problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
25.4k
| golden_diff
stringlengths 145
5.13k
| verification_info
stringlengths 582
39.1k
| num_tokens
int64 271
4.1k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_25882
|
rasdani/github-patches
|
git_diff
|
mdn__kuma-6927
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
T - Update robots.txt (remove sites)
**Summary**
There are several page types that we have in our robots.txt, which means that Google can't crawl it to see that we don't want those pages to be indexed.
**Steps To Reproduce (STR)**
1. Go to Search Console
2. go to Coverage > Valid with Warnings > Indexed, though blocked by robots.txt
3. Alternatively: https://developer.mozilla.org/robots.txt
**Actual behavior**
Google has a link to https://developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult (for example), but that URL is blocked in robots.txt, so it can't follow the link to see that it redirects to https://wiki.developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult which has a `no-index` directive.
**Expected behavior**
All page types that redirect to the wiki should be removed from the robots.txt of the read-only site, since the wiki has a `no-index` directive on all content already.
**Additional context**
The reason to do this is so we can see actually problematic content show up in our search console reports, instead of this noise. Search console only shows up to 1000 pages as problematic, but there are currently more than 10k warnings, so we might be missing large issues.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kuma/landing/views.py`
Content:
```
1 from django.conf import settings
2 from django.http import HttpResponse
3 from django.shortcuts import redirect, render
4 from django.views import static
5 from django.views.decorators.cache import never_cache
6 from django.views.generic import RedirectView
7
8 from kuma.core.decorators import ensure_wiki_domain, shared_cache_control
9 from kuma.core.utils import is_wiki
10 from kuma.feeder.models import Bundle
11 from kuma.feeder.sections import SECTION_HACKS
12 from kuma.search.models import Filter
13
14 from .utils import favicon_url
15
16
17 @shared_cache_control
18 def contribute_json(request):
19 return static.serve(request, "contribute.json", document_root=settings.ROOT)
20
21
22 @shared_cache_control
23 def home(request):
24 """Home page."""
25 context = {}
26 # Need for both wiki and react homepage
27 context["updates"] = list(Bundle.objects.recent_entries(SECTION_HACKS.updates)[:5])
28
29 # The default template name
30 template_name = "landing/react_homepage.html"
31 if is_wiki(request):
32 template_name = "landing/homepage.html"
33 context["default_filters"] = Filter.objects.default_filters()
34 return render(request, template_name, context)
35
36
37 @ensure_wiki_domain
38 @never_cache
39 def maintenance_mode(request):
40 if settings.MAINTENANCE_MODE:
41 return render(request, "landing/maintenance-mode.html")
42 else:
43 return redirect("home")
44
45
46 @ensure_wiki_domain
47 @shared_cache_control
48 def promote_buttons(request):
49 """Bug 646192: MDN affiliate buttons"""
50 return render(request, "landing/promote_buttons.html")
51
52
53 ROBOTS_ALL_ALLOWED_TXT = """\
54 User-agent: *
55 Sitemap: https://wiki.developer.mozilla.org/sitemap.xml
56
57 Disallow:
58 """
59
60 ROBOTS_ALLOWED_TXT = """\
61 User-agent: *
62 Sitemap: https://developer.mozilla.org/sitemap.xml
63
64 Disallow: /admin/
65 Disallow: /api/
66 Disallow: /*/dashboards/*
67 Disallow: /*docs/feeds
68 Disallow: /*docs/templates
69 Disallow: /*docs*Template:
70 Disallow: /*docs/all
71 Disallow: /*docs/tag*
72 Disallow: /*docs/needs-review*
73 Disallow: /*docs/localization-tag*
74 Disallow: /*docs/with-errors
75 Disallow: /*docs/without-parent
76 Disallow: /*docs/top-level
77 Disallow: /*docs/new
78 Disallow: /*docs/get-documents
79 Disallow: /*docs/submit_akismet_spam
80 Disallow: /*docs/load*
81 Disallow: /*docs/Experiment:*
82 Disallow: /*$api
83 Disallow: /*$compare
84 Disallow: /*$revision
85 Disallow: /*$history
86 Disallow: /*$children
87 Disallow: /*$flag
88 Disallow: /*$locales
89 Disallow: /*$toc
90 Disallow: /*$move
91 Disallow: /*$quick-review
92 Disallow: /*$samples
93 Disallow: /*$revert
94 Disallow: /*$repair_breadcrumbs
95 Disallow: /*$delete
96 Disallow: /*$restore
97 Disallow: /*$purge
98 Disallow: /*$subscribe
99 Disallow: /*$subscribe_to_tree
100 Disallow: /*$vote
101 Disallow: /*docs.json
102 Disallow: /*docs/ckeditor_config.js
103 Disallow: /*/files/
104 Disallow: /media
105 Disallow: /*move-requested
106 Disallow: /*preview-wiki-content
107 Disallow: /*profiles*/edit
108 Disallow: /skins
109 Disallow: /*type=feed
110 Disallow: /*users/
111 """ + "\n".join(
112 "Disallow: /{locale}/search".format(locale=locale)
113 for locale in settings.ENABLED_LOCALES
114 )
115
116 ROBOTS_GO_AWAY_TXT = """\
117 User-Agent: *
118 Disallow: /
119 """
120
121
122 @shared_cache_control
123 def robots_txt(request):
124 """Serve robots.txt that allows or forbids robots."""
125 host = request.get_host()
126 if host in settings.ALLOW_ROBOTS_DOMAINS:
127 robots = ""
128 elif host in settings.ALLOW_ROBOTS_WEB_DOMAINS:
129 if host == settings.WIKI_HOST:
130 robots = ROBOTS_ALL_ALLOWED_TXT
131 else:
132 robots = ROBOTS_ALLOWED_TXT
133 else:
134 robots = ROBOTS_GO_AWAY_TXT
135 return HttpResponse(robots, content_type="text/plain")
136
137
138 class FaviconRedirect(RedirectView):
139 """Redirect to the favicon in the static img folder (bug 1402497)"""
140
141 def get_redirect_url(self, *args, **kwargs):
142 return favicon_url()
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/kuma/landing/views.py b/kuma/landing/views.py
--- a/kuma/landing/views.py
+++ b/kuma/landing/views.py
@@ -61,52 +61,14 @@
User-agent: *
Sitemap: https://developer.mozilla.org/sitemap.xml
-Disallow: /admin/
Disallow: /api/
-Disallow: /*/dashboards/*
-Disallow: /*docs/feeds
-Disallow: /*docs/templates
-Disallow: /*docs*Template:
-Disallow: /*docs/all
-Disallow: /*docs/tag*
-Disallow: /*docs/needs-review*
-Disallow: /*docs/localization-tag*
-Disallow: /*docs/with-errors
-Disallow: /*docs/without-parent
-Disallow: /*docs/top-level
-Disallow: /*docs/new
Disallow: /*docs/get-documents
-Disallow: /*docs/submit_akismet_spam
-Disallow: /*docs/load*
Disallow: /*docs/Experiment:*
-Disallow: /*$api
-Disallow: /*$compare
-Disallow: /*$revision
-Disallow: /*$history
Disallow: /*$children
-Disallow: /*$flag
-Disallow: /*$locales
-Disallow: /*$toc
-Disallow: /*$move
-Disallow: /*$quick-review
-Disallow: /*$samples
-Disallow: /*$revert
-Disallow: /*$repair_breadcrumbs
-Disallow: /*$delete
-Disallow: /*$restore
-Disallow: /*$purge
-Disallow: /*$subscribe
-Disallow: /*$subscribe_to_tree
-Disallow: /*$vote
Disallow: /*docs.json
-Disallow: /*docs/ckeditor_config.js
Disallow: /*/files/
Disallow: /media
-Disallow: /*move-requested
-Disallow: /*preview-wiki-content
Disallow: /*profiles*/edit
-Disallow: /skins
-Disallow: /*type=feed
Disallow: /*users/
""" + "\n".join(
"Disallow: /{locale}/search".format(locale=locale)
|
{"golden_diff": "diff --git a/kuma/landing/views.py b/kuma/landing/views.py\n--- a/kuma/landing/views.py\n+++ b/kuma/landing/views.py\n@@ -61,52 +61,14 @@\n User-agent: *\n Sitemap: https://developer.mozilla.org/sitemap.xml\n \n-Disallow: /admin/\n Disallow: /api/\n-Disallow: /*/dashboards/*\n-Disallow: /*docs/feeds\n-Disallow: /*docs/templates\n-Disallow: /*docs*Template:\n-Disallow: /*docs/all\n-Disallow: /*docs/tag*\n-Disallow: /*docs/needs-review*\n-Disallow: /*docs/localization-tag*\n-Disallow: /*docs/with-errors\n-Disallow: /*docs/without-parent\n-Disallow: /*docs/top-level\n-Disallow: /*docs/new\n Disallow: /*docs/get-documents\n-Disallow: /*docs/submit_akismet_spam\n-Disallow: /*docs/load*\n Disallow: /*docs/Experiment:*\n-Disallow: /*$api\n-Disallow: /*$compare\n-Disallow: /*$revision\n-Disallow: /*$history\n Disallow: /*$children\n-Disallow: /*$flag\n-Disallow: /*$locales\n-Disallow: /*$toc\n-Disallow: /*$move\n-Disallow: /*$quick-review\n-Disallow: /*$samples\n-Disallow: /*$revert\n-Disallow: /*$repair_breadcrumbs\n-Disallow: /*$delete\n-Disallow: /*$restore\n-Disallow: /*$purge\n-Disallow: /*$subscribe\n-Disallow: /*$subscribe_to_tree\n-Disallow: /*$vote\n Disallow: /*docs.json\n-Disallow: /*docs/ckeditor_config.js\n Disallow: /*/files/\n Disallow: /media\n-Disallow: /*move-requested\n-Disallow: /*preview-wiki-content\n Disallow: /*profiles*/edit\n-Disallow: /skins\n-Disallow: /*type=feed\n Disallow: /*users/\n \"\"\" + \"\\n\".join(\n \"Disallow: /{locale}/search\".format(locale=locale)\n", "issue": "T - Update robots.txt (remove sites)\n**Summary**\r\nThere are several page types that we have in our robots.txt, which means that Google can't crawl it to see that we don't want those pages to be indexed.\r\n\r\n\r\n**Steps To Reproduce (STR)**\r\n1. Go to Search Console\r\n2. go to Coverage > Valid with Warnings > Indexed, though blocked by robots.txt\r\n3. Alternatively: https://developer.mozilla.org/robots.txt\r\n\r\n\r\n**Actual behavior**\r\nGoogle has a link to https://developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult (for example), but that URL is blocked in robots.txt, so it can't follow the link to see that it redirects to https://wiki.developer.mozilla.org/en-US/docs/tag/SpeechRecognitionResult which has a `no-index` directive.\r\n\r\n\r\n**Expected behavior**\r\nAll page types that redirect to the wiki should be removed from the robots.txt of the read-only site, since the wiki has a `no-index` directive on all content already.\r\n\r\n\r\n**Additional context**\r\nThe reason to do this is so we can see actually problematic content show up in our search console reports, instead of this noise. Search console only shows up to 1000 pages as problematic, but there are currently more than 10k warnings, so we might be missing large issues.\r\n\n", "before_files": [{"content": "from django.conf import settings\nfrom django.http import HttpResponse\nfrom django.shortcuts import redirect, render\nfrom django.views import static\nfrom django.views.decorators.cache import never_cache\nfrom django.views.generic import RedirectView\n\nfrom kuma.core.decorators import ensure_wiki_domain, shared_cache_control\nfrom kuma.core.utils import is_wiki\nfrom kuma.feeder.models import Bundle\nfrom kuma.feeder.sections import SECTION_HACKS\nfrom kuma.search.models import Filter\n\nfrom .utils import favicon_url\n\n\n@shared_cache_control\ndef contribute_json(request):\n return static.serve(request, \"contribute.json\", document_root=settings.ROOT)\n\n\n@shared_cache_control\ndef home(request):\n \"\"\"Home page.\"\"\"\n context = {}\n # Need for both wiki and react homepage\n context[\"updates\"] = list(Bundle.objects.recent_entries(SECTION_HACKS.updates)[:5])\n\n # The default template name\n template_name = \"landing/react_homepage.html\"\n if is_wiki(request):\n template_name = \"landing/homepage.html\"\n context[\"default_filters\"] = Filter.objects.default_filters()\n return render(request, template_name, context)\n\n\n@ensure_wiki_domain\n@never_cache\ndef maintenance_mode(request):\n if settings.MAINTENANCE_MODE:\n return render(request, \"landing/maintenance-mode.html\")\n else:\n return redirect(\"home\")\n\n\n@ensure_wiki_domain\n@shared_cache_control\ndef promote_buttons(request):\n \"\"\"Bug 646192: MDN affiliate buttons\"\"\"\n return render(request, \"landing/promote_buttons.html\")\n\n\nROBOTS_ALL_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://wiki.developer.mozilla.org/sitemap.xml\n\nDisallow:\n\"\"\"\n\nROBOTS_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://developer.mozilla.org/sitemap.xml\n\nDisallow: /admin/\nDisallow: /api/\nDisallow: /*/dashboards/*\nDisallow: /*docs/feeds\nDisallow: /*docs/templates\nDisallow: /*docs*Template:\nDisallow: /*docs/all\nDisallow: /*docs/tag*\nDisallow: /*docs/needs-review*\nDisallow: /*docs/localization-tag*\nDisallow: /*docs/with-errors\nDisallow: /*docs/without-parent\nDisallow: /*docs/top-level\nDisallow: /*docs/new\nDisallow: /*docs/get-documents\nDisallow: /*docs/submit_akismet_spam\nDisallow: /*docs/load*\nDisallow: /*docs/Experiment:*\nDisallow: /*$api\nDisallow: /*$compare\nDisallow: /*$revision\nDisallow: /*$history\nDisallow: /*$children\nDisallow: /*$flag\nDisallow: /*$locales\nDisallow: /*$toc\nDisallow: /*$move\nDisallow: /*$quick-review\nDisallow: /*$samples\nDisallow: /*$revert\nDisallow: /*$repair_breadcrumbs\nDisallow: /*$delete\nDisallow: /*$restore\nDisallow: /*$purge\nDisallow: /*$subscribe\nDisallow: /*$subscribe_to_tree\nDisallow: /*$vote\nDisallow: /*docs.json\nDisallow: /*docs/ckeditor_config.js\nDisallow: /*/files/\nDisallow: /media\nDisallow: /*move-requested\nDisallow: /*preview-wiki-content\nDisallow: /*profiles*/edit\nDisallow: /skins\nDisallow: /*type=feed\nDisallow: /*users/\n\"\"\" + \"\\n\".join(\n \"Disallow: /{locale}/search\".format(locale=locale)\n for locale in settings.ENABLED_LOCALES\n)\n\nROBOTS_GO_AWAY_TXT = \"\"\"\\\nUser-Agent: *\nDisallow: /\n\"\"\"\n\n\n@shared_cache_control\ndef robots_txt(request):\n \"\"\"Serve robots.txt that allows or forbids robots.\"\"\"\n host = request.get_host()\n if host in settings.ALLOW_ROBOTS_DOMAINS:\n robots = \"\"\n elif host in settings.ALLOW_ROBOTS_WEB_DOMAINS:\n if host == settings.WIKI_HOST:\n robots = ROBOTS_ALL_ALLOWED_TXT\n else:\n robots = ROBOTS_ALLOWED_TXT\n else:\n robots = ROBOTS_GO_AWAY_TXT\n return HttpResponse(robots, content_type=\"text/plain\")\n\n\nclass FaviconRedirect(RedirectView):\n \"\"\"Redirect to the favicon in the static img folder (bug 1402497)\"\"\"\n\n def get_redirect_url(self, *args, **kwargs):\n return favicon_url()\n", "path": "kuma/landing/views.py"}], "after_files": [{"content": "from django.conf import settings\nfrom django.http import HttpResponse\nfrom django.shortcuts import redirect, render\nfrom django.views import static\nfrom django.views.decorators.cache import never_cache\nfrom django.views.generic import RedirectView\n\nfrom kuma.core.decorators import ensure_wiki_domain, shared_cache_control\nfrom kuma.core.utils import is_wiki\nfrom kuma.feeder.models import Bundle\nfrom kuma.feeder.sections import SECTION_HACKS\nfrom kuma.search.models import Filter\n\nfrom .utils import favicon_url\n\n\n@shared_cache_control\ndef contribute_json(request):\n return static.serve(request, \"contribute.json\", document_root=settings.ROOT)\n\n\n@shared_cache_control\ndef home(request):\n \"\"\"Home page.\"\"\"\n context = {}\n # Need for both wiki and react homepage\n context[\"updates\"] = list(Bundle.objects.recent_entries(SECTION_HACKS.updates)[:5])\n\n # The default template name\n template_name = \"landing/react_homepage.html\"\n if is_wiki(request):\n template_name = \"landing/homepage.html\"\n context[\"default_filters\"] = Filter.objects.default_filters()\n return render(request, template_name, context)\n\n\n@ensure_wiki_domain\n@never_cache\ndef maintenance_mode(request):\n if settings.MAINTENANCE_MODE:\n return render(request, \"landing/maintenance-mode.html\")\n else:\n return redirect(\"home\")\n\n\n@ensure_wiki_domain\n@shared_cache_control\ndef promote_buttons(request):\n \"\"\"Bug 646192: MDN affiliate buttons\"\"\"\n return render(request, \"landing/promote_buttons.html\")\n\n\nROBOTS_ALL_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://wiki.developer.mozilla.org/sitemap.xml\n\nDisallow:\n\"\"\"\n\nROBOTS_ALLOWED_TXT = \"\"\"\\\nUser-agent: *\nSitemap: https://developer.mozilla.org/sitemap.xml\n\nDisallow: /api/\nDisallow: /*docs/get-documents\nDisallow: /*docs/Experiment:*\nDisallow: /*$children\nDisallow: /*docs.json\nDisallow: /*/files/\nDisallow: /media\nDisallow: /*profiles*/edit\nDisallow: /*users/\n\"\"\" + \"\\n\".join(\n \"Disallow: /{locale}/search\".format(locale=locale)\n for locale in settings.ENABLED_LOCALES\n)\n\nROBOTS_GO_AWAY_TXT = \"\"\"\\\nUser-Agent: *\nDisallow: /\n\"\"\"\n\n\n@shared_cache_control\ndef robots_txt(request):\n \"\"\"Serve robots.txt that allows or forbids robots.\"\"\"\n host = request.get_host()\n if host in settings.ALLOW_ROBOTS_DOMAINS:\n robots = \"\"\n elif host in settings.ALLOW_ROBOTS_WEB_DOMAINS:\n if host == settings.WIKI_HOST:\n robots = ROBOTS_ALL_ALLOWED_TXT\n else:\n robots = ROBOTS_ALLOWED_TXT\n else:\n robots = ROBOTS_GO_AWAY_TXT\n return HttpResponse(robots, content_type=\"text/plain\")\n\n\nclass FaviconRedirect(RedirectView):\n \"\"\"Redirect to the favicon in the static img folder (bug 1402497)\"\"\"\n\n def get_redirect_url(self, *args, **kwargs):\n return favicon_url()\n", "path": "kuma/landing/views.py"}]}
| 1,845 | 448 |
gh_patches_debug_27712
|
rasdani/github-patches
|
git_diff
|
pyca__cryptography-10277
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
cryptography_vectors is missing from `test` extra
There is missing `cryptography_vectors` in the `test` extra in `pyproject.toml` so when tests are run from sdist with the `test` extra installed then testing fails and `cryptography_vectors` needs to be installed manually to make tests passing.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `release.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5 import pathlib
6 import re
7 import subprocess
8
9 import click
10 import tomllib
11 from packaging.version import Version
12
13
14 def run(*args: str) -> None:
15 print(f"[running] {list(args)}")
16 subprocess.check_call(list(args))
17
18
19 @click.group()
20 def cli():
21 pass
22
23
24 @cli.command()
25 def release() -> None:
26 base_dir = pathlib.Path(__file__).parent
27 with (base_dir / "pyproject.toml").open("rb") as f:
28 pyproject = tomllib.load(f)
29 version = pyproject["project"]["version"]
30
31 if Version(version).is_prerelease:
32 raise RuntimeError(
33 f"Can't release, pyproject.toml version is pre-release: {version}"
34 )
35
36 # Tag and push the tag (this will trigger the wheel builder in Actions)
37 run("git", "tag", "-s", version, "-m", f"{version} release")
38 run("git", "push", "--tags", "git@github.com:pyca/cryptography.git")
39
40
41 def replace_version(
42 p: pathlib.Path, variable_name: str, new_version: str
43 ) -> None:
44 content = p.read_text()
45
46 pattern = rf"^{variable_name}\s*=\s*.*$"
47 match = re.search(pattern, content, re.MULTILINE)
48 assert match is not None
49
50 start, end = match.span()
51 new_content = (
52 content[:start] + f'{variable_name} = "{new_version}"' + content[end:]
53 )
54
55 p.write_text(new_content)
56
57
58 @cli.command()
59 @click.argument("new_version")
60 def bump_version(new_version: str) -> None:
61 base_dir = pathlib.Path(__file__).parent
62
63 replace_version(base_dir / "pyproject.toml", "version", new_version)
64 replace_version(
65 base_dir / "src/cryptography/__about__.py", "__version__", new_version
66 )
67 replace_version(
68 base_dir / "vectors/pyproject.toml",
69 "version",
70 new_version,
71 )
72 replace_version(
73 base_dir / "vectors/cryptography_vectors/__about__.py",
74 "__version__",
75 new_version,
76 )
77
78
79 if __name__ == "__main__":
80 cli()
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/release.py b/release.py
--- a/release.py
+++ b/release.py
@@ -38,23 +38,24 @@
run("git", "push", "--tags", "git@github.com:pyca/cryptography.git")
-def replace_version(
- p: pathlib.Path, variable_name: str, new_version: str
-) -> None:
+def replace_pattern(p: pathlib.Path, pattern: str, replacement: str) -> None:
content = p.read_text()
-
- pattern = rf"^{variable_name}\s*=\s*.*$"
match = re.search(pattern, content, re.MULTILINE)
assert match is not None
start, end = match.span()
- new_content = (
- content[:start] + f'{variable_name} = "{new_version}"' + content[end:]
- )
-
+ new_content = content[:start] + replacement + content[end:]
p.write_text(new_content)
+def replace_version(
+ p: pathlib.Path, variable_name: str, new_version: str
+) -> None:
+ replace_pattern(
+ p, rf"^{variable_name}\s*=\s*.*$", f'{variable_name} = "{new_version}"'
+ )
+
+
@cli.command()
@click.argument("new_version")
def bump_version(new_version: str) -> None:
@@ -75,6 +76,19 @@
new_version,
)
+ if Version(new_version).is_prerelease:
+ replace_pattern(
+ base_dir / "pyproject.toml",
+ r'"cryptography_vectors(==.*?)?"',
+ '"cryptography_vectors"',
+ )
+ else:
+ replace_pattern(
+ base_dir / "pyproject.toml",
+ r'"cryptography_vectors(==.*?)?"',
+ f'"cryptography_vectors=={new_version}"',
+ )
+
if __name__ == "__main__":
cli()
|
{"golden_diff": "diff --git a/release.py b/release.py\n--- a/release.py\n+++ b/release.py\n@@ -38,23 +38,24 @@\n run(\"git\", \"push\", \"--tags\", \"git@github.com:pyca/cryptography.git\")\n \n \n-def replace_version(\n- p: pathlib.Path, variable_name: str, new_version: str\n-) -> None:\n+def replace_pattern(p: pathlib.Path, pattern: str, replacement: str) -> None:\n content = p.read_text()\n-\n- pattern = rf\"^{variable_name}\\s*=\\s*.*$\"\n match = re.search(pattern, content, re.MULTILINE)\n assert match is not None\n \n start, end = match.span()\n- new_content = (\n- content[:start] + f'{variable_name} = \"{new_version}\"' + content[end:]\n- )\n-\n+ new_content = content[:start] + replacement + content[end:]\n p.write_text(new_content)\n \n \n+def replace_version(\n+ p: pathlib.Path, variable_name: str, new_version: str\n+) -> None:\n+ replace_pattern(\n+ p, rf\"^{variable_name}\\s*=\\s*.*$\", f'{variable_name} = \"{new_version}\"'\n+ )\n+\n+\n @cli.command()\n @click.argument(\"new_version\")\n def bump_version(new_version: str) -> None:\n@@ -75,6 +76,19 @@\n new_version,\n )\n \n+ if Version(new_version).is_prerelease:\n+ replace_pattern(\n+ base_dir / \"pyproject.toml\",\n+ r'\"cryptography_vectors(==.*?)?\"',\n+ '\"cryptography_vectors\"',\n+ )\n+ else:\n+ replace_pattern(\n+ base_dir / \"pyproject.toml\",\n+ r'\"cryptography_vectors(==.*?)?\"',\n+ f'\"cryptography_vectors=={new_version}\"',\n+ )\n+\n \n if __name__ == \"__main__\":\n cli()\n", "issue": "cryptography_vectors is missing from `test` extra\nThere is missing `cryptography_vectors` in the `test` extra in `pyproject.toml` so when tests are run from sdist with the `test` extra installed then testing fails and `cryptography_vectors` needs to be installed manually to make tests passing.\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nimport pathlib\nimport re\nimport subprocess\n\nimport click\nimport tomllib\nfrom packaging.version import Version\n\n\ndef run(*args: str) -> None:\n print(f\"[running] {list(args)}\")\n subprocess.check_call(list(args))\n\n\n@click.group()\ndef cli():\n pass\n\n\n@cli.command()\ndef release() -> None:\n base_dir = pathlib.Path(__file__).parent\n with (base_dir / \"pyproject.toml\").open(\"rb\") as f:\n pyproject = tomllib.load(f)\n version = pyproject[\"project\"][\"version\"]\n\n if Version(version).is_prerelease:\n raise RuntimeError(\n f\"Can't release, pyproject.toml version is pre-release: {version}\"\n )\n\n # Tag and push the tag (this will trigger the wheel builder in Actions)\n run(\"git\", \"tag\", \"-s\", version, \"-m\", f\"{version} release\")\n run(\"git\", \"push\", \"--tags\", \"git@github.com:pyca/cryptography.git\")\n\n\ndef replace_version(\n p: pathlib.Path, variable_name: str, new_version: str\n) -> None:\n content = p.read_text()\n\n pattern = rf\"^{variable_name}\\s*=\\s*.*$\"\n match = re.search(pattern, content, re.MULTILINE)\n assert match is not None\n\n start, end = match.span()\n new_content = (\n content[:start] + f'{variable_name} = \"{new_version}\"' + content[end:]\n )\n\n p.write_text(new_content)\n\n\n@cli.command()\n@click.argument(\"new_version\")\ndef bump_version(new_version: str) -> None:\n base_dir = pathlib.Path(__file__).parent\n\n replace_version(base_dir / \"pyproject.toml\", \"version\", new_version)\n replace_version(\n base_dir / \"src/cryptography/__about__.py\", \"__version__\", new_version\n )\n replace_version(\n base_dir / \"vectors/pyproject.toml\",\n \"version\",\n new_version,\n )\n replace_version(\n base_dir / \"vectors/cryptography_vectors/__about__.py\",\n \"__version__\",\n new_version,\n )\n\n\nif __name__ == \"__main__\":\n cli()\n", "path": "release.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nimport pathlib\nimport re\nimport subprocess\n\nimport click\nimport tomllib\nfrom packaging.version import Version\n\n\ndef run(*args: str) -> None:\n print(f\"[running] {list(args)}\")\n subprocess.check_call(list(args))\n\n\n@click.group()\ndef cli():\n pass\n\n\n@cli.command()\ndef release() -> None:\n base_dir = pathlib.Path(__file__).parent\n with (base_dir / \"pyproject.toml\").open(\"rb\") as f:\n pyproject = tomllib.load(f)\n version = pyproject[\"project\"][\"version\"]\n\n if Version(version).is_prerelease:\n raise RuntimeError(\n f\"Can't release, pyproject.toml version is pre-release: {version}\"\n )\n\n # Tag and push the tag (this will trigger the wheel builder in Actions)\n run(\"git\", \"tag\", \"-s\", version, \"-m\", f\"{version} release\")\n run(\"git\", \"push\", \"--tags\", \"git@github.com:pyca/cryptography.git\")\n\n\ndef replace_pattern(p: pathlib.Path, pattern: str, replacement: str) -> None:\n content = p.read_text()\n match = re.search(pattern, content, re.MULTILINE)\n assert match is not None\n\n start, end = match.span()\n new_content = content[:start] + replacement + content[end:]\n p.write_text(new_content)\n\n\ndef replace_version(\n p: pathlib.Path, variable_name: str, new_version: str\n) -> None:\n replace_pattern(\n p, rf\"^{variable_name}\\s*=\\s*.*$\", f'{variable_name} = \"{new_version}\"'\n )\n\n\n@cli.command()\n@click.argument(\"new_version\")\ndef bump_version(new_version: str) -> None:\n base_dir = pathlib.Path(__file__).parent\n\n replace_version(base_dir / \"pyproject.toml\", \"version\", new_version)\n replace_version(\n base_dir / \"src/cryptography/__about__.py\", \"__version__\", new_version\n )\n replace_version(\n base_dir / \"vectors/pyproject.toml\",\n \"version\",\n new_version,\n )\n replace_version(\n base_dir / \"vectors/cryptography_vectors/__about__.py\",\n \"__version__\",\n new_version,\n )\n\n if Version(new_version).is_prerelease:\n replace_pattern(\n base_dir / \"pyproject.toml\",\n r'\"cryptography_vectors(==.*?)?\"',\n '\"cryptography_vectors\"',\n )\n else:\n replace_pattern(\n base_dir / \"pyproject.toml\",\n r'\"cryptography_vectors(==.*?)?\"',\n f'\"cryptography_vectors=={new_version}\"',\n )\n\n\nif __name__ == \"__main__\":\n cli()\n", "path": "release.py"}]}
| 1,009 | 429 |
gh_patches_debug_58135
|
rasdani/github-patches
|
git_diff
|
liqd__a4-meinberlin-1579
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Untranslated "Plans"

Maybe we just remove it since there is already a header
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `meinberlin/apps/plans/forms.py`
Content:
```
1 from django import forms
2 from django.conf import settings
3 from django.db.models import Q
4 from django.utils.translation import ugettext_lazy as _
5
6 from adhocracy4.dashboard.components.forms import ProjectDashboardForm
7 from adhocracy4.maps import widgets as maps_widgets
8 from adhocracy4.projects import models as project_models
9 from meinberlin.apps.contrib import widgets as contrib_widgets
10
11 from . import models
12
13
14 def get_theme_options():
15 return models.Plan.objects\
16 .filter(~Q(theme=''))\
17 .order_by('theme')\
18 .values_list('theme', flat=True)\
19 .distinct()
20
21
22 class PlanForm(forms.ModelForm):
23
24 class Meta:
25 model = models.Plan
26 fields = [
27 'title',
28 'description_image',
29 'contact',
30 'point',
31 'point_label',
32 'district',
33 'cost',
34 'description',
35 'theme',
36 'status',
37 'participation']
38 widgets = {
39 'point': maps_widgets.MapChoosePointWidget(
40 polygon=settings.BERLIN_POLYGON),
41 'theme': contrib_widgets.TextWithDatalistWidget(attrs={
42 'options': get_theme_options
43 })
44 }
45 error_messages = {
46 'point': {
47 'required': _('Please locate the plan on the map.')
48 }
49 }
50
51 def __init__(self, *args, **kwargs):
52 super().__init__(*args, **kwargs)
53 self.fields['district'].empty_label = _('City wide')
54
55
56 class CustomMultipleChoiceField(forms.ModelMultipleChoiceField):
57
58 widget = forms.RadioSelect
59
60 def clean(self, value):
61 if value is None:
62 return super().clean([])
63 return super().clean([value])
64
65
66 class ProjectPlansDashboardForm(ProjectDashboardForm):
67 plans = CustomMultipleChoiceField(queryset=None)
68
69 class Meta:
70 model = project_models.Project
71 fields = ['plans']
72 required_for_project_publish = ['plans']
73
74 def save(self, commit=False):
75 plans = self.cleaned_data['plans']
76 self.instance.plans.set(plans)
77
78 def __init__(self, *args, **kwargs):
79 super().__init__(*args, **kwargs)
80 self.initial['plans'] = self.instance.plans.all()
81 self.fields['plans'
82 ].queryset = self.instance.organisation.plan_set.all()
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/meinberlin/apps/plans/forms.py b/meinberlin/apps/plans/forms.py
--- a/meinberlin/apps/plans/forms.py
+++ b/meinberlin/apps/plans/forms.py
@@ -64,7 +64,8 @@
class ProjectPlansDashboardForm(ProjectDashboardForm):
- plans = CustomMultipleChoiceField(queryset=None)
+ plans = CustomMultipleChoiceField(queryset=None,
+ label=_('Plans'))
class Meta:
model = project_models.Project
|
{"golden_diff": "diff --git a/meinberlin/apps/plans/forms.py b/meinberlin/apps/plans/forms.py\n--- a/meinberlin/apps/plans/forms.py\n+++ b/meinberlin/apps/plans/forms.py\n@@ -64,7 +64,8 @@\n \n \n class ProjectPlansDashboardForm(ProjectDashboardForm):\n- plans = CustomMultipleChoiceField(queryset=None)\n+ plans = CustomMultipleChoiceField(queryset=None,\n+ label=_('Plans'))\n \n class Meta:\n model = project_models.Project\n", "issue": "Untranslated \"Plans\"\n\r\n\r\n\r\nMaybe we just remove it since there is already a header\n", "before_files": [{"content": "from django import forms\nfrom django.conf import settings\nfrom django.db.models import Q\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4.dashboard.components.forms import ProjectDashboardForm\nfrom adhocracy4.maps import widgets as maps_widgets\nfrom adhocracy4.projects import models as project_models\nfrom meinberlin.apps.contrib import widgets as contrib_widgets\n\nfrom . import models\n\n\ndef get_theme_options():\n return models.Plan.objects\\\n .filter(~Q(theme=''))\\\n .order_by('theme')\\\n .values_list('theme', flat=True)\\\n .distinct()\n\n\nclass PlanForm(forms.ModelForm):\n\n class Meta:\n model = models.Plan\n fields = [\n 'title',\n 'description_image',\n 'contact',\n 'point',\n 'point_label',\n 'district',\n 'cost',\n 'description',\n 'theme',\n 'status',\n 'participation']\n widgets = {\n 'point': maps_widgets.MapChoosePointWidget(\n polygon=settings.BERLIN_POLYGON),\n 'theme': contrib_widgets.TextWithDatalistWidget(attrs={\n 'options': get_theme_options\n })\n }\n error_messages = {\n 'point': {\n 'required': _('Please locate the plan on the map.')\n }\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.fields['district'].empty_label = _('City wide')\n\n\nclass CustomMultipleChoiceField(forms.ModelMultipleChoiceField):\n\n widget = forms.RadioSelect\n\n def clean(self, value):\n if value is None:\n return super().clean([])\n return super().clean([value])\n\n\nclass ProjectPlansDashboardForm(ProjectDashboardForm):\n plans = CustomMultipleChoiceField(queryset=None)\n\n class Meta:\n model = project_models.Project\n fields = ['plans']\n required_for_project_publish = ['plans']\n\n def save(self, commit=False):\n plans = self.cleaned_data['plans']\n self.instance.plans.set(plans)\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.initial['plans'] = self.instance.plans.all()\n self.fields['plans'\n ].queryset = self.instance.organisation.plan_set.all()\n", "path": "meinberlin/apps/plans/forms.py"}], "after_files": [{"content": "from django import forms\nfrom django.conf import settings\nfrom django.db.models import Q\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4.dashboard.components.forms import ProjectDashboardForm\nfrom adhocracy4.maps import widgets as maps_widgets\nfrom adhocracy4.projects import models as project_models\nfrom meinberlin.apps.contrib import widgets as contrib_widgets\n\nfrom . import models\n\n\ndef get_theme_options():\n return models.Plan.objects\\\n .filter(~Q(theme=''))\\\n .order_by('theme')\\\n .values_list('theme', flat=True)\\\n .distinct()\n\n\nclass PlanForm(forms.ModelForm):\n\n class Meta:\n model = models.Plan\n fields = [\n 'title',\n 'description_image',\n 'contact',\n 'point',\n 'point_label',\n 'district',\n 'cost',\n 'description',\n 'theme',\n 'status',\n 'participation']\n widgets = {\n 'point': maps_widgets.MapChoosePointWidget(\n polygon=settings.BERLIN_POLYGON),\n 'theme': contrib_widgets.TextWithDatalistWidget(attrs={\n 'options': get_theme_options\n })\n }\n error_messages = {\n 'point': {\n 'required': _('Please locate the plan on the map.')\n }\n }\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.fields['district'].empty_label = _('City wide')\n\n\nclass CustomMultipleChoiceField(forms.ModelMultipleChoiceField):\n\n widget = forms.RadioSelect\n\n def clean(self, value):\n if value is None:\n return super().clean([])\n return super().clean([value])\n\n\nclass ProjectPlansDashboardForm(ProjectDashboardForm):\n plans = CustomMultipleChoiceField(queryset=None,\n label=_('Plans'))\n\n class Meta:\n model = project_models.Project\n fields = ['plans']\n required_for_project_publish = ['plans']\n\n def save(self, commit=False):\n plans = self.cleaned_data['plans']\n self.instance.plans.set(plans)\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.initial['plans'] = self.instance.plans.all()\n self.fields['plans'\n ].queryset = self.instance.organisation.plan_set.all()\n", "path": "meinberlin/apps/plans/forms.py"}]}
| 1,019 | 112 |
gh_patches_debug_16236
|
rasdani/github-patches
|
git_diff
|
catalyst-team__catalyst-151
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[fix] fix registry exception for duplicate factory
@TODO:
- return exception for duplicate factory
- check that all registries usage are correct: both in `__init__.py` and `model.py` for example
[fix] fix registry exception for duplicate factory
@TODO:
- return exception for duplicate factory
- check that all registries usage are correct: both in `__init__.py` and `model.py` for example
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `catalyst/utils/registry.py`
Content:
```
1 import warnings
2 import inspect
3 from typing import Dict, Callable, Any, Union, Type, Mapping, Tuple, List, \
4 Optional, Iterator
5
6 Factory = Union[Type, Callable[..., Any]]
7 LateAddCallbak = Callable[["Registry"], None]
8 MetaFactory = Callable[[Factory, Tuple, Mapping], Any]
9
10
11 def _default_meta_factory(factory: Factory, args: Tuple, kwargs: Mapping):
12 return factory(*args, **kwargs)
13
14
15 class RegistryException(Exception):
16 def __init__(self, message):
17 super().__init__(message)
18
19
20 class Registry(Mapping):
21 """
22 Universal class allowing to add and access various factories by name
23 """
24
25 def __init__(
26 self,
27 default_name_key: str,
28 default_meta_factory: MetaFactory = _default_meta_factory
29 ):
30 """
31 :param default_name_key: Default key containing factory name when
32 creating from config
33 :param default_meta_factory: default object that calls factory.
34 Optional. Default just calls factory.
35 """
36 self.meta_factory = default_meta_factory
37 self._name_key = default_name_key
38 self._factories: Dict[str, Factory] = {}
39 self._late_add_callbacks: List[LateAddCallbak] = []
40
41 @staticmethod
42 def _get_factory_name(f, provided_name=None) -> str:
43 if not provided_name:
44 provided_name = getattr(f, "__name__", None)
45 if not provided_name:
46 raise RegistryException(
47 f"Factory {f} has no __name__ and no "
48 f"name was provided"
49 )
50 if provided_name == "<lambda>":
51 raise RegistryException(
52 "Name for lambda factories must be provided"
53 )
54 return provided_name
55
56 def _do_late_add(self):
57 if self._late_add_callbacks:
58 for cb in self._late_add_callbacks:
59 cb(self)
60 self._late_add_callbacks = []
61
62 def add(
63 self,
64 factory: Factory = None,
65 *factories: Factory,
66 name: str = None,
67 **named_factories: Factory
68 ) -> Factory:
69 """
70 Adds factory to registry with it's __name__ attribute or provided
71 name.
72 Signature is flexible.
73
74 :param factory: Factory instance
75 :param factories: More instances
76 :param name: Provided name for first instance. Use only when pass
77 single instance.
78 :param named_factories: Factory and their names as kwargs
79
80 :return: First factory passed
81 """
82 if len(factories) > 0 and name is not None:
83 raise RegistryException(
84 "Multiple factories with single name are not allowed"
85 )
86
87 if factory is not None:
88 named_factories[self._get_factory_name(factory, name)] = factory
89
90 if len(factories) > 0:
91 new = {self._get_factory_name(f): f for f in factories}
92 named_factories.update(new)
93
94 if len(named_factories) == 0:
95 warnings.warn("No factories were provided!")
96
97 # @TODO: return warning
98 # for name in named_factories:
99 # if name in self._factories:
100 # raise RegistryException(
101 # f"Factory with name '{name}' is already present"
102 # )
103
104 self._factories.update(named_factories)
105
106 return factory
107
108 def late_add(self, cb: LateAddCallbak):
109 """
110 Allows to prevent cycle imports by delaying some imports till next
111 registry query
112
113 :param cb: Callback receives registry and must call it's methods to
114 register factories
115 :return:
116 """
117 self._late_add_callbacks.append(cb)
118
119 def add_from_module(self, module) -> None:
120 """
121 Adds all factories present in module.
122 If __all__ attribute is present, takes ony what mentioned in it
123
124 :param module: module to scan
125 :return: None
126 """
127 factories = {
128 k: v
129 for k, v in module.__dict__.items()
130 if inspect.isclass(v) or inspect.isfunction(v)
131 }
132
133 # Filter by __all__ if present
134 names_to_add = getattr(module, "__all__", list(factories.keys()))
135
136 to_add = {name: factories[name] for name in names_to_add}
137
138 self.add(**to_add)
139
140 def get(self, name: str) -> Optional[Factory]:
141 """
142 Retrieves factory, without creating any objects with it
143 or raises error
144
145 :param name: factory name
146 :returns Factory
147 """
148
149 self._do_late_add()
150
151 if name is None:
152 return None
153
154 res = self._factories.get(name, None)
155
156 if not res:
157 raise RegistryException(
158 f"No factory with name '{name}' was registered"
159 )
160
161 return res
162
163 def get_if_str(self, obj: Union[str, Factory]):
164 if type(obj) is str:
165 return self.get(obj)
166 return obj
167
168 def get_instance(self, name: str, *args, meta_factory=None, **kwargs):
169 """
170 Creates instance by calling specified factory
171 with instantiate_fn
172 :param name: factory name
173 :param meta_factory: Function that calls factory the right way.
174 If not provided, default is used
175 :param args: args to pass to the factory
176 :param kwargs: kwargs to pass to the factory
177 :return: created instance
178 """
179 meta_factory = meta_factory or self.meta_factory
180 f = self.get(name)
181
182 try:
183 if hasattr(f, "create_from_params"):
184 return f.create_from_params(*args, **kwargs)
185 return meta_factory(f, args, kwargs)
186 except Exception as e:
187 raise RegistryException(
188 f"Factory '{name}' call failed: args={args} kwargs={kwargs}"
189 ) from e
190
191 def get_from_params(
192 self, *, meta_factory=None, **kwargs
193 ) -> Union[Any, Tuple[Any, Mapping[str, Any]]]:
194 """
195 Creates instance based in configuration dict with instantiation_fn.
196 If config[name_key] is None, None is returned.
197
198 :param meta_factory: Function that calls factory the right way.
199 If not provided, default is used.
200 :param kwargs: additional kwargs for factory
201 :return: result of calling instantiate_fn(factory, **config)
202 """
203
204 name = kwargs.pop(self._name_key, None)
205 if name:
206 return self.get_instance(name, meta_factory=meta_factory, **kwargs)
207
208 def all(self) -> List[str]:
209 """
210 :return: list of names of registered items
211 """
212 self._do_late_add()
213 result = list(self._factories.keys())
214
215 return result
216
217 def len(self) -> int:
218 """
219 :return: length of registered items
220 """
221 return len(self._factories)
222
223 def __str__(self) -> str:
224 return self.all().__str__()
225
226 def __repr__(self) -> str:
227 return self.all().__str__()
228
229 # mapping methods
230 def __len__(self) -> int:
231 self._do_late_add()
232 return self.len()
233
234 def __getitem__(self, name: str) -> Optional[Factory]:
235 return self.get(name)
236
237 def __iter__(self) -> Iterator[str]:
238 self._do_late_add()
239 return self._factories.__iter__()
240
241 def __contains__(self, name: str):
242 self._do_late_add()
243 return self._factories.__contains__(name)
244
245 def __setitem__(self, name: str, factory: Factory) -> None:
246 self.add(factory, name=name)
247
248
249 __all__ = ["Registry", "RegistryException"]
250
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/catalyst/utils/registry.py b/catalyst/utils/registry.py
--- a/catalyst/utils/registry.py
+++ b/catalyst/utils/registry.py
@@ -94,12 +94,15 @@
if len(named_factories) == 0:
warnings.warn("No factories were provided!")
- # @TODO: return warning
- # for name in named_factories:
- # if name in self._factories:
- # raise RegistryException(
- # f"Factory with name '{name}' is already present"
- # )
+ for name, f in named_factories.items():
+ # self._factories[name] != f is a workaround for
+ # https://github.com/catalyst-team/catalyst/issues/135
+ if name in self._factories and self._factories[name] != f:
+ raise RegistryException(
+ f"Factory with name '{name}' is already present\n"
+ f"Already registered: '{self._factories[name]}'\n"
+ f"New: '{f}'"
+ )
self._factories.update(named_factories)
|
{"golden_diff": "diff --git a/catalyst/utils/registry.py b/catalyst/utils/registry.py\n--- a/catalyst/utils/registry.py\n+++ b/catalyst/utils/registry.py\n@@ -94,12 +94,15 @@\n if len(named_factories) == 0:\n warnings.warn(\"No factories were provided!\")\n \n- # @TODO: return warning\n- # for name in named_factories:\n- # if name in self._factories:\n- # raise RegistryException(\n- # f\"Factory with name '{name}' is already present\"\n- # )\n+ for name, f in named_factories.items():\n+ # self._factories[name] != f is a workaround for\n+ # https://github.com/catalyst-team/catalyst/issues/135\n+ if name in self._factories and self._factories[name] != f:\n+ raise RegistryException(\n+ f\"Factory with name '{name}' is already present\\n\"\n+ f\"Already registered: '{self._factories[name]}'\\n\"\n+ f\"New: '{f}'\"\n+ )\n \n self._factories.update(named_factories)\n", "issue": "[fix] fix registry exception for duplicate factory\n@TODO:\r\n- return exception for duplicate factory\r\n- check that all registries usage are correct: both in `__init__.py` and `model.py` for example\n[fix] fix registry exception for duplicate factory\n@TODO:\r\n- return exception for duplicate factory\r\n- check that all registries usage are correct: both in `__init__.py` and `model.py` for example\n", "before_files": [{"content": "import warnings\nimport inspect\nfrom typing import Dict, Callable, Any, Union, Type, Mapping, Tuple, List, \\\n Optional, Iterator\n\nFactory = Union[Type, Callable[..., Any]]\nLateAddCallbak = Callable[[\"Registry\"], None]\nMetaFactory = Callable[[Factory, Tuple, Mapping], Any]\n\n\ndef _default_meta_factory(factory: Factory, args: Tuple, kwargs: Mapping):\n return factory(*args, **kwargs)\n\n\nclass RegistryException(Exception):\n def __init__(self, message):\n super().__init__(message)\n\n\nclass Registry(Mapping):\n \"\"\"\n Universal class allowing to add and access various factories by name\n \"\"\"\n\n def __init__(\n self,\n default_name_key: str,\n default_meta_factory: MetaFactory = _default_meta_factory\n ):\n \"\"\"\n :param default_name_key: Default key containing factory name when\n creating from config\n :param default_meta_factory: default object that calls factory.\n Optional. Default just calls factory.\n \"\"\"\n self.meta_factory = default_meta_factory\n self._name_key = default_name_key\n self._factories: Dict[str, Factory] = {}\n self._late_add_callbacks: List[LateAddCallbak] = []\n\n @staticmethod\n def _get_factory_name(f, provided_name=None) -> str:\n if not provided_name:\n provided_name = getattr(f, \"__name__\", None)\n if not provided_name:\n raise RegistryException(\n f\"Factory {f} has no __name__ and no \"\n f\"name was provided\"\n )\n if provided_name == \"<lambda>\":\n raise RegistryException(\n \"Name for lambda factories must be provided\"\n )\n return provided_name\n\n def _do_late_add(self):\n if self._late_add_callbacks:\n for cb in self._late_add_callbacks:\n cb(self)\n self._late_add_callbacks = []\n\n def add(\n self,\n factory: Factory = None,\n *factories: Factory,\n name: str = None,\n **named_factories: Factory\n ) -> Factory:\n \"\"\"\n Adds factory to registry with it's __name__ attribute or provided\n name.\n Signature is flexible.\n\n :param factory: Factory instance\n :param factories: More instances\n :param name: Provided name for first instance. Use only when pass\n single instance.\n :param named_factories: Factory and their names as kwargs\n\n :return: First factory passed\n \"\"\"\n if len(factories) > 0 and name is not None:\n raise RegistryException(\n \"Multiple factories with single name are not allowed\"\n )\n\n if factory is not None:\n named_factories[self._get_factory_name(factory, name)] = factory\n\n if len(factories) > 0:\n new = {self._get_factory_name(f): f for f in factories}\n named_factories.update(new)\n\n if len(named_factories) == 0:\n warnings.warn(\"No factories were provided!\")\n\n # @TODO: return warning\n # for name in named_factories:\n # if name in self._factories:\n # raise RegistryException(\n # f\"Factory with name '{name}' is already present\"\n # )\n\n self._factories.update(named_factories)\n\n return factory\n\n def late_add(self, cb: LateAddCallbak):\n \"\"\"\n Allows to prevent cycle imports by delaying some imports till next\n registry query\n\n :param cb: Callback receives registry and must call it's methods to\n register factories\n :return:\n \"\"\"\n self._late_add_callbacks.append(cb)\n\n def add_from_module(self, module) -> None:\n \"\"\"\n Adds all factories present in module.\n If __all__ attribute is present, takes ony what mentioned in it\n\n :param module: module to scan\n :return: None\n \"\"\"\n factories = {\n k: v\n for k, v in module.__dict__.items()\n if inspect.isclass(v) or inspect.isfunction(v)\n }\n\n # Filter by __all__ if present\n names_to_add = getattr(module, \"__all__\", list(factories.keys()))\n\n to_add = {name: factories[name] for name in names_to_add}\n\n self.add(**to_add)\n\n def get(self, name: str) -> Optional[Factory]:\n \"\"\"\n Retrieves factory, without creating any objects with it\n or raises error\n\n :param name: factory name\n :returns Factory\n \"\"\"\n\n self._do_late_add()\n\n if name is None:\n return None\n\n res = self._factories.get(name, None)\n\n if not res:\n raise RegistryException(\n f\"No factory with name '{name}' was registered\"\n )\n\n return res\n\n def get_if_str(self, obj: Union[str, Factory]):\n if type(obj) is str:\n return self.get(obj)\n return obj\n\n def get_instance(self, name: str, *args, meta_factory=None, **kwargs):\n \"\"\"\n Creates instance by calling specified factory\n with instantiate_fn\n :param name: factory name\n :param meta_factory: Function that calls factory the right way.\n If not provided, default is used\n :param args: args to pass to the factory\n :param kwargs: kwargs to pass to the factory\n :return: created instance\n \"\"\"\n meta_factory = meta_factory or self.meta_factory\n f = self.get(name)\n\n try:\n if hasattr(f, \"create_from_params\"):\n return f.create_from_params(*args, **kwargs)\n return meta_factory(f, args, kwargs)\n except Exception as e:\n raise RegistryException(\n f\"Factory '{name}' call failed: args={args} kwargs={kwargs}\"\n ) from e\n\n def get_from_params(\n self, *, meta_factory=None, **kwargs\n ) -> Union[Any, Tuple[Any, Mapping[str, Any]]]:\n \"\"\"\n Creates instance based in configuration dict with instantiation_fn.\n If config[name_key] is None, None is returned.\n\n :param meta_factory: Function that calls factory the right way.\n If not provided, default is used.\n :param kwargs: additional kwargs for factory\n :return: result of calling instantiate_fn(factory, **config)\n \"\"\"\n\n name = kwargs.pop(self._name_key, None)\n if name:\n return self.get_instance(name, meta_factory=meta_factory, **kwargs)\n\n def all(self) -> List[str]:\n \"\"\"\n :return: list of names of registered items\n \"\"\"\n self._do_late_add()\n result = list(self._factories.keys())\n\n return result\n\n def len(self) -> int:\n \"\"\"\n :return: length of registered items\n \"\"\"\n return len(self._factories)\n\n def __str__(self) -> str:\n return self.all().__str__()\n\n def __repr__(self) -> str:\n return self.all().__str__()\n\n # mapping methods\n def __len__(self) -> int:\n self._do_late_add()\n return self.len()\n\n def __getitem__(self, name: str) -> Optional[Factory]:\n return self.get(name)\n\n def __iter__(self) -> Iterator[str]:\n self._do_late_add()\n return self._factories.__iter__()\n\n def __contains__(self, name: str):\n self._do_late_add()\n return self._factories.__contains__(name)\n\n def __setitem__(self, name: str, factory: Factory) -> None:\n self.add(factory, name=name)\n\n\n__all__ = [\"Registry\", \"RegistryException\"]\n", "path": "catalyst/utils/registry.py"}], "after_files": [{"content": "import warnings\nimport inspect\nfrom typing import Dict, Callable, Any, Union, Type, Mapping, Tuple, List, \\\n Optional, Iterator\n\nFactory = Union[Type, Callable[..., Any]]\nLateAddCallbak = Callable[[\"Registry\"], None]\nMetaFactory = Callable[[Factory, Tuple, Mapping], Any]\n\n\ndef _default_meta_factory(factory: Factory, args: Tuple, kwargs: Mapping):\n return factory(*args, **kwargs)\n\n\nclass RegistryException(Exception):\n def __init__(self, message):\n super().__init__(message)\n\n\nclass Registry(Mapping):\n \"\"\"\n Universal class allowing to add and access various factories by name\n \"\"\"\n\n def __init__(\n self,\n default_name_key: str,\n default_meta_factory: MetaFactory = _default_meta_factory\n ):\n \"\"\"\n :param default_name_key: Default key containing factory name when\n creating from config\n :param default_meta_factory: default object that calls factory.\n Optional. Default just calls factory.\n \"\"\"\n self.meta_factory = default_meta_factory\n self._name_key = default_name_key\n self._factories: Dict[str, Factory] = {}\n self._late_add_callbacks: List[LateAddCallbak] = []\n\n @staticmethod\n def _get_factory_name(f, provided_name=None) -> str:\n if not provided_name:\n provided_name = getattr(f, \"__name__\", None)\n if not provided_name:\n raise RegistryException(\n f\"Factory {f} has no __name__ and no \"\n f\"name was provided\"\n )\n if provided_name == \"<lambda>\":\n raise RegistryException(\n \"Name for lambda factories must be provided\"\n )\n return provided_name\n\n def _do_late_add(self):\n if self._late_add_callbacks:\n for cb in self._late_add_callbacks:\n cb(self)\n self._late_add_callbacks = []\n\n def add(\n self,\n factory: Factory = None,\n *factories: Factory,\n name: str = None,\n **named_factories: Factory\n ) -> Factory:\n \"\"\"\n Adds factory to registry with it's __name__ attribute or provided\n name.\n Signature is flexible.\n\n :param factory: Factory instance\n :param factories: More instances\n :param name: Provided name for first instance. Use only when pass\n single instance.\n :param named_factories: Factory and their names as kwargs\n\n :return: First factory passed\n \"\"\"\n if len(factories) > 0 and name is not None:\n raise RegistryException(\n \"Multiple factories with single name are not allowed\"\n )\n\n if factory is not None:\n named_factories[self._get_factory_name(factory, name)] = factory\n\n if len(factories) > 0:\n new = {self._get_factory_name(f): f for f in factories}\n named_factories.update(new)\n\n if len(named_factories) == 0:\n warnings.warn(\"No factories were provided!\")\n\n for name, f in named_factories.items():\n # self._factories[name] != f is a workaround for\n # https://github.com/catalyst-team/catalyst/issues/135\n if name in self._factories and self._factories[name] != f:\n raise RegistryException(\n f\"Factory with name '{name}' is already present\\n\"\n f\"Already registered: '{self._factories[name]}'\\n\"\n f\"New: '{f}'\"\n )\n\n self._factories.update(named_factories)\n\n return factory\n\n def late_add(self, cb: LateAddCallbak):\n \"\"\"\n Allows to prevent cycle imports by delaying some imports till next\n registry query\n\n :param cb: Callback receives registry and must call it's methods to\n register factories\n :return:\n \"\"\"\n self._late_add_callbacks.append(cb)\n\n def add_from_module(self, module) -> None:\n \"\"\"\n Adds all factories present in module.\n If __all__ attribute is present, takes ony what mentioned in it\n\n :param module: module to scan\n :return: None\n \"\"\"\n factories = {\n k: v\n for k, v in module.__dict__.items()\n if inspect.isclass(v) or inspect.isfunction(v)\n }\n\n # Filter by __all__ if present\n names_to_add = getattr(module, \"__all__\", list(factories.keys()))\n\n to_add = {name: factories[name] for name in names_to_add}\n\n self.add(**to_add)\n\n def get(self, name: str) -> Optional[Factory]:\n \"\"\"\n Retrieves factory, without creating any objects with it\n or raises error\n\n :param name: factory name\n :returns Factory\n \"\"\"\n\n self._do_late_add()\n\n if name is None:\n return None\n\n res = self._factories.get(name, None)\n\n if not res:\n raise RegistryException(\n f\"No factory with name '{name}' was registered\"\n )\n\n return res\n\n def get_if_str(self, obj: Union[str, Factory]):\n if type(obj) is str:\n return self.get(obj)\n return obj\n\n def get_instance(self, name: str, *args, meta_factory=None, **kwargs):\n \"\"\"\n Creates instance by calling specified factory\n with instantiate_fn\n :param name: factory name\n :param meta_factory: Function that calls factory the right way.\n If not provided, default is used\n :param args: args to pass to the factory\n :param kwargs: kwargs to pass to the factory\n :return: created instance\n \"\"\"\n meta_factory = meta_factory or self.meta_factory\n f = self.get(name)\n\n try:\n if hasattr(f, \"create_from_params\"):\n return f.create_from_params(*args, **kwargs)\n return meta_factory(f, args, kwargs)\n except Exception as e:\n raise RegistryException(\n f\"Factory '{name}' call failed: args={args} kwargs={kwargs}\"\n ) from e\n\n def get_from_params(\n self, *, meta_factory=None, **kwargs\n ) -> Union[Any, Tuple[Any, Mapping[str, Any]]]:\n \"\"\"\n Creates instance based in configuration dict with instantiation_fn.\n If config[name_key] is None, None is returned.\n\n :param meta_factory: Function that calls factory the right way.\n If not provided, default is used.\n :param kwargs: additional kwargs for factory\n :return: result of calling instantiate_fn(factory, **config)\n \"\"\"\n\n name = kwargs.pop(self._name_key, None)\n if name:\n return self.get_instance(name, meta_factory=meta_factory, **kwargs)\n\n def all(self) -> List[str]:\n \"\"\"\n :return: list of names of registered items\n \"\"\"\n self._do_late_add()\n result = list(self._factories.keys())\n\n return result\n\n def len(self) -> int:\n \"\"\"\n :return: length of registered items\n \"\"\"\n return len(self._factories)\n\n def __str__(self) -> str:\n return self.all().__str__()\n\n def __repr__(self) -> str:\n return self.all().__str__()\n\n # mapping methods\n def __len__(self) -> int:\n self._do_late_add()\n return self.len()\n\n def __getitem__(self, name: str) -> Optional[Factory]:\n return self.get(name)\n\n def __iter__(self) -> Iterator[str]:\n self._do_late_add()\n return self._factories.__iter__()\n\n def __contains__(self, name: str):\n self._do_late_add()\n return self._factories.__contains__(name)\n\n def __setitem__(self, name: str, factory: Factory) -> None:\n self.add(factory, name=name)\n\n\n__all__ = [\"Registry\", \"RegistryException\"]\n", "path": "catalyst/utils/registry.py"}]}
| 2,676 | 265 |
gh_patches_debug_36021
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-5642
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Resources that contain scalar fields named "dynamic" not parsing properly
**Describe the issue**
Running checkov against a Terraform resource which includes a scalar field called "dynamic" which _is not_ an actual HCL `dynamic` block causes a crash.
**Examples**
Using the below resource definition:
```hcl
resource "elasticsearch_index_template" "logstash" {
body = jsonencode(
{
index_patterns = [
"test-*",
]
mappings = {
properties = {
foo = {
# Note that "dynamic" here is not HCL dynamic block, but rather an elasticsearch property declared using HCL
dynamic = true
properties = {
bar = {
type = "string"
}
}
}
}
}
}
)
name = "test"
}
```
We found this bug while attempting to write a custom check to flag up use of the `provisioner` block which, since it is a [Terraform configuration](https://developer.hashicorp.com/terraform/language/resources/provisioners/syntax) available to all `resource` blocks, requires a custom check where `supported_resources = ("*")`:
```python
from typing import Any
from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
from checkov.common.models.enums import CheckResult, CheckCategories
class DynamicTest(BaseResourceCheck):
def __init__(self) -> None:
name = "Testing crash where dynamic is a scalar value"
id = "CKV_TEST_1"
supported_resources = ("*",)
categories = (CheckCategories.GENERAL_SECURITY,)
super().__init__(
name=name,
id=id,
categories=categories,
supported_resources=supported_resources,
)
def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:
return CheckResult.PASSED
check = DynamicTest()
```
We can then run checkov:
```shell
$ checkov -f test.tf --external-checks-dir=./checks
```
**Exception Trace**
The relevant part of the trace caused by this issue is below:
```
2023-10-13 10:22:17,018 [MainThread ] [DEBUG] Running check: Testing crash where dynamic is a scalar value on file /test.tf
2023-10-13 10:22:17,019 [MainThread ] [ERROR] Failed to run check CKV_TEST_1 on /test.tf:elasticsearch_index_template.test
Traceback (most recent call last):
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/common/checks/base_check.py", line 75, in run
check_result["result"] = self.scan_entity_conf(entity_configuration, entity_type)
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/checks/resource/base_resource_check.py", line 42, in scan_entity_conf
handle_dynamic_values(conf)
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py", line 186, in handle_dynamic_values
has_dynamic_block = handle_dynamic_values(conf_block[0], has_dynamic_block)
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py", line 182, in handle_dynamic_values
has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py", line 182, in handle_dynamic_values
has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py", line 182, in handle_dynamic_values
has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py", line 189, in handle_dynamic_values
return process_dynamic_values(conf) or has_dynamic_block
File "/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py", line 195, in process_dynamic_values
for dynamic_element in conf.get("dynamic", {}):
TypeError: 'bool' object is not iterable
2023-10-13 10:22:17,020 [MainThread ] [INFO ] Entity configuration: {'__end_line__': 24, '__start_line__': 1, 'body': [{'index_patterns': ['test-*'], 'mappings': {'properties': {'foo': {'dynamic': True, 'properties': {'bar': {'type': 'string'}}}}}}], 'name': ['test'], '__address__': 'elasticsearch_index_template.test'}
```
**Desktop (please complete the following information):**
- OS: MacOS 13.4.1 (Ventura)
- Checkov Version: 2.5.8
**Additional context**
I believe this is due to the [process_dynamic_values](https://github.com/bridgecrewio/checkov/blob/main/checkov/terraform/parser_functions.py#L194) method, which directly fetches the "dynamic" property of `conf` but does not verify its Iterable-ness before trying to loop through and parse its elements.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/parser_functions.py`
Content:
```
1 from __future__ import annotations
2
3 import json
4 import logging
5 from collections.abc import Hashable
6 from typing import Dict, List, Union, Any, Callable
7
8 from checkov.common.util.data_structures_utils import pickle_deepcopy
9 from checkov.common.util.type_forcers import convert_str_to_bool
10 from checkov.common.util.parser_utils import eval_string, split_merge_args, string_to_native, to_string
11
12 #
13 # Functions defined in this file implement terraform functions.
14 #
15 # Inputs:
16 # - First arg (unnamed) - the value string provided to the function
17 # - "var_resolver" - function pointer to resolve variable/local references and such
18 # - "function_name" - name of the function being called (mainly useful for error reporting when a
19 # function isn't defined)
20 # These may be expanded over time, so accepting kwargs (via `**`) is recommended.
21 #
22 # If the value cannot be processed, `FUNCTION_FAILED` should be returned.
23 #
24
25 FUNCTION_FAILED = "____FUNCTION_FAILED____"
26
27
28 def merge(original: str, var_resolver: Callable[[Any], Any], **_: Any) -> dict[Hashable, Any] | str:
29 # https://www.terraform.io/docs/language/functions/merge.html
30 args = split_merge_args(original)
31 if args is None:
32 return FUNCTION_FAILED
33 merged_map = {}
34 for arg in args:
35 if arg.startswith("{"):
36 arg_value = string_to_native(arg)
37 if arg_value is None:
38 return FUNCTION_FAILED
39 else:
40 arg_value = var_resolver(arg)
41 if isinstance(arg_value, dict):
42 merged_map.update(arg_value)
43 else:
44 return FUNCTION_FAILED # don't know what this is, blow out
45 return merged_map
46
47
48 def concat(original: str, var_resolver: Callable[[Any], Any], **_: Any) -> list[Any] | str:
49 # https://www.terraform.io/docs/language/functions/concat.html
50 args = split_merge_args(original)
51 if args is None:
52 return FUNCTION_FAILED
53 merged_list = []
54 for arg in args:
55 if arg.startswith("["):
56 value = eval_string(arg)
57 if value is None:
58 logging.debug("Unable to convert to list: %s", arg)
59 return FUNCTION_FAILED
60 else:
61 value = var_resolver(arg)
62 if isinstance(value, list):
63 merged_list.extend(value)
64 else:
65 return FUNCTION_FAILED # don't know what this is, blow out
66 return merged_list
67
68
69 def tobool(original: Union[bool, str], **_: Any) -> Union[bool, str]:
70 # https://www.terraform.io/docs/configuration/functions/tobool.html
71 bool_value = convert_str_to_bool(original)
72 return bool_value if isinstance(bool_value, bool) else FUNCTION_FAILED
73
74
75 def tonumber(original: str, **_: Any) -> float | str:
76 # https://www.terraform.io/docs/configuration/functions/tonumber.html
77 if original.startswith('"') and original.endswith('"'):
78 original = original[1:-1]
79 try:
80 if "." in original:
81 return float(original)
82 else:
83 return int(original)
84 except ValueError:
85 return FUNCTION_FAILED
86
87
88 def tostring(original: str, **_: Any) -> bool | str:
89 # Indicates a safe string, all good
90 if original.startswith('"') and original.endswith('"'):
91 return original[1:-1]
92 # Otherwise, need to check for valid types (number or bool)
93 bool_value = convert_str_to_bool(original)
94 if isinstance(bool_value, bool):
95 return bool_value
96 else:
97 try:
98 if "." in original:
99 return str(float(original))
100 else:
101 return str(int(original))
102 except ValueError:
103 return FUNCTION_FAILED # no change
104
105
106 def tolist(original: str, **_: Any) -> list[Any] | str:
107 # https://www.terraform.io/docs/configuration/functions/tolist.html
108 altered_value = eval_string(original)
109 if altered_value is None:
110 return FUNCTION_FAILED
111 return altered_value if isinstance(altered_value, list) else list(altered_value)
112
113
114 def toset(original: str, **_: Any) -> set[Any] | str:
115 # https://www.terraform.io/docs/configuration/functions/toset.html
116 altered_value = eval_string(original)
117 if altered_value is None:
118 return FUNCTION_FAILED
119 return altered_value if isinstance(altered_value, set) else set(altered_value)
120
121
122 def tomap(original: str, **_: Any) -> dict[Hashable, Any] | str:
123 # https://www.terraform.io/docs/language/functions/tomap.html
124 original = original.replace(":", "=") # converted to colons by parser #shrug
125
126 altered_value = eval_string(original)
127 if altered_value is None or not isinstance(altered_value, dict):
128 return FUNCTION_FAILED
129 return _check_map_type_consistency(altered_value)
130
131
132 def map(original: str, **_: Any) -> dict[Hashable, Any] | str:
133 # https://www.terraform.io/docs/language/functions/map.html
134
135 # NOTE: Splitting by commas is annoying due to possible commas in strings. To avoid
136 # the issue, act like it's a list (to allow comma separation) and let the HCL
137 # parser deal with it. Then iterating the list is easy.
138 converted_to_list = eval_string(f"[{original}]")
139 if converted_to_list is None or len(converted_to_list) & 1: # none or odd number of args
140 return FUNCTION_FAILED
141
142 return create_map(converted_to_list)
143
144
145 def create_map(lst: list[Any]) -> dict[Hashable, Any]:
146 new_map = {}
147 for i in range(0, len(lst), 2):
148 new_map[lst[i]] = lst[i + 1]
149 return _check_map_type_consistency(new_map)
150
151
152 def _check_map_type_consistency(value: dict[Hashable, Any]) -> dict[Hashable, Any]:
153 # If there is a string and anything else, convert to string
154 had_string = False
155 had_something_else = False
156 for k, v in value.items():
157 if v == "${True}":
158 value[k] = True
159 v = True
160 elif v == "${False}":
161 value[k] = False
162 v = False
163
164 if isinstance(v, str):
165 had_string = True
166 if had_something_else:
167 break
168 else:
169 had_something_else = True
170 if had_string:
171 break
172 if had_string and had_something_else:
173 value = {k: to_string(v) for k, v in value.items()}
174 return value
175
176
177 def handle_dynamic_values(conf: Dict[str, List[Any]], has_dynamic_block: bool = False) -> bool:
178 # recursively search for blocks that are dynamic
179 for block_name in conf.keys():
180 conf_block = conf[block_name]
181 if isinstance(conf_block, dict):
182 has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)
183
184 # if the configuration is a block element, search down again.
185 if conf_block and isinstance(conf_block, list) and isinstance(conf_block[0], dict):
186 has_dynamic_block = handle_dynamic_values(conf_block[0], has_dynamic_block)
187
188 # if a dynamic block exists somewhere in the resource it will return True
189 return process_dynamic_values(conf) or has_dynamic_block
190
191
192 def process_dynamic_values(conf: Dict[str, List[Any]]) -> bool:
193 has_dynamic_block = False
194 for dynamic_element in conf.get("dynamic", {}):
195 if isinstance(dynamic_element, str):
196 try:
197 dynamic_element = json.loads(dynamic_element)
198 except Exception:
199 dynamic_element = {}
200
201 for element_name, element_value in dynamic_element.items():
202 if "content" in element_value:
203 if element_name in conf:
204 if not isinstance(conf[element_name], list):
205 conf[element_name] = [conf[element_name]]
206 if isinstance(element_value["content"], list):
207 conf[element_name].extend(element_value["content"])
208 else:
209 conf[element_name].append(element_value["content"])
210
211 else:
212 conf[element_name] = pickle_deepcopy(element_value["content"])
213 else:
214 # this should be the result of a successful dynamic block rendering
215 # in some cases a whole dict is added, which doesn't have a list around it
216 conf[element_name] = element_value if isinstance(element_value, list) else [element_value]
217
218 has_dynamic_block = True
219
220 return has_dynamic_block
221
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/terraform/parser_functions.py b/checkov/terraform/parser_functions.py
--- a/checkov/terraform/parser_functions.py
+++ b/checkov/terraform/parser_functions.py
@@ -7,7 +7,12 @@
from checkov.common.util.data_structures_utils import pickle_deepcopy
from checkov.common.util.type_forcers import convert_str_to_bool
-from checkov.common.util.parser_utils import eval_string, split_merge_args, string_to_native, to_string
+from checkov.common.util.parser_utils import (
+ eval_string,
+ split_merge_args,
+ string_to_native,
+ to_string,
+)
#
# Functions defined in this file implement terraform functions.
@@ -121,7 +126,7 @@
def tomap(original: str, **_: Any) -> dict[Hashable, Any] | str:
# https://www.terraform.io/docs/language/functions/tomap.html
- original = original.replace(":", "=") # converted to colons by parser #shrug
+ original = original.replace(":", "=") # converted to colons by parser #shrug
altered_value = eval_string(original)
if altered_value is None or not isinstance(altered_value, dict):
@@ -136,7 +141,7 @@
# the issue, act like it's a list (to allow comma separation) and let the HCL
# parser deal with it. Then iterating the list is easy.
converted_to_list = eval_string(f"[{original}]")
- if converted_to_list is None or len(converted_to_list) & 1: # none or odd number of args
+ if converted_to_list is None or len(converted_to_list) & 1: # none or odd number of args
return FUNCTION_FAILED
return create_map(converted_to_list)
@@ -190,8 +195,13 @@
def process_dynamic_values(conf: Dict[str, List[Any]]) -> bool:
+ dynamic_conf: Union[List[Any], Dict[str, List[Any]]] = conf.get("dynamic", {})
+
+ if not isinstance(dynamic_conf, list):
+ return False
+
has_dynamic_block = False
- for dynamic_element in conf.get("dynamic", {}):
+ for dynamic_element in dynamic_conf:
if isinstance(dynamic_element, str):
try:
dynamic_element = json.loads(dynamic_element)
|
{"golden_diff": "diff --git a/checkov/terraform/parser_functions.py b/checkov/terraform/parser_functions.py\n--- a/checkov/terraform/parser_functions.py\n+++ b/checkov/terraform/parser_functions.py\n@@ -7,7 +7,12 @@\n \n from checkov.common.util.data_structures_utils import pickle_deepcopy\n from checkov.common.util.type_forcers import convert_str_to_bool\n-from checkov.common.util.parser_utils import eval_string, split_merge_args, string_to_native, to_string\n+from checkov.common.util.parser_utils import (\n+ eval_string,\n+ split_merge_args,\n+ string_to_native,\n+ to_string,\n+)\n \n #\n # Functions defined in this file implement terraform functions.\n@@ -121,7 +126,7 @@\n \n def tomap(original: str, **_: Any) -> dict[Hashable, Any] | str:\n # https://www.terraform.io/docs/language/functions/tomap.html\n- original = original.replace(\":\", \"=\") # converted to colons by parser #shrug\n+ original = original.replace(\":\", \"=\") # converted to colons by parser #shrug\n \n altered_value = eval_string(original)\n if altered_value is None or not isinstance(altered_value, dict):\n@@ -136,7 +141,7 @@\n # the issue, act like it's a list (to allow comma separation) and let the HCL\n # parser deal with it. Then iterating the list is easy.\n converted_to_list = eval_string(f\"[{original}]\")\n- if converted_to_list is None or len(converted_to_list) & 1: # none or odd number of args\n+ if converted_to_list is None or len(converted_to_list) & 1: # none or odd number of args\n return FUNCTION_FAILED\n \n return create_map(converted_to_list)\n@@ -190,8 +195,13 @@\n \n \n def process_dynamic_values(conf: Dict[str, List[Any]]) -> bool:\n+ dynamic_conf: Union[List[Any], Dict[str, List[Any]]] = conf.get(\"dynamic\", {})\n+\n+ if not isinstance(dynamic_conf, list):\n+ return False\n+\n has_dynamic_block = False\n- for dynamic_element in conf.get(\"dynamic\", {}):\n+ for dynamic_element in dynamic_conf:\n if isinstance(dynamic_element, str):\n try:\n dynamic_element = json.loads(dynamic_element)\n", "issue": "Resources that contain scalar fields named \"dynamic\" not parsing properly\n**Describe the issue**\r\n\r\nRunning checkov against a Terraform resource which includes a scalar field called \"dynamic\" which _is not_ an actual HCL `dynamic` block causes a crash.\r\n\r\n**Examples**\r\n\r\nUsing the below resource definition:\r\n\r\n```hcl\r\nresource \"elasticsearch_index_template\" \"logstash\" {\r\n body = jsonencode(\r\n {\r\n index_patterns = [\r\n \"test-*\",\r\n ]\r\n mappings = {\r\n properties = {\r\n foo = {\r\n # Note that \"dynamic\" here is not HCL dynamic block, but rather an elasticsearch property declared using HCL\r\n dynamic = true\r\n properties = {\r\n bar = {\r\n type = \"string\"\r\n }\r\n }\r\n }\r\n }\r\n }\r\n }\r\n )\r\n\r\n name = \"test\"\r\n}\r\n```\r\n\r\nWe found this bug while attempting to write a custom check to flag up use of the `provisioner` block which, since it is a [Terraform configuration](https://developer.hashicorp.com/terraform/language/resources/provisioners/syntax) available to all `resource` blocks, requires a custom check where `supported_resources = (\"*\")`:\r\n\r\n```python\r\nfrom typing import Any\r\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\r\nfrom checkov.common.models.enums import CheckResult, CheckCategories\r\n\r\n\r\nclass DynamicTest(BaseResourceCheck):\r\n def __init__(self) -> None:\r\n name = \"Testing crash where dynamic is a scalar value\"\r\n id = \"CKV_TEST_1\"\r\n supported_resources = (\"*\",)\r\n categories = (CheckCategories.GENERAL_SECURITY,)\r\n super().__init__(\r\n name=name,\r\n id=id,\r\n categories=categories,\r\n supported_resources=supported_resources,\r\n )\r\n\r\n def scan_resource_conf(self, conf: dict[str, list[Any]]) -> CheckResult:\r\n return CheckResult.PASSED\r\n\r\n\r\ncheck = DynamicTest()\r\n```\r\n\r\nWe can then run checkov:\r\n\r\n```shell\r\n$ checkov -f test.tf --external-checks-dir=./checks\r\n```\r\n\r\n**Exception Trace**\r\n\r\nThe relevant part of the trace caused by this issue is below:\r\n\r\n```\r\n2023-10-13 10:22:17,018 [MainThread ] [DEBUG] Running check: Testing crash where dynamic is a scalar value on file /test.tf\r\n2023-10-13 10:22:17,019 [MainThread ] [ERROR] Failed to run check CKV_TEST_1 on /test.tf:elasticsearch_index_template.test\r\nTraceback (most recent call last):\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/common/checks/base_check.py\", line 75, in run\r\n check_result[\"result\"] = self.scan_entity_conf(entity_configuration, entity_type)\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/checks/resource/base_resource_check.py\", line 42, in scan_entity_conf\r\n handle_dynamic_values(conf)\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py\", line 186, in handle_dynamic_values\r\n has_dynamic_block = handle_dynamic_values(conf_block[0], has_dynamic_block)\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py\", line 182, in handle_dynamic_values\r\n has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py\", line 182, in handle_dynamic_values\r\n has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py\", line 182, in handle_dynamic_values\r\n has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py\", line 189, in handle_dynamic_values\r\n return process_dynamic_values(conf) or has_dynamic_block\r\n File \"/Users/rob.westbrook/.pyenv/versions/3.10.4/lib/python3.10/site-packages/checkov/terraform/parser_functions.py\", line 195, in process_dynamic_values\r\n for dynamic_element in conf.get(\"dynamic\", {}):\r\nTypeError: 'bool' object is not iterable\r\n2023-10-13 10:22:17,020 [MainThread ] [INFO ] Entity configuration: {'__end_line__': 24, '__start_line__': 1, 'body': [{'index_patterns': ['test-*'], 'mappings': {'properties': {'foo': {'dynamic': True, 'properties': {'bar': {'type': 'string'}}}}}}], 'name': ['test'], '__address__': 'elasticsearch_index_template.test'}\r\n```\r\n\r\n**Desktop (please complete the following information):**\r\n\r\n - OS: MacOS 13.4.1 (Ventura)\r\n - Checkov Version: 2.5.8\r\n\r\n**Additional context**\r\n\r\nI believe this is due to the [process_dynamic_values](https://github.com/bridgecrewio/checkov/blob/main/checkov/terraform/parser_functions.py#L194) method, which directly fetches the \"dynamic\" property of `conf` but does not verify its Iterable-ness before trying to loop through and parse its elements.\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nimport json\nimport logging\nfrom collections.abc import Hashable\nfrom typing import Dict, List, Union, Any, Callable\n\nfrom checkov.common.util.data_structures_utils import pickle_deepcopy\nfrom checkov.common.util.type_forcers import convert_str_to_bool\nfrom checkov.common.util.parser_utils import eval_string, split_merge_args, string_to_native, to_string\n\n#\n# Functions defined in this file implement terraform functions.\n#\n# Inputs:\n# - First arg (unnamed) - the value string provided to the function\n# - \"var_resolver\" - function pointer to resolve variable/local references and such\n# - \"function_name\" - name of the function being called (mainly useful for error reporting when a\n# function isn't defined)\n# These may be expanded over time, so accepting kwargs (via `**`) is recommended.\n#\n# If the value cannot be processed, `FUNCTION_FAILED` should be returned.\n#\n\nFUNCTION_FAILED = \"____FUNCTION_FAILED____\"\n\n\ndef merge(original: str, var_resolver: Callable[[Any], Any], **_: Any) -> dict[Hashable, Any] | str:\n # https://www.terraform.io/docs/language/functions/merge.html\n args = split_merge_args(original)\n if args is None:\n return FUNCTION_FAILED\n merged_map = {}\n for arg in args:\n if arg.startswith(\"{\"):\n arg_value = string_to_native(arg)\n if arg_value is None:\n return FUNCTION_FAILED\n else:\n arg_value = var_resolver(arg)\n if isinstance(arg_value, dict):\n merged_map.update(arg_value)\n else:\n return FUNCTION_FAILED # don't know what this is, blow out\n return merged_map\n\n\ndef concat(original: str, var_resolver: Callable[[Any], Any], **_: Any) -> list[Any] | str:\n # https://www.terraform.io/docs/language/functions/concat.html\n args = split_merge_args(original)\n if args is None:\n return FUNCTION_FAILED\n merged_list = []\n for arg in args:\n if arg.startswith(\"[\"):\n value = eval_string(arg)\n if value is None:\n logging.debug(\"Unable to convert to list: %s\", arg)\n return FUNCTION_FAILED\n else:\n value = var_resolver(arg)\n if isinstance(value, list):\n merged_list.extend(value)\n else:\n return FUNCTION_FAILED # don't know what this is, blow out\n return merged_list\n\n\ndef tobool(original: Union[bool, str], **_: Any) -> Union[bool, str]:\n # https://www.terraform.io/docs/configuration/functions/tobool.html\n bool_value = convert_str_to_bool(original)\n return bool_value if isinstance(bool_value, bool) else FUNCTION_FAILED\n\n\ndef tonumber(original: str, **_: Any) -> float | str:\n # https://www.terraform.io/docs/configuration/functions/tonumber.html\n if original.startswith('\"') and original.endswith('\"'):\n original = original[1:-1]\n try:\n if \".\" in original:\n return float(original)\n else:\n return int(original)\n except ValueError:\n return FUNCTION_FAILED\n\n\ndef tostring(original: str, **_: Any) -> bool | str:\n # Indicates a safe string, all good\n if original.startswith('\"') and original.endswith('\"'):\n return original[1:-1]\n # Otherwise, need to check for valid types (number or bool)\n bool_value = convert_str_to_bool(original)\n if isinstance(bool_value, bool):\n return bool_value\n else:\n try:\n if \".\" in original:\n return str(float(original))\n else:\n return str(int(original))\n except ValueError:\n return FUNCTION_FAILED # no change\n\n\ndef tolist(original: str, **_: Any) -> list[Any] | str:\n # https://www.terraform.io/docs/configuration/functions/tolist.html\n altered_value = eval_string(original)\n if altered_value is None:\n return FUNCTION_FAILED\n return altered_value if isinstance(altered_value, list) else list(altered_value)\n\n\ndef toset(original: str, **_: Any) -> set[Any] | str:\n # https://www.terraform.io/docs/configuration/functions/toset.html\n altered_value = eval_string(original)\n if altered_value is None:\n return FUNCTION_FAILED\n return altered_value if isinstance(altered_value, set) else set(altered_value)\n\n\ndef tomap(original: str, **_: Any) -> dict[Hashable, Any] | str:\n # https://www.terraform.io/docs/language/functions/tomap.html\n original = original.replace(\":\", \"=\") # converted to colons by parser #shrug\n\n altered_value = eval_string(original)\n if altered_value is None or not isinstance(altered_value, dict):\n return FUNCTION_FAILED\n return _check_map_type_consistency(altered_value)\n\n\ndef map(original: str, **_: Any) -> dict[Hashable, Any] | str:\n # https://www.terraform.io/docs/language/functions/map.html\n\n # NOTE: Splitting by commas is annoying due to possible commas in strings. To avoid\n # the issue, act like it's a list (to allow comma separation) and let the HCL\n # parser deal with it. Then iterating the list is easy.\n converted_to_list = eval_string(f\"[{original}]\")\n if converted_to_list is None or len(converted_to_list) & 1: # none or odd number of args\n return FUNCTION_FAILED\n\n return create_map(converted_to_list)\n\n\ndef create_map(lst: list[Any]) -> dict[Hashable, Any]:\n new_map = {}\n for i in range(0, len(lst), 2):\n new_map[lst[i]] = lst[i + 1]\n return _check_map_type_consistency(new_map)\n\n\ndef _check_map_type_consistency(value: dict[Hashable, Any]) -> dict[Hashable, Any]:\n # If there is a string and anything else, convert to string\n had_string = False\n had_something_else = False\n for k, v in value.items():\n if v == \"${True}\":\n value[k] = True\n v = True\n elif v == \"${False}\":\n value[k] = False\n v = False\n\n if isinstance(v, str):\n had_string = True\n if had_something_else:\n break\n else:\n had_something_else = True\n if had_string:\n break\n if had_string and had_something_else:\n value = {k: to_string(v) for k, v in value.items()}\n return value\n\n\ndef handle_dynamic_values(conf: Dict[str, List[Any]], has_dynamic_block: bool = False) -> bool:\n # recursively search for blocks that are dynamic\n for block_name in conf.keys():\n conf_block = conf[block_name]\n if isinstance(conf_block, dict):\n has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)\n\n # if the configuration is a block element, search down again.\n if conf_block and isinstance(conf_block, list) and isinstance(conf_block[0], dict):\n has_dynamic_block = handle_dynamic_values(conf_block[0], has_dynamic_block)\n\n # if a dynamic block exists somewhere in the resource it will return True\n return process_dynamic_values(conf) or has_dynamic_block\n\n\ndef process_dynamic_values(conf: Dict[str, List[Any]]) -> bool:\n has_dynamic_block = False\n for dynamic_element in conf.get(\"dynamic\", {}):\n if isinstance(dynamic_element, str):\n try:\n dynamic_element = json.loads(dynamic_element)\n except Exception:\n dynamic_element = {}\n\n for element_name, element_value in dynamic_element.items():\n if \"content\" in element_value:\n if element_name in conf:\n if not isinstance(conf[element_name], list):\n conf[element_name] = [conf[element_name]]\n if isinstance(element_value[\"content\"], list):\n conf[element_name].extend(element_value[\"content\"])\n else:\n conf[element_name].append(element_value[\"content\"])\n\n else:\n conf[element_name] = pickle_deepcopy(element_value[\"content\"])\n else:\n # this should be the result of a successful dynamic block rendering\n # in some cases a whole dict is added, which doesn't have a list around it\n conf[element_name] = element_value if isinstance(element_value, list) else [element_value]\n\n has_dynamic_block = True\n\n return has_dynamic_block\n", "path": "checkov/terraform/parser_functions.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport json\nimport logging\nfrom collections.abc import Hashable\nfrom typing import Dict, List, Union, Any, Callable\n\nfrom checkov.common.util.data_structures_utils import pickle_deepcopy\nfrom checkov.common.util.type_forcers import convert_str_to_bool\nfrom checkov.common.util.parser_utils import (\n eval_string,\n split_merge_args,\n string_to_native,\n to_string,\n)\n\n#\n# Functions defined in this file implement terraform functions.\n#\n# Inputs:\n# - First arg (unnamed) - the value string provided to the function\n# - \"var_resolver\" - function pointer to resolve variable/local references and such\n# - \"function_name\" - name of the function being called (mainly useful for error reporting when a\n# function isn't defined)\n# These may be expanded over time, so accepting kwargs (via `**`) is recommended.\n#\n# If the value cannot be processed, `FUNCTION_FAILED` should be returned.\n#\n\nFUNCTION_FAILED = \"____FUNCTION_FAILED____\"\n\n\ndef merge(original: str, var_resolver: Callable[[Any], Any], **_: Any) -> dict[Hashable, Any] | str:\n # https://www.terraform.io/docs/language/functions/merge.html\n args = split_merge_args(original)\n if args is None:\n return FUNCTION_FAILED\n merged_map = {}\n for arg in args:\n if arg.startswith(\"{\"):\n arg_value = string_to_native(arg)\n if arg_value is None:\n return FUNCTION_FAILED\n else:\n arg_value = var_resolver(arg)\n if isinstance(arg_value, dict):\n merged_map.update(arg_value)\n else:\n return FUNCTION_FAILED # don't know what this is, blow out\n return merged_map\n\n\ndef concat(original: str, var_resolver: Callable[[Any], Any], **_: Any) -> list[Any] | str:\n # https://www.terraform.io/docs/language/functions/concat.html\n args = split_merge_args(original)\n if args is None:\n return FUNCTION_FAILED\n merged_list = []\n for arg in args:\n if arg.startswith(\"[\"):\n value = eval_string(arg)\n if value is None:\n logging.debug(\"Unable to convert to list: %s\", arg)\n return FUNCTION_FAILED\n else:\n value = var_resolver(arg)\n if isinstance(value, list):\n merged_list.extend(value)\n else:\n return FUNCTION_FAILED # don't know what this is, blow out\n return merged_list\n\n\ndef tobool(original: Union[bool, str], **_: Any) -> Union[bool, str]:\n # https://www.terraform.io/docs/configuration/functions/tobool.html\n bool_value = convert_str_to_bool(original)\n return bool_value if isinstance(bool_value, bool) else FUNCTION_FAILED\n\n\ndef tonumber(original: str, **_: Any) -> float | str:\n # https://www.terraform.io/docs/configuration/functions/tonumber.html\n if original.startswith('\"') and original.endswith('\"'):\n original = original[1:-1]\n try:\n if \".\" in original:\n return float(original)\n else:\n return int(original)\n except ValueError:\n return FUNCTION_FAILED\n\n\ndef tostring(original: str, **_: Any) -> bool | str:\n # Indicates a safe string, all good\n if original.startswith('\"') and original.endswith('\"'):\n return original[1:-1]\n # Otherwise, need to check for valid types (number or bool)\n bool_value = convert_str_to_bool(original)\n if isinstance(bool_value, bool):\n return bool_value\n else:\n try:\n if \".\" in original:\n return str(float(original))\n else:\n return str(int(original))\n except ValueError:\n return FUNCTION_FAILED # no change\n\n\ndef tolist(original: str, **_: Any) -> list[Any] | str:\n # https://www.terraform.io/docs/configuration/functions/tolist.html\n altered_value = eval_string(original)\n if altered_value is None:\n return FUNCTION_FAILED\n return altered_value if isinstance(altered_value, list) else list(altered_value)\n\n\ndef toset(original: str, **_: Any) -> set[Any] | str:\n # https://www.terraform.io/docs/configuration/functions/toset.html\n altered_value = eval_string(original)\n if altered_value is None:\n return FUNCTION_FAILED\n return altered_value if isinstance(altered_value, set) else set(altered_value)\n\n\ndef tomap(original: str, **_: Any) -> dict[Hashable, Any] | str:\n # https://www.terraform.io/docs/language/functions/tomap.html\n original = original.replace(\":\", \"=\") # converted to colons by parser #shrug\n\n altered_value = eval_string(original)\n if altered_value is None or not isinstance(altered_value, dict):\n return FUNCTION_FAILED\n return _check_map_type_consistency(altered_value)\n\n\ndef map(original: str, **_: Any) -> dict[Hashable, Any] | str:\n # https://www.terraform.io/docs/language/functions/map.html\n\n # NOTE: Splitting by commas is annoying due to possible commas in strings. To avoid\n # the issue, act like it's a list (to allow comma separation) and let the HCL\n # parser deal with it. Then iterating the list is easy.\n converted_to_list = eval_string(f\"[{original}]\")\n if converted_to_list is None or len(converted_to_list) & 1: # none or odd number of args\n return FUNCTION_FAILED\n\n return create_map(converted_to_list)\n\n\ndef create_map(lst: list[Any]) -> dict[Hashable, Any]:\n new_map = {}\n for i in range(0, len(lst), 2):\n new_map[lst[i]] = lst[i + 1]\n return _check_map_type_consistency(new_map)\n\n\ndef _check_map_type_consistency(value: dict[Hashable, Any]) -> dict[Hashable, Any]:\n # If there is a string and anything else, convert to string\n had_string = False\n had_something_else = False\n for k, v in value.items():\n if v == \"${True}\":\n value[k] = True\n v = True\n elif v == \"${False}\":\n value[k] = False\n v = False\n\n if isinstance(v, str):\n had_string = True\n if had_something_else:\n break\n else:\n had_something_else = True\n if had_string:\n break\n if had_string and had_something_else:\n value = {k: to_string(v) for k, v in value.items()}\n return value\n\n\ndef handle_dynamic_values(conf: Dict[str, List[Any]], has_dynamic_block: bool = False) -> bool:\n # recursively search for blocks that are dynamic\n for block_name in conf.keys():\n conf_block = conf[block_name]\n if isinstance(conf_block, dict):\n has_dynamic_block = handle_dynamic_values(conf_block, has_dynamic_block)\n\n # if the configuration is a block element, search down again.\n if conf_block and isinstance(conf_block, list) and isinstance(conf_block[0], dict):\n has_dynamic_block = handle_dynamic_values(conf_block[0], has_dynamic_block)\n\n # if a dynamic block exists somewhere in the resource it will return True\n return process_dynamic_values(conf) or has_dynamic_block\n\n\ndef process_dynamic_values(conf: Dict[str, List[Any]]) -> bool:\n dynamic_conf: Union[List[Any], Dict[str, List[Any]]] = conf.get(\"dynamic\", {})\n\n if not isinstance(dynamic_conf, list):\n return False\n\n has_dynamic_block = False\n for dynamic_element in dynamic_conf:\n if isinstance(dynamic_element, str):\n try:\n dynamic_element = json.loads(dynamic_element)\n except Exception:\n dynamic_element = {}\n\n for element_name, element_value in dynamic_element.items():\n if \"content\" in element_value:\n if element_name in conf:\n if not isinstance(conf[element_name], list):\n conf[element_name] = [conf[element_name]]\n if isinstance(element_value[\"content\"], list):\n conf[element_name].extend(element_value[\"content\"])\n else:\n conf[element_name].append(element_value[\"content\"])\n\n else:\n conf[element_name] = pickle_deepcopy(element_value[\"content\"])\n else:\n # this should be the result of a successful dynamic block rendering\n # in some cases a whole dict is added, which doesn't have a list around it\n conf[element_name] = element_value if isinstance(element_value, list) else [element_value]\n\n has_dynamic_block = True\n\n return has_dynamic_block\n", "path": "checkov/terraform/parser_functions.py"}]}
| 3,951 | 524 |
gh_patches_debug_16587
|
rasdani/github-patches
|
git_diff
|
freedomofpress__securedrop-5839
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Several testinfra tests failing against 1.8.0-rc1 Focal production instances:
## Description
`./securedrop-admin verify` against a 1.8.0-rc1 instance results in 11 test failures (of which 2 are expected)
```
============================================================================================= short test summary info =============================================================================================
FAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-xenial] - AssertionError: assert 'xenial' in ' /usr/bin/man\n /usr/lib/NetworkManager/nm-dhcp-client.action\n /usr/lib/NetworkManager/nm...
FAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-focal] - AssertionError: assert 'focal' in ' /usr/bin/man\n /usr/lib/NetworkManager/nm-dhcp-client.action\n /usr/lib/NetworkManager/nm-d...
FAILED app-code/test_securedrop_rqrequeue.py::test_securedrop_rqrequeue_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_rqworker.py::test_securedrop_rqworker_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_shredder_configuration.py::test_securedrop_shredder_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_source_deleter_configuration.py::test_securedrop_source_deleter_service[paramiko:/app] - assert '[Unit]\nDesc...user.target\n' == '[Unit]\nDesc...user.target\n'
FAILED app-code/test_securedrop_app_code.py::test_securedrop_application_apt_dependencies[paramiko:/app-libpython3.5] - AssertionError: assert False
(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/app] - AssertionError: Unexpected exit code 2 for CommandResult(command=b"grep -qs -- '^deb \\[arch=amd64\\] https://apt\\.freedom\\.pre...
(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/mon] - AssertionError: Unexpected exit code 2 for CommandResult(command=b"grep -qs -- '^deb \\[arch=amd64\\] https://apt\\.freedom\\.pre...
FAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/mon] - RuntimeError: Ansible module is only available with ansible connection backend
FAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/app] - RuntimeError: Ansible module is only available with ansible connection backend
============================================================= 11 failed, 441 passed, 7 skipped, 3 xfailed, 1 xpassed, 8 warnings in 842.46s (0:14:02) =============================================================
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `molecule/testinfra/conftest.py`
Content:
```
1 """
2 Configuration for TestInfra test suite for SecureDrop.
3 Handles importing host-specific test vars, so test functions
4 can be reused across multiple hosts, with varied targets.
5
6 Vars should be placed in `testinfra/vars/<hostname>.yml`.
7 """
8
9 import io
10 import os
11 import yaml
12 from typing import Any, Dict
13
14 import testutils
15
16
17 # The config tests target staging by default. It's possible to override
18 # for e.g. prod, but the associated vars files are not yet ported.
19 target_host = os.environ.get('SECUREDROP_TESTINFRA_TARGET_HOST', 'staging')
20
21
22 def securedrop_import_testinfra_vars(hostname, with_header=False):
23 """
24 Import vars from a YAML file to populate tests with host-specific
25 values used in checks. For instance, the SecureDrop docroot will
26 be under /vagrant in development, but /var/www/securedrop in staging.
27
28 Vars must be stored in `testinfra/vars/<hostname>.yml`.
29 """
30 filepath = os.path.join(os.path.dirname(__file__), "vars", hostname+".yml")
31 with io.open(filepath, 'r') as f:
32 hostvars = yaml.safe_load(f)
33
34 # Testing against both Focal and Xenial must be supported for now in both
35 # staging scenarios, and in prod via `USE_FOCAL=1 ./securedrop-admin verify`
36 testing_focal = False
37 scenario_env = "MOLECULE_SCENARIO_NAME"
38 if scenario_env in os.environ and os.environ.get(scenario_env).endswith("focal"):
39 testing_focal = True
40 if "USE_FOCAL" in os.environ:
41 testing_focal = True
42
43 if testing_focal:
44 hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.8") # noqa: E501
45 hostvars['python_version'] = "3.8"
46 else:
47 hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.5") # noqa: E501
48 hostvars['python_version'] = "3.5"
49
50 if with_header:
51 hostvars = dict(securedrop_test_vars=hostvars)
52
53 return hostvars
54
55
56 class TestVars(dict):
57 managed_attrs = {} # type: Dict[str, Any]
58
59 def __init__(self, initial: Dict[str, Any]) -> None:
60 self.securedrop_target_distribution = os.environ.get("SECUREDROP_TARGET_DISTRIBUTION")
61 self.managed_attrs.update(initial)
62
63 def __getattr__(self, name: str) -> Any:
64 """
65 If the requested attribute names a dict in managed_attrs and that
66 contains a key with the name of the target distribution,
67 e.g. "focal", return that. Otherwise return the entire item
68 under the requested name.
69 """
70 try:
71 attr = self.managed_attrs[name]
72 if isinstance(attr, dict) and self.securedrop_target_distribution in attr:
73 return attr[self.securedrop_target_distribution]
74 return attr
75 except KeyError:
76 raise AttributeError(name)
77
78 def __str__(self) -> str:
79 return str(self.managed_attrs)
80
81
82 testutils.securedrop_test_vars = TestVars(securedrop_import_testinfra_vars(target_host))
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/molecule/testinfra/conftest.py b/molecule/testinfra/conftest.py
--- a/molecule/testinfra/conftest.py
+++ b/molecule/testinfra/conftest.py
@@ -43,9 +43,11 @@
if testing_focal:
hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.8") # noqa: E501
hostvars['python_version'] = "3.8"
+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['focal']
else:
hostvars['securedrop_venv_site_packages'] = hostvars["securedrop_venv_site_packages"].format("3.5") # noqa: E501
hostvars['python_version'] = "3.5"
+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['xenial']
if with_header:
hostvars = dict(securedrop_test_vars=hostvars)
|
{"golden_diff": "diff --git a/molecule/testinfra/conftest.py b/molecule/testinfra/conftest.py\n--- a/molecule/testinfra/conftest.py\n+++ b/molecule/testinfra/conftest.py\n@@ -43,9 +43,11 @@\n if testing_focal:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.8\") # noqa: E501\n hostvars['python_version'] = \"3.8\"\n+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['focal']\n else:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.5\") # noqa: E501\n hostvars['python_version'] = \"3.5\"\n+ hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['xenial']\n \n if with_header:\n hostvars = dict(securedrop_test_vars=hostvars)\n", "issue": "Several testinfra tests failing against 1.8.0-rc1 Focal production instances:\n## Description\r\n`./securedrop-admin verify` against a 1.8.0-rc1 instance results in 11 test failures (of which 2 are expected)\r\n\r\n```\r\n============================================================================================= short test summary info =============================================================================================\r\nFAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-xenial] - AssertionError: assert 'xenial' in ' /usr/bin/man\\n /usr/lib/NetworkManager/nm-dhcp-client.action\\n /usr/lib/NetworkManager/nm...\r\nFAILED app/test_apparmor.py::test_apparmor_enforced[paramiko:/app-focal] - AssertionError: assert 'focal' in ' /usr/bin/man\\n /usr/lib/NetworkManager/nm-dhcp-client.action\\n /usr/lib/NetworkManager/nm-d...\r\nFAILED app-code/test_securedrop_rqrequeue.py::test_securedrop_rqrequeue_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_rqworker.py::test_securedrop_rqworker_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_shredder_configuration.py::test_securedrop_shredder_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_source_deleter_configuration.py::test_securedrop_source_deleter_service[paramiko:/app] - assert '[Unit]\\nDesc...user.target\\n' == '[Unit]\\nDesc...user.target\\n'\r\nFAILED app-code/test_securedrop_app_code.py::test_securedrop_application_apt_dependencies[paramiko:/app-libpython3.5] - AssertionError: assert False\r\n(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/app] - AssertionError: Unexpected exit code 2 for CommandResult(command=b\"grep -qs -- '^deb \\\\[arch=amd64\\\\] https://apt\\\\.freedom\\\\.pre...\r\n(expected failure) FAILED common/test_fpf_apt_repo.py::test_fpf_apt_repo_present[paramiko:/mon] - AssertionError: Unexpected exit code 2 for CommandResult(command=b\"grep -qs -- '^deb \\\\[arch=amd64\\\\] https://apt\\\\.freedom\\\\.pre...\r\nFAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/mon] - RuntimeError: Ansible module is only available with ansible connection backend\r\nFAILED common/test_grsecurity.py::test_paxctld_focal[paramiko:/app] - RuntimeError: Ansible module is only available with ansible connection backend\r\n============================================================= 11 failed, 441 passed, 7 skipped, 3 xfailed, 1 xpassed, 8 warnings in 842.46s (0:14:02) =============================================================\r\n```\r\n\n", "before_files": [{"content": "\"\"\"\nConfiguration for TestInfra test suite for SecureDrop.\nHandles importing host-specific test vars, so test functions\ncan be reused across multiple hosts, with varied targets.\n\nVars should be placed in `testinfra/vars/<hostname>.yml`.\n\"\"\"\n\nimport io\nimport os\nimport yaml\nfrom typing import Any, Dict\n\nimport testutils\n\n\n# The config tests target staging by default. It's possible to override\n# for e.g. prod, but the associated vars files are not yet ported.\ntarget_host = os.environ.get('SECUREDROP_TESTINFRA_TARGET_HOST', 'staging')\n\n\ndef securedrop_import_testinfra_vars(hostname, with_header=False):\n \"\"\"\n Import vars from a YAML file to populate tests with host-specific\n values used in checks. For instance, the SecureDrop docroot will\n be under /vagrant in development, but /var/www/securedrop in staging.\n\n Vars must be stored in `testinfra/vars/<hostname>.yml`.\n \"\"\"\n filepath = os.path.join(os.path.dirname(__file__), \"vars\", hostname+\".yml\")\n with io.open(filepath, 'r') as f:\n hostvars = yaml.safe_load(f)\n\n # Testing against both Focal and Xenial must be supported for now in both\n # staging scenarios, and in prod via `USE_FOCAL=1 ./securedrop-admin verify`\n testing_focal = False\n scenario_env = \"MOLECULE_SCENARIO_NAME\"\n if scenario_env in os.environ and os.environ.get(scenario_env).endswith(\"focal\"):\n testing_focal = True\n if \"USE_FOCAL\" in os.environ:\n testing_focal = True\n\n if testing_focal:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.8\") # noqa: E501\n hostvars['python_version'] = \"3.8\"\n else:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.5\") # noqa: E501\n hostvars['python_version'] = \"3.5\"\n\n if with_header:\n hostvars = dict(securedrop_test_vars=hostvars)\n\n return hostvars\n\n\nclass TestVars(dict):\n managed_attrs = {} # type: Dict[str, Any]\n\n def __init__(self, initial: Dict[str, Any]) -> None:\n self.securedrop_target_distribution = os.environ.get(\"SECUREDROP_TARGET_DISTRIBUTION\")\n self.managed_attrs.update(initial)\n\n def __getattr__(self, name: str) -> Any:\n \"\"\"\n If the requested attribute names a dict in managed_attrs and that\n contains a key with the name of the target distribution,\n e.g. \"focal\", return that. Otherwise return the entire item\n under the requested name.\n \"\"\"\n try:\n attr = self.managed_attrs[name]\n if isinstance(attr, dict) and self.securedrop_target_distribution in attr:\n return attr[self.securedrop_target_distribution]\n return attr\n except KeyError:\n raise AttributeError(name)\n\n def __str__(self) -> str:\n return str(self.managed_attrs)\n\n\ntestutils.securedrop_test_vars = TestVars(securedrop_import_testinfra_vars(target_host))\n", "path": "molecule/testinfra/conftest.py"}], "after_files": [{"content": "\"\"\"\nConfiguration for TestInfra test suite for SecureDrop.\nHandles importing host-specific test vars, so test functions\ncan be reused across multiple hosts, with varied targets.\n\nVars should be placed in `testinfra/vars/<hostname>.yml`.\n\"\"\"\n\nimport io\nimport os\nimport yaml\nfrom typing import Any, Dict\n\nimport testutils\n\n\n# The config tests target staging by default. It's possible to override\n# for e.g. prod, but the associated vars files are not yet ported.\ntarget_host = os.environ.get('SECUREDROP_TESTINFRA_TARGET_HOST', 'staging')\n\n\ndef securedrop_import_testinfra_vars(hostname, with_header=False):\n \"\"\"\n Import vars from a YAML file to populate tests with host-specific\n values used in checks. For instance, the SecureDrop docroot will\n be under /vagrant in development, but /var/www/securedrop in staging.\n\n Vars must be stored in `testinfra/vars/<hostname>.yml`.\n \"\"\"\n filepath = os.path.join(os.path.dirname(__file__), \"vars\", hostname+\".yml\")\n with io.open(filepath, 'r') as f:\n hostvars = yaml.safe_load(f)\n\n # Testing against both Focal and Xenial must be supported for now in both\n # staging scenarios, and in prod via `USE_FOCAL=1 ./securedrop-admin verify`\n testing_focal = False\n scenario_env = \"MOLECULE_SCENARIO_NAME\"\n if scenario_env in os.environ and os.environ.get(scenario_env).endswith(\"focal\"):\n testing_focal = True\n if \"USE_FOCAL\" in os.environ:\n testing_focal = True\n\n if testing_focal:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.8\") # noqa: E501\n hostvars['python_version'] = \"3.8\"\n hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['focal']\n else:\n hostvars['securedrop_venv_site_packages'] = hostvars[\"securedrop_venv_site_packages\"].format(\"3.5\") # noqa: E501\n hostvars['python_version'] = \"3.5\"\n hostvars['apparmor_enforce_actual'] = hostvars['apparmor_enforce']['xenial']\n\n if with_header:\n hostvars = dict(securedrop_test_vars=hostvars)\n\n return hostvars\n\n\nclass TestVars(dict):\n managed_attrs = {} # type: Dict[str, Any]\n\n def __init__(self, initial: Dict[str, Any]) -> None:\n self.securedrop_target_distribution = os.environ.get(\"SECUREDROP_TARGET_DISTRIBUTION\")\n self.managed_attrs.update(initial)\n\n def __getattr__(self, name: str) -> Any:\n \"\"\"\n If the requested attribute names a dict in managed_attrs and that\n contains a key with the name of the target distribution,\n e.g. \"focal\", return that. Otherwise return the entire item\n under the requested name.\n \"\"\"\n try:\n attr = self.managed_attrs[name]\n if isinstance(attr, dict) and self.securedrop_target_distribution in attr:\n return attr[self.securedrop_target_distribution]\n return attr\n except KeyError:\n raise AttributeError(name)\n\n def __str__(self) -> str:\n return str(self.managed_attrs)\n\n\ntestutils.securedrop_test_vars = TestVars(securedrop_import_testinfra_vars(target_host))\n", "path": "molecule/testinfra/conftest.py"}]}
| 1,813 | 237 |
gh_patches_debug_18447
|
rasdani/github-patches
|
git_diff
|
facebookresearch__CompilerGym-442
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Observation 'Bitcode' without going through a temporary file.
## 🚀 Feature
Have an observation which just returns the bytes of the bitcode, rather than the file. Going through a temporary file is a (tiny) bit annoying.
```
env.observation["Bitcode"] -> bytes
env.bitcode == env.observation["Bitcode"]
```
## Motivation
I'm passing bitcode to LLVM tools. They will accept bitcode on stdin and emit it to stdout. No need for a temporary file.
## Pitch
Don't use a file when a `bytes` will do?
## Alternatives
Keep using a file.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `compiler_gym/spaces/sequence.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates.
2 #
3 # This source code is licensed under the MIT license found in the
4 # LICENSE file in the root directory of this source tree.
5 from typing import Optional, Tuple
6
7 from gym.spaces import Space
8
9 from compiler_gym.spaces.scalar import Scalar
10
11
12 class Sequence(Space):
13 """A sequence of values. Each element of the sequence is of `dtype`. The
14 length of the sequence is bounded by `size_range`.
15
16 Example:
17
18 ::
19
20 >>> space = Sequence(size_range=(0, None), dtype=str)
21 >>> space.contains("Hello, world!")
22 True
23
24 ::
25
26 >>> space = Sequence(size_range=(256, 256), dtype=bytes)
27 >>> space.contains("Hello, world!")
28 False
29
30 :ivar size_range: A tuple indicating the `(lower, upper)` bounds for
31 sequence lengths. An upper bound of `None` means no upper bound. All
32 sequences must have a lower bound of length >= 0.
33 :ivar dtype: The data type for each element in a sequence.
34 :ivar opaque_data_format: An optional string describing an opaque data
35 format, e.g. a data structure that is serialized to a string/binary
36 array for transmission to the client. It is up to the client and service
37 to agree on how to decode observations using this value. For example,
38 an opaque_data_format of `string_json` could be used to indicate that
39 the observation is a string-serialized JSON value.
40 """
41
42 def __init__(
43 self,
44 size_range: Tuple[int, Optional[int]] = (0, None),
45 dtype=bytes,
46 opaque_data_format: Optional[str] = None,
47 scalar_range: Optional[Scalar] = None,
48 ):
49 """Constructor.
50
51 :param size_range: A tuple indicating the `(lower, upper)` bounds for
52 sequence lengths. An upper bound of `None` means no upper bound. All
53 sequences must have a lower bound of length >= 0.
54 :param dtype: The data type for each element in a sequence.
55 :param opaque_data_format: An optional string describing an opaque data
56 format, e.g. a data structure that is serialized to a string/binary
57 array for transmission to the client. It is up to the client and
58 service to agree on how to decode observations using this value. For
59 example, an opaque_data_format of `string_json` could be used to
60 indicate that the observation is a string-serialized JSON value.
61 :param scalar_range: If specified, this denotes the legal range of
62 each element in the sequence. This is enforced by
63 :meth:`contains() <compiler_gym.spaces.Sequence.contains>` checks.
64 """
65 self.size_range = size_range
66 self.dtype = dtype
67 self.opaque_data_format = opaque_data_format
68 self.scalar_range = scalar_range
69
70 def __repr__(self) -> str:
71 upper_bound = "inf" if self.size_range[1] is None else self.size_range[1]
72 d = f" -> {self.opaque_data_format}" if self.opaque_data_format else ""
73 return (
74 f"{self.dtype.__name__}_list<>[{int(self.size_range[0])},{upper_bound}]){d}"
75 )
76
77 def contains(self, x):
78 lower_bound = self.size_range[0]
79 upper_bound = float("inf") if self.size_range[1] is None else self.size_range[1]
80 if not (lower_bound <= len(x) <= upper_bound):
81 return False
82 for element in x:
83 if not isinstance(element, self.dtype):
84 return False
85
86 # Run the bounds check on every scalar element, if there is a scalar
87 # range specified.
88 if self.scalar_range:
89 return all(self.scalar_range.contains(s) for s in x)
90
91 return True
92
93 def sample(self):
94 """
95 .. warning::
96 The `Sequence` space cannot be sampled from.
97
98 :raises NotImplementedError: Not supported.
99 """
100 raise NotImplementedError
101
102 def __eq__(self, other):
103 if not isinstance(other, Sequence):
104 return False
105 return (
106 self.size_range == other.size_range
107 and self.dtype == other.dtype
108 and self.opaque_data_format == other.opaque_data_format
109 )
110
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/compiler_gym/spaces/sequence.py b/compiler_gym/spaces/sequence.py
--- a/compiler_gym/spaces/sequence.py
+++ b/compiler_gym/spaces/sequence.py
@@ -79,9 +79,18 @@
upper_bound = float("inf") if self.size_range[1] is None else self.size_range[1]
if not (lower_bound <= len(x) <= upper_bound):
return False
- for element in x:
- if not isinstance(element, self.dtype):
+
+ # TODO(cummins): The dtype API is inconsistent. When dtype=str or
+ # dtype=bytes, we expect this to be the type of the entire sequence. But
+ # for dtype=int, we expect this to be the type of each element. We
+ # should distinguish these differences better.
+ if self.dtype in {str, bytes}:
+ if not isinstance(x, self.dtype):
return False
+ else:
+ for element in x:
+ if not isinstance(element, self.dtype):
+ return False
# Run the bounds check on every scalar element, if there is a scalar
# range specified.
|
{"golden_diff": "diff --git a/compiler_gym/spaces/sequence.py b/compiler_gym/spaces/sequence.py\n--- a/compiler_gym/spaces/sequence.py\n+++ b/compiler_gym/spaces/sequence.py\n@@ -79,9 +79,18 @@\n upper_bound = float(\"inf\") if self.size_range[1] is None else self.size_range[1]\n if not (lower_bound <= len(x) <= upper_bound):\n return False\n- for element in x:\n- if not isinstance(element, self.dtype):\n+\n+ # TODO(cummins): The dtype API is inconsistent. When dtype=str or\n+ # dtype=bytes, we expect this to be the type of the entire sequence. But\n+ # for dtype=int, we expect this to be the type of each element. We\n+ # should distinguish these differences better.\n+ if self.dtype in {str, bytes}:\n+ if not isinstance(x, self.dtype):\n return False\n+ else:\n+ for element in x:\n+ if not isinstance(element, self.dtype):\n+ return False\n \n # Run the bounds check on every scalar element, if there is a scalar\n # range specified.\n", "issue": "Observation 'Bitcode' without going through a temporary file.\n## \ud83d\ude80 Feature\r\n\r\nHave an observation which just returns the bytes of the bitcode, rather than the file. Going through a temporary file is a (tiny) bit annoying.\r\n\r\n```\r\nenv.observation[\"Bitcode\"] -> bytes\r\nenv.bitcode == env.observation[\"Bitcode\"]\r\n```\r\n## Motivation\r\n\r\nI'm passing bitcode to LLVM tools. They will accept bitcode on stdin and emit it to stdout. No need for a temporary file.\r\n\r\n## Pitch\r\n\r\nDon't use a file when a `bytes` will do? \r\n\r\n## Alternatives\r\n\r\nKeep using a file.\r\n\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\nfrom typing import Optional, Tuple\n\nfrom gym.spaces import Space\n\nfrom compiler_gym.spaces.scalar import Scalar\n\n\nclass Sequence(Space):\n \"\"\"A sequence of values. Each element of the sequence is of `dtype`. The\n length of the sequence is bounded by `size_range`.\n\n Example:\n\n ::\n\n >>> space = Sequence(size_range=(0, None), dtype=str)\n >>> space.contains(\"Hello, world!\")\n True\n\n ::\n\n >>> space = Sequence(size_range=(256, 256), dtype=bytes)\n >>> space.contains(\"Hello, world!\")\n False\n\n :ivar size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :ivar dtype: The data type for each element in a sequence.\n :ivar opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and service\n to agree on how to decode observations using this value. For example,\n an opaque_data_format of `string_json` could be used to indicate that\n the observation is a string-serialized JSON value.\n \"\"\"\n\n def __init__(\n self,\n size_range: Tuple[int, Optional[int]] = (0, None),\n dtype=bytes,\n opaque_data_format: Optional[str] = None,\n scalar_range: Optional[Scalar] = None,\n ):\n \"\"\"Constructor.\n\n :param size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :param dtype: The data type for each element in a sequence.\n :param opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and\n service to agree on how to decode observations using this value. For\n example, an opaque_data_format of `string_json` could be used to\n indicate that the observation is a string-serialized JSON value.\n :param scalar_range: If specified, this denotes the legal range of\n each element in the sequence. This is enforced by\n :meth:`contains() <compiler_gym.spaces.Sequence.contains>` checks.\n \"\"\"\n self.size_range = size_range\n self.dtype = dtype\n self.opaque_data_format = opaque_data_format\n self.scalar_range = scalar_range\n\n def __repr__(self) -> str:\n upper_bound = \"inf\" if self.size_range[1] is None else self.size_range[1]\n d = f\" -> {self.opaque_data_format}\" if self.opaque_data_format else \"\"\n return (\n f\"{self.dtype.__name__}_list<>[{int(self.size_range[0])},{upper_bound}]){d}\"\n )\n\n def contains(self, x):\n lower_bound = self.size_range[0]\n upper_bound = float(\"inf\") if self.size_range[1] is None else self.size_range[1]\n if not (lower_bound <= len(x) <= upper_bound):\n return False\n for element in x:\n if not isinstance(element, self.dtype):\n return False\n\n # Run the bounds check on every scalar element, if there is a scalar\n # range specified.\n if self.scalar_range:\n return all(self.scalar_range.contains(s) for s in x)\n\n return True\n\n def sample(self):\n \"\"\"\n .. warning::\n The `Sequence` space cannot be sampled from.\n\n :raises NotImplementedError: Not supported.\n \"\"\"\n raise NotImplementedError\n\n def __eq__(self, other):\n if not isinstance(other, Sequence):\n return False\n return (\n self.size_range == other.size_range\n and self.dtype == other.dtype\n and self.opaque_data_format == other.opaque_data_format\n )\n", "path": "compiler_gym/spaces/sequence.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\nfrom typing import Optional, Tuple\n\nfrom gym.spaces import Space\n\nfrom compiler_gym.spaces.scalar import Scalar\n\n\nclass Sequence(Space):\n \"\"\"A sequence of values. Each element of the sequence is of `dtype`. The\n length of the sequence is bounded by `size_range`.\n\n Example:\n\n ::\n\n >>> space = Sequence(size_range=(0, None), dtype=str)\n >>> space.contains(\"Hello, world!\")\n True\n\n ::\n\n >>> space = Sequence(size_range=(256, 256), dtype=bytes)\n >>> space.contains(\"Hello, world!\")\n False\n\n :ivar size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :ivar dtype: The data type for each element in a sequence.\n :ivar opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and service\n to agree on how to decode observations using this value. For example,\n an opaque_data_format of `string_json` could be used to indicate that\n the observation is a string-serialized JSON value.\n \"\"\"\n\n def __init__(\n self,\n size_range: Tuple[int, Optional[int]] = (0, None),\n dtype=bytes,\n opaque_data_format: Optional[str] = None,\n scalar_range: Optional[Scalar] = None,\n ):\n \"\"\"Constructor.\n\n :param size_range: A tuple indicating the `(lower, upper)` bounds for\n sequence lengths. An upper bound of `None` means no upper bound. All\n sequences must have a lower bound of length >= 0.\n :param dtype: The data type for each element in a sequence.\n :param opaque_data_format: An optional string describing an opaque data\n format, e.g. a data structure that is serialized to a string/binary\n array for transmission to the client. It is up to the client and\n service to agree on how to decode observations using this value. For\n example, an opaque_data_format of `string_json` could be used to\n indicate that the observation is a string-serialized JSON value.\n :param scalar_range: If specified, this denotes the legal range of\n each element in the sequence. This is enforced by\n :meth:`contains() <compiler_gym.spaces.Sequence.contains>` checks.\n \"\"\"\n self.size_range = size_range\n self.dtype = dtype\n self.opaque_data_format = opaque_data_format\n self.scalar_range = scalar_range\n\n def __repr__(self) -> str:\n upper_bound = \"inf\" if self.size_range[1] is None else self.size_range[1]\n d = f\" -> {self.opaque_data_format}\" if self.opaque_data_format else \"\"\n return (\n f\"{self.dtype.__name__}_list<>[{int(self.size_range[0])},{upper_bound}]){d}\"\n )\n\n def contains(self, x):\n lower_bound = self.size_range[0]\n upper_bound = float(\"inf\") if self.size_range[1] is None else self.size_range[1]\n if not (lower_bound <= len(x) <= upper_bound):\n return False\n\n # TODO(cummins): The dtype API is inconsistent. When dtype=str or\n # dtype=bytes, we expect this to be the type of the entire sequence. But\n # for dtype=int, we expect this to be the type of each element. We\n # should distinguish these differences better.\n if self.dtype in {str, bytes}:\n if not isinstance(x, self.dtype):\n return False\n else:\n for element in x:\n if not isinstance(element, self.dtype):\n return False\n\n # Run the bounds check on every scalar element, if there is a scalar\n # range specified.\n if self.scalar_range:\n return all(self.scalar_range.contains(s) for s in x)\n\n return True\n\n def sample(self):\n \"\"\"\n .. warning::\n The `Sequence` space cannot be sampled from.\n\n :raises NotImplementedError: Not supported.\n \"\"\"\n raise NotImplementedError\n\n def __eq__(self, other):\n if not isinstance(other, Sequence):\n return False\n return (\n self.size_range == other.size_range\n and self.dtype == other.dtype\n and self.opaque_data_format == other.opaque_data_format\n )\n", "path": "compiler_gym/spaces/sequence.py"}]}
| 1,562 | 260 |
gh_patches_debug_17478
|
rasdani/github-patches
|
git_diff
|
yt-dlp__yt-dlp-9280
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cloudflare Stream is unsupported
### DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
- [X] I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
### Checklist
- [X] I'm reporting a new site support request
- [X] I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- [X] I've checked that all provided URLs are playable in a browser with the same IP and same login details
- [X] I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
- [X] I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
- [X] I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and am willing to share it if required
### Region
Spain
### Example URLs
https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe
### Provide a description that is worded well enough to be understood
Says URL is unsupported.
### Provide verbose output that clearly demonstrates the problem
- [X] Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
- [X] If using API, add `'verbose': True` to `YoutubeDL` params instead
- [X] Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
### Complete Verbose Output
```shell
$ yt-dlp -vU 'https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe'
[debug] Command-line config: ['-vU', 'https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe']
[debug] Encodings: locale UTF-8, fs utf-8, pref UTF-8, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@2024.02.05.232712 from yt-dlp/yt-dlp-nightly-builds [05420227a] (pip)
[debug] Python 3.10.12 (CPython x86_64 64bit) - Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35 (OpenSSL 3.0.2 15 Mar 2022, glibc 2.35)
[debug] exe versions: ffmpeg 4.4.2 (setts), ffprobe 4.4.2
[debug] Optional libraries: Cryptodome-3.20.0, brotli-1.1.0, certifi-2023.11.17, mutagen-1.47.0, requests-2.31.0, secretstorage-3.3.1, sqlite3-3.37.2, urllib3-2.1.0, websockets-12.0
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests, websockets
[debug] Loaded 1832 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
Latest version: nightly@2024.02.05.232712 from yt-dlp/yt-dlp-nightly-builds
yt-dlp is up to date (nightly@2024.02.05.232712 from yt-dlp/yt-dlp-nightly-builds)
[generic] Extracting URL: https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe
[generic] iframe: Downloading webpage
WARNING: [generic] Falling back on generic information extractor
[generic] iframe: Extracting information
[debug] Looking for embeds
ERROR: Unsupported URL: https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe
Traceback (most recent call last):
File "/home/user/.local/lib/python3.10/site-packages/yt_dlp/YoutubeDL.py", line 1586, in wrapper
return func(self, *args, **kwargs)
File "/home/user/.local/lib/python3.10/site-packages/yt_dlp/YoutubeDL.py", line 1721, in __extract_info
ie_result = ie.extract(url)
File "/home/user/.local/lib/python3.10/site-packages/yt_dlp/extractor/common.py", line 718, in extract
ie_result = self._real_extract(url)
File "/home/user/.local/lib/python3.10/site-packages/yt_dlp/extractor/generic.py", line 2531, in _real_extract
raise UnsupportedError(url)
yt_dlp.utils.UnsupportedError: Unsupported URL: https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `yt_dlp/extractor/cloudflarestream.py`
Content:
```
1 import base64
2
3 from .common import InfoExtractor
4
5
6 class CloudflareStreamIE(InfoExtractor):
7 _DOMAIN_RE = r'(?:cloudflarestream\.com|(?:videodelivery|bytehighway)\.net)'
8 _EMBED_RE = r'embed\.%s/embed/[^/]+\.js\?.*?\bvideo=' % _DOMAIN_RE
9 _ID_RE = r'[\da-f]{32}|[\w-]+\.[\w-]+\.[\w-]+'
10 _VALID_URL = r'''(?x)
11 https?://
12 (?:
13 (?:watch\.)?%s/|
14 %s
15 )
16 (?P<id>%s)
17 ''' % (_DOMAIN_RE, _EMBED_RE, _ID_RE)
18 _EMBED_REGEX = [fr'<script[^>]+\bsrc=(["\'])(?P<url>(?:https?:)?//{_EMBED_RE}(?:{_ID_RE}).*?)\1']
19 _TESTS = [{
20 'url': 'https://embed.cloudflarestream.com/embed/we4g.fla9.latest.js?video=31c9291ab41fac05471db4e73aa11717',
21 'info_dict': {
22 'id': '31c9291ab41fac05471db4e73aa11717',
23 'ext': 'mp4',
24 'title': '31c9291ab41fac05471db4e73aa11717',
25 },
26 'params': {
27 'skip_download': True,
28 },
29 }, {
30 'url': 'https://watch.cloudflarestream.com/9df17203414fd1db3e3ed74abbe936c1',
31 'only_matching': True,
32 }, {
33 'url': 'https://cloudflarestream.com/31c9291ab41fac05471db4e73aa11717/manifest/video.mpd',
34 'only_matching': True,
35 }, {
36 'url': 'https://embed.videodelivery.net/embed/r4xu.fla9.latest.js?video=81d80727f3022488598f68d323c1ad5e',
37 'only_matching': True,
38 }]
39
40 def _real_extract(self, url):
41 video_id = self._match_id(url)
42 domain = 'bytehighway.net' if 'bytehighway.net/' in url else 'videodelivery.net'
43 base_url = 'https://%s/%s/' % (domain, video_id)
44 if '.' in video_id:
45 video_id = self._parse_json(base64.urlsafe_b64decode(
46 video_id.split('.')[1] + '==='), video_id)['sub']
47 manifest_base_url = base_url + 'manifest/video.'
48
49 formats, subtitles = self._extract_m3u8_formats_and_subtitles(
50 manifest_base_url + 'm3u8', video_id, 'mp4',
51 'm3u8_native', m3u8_id='hls', fatal=False)
52 fmts, subs = self._extract_mpd_formats_and_subtitles(
53 manifest_base_url + 'mpd', video_id, mpd_id='dash', fatal=False)
54 formats.extend(fmts)
55 self._merge_subtitles(subs, target=subtitles)
56
57 return {
58 'id': video_id,
59 'title': video_id,
60 'thumbnail': base_url + 'thumbnails/thumbnail.jpg',
61 'formats': formats,
62 'subtitles': subtitles,
63 }
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/yt_dlp/extractor/cloudflarestream.py b/yt_dlp/extractor/cloudflarestream.py
--- a/yt_dlp/extractor/cloudflarestream.py
+++ b/yt_dlp/extractor/cloudflarestream.py
@@ -10,7 +10,7 @@
_VALID_URL = r'''(?x)
https?://
(?:
- (?:watch\.)?%s/|
+ (?:[\w-]+\.)?%s/|
%s
)
(?P<id>%s)
@@ -35,6 +35,9 @@
}, {
'url': 'https://embed.videodelivery.net/embed/r4xu.fla9.latest.js?video=81d80727f3022488598f68d323c1ad5e',
'only_matching': True,
+ }, {
+ 'url': 'https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe',
+ 'only_matching': True,
}]
def _real_extract(self, url):
|
{"golden_diff": "diff --git a/yt_dlp/extractor/cloudflarestream.py b/yt_dlp/extractor/cloudflarestream.py\n--- a/yt_dlp/extractor/cloudflarestream.py\n+++ b/yt_dlp/extractor/cloudflarestream.py\n@@ -10,7 +10,7 @@\n _VALID_URL = r'''(?x)\n https?://\n (?:\n- (?:watch\\.)?%s/|\n+ (?:[\\w-]+\\.)?%s/|\n %s\n )\n (?P<id>%s)\n@@ -35,6 +35,9 @@\n }, {\n 'url': 'https://embed.videodelivery.net/embed/r4xu.fla9.latest.js?video=81d80727f3022488598f68d323c1ad5e',\n 'only_matching': True,\n+ }, {\n+ 'url': 'https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe',\n+ 'only_matching': True,\n }]\n \n def _real_extract(self, url):\n", "issue": "Cloudflare Stream is unsupported\n### DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE\n\n- [X] I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\\* field\n\n### Checklist\n\n- [X] I'm reporting a new site support request\n- [X] I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))\n- [X] I've checked that all provided URLs are playable in a browser with the same IP and same login details\n- [X] I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge\n- [X] I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates\n- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)\n- [X] I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and am willing to share it if required\n\n### Region\n\nSpain\n\n### Example URLs\n\nhttps://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe\n\n### Provide a description that is worded well enough to be understood\n\nSays URL is unsupported.\n\n### Provide verbose output that clearly demonstrates the problem\n\n- [X] Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)\n- [X] If using API, add `'verbose': True` to `YoutubeDL` params instead\n- [X] Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below\n\n### Complete Verbose Output\n\n```shell\n$ yt-dlp -vU 'https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe'\r\n[debug] Command-line config: ['-vU', 'https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe']\r\n[debug] Encodings: locale UTF-8, fs utf-8, pref UTF-8, out utf-8, error utf-8, screen utf-8\r\n[debug] yt-dlp version nightly@2024.02.05.232712 from yt-dlp/yt-dlp-nightly-builds [05420227a] (pip)\r\n[debug] Python 3.10.12 (CPython x86_64 64bit) - Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35 (OpenSSL 3.0.2 15 Mar 2022, glibc 2.35)\r\n[debug] exe versions: ffmpeg 4.4.2 (setts), ffprobe 4.4.2\r\n[debug] Optional libraries: Cryptodome-3.20.0, brotli-1.1.0, certifi-2023.11.17, mutagen-1.47.0, requests-2.31.0, secretstorage-3.3.1, sqlite3-3.37.2, urllib3-2.1.0, websockets-12.0\r\n[debug] Proxy map: {}\r\n[debug] Request Handlers: urllib, requests, websockets\r\n[debug] Loaded 1832 extractors\r\n[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest\r\nLatest version: nightly@2024.02.05.232712 from yt-dlp/yt-dlp-nightly-builds\r\nyt-dlp is up to date (nightly@2024.02.05.232712 from yt-dlp/yt-dlp-nightly-builds)\r\n[generic] Extracting URL: https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe\r\n[generic] iframe: Downloading webpage\r\nWARNING: [generic] Falling back on generic information extractor\r\n[generic] iframe: Extracting information\r\n[debug] Looking for embeds\r\nERROR: Unsupported URL: https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe\r\nTraceback (most recent call last):\r\n File \"/home/user/.local/lib/python3.10/site-packages/yt_dlp/YoutubeDL.py\", line 1586, in wrapper\r\n return func(self, *args, **kwargs)\r\n File \"/home/user/.local/lib/python3.10/site-packages/yt_dlp/YoutubeDL.py\", line 1721, in __extract_info\r\n ie_result = ie.extract(url)\r\n File \"/home/user/.local/lib/python3.10/site-packages/yt_dlp/extractor/common.py\", line 718, in extract\r\n ie_result = self._real_extract(url)\r\n File \"/home/user/.local/lib/python3.10/site-packages/yt_dlp/extractor/generic.py\", line 2531, in _real_extract\r\n raise UnsupportedError(url)\r\nyt_dlp.utils.UnsupportedError: Unsupported URL: https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe\n```\n\n", "before_files": [{"content": "import base64\n\nfrom .common import InfoExtractor\n\n\nclass CloudflareStreamIE(InfoExtractor):\n _DOMAIN_RE = r'(?:cloudflarestream\\.com|(?:videodelivery|bytehighway)\\.net)'\n _EMBED_RE = r'embed\\.%s/embed/[^/]+\\.js\\?.*?\\bvideo=' % _DOMAIN_RE\n _ID_RE = r'[\\da-f]{32}|[\\w-]+\\.[\\w-]+\\.[\\w-]+'\n _VALID_URL = r'''(?x)\n https?://\n (?:\n (?:watch\\.)?%s/|\n %s\n )\n (?P<id>%s)\n ''' % (_DOMAIN_RE, _EMBED_RE, _ID_RE)\n _EMBED_REGEX = [fr'<script[^>]+\\bsrc=([\"\\'])(?P<url>(?:https?:)?//{_EMBED_RE}(?:{_ID_RE}).*?)\\1']\n _TESTS = [{\n 'url': 'https://embed.cloudflarestream.com/embed/we4g.fla9.latest.js?video=31c9291ab41fac05471db4e73aa11717',\n 'info_dict': {\n 'id': '31c9291ab41fac05471db4e73aa11717',\n 'ext': 'mp4',\n 'title': '31c9291ab41fac05471db4e73aa11717',\n },\n 'params': {\n 'skip_download': True,\n },\n }, {\n 'url': 'https://watch.cloudflarestream.com/9df17203414fd1db3e3ed74abbe936c1',\n 'only_matching': True,\n }, {\n 'url': 'https://cloudflarestream.com/31c9291ab41fac05471db4e73aa11717/manifest/video.mpd',\n 'only_matching': True,\n }, {\n 'url': 'https://embed.videodelivery.net/embed/r4xu.fla9.latest.js?video=81d80727f3022488598f68d323c1ad5e',\n 'only_matching': True,\n }]\n\n def _real_extract(self, url):\n video_id = self._match_id(url)\n domain = 'bytehighway.net' if 'bytehighway.net/' in url else 'videodelivery.net'\n base_url = 'https://%s/%s/' % (domain, video_id)\n if '.' in video_id:\n video_id = self._parse_json(base64.urlsafe_b64decode(\n video_id.split('.')[1] + '==='), video_id)['sub']\n manifest_base_url = base_url + 'manifest/video.'\n\n formats, subtitles = self._extract_m3u8_formats_and_subtitles(\n manifest_base_url + 'm3u8', video_id, 'mp4',\n 'm3u8_native', m3u8_id='hls', fatal=False)\n fmts, subs = self._extract_mpd_formats_and_subtitles(\n manifest_base_url + 'mpd', video_id, mpd_id='dash', fatal=False)\n formats.extend(fmts)\n self._merge_subtitles(subs, target=subtitles)\n\n return {\n 'id': video_id,\n 'title': video_id,\n 'thumbnail': base_url + 'thumbnails/thumbnail.jpg',\n 'formats': formats,\n 'subtitles': subtitles,\n }\n", "path": "yt_dlp/extractor/cloudflarestream.py"}], "after_files": [{"content": "import base64\n\nfrom .common import InfoExtractor\n\n\nclass CloudflareStreamIE(InfoExtractor):\n _DOMAIN_RE = r'(?:cloudflarestream\\.com|(?:videodelivery|bytehighway)\\.net)'\n _EMBED_RE = r'embed\\.%s/embed/[^/]+\\.js\\?.*?\\bvideo=' % _DOMAIN_RE\n _ID_RE = r'[\\da-f]{32}|[\\w-]+\\.[\\w-]+\\.[\\w-]+'\n _VALID_URL = r'''(?x)\n https?://\n (?:\n (?:[\\w-]+\\.)?%s/|\n %s\n )\n (?P<id>%s)\n ''' % (_DOMAIN_RE, _EMBED_RE, _ID_RE)\n _EMBED_REGEX = [fr'<script[^>]+\\bsrc=([\"\\'])(?P<url>(?:https?:)?//{_EMBED_RE}(?:{_ID_RE}).*?)\\1']\n _TESTS = [{\n 'url': 'https://embed.cloudflarestream.com/embed/we4g.fla9.latest.js?video=31c9291ab41fac05471db4e73aa11717',\n 'info_dict': {\n 'id': '31c9291ab41fac05471db4e73aa11717',\n 'ext': 'mp4',\n 'title': '31c9291ab41fac05471db4e73aa11717',\n },\n 'params': {\n 'skip_download': True,\n },\n }, {\n 'url': 'https://watch.cloudflarestream.com/9df17203414fd1db3e3ed74abbe936c1',\n 'only_matching': True,\n }, {\n 'url': 'https://cloudflarestream.com/31c9291ab41fac05471db4e73aa11717/manifest/video.mpd',\n 'only_matching': True,\n }, {\n 'url': 'https://embed.videodelivery.net/embed/r4xu.fla9.latest.js?video=81d80727f3022488598f68d323c1ad5e',\n 'only_matching': True,\n }, {\n 'url': 'https://customer-aw5py76sw8wyqzmh.cloudflarestream.com/2463f6d3e06fa29710a337f5f5389fd8/iframe',\n 'only_matching': True,\n }]\n\n def _real_extract(self, url):\n video_id = self._match_id(url)\n domain = 'bytehighway.net' if 'bytehighway.net/' in url else 'videodelivery.net'\n base_url = 'https://%s/%s/' % (domain, video_id)\n if '.' in video_id:\n video_id = self._parse_json(base64.urlsafe_b64decode(\n video_id.split('.')[1] + '==='), video_id)['sub']\n manifest_base_url = base_url + 'manifest/video.'\n\n formats, subtitles = self._extract_m3u8_formats_and_subtitles(\n manifest_base_url + 'm3u8', video_id, 'mp4',\n 'm3u8_native', m3u8_id='hls', fatal=False)\n fmts, subs = self._extract_mpd_formats_and_subtitles(\n manifest_base_url + 'mpd', video_id, mpd_id='dash', fatal=False)\n formats.extend(fmts)\n self._merge_subtitles(subs, target=subtitles)\n\n return {\n 'id': video_id,\n 'title': video_id,\n 'thumbnail': base_url + 'thumbnails/thumbnail.jpg',\n 'formats': formats,\n 'subtitles': subtitles,\n }\n", "path": "yt_dlp/extractor/cloudflarestream.py"}]}
| 2,702 | 288 |
gh_patches_debug_28347
|
rasdani/github-patches
|
git_diff
|
rotki__rotki-2858
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Upgrade to Premium popup still displayed when created user with Premium api key
Created docker image from commit https://github.com/rotki/rotki/commit/a7ef8be31f8059eeff34509667688a87a63edfc6 and created user for a first time importing api keys i was present with `Upgrade to Premium` popup, expected no to see this as far as Premium keys already were imported during user creation.
<img width="797" alt="Screen Shot 2021-05-05 at 6 40 03 PM" src="https://user-images.githubusercontent.com/15807643/117230156-fb5d4880-ae0b-11eb-9bc1-0d1d11cb5c6c.png">
<img width="1888" alt="Screen Shot 2021-05-05 at 6 43 32 PM" src="https://user-images.githubusercontent.com/15807643/117230372-79215400-ae0c-11eb-8f24-dbe79a38bab2.png">
Imported api keys second time to activate Premium
<img width="1814" alt="Screen Shot 2021-05-05 at 6 44 45 PM" src="https://user-images.githubusercontent.com/15807643/117230452-a66e0200-ae0c-11eb-9c3a-d1ab8be0be9a.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `rotkehlchen/premium/sync.py`
Content:
```
1 import base64
2 import logging
3 import shutil
4 from enum import Enum
5 from typing import Any, Dict, NamedTuple, Optional, Tuple
6
7 from typing_extensions import Literal
8
9 from rotkehlchen.data_handler import DataHandler
10 from rotkehlchen.errors import (
11 PremiumAuthenticationError,
12 RemoteError,
13 RotkehlchenPermissionError,
14 UnableToDecryptRemoteData,
15 )
16 from rotkehlchen.logging import RotkehlchenLogsAdapter
17 from rotkehlchen.premium.premium import Premium, PremiumCredentials, premium_create_and_verify
18 from rotkehlchen.utils.misc import timestamp_to_date, ts_now
19
20 logger = logging.getLogger(__name__)
21 log = RotkehlchenLogsAdapter(logger)
22
23
24 class CanSync(Enum):
25 YES = 0
26 NO = 1
27 ASK_USER = 2
28
29
30 class SyncCheckResult(NamedTuple):
31 # The result of the sync check
32 can_sync: CanSync
33 # If result is ASK_USER, what should the message be?
34 message: str
35 payload: Optional[Dict[str, Any]]
36
37
38 class PremiumSyncManager():
39
40 def __init__(self, data: DataHandler, password: str) -> None:
41 # Initialize this with the value saved in the DB
42 self.last_data_upload_ts = data.db.get_last_data_upload_ts()
43 self.data = data
44 self.password = password
45 self.premium: Optional[Premium] = None
46
47 def _can_sync_data_from_server(self, new_account: bool) -> SyncCheckResult:
48 """
49 Checks if the remote data can be pulled from the server.
50
51 Returns a SyncCheckResult denoting whether we can pull for sure,
52 whether we can't pull or whether the user should be asked. If the user
53 should be asked a message is also returned
54 """
55 log.debug('can sync data from server -- start')
56 if self.premium is None:
57 return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)
58
59 b64_encoded_data, our_hash = self.data.compress_and_encrypt_db(self.password)
60
61 try:
62 metadata = self.premium.query_last_data_metadata()
63 except RemoteError as e:
64 log.debug('can sync data from server failed', error=str(e))
65 return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)
66
67 if new_account:
68 return SyncCheckResult(can_sync=CanSync.YES, message='', payload=None)
69
70 if not self.data.db.get_premium_sync():
71 # If it's not a new account and the db setting for premium syncing is off stop
72 return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)
73
74 log.debug(
75 'CAN_PULL',
76 ours=our_hash,
77 theirs=metadata.data_hash,
78 )
79 if our_hash == metadata.data_hash:
80 log.debug('sync from server stopped -- same hash')
81 # same hash -- no need to get anything
82 return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)
83
84 our_last_write_ts = self.data.db.get_last_write_ts()
85 data_bytes_size = len(base64.b64decode(b64_encoded_data))
86
87 local_more_recent = our_last_write_ts >= metadata.last_modify_ts
88 local_bigger = data_bytes_size >= metadata.data_size
89
90 if local_more_recent and local_bigger:
91 log.debug('sync from server stopped -- local is both newer and bigger')
92 return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)
93
94 if local_more_recent is False: # remote is more recent
95 message = (
96 'Detected remote database with more recent modification timestamp '
97 'than the local one. '
98 )
99 else: # remote is bigger
100 message = 'Detected remote database with bigger size than the local one. '
101
102 return SyncCheckResult(
103 can_sync=CanSync.ASK_USER,
104 message=message,
105 payload={
106 'local_size': data_bytes_size,
107 'remote_size': metadata.data_size,
108 'local_last_modified': timestamp_to_date(our_last_write_ts),
109 'remote_last_modified': timestamp_to_date(metadata.last_modify_ts),
110 },
111 )
112
113 def _sync_data_from_server_and_replace_local(self) -> Tuple[bool, str]:
114 """
115 Performs syncing of data from server and replaces local db
116
117 Returns true for success and False for error/failure
118
119 May raise:
120 - PremiumAuthenticationError due to an UnableToDecryptRemoteData
121 coming from decompress_and_decrypt_db. This happens when the given password
122 does not match the one on the saved DB.
123 """
124 if self.premium is None:
125 return False, 'Pulling failed. User does not have active premium.'
126
127 try:
128 result = self.premium.pull_data()
129 except RemoteError as e:
130 log.debug('sync from server -- pulling failed.', error=str(e))
131 return False, f'Pulling failed: {str(e)}'
132
133 if result['data'] is None:
134 log.debug('sync from server -- no data found.')
135 return False, 'No data found'
136
137 try:
138 self.data.decompress_and_decrypt_db(self.password, result['data'])
139 except UnableToDecryptRemoteData as e:
140 raise PremiumAuthenticationError(
141 'The given password can not unlock the database that was retrieved from '
142 'the server. Make sure to use the same password as when the account was created.',
143 ) from e
144
145 return True, ''
146
147 def maybe_upload_data_to_server(self, force_upload: bool = False) -> bool:
148 # if user has no premium do nothing
149 if self.premium is None:
150 return False
151
152 if not self.data.db.get_premium_sync() and not force_upload:
153 return False
154
155 # upload only once per hour
156 diff = ts_now() - self.last_data_upload_ts
157 if diff < 3600 and not force_upload:
158 return False
159
160 try:
161 metadata = self.premium.query_last_data_metadata()
162 except RemoteError as e:
163 log.debug('upload to server -- fetching metadata error', error=str(e))
164 return False
165 b64_encoded_data, our_hash = self.data.compress_and_encrypt_db(self.password)
166
167 log.debug(
168 'CAN_PUSH',
169 ours=our_hash,
170 theirs=metadata.data_hash,
171 )
172 if our_hash == metadata.data_hash and not force_upload:
173 log.debug('upload to server stopped -- same hash')
174 # same hash -- no need to upload anything
175 return False
176
177 our_last_write_ts = self.data.db.get_last_write_ts()
178 if our_last_write_ts <= metadata.last_modify_ts and not force_upload:
179 # Server's DB was modified after our local DB
180 log.debug(
181 f'upload to server stopped -- remote db({metadata.last_modify_ts}) '
182 f'more recent than local({our_last_write_ts})',
183 )
184 return False
185
186 data_bytes_size = len(base64.b64decode(b64_encoded_data))
187 if data_bytes_size < metadata.data_size and not force_upload:
188 # Let's be conservative.
189 # TODO: Here perhaps prompt user in the future
190 log.debug(
191 f'upload to server stopped -- remote db({metadata.data_size}) '
192 f'bigger than local({data_bytes_size})',
193 )
194 return False
195
196 try:
197 self.premium.upload_data(
198 data_blob=b64_encoded_data,
199 our_hash=our_hash,
200 last_modify_ts=our_last_write_ts,
201 compression_type='zlib',
202 )
203 except RemoteError as e:
204 log.debug('upload to server -- upload error', error=str(e))
205 return False
206
207 # update the last data upload value
208 self.last_data_upload_ts = ts_now()
209 self.data.db.update_last_data_upload_ts(self.last_data_upload_ts)
210 log.debug('upload to server -- success')
211 return True
212
213 def sync_data(self, action: Literal['upload', 'download']) -> Tuple[bool, str]:
214 msg = ''
215
216 if action == 'upload':
217 success = self.maybe_upload_data_to_server(force_upload=True)
218
219 if not success:
220 msg = 'Upload failed'
221 return success, msg
222
223 return self._sync_data_from_server_and_replace_local()
224
225 def try_premium_at_start(
226 self,
227 given_premium_credentials: Optional[PremiumCredentials],
228 username: str,
229 create_new: bool,
230 sync_approval: Literal['yes', 'no', 'unknown'],
231 ) -> Optional[Premium]:
232 """
233 Check if new user provided api pair or we already got one in the DB
234
235 Returns the created premium if user's premium credentials were fine.
236
237 If not it will raise PremiumAuthenticationError.
238
239 If no credentials were given it returns None
240 """
241
242 if given_premium_credentials is not None:
243 assert create_new, 'We should never get here for an already existing account'
244
245 try:
246 self.premium = premium_create_and_verify(given_premium_credentials)
247 except PremiumAuthenticationError as e:
248 log.error('Given API key is invalid')
249 # At this point we are at a new user trying to create an account with
250 # premium API keys and we failed. But a directory was created. Remove it.
251 # But create a backup of it in case something went really wrong
252 # and the directory contained data we did not want to lose
253 shutil.move(
254 self.data.user_data_dir, # type: ignore
255 self.data.data_directory / f'auto_backup_{username}_{ts_now()}',
256 )
257 raise PremiumAuthenticationError(
258 'Could not verify keys for the new account. '
259 '{}'.format(str(e)),
260 ) from e
261
262 # else, if we got premium data in the DB initialize it and try to sync with the server
263 db_credentials = self.data.db.get_rotkehlchen_premium()
264 if db_credentials:
265 assert not create_new, 'We should never get here for a new account'
266 try:
267 self.premium = premium_create_and_verify(db_credentials)
268 except PremiumAuthenticationError as e:
269 message = (
270 f'Could not authenticate with the rotkehlchen server with '
271 f'the API keys found in the Database. Error: {str(e)}'
272 )
273 log.error(message)
274 raise PremiumAuthenticationError(message) from e
275
276 if self.premium is None:
277 return None
278
279 result = self._can_sync_data_from_server(new_account=create_new)
280 if result.can_sync == CanSync.ASK_USER:
281 if sync_approval == 'unknown':
282 log.info('Remote DB is possibly newer. Ask user.')
283 raise RotkehlchenPermissionError(result.message, result.payload)
284
285 if sync_approval == 'yes':
286 log.info('User approved data sync from server')
287 if self._sync_data_from_server_and_replace_local()[0]:
288 if create_new:
289 # if we successfully synced data from the server and this is
290 # a new account, make sure the api keys are properly stored
291 # in the DB
292 self.data.db.set_rotkehlchen_premium(self.premium.credentials)
293
294 else:
295 log.debug('Could sync data from server but user refused')
296 elif result.can_sync == CanSync.YES:
297 log.info('User approved data sync from server')
298 if self._sync_data_from_server_and_replace_local()[0]:
299 if create_new:
300 # if we successfully synced data from the server and this is
301 # a new account, make sure the api keys are properly stored
302 # in the DB
303 self.data.db.set_rotkehlchen_premium(self.premium.credentials)
304
305 # else result.can_sync was no, so we do nothing
306
307 # Success, return premium
308 return self.premium
309
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/rotkehlchen/premium/sync.py b/rotkehlchen/premium/sync.py
--- a/rotkehlchen/premium/sync.py
+++ b/rotkehlchen/premium/sync.py
@@ -284,25 +284,18 @@
if sync_approval == 'yes':
log.info('User approved data sync from server')
- if self._sync_data_from_server_and_replace_local()[0]:
- if create_new:
- # if we successfully synced data from the server and this is
- # a new account, make sure the api keys are properly stored
- # in the DB
- self.data.db.set_rotkehlchen_premium(self.premium.credentials)
+ self._sync_data_from_server_and_replace_local() # this may raise due to password
else:
log.debug('Could sync data from server but user refused')
elif result.can_sync == CanSync.YES:
log.info('User approved data sync from server')
- if self._sync_data_from_server_and_replace_local()[0]:
- if create_new:
- # if we successfully synced data from the server and this is
- # a new account, make sure the api keys are properly stored
- # in the DB
- self.data.db.set_rotkehlchen_premium(self.premium.credentials)
-
- # else result.can_sync was no, so we do nothing
+ self._sync_data_from_server_and_replace_local() # this may raise due to password
+
+ if create_new:
+ # if this is a new account, make sure the api keys are properly stored
+ # in the DB
+ self.data.db.set_rotkehlchen_premium(self.premium.credentials)
# Success, return premium
return self.premium
|
{"golden_diff": "diff --git a/rotkehlchen/premium/sync.py b/rotkehlchen/premium/sync.py\n--- a/rotkehlchen/premium/sync.py\n+++ b/rotkehlchen/premium/sync.py\n@@ -284,25 +284,18 @@\n \n if sync_approval == 'yes':\n log.info('User approved data sync from server')\n- if self._sync_data_from_server_and_replace_local()[0]:\n- if create_new:\n- # if we successfully synced data from the server and this is\n- # a new account, make sure the api keys are properly stored\n- # in the DB\n- self.data.db.set_rotkehlchen_premium(self.premium.credentials)\n+ self._sync_data_from_server_and_replace_local() # this may raise due to password\n \n else:\n log.debug('Could sync data from server but user refused')\n elif result.can_sync == CanSync.YES:\n log.info('User approved data sync from server')\n- if self._sync_data_from_server_and_replace_local()[0]:\n- if create_new:\n- # if we successfully synced data from the server and this is\n- # a new account, make sure the api keys are properly stored\n- # in the DB\n- self.data.db.set_rotkehlchen_premium(self.premium.credentials)\n-\n- # else result.can_sync was no, so we do nothing\n+ self._sync_data_from_server_and_replace_local() # this may raise due to password\n+\n+ if create_new:\n+ # if this is a new account, make sure the api keys are properly stored\n+ # in the DB\n+ self.data.db.set_rotkehlchen_premium(self.premium.credentials)\n \n # Success, return premium\n return self.premium\n", "issue": "Upgrade to Premium popup still displayed when created user with Premium api key \nCreated docker image from commit https://github.com/rotki/rotki/commit/a7ef8be31f8059eeff34509667688a87a63edfc6 and created user for a first time importing api keys i was present with `Upgrade to Premium` popup, expected no to see this as far as Premium keys already were imported during user creation. \r\n\r\n<img width=\"797\" alt=\"Screen Shot 2021-05-05 at 6 40 03 PM\" src=\"https://user-images.githubusercontent.com/15807643/117230156-fb5d4880-ae0b-11eb-9bc1-0d1d11cb5c6c.png\">\r\n<img width=\"1888\" alt=\"Screen Shot 2021-05-05 at 6 43 32 PM\" src=\"https://user-images.githubusercontent.com/15807643/117230372-79215400-ae0c-11eb-8f24-dbe79a38bab2.png\">\r\n\r\nImported api keys second time to activate Premium\r\n<img width=\"1814\" alt=\"Screen Shot 2021-05-05 at 6 44 45 PM\" src=\"https://user-images.githubusercontent.com/15807643/117230452-a66e0200-ae0c-11eb-9c3a-d1ab8be0be9a.png\">\r\n\r\n\r\n\n", "before_files": [{"content": "import base64\nimport logging\nimport shutil\nfrom enum import Enum\nfrom typing import Any, Dict, NamedTuple, Optional, Tuple\n\nfrom typing_extensions import Literal\n\nfrom rotkehlchen.data_handler import DataHandler\nfrom rotkehlchen.errors import (\n PremiumAuthenticationError,\n RemoteError,\n RotkehlchenPermissionError,\n UnableToDecryptRemoteData,\n)\nfrom rotkehlchen.logging import RotkehlchenLogsAdapter\nfrom rotkehlchen.premium.premium import Premium, PremiumCredentials, premium_create_and_verify\nfrom rotkehlchen.utils.misc import timestamp_to_date, ts_now\n\nlogger = logging.getLogger(__name__)\nlog = RotkehlchenLogsAdapter(logger)\n\n\nclass CanSync(Enum):\n YES = 0\n NO = 1\n ASK_USER = 2\n\n\nclass SyncCheckResult(NamedTuple):\n # The result of the sync check\n can_sync: CanSync\n # If result is ASK_USER, what should the message be?\n message: str\n payload: Optional[Dict[str, Any]]\n\n\nclass PremiumSyncManager():\n\n def __init__(self, data: DataHandler, password: str) -> None:\n # Initialize this with the value saved in the DB\n self.last_data_upload_ts = data.db.get_last_data_upload_ts()\n self.data = data\n self.password = password\n self.premium: Optional[Premium] = None\n\n def _can_sync_data_from_server(self, new_account: bool) -> SyncCheckResult:\n \"\"\"\n Checks if the remote data can be pulled from the server.\n\n Returns a SyncCheckResult denoting whether we can pull for sure,\n whether we can't pull or whether the user should be asked. If the user\n should be asked a message is also returned\n \"\"\"\n log.debug('can sync data from server -- start')\n if self.premium is None:\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n b64_encoded_data, our_hash = self.data.compress_and_encrypt_db(self.password)\n\n try:\n metadata = self.premium.query_last_data_metadata()\n except RemoteError as e:\n log.debug('can sync data from server failed', error=str(e))\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n if new_account:\n return SyncCheckResult(can_sync=CanSync.YES, message='', payload=None)\n\n if not self.data.db.get_premium_sync():\n # If it's not a new account and the db setting for premium syncing is off stop\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n log.debug(\n 'CAN_PULL',\n ours=our_hash,\n theirs=metadata.data_hash,\n )\n if our_hash == metadata.data_hash:\n log.debug('sync from server stopped -- same hash')\n # same hash -- no need to get anything\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n our_last_write_ts = self.data.db.get_last_write_ts()\n data_bytes_size = len(base64.b64decode(b64_encoded_data))\n\n local_more_recent = our_last_write_ts >= metadata.last_modify_ts\n local_bigger = data_bytes_size >= metadata.data_size\n\n if local_more_recent and local_bigger:\n log.debug('sync from server stopped -- local is both newer and bigger')\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n if local_more_recent is False: # remote is more recent\n message = (\n 'Detected remote database with more recent modification timestamp '\n 'than the local one. '\n )\n else: # remote is bigger\n message = 'Detected remote database with bigger size than the local one. '\n\n return SyncCheckResult(\n can_sync=CanSync.ASK_USER,\n message=message,\n payload={\n 'local_size': data_bytes_size,\n 'remote_size': metadata.data_size,\n 'local_last_modified': timestamp_to_date(our_last_write_ts),\n 'remote_last_modified': timestamp_to_date(metadata.last_modify_ts),\n },\n )\n\n def _sync_data_from_server_and_replace_local(self) -> Tuple[bool, str]:\n \"\"\"\n Performs syncing of data from server and replaces local db\n\n Returns true for success and False for error/failure\n\n May raise:\n - PremiumAuthenticationError due to an UnableToDecryptRemoteData\n coming from decompress_and_decrypt_db. This happens when the given password\n does not match the one on the saved DB.\n \"\"\"\n if self.premium is None:\n return False, 'Pulling failed. User does not have active premium.'\n\n try:\n result = self.premium.pull_data()\n except RemoteError as e:\n log.debug('sync from server -- pulling failed.', error=str(e))\n return False, f'Pulling failed: {str(e)}'\n\n if result['data'] is None:\n log.debug('sync from server -- no data found.')\n return False, 'No data found'\n\n try:\n self.data.decompress_and_decrypt_db(self.password, result['data'])\n except UnableToDecryptRemoteData as e:\n raise PremiumAuthenticationError(\n 'The given password can not unlock the database that was retrieved from '\n 'the server. Make sure to use the same password as when the account was created.',\n ) from e\n\n return True, ''\n\n def maybe_upload_data_to_server(self, force_upload: bool = False) -> bool:\n # if user has no premium do nothing\n if self.premium is None:\n return False\n\n if not self.data.db.get_premium_sync() and not force_upload:\n return False\n\n # upload only once per hour\n diff = ts_now() - self.last_data_upload_ts\n if diff < 3600 and not force_upload:\n return False\n\n try:\n metadata = self.premium.query_last_data_metadata()\n except RemoteError as e:\n log.debug('upload to server -- fetching metadata error', error=str(e))\n return False\n b64_encoded_data, our_hash = self.data.compress_and_encrypt_db(self.password)\n\n log.debug(\n 'CAN_PUSH',\n ours=our_hash,\n theirs=metadata.data_hash,\n )\n if our_hash == metadata.data_hash and not force_upload:\n log.debug('upload to server stopped -- same hash')\n # same hash -- no need to upload anything\n return False\n\n our_last_write_ts = self.data.db.get_last_write_ts()\n if our_last_write_ts <= metadata.last_modify_ts and not force_upload:\n # Server's DB was modified after our local DB\n log.debug(\n f'upload to server stopped -- remote db({metadata.last_modify_ts}) '\n f'more recent than local({our_last_write_ts})',\n )\n return False\n\n data_bytes_size = len(base64.b64decode(b64_encoded_data))\n if data_bytes_size < metadata.data_size and not force_upload:\n # Let's be conservative.\n # TODO: Here perhaps prompt user in the future\n log.debug(\n f'upload to server stopped -- remote db({metadata.data_size}) '\n f'bigger than local({data_bytes_size})',\n )\n return False\n\n try:\n self.premium.upload_data(\n data_blob=b64_encoded_data,\n our_hash=our_hash,\n last_modify_ts=our_last_write_ts,\n compression_type='zlib',\n )\n except RemoteError as e:\n log.debug('upload to server -- upload error', error=str(e))\n return False\n\n # update the last data upload value\n self.last_data_upload_ts = ts_now()\n self.data.db.update_last_data_upload_ts(self.last_data_upload_ts)\n log.debug('upload to server -- success')\n return True\n\n def sync_data(self, action: Literal['upload', 'download']) -> Tuple[bool, str]:\n msg = ''\n\n if action == 'upload':\n success = self.maybe_upload_data_to_server(force_upload=True)\n\n if not success:\n msg = 'Upload failed'\n return success, msg\n\n return self._sync_data_from_server_and_replace_local()\n\n def try_premium_at_start(\n self,\n given_premium_credentials: Optional[PremiumCredentials],\n username: str,\n create_new: bool,\n sync_approval: Literal['yes', 'no', 'unknown'],\n ) -> Optional[Premium]:\n \"\"\"\n Check if new user provided api pair or we already got one in the DB\n\n Returns the created premium if user's premium credentials were fine.\n\n If not it will raise PremiumAuthenticationError.\n\n If no credentials were given it returns None\n \"\"\"\n\n if given_premium_credentials is not None:\n assert create_new, 'We should never get here for an already existing account'\n\n try:\n self.premium = premium_create_and_verify(given_premium_credentials)\n except PremiumAuthenticationError as e:\n log.error('Given API key is invalid')\n # At this point we are at a new user trying to create an account with\n # premium API keys and we failed. But a directory was created. Remove it.\n # But create a backup of it in case something went really wrong\n # and the directory contained data we did not want to lose\n shutil.move(\n self.data.user_data_dir, # type: ignore\n self.data.data_directory / f'auto_backup_{username}_{ts_now()}',\n )\n raise PremiumAuthenticationError(\n 'Could not verify keys for the new account. '\n '{}'.format(str(e)),\n ) from e\n\n # else, if we got premium data in the DB initialize it and try to sync with the server\n db_credentials = self.data.db.get_rotkehlchen_premium()\n if db_credentials:\n assert not create_new, 'We should never get here for a new account'\n try:\n self.premium = premium_create_and_verify(db_credentials)\n except PremiumAuthenticationError as e:\n message = (\n f'Could not authenticate with the rotkehlchen server with '\n f'the API keys found in the Database. Error: {str(e)}'\n )\n log.error(message)\n raise PremiumAuthenticationError(message) from e\n\n if self.premium is None:\n return None\n\n result = self._can_sync_data_from_server(new_account=create_new)\n if result.can_sync == CanSync.ASK_USER:\n if sync_approval == 'unknown':\n log.info('Remote DB is possibly newer. Ask user.')\n raise RotkehlchenPermissionError(result.message, result.payload)\n\n if sync_approval == 'yes':\n log.info('User approved data sync from server')\n if self._sync_data_from_server_and_replace_local()[0]:\n if create_new:\n # if we successfully synced data from the server and this is\n # a new account, make sure the api keys are properly stored\n # in the DB\n self.data.db.set_rotkehlchen_premium(self.premium.credentials)\n\n else:\n log.debug('Could sync data from server but user refused')\n elif result.can_sync == CanSync.YES:\n log.info('User approved data sync from server')\n if self._sync_data_from_server_and_replace_local()[0]:\n if create_new:\n # if we successfully synced data from the server and this is\n # a new account, make sure the api keys are properly stored\n # in the DB\n self.data.db.set_rotkehlchen_premium(self.premium.credentials)\n\n # else result.can_sync was no, so we do nothing\n\n # Success, return premium\n return self.premium\n", "path": "rotkehlchen/premium/sync.py"}], "after_files": [{"content": "import base64\nimport logging\nimport shutil\nfrom enum import Enum\nfrom typing import Any, Dict, NamedTuple, Optional, Tuple\n\nfrom typing_extensions import Literal\n\nfrom rotkehlchen.data_handler import DataHandler\nfrom rotkehlchen.errors import (\n PremiumAuthenticationError,\n RemoteError,\n RotkehlchenPermissionError,\n UnableToDecryptRemoteData,\n)\nfrom rotkehlchen.logging import RotkehlchenLogsAdapter\nfrom rotkehlchen.premium.premium import Premium, PremiumCredentials, premium_create_and_verify\nfrom rotkehlchen.utils.misc import timestamp_to_date, ts_now\n\nlogger = logging.getLogger(__name__)\nlog = RotkehlchenLogsAdapter(logger)\n\n\nclass CanSync(Enum):\n YES = 0\n NO = 1\n ASK_USER = 2\n\n\nclass SyncCheckResult(NamedTuple):\n # The result of the sync check\n can_sync: CanSync\n # If result is ASK_USER, what should the message be?\n message: str\n payload: Optional[Dict[str, Any]]\n\n\nclass PremiumSyncManager():\n\n def __init__(self, data: DataHandler, password: str) -> None:\n # Initialize this with the value saved in the DB\n self.last_data_upload_ts = data.db.get_last_data_upload_ts()\n self.data = data\n self.password = password\n self.premium: Optional[Premium] = None\n\n def _can_sync_data_from_server(self, new_account: bool) -> SyncCheckResult:\n \"\"\"\n Checks if the remote data can be pulled from the server.\n\n Returns a SyncCheckResult denoting whether we can pull for sure,\n whether we can't pull or whether the user should be asked. If the user\n should be asked a message is also returned\n \"\"\"\n log.debug('can sync data from server -- start')\n if self.premium is None:\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n b64_encoded_data, our_hash = self.data.compress_and_encrypt_db(self.password)\n\n try:\n metadata = self.premium.query_last_data_metadata()\n except RemoteError as e:\n log.debug('can sync data from server failed', error=str(e))\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n if new_account:\n return SyncCheckResult(can_sync=CanSync.YES, message='', payload=None)\n\n if not self.data.db.get_premium_sync():\n # If it's not a new account and the db setting for premium syncing is off stop\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n log.debug(\n 'CAN_PULL',\n ours=our_hash,\n theirs=metadata.data_hash,\n )\n if our_hash == metadata.data_hash:\n log.debug('sync from server stopped -- same hash')\n # same hash -- no need to get anything\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n our_last_write_ts = self.data.db.get_last_write_ts()\n data_bytes_size = len(base64.b64decode(b64_encoded_data))\n\n local_more_recent = our_last_write_ts >= metadata.last_modify_ts\n local_bigger = data_bytes_size >= metadata.data_size\n\n if local_more_recent and local_bigger:\n log.debug('sync from server stopped -- local is both newer and bigger')\n return SyncCheckResult(can_sync=CanSync.NO, message='', payload=None)\n\n if local_more_recent is False: # remote is more recent\n message = (\n 'Detected remote database with more recent modification timestamp '\n 'than the local one. '\n )\n else: # remote is bigger\n message = 'Detected remote database with bigger size than the local one. '\n\n return SyncCheckResult(\n can_sync=CanSync.ASK_USER,\n message=message,\n payload={\n 'local_size': data_bytes_size,\n 'remote_size': metadata.data_size,\n 'local_last_modified': timestamp_to_date(our_last_write_ts),\n 'remote_last_modified': timestamp_to_date(metadata.last_modify_ts),\n },\n )\n\n def _sync_data_from_server_and_replace_local(self) -> Tuple[bool, str]:\n \"\"\"\n Performs syncing of data from server and replaces local db\n\n Returns true for success and False for error/failure\n\n May raise:\n - PremiumAuthenticationError due to an UnableToDecryptRemoteData\n coming from decompress_and_decrypt_db. This happens when the given password\n does not match the one on the saved DB.\n \"\"\"\n if self.premium is None:\n return False, 'Pulling failed. User does not have active premium.'\n\n try:\n result = self.premium.pull_data()\n except RemoteError as e:\n log.debug('sync from server -- pulling failed.', error=str(e))\n return False, f'Pulling failed: {str(e)}'\n\n if result['data'] is None:\n log.debug('sync from server -- no data found.')\n return False, 'No data found'\n\n try:\n self.data.decompress_and_decrypt_db(self.password, result['data'])\n except UnableToDecryptRemoteData as e:\n raise PremiumAuthenticationError(\n 'The given password can not unlock the database that was retrieved from '\n 'the server. Make sure to use the same password as when the account was created.',\n ) from e\n\n return True, ''\n\n def maybe_upload_data_to_server(self, force_upload: bool = False) -> bool:\n # if user has no premium do nothing\n if self.premium is None:\n return False\n\n if not self.data.db.get_premium_sync() and not force_upload:\n return False\n\n # upload only once per hour\n diff = ts_now() - self.last_data_upload_ts\n if diff < 3600 and not force_upload:\n return False\n\n try:\n metadata = self.premium.query_last_data_metadata()\n except RemoteError as e:\n log.debug('upload to server -- fetching metadata error', error=str(e))\n return False\n b64_encoded_data, our_hash = self.data.compress_and_encrypt_db(self.password)\n\n log.debug(\n 'CAN_PUSH',\n ours=our_hash,\n theirs=metadata.data_hash,\n )\n if our_hash == metadata.data_hash and not force_upload:\n log.debug('upload to server stopped -- same hash')\n # same hash -- no need to upload anything\n return False\n\n our_last_write_ts = self.data.db.get_last_write_ts()\n if our_last_write_ts <= metadata.last_modify_ts and not force_upload:\n # Server's DB was modified after our local DB\n log.debug(\n f'upload to server stopped -- remote db({metadata.last_modify_ts}) '\n f'more recent than local({our_last_write_ts})',\n )\n return False\n\n data_bytes_size = len(base64.b64decode(b64_encoded_data))\n if data_bytes_size < metadata.data_size and not force_upload:\n # Let's be conservative.\n # TODO: Here perhaps prompt user in the future\n log.debug(\n f'upload to server stopped -- remote db({metadata.data_size}) '\n f'bigger than local({data_bytes_size})',\n )\n return False\n\n try:\n self.premium.upload_data(\n data_blob=b64_encoded_data,\n our_hash=our_hash,\n last_modify_ts=our_last_write_ts,\n compression_type='zlib',\n )\n except RemoteError as e:\n log.debug('upload to server -- upload error', error=str(e))\n return False\n\n # update the last data upload value\n self.last_data_upload_ts = ts_now()\n self.data.db.update_last_data_upload_ts(self.last_data_upload_ts)\n log.debug('upload to server -- success')\n return True\n\n def sync_data(self, action: Literal['upload', 'download']) -> Tuple[bool, str]:\n msg = ''\n\n if action == 'upload':\n success = self.maybe_upload_data_to_server(force_upload=True)\n\n if not success:\n msg = 'Upload failed'\n return success, msg\n\n return self._sync_data_from_server_and_replace_local()\n\n def try_premium_at_start(\n self,\n given_premium_credentials: Optional[PremiumCredentials],\n username: str,\n create_new: bool,\n sync_approval: Literal['yes', 'no', 'unknown'],\n ) -> Optional[Premium]:\n \"\"\"\n Check if new user provided api pair or we already got one in the DB\n\n Returns the created premium if user's premium credentials were fine.\n\n If not it will raise PremiumAuthenticationError.\n\n If no credentials were given it returns None\n \"\"\"\n\n if given_premium_credentials is not None:\n assert create_new, 'We should never get here for an already existing account'\n\n try:\n self.premium = premium_create_and_verify(given_premium_credentials)\n except PremiumAuthenticationError as e:\n log.error('Given API key is invalid')\n # At this point we are at a new user trying to create an account with\n # premium API keys and we failed. But a directory was created. Remove it.\n # But create a backup of it in case something went really wrong\n # and the directory contained data we did not want to lose\n shutil.move(\n self.data.user_data_dir, # type: ignore\n self.data.data_directory / f'auto_backup_{username}_{ts_now()}',\n )\n raise PremiumAuthenticationError(\n 'Could not verify keys for the new account. '\n '{}'.format(str(e)),\n ) from e\n\n # else, if we got premium data in the DB initialize it and try to sync with the server\n db_credentials = self.data.db.get_rotkehlchen_premium()\n if db_credentials:\n assert not create_new, 'We should never get here for a new account'\n try:\n self.premium = premium_create_and_verify(db_credentials)\n except PremiumAuthenticationError as e:\n message = (\n f'Could not authenticate with the rotkehlchen server with '\n f'the API keys found in the Database. Error: {str(e)}'\n )\n log.error(message)\n raise PremiumAuthenticationError(message) from e\n\n if self.premium is None:\n return None\n\n result = self._can_sync_data_from_server(new_account=create_new)\n if result.can_sync == CanSync.ASK_USER:\n if sync_approval == 'unknown':\n log.info('Remote DB is possibly newer. Ask user.')\n raise RotkehlchenPermissionError(result.message, result.payload)\n\n if sync_approval == 'yes':\n log.info('User approved data sync from server')\n self._sync_data_from_server_and_replace_local() # this may raise due to password\n\n else:\n log.debug('Could sync data from server but user refused')\n elif result.can_sync == CanSync.YES:\n log.info('User approved data sync from server')\n self._sync_data_from_server_and_replace_local() # this may raise due to password\n\n if create_new:\n # if this is a new account, make sure the api keys are properly stored\n # in the DB\n self.data.db.set_rotkehlchen_premium(self.premium.credentials)\n\n # Success, return premium\n return self.premium\n", "path": "rotkehlchen/premium/sync.py"}]}
| 4,014 | 399 |
gh_patches_debug_67277
|
rasdani/github-patches
|
git_diff
|
litestar-org__litestar-2244
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
StaticFilesConfig and virtual directories
I'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem.
This is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems.
https://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `litestar/cli/main.py`
Content:
```
1 from __future__ import annotations
2
3 import sys
4 from pathlib import Path
5 from typing import TYPE_CHECKING
6
7 from ._utils import RICH_CLICK_INSTALLED, LitestarEnv, LitestarExtensionGroup
8 from .commands import core, schema, sessions
9
10 if TYPE_CHECKING or not RICH_CLICK_INSTALLED: # pragma: no cover
11 import click
12 from click import Context, group, option, pass_context
13 from click import Path as ClickPath
14 else:
15 import rich_click as click
16 from rich_click import Context, group, option, pass_context
17 from rich_click import Path as ClickPath
18 from rich_click.cli import patch as rich_click_patch
19
20 rich_click_patch()
21 click.rich_click.USE_RICH_MARKUP = True
22 click.rich_click.USE_MARKDOWN = False
23 click.rich_click.SHOW_ARGUMENTS = True
24 click.rich_click.GROUP_ARGUMENTS_OPTIONS = True
25 click.rich_click.SHOW_ARGUMENTS = True
26 click.rich_click.GROUP_ARGUMENTS_OPTIONS = True
27 click.rich_click.STYLE_ERRORS_SUGGESTION = "magenta italic"
28 click.rich_click.ERRORS_SUGGESTION = ""
29 click.rich_click.ERRORS_EPILOGUE = ""
30 click.rich_click.MAX_WIDTH = 100
31 click.rich_click.SHOW_METAVARS_COLUMN = True
32 click.rich_click.APPEND_METAVARS_HELP = True
33
34
35 __all__ = ("litestar_group",)
36
37
38 @group(cls=LitestarExtensionGroup, context_settings={"help_option_names": ["-h", "--help"]})
39 @option("--app", "app_path", help="Module path to a Litestar application")
40 @option(
41 "--app-dir",
42 help="Look for APP in the specified directory, by adding this to the PYTHONPATH. Defaults to the current working directory.",
43 default=None,
44 type=ClickPath(dir_okay=True, file_okay=False, path_type=Path),
45 show_default=False,
46 )
47 @pass_context
48 def litestar_group(ctx: Context, app_path: str | None, app_dir: Path | None = None) -> None:
49 """Litestar CLI."""
50 sys.path.append(str(app_dir))
51
52 if ctx.obj is None: # env has not been loaded yet, so we can lazy load it
53 ctx.obj = lambda: LitestarEnv.from_env(app_path)
54
55
56 # add sub commands here
57
58 litestar_group.add_command(core.info_command)
59 litestar_group.add_command(core.run_command)
60 litestar_group.add_command(core.routes_command)
61 litestar_group.add_command(core.version_command)
62 litestar_group.add_command(sessions.sessions_group)
63 litestar_group.add_command(schema.schema_group)
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/litestar/cli/main.py b/litestar/cli/main.py
--- a/litestar/cli/main.py
+++ b/litestar/cli/main.py
@@ -27,7 +27,7 @@
click.rich_click.STYLE_ERRORS_SUGGESTION = "magenta italic"
click.rich_click.ERRORS_SUGGESTION = ""
click.rich_click.ERRORS_EPILOGUE = ""
- click.rich_click.MAX_WIDTH = 100
+ click.rich_click.MAX_WIDTH = 80
click.rich_click.SHOW_METAVARS_COLUMN = True
click.rich_click.APPEND_METAVARS_HELP = True
|
{"golden_diff": "diff --git a/litestar/cli/main.py b/litestar/cli/main.py\n--- a/litestar/cli/main.py\n+++ b/litestar/cli/main.py\n@@ -27,7 +27,7 @@\n click.rich_click.STYLE_ERRORS_SUGGESTION = \"magenta italic\"\n click.rich_click.ERRORS_SUGGESTION = \"\"\n click.rich_click.ERRORS_EPILOGUE = \"\"\n- click.rich_click.MAX_WIDTH = 100\n+ click.rich_click.MAX_WIDTH = 80\n click.rich_click.SHOW_METAVARS_COLUMN = True\n click.rich_click.APPEND_METAVARS_HELP = True\n", "issue": "StaticFilesConfig and virtual directories\nI'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem. \r\n\r\nThis is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems.\r\n\r\nhttps://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32\n", "before_files": [{"content": "from __future__ import annotations\n\nimport sys\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING\n\nfrom ._utils import RICH_CLICK_INSTALLED, LitestarEnv, LitestarExtensionGroup\nfrom .commands import core, schema, sessions\n\nif TYPE_CHECKING or not RICH_CLICK_INSTALLED: # pragma: no cover\n import click\n from click import Context, group, option, pass_context\n from click import Path as ClickPath\nelse:\n import rich_click as click\n from rich_click import Context, group, option, pass_context\n from rich_click import Path as ClickPath\n from rich_click.cli import patch as rich_click_patch\n\n rich_click_patch()\n click.rich_click.USE_RICH_MARKUP = True\n click.rich_click.USE_MARKDOWN = False\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.STYLE_ERRORS_SUGGESTION = \"magenta italic\"\n click.rich_click.ERRORS_SUGGESTION = \"\"\n click.rich_click.ERRORS_EPILOGUE = \"\"\n click.rich_click.MAX_WIDTH = 100\n click.rich_click.SHOW_METAVARS_COLUMN = True\n click.rich_click.APPEND_METAVARS_HELP = True\n\n\n__all__ = (\"litestar_group\",)\n\n\n@group(cls=LitestarExtensionGroup, context_settings={\"help_option_names\": [\"-h\", \"--help\"]})\n@option(\"--app\", \"app_path\", help=\"Module path to a Litestar application\")\n@option(\n \"--app-dir\",\n help=\"Look for APP in the specified directory, by adding this to the PYTHONPATH. Defaults to the current working directory.\",\n default=None,\n type=ClickPath(dir_okay=True, file_okay=False, path_type=Path),\n show_default=False,\n)\n@pass_context\ndef litestar_group(ctx: Context, app_path: str | None, app_dir: Path | None = None) -> None:\n \"\"\"Litestar CLI.\"\"\"\n sys.path.append(str(app_dir))\n\n if ctx.obj is None: # env has not been loaded yet, so we can lazy load it\n ctx.obj = lambda: LitestarEnv.from_env(app_path)\n\n\n# add sub commands here\n\nlitestar_group.add_command(core.info_command)\nlitestar_group.add_command(core.run_command)\nlitestar_group.add_command(core.routes_command)\nlitestar_group.add_command(core.version_command)\nlitestar_group.add_command(sessions.sessions_group)\nlitestar_group.add_command(schema.schema_group)\n", "path": "litestar/cli/main.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport sys\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING\n\nfrom ._utils import RICH_CLICK_INSTALLED, LitestarEnv, LitestarExtensionGroup\nfrom .commands import core, schema, sessions\n\nif TYPE_CHECKING or not RICH_CLICK_INSTALLED: # pragma: no cover\n import click\n from click import Context, group, option, pass_context\n from click import Path as ClickPath\nelse:\n import rich_click as click\n from rich_click import Context, group, option, pass_context\n from rich_click import Path as ClickPath\n from rich_click.cli import patch as rich_click_patch\n\n rich_click_patch()\n click.rich_click.USE_RICH_MARKUP = True\n click.rich_click.USE_MARKDOWN = False\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.SHOW_ARGUMENTS = True\n click.rich_click.GROUP_ARGUMENTS_OPTIONS = True\n click.rich_click.STYLE_ERRORS_SUGGESTION = \"magenta italic\"\n click.rich_click.ERRORS_SUGGESTION = \"\"\n click.rich_click.ERRORS_EPILOGUE = \"\"\n click.rich_click.MAX_WIDTH = 80\n click.rich_click.SHOW_METAVARS_COLUMN = True\n click.rich_click.APPEND_METAVARS_HELP = True\n\n\n__all__ = (\"litestar_group\",)\n\n\n@group(cls=LitestarExtensionGroup, context_settings={\"help_option_names\": [\"-h\", \"--help\"]})\n@option(\"--app\", \"app_path\", help=\"Module path to a Litestar application\")\n@option(\n \"--app-dir\",\n help=\"Look for APP in the specified directory, by adding this to the PYTHONPATH. Defaults to the current working directory.\",\n default=None,\n type=ClickPath(dir_okay=True, file_okay=False, path_type=Path),\n show_default=False,\n)\n@pass_context\ndef litestar_group(ctx: Context, app_path: str | None, app_dir: Path | None = None) -> None:\n \"\"\"Litestar CLI.\"\"\"\n sys.path.append(str(app_dir))\n\n if ctx.obj is None: # env has not been loaded yet, so we can lazy load it\n ctx.obj = lambda: LitestarEnv.from_env(app_path)\n\n\n# add sub commands here\n\nlitestar_group.add_command(core.info_command)\nlitestar_group.add_command(core.run_command)\nlitestar_group.add_command(core.routes_command)\nlitestar_group.add_command(core.version_command)\nlitestar_group.add_command(sessions.sessions_group)\nlitestar_group.add_command(schema.schema_group)\n", "path": "litestar/cli/main.py"}]}
| 1,124 | 147 |
gh_patches_debug_6898
|
rasdani/github-patches
|
git_diff
|
mozilla__bugbug-98
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Unnecessary string formatting when saving the model
https://github.com/mozilla/bugbug/blob/ab007344c5c799c08739b2d56f334afcaf242919/bugbug/model.py#L111 is not needed, we can use the variable directly.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bugbug/model.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # This Source Code Form is subject to the terms of the Mozilla Public
3 # License, v. 2.0. If a copy of the MPL was not distributed with this file,
4 # You can obtain one at http://mozilla.org/MPL/2.0/.
5
6 import numpy as np
7 import shap
8 from imblearn.metrics import classification_report_imbalanced
9 from imblearn.under_sampling import RandomUnderSampler
10 from sklearn import metrics
11 from sklearn.externals import joblib
12 from sklearn.feature_extraction.text import TfidfVectorizer
13 from sklearn.model_selection import cross_val_score
14 from sklearn.model_selection import train_test_split
15
16 from bugbug import bugzilla
17 from bugbug.nlp import SpacyVectorizer
18
19
20 class Model():
21 def __init__(self, lemmatization=False):
22 if lemmatization:
23 self.text_vectorizer = SpacyVectorizer
24 else:
25 self.text_vectorizer = TfidfVectorizer
26
27 self.undersampling_enabled = True
28 self.cross_validation_enabled = True
29
30 def get_feature_names(self):
31 return []
32
33 def train(self):
34 classes = self.get_labels()
35 class_names = sorted(list(set(classes.values())), reverse=True)
36
37 # Get bugs, filtering out those for which we have no labels.
38 def bugs():
39 return (bug for bug in bugzilla.get_bugs() if bug['id'] in classes)
40
41 # Calculate labels.
42 y = np.array([classes[bug['id']] for bug in bugs()])
43
44 # Extract features from the bugs.
45 X = self.extraction_pipeline.fit_transform(bugs())
46
47 print(f'X: {X.shape}, y: {y.shape}')
48
49 # Split dataset in training and test.
50 X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=0)
51
52 if self.undersampling_enabled:
53 # Under-sample the majority classes, as the datasets are imbalanced.
54 X_train, y_train = RandomUnderSampler(random_state=0).fit_sample(X_train, y_train)
55
56 print(f'X_train: {X_train.shape}, y_train: {y_train.shape}')
57 print(f'X_test: {X_test.shape}, y_test: {y_test.shape}')
58
59 # Use k-fold cross validation to evaluate results.
60 if self.cross_validation_enabled:
61 scores = cross_val_score(self.clf, X_train, y_train, cv=5)
62 print(f'CV Accuracy: f{scores.mean()} (+/- {scores.std() * 2})')
63
64 # Evaluate results on the test set.
65 self.clf.fit(X_train, y_train)
66
67 feature_names = self.get_feature_names()
68 if len(feature_names):
69 explainer = shap.TreeExplainer(self.clf)
70 shap_values = explainer.shap_values(X_train)
71
72 print('Feature ranking (top 20 features):')
73 # Calculate the values that represent the fraction of the model output variability attributable
74 # to each feature across the whole dataset.
75 shap_sums = np.abs(shap_values).sum(0)
76 rel_shap_sums = shap_sums / shap_sums.sum()
77 indices = np.argsort(rel_shap_sums)[::-1][:20]
78 for i, index in enumerate(indices):
79 print(f'{i + 1}. \'{feature_names[index]}\' ({rel_shap_sums[index]})')
80
81 y_pred = self.clf.predict(X_test)
82
83 print(f'No confidence threshold - {len(y_test)} classified')
84 print(metrics.confusion_matrix(y_test, y_pred, labels=class_names))
85 print(classification_report_imbalanced(y_test, y_pred, labels=class_names))
86
87 # Evaluate results on the test set for some confidence thresholds.
88 for confidence_threshold in [0.6, 0.7, 0.8, 0.9]:
89 y_pred_probas = self.clf.predict_proba(X_test)
90
91 y_test_filter = []
92 y_pred_filter = []
93 for i in range(0, len(y_test)):
94 argmax = np.argmax(y_pred_probas[i])
95 if y_pred_probas[i][argmax] < confidence_threshold:
96 continue
97
98 y_test_filter.append(y_test[i])
99 y_pred_filter.append(argmax)
100
101 y_pred_filter = self.clf._le.inverse_transform(y_pred_filter)
102
103 print(f'\nConfidence threshold > {confidence_threshold} - {len(y_test_filter)} classified')
104 print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))
105 print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))
106
107 joblib.dump(self, f'{self.__class__.__name__.lower()}')
108
109 @staticmethod
110 def load(model_file_name):
111 return joblib.load(model_file_name)
112
113 def overwrite_classes(self, bugs, classes, probabilities):
114 return classes
115
116 def classify(self, bugs, probabilities=False, importances=False):
117 assert bugs is not None
118 assert self.extraction_pipeline is not None and self.clf is not None, 'The module needs to be initialized first'
119
120 if not isinstance(bugs, list):
121 bugs = [bugs]
122
123 assert isinstance(bugs[0], dict)
124
125 X = self.extraction_pipeline.transform(bugs)
126 if probabilities:
127 classes = self.clf.predict_proba(X)
128 else:
129 classes = self.clf.predict(X)
130
131 classes = self.overwrite_classes(bugs, classes, probabilities)
132
133 if importances:
134 explainer = shap.TreeExplainer(self.clf)
135 shap_values = explainer.shap_values(X)
136
137 shap_sums = shap_values.sum(0)
138 abs_shap_sums = np.abs(shap_sums)
139 rel_shap_sums = abs_shap_sums / abs_shap_sums.sum()
140 indices = np.argsort(abs_shap_sums)[::-1]
141 importances = [(index, shap_sums[index] > 0, rel_shap_sums[index]) for index in indices]
142
143 return classes, importances
144
145 return classes
146
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/bugbug/model.py b/bugbug/model.py
--- a/bugbug/model.py
+++ b/bugbug/model.py
@@ -104,7 +104,7 @@
print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))
print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))
- joblib.dump(self, f'{self.__class__.__name__.lower()}')
+ joblib.dump(self, self.__class__.__name__.lower())
@staticmethod
def load(model_file_name):
|
{"golden_diff": "diff --git a/bugbug/model.py b/bugbug/model.py\n--- a/bugbug/model.py\n+++ b/bugbug/model.py\n@@ -104,7 +104,7 @@\n print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))\n print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))\n \n- joblib.dump(self, f'{self.__class__.__name__.lower()}')\n+ joblib.dump(self, self.__class__.__name__.lower())\n \n @staticmethod\n def load(model_file_name):\n", "issue": "Unnecessary string formatting when saving the model\nhttps://github.com/mozilla/bugbug/blob/ab007344c5c799c08739b2d56f334afcaf242919/bugbug/model.py#L111 is not needed, we can use the variable directly.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport numpy as np\nimport shap\nfrom imblearn.metrics import classification_report_imbalanced\nfrom imblearn.under_sampling import RandomUnderSampler\nfrom sklearn import metrics\nfrom sklearn.externals import joblib\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nfrom sklearn.model_selection import cross_val_score\nfrom sklearn.model_selection import train_test_split\n\nfrom bugbug import bugzilla\nfrom bugbug.nlp import SpacyVectorizer\n\n\nclass Model():\n def __init__(self, lemmatization=False):\n if lemmatization:\n self.text_vectorizer = SpacyVectorizer\n else:\n self.text_vectorizer = TfidfVectorizer\n\n self.undersampling_enabled = True\n self.cross_validation_enabled = True\n\n def get_feature_names(self):\n return []\n\n def train(self):\n classes = self.get_labels()\n class_names = sorted(list(set(classes.values())), reverse=True)\n\n # Get bugs, filtering out those for which we have no labels.\n def bugs():\n return (bug for bug in bugzilla.get_bugs() if bug['id'] in classes)\n\n # Calculate labels.\n y = np.array([classes[bug['id']] for bug in bugs()])\n\n # Extract features from the bugs.\n X = self.extraction_pipeline.fit_transform(bugs())\n\n print(f'X: {X.shape}, y: {y.shape}')\n\n # Split dataset in training and test.\n X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=0)\n\n if self.undersampling_enabled:\n # Under-sample the majority classes, as the datasets are imbalanced.\n X_train, y_train = RandomUnderSampler(random_state=0).fit_sample(X_train, y_train)\n\n print(f'X_train: {X_train.shape}, y_train: {y_train.shape}')\n print(f'X_test: {X_test.shape}, y_test: {y_test.shape}')\n\n # Use k-fold cross validation to evaluate results.\n if self.cross_validation_enabled:\n scores = cross_val_score(self.clf, X_train, y_train, cv=5)\n print(f'CV Accuracy: f{scores.mean()} (+/- {scores.std() * 2})')\n\n # Evaluate results on the test set.\n self.clf.fit(X_train, y_train)\n\n feature_names = self.get_feature_names()\n if len(feature_names):\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X_train)\n\n print('Feature ranking (top 20 features):')\n # Calculate the values that represent the fraction of the model output variability attributable\n # to each feature across the whole dataset.\n shap_sums = np.abs(shap_values).sum(0)\n rel_shap_sums = shap_sums / shap_sums.sum()\n indices = np.argsort(rel_shap_sums)[::-1][:20]\n for i, index in enumerate(indices):\n print(f'{i + 1}. \\'{feature_names[index]}\\' ({rel_shap_sums[index]})')\n\n y_pred = self.clf.predict(X_test)\n\n print(f'No confidence threshold - {len(y_test)} classified')\n print(metrics.confusion_matrix(y_test, y_pred, labels=class_names))\n print(classification_report_imbalanced(y_test, y_pred, labels=class_names))\n\n # Evaluate results on the test set for some confidence thresholds.\n for confidence_threshold in [0.6, 0.7, 0.8, 0.9]:\n y_pred_probas = self.clf.predict_proba(X_test)\n\n y_test_filter = []\n y_pred_filter = []\n for i in range(0, len(y_test)):\n argmax = np.argmax(y_pred_probas[i])\n if y_pred_probas[i][argmax] < confidence_threshold:\n continue\n\n y_test_filter.append(y_test[i])\n y_pred_filter.append(argmax)\n\n y_pred_filter = self.clf._le.inverse_transform(y_pred_filter)\n\n print(f'\\nConfidence threshold > {confidence_threshold} - {len(y_test_filter)} classified')\n print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))\n print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))\n\n joblib.dump(self, f'{self.__class__.__name__.lower()}')\n\n @staticmethod\n def load(model_file_name):\n return joblib.load(model_file_name)\n\n def overwrite_classes(self, bugs, classes, probabilities):\n return classes\n\n def classify(self, bugs, probabilities=False, importances=False):\n assert bugs is not None\n assert self.extraction_pipeline is not None and self.clf is not None, 'The module needs to be initialized first'\n\n if not isinstance(bugs, list):\n bugs = [bugs]\n\n assert isinstance(bugs[0], dict)\n\n X = self.extraction_pipeline.transform(bugs)\n if probabilities:\n classes = self.clf.predict_proba(X)\n else:\n classes = self.clf.predict(X)\n\n classes = self.overwrite_classes(bugs, classes, probabilities)\n\n if importances:\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X)\n\n shap_sums = shap_values.sum(0)\n abs_shap_sums = np.abs(shap_sums)\n rel_shap_sums = abs_shap_sums / abs_shap_sums.sum()\n indices = np.argsort(abs_shap_sums)[::-1]\n importances = [(index, shap_sums[index] > 0, rel_shap_sums[index]) for index in indices]\n\n return classes, importances\n\n return classes\n", "path": "bugbug/model.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport numpy as np\nimport shap\nfrom imblearn.metrics import classification_report_imbalanced\nfrom imblearn.under_sampling import RandomUnderSampler\nfrom sklearn import metrics\nfrom sklearn.externals import joblib\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nfrom sklearn.model_selection import cross_val_score\nfrom sklearn.model_selection import train_test_split\n\nfrom bugbug import bugzilla\nfrom bugbug.nlp import SpacyVectorizer\n\n\nclass Model():\n def __init__(self, lemmatization=False):\n if lemmatization:\n self.text_vectorizer = SpacyVectorizer\n else:\n self.text_vectorizer = TfidfVectorizer\n\n self.undersampling_enabled = True\n self.cross_validation_enabled = True\n\n def get_feature_names(self):\n return []\n\n def train(self):\n classes = self.get_labels()\n class_names = sorted(list(set(classes.values())), reverse=True)\n\n # Get bugs, filtering out those for which we have no labels.\n def bugs():\n return (bug for bug in bugzilla.get_bugs() if bug['id'] in classes)\n\n # Calculate labels.\n y = np.array([classes[bug['id']] for bug in bugs()])\n\n # Extract features from the bugs.\n X = self.extraction_pipeline.fit_transform(bugs())\n\n print(f'X: {X.shape}, y: {y.shape}')\n\n # Split dataset in training and test.\n X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=0)\n\n if self.undersampling_enabled:\n # Under-sample the majority classes, as the datasets are imbalanced.\n X_train, y_train = RandomUnderSampler(random_state=0).fit_sample(X_train, y_train)\n\n print(f'X_train: {X_train.shape}, y_train: {y_train.shape}')\n print(f'X_test: {X_test.shape}, y_test: {y_test.shape}')\n\n # Use k-fold cross validation to evaluate results.\n if self.cross_validation_enabled:\n scores = cross_val_score(self.clf, X_train, y_train, cv=5)\n print(f'CV Accuracy: f{scores.mean()} (+/- {scores.std() * 2})')\n\n # Evaluate results on the test set.\n self.clf.fit(X_train, y_train)\n\n feature_names = self.get_feature_names()\n if len(feature_names):\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X_train)\n\n print('Feature ranking (top 20 features):')\n # Calculate the values that represent the fraction of the model output variability attributable\n # to each feature across the whole dataset.\n shap_sums = np.abs(shap_values).sum(0)\n rel_shap_sums = shap_sums / shap_sums.sum()\n indices = np.argsort(rel_shap_sums)[::-1][:20]\n for i, index in enumerate(indices):\n print(f'{i + 1}. \\'{feature_names[index]}\\' ({rel_shap_sums[index]})')\n\n y_pred = self.clf.predict(X_test)\n\n print(f'No confidence threshold - {len(y_test)} classified')\n print(metrics.confusion_matrix(y_test, y_pred, labels=class_names))\n print(classification_report_imbalanced(y_test, y_pred, labels=class_names))\n\n # Evaluate results on the test set for some confidence thresholds.\n for confidence_threshold in [0.6, 0.7, 0.8, 0.9]:\n y_pred_probas = self.clf.predict_proba(X_test)\n\n y_test_filter = []\n y_pred_filter = []\n for i in range(0, len(y_test)):\n argmax = np.argmax(y_pred_probas[i])\n if y_pred_probas[i][argmax] < confidence_threshold:\n continue\n\n y_test_filter.append(y_test[i])\n y_pred_filter.append(argmax)\n\n y_pred_filter = self.clf._le.inverse_transform(y_pred_filter)\n\n print(f'\\nConfidence threshold > {confidence_threshold} - {len(y_test_filter)} classified')\n print(metrics.confusion_matrix(y_test_filter, y_pred_filter, labels=class_names))\n print(classification_report_imbalanced(y_test_filter, y_pred_filter, labels=class_names))\n\n joblib.dump(self, self.__class__.__name__.lower())\n\n @staticmethod\n def load(model_file_name):\n return joblib.load(model_file_name)\n\n def overwrite_classes(self, bugs, classes, probabilities):\n return classes\n\n def classify(self, bugs, probabilities=False, importances=False):\n assert bugs is not None\n assert self.extraction_pipeline is not None and self.clf is not None, 'The module needs to be initialized first'\n\n if not isinstance(bugs, list):\n bugs = [bugs]\n\n assert isinstance(bugs[0], dict)\n\n X = self.extraction_pipeline.transform(bugs)\n if probabilities:\n classes = self.clf.predict_proba(X)\n else:\n classes = self.clf.predict(X)\n\n classes = self.overwrite_classes(bugs, classes, probabilities)\n\n if importances:\n explainer = shap.TreeExplainer(self.clf)\n shap_values = explainer.shap_values(X)\n\n shap_sums = shap_values.sum(0)\n abs_shap_sums = np.abs(shap_sums)\n rel_shap_sums = abs_shap_sums / abs_shap_sums.sum()\n indices = np.argsort(abs_shap_sums)[::-1]\n importances = [(index, shap_sums[index] > 0, rel_shap_sums[index]) for index in indices]\n\n return classes, importances\n\n return classes\n", "path": "bugbug/model.py"}]}
| 1,982 | 131 |
gh_patches_debug_28868
|
rasdani/github-patches
|
git_diff
|
streamlit__streamlit-2101
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Month dropdown on DateInput is very slow (~5s to appear)
# Summary
Month dropdown on DateInput is very slow (~5s to appear). See https://i.imgur.com/WkmHYag.gifv
# Steps to reproduce
What are the steps we should take to reproduce the bug:
1. Create `st.date_input('hi')`
2. Click on the date
3. Click on the month inside the date
## Expected behavior:
Month dropdown should appear almost instantly, with no user-perceived lag.
## Actual behavior:
Takes ~5 seconds
## Is this a regression?
Unsure. Does this date input dropdown use our virtualized selectbox?
# Debug info
- Streamlit version: develop
- Python version: 3.7.6
- Using Pipenv
- OS version: Windows 10
- Browser version: Chrome
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/streamlit/elements/time_widgets.py`
Content:
```
1 from datetime import datetime, date, time
2
3 from streamlit.proto.TimeInput_pb2 import TimeInput as TimeInputProto
4 from streamlit.proto.DateInput_pb2 import DateInput as DateInputProto
5 from streamlit.errors import StreamlitAPIException
6 from .utils import _get_widget_ui_value
7
8
9 class TimeWidgetsMixin:
10 def time_input(dg, label, value=None, key=None):
11 """Display a time input widget.
12
13 Parameters
14 ----------
15 label : str
16 A short label explaining to the user what this time input is for.
17 value : datetime.time/datetime.datetime
18 The value of this widget when it first renders. This will be
19 cast to str internally. Defaults to the current time.
20 key : str
21 An optional string to use as the unique key for the widget.
22 If this is omitted, a key will be generated for the widget
23 based on its content. Multiple widgets of the same type may
24 not share the same key.
25
26 Returns
27 -------
28 datetime.time
29 The current value of the time input widget.
30
31 Example
32 -------
33 >>> t = st.time_input('Set an alarm for', datetime.time(8, 45))
34 >>> st.write('Alarm is set for', t)
35
36 """
37 # Set value default.
38 if value is None:
39 value = datetime.now().time()
40
41 # Ensure that the value is either datetime/time
42 if not isinstance(value, datetime) and not isinstance(value, time):
43 raise StreamlitAPIException(
44 "The type of the value should be either datetime or time."
45 )
46
47 # Convert datetime to time
48 if isinstance(value, datetime):
49 value = value.time()
50
51 time_input_proto = TimeInputProto()
52 time_input_proto.label = label
53 time_input_proto.default = time.strftime(value, "%H:%M")
54
55 ui_value = _get_widget_ui_value("time_input", time_input_proto, user_key=key)
56 current_value = (
57 datetime.strptime(ui_value, "%H:%M").time()
58 if ui_value is not None
59 else value
60 )
61 return dg._enqueue("time_input", time_input_proto, current_value) # type: ignore
62
63 def date_input(
64 dg,
65 label,
66 value=None,
67 min_value=datetime.min,
68 max_value=None,
69 key=None,
70 ):
71 """Display a date input widget.
72
73 Parameters
74 ----------
75 label : str
76 A short label explaining to the user what this date input is for.
77 value : datetime.date or datetime.datetime or list/tuple of datetime.date or datetime.datetime or None
78 The value of this widget when it first renders. If a list/tuple with
79 0 to 2 date/datetime values is provided, the datepicker will allow
80 users to provide a range. Defaults to today as a single-date picker.
81 min_value : datetime.date or datetime.datetime
82 The minimum selectable date. Defaults to datetime.min.
83 max_value : datetime.date or datetime.datetime
84 The maximum selectable date. Defaults to today+10y.
85 key : str
86 An optional string to use as the unique key for the widget.
87 If this is omitted, a key will be generated for the widget
88 based on its content. Multiple widgets of the same type may
89 not share the same key.
90
91 Returns
92 -------
93 datetime.date
94 The current value of the date input widget.
95
96 Example
97 -------
98 >>> d = st.date_input(
99 ... "When\'s your birthday",
100 ... datetime.date(2019, 7, 6))
101 >>> st.write('Your birthday is:', d)
102
103 """
104 # Set value default.
105 if value is None:
106 value = datetime.now().date()
107
108 single_value = isinstance(value, (date, datetime))
109 range_value = isinstance(value, (list, tuple)) and len(value) in (0, 1, 2)
110 if not single_value and not range_value:
111 raise StreamlitAPIException(
112 "DateInput value should either be an date/datetime or a list/tuple of "
113 "0 - 2 date/datetime values"
114 )
115
116 if single_value:
117 value = [value]
118
119 date_input_proto = DateInputProto()
120 date_input_proto.is_range = range_value
121
122 value = [v.date() if isinstance(v, datetime) else v for v in value]
123
124 date_input_proto.label = label
125 date_input_proto.default[:] = [date.strftime(v, "%Y/%m/%d") for v in value]
126
127 if isinstance(min_value, datetime):
128 min_value = min_value.date()
129
130 date_input_proto.min = date.strftime(min_value, "%Y/%m/%d")
131
132 if max_value is None:
133 today = date.today()
134 max_value = date(today.year + 10, today.month, today.day)
135
136 if isinstance(max_value, datetime):
137 max_value = max_value.date()
138
139 date_input_proto.max = date.strftime(max_value, "%Y/%m/%d")
140
141 ui_value = _get_widget_ui_value("date_input", date_input_proto, user_key=key)
142
143 if ui_value is not None:
144 value = getattr(ui_value, "data")
145 value = [datetime.strptime(v, "%Y/%m/%d").date() for v in value]
146
147 return_value = value[0] if single_value else tuple(value)
148 return dg._enqueue("date_input", date_input_proto, return_value) # type: ignore
149
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lib/streamlit/elements/time_widgets.py b/lib/streamlit/elements/time_widgets.py
--- a/lib/streamlit/elements/time_widgets.py
+++ b/lib/streamlit/elements/time_widgets.py
@@ -61,12 +61,7 @@
return dg._enqueue("time_input", time_input_proto, current_value) # type: ignore
def date_input(
- dg,
- label,
- value=None,
- min_value=datetime.min,
- max_value=None,
- key=None,
+ dg, label, value=None, min_value=None, max_value=None, key=None,
):
"""Display a date input widget.
@@ -79,7 +74,7 @@
0 to 2 date/datetime values is provided, the datepicker will allow
users to provide a range. Defaults to today as a single-date picker.
min_value : datetime.date or datetime.datetime
- The minimum selectable date. Defaults to datetime.min.
+ The minimum selectable date. Defaults to today-10y.
max_value : datetime.date or datetime.datetime
The maximum selectable date. Defaults to today+10y.
key : str
@@ -126,6 +121,9 @@
if isinstance(min_value, datetime):
min_value = min_value.date()
+ elif min_value is None:
+ today = date.today()
+ min_value = date(today.year - 10, today.month, today.day)
date_input_proto.min = date.strftime(min_value, "%Y/%m/%d")
|
{"golden_diff": "diff --git a/lib/streamlit/elements/time_widgets.py b/lib/streamlit/elements/time_widgets.py\n--- a/lib/streamlit/elements/time_widgets.py\n+++ b/lib/streamlit/elements/time_widgets.py\n@@ -61,12 +61,7 @@\n return dg._enqueue(\"time_input\", time_input_proto, current_value) # type: ignore\n \n def date_input(\n- dg,\n- label,\n- value=None,\n- min_value=datetime.min,\n- max_value=None,\n- key=None,\n+ dg, label, value=None, min_value=None, max_value=None, key=None,\n ):\n \"\"\"Display a date input widget.\n \n@@ -79,7 +74,7 @@\n 0 to 2 date/datetime values is provided, the datepicker will allow\n users to provide a range. Defaults to today as a single-date picker.\n min_value : datetime.date or datetime.datetime\n- The minimum selectable date. Defaults to datetime.min.\n+ The minimum selectable date. Defaults to today-10y.\n max_value : datetime.date or datetime.datetime\n The maximum selectable date. Defaults to today+10y.\n key : str\n@@ -126,6 +121,9 @@\n \n if isinstance(min_value, datetime):\n min_value = min_value.date()\n+ elif min_value is None:\n+ today = date.today()\n+ min_value = date(today.year - 10, today.month, today.day)\n \n date_input_proto.min = date.strftime(min_value, \"%Y/%m/%d\")\n", "issue": "Month dropdown on DateInput is very slow (~5s to appear)\n# Summary\r\n\r\nMonth dropdown on DateInput is very slow (~5s to appear). See https://i.imgur.com/WkmHYag.gifv\r\n\r\n# Steps to reproduce\r\n\r\nWhat are the steps we should take to reproduce the bug:\r\n\r\n1. Create `st.date_input('hi')`\r\n2. Click on the date\r\n3. Click on the month inside the date\r\n\r\n## Expected behavior:\r\n\r\nMonth dropdown should appear almost instantly, with no user-perceived lag.\r\n\r\n## Actual behavior:\r\n\r\nTakes ~5 seconds\r\n\r\n## Is this a regression?\r\n\r\nUnsure. Does this date input dropdown use our virtualized selectbox?\r\n\r\n# Debug info\r\n\r\n- Streamlit version: develop\r\n- Python version: 3.7.6\r\n- Using Pipenv\r\n- OS version: Windows 10\r\n- Browser version: Chrome\r\n\n", "before_files": [{"content": "from datetime import datetime, date, time\n\nfrom streamlit.proto.TimeInput_pb2 import TimeInput as TimeInputProto\nfrom streamlit.proto.DateInput_pb2 import DateInput as DateInputProto\nfrom streamlit.errors import StreamlitAPIException\nfrom .utils import _get_widget_ui_value\n\n\nclass TimeWidgetsMixin:\n def time_input(dg, label, value=None, key=None):\n \"\"\"Display a time input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this time input is for.\n value : datetime.time/datetime.datetime\n The value of this widget when it first renders. This will be\n cast to str internally. Defaults to the current time.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.time\n The current value of the time input widget.\n\n Example\n -------\n >>> t = st.time_input('Set an alarm for', datetime.time(8, 45))\n >>> st.write('Alarm is set for', t)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().time()\n\n # Ensure that the value is either datetime/time\n if not isinstance(value, datetime) and not isinstance(value, time):\n raise StreamlitAPIException(\n \"The type of the value should be either datetime or time.\"\n )\n\n # Convert datetime to time\n if isinstance(value, datetime):\n value = value.time()\n\n time_input_proto = TimeInputProto()\n time_input_proto.label = label\n time_input_proto.default = time.strftime(value, \"%H:%M\")\n\n ui_value = _get_widget_ui_value(\"time_input\", time_input_proto, user_key=key)\n current_value = (\n datetime.strptime(ui_value, \"%H:%M\").time()\n if ui_value is not None\n else value\n )\n return dg._enqueue(\"time_input\", time_input_proto, current_value) # type: ignore\n\n def date_input(\n dg,\n label,\n value=None,\n min_value=datetime.min,\n max_value=None,\n key=None,\n ):\n \"\"\"Display a date input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this date input is for.\n value : datetime.date or datetime.datetime or list/tuple of datetime.date or datetime.datetime or None\n The value of this widget when it first renders. If a list/tuple with\n 0 to 2 date/datetime values is provided, the datepicker will allow\n users to provide a range. Defaults to today as a single-date picker.\n min_value : datetime.date or datetime.datetime\n The minimum selectable date. Defaults to datetime.min.\n max_value : datetime.date or datetime.datetime\n The maximum selectable date. Defaults to today+10y.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.date\n The current value of the date input widget.\n\n Example\n -------\n >>> d = st.date_input(\n ... \"When\\'s your birthday\",\n ... datetime.date(2019, 7, 6))\n >>> st.write('Your birthday is:', d)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().date()\n\n single_value = isinstance(value, (date, datetime))\n range_value = isinstance(value, (list, tuple)) and len(value) in (0, 1, 2)\n if not single_value and not range_value:\n raise StreamlitAPIException(\n \"DateInput value should either be an date/datetime or a list/tuple of \"\n \"0 - 2 date/datetime values\"\n )\n\n if single_value:\n value = [value]\n\n date_input_proto = DateInputProto()\n date_input_proto.is_range = range_value\n\n value = [v.date() if isinstance(v, datetime) else v for v in value]\n\n date_input_proto.label = label\n date_input_proto.default[:] = [date.strftime(v, \"%Y/%m/%d\") for v in value]\n\n if isinstance(min_value, datetime):\n min_value = min_value.date()\n\n date_input_proto.min = date.strftime(min_value, \"%Y/%m/%d\")\n\n if max_value is None:\n today = date.today()\n max_value = date(today.year + 10, today.month, today.day)\n\n if isinstance(max_value, datetime):\n max_value = max_value.date()\n\n date_input_proto.max = date.strftime(max_value, \"%Y/%m/%d\")\n\n ui_value = _get_widget_ui_value(\"date_input\", date_input_proto, user_key=key)\n\n if ui_value is not None:\n value = getattr(ui_value, \"data\")\n value = [datetime.strptime(v, \"%Y/%m/%d\").date() for v in value]\n\n return_value = value[0] if single_value else tuple(value)\n return dg._enqueue(\"date_input\", date_input_proto, return_value) # type: ignore\n", "path": "lib/streamlit/elements/time_widgets.py"}], "after_files": [{"content": "from datetime import datetime, date, time\n\nfrom streamlit.proto.TimeInput_pb2 import TimeInput as TimeInputProto\nfrom streamlit.proto.DateInput_pb2 import DateInput as DateInputProto\nfrom streamlit.errors import StreamlitAPIException\nfrom .utils import _get_widget_ui_value\n\n\nclass TimeWidgetsMixin:\n def time_input(dg, label, value=None, key=None):\n \"\"\"Display a time input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this time input is for.\n value : datetime.time/datetime.datetime\n The value of this widget when it first renders. This will be\n cast to str internally. Defaults to the current time.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.time\n The current value of the time input widget.\n\n Example\n -------\n >>> t = st.time_input('Set an alarm for', datetime.time(8, 45))\n >>> st.write('Alarm is set for', t)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().time()\n\n # Ensure that the value is either datetime/time\n if not isinstance(value, datetime) and not isinstance(value, time):\n raise StreamlitAPIException(\n \"The type of the value should be either datetime or time.\"\n )\n\n # Convert datetime to time\n if isinstance(value, datetime):\n value = value.time()\n\n time_input_proto = TimeInputProto()\n time_input_proto.label = label\n time_input_proto.default = time.strftime(value, \"%H:%M\")\n\n ui_value = _get_widget_ui_value(\"time_input\", time_input_proto, user_key=key)\n current_value = (\n datetime.strptime(ui_value, \"%H:%M\").time()\n if ui_value is not None\n else value\n )\n return dg._enqueue(\"time_input\", time_input_proto, current_value) # type: ignore\n\n def date_input(\n dg, label, value=None, min_value=None, max_value=None, key=None,\n ):\n \"\"\"Display a date input widget.\n\n Parameters\n ----------\n label : str\n A short label explaining to the user what this date input is for.\n value : datetime.date or datetime.datetime or list/tuple of datetime.date or datetime.datetime or None\n The value of this widget when it first renders. If a list/tuple with\n 0 to 2 date/datetime values is provided, the datepicker will allow\n users to provide a range. Defaults to today as a single-date picker.\n min_value : datetime.date or datetime.datetime\n The minimum selectable date. Defaults to today-10y.\n max_value : datetime.date or datetime.datetime\n The maximum selectable date. Defaults to today+10y.\n key : str\n An optional string to use as the unique key for the widget.\n If this is omitted, a key will be generated for the widget\n based on its content. Multiple widgets of the same type may\n not share the same key.\n\n Returns\n -------\n datetime.date\n The current value of the date input widget.\n\n Example\n -------\n >>> d = st.date_input(\n ... \"When\\'s your birthday\",\n ... datetime.date(2019, 7, 6))\n >>> st.write('Your birthday is:', d)\n\n \"\"\"\n # Set value default.\n if value is None:\n value = datetime.now().date()\n\n single_value = isinstance(value, (date, datetime))\n range_value = isinstance(value, (list, tuple)) and len(value) in (0, 1, 2)\n if not single_value and not range_value:\n raise StreamlitAPIException(\n \"DateInput value should either be an date/datetime or a list/tuple of \"\n \"0 - 2 date/datetime values\"\n )\n\n if single_value:\n value = [value]\n\n date_input_proto = DateInputProto()\n date_input_proto.is_range = range_value\n\n value = [v.date() if isinstance(v, datetime) else v for v in value]\n\n date_input_proto.label = label\n date_input_proto.default[:] = [date.strftime(v, \"%Y/%m/%d\") for v in value]\n\n if isinstance(min_value, datetime):\n min_value = min_value.date()\n elif min_value is None:\n today = date.today()\n min_value = date(today.year - 10, today.month, today.day)\n\n date_input_proto.min = date.strftime(min_value, \"%Y/%m/%d\")\n\n if max_value is None:\n today = date.today()\n max_value = date(today.year + 10, today.month, today.day)\n\n if isinstance(max_value, datetime):\n max_value = max_value.date()\n\n date_input_proto.max = date.strftime(max_value, \"%Y/%m/%d\")\n\n ui_value = _get_widget_ui_value(\"date_input\", date_input_proto, user_key=key)\n\n if ui_value is not None:\n value = getattr(ui_value, \"data\")\n value = [datetime.strptime(v, \"%Y/%m/%d\").date() for v in value]\n\n return_value = value[0] if single_value else tuple(value)\n return dg._enqueue(\"date_input\", date_input_proto, return_value) # type: ignore\n", "path": "lib/streamlit/elements/time_widgets.py"}]}
| 1,971 | 343 |
gh_patches_debug_1328
|
rasdani/github-patches
|
git_diff
|
Lightning-Universe__lightning-flash-666
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ImageEmbedder default behavior is not a flattened output
## 🐛 Bug
I discovered this issue while testing PR #655. If you run the [Image Embedding README example code](https://github.com/PyTorchLightning/lightning-flash#example-1-image-embedding), it returns a 3D tensor.
My understanding from the use of embeddings in general, and how they are used in [Fifty One](https://voxel51.com/docs/fiftyone/tutorials/image_embeddings.html) is they expect the embeddings to be 1D (for each embedding).
The reason it returns a 3D tensor is because it depends on the backbone used. The default there is `resnet101`, which returns a `2048x7x7` shape tensor. Others like inception return a flat 1D tensor, i.e. length-X.
### To Reproduce
Steps to reproduce the behavior:
Run the [README example](https://github.com/PyTorchLightning/lightning-flash#example-1-image-embedding), but remove the `embedding_dim` parameter. See below for example.
Note: as-is, this will error on `print(embeddings.shape)`, regardless of configuration, since that is a list. But the question here is around the logic for the ImageEmbedder.
#### Code sample
```python
from flash.core.data.utils import download_data
from flash.image import ImageEmbedder
# 1. Download the data
download_data("https://pl-flash-data.s3.amazonaws.com/hymenoptera_data.zip", "data/")
# 2. Create an ImageEmbedder with resnet50 trained on imagenet.
embedder = ImageEmbedder(backbone="resnet50")
# 3. Generate an embedding from an image path.
embeddings = embedder.predict("data/hymenoptera_data/predict/153783656_85f9c3ac70.jpg")
# 4. Print embeddings shape
print(embeddings.shape)
```
### Expected behavior
Expect to see a 100352x1 shape tensor as the output, instead of 2048x7x7.
### Environment
- PyTorch Version (e.g., 1.0): 1.9
- OS (e.g., Linux): Linux
- How you installed PyTorch (`conda`, `pip`, source): pip
- Build command you used (if compiling from source): N/A
- Python version: 3.8.6
- CUDA/cuDNN version: N/A
- GPU models and configuration: N/A
- Any other relevant information: N/A
### Additional context
I believe the question is around what the logic should be here:
https://github.com/PyTorchLightning/lightning-flash/blob/075de3a46d74d9fc0e769401063fede1f12d0518/flash/image/embedding/model.py#L85-L92
If `embedding_dim` is None, then the head is `nn.Identity()`. **If we desire a flat 1D embedding, then the question is: should `nn.Identity()` change to `nn.Flatten()`?**
It could be argued that the user should be left to flatten after on their own, but per the contributing guidelines, I thought this would align with "[Force User Decisions To Best Practices](https://github.com/PyTorchLightning/lightning-flash/blob/ddd942d3dfe3884a97a855446410166c3c9f16d9/.github/CONTRIBUTING.md#force-user-decisions-to-best-practices)"
Let me know your thoughts. If that makes sense, then I can update the code, run some tests, and update docs in a PR.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flash/image/embedding/model.py`
Content:
```
1 # Copyright The PyTorch Lightning team.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 from typing import Any, Callable, Mapping, Optional, Sequence, Tuple, Type, Union
15
16 import torch
17 from pytorch_lightning.utilities import rank_zero_warn
18 from torch import nn
19 from torch.nn import functional as F
20 from torchmetrics import Accuracy, Metric
21
22 from flash.core.data.data_source import DefaultDataKeys
23 from flash.core.model import Task
24 from flash.core.registry import FlashRegistry
25 from flash.core.utilities.imports import _IMAGE_AVAILABLE
26 from flash.core.utilities.isinstance import _isinstance
27 from flash.image.classification.data import ImageClassificationPreprocess
28
29 if _IMAGE_AVAILABLE:
30 from flash.image.classification.backbones import IMAGE_CLASSIFIER_BACKBONES
31 else:
32 IMAGE_CLASSIFIER_BACKBONES = FlashRegistry("backbones")
33
34
35 class ImageEmbedder(Task):
36 """The ``ImageEmbedder`` is a :class:`~flash.Task` for obtaining feature vectors (embeddings) from images. For
37 more details, see :ref:`image_embedder`.
38
39 Args:
40 embedding_dim: Dimension of the embedded vector. ``None`` uses the default from the backbone.
41 backbone: A model to use to extract image features, defaults to ``"swav-imagenet"``.
42 pretrained: Use a pretrained backbone, defaults to ``True``.
43 loss_fn: Loss function for training and finetuning, defaults to :func:`torch.nn.functional.cross_entropy`
44 optimizer: Optimizer to use for training and finetuning, defaults to :class:`torch.optim.SGD`.
45 metrics: Metrics to compute for training and evaluation. Can either be an metric from the `torchmetrics`
46 package, a custom metric inherenting from `torchmetrics.Metric`, a callable function or a list/dict
47 containing a combination of the aforementioned. In all cases, each metric needs to have the signature
48 `metric(preds,target)` and return a single scalar tensor. Defaults to :class:`torchmetrics.Accuracy`.
49 learning_rate: Learning rate to use for training, defaults to ``1e-3``.
50 pooling_fn: Function used to pool image to generate embeddings, defaults to :func:`torch.max`.
51 """
52
53 backbones: FlashRegistry = IMAGE_CLASSIFIER_BACKBONES
54
55 required_extras: str = "image"
56
57 def __init__(
58 self,
59 embedding_dim: Optional[int] = None,
60 backbone: str = "resnet101",
61 pretrained: bool = True,
62 loss_fn: Callable = F.cross_entropy,
63 optimizer: Type[torch.optim.Optimizer] = torch.optim.SGD,
64 metrics: Union[Metric, Callable, Mapping, Sequence, None] = (Accuracy()),
65 learning_rate: float = 1e-3,
66 pooling_fn: Callable = torch.max,
67 ):
68 super().__init__(
69 model=None,
70 loss_fn=loss_fn,
71 optimizer=optimizer,
72 metrics=metrics,
73 learning_rate=learning_rate,
74 preprocess=ImageClassificationPreprocess(),
75 )
76
77 self.save_hyperparameters()
78 self.backbone_name = backbone
79 self.embedding_dim = embedding_dim
80 assert pooling_fn in [torch.mean, torch.max]
81 self.pooling_fn = pooling_fn
82
83 self.backbone, num_features = self.backbones.get(backbone)(pretrained=pretrained)
84
85 if embedding_dim is None:
86 self.head = nn.Identity()
87 else:
88 self.head = nn.Sequential(
89 nn.Flatten(),
90 nn.Linear(num_features, embedding_dim),
91 )
92 rank_zero_warn("Adding linear layer on top of backbone. Remember to finetune first before using!")
93
94 def apply_pool(self, x):
95 x = self.pooling_fn(x, dim=-1)
96 if _isinstance(x, Tuple[torch.Tensor, torch.Tensor]):
97 x = x[0]
98 x = self.pooling_fn(x, dim=-1)
99 if _isinstance(x, Tuple[torch.Tensor, torch.Tensor]):
100 x = x[0]
101 return x
102
103 def forward(self, x) -> torch.Tensor:
104 x = self.backbone(x)
105
106 # bolts ssl models return lists
107 if isinstance(x, tuple):
108 x = x[-1]
109
110 if x.dim() == 4 and self.embedding_dim:
111 x = self.apply_pool(x)
112
113 x = self.head(x)
114 return x
115
116 def training_step(self, batch: Any, batch_idx: int) -> Any:
117 batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])
118 return super().training_step(batch, batch_idx)
119
120 def validation_step(self, batch: Any, batch_idx: int) -> Any:
121 batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])
122 return super().validation_step(batch, batch_idx)
123
124 def test_step(self, batch: Any, batch_idx: int) -> Any:
125 batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])
126 return super().test_step(batch, batch_idx)
127
128 def predict_step(self, batch: Any, batch_idx: int, dataloader_idx: int = 0) -> Any:
129 batch = batch[DefaultDataKeys.INPUT]
130 return super().predict_step(batch, batch_idx, dataloader_idx=dataloader_idx)
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/flash/image/embedding/model.py b/flash/image/embedding/model.py
--- a/flash/image/embedding/model.py
+++ b/flash/image/embedding/model.py
@@ -107,7 +107,7 @@
if isinstance(x, tuple):
x = x[-1]
- if x.dim() == 4 and self.embedding_dim:
+ if x.dim() == 4 and not self.embedding_dim:
x = self.apply_pool(x)
x = self.head(x)
|
{"golden_diff": "diff --git a/flash/image/embedding/model.py b/flash/image/embedding/model.py\n--- a/flash/image/embedding/model.py\n+++ b/flash/image/embedding/model.py\n@@ -107,7 +107,7 @@\n if isinstance(x, tuple):\n x = x[-1]\n \n- if x.dim() == 4 and self.embedding_dim:\n+ if x.dim() == 4 and not self.embedding_dim:\n x = self.apply_pool(x)\n \n x = self.head(x)\n", "issue": "ImageEmbedder default behavior is not a flattened output\n## \ud83d\udc1b Bug\r\n\r\nI discovered this issue while testing PR #655. If you run the [Image Embedding README example code](https://github.com/PyTorchLightning/lightning-flash#example-1-image-embedding), it returns a 3D tensor. \r\nMy understanding from the use of embeddings in general, and how they are used in [Fifty One](https://voxel51.com/docs/fiftyone/tutorials/image_embeddings.html) is they expect the embeddings to be 1D (for each embedding). \r\n\r\nThe reason it returns a 3D tensor is because it depends on the backbone used. The default there is `resnet101`, which returns a `2048x7x7` shape tensor. Others like inception return a flat 1D tensor, i.e. length-X.\r\n\r\n### To Reproduce\r\n\r\nSteps to reproduce the behavior:\r\n\r\nRun the [README example](https://github.com/PyTorchLightning/lightning-flash#example-1-image-embedding), but remove the `embedding_dim` parameter. See below for example.\r\n\r\nNote: as-is, this will error on `print(embeddings.shape)`, regardless of configuration, since that is a list. But the question here is around the logic for the ImageEmbedder. \r\n\r\n\r\n#### Code sample\r\n```python\r\nfrom flash.core.data.utils import download_data\r\nfrom flash.image import ImageEmbedder\r\n\r\n# 1. Download the data\r\ndownload_data(\"https://pl-flash-data.s3.amazonaws.com/hymenoptera_data.zip\", \"data/\")\r\n\r\n# 2. Create an ImageEmbedder with resnet50 trained on imagenet.\r\nembedder = ImageEmbedder(backbone=\"resnet50\")\r\n\r\n# 3. Generate an embedding from an image path.\r\nembeddings = embedder.predict(\"data/hymenoptera_data/predict/153783656_85f9c3ac70.jpg\")\r\n\r\n# 4. Print embeddings shape\r\nprint(embeddings.shape)\r\n```\r\n\r\n### Expected behavior\r\n\r\nExpect to see a 100352x1 shape tensor as the output, instead of 2048x7x7. \r\n\r\n### Environment\r\n\r\n - PyTorch Version (e.g., 1.0): 1.9\r\n - OS (e.g., Linux): Linux\r\n - How you installed PyTorch (`conda`, `pip`, source): pip\r\n - Build command you used (if compiling from source): N/A\r\n - Python version: 3.8.6\r\n - CUDA/cuDNN version: N/A\r\n - GPU models and configuration: N/A\r\n - Any other relevant information: N/A\r\n\r\n### Additional context\r\n\r\nI believe the question is around what the logic should be here:\r\nhttps://github.com/PyTorchLightning/lightning-flash/blob/075de3a46d74d9fc0e769401063fede1f12d0518/flash/image/embedding/model.py#L85-L92\r\n\r\nIf `embedding_dim` is None, then the head is `nn.Identity()`. **If we desire a flat 1D embedding, then the question is: should `nn.Identity()` change to `nn.Flatten()`?**\r\n\r\nIt could be argued that the user should be left to flatten after on their own, but per the contributing guidelines, I thought this would align with \"[Force User Decisions To Best Practices](https://github.com/PyTorchLightning/lightning-flash/blob/ddd942d3dfe3884a97a855446410166c3c9f16d9/.github/CONTRIBUTING.md#force-user-decisions-to-best-practices)\"\r\n\r\nLet me know your thoughts. If that makes sense, then I can update the code, run some tests, and update docs in a PR. \r\n\n", "before_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any, Callable, Mapping, Optional, Sequence, Tuple, Type, Union\n\nimport torch\nfrom pytorch_lightning.utilities import rank_zero_warn\nfrom torch import nn\nfrom torch.nn import functional as F\nfrom torchmetrics import Accuracy, Metric\n\nfrom flash.core.data.data_source import DefaultDataKeys\nfrom flash.core.model import Task\nfrom flash.core.registry import FlashRegistry\nfrom flash.core.utilities.imports import _IMAGE_AVAILABLE\nfrom flash.core.utilities.isinstance import _isinstance\nfrom flash.image.classification.data import ImageClassificationPreprocess\n\nif _IMAGE_AVAILABLE:\n from flash.image.classification.backbones import IMAGE_CLASSIFIER_BACKBONES\nelse:\n IMAGE_CLASSIFIER_BACKBONES = FlashRegistry(\"backbones\")\n\n\nclass ImageEmbedder(Task):\n \"\"\"The ``ImageEmbedder`` is a :class:`~flash.Task` for obtaining feature vectors (embeddings) from images. For\n more details, see :ref:`image_embedder`.\n\n Args:\n embedding_dim: Dimension of the embedded vector. ``None`` uses the default from the backbone.\n backbone: A model to use to extract image features, defaults to ``\"swav-imagenet\"``.\n pretrained: Use a pretrained backbone, defaults to ``True``.\n loss_fn: Loss function for training and finetuning, defaults to :func:`torch.nn.functional.cross_entropy`\n optimizer: Optimizer to use for training and finetuning, defaults to :class:`torch.optim.SGD`.\n metrics: Metrics to compute for training and evaluation. Can either be an metric from the `torchmetrics`\n package, a custom metric inherenting from `torchmetrics.Metric`, a callable function or a list/dict\n containing a combination of the aforementioned. In all cases, each metric needs to have the signature\n `metric(preds,target)` and return a single scalar tensor. Defaults to :class:`torchmetrics.Accuracy`.\n learning_rate: Learning rate to use for training, defaults to ``1e-3``.\n pooling_fn: Function used to pool image to generate embeddings, defaults to :func:`torch.max`.\n \"\"\"\n\n backbones: FlashRegistry = IMAGE_CLASSIFIER_BACKBONES\n\n required_extras: str = \"image\"\n\n def __init__(\n self,\n embedding_dim: Optional[int] = None,\n backbone: str = \"resnet101\",\n pretrained: bool = True,\n loss_fn: Callable = F.cross_entropy,\n optimizer: Type[torch.optim.Optimizer] = torch.optim.SGD,\n metrics: Union[Metric, Callable, Mapping, Sequence, None] = (Accuracy()),\n learning_rate: float = 1e-3,\n pooling_fn: Callable = torch.max,\n ):\n super().__init__(\n model=None,\n loss_fn=loss_fn,\n optimizer=optimizer,\n metrics=metrics,\n learning_rate=learning_rate,\n preprocess=ImageClassificationPreprocess(),\n )\n\n self.save_hyperparameters()\n self.backbone_name = backbone\n self.embedding_dim = embedding_dim\n assert pooling_fn in [torch.mean, torch.max]\n self.pooling_fn = pooling_fn\n\n self.backbone, num_features = self.backbones.get(backbone)(pretrained=pretrained)\n\n if embedding_dim is None:\n self.head = nn.Identity()\n else:\n self.head = nn.Sequential(\n nn.Flatten(),\n nn.Linear(num_features, embedding_dim),\n )\n rank_zero_warn(\"Adding linear layer on top of backbone. Remember to finetune first before using!\")\n\n def apply_pool(self, x):\n x = self.pooling_fn(x, dim=-1)\n if _isinstance(x, Tuple[torch.Tensor, torch.Tensor]):\n x = x[0]\n x = self.pooling_fn(x, dim=-1)\n if _isinstance(x, Tuple[torch.Tensor, torch.Tensor]):\n x = x[0]\n return x\n\n def forward(self, x) -> torch.Tensor:\n x = self.backbone(x)\n\n # bolts ssl models return lists\n if isinstance(x, tuple):\n x = x[-1]\n\n if x.dim() == 4 and self.embedding_dim:\n x = self.apply_pool(x)\n\n x = self.head(x)\n return x\n\n def training_step(self, batch: Any, batch_idx: int) -> Any:\n batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])\n return super().training_step(batch, batch_idx)\n\n def validation_step(self, batch: Any, batch_idx: int) -> Any:\n batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])\n return super().validation_step(batch, batch_idx)\n\n def test_step(self, batch: Any, batch_idx: int) -> Any:\n batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])\n return super().test_step(batch, batch_idx)\n\n def predict_step(self, batch: Any, batch_idx: int, dataloader_idx: int = 0) -> Any:\n batch = batch[DefaultDataKeys.INPUT]\n return super().predict_step(batch, batch_idx, dataloader_idx=dataloader_idx)\n", "path": "flash/image/embedding/model.py"}], "after_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any, Callable, Mapping, Optional, Sequence, Tuple, Type, Union\n\nimport torch\nfrom pytorch_lightning.utilities import rank_zero_warn\nfrom torch import nn\nfrom torch.nn import functional as F\nfrom torchmetrics import Accuracy, Metric\n\nfrom flash.core.data.data_source import DefaultDataKeys\nfrom flash.core.model import Task\nfrom flash.core.registry import FlashRegistry\nfrom flash.core.utilities.imports import _IMAGE_AVAILABLE\nfrom flash.core.utilities.isinstance import _isinstance\nfrom flash.image.classification.data import ImageClassificationPreprocess\n\nif _IMAGE_AVAILABLE:\n from flash.image.classification.backbones import IMAGE_CLASSIFIER_BACKBONES\nelse:\n IMAGE_CLASSIFIER_BACKBONES = FlashRegistry(\"backbones\")\n\n\nclass ImageEmbedder(Task):\n \"\"\"The ``ImageEmbedder`` is a :class:`~flash.Task` for obtaining feature vectors (embeddings) from images. For\n more details, see :ref:`image_embedder`.\n\n Args:\n embedding_dim: Dimension of the embedded vector. ``None`` uses the default from the backbone.\n backbone: A model to use to extract image features, defaults to ``\"swav-imagenet\"``.\n pretrained: Use a pretrained backbone, defaults to ``True``.\n loss_fn: Loss function for training and finetuning, defaults to :func:`torch.nn.functional.cross_entropy`\n optimizer: Optimizer to use for training and finetuning, defaults to :class:`torch.optim.SGD`.\n metrics: Metrics to compute for training and evaluation. Can either be an metric from the `torchmetrics`\n package, a custom metric inherenting from `torchmetrics.Metric`, a callable function or a list/dict\n containing a combination of the aforementioned. In all cases, each metric needs to have the signature\n `metric(preds,target)` and return a single scalar tensor. Defaults to :class:`torchmetrics.Accuracy`.\n learning_rate: Learning rate to use for training, defaults to ``1e-3``.\n pooling_fn: Function used to pool image to generate embeddings, defaults to :func:`torch.max`.\n \"\"\"\n\n backbones: FlashRegistry = IMAGE_CLASSIFIER_BACKBONES\n\n required_extras: str = \"image\"\n\n def __init__(\n self,\n embedding_dim: Optional[int] = None,\n backbone: str = \"resnet101\",\n pretrained: bool = True,\n loss_fn: Callable = F.cross_entropy,\n optimizer: Type[torch.optim.Optimizer] = torch.optim.SGD,\n metrics: Union[Metric, Callable, Mapping, Sequence, None] = (Accuracy()),\n learning_rate: float = 1e-3,\n pooling_fn: Callable = torch.max,\n ):\n super().__init__(\n model=None,\n loss_fn=loss_fn,\n optimizer=optimizer,\n metrics=metrics,\n learning_rate=learning_rate,\n preprocess=ImageClassificationPreprocess(),\n )\n\n self.save_hyperparameters()\n self.backbone_name = backbone\n self.embedding_dim = embedding_dim\n assert pooling_fn in [torch.mean, torch.max]\n self.pooling_fn = pooling_fn\n\n self.backbone, num_features = self.backbones.get(backbone)(pretrained=pretrained)\n\n if embedding_dim is None:\n self.head = nn.Identity()\n else:\n self.head = nn.Sequential(\n nn.Flatten(),\n nn.Linear(num_features, embedding_dim),\n )\n rank_zero_warn(\"Adding linear layer on top of backbone. Remember to finetune first before using!\")\n\n def apply_pool(self, x):\n x = self.pooling_fn(x, dim=-1)\n if _isinstance(x, Tuple[torch.Tensor, torch.Tensor]):\n x = x[0]\n x = self.pooling_fn(x, dim=-1)\n if _isinstance(x, Tuple[torch.Tensor, torch.Tensor]):\n x = x[0]\n return x\n\n def forward(self, x) -> torch.Tensor:\n x = self.backbone(x)\n\n # bolts ssl models return lists\n if isinstance(x, tuple):\n x = x[-1]\n\n if x.dim() == 4 and not self.embedding_dim:\n x = self.apply_pool(x)\n\n x = self.head(x)\n return x\n\n def training_step(self, batch: Any, batch_idx: int) -> Any:\n batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])\n return super().training_step(batch, batch_idx)\n\n def validation_step(self, batch: Any, batch_idx: int) -> Any:\n batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])\n return super().validation_step(batch, batch_idx)\n\n def test_step(self, batch: Any, batch_idx: int) -> Any:\n batch = (batch[DefaultDataKeys.INPUT], batch[DefaultDataKeys.TARGET])\n return super().test_step(batch, batch_idx)\n\n def predict_step(self, batch: Any, batch_idx: int, dataloader_idx: int = 0) -> Any:\n batch = batch[DefaultDataKeys.INPUT]\n return super().predict_step(batch, batch_idx, dataloader_idx=dataloader_idx)\n", "path": "flash/image/embedding/model.py"}]}
| 2,636 | 114 |
gh_patches_debug_40235
|
rasdani/github-patches
|
git_diff
|
qutebrowser__qutebrowser-3385
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Completion: match terms in any order
[This regex](https://github.com/The-Compiler/qutebrowser/blob/master/qutebrowser/completion/models/sortfilter.py#L75), which is used to filter items for completion, constrains the match to the same order the terms are supplied by the user. I.e., "browser qute" does not match "qutebrowser". According to [this post](http://stackoverflow.com/a/18944925/4180822), this issue can be solved by constructing better regular expression using positive lookaheads. I think it's worthwhile to examine the performance impact of the solution, or find a different one.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qutebrowser/completion/models/histcategory.py`
Content:
```
1 # vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:
2
3 # Copyright 2017 Ryan Roden-Corrent (rcorre) <ryan@rcorre.net>
4 #
5 # This file is part of qutebrowser.
6 #
7 # qutebrowser is free software: you can redistribute it and/or modify
8 # it under the terms of the GNU General Public License as published by
9 # the Free Software Foundation, either version 3 of the License, or
10 # (at your option) any later version.
11 #
12 # qutebrowser is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU General Public License for more details.
16 #
17 # You should have received a copy of the GNU General Public License
18 # along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.
19
20 """A completion category that queries the SQL History store."""
21
22 import re
23
24 from PyQt5.QtSql import QSqlQueryModel
25
26 from qutebrowser.misc import sql
27 from qutebrowser.utils import debug
28 from qutebrowser.config import config
29
30
31 class HistoryCategory(QSqlQueryModel):
32
33 """A completion category that queries the SQL History store."""
34
35 def __init__(self, *, delete_func=None, parent=None):
36 """Create a new History completion category."""
37 super().__init__(parent=parent)
38 self.name = "History"
39
40 # replace ' in timestamp-format to avoid breaking the query
41 timestamp_format = config.val.completion.timestamp_format
42 timefmt = ("strftime('{}', last_atime, 'unixepoch', 'localtime')"
43 .format(timestamp_format.replace("'", "`")))
44
45 self._query = sql.Query(' '.join([
46 "SELECT url, title, {}".format(timefmt),
47 "FROM CompletionHistory",
48 # the incoming pattern will have literal % and _ escaped with '\'
49 # we need to tell sql to treat '\' as an escape character
50 "WHERE ((url || title) LIKE :pat escape '\\')",
51 self._atime_expr(),
52 "ORDER BY last_atime DESC",
53 ]), forward_only=False)
54
55 # advertise that this model filters by URL and title
56 self.columns_to_filter = [0, 1]
57 self.delete_func = delete_func
58
59 def _atime_expr(self):
60 """If max_items is set, return an expression to limit the query."""
61 max_items = config.val.completion.web_history_max_items
62 # HistoryCategory should not be added to the completion in that case.
63 assert max_items != 0
64
65 if max_items < 0:
66 return ''
67
68 min_atime = sql.Query(' '.join([
69 'SELECT min(last_atime) FROM',
70 '(SELECT last_atime FROM CompletionHistory',
71 'ORDER BY last_atime DESC LIMIT :limit)',
72 ])).run(limit=max_items).value()
73
74 if not min_atime:
75 # if there are no history items, min_atime may be '' (issue #2849)
76 return ''
77
78 return "AND last_atime >= {}".format(min_atime)
79
80 def set_pattern(self, pattern):
81 """Set the pattern used to filter results.
82
83 Args:
84 pattern: string pattern to filter by.
85 """
86 # escape to treat a user input % or _ as a literal, not a wildcard
87 pattern = pattern.replace('%', '\\%')
88 pattern = pattern.replace('_', '\\_')
89 # treat spaces as wildcards to match any of the typed words
90 pattern = re.sub(r' +', '%', pattern)
91 pattern = '%{}%'.format(pattern)
92 with debug.log_time('sql', 'Running completion query'):
93 self._query.run(pat=pattern)
94 self.setQuery(self._query)
95
96 def removeRows(self, row, _count, _parent=None):
97 """Override QAbstractItemModel::removeRows to re-run sql query."""
98 # re-run query to reload updated table
99 with debug.log_time('sql', 'Re-running completion query post-delete'):
100 self._query.run()
101 self.setQuery(self._query)
102 while self.rowCount() < row:
103 self.fetchMore()
104 return True
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/qutebrowser/completion/models/histcategory.py b/qutebrowser/completion/models/histcategory.py
--- a/qutebrowser/completion/models/histcategory.py
+++ b/qutebrowser/completion/models/histcategory.py
@@ -19,8 +19,6 @@
"""A completion category that queries the SQL History store."""
-import re
-
from PyQt5.QtSql import QSqlQueryModel
from qutebrowser.misc import sql
@@ -36,21 +34,7 @@
"""Create a new History completion category."""
super().__init__(parent=parent)
self.name = "History"
-
- # replace ' in timestamp-format to avoid breaking the query
- timestamp_format = config.val.completion.timestamp_format
- timefmt = ("strftime('{}', last_atime, 'unixepoch', 'localtime')"
- .format(timestamp_format.replace("'", "`")))
-
- self._query = sql.Query(' '.join([
- "SELECT url, title, {}".format(timefmt),
- "FROM CompletionHistory",
- # the incoming pattern will have literal % and _ escaped with '\'
- # we need to tell sql to treat '\' as an escape character
- "WHERE ((url || title) LIKE :pat escape '\\')",
- self._atime_expr(),
- "ORDER BY last_atime DESC",
- ]), forward_only=False)
+ self._query = None
# advertise that this model filters by URL and title
self.columns_to_filter = [0, 1]
@@ -86,11 +70,36 @@
# escape to treat a user input % or _ as a literal, not a wildcard
pattern = pattern.replace('%', '\\%')
pattern = pattern.replace('_', '\\_')
- # treat spaces as wildcards to match any of the typed words
- pattern = re.sub(r' +', '%', pattern)
- pattern = '%{}%'.format(pattern)
+ words = ['%{}%'.format(w) for w in pattern.split(' ')]
+
+ # build a where clause to match all of the words in any order
+ # given the search term "a b", the WHERE clause would be:
+ # ((url || title) LIKE '%a%') AND ((url || title) LIKE '%b%')
+ where_clause = ' AND '.join(
+ "(url || title) LIKE :{} escape '\\'".format(i)
+ for i in range(len(words)))
+
+ # replace ' in timestamp-format to avoid breaking the query
+ timestamp_format = config.val.completion.timestamp_format
+ timefmt = ("strftime('{}', last_atime, 'unixepoch', 'localtime')"
+ .format(timestamp_format.replace("'", "`")))
+
+ if not self._query or len(words) != len(self._query.boundValues()):
+ # if the number of words changed, we need to generate a new query
+ # otherwise, we can reuse the prepared query for performance
+ self._query = sql.Query(' '.join([
+ "SELECT url, title, {}".format(timefmt),
+ "FROM CompletionHistory",
+ # the incoming pattern will have literal % and _ escaped
+ # we need to tell sql to treat '\' as an escape character
+ 'WHERE ({})'.format(where_clause),
+ self._atime_expr(),
+ "ORDER BY last_atime DESC",
+ ]), forward_only=False)
+
with debug.log_time('sql', 'Running completion query'):
- self._query.run(pat=pattern)
+ self._query.run(**{
+ str(i): w for i, w in enumerate(words)})
self.setQuery(self._query)
def removeRows(self, row, _count, _parent=None):
|
{"golden_diff": "diff --git a/qutebrowser/completion/models/histcategory.py b/qutebrowser/completion/models/histcategory.py\n--- a/qutebrowser/completion/models/histcategory.py\n+++ b/qutebrowser/completion/models/histcategory.py\n@@ -19,8 +19,6 @@\n \n \"\"\"A completion category that queries the SQL History store.\"\"\"\n \n-import re\n-\n from PyQt5.QtSql import QSqlQueryModel\n \n from qutebrowser.misc import sql\n@@ -36,21 +34,7 @@\n \"\"\"Create a new History completion category.\"\"\"\n super().__init__(parent=parent)\n self.name = \"History\"\n-\n- # replace ' in timestamp-format to avoid breaking the query\n- timestamp_format = config.val.completion.timestamp_format\n- timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n- .format(timestamp_format.replace(\"'\", \"`\")))\n-\n- self._query = sql.Query(' '.join([\n- \"SELECT url, title, {}\".format(timefmt),\n- \"FROM CompletionHistory\",\n- # the incoming pattern will have literal % and _ escaped with '\\'\n- # we need to tell sql to treat '\\' as an escape character\n- \"WHERE ((url || title) LIKE :pat escape '\\\\')\",\n- self._atime_expr(),\n- \"ORDER BY last_atime DESC\",\n- ]), forward_only=False)\n+ self._query = None\n \n # advertise that this model filters by URL and title\n self.columns_to_filter = [0, 1]\n@@ -86,11 +70,36 @@\n # escape to treat a user input % or _ as a literal, not a wildcard\n pattern = pattern.replace('%', '\\\\%')\n pattern = pattern.replace('_', '\\\\_')\n- # treat spaces as wildcards to match any of the typed words\n- pattern = re.sub(r' +', '%', pattern)\n- pattern = '%{}%'.format(pattern)\n+ words = ['%{}%'.format(w) for w in pattern.split(' ')]\n+\n+ # build a where clause to match all of the words in any order\n+ # given the search term \"a b\", the WHERE clause would be:\n+ # ((url || title) LIKE '%a%') AND ((url || title) LIKE '%b%')\n+ where_clause = ' AND '.join(\n+ \"(url || title) LIKE :{} escape '\\\\'\".format(i)\n+ for i in range(len(words)))\n+\n+ # replace ' in timestamp-format to avoid breaking the query\n+ timestamp_format = config.val.completion.timestamp_format\n+ timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n+ .format(timestamp_format.replace(\"'\", \"`\")))\n+\n+ if not self._query or len(words) != len(self._query.boundValues()):\n+ # if the number of words changed, we need to generate a new query\n+ # otherwise, we can reuse the prepared query for performance\n+ self._query = sql.Query(' '.join([\n+ \"SELECT url, title, {}\".format(timefmt),\n+ \"FROM CompletionHistory\",\n+ # the incoming pattern will have literal % and _ escaped\n+ # we need to tell sql to treat '\\' as an escape character\n+ 'WHERE ({})'.format(where_clause),\n+ self._atime_expr(),\n+ \"ORDER BY last_atime DESC\",\n+ ]), forward_only=False)\n+\n with debug.log_time('sql', 'Running completion query'):\n- self._query.run(pat=pattern)\n+ self._query.run(**{\n+ str(i): w for i, w in enumerate(words)})\n self.setQuery(self._query)\n \n def removeRows(self, row, _count, _parent=None):\n", "issue": "Completion: match terms in any order\n[This regex](https://github.com/The-Compiler/qutebrowser/blob/master/qutebrowser/completion/models/sortfilter.py#L75), which is used to filter items for completion, constrains the match to the same order the terms are supplied by the user. I.e., \"browser qute\" does not match \"qutebrowser\". According to [this post](http://stackoverflow.com/a/18944925/4180822), this issue can be solved by constructing better regular expression using positive lookaheads. I think it's worthwhile to examine the performance impact of the solution, or find a different one.\n\n", "before_files": [{"content": "# vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:\n\n# Copyright 2017 Ryan Roden-Corrent (rcorre) <ryan@rcorre.net>\n#\n# This file is part of qutebrowser.\n#\n# qutebrowser is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# qutebrowser is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.\n\n\"\"\"A completion category that queries the SQL History store.\"\"\"\n\nimport re\n\nfrom PyQt5.QtSql import QSqlQueryModel\n\nfrom qutebrowser.misc import sql\nfrom qutebrowser.utils import debug\nfrom qutebrowser.config import config\n\n\nclass HistoryCategory(QSqlQueryModel):\n\n \"\"\"A completion category that queries the SQL History store.\"\"\"\n\n def __init__(self, *, delete_func=None, parent=None):\n \"\"\"Create a new History completion category.\"\"\"\n super().__init__(parent=parent)\n self.name = \"History\"\n\n # replace ' in timestamp-format to avoid breaking the query\n timestamp_format = config.val.completion.timestamp_format\n timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n .format(timestamp_format.replace(\"'\", \"`\")))\n\n self._query = sql.Query(' '.join([\n \"SELECT url, title, {}\".format(timefmt),\n \"FROM CompletionHistory\",\n # the incoming pattern will have literal % and _ escaped with '\\'\n # we need to tell sql to treat '\\' as an escape character\n \"WHERE ((url || title) LIKE :pat escape '\\\\')\",\n self._atime_expr(),\n \"ORDER BY last_atime DESC\",\n ]), forward_only=False)\n\n # advertise that this model filters by URL and title\n self.columns_to_filter = [0, 1]\n self.delete_func = delete_func\n\n def _atime_expr(self):\n \"\"\"If max_items is set, return an expression to limit the query.\"\"\"\n max_items = config.val.completion.web_history_max_items\n # HistoryCategory should not be added to the completion in that case.\n assert max_items != 0\n\n if max_items < 0:\n return ''\n\n min_atime = sql.Query(' '.join([\n 'SELECT min(last_atime) FROM',\n '(SELECT last_atime FROM CompletionHistory',\n 'ORDER BY last_atime DESC LIMIT :limit)',\n ])).run(limit=max_items).value()\n\n if not min_atime:\n # if there are no history items, min_atime may be '' (issue #2849)\n return ''\n\n return \"AND last_atime >= {}\".format(min_atime)\n\n def set_pattern(self, pattern):\n \"\"\"Set the pattern used to filter results.\n\n Args:\n pattern: string pattern to filter by.\n \"\"\"\n # escape to treat a user input % or _ as a literal, not a wildcard\n pattern = pattern.replace('%', '\\\\%')\n pattern = pattern.replace('_', '\\\\_')\n # treat spaces as wildcards to match any of the typed words\n pattern = re.sub(r' +', '%', pattern)\n pattern = '%{}%'.format(pattern)\n with debug.log_time('sql', 'Running completion query'):\n self._query.run(pat=pattern)\n self.setQuery(self._query)\n\n def removeRows(self, row, _count, _parent=None):\n \"\"\"Override QAbstractItemModel::removeRows to re-run sql query.\"\"\"\n # re-run query to reload updated table\n with debug.log_time('sql', 'Re-running completion query post-delete'):\n self._query.run()\n self.setQuery(self._query)\n while self.rowCount() < row:\n self.fetchMore()\n return True\n", "path": "qutebrowser/completion/models/histcategory.py"}], "after_files": [{"content": "# vim: ft=python fileencoding=utf-8 sts=4 sw=4 et:\n\n# Copyright 2017 Ryan Roden-Corrent (rcorre) <ryan@rcorre.net>\n#\n# This file is part of qutebrowser.\n#\n# qutebrowser is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# qutebrowser is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with qutebrowser. If not, see <http://www.gnu.org/licenses/>.\n\n\"\"\"A completion category that queries the SQL History store.\"\"\"\n\nfrom PyQt5.QtSql import QSqlQueryModel\n\nfrom qutebrowser.misc import sql\nfrom qutebrowser.utils import debug\nfrom qutebrowser.config import config\n\n\nclass HistoryCategory(QSqlQueryModel):\n\n \"\"\"A completion category that queries the SQL History store.\"\"\"\n\n def __init__(self, *, delete_func=None, parent=None):\n \"\"\"Create a new History completion category.\"\"\"\n super().__init__(parent=parent)\n self.name = \"History\"\n self._query = None\n\n # advertise that this model filters by URL and title\n self.columns_to_filter = [0, 1]\n self.delete_func = delete_func\n\n def _atime_expr(self):\n \"\"\"If max_items is set, return an expression to limit the query.\"\"\"\n max_items = config.val.completion.web_history_max_items\n # HistoryCategory should not be added to the completion in that case.\n assert max_items != 0\n\n if max_items < 0:\n return ''\n\n min_atime = sql.Query(' '.join([\n 'SELECT min(last_atime) FROM',\n '(SELECT last_atime FROM CompletionHistory',\n 'ORDER BY last_atime DESC LIMIT :limit)',\n ])).run(limit=max_items).value()\n\n if not min_atime:\n # if there are no history items, min_atime may be '' (issue #2849)\n return ''\n\n return \"AND last_atime >= {}\".format(min_atime)\n\n def set_pattern(self, pattern):\n \"\"\"Set the pattern used to filter results.\n\n Args:\n pattern: string pattern to filter by.\n \"\"\"\n # escape to treat a user input % or _ as a literal, not a wildcard\n pattern = pattern.replace('%', '\\\\%')\n pattern = pattern.replace('_', '\\\\_')\n words = ['%{}%'.format(w) for w in pattern.split(' ')]\n\n # build a where clause to match all of the words in any order\n # given the search term \"a b\", the WHERE clause would be:\n # ((url || title) LIKE '%a%') AND ((url || title) LIKE '%b%')\n where_clause = ' AND '.join(\n \"(url || title) LIKE :{} escape '\\\\'\".format(i)\n for i in range(len(words)))\n\n # replace ' in timestamp-format to avoid breaking the query\n timestamp_format = config.val.completion.timestamp_format\n timefmt = (\"strftime('{}', last_atime, 'unixepoch', 'localtime')\"\n .format(timestamp_format.replace(\"'\", \"`\")))\n\n if not self._query or len(words) != len(self._query.boundValues()):\n # if the number of words changed, we need to generate a new query\n # otherwise, we can reuse the prepared query for performance\n self._query = sql.Query(' '.join([\n \"SELECT url, title, {}\".format(timefmt),\n \"FROM CompletionHistory\",\n # the incoming pattern will have literal % and _ escaped\n # we need to tell sql to treat '\\' as an escape character\n 'WHERE ({})'.format(where_clause),\n self._atime_expr(),\n \"ORDER BY last_atime DESC\",\n ]), forward_only=False)\n\n with debug.log_time('sql', 'Running completion query'):\n self._query.run(**{\n str(i): w for i, w in enumerate(words)})\n self.setQuery(self._query)\n\n def removeRows(self, row, _count, _parent=None):\n \"\"\"Override QAbstractItemModel::removeRows to re-run sql query.\"\"\"\n # re-run query to reload updated table\n with debug.log_time('sql', 'Re-running completion query post-delete'):\n self._query.run()\n self.setQuery(self._query)\n while self.rowCount() < row:\n self.fetchMore()\n return True\n", "path": "qutebrowser/completion/models/histcategory.py"}]}
| 1,521 | 821 |
gh_patches_debug_5403
|
rasdani/github-patches
|
git_diff
|
ivy-llc__ivy-18003
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
fft2
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/scipy/fft/fft.py`
Content:
```
1 # global
2 import ivy
3 from ivy.functional.frontends.scipy.func_wrapper import (
4 to_ivy_arrays_and_back,
5 )
6
7
8 # fft
9 @to_ivy_arrays_and_back
10 def fft(x, n=None, axis=-1, norm="backward", overwrite_x=False):
11 return ivy.fft(x, axis, norm=norm, n=n)
12
13
14 # ifft
15 @to_ivy_arrays_and_back
16 def ifft(x, n=None, axis=-1, norm="backward", overwrite_x=False):
17 return ivy.ifft(x, axis, norm=norm, n=n)
18
19
20 # dct
21 @to_ivy_arrays_and_back
22 def dct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):
23 return ivy.dct(x, type=type, n=n, axis=axis, norm=norm)
24
25
26 # idct
27 @to_ivy_arrays_and_back
28 def idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):
29 inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]
30 return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ivy/functional/frontends/scipy/fft/fft.py b/ivy/functional/frontends/scipy/fft/fft.py
--- a/ivy/functional/frontends/scipy/fft/fft.py
+++ b/ivy/functional/frontends/scipy/fft/fft.py
@@ -28,3 +28,8 @@
def idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):
inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]
return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)
+
+
+@to_ivy_arrays_and_back
+def fft2(x, s=None, axes=(-2, -1), norm=None, overwrite_x=False):
+ return ivy.fft2(x, s=s, dim=axes, norm=norm)
|
{"golden_diff": "diff --git a/ivy/functional/frontends/scipy/fft/fft.py b/ivy/functional/frontends/scipy/fft/fft.py\n--- a/ivy/functional/frontends/scipy/fft/fft.py\n+++ b/ivy/functional/frontends/scipy/fft/fft.py\n@@ -28,3 +28,8 @@\n def idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]\n return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)\n+\n+\n+@to_ivy_arrays_and_back\n+def fft2(x, s=None, axes=(-2, -1), norm=None, overwrite_x=False):\n+ return ivy.fft2(x, s=s, dim=axes, norm=norm)\n", "issue": "fft2\n\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.scipy.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n# fft\n@to_ivy_arrays_and_back\ndef fft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.fft(x, axis, norm=norm, n=n)\n\n\n# ifft\n@to_ivy_arrays_and_back\ndef ifft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.ifft(x, axis, norm=norm, n=n)\n\n\n# dct\n@to_ivy_arrays_and_back\ndef dct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n return ivy.dct(x, type=type, n=n, axis=axis, norm=norm)\n\n\n# idct\n@to_ivy_arrays_and_back\ndef idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]\n return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)\n", "path": "ivy/functional/frontends/scipy/fft/fft.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.scipy.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n# fft\n@to_ivy_arrays_and_back\ndef fft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.fft(x, axis, norm=norm, n=n)\n\n\n# ifft\n@to_ivy_arrays_and_back\ndef ifft(x, n=None, axis=-1, norm=\"backward\", overwrite_x=False):\n return ivy.ifft(x, axis, norm=norm, n=n)\n\n\n# dct\n@to_ivy_arrays_and_back\ndef dct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n return ivy.dct(x, type=type, n=n, axis=axis, norm=norm)\n\n\n# idct\n@to_ivy_arrays_and_back\ndef idct(x, type=2, n=None, axis=-1, norm=None, overwrite_x=False, orthogonalize=None):\n inverse_type = {1: 1, 2: 3, 3: 2, 4: 4}[type]\n return ivy.dct(x, type=inverse_type, n=n, axis=axis, norm=norm)\n\n\n@to_ivy_arrays_and_back\ndef fft2(x, s=None, axes=(-2, -1), norm=None, overwrite_x=False):\n return ivy.fft2(x, s=s, dim=axes, norm=norm)\n", "path": "ivy/functional/frontends/scipy/fft/fft.py"}]}
| 606 | 211 |
gh_patches_debug_6100
|
rasdani/github-patches
|
git_diff
|
conan-io__conan-center-index-2011
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[bug] Some system packages fail to install (xorg/system, opengl/system)
<!--
Please don't forget to update the issue title.
Include all applicable information to help us reproduce your problem.
To help us debug your issue please explain:
--> When configuring a cmake project template that used conan i encountered an error while configuring cmake :
`opengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'`. After attemping a manual installation with `conan install opengl/system@` i still have the same problem and output. I have no idea what could be causing this and after a few hours of googling i failed to find anything that would explain this. I also get the exact same output when installing `xorg/system`
### Environment Details (include every applicable attribute)
* Operating System+version: Pop!_OS 20.04 LTS x86_64
* Compiler+version: gcc (Ubuntu 9.3.0-10ubuntu2) 9.3.0
* Conan version: Conan version 1.26.0
* Python version: Python 2.7.18rc1 AND Python 3.8.2
### Steps to reproduce (Include if Applicable)
`conan install opengl/system@`
`conan install xorg/system@`
### Logs (Executed commands with output) (Include/Attach if Applicable)
```
Configuration:
[settings]
arch=x86_64
arch_build=x86_64
build_type=Release
os=Linux
os_build=Linux
[options]
[build_requires]
[env]
Installing package: opengl/system
Requirements
opengl/system from 'conan-center' - Cache
Packages
opengl/system:5ab84d6acfe1f23c4fae0ab88f26e3a396351ac9 - Cache
Installing (downloading, building) binaries...
opengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'
ERROR: Error in system requirements
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/opengl/all/conanfile.py`
Content:
```
1 from conans import ConanFile, tools
2 from conans.errors import ConanException
3 import os
4
5
6 class SysConfigOpenGLConan(ConanFile):
7 name = "opengl"
8 version = "system"
9 description = "cross-platform virtual conan package for the OpenGL support"
10 topics = ("conan", "opengl", "gl")
11 url = "https://github.com/conan-io/conan-center-index"
12 homepage = "https://www.opengl.org/"
13 license = "MIT"
14 settings = ("os",)
15
16 def package_id(self):
17 self.info.header_only()
18
19 def _fill_cppinfo_from_pkgconfig(self, name):
20 pkg_config = tools.PkgConfig(name)
21 if not pkg_config.provides:
22 raise ConanException("OpenGL development files aren't available, give up")
23 libs = [lib[2:] for lib in pkg_config.libs_only_l]
24 lib_dirs = [lib[2:] for lib in pkg_config.libs_only_L]
25 ldflags = [flag for flag in pkg_config.libs_only_other]
26 include_dirs = [include[2:] for include in pkg_config.cflags_only_I]
27 cflags = [flag for flag in pkg_config.cflags_only_other if not flag.startswith("-D")]
28 defines = [flag[2:] for flag in pkg_config.cflags_only_other if flag.startswith("-D")]
29
30 self.cpp_info.system_libs.extend(libs)
31 self.cpp_info.libdirs.extend(lib_dirs)
32 self.cpp_info.sharedlinkflags.extend(ldflags)
33 self.cpp_info.exelinkflags.extend(ldflags)
34 self.cpp_info.defines.extend(defines)
35 self.cpp_info.includedirs.extend(include_dirs)
36 self.cpp_info.cflags.extend(cflags)
37 self.cpp_info.cxxflags.extend(cflags)
38
39 def system_requirements(self):
40 if tools.os_info.is_linux and self.settings.os == "Linux":
41 package_tool = tools.SystemPackageTool(conanfile=self, default_mode='verify')
42 if tools.os_info.with_yum:
43 if tools.os_info.linux_distro == "fedora" and tools.os_info.os_version >= "32":
44 packages = ["libglvnd-devel"]
45 else:
46 packages = ["mesa-libGL-devel"]
47 elif tools.os_info.with_apt:
48 ubuntu_20_or_later = tools.os_info.linux_distro == "ubuntu" and tools.os_info.os_version >= "20"
49 debian_11_or_later = tools.os_info.linux_distro == "debian" and tools.os_info.os_version >= "11"
50 if ubuntu_20_or_later or debian_11_or_later:
51 packages = ["libgl-dev"]
52 else:
53 packages = ["libgl1-mesa-dev"]
54 elif tools.os_info.with_pacman:
55 packages = ["libglvnd"]
56 elif tools.os_info.with_zypper:
57 packages = ["Mesa-libGL-devel"]
58 else:
59 self.warn("don't know how to install OpenGL for your distro")
60 package_tool.install(update=True, packages=packages)
61
62 def package_info(self):
63 if self.settings.os == "Macos":
64 self.cpp_info.defines.append("GL_SILENCE_DEPRECATION=1")
65 self.cpp_info.frameworks.append("OpenGL")
66 elif self.settings.os == "Windows":
67 self.cpp_info.system_libs = ["OpenGL32"]
68 elif self.settings.os == "Linux":
69 self._fill_cppinfo_from_pkgconfig('gl')
70
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/recipes/opengl/all/conanfile.py b/recipes/opengl/all/conanfile.py
--- a/recipes/opengl/all/conanfile.py
+++ b/recipes/opengl/all/conanfile.py
@@ -56,7 +56,7 @@
elif tools.os_info.with_zypper:
packages = ["Mesa-libGL-devel"]
else:
- self.warn("don't know how to install OpenGL for your distro")
+ self.output.warn("Don't know how to install OpenGL for your distro.")
package_tool.install(update=True, packages=packages)
def package_info(self):
|
{"golden_diff": "diff --git a/recipes/opengl/all/conanfile.py b/recipes/opengl/all/conanfile.py\n--- a/recipes/opengl/all/conanfile.py\n+++ b/recipes/opengl/all/conanfile.py\n@@ -56,7 +56,7 @@\n elif tools.os_info.with_zypper:\n packages = [\"Mesa-libGL-devel\"]\n else:\n- self.warn(\"don't know how to install OpenGL for your distro\")\n+ self.output.warn(\"Don't know how to install OpenGL for your distro.\")\n package_tool.install(update=True, packages=packages)\n \n def package_info(self):\n", "issue": "[bug] Some system packages fail to install (xorg/system, opengl/system)\n<!--\r\n Please don't forget to update the issue title.\r\n Include all applicable information to help us reproduce your problem.\r\n\r\n To help us debug your issue please explain:\r\n--> When configuring a cmake project template that used conan i encountered an error while configuring cmake : \r\n`opengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'`. After attemping a manual installation with `conan install opengl/system@` i still have the same problem and output. I have no idea what could be causing this and after a few hours of googling i failed to find anything that would explain this. I also get the exact same output when installing `xorg/system`\r\n\r\n### Environment Details (include every applicable attribute)\r\n * Operating System+version: Pop!_OS 20.04 LTS x86_64\r\n * Compiler+version: gcc (Ubuntu 9.3.0-10ubuntu2) 9.3.0\r\n * Conan version: Conan version 1.26.0\r\n * Python version: Python 2.7.18rc1 AND Python 3.8.2\r\n\r\n### Steps to reproduce (Include if Applicable)\r\n`conan install opengl/system@`\r\n`conan install xorg/system@`\r\n\r\n### Logs (Executed commands with output) (Include/Attach if Applicable)\r\n\r\n```\r\nConfiguration:\r\n[settings]\r\narch=x86_64\r\narch_build=x86_64\r\nbuild_type=Release\r\nos=Linux\r\nos_build=Linux\r\n[options]\r\n[build_requires]\r\n[env]\r\n\r\nInstalling package: opengl/system\r\nRequirements\r\n opengl/system from 'conan-center' - Cache\r\nPackages\r\n opengl/system:5ab84d6acfe1f23c4fae0ab88f26e3a396351ac9 - Cache\r\n\r\nInstalling (downloading, building) binaries...\r\nopengl/system: ERROR: while executing system_requirements(): 'SysConfigOpenGLConan' object has no attribute 'warn'\r\nERROR: Error in system requirements\r\n```\r\n\n", "before_files": [{"content": "from conans import ConanFile, tools\nfrom conans.errors import ConanException\nimport os\n\n\nclass SysConfigOpenGLConan(ConanFile):\n name = \"opengl\"\n version = \"system\"\n description = \"cross-platform virtual conan package for the OpenGL support\"\n topics = (\"conan\", \"opengl\", \"gl\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://www.opengl.org/\"\n license = \"MIT\"\n settings = (\"os\",)\n\n def package_id(self):\n self.info.header_only()\n\n def _fill_cppinfo_from_pkgconfig(self, name):\n pkg_config = tools.PkgConfig(name)\n if not pkg_config.provides:\n raise ConanException(\"OpenGL development files aren't available, give up\")\n libs = [lib[2:] for lib in pkg_config.libs_only_l]\n lib_dirs = [lib[2:] for lib in pkg_config.libs_only_L]\n ldflags = [flag for flag in pkg_config.libs_only_other]\n include_dirs = [include[2:] for include in pkg_config.cflags_only_I]\n cflags = [flag for flag in pkg_config.cflags_only_other if not flag.startswith(\"-D\")]\n defines = [flag[2:] for flag in pkg_config.cflags_only_other if flag.startswith(\"-D\")]\n\n self.cpp_info.system_libs.extend(libs)\n self.cpp_info.libdirs.extend(lib_dirs)\n self.cpp_info.sharedlinkflags.extend(ldflags)\n self.cpp_info.exelinkflags.extend(ldflags)\n self.cpp_info.defines.extend(defines)\n self.cpp_info.includedirs.extend(include_dirs)\n self.cpp_info.cflags.extend(cflags)\n self.cpp_info.cxxflags.extend(cflags)\n\n def system_requirements(self):\n if tools.os_info.is_linux and self.settings.os == \"Linux\":\n package_tool = tools.SystemPackageTool(conanfile=self, default_mode='verify')\n if tools.os_info.with_yum:\n if tools.os_info.linux_distro == \"fedora\" and tools.os_info.os_version >= \"32\":\n packages = [\"libglvnd-devel\"]\n else:\n packages = [\"mesa-libGL-devel\"]\n elif tools.os_info.with_apt:\n ubuntu_20_or_later = tools.os_info.linux_distro == \"ubuntu\" and tools.os_info.os_version >= \"20\"\n debian_11_or_later = tools.os_info.linux_distro == \"debian\" and tools.os_info.os_version >= \"11\"\n if ubuntu_20_or_later or debian_11_or_later:\n packages = [\"libgl-dev\"]\n else:\n packages = [\"libgl1-mesa-dev\"]\n elif tools.os_info.with_pacman:\n packages = [\"libglvnd\"]\n elif tools.os_info.with_zypper:\n packages = [\"Mesa-libGL-devel\"]\n else:\n self.warn(\"don't know how to install OpenGL for your distro\")\n package_tool.install(update=True, packages=packages)\n\n def package_info(self):\n if self.settings.os == \"Macos\":\n self.cpp_info.defines.append(\"GL_SILENCE_DEPRECATION=1\")\n self.cpp_info.frameworks.append(\"OpenGL\")\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"OpenGL32\"]\n elif self.settings.os == \"Linux\":\n self._fill_cppinfo_from_pkgconfig('gl')\n", "path": "recipes/opengl/all/conanfile.py"}], "after_files": [{"content": "from conans import ConanFile, tools\nfrom conans.errors import ConanException\nimport os\n\n\nclass SysConfigOpenGLConan(ConanFile):\n name = \"opengl\"\n version = \"system\"\n description = \"cross-platform virtual conan package for the OpenGL support\"\n topics = (\"conan\", \"opengl\", \"gl\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://www.opengl.org/\"\n license = \"MIT\"\n settings = (\"os\",)\n\n def package_id(self):\n self.info.header_only()\n\n def _fill_cppinfo_from_pkgconfig(self, name):\n pkg_config = tools.PkgConfig(name)\n if not pkg_config.provides:\n raise ConanException(\"OpenGL development files aren't available, give up\")\n libs = [lib[2:] for lib in pkg_config.libs_only_l]\n lib_dirs = [lib[2:] for lib in pkg_config.libs_only_L]\n ldflags = [flag for flag in pkg_config.libs_only_other]\n include_dirs = [include[2:] for include in pkg_config.cflags_only_I]\n cflags = [flag for flag in pkg_config.cflags_only_other if not flag.startswith(\"-D\")]\n defines = [flag[2:] for flag in pkg_config.cflags_only_other if flag.startswith(\"-D\")]\n\n self.cpp_info.system_libs.extend(libs)\n self.cpp_info.libdirs.extend(lib_dirs)\n self.cpp_info.sharedlinkflags.extend(ldflags)\n self.cpp_info.exelinkflags.extend(ldflags)\n self.cpp_info.defines.extend(defines)\n self.cpp_info.includedirs.extend(include_dirs)\n self.cpp_info.cflags.extend(cflags)\n self.cpp_info.cxxflags.extend(cflags)\n\n def system_requirements(self):\n if tools.os_info.is_linux and self.settings.os == \"Linux\":\n package_tool = tools.SystemPackageTool(conanfile=self, default_mode='verify')\n if tools.os_info.with_yum:\n if tools.os_info.linux_distro == \"fedora\" and tools.os_info.os_version >= \"32\":\n packages = [\"libglvnd-devel\"]\n else:\n packages = [\"mesa-libGL-devel\"]\n elif tools.os_info.with_apt:\n ubuntu_20_or_later = tools.os_info.linux_distro == \"ubuntu\" and tools.os_info.os_version >= \"20\"\n debian_11_or_later = tools.os_info.linux_distro == \"debian\" and tools.os_info.os_version >= \"11\"\n if ubuntu_20_or_later or debian_11_or_later:\n packages = [\"libgl-dev\"]\n else:\n packages = [\"libgl1-mesa-dev\"]\n elif tools.os_info.with_pacman:\n packages = [\"libglvnd\"]\n elif tools.os_info.with_zypper:\n packages = [\"Mesa-libGL-devel\"]\n else:\n self.output.warn(\"Don't know how to install OpenGL for your distro.\")\n package_tool.install(update=True, packages=packages)\n\n def package_info(self):\n if self.settings.os == \"Macos\":\n self.cpp_info.defines.append(\"GL_SILENCE_DEPRECATION=1\")\n self.cpp_info.frameworks.append(\"OpenGL\")\n elif self.settings.os == \"Windows\":\n self.cpp_info.system_libs = [\"OpenGL32\"]\n elif self.settings.os == \"Linux\":\n self._fill_cppinfo_from_pkgconfig('gl')\n", "path": "recipes/opengl/all/conanfile.py"}]}
| 1,606 | 136 |
gh_patches_debug_35476
|
rasdani/github-patches
|
git_diff
|
WeblateOrg__weblate-3377
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Migration from 3.9.1 to 3.10 fails with AttributeError: 'Comment' object has no attribute '__getstate__'
**Describe the bug**
After upgrading to 3.10 (from 3.9.1) applying trans.0053_unitdata fails with an error.
```
Applying trans.0053_unitdata...Traceback (most recent call last):
File "/home/ubuntu/weblateenv/bin/weblate", line 11, in <module>
sys.exit(main())
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/weblate/runner.py", line 34, in main
execute_from_command_line(argv)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 364, in execute_from_command_line
utility.execute()
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 356, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 283, in run_from_argv
self.execute(*args, **cmd_options)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 330, in execute
output = self.handle(*args, **options)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/commands/migrate.py", line 204, in handle
fake_initial=fake_initial,
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/executor.py", line 115, in migrate
state = self._migrate_all_forwards(state, plan, full_plan, fake=fake, fake_initial=fake_initial)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/executor.py", line 145, in _migrate_all_forwards
state = self.apply_migration(state, migration, fake=fake, fake_initial=fake_initial)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/executor.py", line 244, in apply_migration
state = migration.apply(state, schema_editor)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/migration.py", line 126, in apply
operation.database_forwards(self.app_label, schema_editor, old_state, project_state)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/operations/special.py", line 193, in database_forwards
self.code(from_state.apps, schema_editor)
File "/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/weblate/trans/migrations/0053_unitdata.py", line 24, in migrate_unitdata
state = obj.__getstate__()
AttributeError: 'Comment' object has no attribute '__getstate__'
Sentry is attempting to send 1 pending error messages
Waiting up to 10 seconds
```
**To Reproduce**
Steps to reproduce the behavior:
1. have 3.9.1 installed
1. upgrade to 3.10 via pip
1. apply changes to the settings.py file
1. run `weblate migrate`
1. see the error (https://sentry.io/share/issue/8e9bee713ce74fc9b53d800b504dcfa5/)
**Expected behavior**
no error and a working migration
**Logs**
https://sentry.io/share/issue/8e9bee713ce74fc9b53d800b504dcfa5/
**Server configuration and status**
output of `weblate list_versions`:
```
* Weblate: 3.10
* Django: 1.11.11
* siphashc: 1.2
* Whoosh: 2.7.4
* translate-toolkit: 2.4.0
* lxml: 4.2.1
* Pillow: 6.2.1
* bleach: 3.1.0
* six: 1.13.0
* python-dateutil: 2.8.1
* social-auth-core: 3.2.0
* social-auth-app-django: 3.1.0
* django-crispy-forms: 1.8.1
* oauthlib: 3.1.0
* django-compressor: 2.3
* djangorestframework: 3.9.0
* django-appconf: 1.0.3
* user-agents: 2.0
* filelock: 3.0.12
* setuptools: 39.0.1
* jellyfish: 0.6.1
* openpyxl: 2.6.4
* celery: 4.3.0
* kombu: 4.6.7
* celery-batches: 0.2
* translation-finder: 1.7
* html2text: 2019.8.11
* pycairo: 1.18.2
* pygobject: 3.34.0
* diff-match-patch: 20181111
* requests: 2.22.0
* django-redis: 4.10.0
* hiredis: 1.0.1
* sentry_sdk: 0.13.5
* Cython: 0.29.14
* misaka: 2.1.1
* GitPython: 2.1.14
* Python: 2.7.12
* Git: 2.7.4
* phply: 1.2.4
* chardet: 3.0.4
* ruamel.yaml: 0.16.5
* Database backends: django.db.backends.mysql
* Cache backends: default:RedisCache, avatar:FileBasedCache
* Email setup: django.core.mail.backends.smtp.EmailBackend: smtp.mailgun.org
* Celery: redis://127.0.0.1:6379/0, redis://127.0.0.1:6379/0, regular
* Platform: Linux 4.4.0-1049-aws (x86_64)
```
output of `weblate check --deploy`:
```
SystemCheckError: System check identified some issues:
ERRORS:
?: (weblate.E006) Please migrate your database to use PostgreSQL. Support for other database backends will be dropped in Weblate 4.0 currently sheduled on April 2020.
HINT: https://docs.weblate.org/en/weblate-3.10/admin/install.html#production-database
?: (weblate.W023) Please upgrade your installation to Python 3. Python 2 support will be dropped in Weblate 4.0 currently sheduled on April 2020.
HINT: https://docs.weblate.org/en/weblate-3.10/admin/upgrade.html#py3
System check identified 2 issues (7 silenced).
```
Not quite sure how to fix this.
Thanks in advance!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `weblate/trans/migrations/0053_unitdata.py`
Content:
```
1 # Generated by Django 2.2.5 on 2019-12-12 14:12
2
3 from django.db import migrations
4
5 MODELS = [("trans", "Comment"), ("trans", "Suggestion"), ("checks", "Check")]
6
7
8 def migrate_unitdata(apps, schema_editor):
9 db_alias = schema_editor.connection.alias
10
11 Unit = apps.get_model("trans", "Unit")
12
13 for model in (apps.get_model(*args) for args in MODELS):
14 # Create new objects for each related unit
15 for obj in model.objects.using(db_alias).filter(unit=None).iterator():
16 units = Unit.objects.using(db_alias).filter(
17 content_hash=obj.content_hash,
18 translation__component__project=obj.project,
19 )
20 if obj.language is None:
21 units = units.filter(translation__language=obj.project.source_language)
22 else:
23 units = units.filter(translation__language=obj.language)
24 # Using __getstate__ would be cleaner, but needs Django 2.0
25 state = dict(obj.__dict__)
26 del state["_state"]
27 del state["id"]
28 del state["unit_id"]
29 for unit in units:
30 if model.objects.using(db_alias).filter(unit=unit, **state).exists():
31 continue
32 model.objects.using(db_alias).create(unit=unit, **state)
33
34 # Remove old objects without unit link
35 model.objects.using(db_alias).filter(unit=None).delete()
36
37
38 class Migration(migrations.Migration):
39
40 dependencies = [("trans", "0052_auto_20191212_1403"), ("checks", "0002_check_unit")]
41
42 operations = [
43 migrations.RunPython(migrate_unitdata, migrations.RunPython.noop, elidable=True)
44 ]
45
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/weblate/trans/migrations/0053_unitdata.py b/weblate/trans/migrations/0053_unitdata.py
--- a/weblate/trans/migrations/0053_unitdata.py
+++ b/weblate/trans/migrations/0053_unitdata.py
@@ -2,7 +2,25 @@
from django.db import migrations
-MODELS = [("trans", "Comment"), ("trans", "Suggestion"), ("checks", "Check")]
+MODELS = [
+ (
+ ("trans", "Comment"),
+ ("content_hash", "project", "language", "comment", "user", "timestamp"),
+ ),
+ (
+ ("trans", "Suggestion"),
+ (
+ "content_hash",
+ "project",
+ "language",
+ "target",
+ "user",
+ "userdetails",
+ "timestamp",
+ ),
+ ),
+ (("checks", "Check"), ("content_hash", "project", "language", "check", "ignore")),
+]
def migrate_unitdata(apps, schema_editor):
@@ -10,7 +28,8 @@
Unit = apps.get_model("trans", "Unit")
- for model in (apps.get_model(*args) for args in MODELS):
+ for model_args, fields in MODELS:
+ model = apps.get_model(*model_args)
# Create new objects for each related unit
for obj in model.objects.using(db_alias).filter(unit=None).iterator():
units = Unit.objects.using(db_alias).filter(
@@ -22,14 +41,15 @@
else:
units = units.filter(translation__language=obj.language)
# Using __getstate__ would be cleaner, but needs Django 2.0
- state = dict(obj.__dict__)
- del state["_state"]
- del state["id"]
- del state["unit_id"]
+ state = {field: getattr(obj, field) for field in fields}
for unit in units:
if model.objects.using(db_alias).filter(unit=unit, **state).exists():
continue
- model.objects.using(db_alias).create(unit=unit, **state)
+ created = model.objects.using(db_alias).create(unit=unit, **state)
+ # Migrate suggestion votes
+ if model_args == ("trans", "Suggestion"):
+ for vote in obj.vote_set.all():
+ created.vote_set.create(user=vote.user, value=vote.value)
# Remove old objects without unit link
model.objects.using(db_alias).filter(unit=None).delete()
|
{"golden_diff": "diff --git a/weblate/trans/migrations/0053_unitdata.py b/weblate/trans/migrations/0053_unitdata.py\n--- a/weblate/trans/migrations/0053_unitdata.py\n+++ b/weblate/trans/migrations/0053_unitdata.py\n@@ -2,7 +2,25 @@\n \n from django.db import migrations\n \n-MODELS = [(\"trans\", \"Comment\"), (\"trans\", \"Suggestion\"), (\"checks\", \"Check\")]\n+MODELS = [\n+ (\n+ (\"trans\", \"Comment\"),\n+ (\"content_hash\", \"project\", \"language\", \"comment\", \"user\", \"timestamp\"),\n+ ),\n+ (\n+ (\"trans\", \"Suggestion\"),\n+ (\n+ \"content_hash\",\n+ \"project\",\n+ \"language\",\n+ \"target\",\n+ \"user\",\n+ \"userdetails\",\n+ \"timestamp\",\n+ ),\n+ ),\n+ ((\"checks\", \"Check\"), (\"content_hash\", \"project\", \"language\", \"check\", \"ignore\")),\n+]\n \n \n def migrate_unitdata(apps, schema_editor):\n@@ -10,7 +28,8 @@\n \n Unit = apps.get_model(\"trans\", \"Unit\")\n \n- for model in (apps.get_model(*args) for args in MODELS):\n+ for model_args, fields in MODELS:\n+ model = apps.get_model(*model_args)\n # Create new objects for each related unit\n for obj in model.objects.using(db_alias).filter(unit=None).iterator():\n units = Unit.objects.using(db_alias).filter(\n@@ -22,14 +41,15 @@\n else:\n units = units.filter(translation__language=obj.language)\n # Using __getstate__ would be cleaner, but needs Django 2.0\n- state = dict(obj.__dict__)\n- del state[\"_state\"]\n- del state[\"id\"]\n- del state[\"unit_id\"]\n+ state = {field: getattr(obj, field) for field in fields}\n for unit in units:\n if model.objects.using(db_alias).filter(unit=unit, **state).exists():\n continue\n- model.objects.using(db_alias).create(unit=unit, **state)\n+ created = model.objects.using(db_alias).create(unit=unit, **state)\n+ # Migrate suggestion votes\n+ if model_args == (\"trans\", \"Suggestion\"):\n+ for vote in obj.vote_set.all():\n+ created.vote_set.create(user=vote.user, value=vote.value)\n \n # Remove old objects without unit link\n model.objects.using(db_alias).filter(unit=None).delete()\n", "issue": "Migration from 3.9.1 to 3.10 fails with AttributeError: 'Comment' object has no attribute '__getstate__'\n**Describe the bug**\r\nAfter upgrading to 3.10 (from 3.9.1) applying trans.0053_unitdata fails with an error.\r\n\r\n```\r\n Applying trans.0053_unitdata...Traceback (most recent call last):\r\n File \"/home/ubuntu/weblateenv/bin/weblate\", line 11, in <module>\r\n sys.exit(main())\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/weblate/runner.py\", line 34, in main\r\n execute_from_command_line(argv)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/__init__.py\", line 364, in execute_from_command_line\r\n utility.execute()\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/__init__.py\", line 356, in execute\r\n self.fetch_command(subcommand).run_from_argv(self.argv)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/base.py\", line 283, in run_from_argv\r\n self.execute(*args, **cmd_options)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/base.py\", line 330, in execute\r\n output = self.handle(*args, **options)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/core/management/commands/migrate.py\", line 204, in handle\r\n fake_initial=fake_initial,\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/executor.py\", line 115, in migrate\r\n state = self._migrate_all_forwards(state, plan, full_plan, fake=fake, fake_initial=fake_initial)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/executor.py\", line 145, in _migrate_all_forwards\r\n state = self.apply_migration(state, migration, fake=fake, fake_initial=fake_initial)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/executor.py\", line 244, in apply_migration\r\n state = migration.apply(state, schema_editor)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/migration.py\", line 126, in apply\r\n operation.database_forwards(self.app_label, schema_editor, old_state, project_state)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/django/db/migrations/operations/special.py\", line 193, in database_forwards\r\n self.code(from_state.apps, schema_editor)\r\n File \"/home/ubuntu/weblateenv/local/lib/python2.7/site-packages/weblate/trans/migrations/0053_unitdata.py\", line 24, in migrate_unitdata\r\n state = obj.__getstate__()\r\nAttributeError: 'Comment' object has no attribute '__getstate__'\r\nSentry is attempting to send 1 pending error messages\r\nWaiting up to 10 seconds\r\n```\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. have 3.9.1 installed\r\n1. upgrade to 3.10 via pip\r\n1. apply changes to the settings.py file\r\n1. run `weblate migrate`\r\n1. see the error (https://sentry.io/share/issue/8e9bee713ce74fc9b53d800b504dcfa5/)\r\n\r\n**Expected behavior**\r\nno error and a working migration\r\n\r\n**Logs**\r\nhttps://sentry.io/share/issue/8e9bee713ce74fc9b53d800b504dcfa5/\r\n\r\n**Server configuration and status**\r\noutput of `weblate list_versions`:\r\n```\r\n * Weblate: 3.10\r\n * Django: 1.11.11\r\n * siphashc: 1.2\r\n * Whoosh: 2.7.4\r\n * translate-toolkit: 2.4.0\r\n * lxml: 4.2.1\r\n * Pillow: 6.2.1\r\n * bleach: 3.1.0\r\n * six: 1.13.0\r\n * python-dateutil: 2.8.1\r\n * social-auth-core: 3.2.0\r\n * social-auth-app-django: 3.1.0\r\n * django-crispy-forms: 1.8.1\r\n * oauthlib: 3.1.0\r\n * django-compressor: 2.3\r\n * djangorestframework: 3.9.0\r\n * django-appconf: 1.0.3\r\n * user-agents: 2.0\r\n * filelock: 3.0.12\r\n * setuptools: 39.0.1\r\n * jellyfish: 0.6.1\r\n * openpyxl: 2.6.4\r\n * celery: 4.3.0\r\n * kombu: 4.6.7\r\n * celery-batches: 0.2\r\n * translation-finder: 1.7\r\n * html2text: 2019.8.11\r\n * pycairo: 1.18.2\r\n * pygobject: 3.34.0\r\n * diff-match-patch: 20181111\r\n * requests: 2.22.0\r\n * django-redis: 4.10.0\r\n * hiredis: 1.0.1\r\n * sentry_sdk: 0.13.5\r\n * Cython: 0.29.14\r\n * misaka: 2.1.1\r\n * GitPython: 2.1.14\r\n * Python: 2.7.12\r\n * Git: 2.7.4\r\n * phply: 1.2.4\r\n * chardet: 3.0.4\r\n * ruamel.yaml: 0.16.5\r\n * Database backends: django.db.backends.mysql\r\n * Cache backends: default:RedisCache, avatar:FileBasedCache\r\n * Email setup: django.core.mail.backends.smtp.EmailBackend: smtp.mailgun.org\r\n * Celery: redis://127.0.0.1:6379/0, redis://127.0.0.1:6379/0, regular\r\n * Platform: Linux 4.4.0-1049-aws (x86_64)\r\n```\r\n\r\noutput of `weblate check --deploy`:\r\n```\r\nSystemCheckError: System check identified some issues:\r\n\r\nERRORS:\r\n?: (weblate.E006) Please migrate your database to use PostgreSQL. Support for other database backends will be dropped in Weblate 4.0 currently sheduled on April 2020.\r\n\tHINT: https://docs.weblate.org/en/weblate-3.10/admin/install.html#production-database\r\n?: (weblate.W023) Please upgrade your installation to Python 3. Python 2 support will be dropped in Weblate 4.0 currently sheduled on April 2020.\r\n\tHINT: https://docs.weblate.org/en/weblate-3.10/admin/upgrade.html#py3\r\n\r\nSystem check identified 2 issues (7 silenced).\r\n```\r\n\r\n\r\nNot quite sure how to fix this.\r\nThanks in advance!\r\n\n", "before_files": [{"content": "# Generated by Django 2.2.5 on 2019-12-12 14:12\n\nfrom django.db import migrations\n\nMODELS = [(\"trans\", \"Comment\"), (\"trans\", \"Suggestion\"), (\"checks\", \"Check\")]\n\n\ndef migrate_unitdata(apps, schema_editor):\n db_alias = schema_editor.connection.alias\n\n Unit = apps.get_model(\"trans\", \"Unit\")\n\n for model in (apps.get_model(*args) for args in MODELS):\n # Create new objects for each related unit\n for obj in model.objects.using(db_alias).filter(unit=None).iterator():\n units = Unit.objects.using(db_alias).filter(\n content_hash=obj.content_hash,\n translation__component__project=obj.project,\n )\n if obj.language is None:\n units = units.filter(translation__language=obj.project.source_language)\n else:\n units = units.filter(translation__language=obj.language)\n # Using __getstate__ would be cleaner, but needs Django 2.0\n state = dict(obj.__dict__)\n del state[\"_state\"]\n del state[\"id\"]\n del state[\"unit_id\"]\n for unit in units:\n if model.objects.using(db_alias).filter(unit=unit, **state).exists():\n continue\n model.objects.using(db_alias).create(unit=unit, **state)\n\n # Remove old objects without unit link\n model.objects.using(db_alias).filter(unit=None).delete()\n\n\nclass Migration(migrations.Migration):\n\n dependencies = [(\"trans\", \"0052_auto_20191212_1403\"), (\"checks\", \"0002_check_unit\")]\n\n operations = [\n migrations.RunPython(migrate_unitdata, migrations.RunPython.noop, elidable=True)\n ]\n", "path": "weblate/trans/migrations/0053_unitdata.py"}], "after_files": [{"content": "# Generated by Django 2.2.5 on 2019-12-12 14:12\n\nfrom django.db import migrations\n\nMODELS = [\n (\n (\"trans\", \"Comment\"),\n (\"content_hash\", \"project\", \"language\", \"comment\", \"user\", \"timestamp\"),\n ),\n (\n (\"trans\", \"Suggestion\"),\n (\n \"content_hash\",\n \"project\",\n \"language\",\n \"target\",\n \"user\",\n \"userdetails\",\n \"timestamp\",\n ),\n ),\n ((\"checks\", \"Check\"), (\"content_hash\", \"project\", \"language\", \"check\", \"ignore\")),\n]\n\n\ndef migrate_unitdata(apps, schema_editor):\n db_alias = schema_editor.connection.alias\n\n Unit = apps.get_model(\"trans\", \"Unit\")\n\n for model_args, fields in MODELS:\n model = apps.get_model(*model_args)\n # Create new objects for each related unit\n for obj in model.objects.using(db_alias).filter(unit=None).iterator():\n units = Unit.objects.using(db_alias).filter(\n content_hash=obj.content_hash,\n translation__component__project=obj.project,\n )\n if obj.language is None:\n units = units.filter(translation__language=obj.project.source_language)\n else:\n units = units.filter(translation__language=obj.language)\n # Using __getstate__ would be cleaner, but needs Django 2.0\n state = {field: getattr(obj, field) for field in fields}\n for unit in units:\n if model.objects.using(db_alias).filter(unit=unit, **state).exists():\n continue\n created = model.objects.using(db_alias).create(unit=unit, **state)\n # Migrate suggestion votes\n if model_args == (\"trans\", \"Suggestion\"):\n for vote in obj.vote_set.all():\n created.vote_set.create(user=vote.user, value=vote.value)\n\n # Remove old objects without unit link\n model.objects.using(db_alias).filter(unit=None).delete()\n\n\nclass Migration(migrations.Migration):\n\n dependencies = [(\"trans\", \"0052_auto_20191212_1403\"), (\"checks\", \"0002_check_unit\")]\n\n operations = [\n migrations.RunPython(migrate_unitdata, migrations.RunPython.noop, elidable=True)\n ]\n", "path": "weblate/trans/migrations/0053_unitdata.py"}]}
| 2,446 | 580 |
gh_patches_debug_5029
|
rasdani/github-patches
|
git_diff
|
sopel-irc__sopel-1464
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Config filename is mis-parsed when running configuration wizard from command line
~~`-w` / `--configure-all` and~~ `--configure-modules` does not drop the file extension `.cfg` from the specified file when searching for a config file to use (unlike just launching the bot, where as the extension does seem to be dropped, or at least accounted for when finding the config file).
A user stopped into #sopel (2019-01-29, ~14:15 UTC-5) who had a `default.cfg` and `custom_config.cfg`. They primarily used `custom_config.cfg` and when attempting to configure modules for said `custom_config.cfg`, the used the command `sopel -c custom_config.cfg --configure-modules`. Sopel could not find `custom_config.cfg.cfg`, and without error/warning, edited `default.cfg`. Launching the bot with `sopel -c custom_config.cfg` still used the correct file though, so any changes made with the configuration wizard were never propogated to the config file that was actually being loaded.
**EDIT**: the `--configure-all` flag will just make a new config file like `custom_config.cfg.cfg`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sopel/config/__init__.py`
Content:
```
1 # coding=utf-8
2 """
3 The config object provides a simplified to access Sopel's configuration file.
4 The sections of the file are attributes of the object, and the keys in the
5 section are attributes of that. So, for example, the ``eggs`` attribute in the
6 ``[spam]`` section can be accessed from ``config.spam.eggs``.
7
8 Section definitions (see "Section configuration sections" below) can be added
9 to the config object with ``define_section``. When this is done, only the
10 defined keys will be available. A section can not be given more than one
11 definition. The ``[core]`` section is defined with ``CoreSection`` when the
12 object is initialized.
13
14 .. versionadded:: 6.0.0
15 """
16 # Copyright 2012-2015, Elsie Powell, embolalia.com
17 # Copyright © 2012, Elad Alfassa <elad@fedoraproject.org>
18 # Licensed under the Eiffel Forum License 2.
19
20 from __future__ import unicode_literals, absolute_import, print_function, division
21
22 from sopel.tools import iteritems, stderr
23 import sopel.tools
24 from sopel.tools import get_input
25 import sopel.loader
26 import os
27 import sys
28 if sys.version_info.major < 3:
29 import ConfigParser
30 else:
31 basestring = str
32 import configparser as ConfigParser
33 import sopel.config.core_section
34 from sopel.config.types import StaticSection
35
36
37 class ConfigurationError(Exception):
38 """ Exception type for configuration errors """
39
40 def __init__(self, value):
41 self.value = value
42
43 def __str__(self):
44 return 'ConfigurationError: %s' % self.value
45
46
47 class Config(object):
48 def __init__(self, filename, validate=True):
49 """The bot's configuration.
50
51 The given filename will be associated with the configuration, and is
52 the file which will be written if write() is called. If load is not
53 given or True, the configuration object will load the attributes from
54 the file at filename.
55
56 A few default values will be set here if they are not defined in the
57 config file, or a config file is not loaded. They are documented below.
58 """
59 self.filename = filename
60 """The config object's associated file, as noted above."""
61 self.parser = ConfigParser.RawConfigParser(allow_no_value=True)
62 self.parser.read(self.filename)
63 self.define_section('core', sopel.config.core_section.CoreSection,
64 validate=validate)
65 self.get = self.parser.get
66
67 @property
68 def homedir(self):
69 """An alias to config.core.homedir"""
70 # Technically it's the other way around, so we can bootstrap filename
71 # attributes in the core section, but whatever.
72 configured = None
73 if self.parser.has_option('core', 'homedir'):
74 configured = self.parser.get('core', 'homedir')
75 if configured:
76 return configured
77 else:
78 return os.path.dirname(self.filename)
79
80 def save(self):
81 """Save all changes to the config file."""
82 cfgfile = open(self.filename, 'w')
83 self.parser.write(cfgfile)
84 cfgfile.flush()
85 cfgfile.close()
86
87 def add_section(self, name):
88 """Add a section to the config file.
89
90 Returns ``False`` if already exists.
91 """
92 try:
93 return self.parser.add_section(name)
94 except ConfigParser.DuplicateSectionError:
95 return False
96
97 def define_section(self, name, cls_, validate=True):
98 """Define the available settings in a section.
99
100 ``cls_`` must be a subclass of ``StaticSection``. If the section has
101 already been defined with a different class, ValueError is raised.
102
103 If ``validate`` is True, the section's values will be validated, and an
104 exception raised if they are invalid. This is desirable in a module's
105 setup function, for example, but might not be in the configure function.
106 """
107 if not issubclass(cls_, StaticSection):
108 raise ValueError("Class must be a subclass of StaticSection.")
109 current = getattr(self, name, None)
110 current_name = str(current.__class__)
111 new_name = str(cls_)
112 if (current is not None and not isinstance(current, self.ConfigSection) and
113 not current_name == new_name):
114 raise ValueError(
115 "Can not re-define class for section from {} to {}.".format(
116 current_name, new_name)
117 )
118 setattr(self, name, cls_(self, name, validate=validate))
119
120 class ConfigSection(object):
121
122 """Represents a section of the config file.
123
124 Contains all keys in thesection as attributes.
125
126 """
127
128 def __init__(self, name, items, parent):
129 object.__setattr__(self, '_name', name)
130 object.__setattr__(self, '_parent', parent)
131 for item in items:
132 value = item[1].strip()
133 if not value.lower() == 'none':
134 if value.lower() == 'false':
135 value = False
136 object.__setattr__(self, item[0], value)
137
138 def __getattr__(self, name):
139 return None
140
141 def __setattr__(self, name, value):
142 object.__setattr__(self, name, value)
143 if type(value) is list:
144 value = ','.join(value)
145 self._parent.parser.set(self._name, name, value)
146
147 def get_list(self, name):
148 value = getattr(self, name)
149 if not value:
150 return []
151 if isinstance(value, basestring):
152 value = value.split(',')
153 # Keep the split value, so we don't have to keep doing this
154 setattr(self, name, value)
155 return value
156
157 def __getattr__(self, name):
158 if name in self.parser.sections():
159 items = self.parser.items(name)
160 section = self.ConfigSection(name, items, self) # Return a section
161 setattr(self, name, section)
162 return section
163 else:
164 raise AttributeError("%r object has no attribute %r"
165 % (type(self).__name__, name))
166
167 def option(self, question, default=False):
168 """Ask "y/n" and return the corresponding boolean answer.
169
170 Show user in terminal a "y/n" prompt, and return true or false based on
171 the response. If default is passed as true, the default will be shown
172 as ``[y]``, else it will be ``[n]``. ``question`` should be phrased as
173 a question, but without a question mark at the end.
174
175 """
176 d = 'n'
177 if default:
178 d = 'y'
179 ans = get_input(question + ' (y/n)? [' + d + '] ')
180 if not ans:
181 ans = d
182 return ans.lower() == 'y'
183
184 def _modules(self):
185 home = os.getcwd()
186 modules_dir = os.path.join(home, 'modules')
187 filenames = sopel.loader.enumerate_modules(self)
188 os.sys.path.insert(0, modules_dir)
189 for name, mod_spec in iteritems(filenames):
190 path, type_ = mod_spec
191 try:
192 module, _ = sopel.loader.load_module(name, path, type_)
193 except Exception as e:
194 filename, lineno = sopel.tools.get_raising_file_and_line()
195 rel_path = os.path.relpath(filename, os.path.dirname(__file__))
196 raising_stmt = "%s:%d" % (rel_path, lineno)
197 stderr("Error loading %s: %s (%s)" % (name, e, raising_stmt))
198 else:
199 if hasattr(module, 'configure'):
200 prompt = name + ' module'
201 if module.__doc__:
202 doc = module.__doc__.split('\n', 1)[0]
203 if doc:
204 prompt = doc
205 prompt = 'Configure {} (y/n)? [n]'.format(prompt)
206 do_configure = get_input(prompt)
207 do_configure = do_configure and do_configure.lower() == 'y'
208 if do_configure:
209 module.configure(self)
210 self.save()
211
212
213 def _wizard(section, config=None):
214 dotdir = os.path.expanduser('~/.sopel')
215 configpath = os.path.join(dotdir, (config or 'default') + '.cfg')
216 if section == 'all':
217 _create_config(configpath)
218 elif section == 'mod':
219 _check_dir(False)
220 if not os.path.isfile(configpath):
221 print("No config file found." +
222 " Please make one before configuring these options.")
223 sys.exit(1)
224 config = Config(configpath, validate=False)
225 config._modules()
226
227
228 def _check_dir(create=True):
229 dotdir = os.path.join(os.path.expanduser('~'), '.sopel')
230 if not os.path.isdir(dotdir):
231 if create:
232 print('Creating a config directory at ~/.sopel...')
233 try:
234 os.makedirs(dotdir)
235 except Exception as e:
236 print('There was a problem creating %s:' % dotdir, file=sys.stderr)
237 print('%s, %s' % (e.__class__, str(e)), file=sys.stderr)
238 print('Please fix this and then run Sopel again.', file=sys.stderr)
239 sys.exit(1)
240 else:
241 print("No config file found. Please make one before configuring these options.")
242 sys.exit(1)
243
244
245 def _create_config(configpath):
246 _check_dir()
247 print("Please answer the following questions" +
248 " to create your configuration file:\n")
249 try:
250 config = Config(configpath, validate=False)
251 sopel.config.core_section.configure(config)
252 if config.option(
253 'Would you like to see if there are any modules'
254 ' that need configuring'
255 ):
256 config._modules()
257 config.save()
258 except Exception: # TODO: Be specific
259 print("Encountered an error while writing the config file." +
260 " This shouldn't happen. Check permissions.")
261 raise
262 sys.exit(1)
263 print("Config file written successfully!")
264
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/sopel/config/__init__.py b/sopel/config/__init__.py
--- a/sopel/config/__init__.py
+++ b/sopel/config/__init__.py
@@ -212,7 +212,7 @@
def _wizard(section, config=None):
dotdir = os.path.expanduser('~/.sopel')
- configpath = os.path.join(dotdir, (config or 'default') + '.cfg')
+ configpath = os.path.join(dotdir, ((config or 'default.cfg') + ('.cfg' if config and not config.endswith('.cfg') else '')))
if section == 'all':
_create_config(configpath)
elif section == 'mod':
|
{"golden_diff": "diff --git a/sopel/config/__init__.py b/sopel/config/__init__.py\n--- a/sopel/config/__init__.py\n+++ b/sopel/config/__init__.py\n@@ -212,7 +212,7 @@\n \n def _wizard(section, config=None):\n dotdir = os.path.expanduser('~/.sopel')\n- configpath = os.path.join(dotdir, (config or 'default') + '.cfg')\n+ configpath = os.path.join(dotdir, ((config or 'default.cfg') + ('.cfg' if config and not config.endswith('.cfg') else '')))\n if section == 'all':\n _create_config(configpath)\n elif section == 'mod':\n", "issue": "Config filename is mis-parsed when running configuration wizard from command line\n~~`-w` / `--configure-all` and~~ `--configure-modules` does not drop the file extension `.cfg` from the specified file when searching for a config file to use (unlike just launching the bot, where as the extension does seem to be dropped, or at least accounted for when finding the config file).\r\n\r\nA user stopped into #sopel (2019-01-29, ~14:15 UTC-5) who had a `default.cfg` and `custom_config.cfg`. They primarily used `custom_config.cfg` and when attempting to configure modules for said `custom_config.cfg`, the used the command `sopel -c custom_config.cfg --configure-modules`. Sopel could not find `custom_config.cfg.cfg`, and without error/warning, edited `default.cfg`. Launching the bot with `sopel -c custom_config.cfg` still used the correct file though, so any changes made with the configuration wizard were never propogated to the config file that was actually being loaded.\r\n\r\n**EDIT**: the `--configure-all` flag will just make a new config file like `custom_config.cfg.cfg`\n", "before_files": [{"content": "# coding=utf-8\n\"\"\"\nThe config object provides a simplified to access Sopel's configuration file.\nThe sections of the file are attributes of the object, and the keys in the\nsection are attributes of that. So, for example, the ``eggs`` attribute in the\n``[spam]`` section can be accessed from ``config.spam.eggs``.\n\nSection definitions (see \"Section configuration sections\" below) can be added\nto the config object with ``define_section``. When this is done, only the\ndefined keys will be available. A section can not be given more than one\ndefinition. The ``[core]`` section is defined with ``CoreSection`` when the\nobject is initialized.\n\n.. versionadded:: 6.0.0\n\"\"\"\n# Copyright 2012-2015, Elsie Powell, embolalia.com\n# Copyright \u00a9 2012, Elad Alfassa <elad@fedoraproject.org>\n# Licensed under the Eiffel Forum License 2.\n\nfrom __future__ import unicode_literals, absolute_import, print_function, division\n\nfrom sopel.tools import iteritems, stderr\nimport sopel.tools\nfrom sopel.tools import get_input\nimport sopel.loader\nimport os\nimport sys\nif sys.version_info.major < 3:\n import ConfigParser\nelse:\n basestring = str\n import configparser as ConfigParser\nimport sopel.config.core_section\nfrom sopel.config.types import StaticSection\n\n\nclass ConfigurationError(Exception):\n \"\"\" Exception type for configuration errors \"\"\"\n\n def __init__(self, value):\n self.value = value\n\n def __str__(self):\n return 'ConfigurationError: %s' % self.value\n\n\nclass Config(object):\n def __init__(self, filename, validate=True):\n \"\"\"The bot's configuration.\n\n The given filename will be associated with the configuration, and is\n the file which will be written if write() is called. If load is not\n given or True, the configuration object will load the attributes from\n the file at filename.\n\n A few default values will be set here if they are not defined in the\n config file, or a config file is not loaded. They are documented below.\n \"\"\"\n self.filename = filename\n \"\"\"The config object's associated file, as noted above.\"\"\"\n self.parser = ConfigParser.RawConfigParser(allow_no_value=True)\n self.parser.read(self.filename)\n self.define_section('core', sopel.config.core_section.CoreSection,\n validate=validate)\n self.get = self.parser.get\n\n @property\n def homedir(self):\n \"\"\"An alias to config.core.homedir\"\"\"\n # Technically it's the other way around, so we can bootstrap filename\n # attributes in the core section, but whatever.\n configured = None\n if self.parser.has_option('core', 'homedir'):\n configured = self.parser.get('core', 'homedir')\n if configured:\n return configured\n else:\n return os.path.dirname(self.filename)\n\n def save(self):\n \"\"\"Save all changes to the config file.\"\"\"\n cfgfile = open(self.filename, 'w')\n self.parser.write(cfgfile)\n cfgfile.flush()\n cfgfile.close()\n\n def add_section(self, name):\n \"\"\"Add a section to the config file.\n\n Returns ``False`` if already exists.\n \"\"\"\n try:\n return self.parser.add_section(name)\n except ConfigParser.DuplicateSectionError:\n return False\n\n def define_section(self, name, cls_, validate=True):\n \"\"\"Define the available settings in a section.\n\n ``cls_`` must be a subclass of ``StaticSection``. If the section has\n already been defined with a different class, ValueError is raised.\n\n If ``validate`` is True, the section's values will be validated, and an\n exception raised if they are invalid. This is desirable in a module's\n setup function, for example, but might not be in the configure function.\n \"\"\"\n if not issubclass(cls_, StaticSection):\n raise ValueError(\"Class must be a subclass of StaticSection.\")\n current = getattr(self, name, None)\n current_name = str(current.__class__)\n new_name = str(cls_)\n if (current is not None and not isinstance(current, self.ConfigSection) and\n not current_name == new_name):\n raise ValueError(\n \"Can not re-define class for section from {} to {}.\".format(\n current_name, new_name)\n )\n setattr(self, name, cls_(self, name, validate=validate))\n\n class ConfigSection(object):\n\n \"\"\"Represents a section of the config file.\n\n Contains all keys in thesection as attributes.\n\n \"\"\"\n\n def __init__(self, name, items, parent):\n object.__setattr__(self, '_name', name)\n object.__setattr__(self, '_parent', parent)\n for item in items:\n value = item[1].strip()\n if not value.lower() == 'none':\n if value.lower() == 'false':\n value = False\n object.__setattr__(self, item[0], value)\n\n def __getattr__(self, name):\n return None\n\n def __setattr__(self, name, value):\n object.__setattr__(self, name, value)\n if type(value) is list:\n value = ','.join(value)\n self._parent.parser.set(self._name, name, value)\n\n def get_list(self, name):\n value = getattr(self, name)\n if not value:\n return []\n if isinstance(value, basestring):\n value = value.split(',')\n # Keep the split value, so we don't have to keep doing this\n setattr(self, name, value)\n return value\n\n def __getattr__(self, name):\n if name in self.parser.sections():\n items = self.parser.items(name)\n section = self.ConfigSection(name, items, self) # Return a section\n setattr(self, name, section)\n return section\n else:\n raise AttributeError(\"%r object has no attribute %r\"\n % (type(self).__name__, name))\n\n def option(self, question, default=False):\n \"\"\"Ask \"y/n\" and return the corresponding boolean answer.\n\n Show user in terminal a \"y/n\" prompt, and return true or false based on\n the response. If default is passed as true, the default will be shown\n as ``[y]``, else it will be ``[n]``. ``question`` should be phrased as\n a question, but without a question mark at the end.\n\n \"\"\"\n d = 'n'\n if default:\n d = 'y'\n ans = get_input(question + ' (y/n)? [' + d + '] ')\n if not ans:\n ans = d\n return ans.lower() == 'y'\n\n def _modules(self):\n home = os.getcwd()\n modules_dir = os.path.join(home, 'modules')\n filenames = sopel.loader.enumerate_modules(self)\n os.sys.path.insert(0, modules_dir)\n for name, mod_spec in iteritems(filenames):\n path, type_ = mod_spec\n try:\n module, _ = sopel.loader.load_module(name, path, type_)\n except Exception as e:\n filename, lineno = sopel.tools.get_raising_file_and_line()\n rel_path = os.path.relpath(filename, os.path.dirname(__file__))\n raising_stmt = \"%s:%d\" % (rel_path, lineno)\n stderr(\"Error loading %s: %s (%s)\" % (name, e, raising_stmt))\n else:\n if hasattr(module, 'configure'):\n prompt = name + ' module'\n if module.__doc__:\n doc = module.__doc__.split('\\n', 1)[0]\n if doc:\n prompt = doc\n prompt = 'Configure {} (y/n)? [n]'.format(prompt)\n do_configure = get_input(prompt)\n do_configure = do_configure and do_configure.lower() == 'y'\n if do_configure:\n module.configure(self)\n self.save()\n\n\ndef _wizard(section, config=None):\n dotdir = os.path.expanduser('~/.sopel')\n configpath = os.path.join(dotdir, (config or 'default') + '.cfg')\n if section == 'all':\n _create_config(configpath)\n elif section == 'mod':\n _check_dir(False)\n if not os.path.isfile(configpath):\n print(\"No config file found.\" +\n \" Please make one before configuring these options.\")\n sys.exit(1)\n config = Config(configpath, validate=False)\n config._modules()\n\n\ndef _check_dir(create=True):\n dotdir = os.path.join(os.path.expanduser('~'), '.sopel')\n if not os.path.isdir(dotdir):\n if create:\n print('Creating a config directory at ~/.sopel...')\n try:\n os.makedirs(dotdir)\n except Exception as e:\n print('There was a problem creating %s:' % dotdir, file=sys.stderr)\n print('%s, %s' % (e.__class__, str(e)), file=sys.stderr)\n print('Please fix this and then run Sopel again.', file=sys.stderr)\n sys.exit(1)\n else:\n print(\"No config file found. Please make one before configuring these options.\")\n sys.exit(1)\n\n\ndef _create_config(configpath):\n _check_dir()\n print(\"Please answer the following questions\" +\n \" to create your configuration file:\\n\")\n try:\n config = Config(configpath, validate=False)\n sopel.config.core_section.configure(config)\n if config.option(\n 'Would you like to see if there are any modules'\n ' that need configuring'\n ):\n config._modules()\n config.save()\n except Exception: # TODO: Be specific\n print(\"Encountered an error while writing the config file.\" +\n \" This shouldn't happen. Check permissions.\")\n raise\n sys.exit(1)\n print(\"Config file written successfully!\")\n", "path": "sopel/config/__init__.py"}], "after_files": [{"content": "# coding=utf-8\n\"\"\"\nThe config object provides a simplified to access Sopel's configuration file.\nThe sections of the file are attributes of the object, and the keys in the\nsection are attributes of that. So, for example, the ``eggs`` attribute in the\n``[spam]`` section can be accessed from ``config.spam.eggs``.\n\nSection definitions (see \"Section configuration sections\" below) can be added\nto the config object with ``define_section``. When this is done, only the\ndefined keys will be available. A section can not be given more than one\ndefinition. The ``[core]`` section is defined with ``CoreSection`` when the\nobject is initialized.\n\n.. versionadded:: 6.0.0\n\"\"\"\n# Copyright 2012-2015, Elsie Powell, embolalia.com\n# Copyright \u00a9 2012, Elad Alfassa <elad@fedoraproject.org>\n# Licensed under the Eiffel Forum License 2.\n\nfrom __future__ import unicode_literals, absolute_import, print_function, division\n\nfrom sopel.tools import iteritems, stderr\nimport sopel.tools\nfrom sopel.tools import get_input\nimport sopel.loader\nimport os\nimport sys\nif sys.version_info.major < 3:\n import ConfigParser\nelse:\n basestring = str\n import configparser as ConfigParser\nimport sopel.config.core_section\nfrom sopel.config.types import StaticSection\n\n\nclass ConfigurationError(Exception):\n \"\"\" Exception type for configuration errors \"\"\"\n\n def __init__(self, value):\n self.value = value\n\n def __str__(self):\n return 'ConfigurationError: %s' % self.value\n\n\nclass Config(object):\n def __init__(self, filename, validate=True):\n \"\"\"The bot's configuration.\n\n The given filename will be associated with the configuration, and is\n the file which will be written if write() is called. If load is not\n given or True, the configuration object will load the attributes from\n the file at filename.\n\n A few default values will be set here if they are not defined in the\n config file, or a config file is not loaded. They are documented below.\n \"\"\"\n self.filename = filename\n \"\"\"The config object's associated file, as noted above.\"\"\"\n self.parser = ConfigParser.RawConfigParser(allow_no_value=True)\n self.parser.read(self.filename)\n self.define_section('core', sopel.config.core_section.CoreSection,\n validate=validate)\n self.get = self.parser.get\n\n @property\n def homedir(self):\n \"\"\"An alias to config.core.homedir\"\"\"\n # Technically it's the other way around, so we can bootstrap filename\n # attributes in the core section, but whatever.\n configured = None\n if self.parser.has_option('core', 'homedir'):\n configured = self.parser.get('core', 'homedir')\n if configured:\n return configured\n else:\n return os.path.dirname(self.filename)\n\n def save(self):\n \"\"\"Save all changes to the config file.\"\"\"\n cfgfile = open(self.filename, 'w')\n self.parser.write(cfgfile)\n cfgfile.flush()\n cfgfile.close()\n\n def add_section(self, name):\n \"\"\"Add a section to the config file.\n\n Returns ``False`` if already exists.\n \"\"\"\n try:\n return self.parser.add_section(name)\n except ConfigParser.DuplicateSectionError:\n return False\n\n def define_section(self, name, cls_, validate=True):\n \"\"\"Define the available settings in a section.\n\n ``cls_`` must be a subclass of ``StaticSection``. If the section has\n already been defined with a different class, ValueError is raised.\n\n If ``validate`` is True, the section's values will be validated, and an\n exception raised if they are invalid. This is desirable in a module's\n setup function, for example, but might not be in the configure function.\n \"\"\"\n if not issubclass(cls_, StaticSection):\n raise ValueError(\"Class must be a subclass of StaticSection.\")\n current = getattr(self, name, None)\n current_name = str(current.__class__)\n new_name = str(cls_)\n if (current is not None and not isinstance(current, self.ConfigSection) and\n not current_name == new_name):\n raise ValueError(\n \"Can not re-define class for section from {} to {}.\".format(\n current_name, new_name)\n )\n setattr(self, name, cls_(self, name, validate=validate))\n\n class ConfigSection(object):\n\n \"\"\"Represents a section of the config file.\n\n Contains all keys in thesection as attributes.\n\n \"\"\"\n\n def __init__(self, name, items, parent):\n object.__setattr__(self, '_name', name)\n object.__setattr__(self, '_parent', parent)\n for item in items:\n value = item[1].strip()\n if not value.lower() == 'none':\n if value.lower() == 'false':\n value = False\n object.__setattr__(self, item[0], value)\n\n def __getattr__(self, name):\n return None\n\n def __setattr__(self, name, value):\n object.__setattr__(self, name, value)\n if type(value) is list:\n value = ','.join(value)\n self._parent.parser.set(self._name, name, value)\n\n def get_list(self, name):\n value = getattr(self, name)\n if not value:\n return []\n if isinstance(value, basestring):\n value = value.split(',')\n # Keep the split value, so we don't have to keep doing this\n setattr(self, name, value)\n return value\n\n def __getattr__(self, name):\n if name in self.parser.sections():\n items = self.parser.items(name)\n section = self.ConfigSection(name, items, self) # Return a section\n setattr(self, name, section)\n return section\n else:\n raise AttributeError(\"%r object has no attribute %r\"\n % (type(self).__name__, name))\n\n def option(self, question, default=False):\n \"\"\"Ask \"y/n\" and return the corresponding boolean answer.\n\n Show user in terminal a \"y/n\" prompt, and return true or false based on\n the response. If default is passed as true, the default will be shown\n as ``[y]``, else it will be ``[n]``. ``question`` should be phrased as\n a question, but without a question mark at the end.\n\n \"\"\"\n d = 'n'\n if default:\n d = 'y'\n ans = get_input(question + ' (y/n)? [' + d + '] ')\n if not ans:\n ans = d\n return ans.lower() == 'y'\n\n def _modules(self):\n home = os.getcwd()\n modules_dir = os.path.join(home, 'modules')\n filenames = sopel.loader.enumerate_modules(self)\n os.sys.path.insert(0, modules_dir)\n for name, mod_spec in iteritems(filenames):\n path, type_ = mod_spec\n try:\n module, _ = sopel.loader.load_module(name, path, type_)\n except Exception as e:\n filename, lineno = sopel.tools.get_raising_file_and_line()\n rel_path = os.path.relpath(filename, os.path.dirname(__file__))\n raising_stmt = \"%s:%d\" % (rel_path, lineno)\n stderr(\"Error loading %s: %s (%s)\" % (name, e, raising_stmt))\n else:\n if hasattr(module, 'configure'):\n prompt = name + ' module'\n if module.__doc__:\n doc = module.__doc__.split('\\n', 1)[0]\n if doc:\n prompt = doc\n prompt = 'Configure {} (y/n)? [n]'.format(prompt)\n do_configure = get_input(prompt)\n do_configure = do_configure and do_configure.lower() == 'y'\n if do_configure:\n module.configure(self)\n self.save()\n\n\ndef _wizard(section, config=None):\n dotdir = os.path.expanduser('~/.sopel')\n configpath = os.path.join(dotdir, ((config or 'default.cfg') + ('.cfg' if config and not config.endswith('.cfg') else '')))\n if section == 'all':\n _create_config(configpath)\n elif section == 'mod':\n _check_dir(False)\n if not os.path.isfile(configpath):\n print(\"No config file found.\" +\n \" Please make one before configuring these options.\")\n sys.exit(1)\n config = Config(configpath, validate=False)\n config._modules()\n\n\ndef _check_dir(create=True):\n dotdir = os.path.join(os.path.expanduser('~'), '.sopel')\n if not os.path.isdir(dotdir):\n if create:\n print('Creating a config directory at ~/.sopel...')\n try:\n os.makedirs(dotdir)\n except Exception as e:\n print('There was a problem creating %s:' % dotdir, file=sys.stderr)\n print('%s, %s' % (e.__class__, str(e)), file=sys.stderr)\n print('Please fix this and then run Sopel again.', file=sys.stderr)\n sys.exit(1)\n else:\n print(\"No config file found. Please make one before configuring these options.\")\n sys.exit(1)\n\n\ndef _create_config(configpath):\n _check_dir()\n print(\"Please answer the following questions\" +\n \" to create your configuration file:\\n\")\n try:\n config = Config(configpath, validate=False)\n sopel.config.core_section.configure(config)\n if config.option(\n 'Would you like to see if there are any modules'\n ' that need configuring'\n ):\n config._modules()\n config.save()\n except Exception: # TODO: Be specific\n print(\"Encountered an error while writing the config file.\" +\n \" This shouldn't happen. Check permissions.\")\n raise\n sys.exit(1)\n print(\"Config file written successfully!\")\n", "path": "sopel/config/__init__.py"}]}
| 3,376 | 158 |
gh_patches_debug_37236
|
rasdani/github-patches
|
git_diff
|
ansible__awx-11599
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
doc error with inventory plugin for latest awx.awx inventory plugin
### Please confirm the following
- [X] I agree to follow this project's [code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html).
- [X] I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.
- [X] I understand that AWX is open source software provided for free and that I am not entitled to status updates or other assurances.
### Summary
The documentation for the inventory plugin, awx.awx.controller, was malformed.
rendered webpage for docs is here: https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory
### AWX version
N/A
### Installation method
openshift
### Modifications
no
### Ansible version
n/a
### Operating system
n/a
### Web browser
Chrome
### Steps to reproduce
go to https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory
there is docs errors on the webpage
>The documentation for the inventory plugin, awx.awx.controller, was malformed.
### Expected results
no errors
### Actual results

### Additional information
fast track this to infamous Chris Meyers
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `awx_collection/plugins/doc_fragments/auth_plugin.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Copyright: (c) 2020, Ansible by Red Hat, Inc
4 # GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
5
6 from __future__ import absolute_import, division, print_function
7
8 __metaclass__ = type
9
10
11 class ModuleDocFragment(object):
12
13 # Automation Platform Controller documentation fragment
14 DOCUMENTATION = r'''
15 options:
16 host:
17 description: The network address of your Automation Platform Controller host.
18 env:
19 - name: CONTROLLER_HOST
20 - name: TOWER_HOST
21 deprecated:
22 - removed_from_collection: 'awx.awx.controller'
23 - removed_in: '4.0.0'
24 - why: Collection name change
25 - alternative: 'CONTROLLER_HOST'
26 username:
27 description: The user that you plan to use to access inventories on the controller.
28 env:
29 - name: CONTROLLER_USERNAME
30 - name: TOWER_USERNAME
31 deprecated:
32 - removed_from_collection: 'awx.awx.controller'
33 - removed_in: '4.0.0'
34 - why: Collection name change
35 - alternative: 'CONTROLLER_USERNAME'
36 password:
37 description: The password for your controller user.
38 env:
39 - name: CONTROLLER_PASSWORD
40 - name: TOWER_PASSWORD
41 deprecated:
42 - removed_from_collection: 'awx.awx.controller'
43 - removed_in: '4.0.0'
44 - why: Collection name change
45 - alternative: 'CONTROLLER_PASSWORD'
46 oauth_token:
47 description:
48 - The OAuth token to use.
49 env:
50 - name: CONTROLLER_OAUTH_TOKEN
51 - name: TOWER_OAUTH_TOKEN
52 deprecated:
53 - removed_from_collection: 'awx.awx.controller'
54 - removed_in: '4.0.0'
55 - why: Collection name change
56 - alternative: 'CONTROLLER_OAUTH_TOKEN'
57 verify_ssl:
58 description:
59 - Specify whether Ansible should verify the SSL certificate of the controller host.
60 - Defaults to True, but this is handled by the shared module_utils code
61 type: bool
62 env:
63 - name: CONTROLLER_VERIFY_SSL
64 - name: TOWER_VERIFY_SSL
65 deprecated:
66 - removed_from_collection: 'awx.awx.controller'
67 - removed_in: '4.0.0'
68 - why: Collection name change
69 - alternative: 'CONTROLLER_VERIFY_SSL'
70 aliases: [ validate_certs ]
71
72 notes:
73 - If no I(config_file) is provided we will attempt to use the tower-cli library
74 defaults to find your host information.
75 - I(config_file) should be in the following format
76 host=hostname
77 username=username
78 password=password
79 '''
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/awx_collection/plugins/doc_fragments/auth_plugin.py b/awx_collection/plugins/doc_fragments/auth_plugin.py
--- a/awx_collection/plugins/doc_fragments/auth_plugin.py
+++ b/awx_collection/plugins/doc_fragments/auth_plugin.py
@@ -18,42 +18,42 @@
env:
- name: CONTROLLER_HOST
- name: TOWER_HOST
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_HOST'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_HOST'
username:
description: The user that you plan to use to access inventories on the controller.
env:
- name: CONTROLLER_USERNAME
- name: TOWER_USERNAME
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_USERNAME'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_USERNAME'
password:
description: The password for your controller user.
env:
- name: CONTROLLER_PASSWORD
- name: TOWER_PASSWORD
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_PASSWORD'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_PASSWORD'
oauth_token:
description:
- The OAuth token to use.
env:
- name: CONTROLLER_OAUTH_TOKEN
- name: TOWER_OAUTH_TOKEN
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_OAUTH_TOKEN'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_OAUTH_TOKEN'
verify_ssl:
description:
- Specify whether Ansible should verify the SSL certificate of the controller host.
@@ -62,11 +62,11 @@
env:
- name: CONTROLLER_VERIFY_SSL
- name: TOWER_VERIFY_SSL
- deprecated:
- - removed_from_collection: 'awx.awx.controller'
- - removed_in: '4.0.0'
- - why: Collection name change
- - alternative: 'CONTROLLER_VERIFY_SSL'
+ deprecated:
+ collection_name: 'awx.awx'
+ version: '4.0.0'
+ why: Collection name change
+ alternatives: 'CONTROLLER_VERIFY_SSL'
aliases: [ validate_certs ]
notes:
|
{"golden_diff": "diff --git a/awx_collection/plugins/doc_fragments/auth_plugin.py b/awx_collection/plugins/doc_fragments/auth_plugin.py\n--- a/awx_collection/plugins/doc_fragments/auth_plugin.py\n+++ b/awx_collection/plugins/doc_fragments/auth_plugin.py\n@@ -18,42 +18,42 @@\n env:\n - name: CONTROLLER_HOST\n - name: TOWER_HOST\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_HOST'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_HOST'\n username:\n description: The user that you plan to use to access inventories on the controller.\n env:\n - name: CONTROLLER_USERNAME\n - name: TOWER_USERNAME\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_USERNAME'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_USERNAME'\n password:\n description: The password for your controller user.\n env:\n - name: CONTROLLER_PASSWORD\n - name: TOWER_PASSWORD\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_PASSWORD'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_PASSWORD'\n oauth_token:\n description:\n - The OAuth token to use.\n env:\n - name: CONTROLLER_OAUTH_TOKEN\n - name: TOWER_OAUTH_TOKEN\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_OAUTH_TOKEN'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_OAUTH_TOKEN'\n verify_ssl:\n description:\n - Specify whether Ansible should verify the SSL certificate of the controller host.\n@@ -62,11 +62,11 @@\n env:\n - name: CONTROLLER_VERIFY_SSL\n - name: TOWER_VERIFY_SSL\n- deprecated:\n- - removed_from_collection: 'awx.awx.controller'\n- - removed_in: '4.0.0'\n- - why: Collection name change\n- - alternative: 'CONTROLLER_VERIFY_SSL'\n+ deprecated:\n+ collection_name: 'awx.awx'\n+ version: '4.0.0'\n+ why: Collection name change\n+ alternatives: 'CONTROLLER_VERIFY_SSL'\n aliases: [ validate_certs ]\n \n notes:\n", "issue": "doc error with inventory plugin for latest awx.awx inventory plugin \n### Please confirm the following\n\n- [X] I agree to follow this project's [code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html).\n- [X] I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.\n- [X] I understand that AWX is open source software provided for free and that I am not entitled to status updates or other assurances.\n\n### Summary\n\nThe documentation for the inventory plugin, awx.awx.controller, was malformed.\r\n\r\nrendered webpage for docs is here: https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory\n\n### AWX version\n\nN/A\n\n### Installation method\n\nopenshift\n\n### Modifications\n\nno\n\n### Ansible version\n\nn/a\n\n### Operating system\n\nn/a\n\n### Web browser\n\nChrome\n\n### Steps to reproduce\n\ngo to https://docs.ansible.com/ansible/latest/collections/awx/awx/controller_inventory.html#ansible-collections-awx-awx-controller-inventory\r\n\r\nthere is docs errors on the webpage \r\n\r\n>The documentation for the inventory plugin, awx.awx.controller, was malformed.\r\n\r\n\n\n### Expected results\n\nno errors\n\n### Actual results\n\n\r\n\n\n### Additional information\n\nfast track this to infamous Chris Meyers\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright: (c) 2020, Ansible by Red Hat, Inc\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n\n__metaclass__ = type\n\n\nclass ModuleDocFragment(object):\n\n # Automation Platform Controller documentation fragment\n DOCUMENTATION = r'''\noptions:\n host:\n description: The network address of your Automation Platform Controller host.\n env:\n - name: CONTROLLER_HOST\n - name: TOWER_HOST\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_HOST'\n username:\n description: The user that you plan to use to access inventories on the controller.\n env:\n - name: CONTROLLER_USERNAME\n - name: TOWER_USERNAME\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_USERNAME'\n password:\n description: The password for your controller user.\n env:\n - name: CONTROLLER_PASSWORD\n - name: TOWER_PASSWORD\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_PASSWORD'\n oauth_token:\n description:\n - The OAuth token to use.\n env:\n - name: CONTROLLER_OAUTH_TOKEN\n - name: TOWER_OAUTH_TOKEN\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_OAUTH_TOKEN'\n verify_ssl:\n description:\n - Specify whether Ansible should verify the SSL certificate of the controller host.\n - Defaults to True, but this is handled by the shared module_utils code\n type: bool\n env:\n - name: CONTROLLER_VERIFY_SSL\n - name: TOWER_VERIFY_SSL\n deprecated:\n - removed_from_collection: 'awx.awx.controller'\n - removed_in: '4.0.0'\n - why: Collection name change\n - alternative: 'CONTROLLER_VERIFY_SSL'\n aliases: [ validate_certs ]\n\nnotes:\n- If no I(config_file) is provided we will attempt to use the tower-cli library\n defaults to find your host information.\n- I(config_file) should be in the following format\n host=hostname\n username=username\n password=password\n'''\n", "path": "awx_collection/plugins/doc_fragments/auth_plugin.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Copyright: (c) 2020, Ansible by Red Hat, Inc\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n\n__metaclass__ = type\n\n\nclass ModuleDocFragment(object):\n\n # Automation Platform Controller documentation fragment\n DOCUMENTATION = r'''\noptions:\n host:\n description: The network address of your Automation Platform Controller host.\n env:\n - name: CONTROLLER_HOST\n - name: TOWER_HOST\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_HOST'\n username:\n description: The user that you plan to use to access inventories on the controller.\n env:\n - name: CONTROLLER_USERNAME\n - name: TOWER_USERNAME\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_USERNAME'\n password:\n description: The password for your controller user.\n env:\n - name: CONTROLLER_PASSWORD\n - name: TOWER_PASSWORD\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_PASSWORD'\n oauth_token:\n description:\n - The OAuth token to use.\n env:\n - name: CONTROLLER_OAUTH_TOKEN\n - name: TOWER_OAUTH_TOKEN\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_OAUTH_TOKEN'\n verify_ssl:\n description:\n - Specify whether Ansible should verify the SSL certificate of the controller host.\n - Defaults to True, but this is handled by the shared module_utils code\n type: bool\n env:\n - name: CONTROLLER_VERIFY_SSL\n - name: TOWER_VERIFY_SSL\n deprecated:\n collection_name: 'awx.awx'\n version: '4.0.0'\n why: Collection name change\n alternatives: 'CONTROLLER_VERIFY_SSL'\n aliases: [ validate_certs ]\n\nnotes:\n- If no I(config_file) is provided we will attempt to use the tower-cli library\n defaults to find your host information.\n- I(config_file) should be in the following format\n host=hostname\n username=username\n password=password\n'''\n", "path": "awx_collection/plugins/doc_fragments/auth_plugin.py"}]}
| 1,384 | 739 |
gh_patches_debug_32514
|
rasdani/github-patches
|
git_diff
|
pallets__click-865
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Autocomplete support for zsh
Bash autocompletion works well, but I would really like zsh to work as well
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `click/_bashcomplete.py`
Content:
```
1 import collections
2 import copy
3 import os
4 import re
5
6 from .utils import echo
7 from .parser import split_arg_string
8 from .core import MultiCommand, Option, Argument
9 from .types import Choice
10
11 WORDBREAK = '='
12
13 COMPLETION_SCRIPT = '''
14 %(complete_func)s() {
15 local IFS=$'\n'
16 COMPREPLY=( $( env COMP_WORDS="${COMP_WORDS[*]}" \\
17 COMP_CWORD=$COMP_CWORD \\
18 %(autocomplete_var)s=complete $1 ) )
19 return 0
20 }
21
22 complete -F %(complete_func)s -o default %(script_names)s
23 '''
24
25 _invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')
26
27
28 def get_completion_script(prog_name, complete_var):
29 cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))
30 return (COMPLETION_SCRIPT % {
31 'complete_func': '_%s_completion' % cf_name,
32 'script_names': prog_name,
33 'autocomplete_var': complete_var,
34 }).strip() + ';'
35
36
37 def resolve_ctx(cli, prog_name, args):
38 """
39 Parse into a hierarchy of contexts. Contexts are connected through the parent variable.
40 :param cli: command definition
41 :param prog_name: the program that is running
42 :param args: full list of args
43 :return: the final context/command parsed
44 """
45 ctx = cli.make_context(prog_name, args, resilient_parsing=True)
46 args_remaining = ctx.protected_args + ctx.args
47 while ctx is not None and args_remaining:
48 if isinstance(ctx.command, MultiCommand):
49 cmd = ctx.command.get_command(ctx, args_remaining[0])
50 if cmd is None:
51 return None
52 ctx = cmd.make_context(
53 args_remaining[0], args_remaining[1:], parent=ctx, resilient_parsing=True)
54 args_remaining = ctx.protected_args + ctx.args
55 else:
56 ctx = ctx.parent
57
58 return ctx
59
60
61 def start_of_option(param_str):
62 """
63 :param param_str: param_str to check
64 :return: whether or not this is the start of an option declaration (i.e. starts "-" or "--")
65 """
66 return param_str and param_str[:1] == '-'
67
68
69 def is_incomplete_option(all_args, cmd_param):
70 """
71 :param all_args: the full original list of args supplied
72 :param cmd_param: the current command paramter
73 :return: whether or not the last option declaration (i.e. starts "-" or "--") is incomplete and
74 corresponds to this cmd_param. In other words whether this cmd_param option can still accept
75 values
76 """
77 if not isinstance(cmd_param, Option):
78 return False
79 if cmd_param.is_flag:
80 return False
81 last_option = None
82 for index, arg_str in enumerate(reversed([arg for arg in all_args if arg != WORDBREAK])):
83 if index + 1 > cmd_param.nargs:
84 break
85 if start_of_option(arg_str):
86 last_option = arg_str
87
88 return True if last_option and last_option in cmd_param.opts else False
89
90
91 def is_incomplete_argument(current_params, cmd_param):
92 """
93 :param current_params: the current params and values for this argument as already entered
94 :param cmd_param: the current command parameter
95 :return: whether or not the last argument is incomplete and corresponds to this cmd_param. In
96 other words whether or not the this cmd_param argument can still accept values
97 """
98 if not isinstance(cmd_param, Argument):
99 return False
100 current_param_values = current_params[cmd_param.name]
101 if current_param_values is None:
102 return True
103 if cmd_param.nargs == -1:
104 return True
105 if isinstance(current_param_values, collections.Iterable) \
106 and cmd_param.nargs > 1 and len(current_param_values) < cmd_param.nargs:
107 return True
108 return False
109
110
111 def get_user_autocompletions(ctx, args, incomplete, cmd_param):
112 """
113 :param ctx: context associated with the parsed command
114 :param args: full list of args
115 :param incomplete: the incomplete text to autocomplete
116 :param cmd_param: command definition
117 :return: all the possible user-specified completions for the param
118 """
119 if isinstance(cmd_param.type, Choice):
120 return [c for c in cmd_param.type.choices if c.startswith(incomplete)]
121 elif cmd_param.autocompletion is not None:
122 return cmd_param.autocompletion(ctx=ctx,
123 args=args,
124 incomplete=incomplete)
125 else:
126 return []
127
128
129 def add_subcommand_completions(ctx, incomplete, completions_out):
130 # Add subcommand completions.
131 if isinstance(ctx.command, MultiCommand):
132 completions_out.extend(
133 [c for c in ctx.command.list_commands(ctx) if c.startswith(incomplete)])
134
135 # Walk up the context list and add any other completion possibilities from chained commands
136 while ctx.parent is not None:
137 ctx = ctx.parent
138 if isinstance(ctx.command, MultiCommand) and ctx.command.chain:
139 remaining_commands = sorted(
140 set(ctx.command.list_commands(ctx)) - set(ctx.protected_args))
141 completions_out.extend(
142 [c for c in remaining_commands if c.startswith(incomplete)])
143
144
145 def get_choices(cli, prog_name, args, incomplete):
146 """
147 :param cli: command definition
148 :param prog_name: the program that is running
149 :param args: full list of args
150 :param incomplete: the incomplete text to autocomplete
151 :return: all the possible completions for the incomplete
152 """
153 all_args = copy.deepcopy(args)
154
155 ctx = resolve_ctx(cli, prog_name, args)
156 if ctx is None:
157 return []
158
159 # In newer versions of bash long opts with '='s are partitioned, but it's easier to parse
160 # without the '='
161 if start_of_option(incomplete) and WORDBREAK in incomplete:
162 partition_incomplete = incomplete.partition(WORDBREAK)
163 all_args.append(partition_incomplete[0])
164 incomplete = partition_incomplete[2]
165 elif incomplete == WORDBREAK:
166 incomplete = ''
167
168 completions = []
169 if start_of_option(incomplete):
170 # completions for partial options
171 for param in ctx.command.params:
172 if isinstance(param, Option):
173 param_opts = [param_opt for param_opt in param.opts +
174 param.secondary_opts if param_opt not in all_args or param.multiple]
175 completions.extend(
176 [c for c in param_opts if c.startswith(incomplete)])
177 return completions
178 # completion for option values from user supplied values
179 for param in ctx.command.params:
180 if is_incomplete_option(all_args, param):
181 return get_user_autocompletions(ctx, all_args, incomplete, param)
182 # completion for argument values from user supplied values
183 for param in ctx.command.params:
184 if is_incomplete_argument(ctx.params, param):
185 completions.extend(get_user_autocompletions(
186 ctx, all_args, incomplete, param))
187 # Stop looking for other completions only if this argument is required.
188 if param.required:
189 return completions
190 break
191
192 add_subcommand_completions(ctx, incomplete, completions)
193 return completions
194
195
196 def do_complete(cli, prog_name):
197 cwords = split_arg_string(os.environ['COMP_WORDS'])
198 cword = int(os.environ['COMP_CWORD'])
199 args = cwords[1:cword]
200 try:
201 incomplete = cwords[cword]
202 except IndexError:
203 incomplete = ''
204
205 for item in get_choices(cli, prog_name, args, incomplete):
206 echo(item)
207
208 return True
209
210
211 def bashcomplete(cli, prog_name, complete_var, complete_instr):
212 if complete_instr == 'source':
213 echo(get_completion_script(prog_name, complete_var))
214 return True
215 elif complete_instr == 'complete':
216 return do_complete(cli, prog_name)
217 return False
218
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/click/_bashcomplete.py b/click/_bashcomplete.py
--- a/click/_bashcomplete.py
+++ b/click/_bashcomplete.py
@@ -10,7 +10,7 @@
WORDBREAK = '='
-COMPLETION_SCRIPT = '''
+COMPLETION_SCRIPT_BASH = '''
%(complete_func)s() {
local IFS=$'\n'
COMPREPLY=( $( env COMP_WORDS="${COMP_WORDS[*]}" \\
@@ -19,15 +19,30 @@
return 0
}
-complete -F %(complete_func)s -o default %(script_names)s
+complete -F %(complete_func)s %(script_names)s
+'''
+
+COMPLETION_SCRIPT_ZSH = '''
+%(complete_func)s() {
+ emulate -L zsh
+ local IFS=$'\n'
+ local completions=( $( env COMP_WORDS="${words[*]}" \\
+ COMP_CWORD=$((CURRENT-1)) \\
+ %(autocomplete_var)s="complete" \\
+ %(script_names)s ) )
+ compadd -M 'r:|=* l:|=* r:|=*' -a -- completions
+}
+
+compdef %(complete_func)s %(script_names)s
'''
_invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')
-def get_completion_script(prog_name, complete_var):
+def get_completion_script(prog_name, complete_var, shell):
cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))
- return (COMPLETION_SCRIPT % {
+ script = COMPLETION_SCRIPT_ZSH if shell == 'zsh' else COMPLETION_SCRIPT_BASH
+ return (script % {
'complete_func': '_%s_completion' % cf_name,
'script_names': prog_name,
'autocomplete_var': complete_var,
@@ -209,8 +224,9 @@
def bashcomplete(cli, prog_name, complete_var, complete_instr):
- if complete_instr == 'source':
- echo(get_completion_script(prog_name, complete_var))
+ if complete_instr.startswith('source'):
+ shell = 'zsh' if complete_instr == 'source_zsh' else 'bash'
+ echo(get_completion_script(prog_name, complete_var, shell))
return True
elif complete_instr == 'complete':
return do_complete(cli, prog_name)
|
{"golden_diff": "diff --git a/click/_bashcomplete.py b/click/_bashcomplete.py\n--- a/click/_bashcomplete.py\n+++ b/click/_bashcomplete.py\n@@ -10,7 +10,7 @@\n \n WORDBREAK = '='\n \n-COMPLETION_SCRIPT = '''\n+COMPLETION_SCRIPT_BASH = '''\n %(complete_func)s() {\n local IFS=$'\\n'\n COMPREPLY=( $( env COMP_WORDS=\"${COMP_WORDS[*]}\" \\\\\n@@ -19,15 +19,30 @@\n return 0\n }\n \n-complete -F %(complete_func)s -o default %(script_names)s\n+complete -F %(complete_func)s %(script_names)s\n+'''\n+\n+COMPLETION_SCRIPT_ZSH = '''\n+%(complete_func)s() {\n+ emulate -L zsh\n+ local IFS=$'\\n'\n+ local completions=( $( env COMP_WORDS=\"${words[*]}\" \\\\\n+ COMP_CWORD=$((CURRENT-1)) \\\\\n+ %(autocomplete_var)s=\"complete\" \\\\\n+ %(script_names)s ) )\n+ compadd -M 'r:|=* l:|=* r:|=*' -a -- completions \n+}\n+\n+compdef %(complete_func)s %(script_names)s\n '''\n \n _invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')\n \n \n-def get_completion_script(prog_name, complete_var):\n+def get_completion_script(prog_name, complete_var, shell):\n cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))\n- return (COMPLETION_SCRIPT % {\n+ script = COMPLETION_SCRIPT_ZSH if shell == 'zsh' else COMPLETION_SCRIPT_BASH\n+ return (script % {\n 'complete_func': '_%s_completion' % cf_name,\n 'script_names': prog_name,\n 'autocomplete_var': complete_var,\n@@ -209,8 +224,9 @@\n \n \n def bashcomplete(cli, prog_name, complete_var, complete_instr):\n- if complete_instr == 'source':\n- echo(get_completion_script(prog_name, complete_var))\n+ if complete_instr.startswith('source'):\n+ shell = 'zsh' if complete_instr == 'source_zsh' else 'bash'\n+ echo(get_completion_script(prog_name, complete_var, shell))\n return True\n elif complete_instr == 'complete':\n return do_complete(cli, prog_name)\n", "issue": "Autocomplete support for zsh\nBash autocompletion works well, but I would really like zsh to work as well\n\n", "before_files": [{"content": "import collections\nimport copy\nimport os\nimport re\n\nfrom .utils import echo\nfrom .parser import split_arg_string\nfrom .core import MultiCommand, Option, Argument\nfrom .types import Choice\n\nWORDBREAK = '='\n\nCOMPLETION_SCRIPT = '''\n%(complete_func)s() {\n local IFS=$'\\n'\n COMPREPLY=( $( env COMP_WORDS=\"${COMP_WORDS[*]}\" \\\\\n COMP_CWORD=$COMP_CWORD \\\\\n %(autocomplete_var)s=complete $1 ) )\n return 0\n}\n\ncomplete -F %(complete_func)s -o default %(script_names)s\n'''\n\n_invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')\n\n\ndef get_completion_script(prog_name, complete_var):\n cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))\n return (COMPLETION_SCRIPT % {\n 'complete_func': '_%s_completion' % cf_name,\n 'script_names': prog_name,\n 'autocomplete_var': complete_var,\n }).strip() + ';'\n\n\ndef resolve_ctx(cli, prog_name, args):\n \"\"\"\n Parse into a hierarchy of contexts. Contexts are connected through the parent variable.\n :param cli: command definition\n :param prog_name: the program that is running\n :param args: full list of args\n :return: the final context/command parsed\n \"\"\"\n ctx = cli.make_context(prog_name, args, resilient_parsing=True)\n args_remaining = ctx.protected_args + ctx.args\n while ctx is not None and args_remaining:\n if isinstance(ctx.command, MultiCommand):\n cmd = ctx.command.get_command(ctx, args_remaining[0])\n if cmd is None:\n return None\n ctx = cmd.make_context(\n args_remaining[0], args_remaining[1:], parent=ctx, resilient_parsing=True)\n args_remaining = ctx.protected_args + ctx.args\n else:\n ctx = ctx.parent\n\n return ctx\n\n\ndef start_of_option(param_str):\n \"\"\"\n :param param_str: param_str to check\n :return: whether or not this is the start of an option declaration (i.e. starts \"-\" or \"--\")\n \"\"\"\n return param_str and param_str[:1] == '-'\n\n\ndef is_incomplete_option(all_args, cmd_param):\n \"\"\"\n :param all_args: the full original list of args supplied\n :param cmd_param: the current command paramter\n :return: whether or not the last option declaration (i.e. starts \"-\" or \"--\") is incomplete and\n corresponds to this cmd_param. In other words whether this cmd_param option can still accept\n values\n \"\"\"\n if not isinstance(cmd_param, Option):\n return False\n if cmd_param.is_flag:\n return False\n last_option = None\n for index, arg_str in enumerate(reversed([arg for arg in all_args if arg != WORDBREAK])):\n if index + 1 > cmd_param.nargs:\n break\n if start_of_option(arg_str):\n last_option = arg_str\n\n return True if last_option and last_option in cmd_param.opts else False\n\n\ndef is_incomplete_argument(current_params, cmd_param):\n \"\"\"\n :param current_params: the current params and values for this argument as already entered\n :param cmd_param: the current command parameter\n :return: whether or not the last argument is incomplete and corresponds to this cmd_param. In\n other words whether or not the this cmd_param argument can still accept values\n \"\"\"\n if not isinstance(cmd_param, Argument):\n return False\n current_param_values = current_params[cmd_param.name]\n if current_param_values is None:\n return True\n if cmd_param.nargs == -1:\n return True\n if isinstance(current_param_values, collections.Iterable) \\\n and cmd_param.nargs > 1 and len(current_param_values) < cmd_param.nargs:\n return True\n return False\n\n\ndef get_user_autocompletions(ctx, args, incomplete, cmd_param):\n \"\"\"\n :param ctx: context associated with the parsed command\n :param args: full list of args\n :param incomplete: the incomplete text to autocomplete\n :param cmd_param: command definition\n :return: all the possible user-specified completions for the param\n \"\"\"\n if isinstance(cmd_param.type, Choice):\n return [c for c in cmd_param.type.choices if c.startswith(incomplete)]\n elif cmd_param.autocompletion is not None:\n return cmd_param.autocompletion(ctx=ctx,\n args=args,\n incomplete=incomplete)\n else:\n return []\n\n\ndef add_subcommand_completions(ctx, incomplete, completions_out):\n # Add subcommand completions.\n if isinstance(ctx.command, MultiCommand):\n completions_out.extend(\n [c for c in ctx.command.list_commands(ctx) if c.startswith(incomplete)])\n\n # Walk up the context list and add any other completion possibilities from chained commands\n while ctx.parent is not None:\n ctx = ctx.parent\n if isinstance(ctx.command, MultiCommand) and ctx.command.chain:\n remaining_commands = sorted(\n set(ctx.command.list_commands(ctx)) - set(ctx.protected_args))\n completions_out.extend(\n [c for c in remaining_commands if c.startswith(incomplete)])\n\n\ndef get_choices(cli, prog_name, args, incomplete):\n \"\"\"\n :param cli: command definition\n :param prog_name: the program that is running\n :param args: full list of args\n :param incomplete: the incomplete text to autocomplete\n :return: all the possible completions for the incomplete\n \"\"\"\n all_args = copy.deepcopy(args)\n\n ctx = resolve_ctx(cli, prog_name, args)\n if ctx is None:\n return []\n\n # In newer versions of bash long opts with '='s are partitioned, but it's easier to parse\n # without the '='\n if start_of_option(incomplete) and WORDBREAK in incomplete:\n partition_incomplete = incomplete.partition(WORDBREAK)\n all_args.append(partition_incomplete[0])\n incomplete = partition_incomplete[2]\n elif incomplete == WORDBREAK:\n incomplete = ''\n\n completions = []\n if start_of_option(incomplete):\n # completions for partial options\n for param in ctx.command.params:\n if isinstance(param, Option):\n param_opts = [param_opt for param_opt in param.opts +\n param.secondary_opts if param_opt not in all_args or param.multiple]\n completions.extend(\n [c for c in param_opts if c.startswith(incomplete)])\n return completions\n # completion for option values from user supplied values\n for param in ctx.command.params:\n if is_incomplete_option(all_args, param):\n return get_user_autocompletions(ctx, all_args, incomplete, param)\n # completion for argument values from user supplied values\n for param in ctx.command.params:\n if is_incomplete_argument(ctx.params, param):\n completions.extend(get_user_autocompletions(\n ctx, all_args, incomplete, param))\n # Stop looking for other completions only if this argument is required.\n if param.required:\n return completions\n break\n\n add_subcommand_completions(ctx, incomplete, completions)\n return completions\n\n\ndef do_complete(cli, prog_name):\n cwords = split_arg_string(os.environ['COMP_WORDS'])\n cword = int(os.environ['COMP_CWORD'])\n args = cwords[1:cword]\n try:\n incomplete = cwords[cword]\n except IndexError:\n incomplete = ''\n\n for item in get_choices(cli, prog_name, args, incomplete):\n echo(item)\n\n return True\n\n\ndef bashcomplete(cli, prog_name, complete_var, complete_instr):\n if complete_instr == 'source':\n echo(get_completion_script(prog_name, complete_var))\n return True\n elif complete_instr == 'complete':\n return do_complete(cli, prog_name)\n return False\n", "path": "click/_bashcomplete.py"}], "after_files": [{"content": "import collections\nimport copy\nimport os\nimport re\n\nfrom .utils import echo\nfrom .parser import split_arg_string\nfrom .core import MultiCommand, Option, Argument\nfrom .types import Choice\n\nWORDBREAK = '='\n\nCOMPLETION_SCRIPT_BASH = '''\n%(complete_func)s() {\n local IFS=$'\\n'\n COMPREPLY=( $( env COMP_WORDS=\"${COMP_WORDS[*]}\" \\\\\n COMP_CWORD=$COMP_CWORD \\\\\n %(autocomplete_var)s=complete $1 ) )\n return 0\n}\n\ncomplete -F %(complete_func)s %(script_names)s\n'''\n\nCOMPLETION_SCRIPT_ZSH = '''\n%(complete_func)s() {\n emulate -L zsh\n local IFS=$'\\n'\n local completions=( $( env COMP_WORDS=\"${words[*]}\" \\\\\n COMP_CWORD=$((CURRENT-1)) \\\\\n %(autocomplete_var)s=\"complete\" \\\\\n %(script_names)s ) )\n compadd -M 'r:|=* l:|=* r:|=*' -a -- completions \n}\n\ncompdef %(complete_func)s %(script_names)s\n'''\n\n_invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')\n\n\ndef get_completion_script(prog_name, complete_var, shell):\n cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))\n script = COMPLETION_SCRIPT_ZSH if shell == 'zsh' else COMPLETION_SCRIPT_BASH\n return (script % {\n 'complete_func': '_%s_completion' % cf_name,\n 'script_names': prog_name,\n 'autocomplete_var': complete_var,\n }).strip() + ';'\n\n\ndef resolve_ctx(cli, prog_name, args):\n \"\"\"\n Parse into a hierarchy of contexts. Contexts are connected through the parent variable.\n :param cli: command definition\n :param prog_name: the program that is running\n :param args: full list of args\n :return: the final context/command parsed\n \"\"\"\n ctx = cli.make_context(prog_name, args, resilient_parsing=True)\n args_remaining = ctx.protected_args + ctx.args\n while ctx is not None and args_remaining:\n if isinstance(ctx.command, MultiCommand):\n cmd = ctx.command.get_command(ctx, args_remaining[0])\n if cmd is None:\n return None\n ctx = cmd.make_context(\n args_remaining[0], args_remaining[1:], parent=ctx, resilient_parsing=True)\n args_remaining = ctx.protected_args + ctx.args\n else:\n ctx = ctx.parent\n\n return ctx\n\n\ndef start_of_option(param_str):\n \"\"\"\n :param param_str: param_str to check\n :return: whether or not this is the start of an option declaration (i.e. starts \"-\" or \"--\")\n \"\"\"\n return param_str and param_str[:1] == '-'\n\n\ndef is_incomplete_option(all_args, cmd_param):\n \"\"\"\n :param all_args: the full original list of args supplied\n :param cmd_param: the current command paramter\n :return: whether or not the last option declaration (i.e. starts \"-\" or \"--\") is incomplete and\n corresponds to this cmd_param. In other words whether this cmd_param option can still accept\n values\n \"\"\"\n if not isinstance(cmd_param, Option):\n return False\n if cmd_param.is_flag:\n return False\n last_option = None\n for index, arg_str in enumerate(reversed([arg for arg in all_args if arg != WORDBREAK])):\n if index + 1 > cmd_param.nargs:\n break\n if start_of_option(arg_str):\n last_option = arg_str\n\n return True if last_option and last_option in cmd_param.opts else False\n\n\ndef is_incomplete_argument(current_params, cmd_param):\n \"\"\"\n :param current_params: the current params and values for this argument as already entered\n :param cmd_param: the current command parameter\n :return: whether or not the last argument is incomplete and corresponds to this cmd_param. In\n other words whether or not the this cmd_param argument can still accept values\n \"\"\"\n if not isinstance(cmd_param, Argument):\n return False\n current_param_values = current_params[cmd_param.name]\n if current_param_values is None:\n return True\n if cmd_param.nargs == -1:\n return True\n if isinstance(current_param_values, collections.Iterable) \\\n and cmd_param.nargs > 1 and len(current_param_values) < cmd_param.nargs:\n return True\n return False\n\n\ndef get_user_autocompletions(ctx, args, incomplete, cmd_param):\n \"\"\"\n :param ctx: context associated with the parsed command\n :param args: full list of args\n :param incomplete: the incomplete text to autocomplete\n :param cmd_param: command definition\n :return: all the possible user-specified completions for the param\n \"\"\"\n if isinstance(cmd_param.type, Choice):\n return [c for c in cmd_param.type.choices if c.startswith(incomplete)]\n elif cmd_param.autocompletion is not None:\n return cmd_param.autocompletion(ctx=ctx,\n args=args,\n incomplete=incomplete)\n else:\n return []\n\n\ndef add_subcommand_completions(ctx, incomplete, completions_out):\n # Add subcommand completions.\n if isinstance(ctx.command, MultiCommand):\n completions_out.extend(\n [c for c in ctx.command.list_commands(ctx) if c.startswith(incomplete)])\n\n # Walk up the context list and add any other completion possibilities from chained commands\n while ctx.parent is not None:\n ctx = ctx.parent\n if isinstance(ctx.command, MultiCommand) and ctx.command.chain:\n remaining_commands = sorted(\n set(ctx.command.list_commands(ctx)) - set(ctx.protected_args))\n completions_out.extend(\n [c for c in remaining_commands if c.startswith(incomplete)])\n\n\ndef get_choices(cli, prog_name, args, incomplete):\n \"\"\"\n :param cli: command definition\n :param prog_name: the program that is running\n :param args: full list of args\n :param incomplete: the incomplete text to autocomplete\n :return: all the possible completions for the incomplete\n \"\"\"\n all_args = copy.deepcopy(args)\n\n ctx = resolve_ctx(cli, prog_name, args)\n if ctx is None:\n return []\n\n # In newer versions of bash long opts with '='s are partitioned, but it's easier to parse\n # without the '='\n if start_of_option(incomplete) and WORDBREAK in incomplete:\n partition_incomplete = incomplete.partition(WORDBREAK)\n all_args.append(partition_incomplete[0])\n incomplete = partition_incomplete[2]\n elif incomplete == WORDBREAK:\n incomplete = ''\n\n completions = []\n if start_of_option(incomplete):\n # completions for partial options\n for param in ctx.command.params:\n if isinstance(param, Option):\n param_opts = [param_opt for param_opt in param.opts +\n param.secondary_opts if param_opt not in all_args or param.multiple]\n completions.extend(\n [c for c in param_opts if c.startswith(incomplete)])\n return completions\n # completion for option values from user supplied values\n for param in ctx.command.params:\n if is_incomplete_option(all_args, param):\n return get_user_autocompletions(ctx, all_args, incomplete, param)\n # completion for argument values from user supplied values\n for param in ctx.command.params:\n if is_incomplete_argument(ctx.params, param):\n completions.extend(get_user_autocompletions(\n ctx, all_args, incomplete, param))\n # Stop looking for other completions only if this argument is required.\n if param.required:\n return completions\n break\n\n add_subcommand_completions(ctx, incomplete, completions)\n return completions\n\n\ndef do_complete(cli, prog_name):\n cwords = split_arg_string(os.environ['COMP_WORDS'])\n cword = int(os.environ['COMP_CWORD'])\n args = cwords[1:cword]\n try:\n incomplete = cwords[cword]\n except IndexError:\n incomplete = ''\n\n for item in get_choices(cli, prog_name, args, incomplete):\n echo(item)\n\n return True\n\n\ndef bashcomplete(cli, prog_name, complete_var, complete_instr):\n if complete_instr.startswith('source'):\n shell = 'zsh' if complete_instr == 'source_zsh' else 'bash'\n echo(get_completion_script(prog_name, complete_var, shell))\n return True\n elif complete_instr == 'complete':\n return do_complete(cli, prog_name)\n return False\n", "path": "click/_bashcomplete.py"}]}
| 2,549 | 521 |
gh_patches_debug_19250
|
rasdani/github-patches
|
git_diff
|
sublimelsp__LSP-1957
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Unwanted splitting on dot of the setting keys
**Describe the bug**
In LSP-yaml, the user can specify schema mapping like so:
```
yaml.schemas: {
"https://json.schemastore.org/composer": "/*"
}
```
This doesn't work in ST because LSP splits the URL key on every dot, creating an object that the server doesn't understand.
**Expected behavior**
Should send the value of the `yaml.schemas` object without altering it
**Additional context**
I'm not sure how VSCode manages to handle it but I would guess it is somewhat taking into consideration the schema itself. We could try to check that first.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugin/core/collections.py`
Content:
```
1 """
2 Module with additional collections.
3 """
4 from .typing import Optional, Dict, Any, Generator
5 from copy import deepcopy
6 import sublime
7
8
9 class DottedDict:
10
11 __slots__ = ('_d',)
12
13 def __init__(self, d: Optional[Dict[str, Any]] = None) -> None:
14 """
15 Construct a DottedDict, optionally from an existing dictionary.
16
17 :param d: An existing dictionary.
18 """
19 self._d = {} # type: Dict[str, Any]
20 if d is not None:
21 self.update(d)
22
23 @classmethod
24 def from_base_and_override(cls, base: "DottedDict", override: Optional[Dict[str, Any]]) -> "DottedDict":
25 result = DottedDict(base.copy())
26 if override:
27 result.update(override)
28 return result
29
30 def get(self, path: Optional[str] = None) -> Any:
31 """
32 Get a value from the dictionary.
33
34 :param path: The path, e.g. foo.bar.baz, or None.
35
36 :returns: The value stored at the path, or None if it doesn't exist.
37 Note that this cannot distinguish between None values and
38 paths that don't exist. If the path is None, returns the
39 entire dictionary.
40 """
41 if path is None:
42 return self._d
43 current = self._d # type: Any
44 keys = path.split('.')
45 for key in keys:
46 if isinstance(current, dict):
47 current = current.get(key)
48 else:
49 return None
50 return current
51
52 def walk(self, path: str) -> Generator[Any, None, None]:
53 current = self._d # type: Any
54 keys = path.split('.')
55 for key in keys:
56 if isinstance(current, dict):
57 current = current.get(key)
58 yield current
59 else:
60 yield None
61 return
62
63 def set(self, path: str, value: Any) -> None:
64 """
65 Set a value in the dictionary.
66
67 :param path: The path, e.g. foo.bar.baz
68 :param value: The value
69 """
70 current = self._d
71 keys = path.split('.')
72 for i in range(0, len(keys) - 1):
73 key = keys[i]
74 next_current = current.get(key)
75 if not isinstance(next_current, dict):
76 next_current = {}
77 current[key] = next_current
78 current = next_current
79 current[keys[-1]] = value
80
81 def remove(self, path: str) -> None:
82 """
83 Remove a key from the dictionary.
84
85 :param path: The path, e.g. foo.bar.baz
86 """
87 current = self._d
88 keys = path.split('.')
89 for i in range(0, len(keys) - 1):
90 key = keys[i]
91 next_current = current.get(key)
92 if not isinstance(next_current, dict):
93 return
94 current = next_current
95 current.pop(keys[-1], None)
96
97 def copy(self, path: Optional[str] = None) -> Any:
98 """
99 Get a copy of the value from the dictionary or copy of whole dictionary.
100
101 :param path: The path, e.g. foo.bar.baz, or None.
102
103 :returns: A copy of the value stored at the path, or None if it doesn't exist.
104 Note that this cannot distinguish between None values and
105 paths that don't exist. If the path is None, returns a copy of the
106 entire dictionary.
107 """
108 return deepcopy(self.get(path))
109
110 def __bool__(self) -> bool:
111 """
112 If this collection has at least one key-value pair, return True, else return False.
113 """
114 return bool(self._d)
115
116 def __contains__(self, path: str) -> bool:
117 value = self.get(path)
118 return value is not None and value is not False
119
120 def clear(self) -> None:
121 """
122 Remove all key-value pairs.
123 """
124 self._d.clear()
125
126 def assign(self, d: Dict[str, Any]) -> None:
127 """
128 Overwrites the old stored dictionary with a fresh new dictionary.
129
130 :param d: The new dictionary to store
131 """
132 self._d = d
133
134 def update(self, d: Dict[str, Any]) -> None:
135 """
136 Overwrite and/or add new key-value pairs to the collection.
137
138 :param d: The overriding dictionary. Can contain nested dictionaries.
139 """
140 for key, value in d.items():
141 if isinstance(value, dict):
142 self._update_recursive(value, key)
143 else:
144 self.set(key, value)
145
146 def get_resolved(self, variables: Dict[str, str]) -> Dict[str, Any]:
147 """
148 Resolve a DottedDict that may potentially contain template variables like $folder.
149
150 :param variables: The variables
151
152 :returns: A copy of the underlying dictionary, but with the variables replaced
153 """
154 return sublime.expand_variables(self._d, variables)
155
156 def _update_recursive(self, current: Dict[str, Any], prefix: str) -> None:
157 if not current:
158 return self.set(prefix, current)
159 for key, value in current.items():
160 path = "{}.{}".format(prefix, key)
161 if isinstance(value, dict):
162 self._update_recursive(value, path)
163 else:
164 self.set(path, value)
165
166 def __repr__(self) -> str:
167 return "{}({})".format(self.__class__.__name__, repr(self._d))
168
169 def __eq__(self, other: Any) -> bool:
170 if not isinstance(other, DottedDict):
171 return False
172 return self._d == other._d
173
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/plugin/core/collections.py b/plugin/core/collections.py
--- a/plugin/core/collections.py
+++ b/plugin/core/collections.py
@@ -113,7 +113,9 @@
"""
return bool(self._d)
- def __contains__(self, path: str) -> bool:
+ def __contains__(self, path: object) -> bool:
+ if not isinstance(path, str):
+ return False
value = self.get(path)
return value is not None and value is not False
@@ -154,7 +156,7 @@
return sublime.expand_variables(self._d, variables)
def _update_recursive(self, current: Dict[str, Any], prefix: str) -> None:
- if not current:
+ if not current or any(filter(lambda key: isinstance(key, str) and (":" in key or "/" in key), current.keys())):
return self.set(prefix, current)
for key, value in current.items():
path = "{}.{}".format(prefix, key)
|
{"golden_diff": "diff --git a/plugin/core/collections.py b/plugin/core/collections.py\n--- a/plugin/core/collections.py\n+++ b/plugin/core/collections.py\n@@ -113,7 +113,9 @@\n \"\"\"\n return bool(self._d)\n \n- def __contains__(self, path: str) -> bool:\n+ def __contains__(self, path: object) -> bool:\n+ if not isinstance(path, str):\n+ return False\n value = self.get(path)\n return value is not None and value is not False\n \n@@ -154,7 +156,7 @@\n return sublime.expand_variables(self._d, variables)\n \n def _update_recursive(self, current: Dict[str, Any], prefix: str) -> None:\n- if not current:\n+ if not current or any(filter(lambda key: isinstance(key, str) and (\":\" in key or \"/\" in key), current.keys())):\n return self.set(prefix, current)\n for key, value in current.items():\n path = \"{}.{}\".format(prefix, key)\n", "issue": "Unwanted splitting on dot of the setting keys\n**Describe the bug**\r\n\r\nIn LSP-yaml, the user can specify schema mapping like so:\r\n\r\n```\r\nyaml.schemas: {\r\n \"https://json.schemastore.org/composer\": \"/*\"\r\n}\r\n```\r\n\r\nThis doesn't work in ST because LSP splits the URL key on every dot, creating an object that the server doesn't understand.\r\n\r\n**Expected behavior**\r\n\r\nShould send the value of the `yaml.schemas` object without altering it\r\n\r\n**Additional context**\r\n\r\nI'm not sure how VSCode manages to handle it but I would guess it is somewhat taking into consideration the schema itself. We could try to check that first.\n", "before_files": [{"content": "\"\"\"\nModule with additional collections.\n\"\"\"\nfrom .typing import Optional, Dict, Any, Generator\nfrom copy import deepcopy\nimport sublime\n\n\nclass DottedDict:\n\n __slots__ = ('_d',)\n\n def __init__(self, d: Optional[Dict[str, Any]] = None) -> None:\n \"\"\"\n Construct a DottedDict, optionally from an existing dictionary.\n\n :param d: An existing dictionary.\n \"\"\"\n self._d = {} # type: Dict[str, Any]\n if d is not None:\n self.update(d)\n\n @classmethod\n def from_base_and_override(cls, base: \"DottedDict\", override: Optional[Dict[str, Any]]) -> \"DottedDict\":\n result = DottedDict(base.copy())\n if override:\n result.update(override)\n return result\n\n def get(self, path: Optional[str] = None) -> Any:\n \"\"\"\n Get a value from the dictionary.\n\n :param path: The path, e.g. foo.bar.baz, or None.\n\n :returns: The value stored at the path, or None if it doesn't exist.\n Note that this cannot distinguish between None values and\n paths that don't exist. If the path is None, returns the\n entire dictionary.\n \"\"\"\n if path is None:\n return self._d\n current = self._d # type: Any\n keys = path.split('.')\n for key in keys:\n if isinstance(current, dict):\n current = current.get(key)\n else:\n return None\n return current\n\n def walk(self, path: str) -> Generator[Any, None, None]:\n current = self._d # type: Any\n keys = path.split('.')\n for key in keys:\n if isinstance(current, dict):\n current = current.get(key)\n yield current\n else:\n yield None\n return\n\n def set(self, path: str, value: Any) -> None:\n \"\"\"\n Set a value in the dictionary.\n\n :param path: The path, e.g. foo.bar.baz\n :param value: The value\n \"\"\"\n current = self._d\n keys = path.split('.')\n for i in range(0, len(keys) - 1):\n key = keys[i]\n next_current = current.get(key)\n if not isinstance(next_current, dict):\n next_current = {}\n current[key] = next_current\n current = next_current\n current[keys[-1]] = value\n\n def remove(self, path: str) -> None:\n \"\"\"\n Remove a key from the dictionary.\n\n :param path: The path, e.g. foo.bar.baz\n \"\"\"\n current = self._d\n keys = path.split('.')\n for i in range(0, len(keys) - 1):\n key = keys[i]\n next_current = current.get(key)\n if not isinstance(next_current, dict):\n return\n current = next_current\n current.pop(keys[-1], None)\n\n def copy(self, path: Optional[str] = None) -> Any:\n \"\"\"\n Get a copy of the value from the dictionary or copy of whole dictionary.\n\n :param path: The path, e.g. foo.bar.baz, or None.\n\n :returns: A copy of the value stored at the path, or None if it doesn't exist.\n Note that this cannot distinguish between None values and\n paths that don't exist. If the path is None, returns a copy of the\n entire dictionary.\n \"\"\"\n return deepcopy(self.get(path))\n\n def __bool__(self) -> bool:\n \"\"\"\n If this collection has at least one key-value pair, return True, else return False.\n \"\"\"\n return bool(self._d)\n\n def __contains__(self, path: str) -> bool:\n value = self.get(path)\n return value is not None and value is not False\n\n def clear(self) -> None:\n \"\"\"\n Remove all key-value pairs.\n \"\"\"\n self._d.clear()\n\n def assign(self, d: Dict[str, Any]) -> None:\n \"\"\"\n Overwrites the old stored dictionary with a fresh new dictionary.\n\n :param d: The new dictionary to store\n \"\"\"\n self._d = d\n\n def update(self, d: Dict[str, Any]) -> None:\n \"\"\"\n Overwrite and/or add new key-value pairs to the collection.\n\n :param d: The overriding dictionary. Can contain nested dictionaries.\n \"\"\"\n for key, value in d.items():\n if isinstance(value, dict):\n self._update_recursive(value, key)\n else:\n self.set(key, value)\n\n def get_resolved(self, variables: Dict[str, str]) -> Dict[str, Any]:\n \"\"\"\n Resolve a DottedDict that may potentially contain template variables like $folder.\n\n :param variables: The variables\n\n :returns: A copy of the underlying dictionary, but with the variables replaced\n \"\"\"\n return sublime.expand_variables(self._d, variables)\n\n def _update_recursive(self, current: Dict[str, Any], prefix: str) -> None:\n if not current:\n return self.set(prefix, current)\n for key, value in current.items():\n path = \"{}.{}\".format(prefix, key)\n if isinstance(value, dict):\n self._update_recursive(value, path)\n else:\n self.set(path, value)\n\n def __repr__(self) -> str:\n return \"{}({})\".format(self.__class__.__name__, repr(self._d))\n\n def __eq__(self, other: Any) -> bool:\n if not isinstance(other, DottedDict):\n return False\n return self._d == other._d\n", "path": "plugin/core/collections.py"}], "after_files": [{"content": "\"\"\"\nModule with additional collections.\n\"\"\"\nfrom .typing import Optional, Dict, Any, Generator\nfrom copy import deepcopy\nimport sublime\n\n\nclass DottedDict:\n\n __slots__ = ('_d',)\n\n def __init__(self, d: Optional[Dict[str, Any]] = None) -> None:\n \"\"\"\n Construct a DottedDict, optionally from an existing dictionary.\n\n :param d: An existing dictionary.\n \"\"\"\n self._d = {} # type: Dict[str, Any]\n if d is not None:\n self.update(d)\n\n @classmethod\n def from_base_and_override(cls, base: \"DottedDict\", override: Optional[Dict[str, Any]]) -> \"DottedDict\":\n result = DottedDict(base.copy())\n if override:\n result.update(override)\n return result\n\n def get(self, path: Optional[str] = None) -> Any:\n \"\"\"\n Get a value from the dictionary.\n\n :param path: The path, e.g. foo.bar.baz, or None.\n\n :returns: The value stored at the path, or None if it doesn't exist.\n Note that this cannot distinguish between None values and\n paths that don't exist. If the path is None, returns the\n entire dictionary.\n \"\"\"\n if path is None:\n return self._d\n current = self._d # type: Any\n keys = path.split('.')\n for key in keys:\n if isinstance(current, dict):\n current = current.get(key)\n else:\n return None\n return current\n\n def walk(self, path: str) -> Generator[Any, None, None]:\n current = self._d # type: Any\n keys = path.split('.')\n for key in keys:\n if isinstance(current, dict):\n current = current.get(key)\n yield current\n else:\n yield None\n return\n\n def set(self, path: str, value: Any) -> None:\n \"\"\"\n Set a value in the dictionary.\n\n :param path: The path, e.g. foo.bar.baz\n :param value: The value\n \"\"\"\n current = self._d\n keys = path.split('.')\n for i in range(0, len(keys) - 1):\n key = keys[i]\n next_current = current.get(key)\n if not isinstance(next_current, dict):\n next_current = {}\n current[key] = next_current\n current = next_current\n current[keys[-1]] = value\n\n def remove(self, path: str) -> None:\n \"\"\"\n Remove a key from the dictionary.\n\n :param path: The path, e.g. foo.bar.baz\n \"\"\"\n current = self._d\n keys = path.split('.')\n for i in range(0, len(keys) - 1):\n key = keys[i]\n next_current = current.get(key)\n if not isinstance(next_current, dict):\n return\n current = next_current\n current.pop(keys[-1], None)\n\n def copy(self, path: Optional[str] = None) -> Any:\n \"\"\"\n Get a copy of the value from the dictionary or copy of whole dictionary.\n\n :param path: The path, e.g. foo.bar.baz, or None.\n\n :returns: A copy of the value stored at the path, or None if it doesn't exist.\n Note that this cannot distinguish between None values and\n paths that don't exist. If the path is None, returns a copy of the\n entire dictionary.\n \"\"\"\n return deepcopy(self.get(path))\n\n def __bool__(self) -> bool:\n \"\"\"\n If this collection has at least one key-value pair, return True, else return False.\n \"\"\"\n return bool(self._d)\n\n def __contains__(self, path: object) -> bool:\n if not isinstance(path, str):\n return False\n value = self.get(path)\n return value is not None and value is not False\n\n def clear(self) -> None:\n \"\"\"\n Remove all key-value pairs.\n \"\"\"\n self._d.clear()\n\n def assign(self, d: Dict[str, Any]) -> None:\n \"\"\"\n Overwrites the old stored dictionary with a fresh new dictionary.\n\n :param d: The new dictionary to store\n \"\"\"\n self._d = d\n\n def update(self, d: Dict[str, Any]) -> None:\n \"\"\"\n Overwrite and/or add new key-value pairs to the collection.\n\n :param d: The overriding dictionary. Can contain nested dictionaries.\n \"\"\"\n for key, value in d.items():\n if isinstance(value, dict):\n self._update_recursive(value, key)\n else:\n self.set(key, value)\n\n def get_resolved(self, variables: Dict[str, str]) -> Dict[str, Any]:\n \"\"\"\n Resolve a DottedDict that may potentially contain template variables like $folder.\n\n :param variables: The variables\n\n :returns: A copy of the underlying dictionary, but with the variables replaced\n \"\"\"\n return sublime.expand_variables(self._d, variables)\n\n def _update_recursive(self, current: Dict[str, Any], prefix: str) -> None:\n if not current or any(filter(lambda key: isinstance(key, str) and (\":\" in key or \"/\" in key), current.keys())):\n return self.set(prefix, current)\n for key, value in current.items():\n path = \"{}.{}\".format(prefix, key)\n if isinstance(value, dict):\n self._update_recursive(value, path)\n else:\n self.set(path, value)\n\n def __repr__(self) -> str:\n return \"{}({})\".format(self.__class__.__name__, repr(self._d))\n\n def __eq__(self, other: Any) -> bool:\n if not isinstance(other, DottedDict):\n return False\n return self._d == other._d\n", "path": "plugin/core/collections.py"}]}
| 2,074 | 230 |
gh_patches_debug_1537
|
rasdani/github-patches
|
git_diff
|
mitmproxy__mitmproxy-2051
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
@concurrent annotation doesn't work in the OOP script method
I.E:
```
class SomeClass:
@concurrent [doesn't work, 2 args]
def request(self, flow):
pass
```
@concurrent annotation doesn't work in the OOP script method
I.E:
```
class SomeClass:
@concurrent [doesn't work, 2 args]
def request(self, flow):
pass
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mitmproxy/script/concurrent.py`
Content:
```
1 """
2 This module provides a @concurrent decorator primitive to
3 offload computations from mitmproxy's main master thread.
4 """
5
6 from mitmproxy import eventsequence
7 from mitmproxy.types import basethread
8
9
10 class ScriptThread(basethread.BaseThread):
11 name = "ScriptThread"
12
13
14 def concurrent(fn):
15 if fn.__name__ not in eventsequence.Events - {"start", "configure", "tick"}:
16 raise NotImplementedError(
17 "Concurrent decorator not supported for '%s' method." % fn.__name__
18 )
19
20 def _concurrent(obj):
21 def run():
22 fn(obj)
23 if obj.reply.state == "taken":
24 if not obj.reply.has_message:
25 obj.reply.ack()
26 obj.reply.commit()
27 obj.reply.take()
28 ScriptThread(
29 "script.concurrent (%s)" % fn.__name__,
30 target=run
31 ).start()
32 return _concurrent
33
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mitmproxy/script/concurrent.py b/mitmproxy/script/concurrent.py
--- a/mitmproxy/script/concurrent.py
+++ b/mitmproxy/script/concurrent.py
@@ -29,4 +29,8 @@
"script.concurrent (%s)" % fn.__name__,
target=run
).start()
- return _concurrent
+ # Support @concurrent for class-based addons
+ if "." in fn.__qualname__:
+ return staticmethod(_concurrent)
+ else:
+ return _concurrent
|
{"golden_diff": "diff --git a/mitmproxy/script/concurrent.py b/mitmproxy/script/concurrent.py\n--- a/mitmproxy/script/concurrent.py\n+++ b/mitmproxy/script/concurrent.py\n@@ -29,4 +29,8 @@\n \"script.concurrent (%s)\" % fn.__name__,\n target=run\n ).start()\n- return _concurrent\n+ # Support @concurrent for class-based addons\n+ if \".\" in fn.__qualname__:\n+ return staticmethod(_concurrent)\n+ else:\n+ return _concurrent\n", "issue": "@concurrent annotation doesn't work in the OOP script method\nI.E:\r\n\r\n```\r\nclass SomeClass:\r\n @concurrent [doesn't work, 2 args]\r\n def request(self, flow):\r\n pass\r\n\r\n```\n@concurrent annotation doesn't work in the OOP script method\nI.E:\r\n\r\n```\r\nclass SomeClass:\r\n @concurrent [doesn't work, 2 args]\r\n def request(self, flow):\r\n pass\r\n\r\n```\n", "before_files": [{"content": "\"\"\"\nThis module provides a @concurrent decorator primitive to\noffload computations from mitmproxy's main master thread.\n\"\"\"\n\nfrom mitmproxy import eventsequence\nfrom mitmproxy.types import basethread\n\n\nclass ScriptThread(basethread.BaseThread):\n name = \"ScriptThread\"\n\n\ndef concurrent(fn):\n if fn.__name__ not in eventsequence.Events - {\"start\", \"configure\", \"tick\"}:\n raise NotImplementedError(\n \"Concurrent decorator not supported for '%s' method.\" % fn.__name__\n )\n\n def _concurrent(obj):\n def run():\n fn(obj)\n if obj.reply.state == \"taken\":\n if not obj.reply.has_message:\n obj.reply.ack()\n obj.reply.commit()\n obj.reply.take()\n ScriptThread(\n \"script.concurrent (%s)\" % fn.__name__,\n target=run\n ).start()\n return _concurrent\n", "path": "mitmproxy/script/concurrent.py"}], "after_files": [{"content": "\"\"\"\nThis module provides a @concurrent decorator primitive to\noffload computations from mitmproxy's main master thread.\n\"\"\"\n\nfrom mitmproxy import eventsequence\nfrom mitmproxy.types import basethread\n\n\nclass ScriptThread(basethread.BaseThread):\n name = \"ScriptThread\"\n\n\ndef concurrent(fn):\n if fn.__name__ not in eventsequence.Events - {\"start\", \"configure\", \"tick\"}:\n raise NotImplementedError(\n \"Concurrent decorator not supported for '%s' method.\" % fn.__name__\n )\n\n def _concurrent(obj):\n def run():\n fn(obj)\n if obj.reply.state == \"taken\":\n if not obj.reply.has_message:\n obj.reply.ack()\n obj.reply.commit()\n obj.reply.take()\n ScriptThread(\n \"script.concurrent (%s)\" % fn.__name__,\n target=run\n ).start()\n # Support @concurrent for class-based addons\n if \".\" in fn.__qualname__:\n return staticmethod(_concurrent)\n else:\n return _concurrent\n", "path": "mitmproxy/script/concurrent.py"}]}
| 609 | 123 |
gh_patches_debug_24733
|
rasdani/github-patches
|
git_diff
|
translate__pootle-4034
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make redis version check CRITICAL
If the redis version is not sufficiently recent the site will fail.
Currently there is a CRITICAL check for the presence of redis, but the version check is only a WARNING - i think it should be CRITICAL
The hint should also say what the minimum redis version is
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pootle/checks.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 #
4 # Copyright (C) Pootle contributors.
5 #
6 # This file is a part of the Pootle project. It is distributed under the GPL3
7 # or later license. See the LICENSE file for a copy of the license and the
8 # AUTHORS file for copyright and authorship information.
9
10 from django.core import checks
11 from django.utils.translation import ugettext as _
12
13
14 # Minimum Translate Toolkit version required for Pootle to run.
15 TTK_MINIMUM_REQUIRED_VERSION = (1, 13, 0)
16
17 # Minimum Django version required for Pootle to run.
18 DJANGO_MINIMUM_REQUIRED_VERSION = (1, 7, 10)
19
20 # Minimum lxml version required for Pootle to run.
21 LXML_MINIMUM_REQUIRED_VERSION = (2, 2, 2, 0)
22
23 # Minimum Redis server version required.
24 # Initially set to some minimums based on:
25 # 1. Ubuntu 12.04LTS's version 2.8.4 (10.04LTS was too old for RQ)
26 # 2. RQ requires >= 2.6.0, and
27 # 3. Wanting to insist on at least the latest stable that devs are using i.e.
28 # 2.8.* versions of Redis
29 REDIS_MINIMUM_REQUIRED_VERSION = (2, 8, 4)
30
31
32 # XXX List of manage.py commands not to run the rqworker check on.
33 # Maybe tagging can improve this?
34 RQWORKER_WHITELIST = [
35 "start", "initdb", "revision", "sync_stores", "run_cherrypy",
36 "refresh_stats", "update_stores", "calculate_checks", "retry_failed_jobs",
37 "check", "runserver",
38 ]
39
40
41 def _version_to_string(version, significance=None):
42 if significance is not None:
43 version = version[significance:]
44 return '.'.join(str(n) for n in version)
45
46
47 @checks.register()
48 def check_library_versions(app_configs=None, **kwargs):
49 from django import VERSION as django_version
50 from lxml.etree import LXML_VERSION as lxml_version
51 from translate.__version__ import ver as ttk_version
52
53 errors = []
54
55 if django_version < DJANGO_MINIMUM_REQUIRED_VERSION:
56 errors.append(checks.Critical(
57 _("Your version of Django is too old."),
58 hint=_("Try pip install --upgrade 'Django==%s'" %
59 _version_to_string(DJANGO_MINIMUM_REQUIRED_VERSION)),
60 id="pootle.C002",
61 ))
62
63 if lxml_version < LXML_MINIMUM_REQUIRED_VERSION:
64 errors.append(checks.Warning(
65 _("Your version of lxml is too old."),
66 hint=_("Try pip install --upgrade lxml"),
67 id="pootle.W003",
68 ))
69
70 if ttk_version < TTK_MINIMUM_REQUIRED_VERSION:
71 errors.append(checks.Critical(
72 _("Your version of Translate Toolkit is too old."),
73 hint=_("Try pip install --upgrade translate-toolkit"),
74 id="pootle.C003",
75 ))
76
77 return errors
78
79
80 @checks.register()
81 def check_redis(app_configs=None, **kwargs):
82 from django_rq.queues import get_queue
83 from django_rq.workers import Worker
84
85 errors = []
86
87 try:
88 queue = get_queue()
89 workers = Worker.all(queue.connection)
90 except Exception as e:
91 conn_settings = queue.connection.connection_pool.connection_kwargs
92 errors.append(checks.Critical(_("Could not connect to Redis (%s)") % (e),
93 hint=_("Make sure Redis is running on %(host)s:%(port)s") % (conn_settings),
94 id="pootle.C001",
95 ))
96 else:
97 if len(queue.connection.smembers(Worker.redis_workers_keys)) == 0:
98 # We need to check we're not running manage.py rqworker right now..
99 import sys
100 if len(sys.argv) > 1 and sys.argv[1] in RQWORKER_WHITELIST:
101 errors.append(checks.Warning(
102 _("No RQ Worker running."),
103 hint=_("Run new workers with manage.py rqworker"),
104 id="pootle.W001",
105 ))
106
107 redis_version = queue.connection.info()["redis_version"].split(".")
108 if tuple(int(x) for x in redis_version) < REDIS_MINIMUM_REQUIRED_VERSION:
109 errors.append(checks.Warning(
110 _("Your version of Redis is too old."),
111 hint=_("Update your system's Redis server package"),
112 id="pootle.W002",
113 ))
114
115 return errors
116
117
118 @checks.register()
119 def check_settings(app_configs=None, **kwargs):
120 from django.conf import settings
121
122 errors = []
123
124 if "RedisCache" not in settings.CACHES.get("default", {}).get("BACKEND"):
125 errors.append(checks.Critical(
126 _("Cache backend is not set to Redis."),
127 hint=_("Set default cache backend to django_redis.cache.RedisCache\n"
128 "Current settings: %r") % (settings.CACHES.get("default")),
129 id="pootle.C005",
130 ))
131 else:
132 from django_redis import get_redis_connection
133
134 if not get_redis_connection():
135 errors.append(checks.Critical(
136 _("Could not initiate a Redis cache connection"),
137 hint=_("Double-check your CACHES settings"),
138 id="pootle.C004",
139 ))
140
141 if settings.DEBUG:
142 errors.append(checks.Warning(
143 _("DEBUG mode is on. Do not do this in production!"),
144 hint=_("Set DEBUG = False in Pootle settings"),
145 id="pootle.W005"
146 ))
147 elif "sqlite" in settings.DATABASES.get("default", {}).get("ENGINE"):
148 # We don't bother warning about sqlite in DEBUG mode.
149 errors.append(checks.Warning(
150 _("The sqlite database backend is unsupported"),
151 hint=_("Set your default database engine to postgresql_psycopg2 or mysql"),
152 id="pootle.W006",
153 ))
154
155 if settings.SESSION_ENGINE.split(".")[-1] not in ("cache", "cached_db"):
156 errors.append(checks.Warning(
157 _("Not using cached_db as session engine"),
158 hint=_("Set SESSION_ENGINE to django.contrib.sessions.backend.cached_db\n"
159 "Current settings: %r") % (settings.SESSION_ENGINE),
160 id="pootle.W007",
161 ))
162
163 if not settings.POOTLE_CONTACT_EMAIL and settings.POOTLE_CONTACT_ENABLED:
164 errors.append(checks.Warning(
165 _("POOTLE_CONTACT_EMAIL is not set."),
166 hint=_("Set POOTLE_CONTACT_EMAIL to allow users to contact "
167 "administrators through the Pootle contact form."),
168 id="pootle.W008",
169 ))
170
171 if settings.POOTLE_CONTACT_EMAIL in ("info@YOUR_DOMAIN.com") \
172 and settings.POOTLE_CONTACT_ENABLED:
173 errors.append(checks.Warning(
174 _("POOTLE_CONTACT_EMAIL is using the following default "
175 "setting %r." % settings.POOTLE_CONTACT_EMAIL),
176 hint=_("POOTLE_CONTACT_EMAIL is the address that will receive "
177 "messages sent by the contact form."),
178 id="pootle.W011",
179 ))
180
181 if not settings.DEFAULT_FROM_EMAIL:
182 errors.append(checks.Warning(
183 _("DEFAULT_FROM_EMAIL is not set."),
184 hint=_("DEFAULT_FROM_EMAIL is used in all outgoing Pootle email.\n"
185 "Don't forget to review your mail server settings."),
186 id="pootle.W009",
187 ))
188
189 if settings.DEFAULT_FROM_EMAIL in ("info@YOUR_DOMAIN.com",
190 "webmaster@localhost"):
191 errors.append(checks.Warning(
192 _("DEFAULT_FROM_EMAIL is using the following default "
193 "setting %r." % settings.DEFAULT_FROM_EMAIL),
194 hint=_("DEFAULT_FROM_EMAIL is used in all outgoing Pootle email.\n"
195 "Don't forget to review your mail server settings."),
196 id="pootle.W010",
197 ))
198
199 try:
200 markup_filter, markup_kwargs = settings.POOTLE_MARKUP_FILTER
201 except AttributeError:
202 errors.append(checks.Warning(
203 _("POOTLE_MARKUP_FILTER is missing."),
204 hint=_("Set POOTLE_MARKUP_FILTER."),
205 id="pootle.W012",
206 ))
207 except (IndexError, TypeError, ValueError):
208 errors.append(checks.Warning(
209 _("Invalid value in POOTLE_MARKUP_FILTER."),
210 hint=_("Set a valid value for POOTLE_MARKUP_FILTER."),
211 id="pootle.W013",
212 ))
213 else:
214 if markup_filter is not None:
215 try:
216 if markup_filter == 'textile':
217 import textile
218 elif markup_filter == 'markdown':
219 import markdown
220 elif markup_filter == 'restructuredtext':
221 import docutils
222 else:
223 errors.append(checks.Warning(
224 _("Invalid markup in POOTLE_MARKUP_FILTER."),
225 hint=_("Set a valid markup for POOTLE_MARKUP_FILTER."),
226 id="pootle.W014",
227 ))
228 except ImportError:
229 errors.append(checks.Warning(
230 _("POOTLE_MARKUP_FILTER is set to '%s' markup, but the "
231 "package that provides can't be found." % markup_filter),
232 hint=_("Install the package or change "
233 "POOTLE_MARKUP_FILTER."),
234 id="pootle.W015",
235 ))
236
237 return errors
238
239
240 @checks.register()
241 def check_users(app_configs=None, **kwargs):
242 from django.contrib.auth import get_user_model
243 from django.db import ProgrammingError
244 from django.db.utils import OperationalError
245
246 errors = []
247
248 User = get_user_model()
249 try:
250 admin_user = User.objects.get(username='admin')
251 except (User.DoesNotExist, OperationalError, ProgrammingError):
252 pass
253 else:
254 if admin_user.check_password('admin'):
255 errors.append(checks.Warning(
256 _("The default 'admin' user still has a password set to "
257 "'admin'."),
258 hint=_("Remove the 'admin' user or change its password."),
259 id="pootle.W016",
260 ))
261
262 return errors
263
264
265 @checks.register()
266 def check_db_transaction_on_commit(app_configs=None, **kwargs):
267 from django.db import connection
268 errors = []
269 try:
270 on_commit = connection.on_commit
271 except AttributeError:
272 errors.append(checks.Critical(
273 _("Database connection does not implement on_commit."),
274 hint=_("Set the DATABASES['default']['ENGINE'] to use a backend "
275 "from transaction_hooks.backends."),
276 id="pootle.C006",
277 ))
278 return errors
279
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pootle/checks.py b/pootle/checks.py
--- a/pootle/checks.py
+++ b/pootle/checks.py
@@ -94,6 +94,17 @@
id="pootle.C001",
))
else:
+ redis_version = tuple(int(x) for x
+ in (queue.connection
+ .info()["redis_version"].split(".")))
+ if redis_version < REDIS_MINIMUM_REQUIRED_VERSION:
+ errors.append(checks.Critical(
+ _("Your version of Redis is too old."),
+ hint=_("Update your system's Redis server package to at least "
+ "version %s" % str(REDIS_MINIMUM_REQUIRED_VERSION)),
+ id="pootle.C007",
+ ))
+
if len(queue.connection.smembers(Worker.redis_workers_keys)) == 0:
# We need to check we're not running manage.py rqworker right now..
import sys
@@ -104,14 +115,6 @@
id="pootle.W001",
))
- redis_version = queue.connection.info()["redis_version"].split(".")
- if tuple(int(x) for x in redis_version) < REDIS_MINIMUM_REQUIRED_VERSION:
- errors.append(checks.Warning(
- _("Your version of Redis is too old."),
- hint=_("Update your system's Redis server package"),
- id="pootle.W002",
- ))
-
return errors
|
{"golden_diff": "diff --git a/pootle/checks.py b/pootle/checks.py\n--- a/pootle/checks.py\n+++ b/pootle/checks.py\n@@ -94,6 +94,17 @@\n id=\"pootle.C001\",\n ))\n else:\n+ redis_version = tuple(int(x) for x\n+ in (queue.connection\n+ .info()[\"redis_version\"].split(\".\")))\n+ if redis_version < REDIS_MINIMUM_REQUIRED_VERSION:\n+ errors.append(checks.Critical(\n+ _(\"Your version of Redis is too old.\"),\n+ hint=_(\"Update your system's Redis server package to at least \"\n+ \"version %s\" % str(REDIS_MINIMUM_REQUIRED_VERSION)),\n+ id=\"pootle.C007\",\n+ ))\n+\n if len(queue.connection.smembers(Worker.redis_workers_keys)) == 0:\n # We need to check we're not running manage.py rqworker right now..\n import sys\n@@ -104,14 +115,6 @@\n id=\"pootle.W001\",\n ))\n \n- redis_version = queue.connection.info()[\"redis_version\"].split(\".\")\n- if tuple(int(x) for x in redis_version) < REDIS_MINIMUM_REQUIRED_VERSION:\n- errors.append(checks.Warning(\n- _(\"Your version of Redis is too old.\"),\n- hint=_(\"Update your system's Redis server package\"),\n- id=\"pootle.W002\",\n- ))\n-\n return errors\n", "issue": "Make redis version check CRITICAL\nIf the redis version is not sufficiently recent the site will fail.\n\nCurrently there is a CRITICAL check for the presence of redis, but the version check is only a WARNING - i think it should be CRITICAL\n\nThe hint should also say what the minimum redis version is\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom django.core import checks\nfrom django.utils.translation import ugettext as _\n\n\n# Minimum Translate Toolkit version required for Pootle to run.\nTTK_MINIMUM_REQUIRED_VERSION = (1, 13, 0)\n\n# Minimum Django version required for Pootle to run.\nDJANGO_MINIMUM_REQUIRED_VERSION = (1, 7, 10)\n\n# Minimum lxml version required for Pootle to run.\nLXML_MINIMUM_REQUIRED_VERSION = (2, 2, 2, 0)\n\n# Minimum Redis server version required.\n# Initially set to some minimums based on:\n# 1. Ubuntu 12.04LTS's version 2.8.4 (10.04LTS was too old for RQ)\n# 2. RQ requires >= 2.6.0, and\n# 3. Wanting to insist on at least the latest stable that devs are using i.e.\n# 2.8.* versions of Redis\nREDIS_MINIMUM_REQUIRED_VERSION = (2, 8, 4)\n\n\n# XXX List of manage.py commands not to run the rqworker check on.\n# Maybe tagging can improve this?\nRQWORKER_WHITELIST = [\n \"start\", \"initdb\", \"revision\", \"sync_stores\", \"run_cherrypy\",\n \"refresh_stats\", \"update_stores\", \"calculate_checks\", \"retry_failed_jobs\",\n \"check\", \"runserver\",\n]\n\n\ndef _version_to_string(version, significance=None):\n if significance is not None:\n version = version[significance:]\n return '.'.join(str(n) for n in version)\n\n\n@checks.register()\ndef check_library_versions(app_configs=None, **kwargs):\n from django import VERSION as django_version\n from lxml.etree import LXML_VERSION as lxml_version\n from translate.__version__ import ver as ttk_version\n\n errors = []\n\n if django_version < DJANGO_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Critical(\n _(\"Your version of Django is too old.\"),\n hint=_(\"Try pip install --upgrade 'Django==%s'\" %\n _version_to_string(DJANGO_MINIMUM_REQUIRED_VERSION)),\n id=\"pootle.C002\",\n ))\n\n if lxml_version < LXML_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Warning(\n _(\"Your version of lxml is too old.\"),\n hint=_(\"Try pip install --upgrade lxml\"),\n id=\"pootle.W003\",\n ))\n\n if ttk_version < TTK_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Critical(\n _(\"Your version of Translate Toolkit is too old.\"),\n hint=_(\"Try pip install --upgrade translate-toolkit\"),\n id=\"pootle.C003\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_redis(app_configs=None, **kwargs):\n from django_rq.queues import get_queue\n from django_rq.workers import Worker\n\n errors = []\n\n try:\n queue = get_queue()\n workers = Worker.all(queue.connection)\n except Exception as e:\n conn_settings = queue.connection.connection_pool.connection_kwargs\n errors.append(checks.Critical(_(\"Could not connect to Redis (%s)\") % (e),\n hint=_(\"Make sure Redis is running on %(host)s:%(port)s\") % (conn_settings),\n id=\"pootle.C001\",\n ))\n else:\n if len(queue.connection.smembers(Worker.redis_workers_keys)) == 0:\n # We need to check we're not running manage.py rqworker right now..\n import sys\n if len(sys.argv) > 1 and sys.argv[1] in RQWORKER_WHITELIST:\n errors.append(checks.Warning(\n _(\"No RQ Worker running.\"),\n hint=_(\"Run new workers with manage.py rqworker\"),\n id=\"pootle.W001\",\n ))\n\n redis_version = queue.connection.info()[\"redis_version\"].split(\".\")\n if tuple(int(x) for x in redis_version) < REDIS_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Warning(\n _(\"Your version of Redis is too old.\"),\n hint=_(\"Update your system's Redis server package\"),\n id=\"pootle.W002\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_settings(app_configs=None, **kwargs):\n from django.conf import settings\n\n errors = []\n\n if \"RedisCache\" not in settings.CACHES.get(\"default\", {}).get(\"BACKEND\"):\n errors.append(checks.Critical(\n _(\"Cache backend is not set to Redis.\"),\n hint=_(\"Set default cache backend to django_redis.cache.RedisCache\\n\"\n \"Current settings: %r\") % (settings.CACHES.get(\"default\")),\n id=\"pootle.C005\",\n ))\n else:\n from django_redis import get_redis_connection\n\n if not get_redis_connection():\n errors.append(checks.Critical(\n _(\"Could not initiate a Redis cache connection\"),\n hint=_(\"Double-check your CACHES settings\"),\n id=\"pootle.C004\",\n ))\n\n if settings.DEBUG:\n errors.append(checks.Warning(\n _(\"DEBUG mode is on. Do not do this in production!\"),\n hint=_(\"Set DEBUG = False in Pootle settings\"),\n id=\"pootle.W005\"\n ))\n elif \"sqlite\" in settings.DATABASES.get(\"default\", {}).get(\"ENGINE\"):\n # We don't bother warning about sqlite in DEBUG mode.\n errors.append(checks.Warning(\n _(\"The sqlite database backend is unsupported\"),\n hint=_(\"Set your default database engine to postgresql_psycopg2 or mysql\"),\n id=\"pootle.W006\",\n ))\n\n if settings.SESSION_ENGINE.split(\".\")[-1] not in (\"cache\", \"cached_db\"):\n errors.append(checks.Warning(\n _(\"Not using cached_db as session engine\"),\n hint=_(\"Set SESSION_ENGINE to django.contrib.sessions.backend.cached_db\\n\"\n \"Current settings: %r\") % (settings.SESSION_ENGINE),\n id=\"pootle.W007\",\n ))\n\n if not settings.POOTLE_CONTACT_EMAIL and settings.POOTLE_CONTACT_ENABLED:\n errors.append(checks.Warning(\n _(\"POOTLE_CONTACT_EMAIL is not set.\"),\n hint=_(\"Set POOTLE_CONTACT_EMAIL to allow users to contact \"\n \"administrators through the Pootle contact form.\"),\n id=\"pootle.W008\",\n ))\n\n if settings.POOTLE_CONTACT_EMAIL in (\"info@YOUR_DOMAIN.com\") \\\n and settings.POOTLE_CONTACT_ENABLED:\n errors.append(checks.Warning(\n _(\"POOTLE_CONTACT_EMAIL is using the following default \"\n \"setting %r.\" % settings.POOTLE_CONTACT_EMAIL),\n hint=_(\"POOTLE_CONTACT_EMAIL is the address that will receive \"\n \"messages sent by the contact form.\"),\n id=\"pootle.W011\",\n ))\n\n if not settings.DEFAULT_FROM_EMAIL:\n errors.append(checks.Warning(\n _(\"DEFAULT_FROM_EMAIL is not set.\"),\n hint=_(\"DEFAULT_FROM_EMAIL is used in all outgoing Pootle email.\\n\"\n \"Don't forget to review your mail server settings.\"),\n id=\"pootle.W009\",\n ))\n\n if settings.DEFAULT_FROM_EMAIL in (\"info@YOUR_DOMAIN.com\",\n \"webmaster@localhost\"):\n errors.append(checks.Warning(\n _(\"DEFAULT_FROM_EMAIL is using the following default \"\n \"setting %r.\" % settings.DEFAULT_FROM_EMAIL),\n hint=_(\"DEFAULT_FROM_EMAIL is used in all outgoing Pootle email.\\n\"\n \"Don't forget to review your mail server settings.\"),\n id=\"pootle.W010\",\n ))\n\n try:\n markup_filter, markup_kwargs = settings.POOTLE_MARKUP_FILTER\n except AttributeError:\n errors.append(checks.Warning(\n _(\"POOTLE_MARKUP_FILTER is missing.\"),\n hint=_(\"Set POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W012\",\n ))\n except (IndexError, TypeError, ValueError):\n errors.append(checks.Warning(\n _(\"Invalid value in POOTLE_MARKUP_FILTER.\"),\n hint=_(\"Set a valid value for POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W013\",\n ))\n else:\n if markup_filter is not None:\n try:\n if markup_filter == 'textile':\n import textile\n elif markup_filter == 'markdown':\n import markdown\n elif markup_filter == 'restructuredtext':\n import docutils\n else:\n errors.append(checks.Warning(\n _(\"Invalid markup in POOTLE_MARKUP_FILTER.\"),\n hint=_(\"Set a valid markup for POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W014\",\n ))\n except ImportError:\n errors.append(checks.Warning(\n _(\"POOTLE_MARKUP_FILTER is set to '%s' markup, but the \"\n \"package that provides can't be found.\" % markup_filter),\n hint=_(\"Install the package or change \"\n \"POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W015\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_users(app_configs=None, **kwargs):\n from django.contrib.auth import get_user_model\n from django.db import ProgrammingError\n from django.db.utils import OperationalError\n\n errors = []\n\n User = get_user_model()\n try:\n admin_user = User.objects.get(username='admin')\n except (User.DoesNotExist, OperationalError, ProgrammingError):\n pass\n else:\n if admin_user.check_password('admin'):\n errors.append(checks.Warning(\n _(\"The default 'admin' user still has a password set to \"\n \"'admin'.\"),\n hint=_(\"Remove the 'admin' user or change its password.\"),\n id=\"pootle.W016\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_db_transaction_on_commit(app_configs=None, **kwargs):\n from django.db import connection\n errors = []\n try:\n on_commit = connection.on_commit\n except AttributeError:\n errors.append(checks.Critical(\n _(\"Database connection does not implement on_commit.\"),\n hint=_(\"Set the DATABASES['default']['ENGINE'] to use a backend \"\n \"from transaction_hooks.backends.\"),\n id=\"pootle.C006\",\n ))\n return errors\n", "path": "pootle/checks.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom django.core import checks\nfrom django.utils.translation import ugettext as _\n\n\n# Minimum Translate Toolkit version required for Pootle to run.\nTTK_MINIMUM_REQUIRED_VERSION = (1, 13, 0)\n\n# Minimum Django version required for Pootle to run.\nDJANGO_MINIMUM_REQUIRED_VERSION = (1, 7, 10)\n\n# Minimum lxml version required for Pootle to run.\nLXML_MINIMUM_REQUIRED_VERSION = (2, 2, 2, 0)\n\n# Minimum Redis server version required.\n# Initially set to some minimums based on:\n# 1. Ubuntu 12.04LTS's version 2.8.4 (10.04LTS was too old for RQ)\n# 2. RQ requires >= 2.6.0, and\n# 3. Wanting to insist on at least the latest stable that devs are using i.e.\n# 2.8.* versions of Redis\nREDIS_MINIMUM_REQUIRED_VERSION = (2, 8, 4)\n\n\n# XXX List of manage.py commands not to run the rqworker check on.\n# Maybe tagging can improve this?\nRQWORKER_WHITELIST = [\n \"start\", \"initdb\", \"revision\", \"sync_stores\", \"run_cherrypy\",\n \"refresh_stats\", \"update_stores\", \"calculate_checks\", \"retry_failed_jobs\",\n \"check\", \"runserver\",\n]\n\n\ndef _version_to_string(version, significance=None):\n if significance is not None:\n version = version[significance:]\n return '.'.join(str(n) for n in version)\n\n\n@checks.register()\ndef check_library_versions(app_configs=None, **kwargs):\n from django import VERSION as django_version\n from lxml.etree import LXML_VERSION as lxml_version\n from translate.__version__ import ver as ttk_version\n\n errors = []\n\n if django_version < DJANGO_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Critical(\n _(\"Your version of Django is too old.\"),\n hint=_(\"Try pip install --upgrade 'Django==%s'\" %\n _version_to_string(DJANGO_MINIMUM_REQUIRED_VERSION)),\n id=\"pootle.C002\",\n ))\n\n if lxml_version < LXML_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Warning(\n _(\"Your version of lxml is too old.\"),\n hint=_(\"Try pip install --upgrade lxml\"),\n id=\"pootle.W003\",\n ))\n\n if ttk_version < TTK_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Critical(\n _(\"Your version of Translate Toolkit is too old.\"),\n hint=_(\"Try pip install --upgrade translate-toolkit\"),\n id=\"pootle.C003\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_redis(app_configs=None, **kwargs):\n from django_rq.queues import get_queue\n from django_rq.workers import Worker\n\n errors = []\n\n try:\n queue = get_queue()\n workers = Worker.all(queue.connection)\n except Exception as e:\n conn_settings = queue.connection.connection_pool.connection_kwargs\n errors.append(checks.Critical(_(\"Could not connect to Redis (%s)\") % (e),\n hint=_(\"Make sure Redis is running on %(host)s:%(port)s\") % (conn_settings),\n id=\"pootle.C001\",\n ))\n else:\n redis_version = tuple(int(x) for x\n in (queue.connection\n .info()[\"redis_version\"].split(\".\")))\n if redis_version < REDIS_MINIMUM_REQUIRED_VERSION:\n errors.append(checks.Critical(\n _(\"Your version of Redis is too old.\"),\n hint=_(\"Update your system's Redis server package to at least \"\n \"version %s\" % str(REDIS_MINIMUM_REQUIRED_VERSION)),\n id=\"pootle.C007\",\n ))\n\n if len(queue.connection.smembers(Worker.redis_workers_keys)) == 0:\n # We need to check we're not running manage.py rqworker right now..\n import sys\n if len(sys.argv) > 1 and sys.argv[1] in RQWORKER_WHITELIST:\n errors.append(checks.Warning(\n _(\"No RQ Worker running.\"),\n hint=_(\"Run new workers with manage.py rqworker\"),\n id=\"pootle.W001\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_settings(app_configs=None, **kwargs):\n from django.conf import settings\n\n errors = []\n\n if \"RedisCache\" not in settings.CACHES.get(\"default\", {}).get(\"BACKEND\"):\n errors.append(checks.Critical(\n _(\"Cache backend is not set to Redis.\"),\n hint=_(\"Set default cache backend to django_redis.cache.RedisCache\\n\"\n \"Current settings: %r\") % (settings.CACHES.get(\"default\")),\n id=\"pootle.C005\",\n ))\n else:\n from django_redis import get_redis_connection\n\n if not get_redis_connection():\n errors.append(checks.Critical(\n _(\"Could not initiate a Redis cache connection\"),\n hint=_(\"Double-check your CACHES settings\"),\n id=\"pootle.C004\",\n ))\n\n if settings.DEBUG:\n errors.append(checks.Warning(\n _(\"DEBUG mode is on. Do not do this in production!\"),\n hint=_(\"Set DEBUG = False in Pootle settings\"),\n id=\"pootle.W005\"\n ))\n elif \"sqlite\" in settings.DATABASES.get(\"default\", {}).get(\"ENGINE\"):\n # We don't bother warning about sqlite in DEBUG mode.\n errors.append(checks.Warning(\n _(\"The sqlite database backend is unsupported\"),\n hint=_(\"Set your default database engine to postgresql_psycopg2 or mysql\"),\n id=\"pootle.W006\",\n ))\n\n if settings.SESSION_ENGINE.split(\".\")[-1] not in (\"cache\", \"cached_db\"):\n errors.append(checks.Warning(\n _(\"Not using cached_db as session engine\"),\n hint=_(\"Set SESSION_ENGINE to django.contrib.sessions.backend.cached_db\\n\"\n \"Current settings: %r\") % (settings.SESSION_ENGINE),\n id=\"pootle.W007\",\n ))\n\n if not settings.POOTLE_CONTACT_EMAIL and settings.POOTLE_CONTACT_ENABLED:\n errors.append(checks.Warning(\n _(\"POOTLE_CONTACT_EMAIL is not set.\"),\n hint=_(\"Set POOTLE_CONTACT_EMAIL to allow users to contact \"\n \"administrators through the Pootle contact form.\"),\n id=\"pootle.W008\",\n ))\n\n if settings.POOTLE_CONTACT_EMAIL in (\"info@YOUR_DOMAIN.com\") \\\n and settings.POOTLE_CONTACT_ENABLED:\n errors.append(checks.Warning(\n _(\"POOTLE_CONTACT_EMAIL is using the following default \"\n \"setting %r.\" % settings.POOTLE_CONTACT_EMAIL),\n hint=_(\"POOTLE_CONTACT_EMAIL is the address that will receive \"\n \"messages sent by the contact form.\"),\n id=\"pootle.W011\",\n ))\n\n if not settings.DEFAULT_FROM_EMAIL:\n errors.append(checks.Warning(\n _(\"DEFAULT_FROM_EMAIL is not set.\"),\n hint=_(\"DEFAULT_FROM_EMAIL is used in all outgoing Pootle email.\\n\"\n \"Don't forget to review your mail server settings.\"),\n id=\"pootle.W009\",\n ))\n\n if settings.DEFAULT_FROM_EMAIL in (\"info@YOUR_DOMAIN.com\",\n \"webmaster@localhost\"):\n errors.append(checks.Warning(\n _(\"DEFAULT_FROM_EMAIL is using the following default \"\n \"setting %r.\" % settings.DEFAULT_FROM_EMAIL),\n hint=_(\"DEFAULT_FROM_EMAIL is used in all outgoing Pootle email.\\n\"\n \"Don't forget to review your mail server settings.\"),\n id=\"pootle.W010\",\n ))\n\n try:\n markup_filter, markup_kwargs = settings.POOTLE_MARKUP_FILTER\n except AttributeError:\n errors.append(checks.Warning(\n _(\"POOTLE_MARKUP_FILTER is missing.\"),\n hint=_(\"Set POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W012\",\n ))\n except (IndexError, TypeError, ValueError):\n errors.append(checks.Warning(\n _(\"Invalid value in POOTLE_MARKUP_FILTER.\"),\n hint=_(\"Set a valid value for POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W013\",\n ))\n else:\n if markup_filter is not None:\n try:\n if markup_filter == 'textile':\n import textile\n elif markup_filter == 'markdown':\n import markdown\n elif markup_filter == 'restructuredtext':\n import docutils\n else:\n errors.append(checks.Warning(\n _(\"Invalid markup in POOTLE_MARKUP_FILTER.\"),\n hint=_(\"Set a valid markup for POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W014\",\n ))\n except ImportError:\n errors.append(checks.Warning(\n _(\"POOTLE_MARKUP_FILTER is set to '%s' markup, but the \"\n \"package that provides can't be found.\" % markup_filter),\n hint=_(\"Install the package or change \"\n \"POOTLE_MARKUP_FILTER.\"),\n id=\"pootle.W015\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_users(app_configs=None, **kwargs):\n from django.contrib.auth import get_user_model\n from django.db import ProgrammingError\n from django.db.utils import OperationalError\n\n errors = []\n\n User = get_user_model()\n try:\n admin_user = User.objects.get(username='admin')\n except (User.DoesNotExist, OperationalError, ProgrammingError):\n pass\n else:\n if admin_user.check_password('admin'):\n errors.append(checks.Warning(\n _(\"The default 'admin' user still has a password set to \"\n \"'admin'.\"),\n hint=_(\"Remove the 'admin' user or change its password.\"),\n id=\"pootle.W016\",\n ))\n\n return errors\n\n\n@checks.register()\ndef check_db_transaction_on_commit(app_configs=None, **kwargs):\n from django.db import connection\n errors = []\n try:\n on_commit = connection.on_commit\n except AttributeError:\n errors.append(checks.Critical(\n _(\"Database connection does not implement on_commit.\"),\n hint=_(\"Set the DATABASES['default']['ENGINE'] to use a backend \"\n \"from transaction_hooks.backends.\"),\n id=\"pootle.C006\",\n ))\n return errors\n", "path": "pootle/checks.py"}]}
| 3,336 | 329 |
gh_patches_debug_24986
|
rasdani/github-patches
|
git_diff
|
pytorch__examples-832
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
two lines of code in mnist/main.py
There are two arguments called batch_size and test_batch_size:
`parser.add_argument('--batch-size', type=int, default=64, metavar='N',
help='input batch size for training (default: 64)')`
`parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',
help='input batch size for testing (default: 1000)')`
but batch_size is used here:
`test_loader = torch.utils.data.DataLoader(
datasets.MNIST('../data', train=False, transform=transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.1307,), (0.3081,))
])),
batch_size=args.batch_size, shuffle=True, **kwargs)`
Also, what does this line(line 105) do:
`test_loss = test_loss`
and it seems that `epoch` is not used in test().
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mnist/main.py`
Content:
```
1 from __future__ import print_function
2 import argparse
3 import torch
4 import torch.nn as nn
5 import torch.nn.functional as F
6 import torch.optim as optim
7 from torchvision import datasets, transforms
8 from torch.optim.lr_scheduler import StepLR
9
10
11 class Net(nn.Module):
12 def __init__(self):
13 super(Net, self).__init__()
14 self.conv1 = nn.Conv2d(1, 32, 3, 1)
15 self.conv2 = nn.Conv2d(32, 64, 3, 1)
16 self.dropout1 = nn.Dropout2d(0.25)
17 self.dropout2 = nn.Dropout2d(0.5)
18 self.fc1 = nn.Linear(9216, 128)
19 self.fc2 = nn.Linear(128, 10)
20
21 def forward(self, x):
22 x = self.conv1(x)
23 x = F.relu(x)
24 x = self.conv2(x)
25 x = F.relu(x)
26 x = F.max_pool2d(x, 2)
27 x = self.dropout1(x)
28 x = torch.flatten(x, 1)
29 x = self.fc1(x)
30 x = F.relu(x)
31 x = self.dropout2(x)
32 x = self.fc2(x)
33 output = F.log_softmax(x, dim=1)
34 return output
35
36
37 def train(args, model, device, train_loader, optimizer, epoch):
38 model.train()
39 for batch_idx, (data, target) in enumerate(train_loader):
40 data, target = data.to(device), target.to(device)
41 optimizer.zero_grad()
42 output = model(data)
43 loss = F.nll_loss(output, target)
44 loss.backward()
45 optimizer.step()
46 if batch_idx % args.log_interval == 0:
47 print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
48 epoch, batch_idx * len(data), len(train_loader.dataset),
49 100. * batch_idx / len(train_loader), loss.item()))
50 if args.dry_run:
51 break
52
53
54 def test(model, device, test_loader):
55 model.eval()
56 test_loss = 0
57 correct = 0
58 with torch.no_grad():
59 for data, target in test_loader:
60 data, target = data.to(device), target.to(device)
61 output = model(data)
62 test_loss += F.nll_loss(output, target, reduction='sum').item() # sum up batch loss
63 pred = output.argmax(dim=1, keepdim=True) # get the index of the max log-probability
64 correct += pred.eq(target.view_as(pred)).sum().item()
65
66 test_loss /= len(test_loader.dataset)
67
68 print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
69 test_loss, correct, len(test_loader.dataset),
70 100. * correct / len(test_loader.dataset)))
71
72
73 def main():
74 # Training settings
75 parser = argparse.ArgumentParser(description='PyTorch MNIST Example')
76 parser.add_argument('--batch-size', type=int, default=64, metavar='N',
77 help='input batch size for training (default: 64)')
78 parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',
79 help='input batch size for testing (default: 1000)')
80 parser.add_argument('--epochs', type=int, default=14, metavar='N',
81 help='number of epochs to train (default: 14)')
82 parser.add_argument('--lr', type=float, default=1.0, metavar='LR',
83 help='learning rate (default: 1.0)')
84 parser.add_argument('--gamma', type=float, default=0.7, metavar='M',
85 help='Learning rate step gamma (default: 0.7)')
86 parser.add_argument('--no-cuda', action='store_true', default=False,
87 help='disables CUDA training')
88 parser.add_argument('--dry-run', action='store_true', default=False,
89 help='quickly check a single pass')
90 parser.add_argument('--seed', type=int, default=1, metavar='S',
91 help='random seed (default: 1)')
92 parser.add_argument('--log-interval', type=int, default=10, metavar='N',
93 help='how many batches to wait before logging training status')
94 parser.add_argument('--save-model', action='store_true', default=False,
95 help='For Saving the current Model')
96 args = parser.parse_args()
97 use_cuda = not args.no_cuda and torch.cuda.is_available()
98
99 torch.manual_seed(args.seed)
100
101 device = torch.device("cuda" if use_cuda else "cpu")
102
103 kwargs = {'batch_size': args.batch_size}
104 if use_cuda:
105 kwargs.update({'num_workers': 1,
106 'pin_memory': True,
107 'shuffle': True},
108 )
109
110 transform=transforms.Compose([
111 transforms.ToTensor(),
112 transforms.Normalize((0.1307,), (0.3081,))
113 ])
114 dataset1 = datasets.MNIST('../data', train=True, download=True,
115 transform=transform)
116 dataset2 = datasets.MNIST('../data', train=False,
117 transform=transform)
118 train_loader = torch.utils.data.DataLoader(dataset1,**kwargs)
119 test_loader = torch.utils.data.DataLoader(dataset2, **kwargs)
120
121 model = Net().to(device)
122 optimizer = optim.Adadelta(model.parameters(), lr=args.lr)
123
124 scheduler = StepLR(optimizer, step_size=1, gamma=args.gamma)
125 for epoch in range(1, args.epochs + 1):
126 train(args, model, device, train_loader, optimizer, epoch)
127 test(model, device, test_loader)
128 scheduler.step()
129
130 if args.save_model:
131 torch.save(model.state_dict(), "mnist_cnn.pt")
132
133
134 if __name__ == '__main__':
135 main()
136
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mnist/main.py b/mnist/main.py
--- a/mnist/main.py
+++ b/mnist/main.py
@@ -100,12 +100,14 @@
device = torch.device("cuda" if use_cuda else "cpu")
- kwargs = {'batch_size': args.batch_size}
+ train_kwargs = {'batch_size': args.batch_size}
+ test_kwargs = {'batch_size': args.test_batch_size}
if use_cuda:
- kwargs.update({'num_workers': 1,
+ cuda_kwargs = {'num_workers': 1,
'pin_memory': True,
- 'shuffle': True},
- )
+ 'shuffle': True}
+ train_kwargs.update(cuda_kwargs)
+ test_kwargs.update(cuda_kwargs)
transform=transforms.Compose([
transforms.ToTensor(),
@@ -115,8 +117,8 @@
transform=transform)
dataset2 = datasets.MNIST('../data', train=False,
transform=transform)
- train_loader = torch.utils.data.DataLoader(dataset1,**kwargs)
- test_loader = torch.utils.data.DataLoader(dataset2, **kwargs)
+ train_loader = torch.utils.data.DataLoader(dataset1,**train_kwargs)
+ test_loader = torch.utils.data.DataLoader(dataset2, **test_kwargs)
model = Net().to(device)
optimizer = optim.Adadelta(model.parameters(), lr=args.lr)
|
{"golden_diff": "diff --git a/mnist/main.py b/mnist/main.py\n--- a/mnist/main.py\n+++ b/mnist/main.py\n@@ -100,12 +100,14 @@\n \n device = torch.device(\"cuda\" if use_cuda else \"cpu\")\n \n- kwargs = {'batch_size': args.batch_size}\n+ train_kwargs = {'batch_size': args.batch_size}\n+ test_kwargs = {'batch_size': args.test_batch_size}\n if use_cuda:\n- kwargs.update({'num_workers': 1,\n+ cuda_kwargs = {'num_workers': 1,\n 'pin_memory': True,\n- 'shuffle': True},\n- )\n+ 'shuffle': True}\n+ train_kwargs.update(cuda_kwargs)\n+ test_kwargs.update(cuda_kwargs)\n \n transform=transforms.Compose([\n transforms.ToTensor(),\n@@ -115,8 +117,8 @@\n transform=transform)\n dataset2 = datasets.MNIST('../data', train=False,\n transform=transform)\n- train_loader = torch.utils.data.DataLoader(dataset1,**kwargs)\n- test_loader = torch.utils.data.DataLoader(dataset2, **kwargs)\n+ train_loader = torch.utils.data.DataLoader(dataset1,**train_kwargs)\n+ test_loader = torch.utils.data.DataLoader(dataset2, **test_kwargs)\n \n model = Net().to(device)\n optimizer = optim.Adadelta(model.parameters(), lr=args.lr)\n", "issue": "two lines of code in mnist/main.py\nThere are two arguments called batch_size and test_batch_size:\r\n`parser.add_argument('--batch-size', type=int, default=64, metavar='N',\r\n help='input batch size for training (default: 64)')`\r\n`parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',\r\n help='input batch size for testing (default: 1000)')`\r\nbut batch_size is used here:\r\n`test_loader = torch.utils.data.DataLoader(\r\n datasets.MNIST('../data', train=False, transform=transforms.Compose([\r\n transforms.ToTensor(),\r\n transforms.Normalize((0.1307,), (0.3081,))\r\n ])),\r\n batch_size=args.batch_size, shuffle=True, **kwargs)`\r\n\r\nAlso, what does this line(line 105) do:\r\n`test_loss = test_loss`\r\n\r\nand it seems that `epoch` is not used in test().\n", "before_files": [{"content": "from __future__ import print_function\nimport argparse\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\nfrom torchvision import datasets, transforms\nfrom torch.optim.lr_scheduler import StepLR\n\n\nclass Net(nn.Module):\n def __init__(self):\n super(Net, self).__init__()\n self.conv1 = nn.Conv2d(1, 32, 3, 1)\n self.conv2 = nn.Conv2d(32, 64, 3, 1)\n self.dropout1 = nn.Dropout2d(0.25)\n self.dropout2 = nn.Dropout2d(0.5)\n self.fc1 = nn.Linear(9216, 128)\n self.fc2 = nn.Linear(128, 10)\n\n def forward(self, x):\n x = self.conv1(x)\n x = F.relu(x)\n x = self.conv2(x)\n x = F.relu(x)\n x = F.max_pool2d(x, 2)\n x = self.dropout1(x)\n x = torch.flatten(x, 1)\n x = self.fc1(x)\n x = F.relu(x)\n x = self.dropout2(x)\n x = self.fc2(x)\n output = F.log_softmax(x, dim=1)\n return output\n\n\ndef train(args, model, device, train_loader, optimizer, epoch):\n model.train()\n for batch_idx, (data, target) in enumerate(train_loader):\n data, target = data.to(device), target.to(device)\n optimizer.zero_grad()\n output = model(data)\n loss = F.nll_loss(output, target)\n loss.backward()\n optimizer.step()\n if batch_idx % args.log_interval == 0:\n print('Train Epoch: {} [{}/{} ({:.0f}%)]\\tLoss: {:.6f}'.format(\n epoch, batch_idx * len(data), len(train_loader.dataset),\n 100. * batch_idx / len(train_loader), loss.item()))\n if args.dry_run:\n break\n\n\ndef test(model, device, test_loader):\n model.eval()\n test_loss = 0\n correct = 0\n with torch.no_grad():\n for data, target in test_loader:\n data, target = data.to(device), target.to(device)\n output = model(data)\n test_loss += F.nll_loss(output, target, reduction='sum').item() # sum up batch loss\n pred = output.argmax(dim=1, keepdim=True) # get the index of the max log-probability\n correct += pred.eq(target.view_as(pred)).sum().item()\n\n test_loss /= len(test_loader.dataset)\n\n print('\\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\\n'.format(\n test_loss, correct, len(test_loader.dataset),\n 100. * correct / len(test_loader.dataset)))\n\n\ndef main():\n # Training settings\n parser = argparse.ArgumentParser(description='PyTorch MNIST Example')\n parser.add_argument('--batch-size', type=int, default=64, metavar='N',\n help='input batch size for training (default: 64)')\n parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',\n help='input batch size for testing (default: 1000)')\n parser.add_argument('--epochs', type=int, default=14, metavar='N',\n help='number of epochs to train (default: 14)')\n parser.add_argument('--lr', type=float, default=1.0, metavar='LR',\n help='learning rate (default: 1.0)')\n parser.add_argument('--gamma', type=float, default=0.7, metavar='M',\n help='Learning rate step gamma (default: 0.7)')\n parser.add_argument('--no-cuda', action='store_true', default=False,\n help='disables CUDA training')\n parser.add_argument('--dry-run', action='store_true', default=False,\n help='quickly check a single pass')\n parser.add_argument('--seed', type=int, default=1, metavar='S',\n help='random seed (default: 1)')\n parser.add_argument('--log-interval', type=int, default=10, metavar='N',\n help='how many batches to wait before logging training status')\n parser.add_argument('--save-model', action='store_true', default=False,\n help='For Saving the current Model')\n args = parser.parse_args()\n use_cuda = not args.no_cuda and torch.cuda.is_available()\n\n torch.manual_seed(args.seed)\n\n device = torch.device(\"cuda\" if use_cuda else \"cpu\")\n\n kwargs = {'batch_size': args.batch_size}\n if use_cuda:\n kwargs.update({'num_workers': 1,\n 'pin_memory': True,\n 'shuffle': True},\n )\n\n transform=transforms.Compose([\n transforms.ToTensor(),\n transforms.Normalize((0.1307,), (0.3081,))\n ])\n dataset1 = datasets.MNIST('../data', train=True, download=True,\n transform=transform)\n dataset2 = datasets.MNIST('../data', train=False,\n transform=transform)\n train_loader = torch.utils.data.DataLoader(dataset1,**kwargs)\n test_loader = torch.utils.data.DataLoader(dataset2, **kwargs)\n\n model = Net().to(device)\n optimizer = optim.Adadelta(model.parameters(), lr=args.lr)\n\n scheduler = StepLR(optimizer, step_size=1, gamma=args.gamma)\n for epoch in range(1, args.epochs + 1):\n train(args, model, device, train_loader, optimizer, epoch)\n test(model, device, test_loader)\n scheduler.step()\n\n if args.save_model:\n torch.save(model.state_dict(), \"mnist_cnn.pt\")\n\n\nif __name__ == '__main__':\n main()\n", "path": "mnist/main.py"}], "after_files": [{"content": "from __future__ import print_function\nimport argparse\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\nfrom torchvision import datasets, transforms\nfrom torch.optim.lr_scheduler import StepLR\n\n\nclass Net(nn.Module):\n def __init__(self):\n super(Net, self).__init__()\n self.conv1 = nn.Conv2d(1, 32, 3, 1)\n self.conv2 = nn.Conv2d(32, 64, 3, 1)\n self.dropout1 = nn.Dropout2d(0.25)\n self.dropout2 = nn.Dropout2d(0.5)\n self.fc1 = nn.Linear(9216, 128)\n self.fc2 = nn.Linear(128, 10)\n\n def forward(self, x):\n x = self.conv1(x)\n x = F.relu(x)\n x = self.conv2(x)\n x = F.relu(x)\n x = F.max_pool2d(x, 2)\n x = self.dropout1(x)\n x = torch.flatten(x, 1)\n x = self.fc1(x)\n x = F.relu(x)\n x = self.dropout2(x)\n x = self.fc2(x)\n output = F.log_softmax(x, dim=1)\n return output\n\n\ndef train(args, model, device, train_loader, optimizer, epoch):\n model.train()\n for batch_idx, (data, target) in enumerate(train_loader):\n data, target = data.to(device), target.to(device)\n optimizer.zero_grad()\n output = model(data)\n loss = F.nll_loss(output, target)\n loss.backward()\n optimizer.step()\n if batch_idx % args.log_interval == 0:\n print('Train Epoch: {} [{}/{} ({:.0f}%)]\\tLoss: {:.6f}'.format(\n epoch, batch_idx * len(data), len(train_loader.dataset),\n 100. * batch_idx / len(train_loader), loss.item()))\n if args.dry_run:\n break\n\n\ndef test(model, device, test_loader):\n model.eval()\n test_loss = 0\n correct = 0\n with torch.no_grad():\n for data, target in test_loader:\n data, target = data.to(device), target.to(device)\n output = model(data)\n test_loss += F.nll_loss(output, target, reduction='sum').item() # sum up batch loss\n pred = output.argmax(dim=1, keepdim=True) # get the index of the max log-probability\n correct += pred.eq(target.view_as(pred)).sum().item()\n\n test_loss /= len(test_loader.dataset)\n\n print('\\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\\n'.format(\n test_loss, correct, len(test_loader.dataset),\n 100. * correct / len(test_loader.dataset)))\n\n\ndef main():\n # Training settings\n parser = argparse.ArgumentParser(description='PyTorch MNIST Example')\n parser.add_argument('--batch-size', type=int, default=64, metavar='N',\n help='input batch size for training (default: 64)')\n parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',\n help='input batch size for testing (default: 1000)')\n parser.add_argument('--epochs', type=int, default=14, metavar='N',\n help='number of epochs to train (default: 14)')\n parser.add_argument('--lr', type=float, default=1.0, metavar='LR',\n help='learning rate (default: 1.0)')\n parser.add_argument('--gamma', type=float, default=0.7, metavar='M',\n help='Learning rate step gamma (default: 0.7)')\n parser.add_argument('--no-cuda', action='store_true', default=False,\n help='disables CUDA training')\n parser.add_argument('--dry-run', action='store_true', default=False,\n help='quickly check a single pass')\n parser.add_argument('--seed', type=int, default=1, metavar='S',\n help='random seed (default: 1)')\n parser.add_argument('--log-interval', type=int, default=10, metavar='N',\n help='how many batches to wait before logging training status')\n parser.add_argument('--save-model', action='store_true', default=False,\n help='For Saving the current Model')\n args = parser.parse_args()\n use_cuda = not args.no_cuda and torch.cuda.is_available()\n\n torch.manual_seed(args.seed)\n\n device = torch.device(\"cuda\" if use_cuda else \"cpu\")\n\n train_kwargs = {'batch_size': args.batch_size}\n test_kwargs = {'batch_size': args.test_batch_size}\n if use_cuda:\n cuda_kwargs = {'num_workers': 1,\n 'pin_memory': True,\n 'shuffle': True}\n train_kwargs.update(cuda_kwargs)\n test_kwargs.update(cuda_kwargs)\n\n transform=transforms.Compose([\n transforms.ToTensor(),\n transforms.Normalize((0.1307,), (0.3081,))\n ])\n dataset1 = datasets.MNIST('../data', train=True, download=True,\n transform=transform)\n dataset2 = datasets.MNIST('../data', train=False,\n transform=transform)\n train_loader = torch.utils.data.DataLoader(dataset1,**train_kwargs)\n test_loader = torch.utils.data.DataLoader(dataset2, **test_kwargs)\n\n model = Net().to(device)\n optimizer = optim.Adadelta(model.parameters(), lr=args.lr)\n\n scheduler = StepLR(optimizer, step_size=1, gamma=args.gamma)\n for epoch in range(1, args.epochs + 1):\n train(args, model, device, train_loader, optimizer, epoch)\n test(model, device, test_loader)\n scheduler.step()\n\n if args.save_model:\n torch.save(model.state_dict(), \"mnist_cnn.pt\")\n\n\nif __name__ == '__main__':\n main()\n", "path": "mnist/main.py"}]}
| 2,064 | 305 |
gh_patches_debug_10057
|
rasdani/github-patches
|
git_diff
|
OCA__social-1068
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[15.0] mail_tracking: portal tests fail
<!-- Provide a general summary of the issue in the Title above.
Before opening the issue, please check that there isn't another issue addressing the same problem already created -->
## Module
mail_tracking
## Describe the bug
mail_tracking causes TestPortalWizard.test_portal_wizard_public_user and TestPortalWizard.test_portal_wizard_partner to fail
## To Reproduce
**Affected versions**: 15.0
Steps to reproduce the behavior:
1. init db with portal and mail_tracking
2. run portal tests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mail_tracking/models/ir_mail_server.py`
Content:
```
1 # Copyright 2016 Antonio Espinosa - <antonio.espinosa@tecnativa.com>
2 # License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).
3
4 import re
5 import threading
6
7 from odoo import api, models, tools
8
9
10 class IrMailServer(models.Model):
11 _inherit = "ir.mail_server"
12
13 def _tracking_headers_add(self, tracking_email_id, headers):
14 """Allow other addons to add its own tracking SMTP headers"""
15 headers = headers or {}
16 headers["X-Odoo-Database"] = getattr(threading.current_thread(), "dbname", None)
17 headers["X-Odoo-MailTracking-ID"] = tracking_email_id
18 return headers
19
20 def _tracking_email_id_body_get(self, body):
21 body = body or ""
22 # https://regex101.com/r/lW4cB1/2
23 match = re.search(r'<img[^>]*data-odoo-tracking-email=["\']([0-9]*)["\']', body)
24 return str(match.group(1)) if match and match.group(1) else False
25
26 def build_email(
27 self,
28 email_from,
29 email_to,
30 subject,
31 body,
32 email_cc=None,
33 email_bcc=None,
34 reply_to=False,
35 attachments=None,
36 message_id=None,
37 references=None,
38 object_id=False,
39 subtype="plain",
40 headers=None,
41 body_alternative=None,
42 subtype_alternative="plain",
43 ):
44 tracking_email_id = self._tracking_email_id_body_get(body)
45 if tracking_email_id:
46 headers = self._tracking_headers_add(tracking_email_id, headers)
47 msg = super(IrMailServer, self).build_email(
48 email_from,
49 email_to,
50 subject,
51 body,
52 email_cc=email_cc,
53 email_bcc=email_bcc,
54 reply_to=reply_to,
55 attachments=attachments,
56 message_id=message_id,
57 references=references,
58 object_id=object_id,
59 subtype=subtype,
60 headers=headers,
61 body_alternative=body_alternative,
62 subtype_alternative=subtype_alternative,
63 )
64 return msg
65
66 def _tracking_email_get(self, message):
67 try:
68 tracking_email_id = int(
69 message.get(
70 "X-Odoo-MailTracking-ID",
71 # Deprecated tracking header, kept as fallback
72 message["X-Odoo-Tracking-ID"],
73 )
74 )
75 except (TypeError, ValueError, KeyError):
76 tracking_email_id = False
77 return self.env["mail.tracking.email"].browse(tracking_email_id)
78
79 def _smtp_server_get(self, mail_server_id, smtp_server):
80 smtp_server_used = False
81 mail_server = None
82 if mail_server_id:
83 mail_server = self.browse(mail_server_id)
84 elif not smtp_server:
85 mail_server_ids = self.search([], order="sequence", limit=1)
86 mail_server = mail_server_ids[0] if mail_server_ids else None
87 if mail_server:
88 smtp_server_used = mail_server.smtp_host
89 else:
90 smtp_server_used = smtp_server or tools.config.get("smtp_server")
91 return smtp_server_used
92
93 @api.model
94 def send_email(
95 self,
96 message,
97 mail_server_id=None,
98 smtp_server=None,
99 smtp_port=None,
100 smtp_user=None,
101 smtp_password=None,
102 smtp_encryption=None,
103 smtp_debug=False,
104 smtp_session=None,
105 ):
106 message_id = False
107 tracking_email = self._tracking_email_get(message)
108 smtp_server_used = self.sudo()._smtp_server_get(mail_server_id, smtp_server)
109 try:
110 message_id = super(IrMailServer, self).send_email(
111 message,
112 mail_server_id=mail_server_id,
113 smtp_server=smtp_server,
114 smtp_port=smtp_port,
115 smtp_user=smtp_user,
116 smtp_password=smtp_password,
117 smtp_encryption=smtp_encryption,
118 smtp_debug=smtp_debug,
119 smtp_session=smtp_session,
120 )
121 except Exception as e:
122 if tracking_email:
123 tracking_email.smtp_error(self, smtp_server_used, e)
124 if message_id and tracking_email:
125 vals = tracking_email._tracking_sent_prepare(
126 self, smtp_server_used, message, message_id
127 )
128 if vals:
129 self.env["mail.tracking.event"].sudo().create(vals)
130 return message_id
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mail_tracking/models/ir_mail_server.py b/mail_tracking/models/ir_mail_server.py
--- a/mail_tracking/models/ir_mail_server.py
+++ b/mail_tracking/models/ir_mail_server.py
@@ -45,10 +45,10 @@
if tracking_email_id:
headers = self._tracking_headers_add(tracking_email_id, headers)
msg = super(IrMailServer, self).build_email(
- email_from,
- email_to,
- subject,
- body,
+ email_from=email_from,
+ email_to=email_to,
+ subject=subject,
+ body=body,
email_cc=email_cc,
email_bcc=email_bcc,
reply_to=reply_to,
|
{"golden_diff": "diff --git a/mail_tracking/models/ir_mail_server.py b/mail_tracking/models/ir_mail_server.py\n--- a/mail_tracking/models/ir_mail_server.py\n+++ b/mail_tracking/models/ir_mail_server.py\n@@ -45,10 +45,10 @@\n if tracking_email_id:\n headers = self._tracking_headers_add(tracking_email_id, headers)\n msg = super(IrMailServer, self).build_email(\n- email_from,\n- email_to,\n- subject,\n- body,\n+ email_from=email_from,\n+ email_to=email_to,\n+ subject=subject,\n+ body=body,\n email_cc=email_cc,\n email_bcc=email_bcc,\n reply_to=reply_to,\n", "issue": "[15.0] mail_tracking: portal tests fail\n<!-- Provide a general summary of the issue in the Title above.\r\nBefore opening the issue, please check that there isn't another issue addressing the same problem already created -->\r\n\r\n## Module\r\nmail_tracking\r\n\r\n## Describe the bug\r\nmail_tracking causes TestPortalWizard.test_portal_wizard_public_user and TestPortalWizard.test_portal_wizard_partner to fail\r\n\r\n## To Reproduce\r\n**Affected versions**: 15.0\r\n\r\nSteps to reproduce the behavior:\r\n1. init db with portal and mail_tracking\r\n2. run portal tests\r\n\n", "before_files": [{"content": "# Copyright 2016 Antonio Espinosa - <antonio.espinosa@tecnativa.com>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n\nimport re\nimport threading\n\nfrom odoo import api, models, tools\n\n\nclass IrMailServer(models.Model):\n _inherit = \"ir.mail_server\"\n\n def _tracking_headers_add(self, tracking_email_id, headers):\n \"\"\"Allow other addons to add its own tracking SMTP headers\"\"\"\n headers = headers or {}\n headers[\"X-Odoo-Database\"] = getattr(threading.current_thread(), \"dbname\", None)\n headers[\"X-Odoo-MailTracking-ID\"] = tracking_email_id\n return headers\n\n def _tracking_email_id_body_get(self, body):\n body = body or \"\"\n # https://regex101.com/r/lW4cB1/2\n match = re.search(r'<img[^>]*data-odoo-tracking-email=[\"\\']([0-9]*)[\"\\']', body)\n return str(match.group(1)) if match and match.group(1) else False\n\n def build_email(\n self,\n email_from,\n email_to,\n subject,\n body,\n email_cc=None,\n email_bcc=None,\n reply_to=False,\n attachments=None,\n message_id=None,\n references=None,\n object_id=False,\n subtype=\"plain\",\n headers=None,\n body_alternative=None,\n subtype_alternative=\"plain\",\n ):\n tracking_email_id = self._tracking_email_id_body_get(body)\n if tracking_email_id:\n headers = self._tracking_headers_add(tracking_email_id, headers)\n msg = super(IrMailServer, self).build_email(\n email_from,\n email_to,\n subject,\n body,\n email_cc=email_cc,\n email_bcc=email_bcc,\n reply_to=reply_to,\n attachments=attachments,\n message_id=message_id,\n references=references,\n object_id=object_id,\n subtype=subtype,\n headers=headers,\n body_alternative=body_alternative,\n subtype_alternative=subtype_alternative,\n )\n return msg\n\n def _tracking_email_get(self, message):\n try:\n tracking_email_id = int(\n message.get(\n \"X-Odoo-MailTracking-ID\",\n # Deprecated tracking header, kept as fallback\n message[\"X-Odoo-Tracking-ID\"],\n )\n )\n except (TypeError, ValueError, KeyError):\n tracking_email_id = False\n return self.env[\"mail.tracking.email\"].browse(tracking_email_id)\n\n def _smtp_server_get(self, mail_server_id, smtp_server):\n smtp_server_used = False\n mail_server = None\n if mail_server_id:\n mail_server = self.browse(mail_server_id)\n elif not smtp_server:\n mail_server_ids = self.search([], order=\"sequence\", limit=1)\n mail_server = mail_server_ids[0] if mail_server_ids else None\n if mail_server:\n smtp_server_used = mail_server.smtp_host\n else:\n smtp_server_used = smtp_server or tools.config.get(\"smtp_server\")\n return smtp_server_used\n\n @api.model\n def send_email(\n self,\n message,\n mail_server_id=None,\n smtp_server=None,\n smtp_port=None,\n smtp_user=None,\n smtp_password=None,\n smtp_encryption=None,\n smtp_debug=False,\n smtp_session=None,\n ):\n message_id = False\n tracking_email = self._tracking_email_get(message)\n smtp_server_used = self.sudo()._smtp_server_get(mail_server_id, smtp_server)\n try:\n message_id = super(IrMailServer, self).send_email(\n message,\n mail_server_id=mail_server_id,\n smtp_server=smtp_server,\n smtp_port=smtp_port,\n smtp_user=smtp_user,\n smtp_password=smtp_password,\n smtp_encryption=smtp_encryption,\n smtp_debug=smtp_debug,\n smtp_session=smtp_session,\n )\n except Exception as e:\n if tracking_email:\n tracking_email.smtp_error(self, smtp_server_used, e)\n if message_id and tracking_email:\n vals = tracking_email._tracking_sent_prepare(\n self, smtp_server_used, message, message_id\n )\n if vals:\n self.env[\"mail.tracking.event\"].sudo().create(vals)\n return message_id\n", "path": "mail_tracking/models/ir_mail_server.py"}], "after_files": [{"content": "# Copyright 2016 Antonio Espinosa - <antonio.espinosa@tecnativa.com>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n\nimport re\nimport threading\n\nfrom odoo import api, models, tools\n\n\nclass IrMailServer(models.Model):\n _inherit = \"ir.mail_server\"\n\n def _tracking_headers_add(self, tracking_email_id, headers):\n \"\"\"Allow other addons to add its own tracking SMTP headers\"\"\"\n headers = headers or {}\n headers[\"X-Odoo-Database\"] = getattr(threading.current_thread(), \"dbname\", None)\n headers[\"X-Odoo-MailTracking-ID\"] = tracking_email_id\n return headers\n\n def _tracking_email_id_body_get(self, body):\n body = body or \"\"\n # https://regex101.com/r/lW4cB1/2\n match = re.search(r'<img[^>]*data-odoo-tracking-email=[\"\\']([0-9]*)[\"\\']', body)\n return str(match.group(1)) if match and match.group(1) else False\n\n def build_email(\n self,\n email_from,\n email_to,\n subject,\n body,\n email_cc=None,\n email_bcc=None,\n reply_to=False,\n attachments=None,\n message_id=None,\n references=None,\n object_id=False,\n subtype=\"plain\",\n headers=None,\n body_alternative=None,\n subtype_alternative=\"plain\",\n ):\n tracking_email_id = self._tracking_email_id_body_get(body)\n if tracking_email_id:\n headers = self._tracking_headers_add(tracking_email_id, headers)\n msg = super(IrMailServer, self).build_email(\n email_from=email_from,\n email_to=email_to,\n subject=subject,\n body=body,\n email_cc=email_cc,\n email_bcc=email_bcc,\n reply_to=reply_to,\n attachments=attachments,\n message_id=message_id,\n references=references,\n object_id=object_id,\n subtype=subtype,\n headers=headers,\n body_alternative=body_alternative,\n subtype_alternative=subtype_alternative,\n )\n return msg\n\n def _tracking_email_get(self, message):\n try:\n tracking_email_id = int(\n message.get(\n \"X-Odoo-MailTracking-ID\",\n # Deprecated tracking header, kept as fallback\n message[\"X-Odoo-Tracking-ID\"],\n )\n )\n except (TypeError, ValueError, KeyError):\n tracking_email_id = False\n return self.env[\"mail.tracking.email\"].browse(tracking_email_id)\n\n def _smtp_server_get(self, mail_server_id, smtp_server):\n smtp_server_used = False\n mail_server = None\n if mail_server_id:\n mail_server = self.browse(mail_server_id)\n elif not smtp_server:\n mail_server_ids = self.search([], order=\"sequence\", limit=1)\n mail_server = mail_server_ids[0] if mail_server_ids else None\n if mail_server:\n smtp_server_used = mail_server.smtp_host\n else:\n smtp_server_used = smtp_server or tools.config.get(\"smtp_server\")\n return smtp_server_used\n\n @api.model\n def send_email(\n self,\n message,\n mail_server_id=None,\n smtp_server=None,\n smtp_port=None,\n smtp_user=None,\n smtp_password=None,\n smtp_encryption=None,\n smtp_debug=False,\n smtp_session=None,\n ):\n message_id = False\n tracking_email = self._tracking_email_get(message)\n smtp_server_used = self.sudo()._smtp_server_get(mail_server_id, smtp_server)\n try:\n message_id = super(IrMailServer, self).send_email(\n message,\n mail_server_id=mail_server_id,\n smtp_server=smtp_server,\n smtp_port=smtp_port,\n smtp_user=smtp_user,\n smtp_password=smtp_password,\n smtp_encryption=smtp_encryption,\n smtp_debug=smtp_debug,\n smtp_session=smtp_session,\n )\n except Exception as e:\n if tracking_email:\n tracking_email.smtp_error(self, smtp_server_used, e)\n if message_id and tracking_email:\n vals = tracking_email._tracking_sent_prepare(\n self, smtp_server_used, message, message_id\n )\n if vals:\n self.env[\"mail.tracking.event\"].sudo().create(vals)\n return message_id\n", "path": "mail_tracking/models/ir_mail_server.py"}]}
| 1,607 | 158 |
gh_patches_debug_19291
|
rasdani/github-patches
|
git_diff
|
aws-cloudformation__cfn-lint-248
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Adding new CloudFormationLintRule's requires the filename to be identical to the Class name
*cfn-lint version: (`cfn-lint --version`)*
0.0.9
*Description of issue.*
When creating your own rules, you have to add CloudFormationLintRules, but the loader requires
the python filename to be identical to the Classname.
It would be nice if:
- we can create files with names matching the python convention.
- we could add multiple lint rules in a single module.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cfnlint/helpers.py`
Content:
```
1 """
2 Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.
3
4 Permission is hereby granted, free of charge, to any person obtaining a copy of this
5 software and associated documentation files (the "Software"), to deal in the Software
6 without restriction, including without limitation the rights to use, copy, modify,
7 merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
8 permit persons to whom the Software is furnished to do so.
9
10 THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
11 INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
12 PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
13 HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
14 OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
15 SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
16 """
17 import sys
18 import fnmatch
19 import json
20 import os
21 import imp
22 import logging
23 import re
24 import pkg_resources
25
26
27 LOGGER = logging.getLogger(__name__)
28
29 TAG_MAP = 'tag:yaml.org,2002:map'
30 UNCONVERTED_SUFFIXES = ['Ref', 'Condition']
31 FN_PREFIX = 'Fn::'
32 CONDITION_FUNCTIONS = ['Fn::If']
33 REGIONS = ['us-east-1', 'us-east-2', 'us-west-1', 'us-west-2', 'ca-central-1',
34 'eu-central-1', 'eu-west-1', 'eu-west-2', 'ap-northeast-1',
35 'ap-northeast-2', 'ap-southeast-1', 'ap-southeast-2', 'ap-south-1',
36 'sa-east-1']
37
38 REGEX_ALPHANUMERIC = re.compile('^[a-zA-Z0-9]*$')
39 REGEX_CIDR = re.compile(r'^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])(\/([0-9]|[1-2][0-9]|3[0-2]))$')
40 REGEX_IPV4 = re.compile(r'^(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}$')
41 REGEX_IPV6 = re.compile(r'^(((?=.*(::))(?!.*\3.+\3))\3?|[\dA-F]{1,4}:)([\dA-F]{1,4}(\3|:\b)|\2){5}(([\dA-F]{1,4}(\3|:\b|$)|\2){2}|(((2[0-4]|1\d|[1-9])?\d|25[0-5])\.?\b){4})\Z', re.I | re.S)
42
43 AVAILABILITY_ZONES = [
44 'us-east-1a', 'us-east-1b', 'us-east-1c', 'us-east-1d', 'us-east-1e', 'us-east-1f',
45 'us-east-2a' 'us-east-2b' 'us-east-2c',
46 'us-west-1a', 'us-west-1b', 'us-west-1c',
47 'us-west-2a', 'us-west-2b', 'us-west-2c',
48 'ca-central-1a', 'ca-central-1b',
49 'sa-east-1a', 'sa-east-1b', 'sa-east-1c',
50 'eu-west-1a', 'eu-west-1b', 'eu-west-1c',
51 'eu-west-2a', 'eu-west-2b', 'eu-west-2c',
52 'eu-west-3a', 'eu-west-3b', 'eu-west-3c',
53 'eu-central-1a', 'eu-central-1b', 'eu-central-1c',
54 'eu-central-2a', 'eu-central-2b', 'eu-central-2c',
55 'ap-northeast-1a', 'ap-northeast-1b', 'ap-northeast-1c', 'ap-northeast-1d',
56 'ap-northeast-2a', 'ap-northeast-2b',
57 'ap-northeast-3a',
58 'ap-southeast-1a', 'ap-southeast-1b', 'ap-southeast-1c',
59 'ap-southeast-2a', 'ap-southeast-2b', 'ap-southeast-2c',
60 'ap-south-1a', 'ap-south-1b',
61 'cn-north-1a', 'cn-north-1b',
62 ]
63
64 FUNCTIONS = [
65 'Fn::Base64', 'Fn::GetAtt', 'Fn::GetAZs', 'Fn::ImportValue',
66 'Fn::Join', 'Fn::Split', 'Fn::FindInMap', 'Fn::Select', 'Ref',
67 'Fn::If', 'Fn::Contains', 'Fn::Sub', 'Fn::Cidr']
68
69 LIMITS = {
70 'mappings': {
71 'number': 100,
72 'attributes': 64,
73 'name': 255 # in characters
74 },
75 'outputs': {
76 'number': 60,
77 'name': 255 # in characters
78 },
79 'parameters': {
80 'number': 60,
81 'name': 255, # in characters
82 'value': 4096 # in bytes
83 },
84 'resources': {
85 'number': 200,
86 'name': 255 # in characters
87 },
88 'template': {
89 'body': 460800, # in bytes
90 'description': 1024 # in bytes
91 }
92 }
93
94
95 def load_resources(filename='/data/CloudSpecs/us-east-1.json'):
96 """Load resources"""
97
98 filename = pkg_resources.resource_filename(
99 __name__,
100 filename
101 )
102
103 data = json.load(open(filename))
104
105 return data
106
107
108 RESOURCE_SPECS = {}
109
110
111 def merge_spec(source, destination):
112 """ Recursive merge spec dict """
113
114 for key, value in source.items():
115 if isinstance(value, dict):
116 node = destination.setdefault(key, {})
117 merge_spec(value, node)
118 else:
119 destination[key] = value
120
121 return destination
122
123
124 def set_specs(override_spec_data):
125 """ Override Resource Specs """
126
127 excludes = []
128 includes = []
129
130 # Extract the exclude list from the override file
131 if 'ExcludeResourceTypes' in override_spec_data:
132 excludes = override_spec_data.pop('ExcludeResourceTypes')
133 if 'IncludeResourceTypes' in override_spec_data:
134 includes = override_spec_data.pop('IncludeResourceTypes')
135
136 for region, spec in RESOURCE_SPECS.items():
137
138 # Merge override spec file into the AWS Resource specification
139 if override_spec_data:
140 RESOURCE_SPECS[region] = merge_spec(override_spec_data, spec)
141
142 # Grab a list of all resources
143 all_resources = list(RESOURCE_SPECS[region]['ResourceTypes'].keys())[:]
144
145 resources = []
146
147 # Remove unsupported resource using includes
148 if includes:
149 for include in includes:
150 regex = re.compile(include.replace('*', '(.*)') + '$')
151 matches = [string for string in all_resources if re.match(regex, string)]
152
153 resources.extend(matches)
154 else:
155 resources = all_resources[:]
156
157 # Remove unsupported resources using the excludes
158 if excludes:
159 for exclude in excludes:
160 regex = re.compile(exclude.replace('*', '(.*)') + '$')
161 matches = [string for string in resources if re.match(regex, string)]
162
163 for match in matches:
164 resources.remove(match)
165
166 # Remove unsupported resources
167 for resource in all_resources:
168 if resource not in resources:
169 del RESOURCE_SPECS[region]['ResourceTypes'][resource]
170
171
172 def initialize_specs():
173 """ Reload Resource Specs """
174 for reg in REGIONS:
175 RESOURCE_SPECS[reg] = load_resources(filename=('/data/CloudSpecs/%s.json' % reg))
176
177
178 initialize_specs()
179
180
181 def load_plugins(directory):
182 """Load plugins"""
183 result = []
184 fh = None
185
186 for root, _, filenames in os.walk(directory):
187 for filename in fnmatch.filter(filenames, '[A-Za-z]*.py'):
188 pluginname = filename.replace('.py', '')
189 try:
190 fh, filename, desc = imp.find_module(pluginname, [root])
191 mod = imp.load_module(pluginname, fh, filename, desc)
192 obj = getattr(mod, pluginname)()
193 result.append(obj)
194 finally:
195 if fh:
196 fh.close()
197 return result
198
199
200 def override_specs(override_spec_file):
201 """Override specs file"""
202 try:
203 filename = override_spec_file
204 custom_spec_data = json.load(open(filename))
205
206 set_specs(custom_spec_data)
207 except IOError as e:
208 if e.errno == 2:
209 LOGGER.error('Override spec file not found: %s', filename)
210 sys.exit(1)
211 elif e.errno == 21:
212 LOGGER.error('Override spec file references a directory, not a file: %s', filename)
213 sys.exit(1)
214 elif e.errno == 13:
215 LOGGER.error('Permission denied when accessing override spec file: %s', filename)
216 sys.exit(1)
217 except (ValueError) as err:
218 LOGGER.error('Override spec file %s is malformed: %s', filename, err)
219 sys.exit(1)
220
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/cfnlint/helpers.py b/src/cfnlint/helpers.py
--- a/src/cfnlint/helpers.py
+++ b/src/cfnlint/helpers.py
@@ -21,6 +21,7 @@
import imp
import logging
import re
+import inspect
import pkg_resources
@@ -189,8 +190,12 @@
try:
fh, filename, desc = imp.find_module(pluginname, [root])
mod = imp.load_module(pluginname, fh, filename, desc)
- obj = getattr(mod, pluginname)()
- result.append(obj)
+ for _, clazz in inspect.getmembers(mod, inspect.isclass):
+ method_resolution = inspect.getmro(clazz)
+ if [clz for clz in method_resolution[1:] if clz.__module__ == 'cfnlint' and clz.__name__ == 'CloudFormationLintRule']:
+ # create and instance of subclasses of CloudFormationLintRule
+ obj = clazz()
+ result.append(obj)
finally:
if fh:
fh.close()
|
{"golden_diff": "diff --git a/src/cfnlint/helpers.py b/src/cfnlint/helpers.py\n--- a/src/cfnlint/helpers.py\n+++ b/src/cfnlint/helpers.py\n@@ -21,6 +21,7 @@\n import imp\n import logging\n import re\n+import inspect\n import pkg_resources\n \n \n@@ -189,8 +190,12 @@\n try:\n fh, filename, desc = imp.find_module(pluginname, [root])\n mod = imp.load_module(pluginname, fh, filename, desc)\n- obj = getattr(mod, pluginname)()\n- result.append(obj)\n+ for _, clazz in inspect.getmembers(mod, inspect.isclass):\n+ method_resolution = inspect.getmro(clazz)\n+ if [clz for clz in method_resolution[1:] if clz.__module__ == 'cfnlint' and clz.__name__ == 'CloudFormationLintRule']:\n+ # create and instance of subclasses of CloudFormationLintRule\n+ obj = clazz()\n+ result.append(obj)\n finally:\n if fh:\n fh.close()\n", "issue": "Adding new CloudFormationLintRule's requires the filename to be identical to the Class name\n*cfn-lint version: (`cfn-lint --version`)*\r\n0.0.9\r\n\r\n*Description of issue.*\r\n\r\nWhen creating your own rules, you have to add CloudFormationLintRules, but the loader requires\r\nthe python filename to be identical to the Classname.\r\n\r\nIt would be nice if:\r\n- we can create files with names matching the python convention.\r\n- we could add multiple lint rules in a single module.\r\n\n", "before_files": [{"content": "\"\"\"\n Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nimport sys\nimport fnmatch\nimport json\nimport os\nimport imp\nimport logging\nimport re\nimport pkg_resources\n\n\nLOGGER = logging.getLogger(__name__)\n\nTAG_MAP = 'tag:yaml.org,2002:map'\nUNCONVERTED_SUFFIXES = ['Ref', 'Condition']\nFN_PREFIX = 'Fn::'\nCONDITION_FUNCTIONS = ['Fn::If']\nREGIONS = ['us-east-1', 'us-east-2', 'us-west-1', 'us-west-2', 'ca-central-1',\n 'eu-central-1', 'eu-west-1', 'eu-west-2', 'ap-northeast-1',\n 'ap-northeast-2', 'ap-southeast-1', 'ap-southeast-2', 'ap-south-1',\n 'sa-east-1']\n\nREGEX_ALPHANUMERIC = re.compile('^[a-zA-Z0-9]*$')\nREGEX_CIDR = re.compile(r'^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])(\\/([0-9]|[1-2][0-9]|3[0-2]))$')\nREGEX_IPV4 = re.compile(r'^(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}$')\nREGEX_IPV6 = re.compile(r'^(((?=.*(::))(?!.*\\3.+\\3))\\3?|[\\dA-F]{1,4}:)([\\dA-F]{1,4}(\\3|:\\b)|\\2){5}(([\\dA-F]{1,4}(\\3|:\\b|$)|\\2){2}|(((2[0-4]|1\\d|[1-9])?\\d|25[0-5])\\.?\\b){4})\\Z', re.I | re.S)\n\nAVAILABILITY_ZONES = [\n 'us-east-1a', 'us-east-1b', 'us-east-1c', 'us-east-1d', 'us-east-1e', 'us-east-1f',\n 'us-east-2a' 'us-east-2b' 'us-east-2c',\n 'us-west-1a', 'us-west-1b', 'us-west-1c',\n 'us-west-2a', 'us-west-2b', 'us-west-2c',\n 'ca-central-1a', 'ca-central-1b',\n 'sa-east-1a', 'sa-east-1b', 'sa-east-1c',\n 'eu-west-1a', 'eu-west-1b', 'eu-west-1c',\n 'eu-west-2a', 'eu-west-2b', 'eu-west-2c',\n 'eu-west-3a', 'eu-west-3b', 'eu-west-3c',\n 'eu-central-1a', 'eu-central-1b', 'eu-central-1c',\n 'eu-central-2a', 'eu-central-2b', 'eu-central-2c',\n 'ap-northeast-1a', 'ap-northeast-1b', 'ap-northeast-1c', 'ap-northeast-1d',\n 'ap-northeast-2a', 'ap-northeast-2b',\n 'ap-northeast-3a',\n 'ap-southeast-1a', 'ap-southeast-1b', 'ap-southeast-1c',\n 'ap-southeast-2a', 'ap-southeast-2b', 'ap-southeast-2c',\n 'ap-south-1a', 'ap-south-1b',\n 'cn-north-1a', 'cn-north-1b',\n]\n\nFUNCTIONS = [\n 'Fn::Base64', 'Fn::GetAtt', 'Fn::GetAZs', 'Fn::ImportValue',\n 'Fn::Join', 'Fn::Split', 'Fn::FindInMap', 'Fn::Select', 'Ref',\n 'Fn::If', 'Fn::Contains', 'Fn::Sub', 'Fn::Cidr']\n\nLIMITS = {\n 'mappings': {\n 'number': 100,\n 'attributes': 64,\n 'name': 255 # in characters\n },\n 'outputs': {\n 'number': 60,\n 'name': 255 # in characters\n },\n 'parameters': {\n 'number': 60,\n 'name': 255, # in characters\n 'value': 4096 # in bytes\n },\n 'resources': {\n 'number': 200,\n 'name': 255 # in characters\n },\n 'template': {\n 'body': 460800, # in bytes\n 'description': 1024 # in bytes\n }\n}\n\n\ndef load_resources(filename='/data/CloudSpecs/us-east-1.json'):\n \"\"\"Load resources\"\"\"\n\n filename = pkg_resources.resource_filename(\n __name__,\n filename\n )\n\n data = json.load(open(filename))\n\n return data\n\n\nRESOURCE_SPECS = {}\n\n\ndef merge_spec(source, destination):\n \"\"\" Recursive merge spec dict \"\"\"\n\n for key, value in source.items():\n if isinstance(value, dict):\n node = destination.setdefault(key, {})\n merge_spec(value, node)\n else:\n destination[key] = value\n\n return destination\n\n\ndef set_specs(override_spec_data):\n \"\"\" Override Resource Specs \"\"\"\n\n excludes = []\n includes = []\n\n # Extract the exclude list from the override file\n if 'ExcludeResourceTypes' in override_spec_data:\n excludes = override_spec_data.pop('ExcludeResourceTypes')\n if 'IncludeResourceTypes' in override_spec_data:\n includes = override_spec_data.pop('IncludeResourceTypes')\n\n for region, spec in RESOURCE_SPECS.items():\n\n # Merge override spec file into the AWS Resource specification\n if override_spec_data:\n RESOURCE_SPECS[region] = merge_spec(override_spec_data, spec)\n\n # Grab a list of all resources\n all_resources = list(RESOURCE_SPECS[region]['ResourceTypes'].keys())[:]\n\n resources = []\n\n # Remove unsupported resource using includes\n if includes:\n for include in includes:\n regex = re.compile(include.replace('*', '(.*)') + '$')\n matches = [string for string in all_resources if re.match(regex, string)]\n\n resources.extend(matches)\n else:\n resources = all_resources[:]\n\n # Remove unsupported resources using the excludes\n if excludes:\n for exclude in excludes:\n regex = re.compile(exclude.replace('*', '(.*)') + '$')\n matches = [string for string in resources if re.match(regex, string)]\n\n for match in matches:\n resources.remove(match)\n\n # Remove unsupported resources\n for resource in all_resources:\n if resource not in resources:\n del RESOURCE_SPECS[region]['ResourceTypes'][resource]\n\n\ndef initialize_specs():\n \"\"\" Reload Resource Specs \"\"\"\n for reg in REGIONS:\n RESOURCE_SPECS[reg] = load_resources(filename=('/data/CloudSpecs/%s.json' % reg))\n\n\ninitialize_specs()\n\n\ndef load_plugins(directory):\n \"\"\"Load plugins\"\"\"\n result = []\n fh = None\n\n for root, _, filenames in os.walk(directory):\n for filename in fnmatch.filter(filenames, '[A-Za-z]*.py'):\n pluginname = filename.replace('.py', '')\n try:\n fh, filename, desc = imp.find_module(pluginname, [root])\n mod = imp.load_module(pluginname, fh, filename, desc)\n obj = getattr(mod, pluginname)()\n result.append(obj)\n finally:\n if fh:\n fh.close()\n return result\n\n\ndef override_specs(override_spec_file):\n \"\"\"Override specs file\"\"\"\n try:\n filename = override_spec_file\n custom_spec_data = json.load(open(filename))\n\n set_specs(custom_spec_data)\n except IOError as e:\n if e.errno == 2:\n LOGGER.error('Override spec file not found: %s', filename)\n sys.exit(1)\n elif e.errno == 21:\n LOGGER.error('Override spec file references a directory, not a file: %s', filename)\n sys.exit(1)\n elif e.errno == 13:\n LOGGER.error('Permission denied when accessing override spec file: %s', filename)\n sys.exit(1)\n except (ValueError) as err:\n LOGGER.error('Override spec file %s is malformed: %s', filename, err)\n sys.exit(1)\n", "path": "src/cfnlint/helpers.py"}], "after_files": [{"content": "\"\"\"\n Copyright 2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy of this\n software and associated documentation files (the \"Software\"), to deal in the Software\n without restriction, including without limitation the rights to use, copy, modify,\n merge, publish, distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,\n INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A\n PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\n HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\"\"\"\nimport sys\nimport fnmatch\nimport json\nimport os\nimport imp\nimport logging\nimport re\nimport inspect\nimport pkg_resources\n\n\nLOGGER = logging.getLogger(__name__)\n\nTAG_MAP = 'tag:yaml.org,2002:map'\nUNCONVERTED_SUFFIXES = ['Ref', 'Condition']\nFN_PREFIX = 'Fn::'\nCONDITION_FUNCTIONS = ['Fn::If']\nREGIONS = ['us-east-1', 'us-east-2', 'us-west-1', 'us-west-2', 'ca-central-1',\n 'eu-central-1', 'eu-west-1', 'eu-west-2', 'ap-northeast-1',\n 'ap-northeast-2', 'ap-southeast-1', 'ap-southeast-2', 'ap-south-1',\n 'sa-east-1']\n\nREGEX_ALPHANUMERIC = re.compile('^[a-zA-Z0-9]*$')\nREGEX_CIDR = re.compile(r'^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])(\\/([0-9]|[1-2][0-9]|3[0-2]))$')\nREGEX_IPV4 = re.compile(r'^(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}$')\nREGEX_IPV6 = re.compile(r'^(((?=.*(::))(?!.*\\3.+\\3))\\3?|[\\dA-F]{1,4}:)([\\dA-F]{1,4}(\\3|:\\b)|\\2){5}(([\\dA-F]{1,4}(\\3|:\\b|$)|\\2){2}|(((2[0-4]|1\\d|[1-9])?\\d|25[0-5])\\.?\\b){4})\\Z', re.I | re.S)\n\nAVAILABILITY_ZONES = [\n 'us-east-1a', 'us-east-1b', 'us-east-1c', 'us-east-1d', 'us-east-1e', 'us-east-1f',\n 'us-east-2a' 'us-east-2b' 'us-east-2c',\n 'us-west-1a', 'us-west-1b', 'us-west-1c',\n 'us-west-2a', 'us-west-2b', 'us-west-2c',\n 'ca-central-1a', 'ca-central-1b',\n 'sa-east-1a', 'sa-east-1b', 'sa-east-1c',\n 'eu-west-1a', 'eu-west-1b', 'eu-west-1c',\n 'eu-west-2a', 'eu-west-2b', 'eu-west-2c',\n 'eu-west-3a', 'eu-west-3b', 'eu-west-3c',\n 'eu-central-1a', 'eu-central-1b', 'eu-central-1c',\n 'eu-central-2a', 'eu-central-2b', 'eu-central-2c',\n 'ap-northeast-1a', 'ap-northeast-1b', 'ap-northeast-1c', 'ap-northeast-1d',\n 'ap-northeast-2a', 'ap-northeast-2b',\n 'ap-northeast-3a',\n 'ap-southeast-1a', 'ap-southeast-1b', 'ap-southeast-1c',\n 'ap-southeast-2a', 'ap-southeast-2b', 'ap-southeast-2c',\n 'ap-south-1a', 'ap-south-1b',\n 'cn-north-1a', 'cn-north-1b',\n]\n\nFUNCTIONS = [\n 'Fn::Base64', 'Fn::GetAtt', 'Fn::GetAZs', 'Fn::ImportValue',\n 'Fn::Join', 'Fn::Split', 'Fn::FindInMap', 'Fn::Select', 'Ref',\n 'Fn::If', 'Fn::Contains', 'Fn::Sub', 'Fn::Cidr']\n\nLIMITS = {\n 'mappings': {\n 'number': 100,\n 'attributes': 64,\n 'name': 255 # in characters\n },\n 'outputs': {\n 'number': 60,\n 'name': 255 # in characters\n },\n 'parameters': {\n 'number': 60,\n 'name': 255, # in characters\n 'value': 4096 # in bytes\n },\n 'resources': {\n 'number': 200,\n 'name': 255 # in characters\n },\n 'template': {\n 'body': 460800, # in bytes\n 'description': 1024 # in bytes\n }\n}\n\n\ndef load_resources(filename='/data/CloudSpecs/us-east-1.json'):\n \"\"\"Load resources\"\"\"\n\n filename = pkg_resources.resource_filename(\n __name__,\n filename\n )\n\n data = json.load(open(filename))\n\n return data\n\n\nRESOURCE_SPECS = {}\n\n\ndef merge_spec(source, destination):\n \"\"\" Recursive merge spec dict \"\"\"\n\n for key, value in source.items():\n if isinstance(value, dict):\n node = destination.setdefault(key, {})\n merge_spec(value, node)\n else:\n destination[key] = value\n\n return destination\n\n\ndef set_specs(override_spec_data):\n \"\"\" Override Resource Specs \"\"\"\n\n excludes = []\n includes = []\n\n # Extract the exclude list from the override file\n if 'ExcludeResourceTypes' in override_spec_data:\n excludes = override_spec_data.pop('ExcludeResourceTypes')\n if 'IncludeResourceTypes' in override_spec_data:\n includes = override_spec_data.pop('IncludeResourceTypes')\n\n for region, spec in RESOURCE_SPECS.items():\n\n # Merge override spec file into the AWS Resource specification\n if override_spec_data:\n RESOURCE_SPECS[region] = merge_spec(override_spec_data, spec)\n\n # Grab a list of all resources\n all_resources = list(RESOURCE_SPECS[region]['ResourceTypes'].keys())[:]\n\n resources = []\n\n # Remove unsupported resource using includes\n if includes:\n for include in includes:\n regex = re.compile(include.replace('*', '(.*)') + '$')\n matches = [string for string in all_resources if re.match(regex, string)]\n\n resources.extend(matches)\n else:\n resources = all_resources[:]\n\n # Remove unsupported resources using the excludes\n if excludes:\n for exclude in excludes:\n regex = re.compile(exclude.replace('*', '(.*)') + '$')\n matches = [string for string in resources if re.match(regex, string)]\n\n for match in matches:\n resources.remove(match)\n\n # Remove unsupported resources\n for resource in all_resources:\n if resource not in resources:\n del RESOURCE_SPECS[region]['ResourceTypes'][resource]\n\n\ndef initialize_specs():\n \"\"\" Reload Resource Specs \"\"\"\n for reg in REGIONS:\n RESOURCE_SPECS[reg] = load_resources(filename=('/data/CloudSpecs/%s.json' % reg))\n\n\ninitialize_specs()\n\n\ndef load_plugins(directory):\n \"\"\"Load plugins\"\"\"\n result = []\n fh = None\n\n for root, _, filenames in os.walk(directory):\n for filename in fnmatch.filter(filenames, '[A-Za-z]*.py'):\n pluginname = filename.replace('.py', '')\n try:\n fh, filename, desc = imp.find_module(pluginname, [root])\n mod = imp.load_module(pluginname, fh, filename, desc)\n for _, clazz in inspect.getmembers(mod, inspect.isclass):\n method_resolution = inspect.getmro(clazz)\n if [clz for clz in method_resolution[1:] if clz.__module__ == 'cfnlint' and clz.__name__ == 'CloudFormationLintRule']:\n # create and instance of subclasses of CloudFormationLintRule\n obj = clazz()\n result.append(obj)\n finally:\n if fh:\n fh.close()\n return result\n\n\ndef override_specs(override_spec_file):\n \"\"\"Override specs file\"\"\"\n try:\n filename = override_spec_file\n custom_spec_data = json.load(open(filename))\n\n set_specs(custom_spec_data)\n except IOError as e:\n if e.errno == 2:\n LOGGER.error('Override spec file not found: %s', filename)\n sys.exit(1)\n elif e.errno == 21:\n LOGGER.error('Override spec file references a directory, not a file: %s', filename)\n sys.exit(1)\n elif e.errno == 13:\n LOGGER.error('Permission denied when accessing override spec file: %s', filename)\n sys.exit(1)\n except (ValueError) as err:\n LOGGER.error('Override spec file %s is malformed: %s', filename, err)\n sys.exit(1)\n", "path": "src/cfnlint/helpers.py"}]}
| 3,127 | 235 |
gh_patches_debug_19450
|
rasdani/github-patches
|
git_diff
|
urllib3__urllib3-711
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SSL cert digest check uses non-constant-time comparison
This is the problematic line:
https://github.com/shazow/urllib3/blob/master/urllib3/util/ssl_.py#L137
The digests being compared are strings, and `==` is not constant-time, which could potentially allow a timing attack. While it may not immediately be obvious whether an adversary could gain any advantage by successfully timing this line, I am a firm believer in "better safe than sorry."
The best way to fix this is to use a constant-time digest comparison function, such as the built-in `hmac.compare_digest`. However, this function was only introduced in Python 2.7.7, while this library supports Python 2.6+. The next best solution would be to use something like the `constant_time_compare` function from the following package:
https://github.com/mitsuhiko/itsdangerous
Using a constant-time comparison is semantically equivalent to using `==`, and the performance impact is negligible and comes along with improved security. I believe this is a good tradeoff.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `urllib3/util/ssl_.py`
Content:
```
1 from binascii import hexlify, unhexlify
2 from hashlib import md5, sha1, sha256
3
4 from ..exceptions import SSLError, InsecurePlatformWarning
5
6
7 SSLContext = None
8 HAS_SNI = False
9 create_default_context = None
10
11 # Maps the length of a digest to a possible hash function producing this digest
12 HASHFUNC_MAP = {
13 32: md5,
14 40: sha1,
15 64: sha256,
16 }
17
18 import errno
19 import warnings
20
21 try: # Test for SSL features
22 import ssl
23 from ssl import wrap_socket, CERT_NONE, PROTOCOL_SSLv23
24 from ssl import HAS_SNI # Has SNI?
25 except ImportError:
26 pass
27
28
29 try:
30 from ssl import OP_NO_SSLv2, OP_NO_SSLv3, OP_NO_COMPRESSION
31 except ImportError:
32 OP_NO_SSLv2, OP_NO_SSLv3 = 0x1000000, 0x2000000
33 OP_NO_COMPRESSION = 0x20000
34
35 # A secure default.
36 # Sources for more information on TLS ciphers:
37 #
38 # - https://wiki.mozilla.org/Security/Server_Side_TLS
39 # - https://www.ssllabs.com/projects/best-practices/index.html
40 # - https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/
41 #
42 # The general intent is:
43 # - Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE),
44 # - prefer ECDHE over DHE for better performance,
45 # - prefer any AES-GCM over any AES-CBC for better performance and security,
46 # - use 3DES as fallback which is secure but slow,
47 # - disable NULL authentication, MD5 MACs and DSS for security reasons.
48 DEFAULT_CIPHERS = (
49 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'
50 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'
51 '!eNULL:!MD5'
52 )
53
54 try:
55 from ssl import SSLContext # Modern SSL?
56 except ImportError:
57 import sys
58
59 class SSLContext(object): # Platform-specific: Python 2 & 3.1
60 supports_set_ciphers = ((2, 7) <= sys.version_info < (3,) or
61 (3, 2) <= sys.version_info)
62
63 def __init__(self, protocol_version):
64 self.protocol = protocol_version
65 # Use default values from a real SSLContext
66 self.check_hostname = False
67 self.verify_mode = ssl.CERT_NONE
68 self.ca_certs = None
69 self.options = 0
70 self.certfile = None
71 self.keyfile = None
72 self.ciphers = None
73
74 def load_cert_chain(self, certfile, keyfile):
75 self.certfile = certfile
76 self.keyfile = keyfile
77
78 def load_verify_locations(self, cafile=None, capath=None):
79 self.ca_certs = cafile
80
81 if capath is not None:
82 raise SSLError("CA directories not supported in older Pythons")
83
84 def set_ciphers(self, cipher_suite):
85 if not self.supports_set_ciphers:
86 raise TypeError(
87 'Your version of Python does not support setting '
88 'a custom cipher suite. Please upgrade to Python '
89 '2.7, 3.2, or later if you need this functionality.'
90 )
91 self.ciphers = cipher_suite
92
93 def wrap_socket(self, socket, server_hostname=None):
94 warnings.warn(
95 'A true SSLContext object is not available. This prevents '
96 'urllib3 from configuring SSL appropriately and may cause '
97 'certain SSL connections to fail. For more information, see '
98 'https://urllib3.readthedocs.org/en/latest/security.html'
99 '#insecureplatformwarning.',
100 InsecurePlatformWarning
101 )
102 kwargs = {
103 'keyfile': self.keyfile,
104 'certfile': self.certfile,
105 'ca_certs': self.ca_certs,
106 'cert_reqs': self.verify_mode,
107 'ssl_version': self.protocol,
108 }
109 if self.supports_set_ciphers: # Platform-specific: Python 2.7+
110 return wrap_socket(socket, ciphers=self.ciphers, **kwargs)
111 else: # Platform-specific: Python 2.6
112 return wrap_socket(socket, **kwargs)
113
114
115 def assert_fingerprint(cert, fingerprint):
116 """
117 Checks if given fingerprint matches the supplied certificate.
118
119 :param cert:
120 Certificate as bytes object.
121 :param fingerprint:
122 Fingerprint as string of hexdigits, can be interspersed by colons.
123 """
124
125 fingerprint = fingerprint.replace(':', '').lower()
126 digest_length = len(fingerprint)
127 hashfunc = HASHFUNC_MAP.get(digest_length)
128 if not hashfunc:
129 raise SSLError(
130 'Fingerprint of invalid length: {0}'.format(fingerprint))
131
132 # We need encode() here for py32; works on py2 and p33.
133 fingerprint_bytes = unhexlify(fingerprint.encode())
134
135 cert_digest = hashfunc(cert).digest()
136
137 if cert_digest != fingerprint_bytes:
138 raise SSLError('Fingerprints did not match. Expected "{0}", got "{1}".'
139 .format(fingerprint, hexlify(cert_digest)))
140
141
142 def resolve_cert_reqs(candidate):
143 """
144 Resolves the argument to a numeric constant, which can be passed to
145 the wrap_socket function/method from the ssl module.
146 Defaults to :data:`ssl.CERT_NONE`.
147 If given a string it is assumed to be the name of the constant in the
148 :mod:`ssl` module or its abbrevation.
149 (So you can specify `REQUIRED` instead of `CERT_REQUIRED`.
150 If it's neither `None` nor a string we assume it is already the numeric
151 constant which can directly be passed to wrap_socket.
152 """
153 if candidate is None:
154 return CERT_NONE
155
156 if isinstance(candidate, str):
157 res = getattr(ssl, candidate, None)
158 if res is None:
159 res = getattr(ssl, 'CERT_' + candidate)
160 return res
161
162 return candidate
163
164
165 def resolve_ssl_version(candidate):
166 """
167 like resolve_cert_reqs
168 """
169 if candidate is None:
170 return PROTOCOL_SSLv23
171
172 if isinstance(candidate, str):
173 res = getattr(ssl, candidate, None)
174 if res is None:
175 res = getattr(ssl, 'PROTOCOL_' + candidate)
176 return res
177
178 return candidate
179
180
181 def create_urllib3_context(ssl_version=None, cert_reqs=None,
182 options=None, ciphers=None):
183 """All arguments have the same meaning as ``ssl_wrap_socket``.
184
185 By default, this function does a lot of the same work that
186 ``ssl.create_default_context`` does on Python 3.4+. It:
187
188 - Disables SSLv2, SSLv3, and compression
189 - Sets a restricted set of server ciphers
190
191 If you wish to enable SSLv3, you can do::
192
193 from urllib3.util import ssl_
194 context = ssl_.create_urllib3_context()
195 context.options &= ~ssl_.OP_NO_SSLv3
196
197 You can do the same to enable compression (substituting ``COMPRESSION``
198 for ``SSLv3`` in the last line above).
199
200 :param ssl_version:
201 The desired protocol version to use. This will default to
202 PROTOCOL_SSLv23 which will negotiate the highest protocol that both
203 the server and your installation of OpenSSL support.
204 :param cert_reqs:
205 Whether to require the certificate verification. This defaults to
206 ``ssl.CERT_REQUIRED``.
207 :param options:
208 Specific OpenSSL options. These default to ``ssl.OP_NO_SSLv2``,
209 ``ssl.OP_NO_SSLv3``, ``ssl.OP_NO_COMPRESSION``.
210 :param ciphers:
211 Which cipher suites to allow the server to select.
212 :returns:
213 Constructed SSLContext object with specified options
214 :rtype: SSLContext
215 """
216 context = SSLContext(ssl_version or ssl.PROTOCOL_SSLv23)
217
218 # Setting the default here, as we may have no ssl module on import
219 cert_reqs = ssl.CERT_REQUIRED if cert_reqs is None else cert_reqs
220
221 if options is None:
222 options = 0
223 # SSLv2 is easily broken and is considered harmful and dangerous
224 options |= OP_NO_SSLv2
225 # SSLv3 has several problems and is now dangerous
226 options |= OP_NO_SSLv3
227 # Disable compression to prevent CRIME attacks for OpenSSL 1.0+
228 # (issue #309)
229 options |= OP_NO_COMPRESSION
230
231 context.options |= options
232
233 if getattr(context, 'supports_set_ciphers', True): # Platform-specific: Python 2.6
234 context.set_ciphers(ciphers or DEFAULT_CIPHERS)
235
236 context.verify_mode = cert_reqs
237 if getattr(context, 'check_hostname', None) is not None: # Platform-specific: Python 3.2
238 # We do our own verification, including fingerprints and alternative
239 # hostnames. So disable it here
240 context.check_hostname = False
241 return context
242
243
244 def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
245 ca_certs=None, server_hostname=None,
246 ssl_version=None, ciphers=None, ssl_context=None,
247 ca_cert_dir=None):
248 """
249 All arguments except for server_hostname, ssl_context, and ca_cert_dir have
250 the same meaning as they do when using :func:`ssl.wrap_socket`.
251
252 :param server_hostname:
253 When SNI is supported, the expected hostname of the certificate
254 :param ssl_context:
255 A pre-made :class:`SSLContext` object. If none is provided, one will
256 be created using :func:`create_urllib3_context`.
257 :param ciphers:
258 A string of ciphers we wish the client to support. This is not
259 supported on Python 2.6 as the ssl module does not support it.
260 :param ca_cert_dir:
261 A directory containing CA certificates in multiple separate files, as
262 supported by OpenSSL's -CApath flag or the capath argument to
263 SSLContext.load_verify_locations().
264 """
265 context = ssl_context
266 if context is None:
267 context = create_urllib3_context(ssl_version, cert_reqs,
268 ciphers=ciphers)
269
270 if ca_certs or ca_cert_dir:
271 try:
272 context.load_verify_locations(ca_certs, ca_cert_dir)
273 except IOError as e: # Platform-specific: Python 2.6, 2.7, 3.2
274 raise SSLError(e)
275 # Py33 raises FileNotFoundError which subclasses OSError
276 # These are not equivalent unless we check the errno attribute
277 except OSError as e: # Platform-specific: Python 3.3 and beyond
278 if e.errno == errno.ENOENT:
279 raise SSLError(e)
280 raise
281
282 if certfile:
283 context.load_cert_chain(certfile, keyfile)
284 if HAS_SNI: # Platform-specific: OpenSSL with enabled SNI
285 return context.wrap_socket(sock, server_hostname=server_hostname)
286 return context.wrap_socket(sock)
287
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/urllib3/util/ssl_.py b/urllib3/util/ssl_.py
--- a/urllib3/util/ssl_.py
+++ b/urllib3/util/ssl_.py
@@ -17,6 +17,25 @@
import errno
import warnings
+import hmac
+
+
+def _const_compare_digest_backport(a, b):
+ """
+ Compare two digests of equal length in constant time.
+
+ The digests must be of type str/bytes.
+ Returns True if the digests match, and False otherwise.
+ """
+ result = abs(len(a) - len(b))
+ for l, r in zip(bytearray(a), bytearray(b)):
+ result |= l ^ r
+ return result == 0
+
+
+_const_compare_digest = getattr(hmac, 'compare_digest',
+ _const_compare_digest_backport)
+
try: # Test for SSL features
import ssl
@@ -134,7 +153,7 @@
cert_digest = hashfunc(cert).digest()
- if cert_digest != fingerprint_bytes:
+ if not _const_compare_digest(cert_digest, fingerprint_bytes):
raise SSLError('Fingerprints did not match. Expected "{0}", got "{1}".'
.format(fingerprint, hexlify(cert_digest)))
|
{"golden_diff": "diff --git a/urllib3/util/ssl_.py b/urllib3/util/ssl_.py\n--- a/urllib3/util/ssl_.py\n+++ b/urllib3/util/ssl_.py\n@@ -17,6 +17,25 @@\n \n import errno\n import warnings\n+import hmac\n+\n+\n+def _const_compare_digest_backport(a, b):\n+ \"\"\"\n+ Compare two digests of equal length in constant time.\n+\n+ The digests must be of type str/bytes.\n+ Returns True if the digests match, and False otherwise.\n+ \"\"\"\n+ result = abs(len(a) - len(b))\n+ for l, r in zip(bytearray(a), bytearray(b)):\n+ result |= l ^ r\n+ return result == 0\n+\n+\n+_const_compare_digest = getattr(hmac, 'compare_digest',\n+ _const_compare_digest_backport)\n+\n \n try: # Test for SSL features\n import ssl\n@@ -134,7 +153,7 @@\n \n cert_digest = hashfunc(cert).digest()\n \n- if cert_digest != fingerprint_bytes:\n+ if not _const_compare_digest(cert_digest, fingerprint_bytes):\n raise SSLError('Fingerprints did not match. Expected \"{0}\", got \"{1}\".'\n .format(fingerprint, hexlify(cert_digest)))\n", "issue": "SSL cert digest check uses non-constant-time comparison\nThis is the problematic line:\nhttps://github.com/shazow/urllib3/blob/master/urllib3/util/ssl_.py#L137\n\nThe digests being compared are strings, and `==` is not constant-time, which could potentially allow a timing attack. While it may not immediately be obvious whether an adversary could gain any advantage by successfully timing this line, I am a firm believer in \"better safe than sorry.\"\n\nThe best way to fix this is to use a constant-time digest comparison function, such as the built-in `hmac.compare_digest`. However, this function was only introduced in Python 2.7.7, while this library supports Python 2.6+. The next best solution would be to use something like the `constant_time_compare` function from the following package:\nhttps://github.com/mitsuhiko/itsdangerous\n\nUsing a constant-time comparison is semantically equivalent to using `==`, and the performance impact is negligible and comes along with improved security. I believe this is a good tradeoff.\n\n", "before_files": [{"content": "from binascii import hexlify, unhexlify\nfrom hashlib import md5, sha1, sha256\n\nfrom ..exceptions import SSLError, InsecurePlatformWarning\n\n\nSSLContext = None\nHAS_SNI = False\ncreate_default_context = None\n\n# Maps the length of a digest to a possible hash function producing this digest\nHASHFUNC_MAP = {\n 32: md5,\n 40: sha1,\n 64: sha256,\n}\n\nimport errno\nimport warnings\n\ntry: # Test for SSL features\n import ssl\n from ssl import wrap_socket, CERT_NONE, PROTOCOL_SSLv23\n from ssl import HAS_SNI # Has SNI?\nexcept ImportError:\n pass\n\n\ntry:\n from ssl import OP_NO_SSLv2, OP_NO_SSLv3, OP_NO_COMPRESSION\nexcept ImportError:\n OP_NO_SSLv2, OP_NO_SSLv3 = 0x1000000, 0x2000000\n OP_NO_COMPRESSION = 0x20000\n\n# A secure default.\n# Sources for more information on TLS ciphers:\n#\n# - https://wiki.mozilla.org/Security/Server_Side_TLS\n# - https://www.ssllabs.com/projects/best-practices/index.html\n# - https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/\n#\n# The general intent is:\n# - Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE),\n# - prefer ECDHE over DHE for better performance,\n# - prefer any AES-GCM over any AES-CBC for better performance and security,\n# - use 3DES as fallback which is secure but slow,\n# - disable NULL authentication, MD5 MACs and DSS for security reasons.\nDEFAULT_CIPHERS = (\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\n '!eNULL:!MD5'\n)\n\ntry:\n from ssl import SSLContext # Modern SSL?\nexcept ImportError:\n import sys\n\n class SSLContext(object): # Platform-specific: Python 2 & 3.1\n supports_set_ciphers = ((2, 7) <= sys.version_info < (3,) or\n (3, 2) <= sys.version_info)\n\n def __init__(self, protocol_version):\n self.protocol = protocol_version\n # Use default values from a real SSLContext\n self.check_hostname = False\n self.verify_mode = ssl.CERT_NONE\n self.ca_certs = None\n self.options = 0\n self.certfile = None\n self.keyfile = None\n self.ciphers = None\n\n def load_cert_chain(self, certfile, keyfile):\n self.certfile = certfile\n self.keyfile = keyfile\n\n def load_verify_locations(self, cafile=None, capath=None):\n self.ca_certs = cafile\n\n if capath is not None:\n raise SSLError(\"CA directories not supported in older Pythons\")\n\n def set_ciphers(self, cipher_suite):\n if not self.supports_set_ciphers:\n raise TypeError(\n 'Your version of Python does not support setting '\n 'a custom cipher suite. Please upgrade to Python '\n '2.7, 3.2, or later if you need this functionality.'\n )\n self.ciphers = cipher_suite\n\n def wrap_socket(self, socket, server_hostname=None):\n warnings.warn(\n 'A true SSLContext object is not available. This prevents '\n 'urllib3 from configuring SSL appropriately and may cause '\n 'certain SSL connections to fail. For more information, see '\n 'https://urllib3.readthedocs.org/en/latest/security.html'\n '#insecureplatformwarning.',\n InsecurePlatformWarning\n )\n kwargs = {\n 'keyfile': self.keyfile,\n 'certfile': self.certfile,\n 'ca_certs': self.ca_certs,\n 'cert_reqs': self.verify_mode,\n 'ssl_version': self.protocol,\n }\n if self.supports_set_ciphers: # Platform-specific: Python 2.7+\n return wrap_socket(socket, ciphers=self.ciphers, **kwargs)\n else: # Platform-specific: Python 2.6\n return wrap_socket(socket, **kwargs)\n\n\ndef assert_fingerprint(cert, fingerprint):\n \"\"\"\n Checks if given fingerprint matches the supplied certificate.\n\n :param cert:\n Certificate as bytes object.\n :param fingerprint:\n Fingerprint as string of hexdigits, can be interspersed by colons.\n \"\"\"\n\n fingerprint = fingerprint.replace(':', '').lower()\n digest_length = len(fingerprint)\n hashfunc = HASHFUNC_MAP.get(digest_length)\n if not hashfunc:\n raise SSLError(\n 'Fingerprint of invalid length: {0}'.format(fingerprint))\n\n # We need encode() here for py32; works on py2 and p33.\n fingerprint_bytes = unhexlify(fingerprint.encode())\n\n cert_digest = hashfunc(cert).digest()\n\n if cert_digest != fingerprint_bytes:\n raise SSLError('Fingerprints did not match. Expected \"{0}\", got \"{1}\".'\n .format(fingerprint, hexlify(cert_digest)))\n\n\ndef resolve_cert_reqs(candidate):\n \"\"\"\n Resolves the argument to a numeric constant, which can be passed to\n the wrap_socket function/method from the ssl module.\n Defaults to :data:`ssl.CERT_NONE`.\n If given a string it is assumed to be the name of the constant in the\n :mod:`ssl` module or its abbrevation.\n (So you can specify `REQUIRED` instead of `CERT_REQUIRED`.\n If it's neither `None` nor a string we assume it is already the numeric\n constant which can directly be passed to wrap_socket.\n \"\"\"\n if candidate is None:\n return CERT_NONE\n\n if isinstance(candidate, str):\n res = getattr(ssl, candidate, None)\n if res is None:\n res = getattr(ssl, 'CERT_' + candidate)\n return res\n\n return candidate\n\n\ndef resolve_ssl_version(candidate):\n \"\"\"\n like resolve_cert_reqs\n \"\"\"\n if candidate is None:\n return PROTOCOL_SSLv23\n\n if isinstance(candidate, str):\n res = getattr(ssl, candidate, None)\n if res is None:\n res = getattr(ssl, 'PROTOCOL_' + candidate)\n return res\n\n return candidate\n\n\ndef create_urllib3_context(ssl_version=None, cert_reqs=None,\n options=None, ciphers=None):\n \"\"\"All arguments have the same meaning as ``ssl_wrap_socket``.\n\n By default, this function does a lot of the same work that\n ``ssl.create_default_context`` does on Python 3.4+. It:\n\n - Disables SSLv2, SSLv3, and compression\n - Sets a restricted set of server ciphers\n\n If you wish to enable SSLv3, you can do::\n\n from urllib3.util import ssl_\n context = ssl_.create_urllib3_context()\n context.options &= ~ssl_.OP_NO_SSLv3\n\n You can do the same to enable compression (substituting ``COMPRESSION``\n for ``SSLv3`` in the last line above).\n\n :param ssl_version:\n The desired protocol version to use. This will default to\n PROTOCOL_SSLv23 which will negotiate the highest protocol that both\n the server and your installation of OpenSSL support.\n :param cert_reqs:\n Whether to require the certificate verification. This defaults to\n ``ssl.CERT_REQUIRED``.\n :param options:\n Specific OpenSSL options. These default to ``ssl.OP_NO_SSLv2``,\n ``ssl.OP_NO_SSLv3``, ``ssl.OP_NO_COMPRESSION``.\n :param ciphers:\n Which cipher suites to allow the server to select.\n :returns:\n Constructed SSLContext object with specified options\n :rtype: SSLContext\n \"\"\"\n context = SSLContext(ssl_version or ssl.PROTOCOL_SSLv23)\n\n # Setting the default here, as we may have no ssl module on import\n cert_reqs = ssl.CERT_REQUIRED if cert_reqs is None else cert_reqs\n\n if options is None:\n options = 0\n # SSLv2 is easily broken and is considered harmful and dangerous\n options |= OP_NO_SSLv2\n # SSLv3 has several problems and is now dangerous\n options |= OP_NO_SSLv3\n # Disable compression to prevent CRIME attacks for OpenSSL 1.0+\n # (issue #309)\n options |= OP_NO_COMPRESSION\n\n context.options |= options\n\n if getattr(context, 'supports_set_ciphers', True): # Platform-specific: Python 2.6\n context.set_ciphers(ciphers or DEFAULT_CIPHERS)\n\n context.verify_mode = cert_reqs\n if getattr(context, 'check_hostname', None) is not None: # Platform-specific: Python 3.2\n # We do our own verification, including fingerprints and alternative\n # hostnames. So disable it here\n context.check_hostname = False\n return context\n\n\ndef ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,\n ca_certs=None, server_hostname=None,\n ssl_version=None, ciphers=None, ssl_context=None,\n ca_cert_dir=None):\n \"\"\"\n All arguments except for server_hostname, ssl_context, and ca_cert_dir have\n the same meaning as they do when using :func:`ssl.wrap_socket`.\n\n :param server_hostname:\n When SNI is supported, the expected hostname of the certificate\n :param ssl_context:\n A pre-made :class:`SSLContext` object. If none is provided, one will\n be created using :func:`create_urllib3_context`.\n :param ciphers:\n A string of ciphers we wish the client to support. This is not\n supported on Python 2.6 as the ssl module does not support it.\n :param ca_cert_dir:\n A directory containing CA certificates in multiple separate files, as\n supported by OpenSSL's -CApath flag or the capath argument to\n SSLContext.load_verify_locations().\n \"\"\"\n context = ssl_context\n if context is None:\n context = create_urllib3_context(ssl_version, cert_reqs,\n ciphers=ciphers)\n\n if ca_certs or ca_cert_dir:\n try:\n context.load_verify_locations(ca_certs, ca_cert_dir)\n except IOError as e: # Platform-specific: Python 2.6, 2.7, 3.2\n raise SSLError(e)\n # Py33 raises FileNotFoundError which subclasses OSError\n # These are not equivalent unless we check the errno attribute\n except OSError as e: # Platform-specific: Python 3.3 and beyond\n if e.errno == errno.ENOENT:\n raise SSLError(e)\n raise\n\n if certfile:\n context.load_cert_chain(certfile, keyfile)\n if HAS_SNI: # Platform-specific: OpenSSL with enabled SNI\n return context.wrap_socket(sock, server_hostname=server_hostname)\n return context.wrap_socket(sock)\n", "path": "urllib3/util/ssl_.py"}], "after_files": [{"content": "from binascii import hexlify, unhexlify\nfrom hashlib import md5, sha1, sha256\n\nfrom ..exceptions import SSLError, InsecurePlatformWarning\n\n\nSSLContext = None\nHAS_SNI = False\ncreate_default_context = None\n\n# Maps the length of a digest to a possible hash function producing this digest\nHASHFUNC_MAP = {\n 32: md5,\n 40: sha1,\n 64: sha256,\n}\n\nimport errno\nimport warnings\nimport hmac\n\n\ndef _const_compare_digest_backport(a, b):\n \"\"\"\n Compare two digests of equal length in constant time.\n\n The digests must be of type str/bytes.\n Returns True if the digests match, and False otherwise.\n \"\"\"\n result = abs(len(a) - len(b))\n for l, r in zip(bytearray(a), bytearray(b)):\n result |= l ^ r\n return result == 0\n\n\n_const_compare_digest = getattr(hmac, 'compare_digest',\n _const_compare_digest_backport)\n\n\ntry: # Test for SSL features\n import ssl\n from ssl import wrap_socket, CERT_NONE, PROTOCOL_SSLv23\n from ssl import HAS_SNI # Has SNI?\nexcept ImportError:\n pass\n\n\ntry:\n from ssl import OP_NO_SSLv2, OP_NO_SSLv3, OP_NO_COMPRESSION\nexcept ImportError:\n OP_NO_SSLv2, OP_NO_SSLv3 = 0x1000000, 0x2000000\n OP_NO_COMPRESSION = 0x20000\n\n# A secure default.\n# Sources for more information on TLS ciphers:\n#\n# - https://wiki.mozilla.org/Security/Server_Side_TLS\n# - https://www.ssllabs.com/projects/best-practices/index.html\n# - https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/\n#\n# The general intent is:\n# - Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE),\n# - prefer ECDHE over DHE for better performance,\n# - prefer any AES-GCM over any AES-CBC for better performance and security,\n# - use 3DES as fallback which is secure but slow,\n# - disable NULL authentication, MD5 MACs and DSS for security reasons.\nDEFAULT_CIPHERS = (\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\n '!eNULL:!MD5'\n)\n\ntry:\n from ssl import SSLContext # Modern SSL?\nexcept ImportError:\n import sys\n\n class SSLContext(object): # Platform-specific: Python 2 & 3.1\n supports_set_ciphers = ((2, 7) <= sys.version_info < (3,) or\n (3, 2) <= sys.version_info)\n\n def __init__(self, protocol_version):\n self.protocol = protocol_version\n # Use default values from a real SSLContext\n self.check_hostname = False\n self.verify_mode = ssl.CERT_NONE\n self.ca_certs = None\n self.options = 0\n self.certfile = None\n self.keyfile = None\n self.ciphers = None\n\n def load_cert_chain(self, certfile, keyfile):\n self.certfile = certfile\n self.keyfile = keyfile\n\n def load_verify_locations(self, cafile=None, capath=None):\n self.ca_certs = cafile\n\n if capath is not None:\n raise SSLError(\"CA directories not supported in older Pythons\")\n\n def set_ciphers(self, cipher_suite):\n if not self.supports_set_ciphers:\n raise TypeError(\n 'Your version of Python does not support setting '\n 'a custom cipher suite. Please upgrade to Python '\n '2.7, 3.2, or later if you need this functionality.'\n )\n self.ciphers = cipher_suite\n\n def wrap_socket(self, socket, server_hostname=None):\n warnings.warn(\n 'A true SSLContext object is not available. This prevents '\n 'urllib3 from configuring SSL appropriately and may cause '\n 'certain SSL connections to fail. For more information, see '\n 'https://urllib3.readthedocs.org/en/latest/security.html'\n '#insecureplatformwarning.',\n InsecurePlatformWarning\n )\n kwargs = {\n 'keyfile': self.keyfile,\n 'certfile': self.certfile,\n 'ca_certs': self.ca_certs,\n 'cert_reqs': self.verify_mode,\n 'ssl_version': self.protocol,\n }\n if self.supports_set_ciphers: # Platform-specific: Python 2.7+\n return wrap_socket(socket, ciphers=self.ciphers, **kwargs)\n else: # Platform-specific: Python 2.6\n return wrap_socket(socket, **kwargs)\n\n\ndef assert_fingerprint(cert, fingerprint):\n \"\"\"\n Checks if given fingerprint matches the supplied certificate.\n\n :param cert:\n Certificate as bytes object.\n :param fingerprint:\n Fingerprint as string of hexdigits, can be interspersed by colons.\n \"\"\"\n\n fingerprint = fingerprint.replace(':', '').lower()\n digest_length = len(fingerprint)\n hashfunc = HASHFUNC_MAP.get(digest_length)\n if not hashfunc:\n raise SSLError(\n 'Fingerprint of invalid length: {0}'.format(fingerprint))\n\n # We need encode() here for py32; works on py2 and p33.\n fingerprint_bytes = unhexlify(fingerprint.encode())\n\n cert_digest = hashfunc(cert).digest()\n\n if not _const_compare_digest(cert_digest, fingerprint_bytes):\n raise SSLError('Fingerprints did not match. Expected \"{0}\", got \"{1}\".'\n .format(fingerprint, hexlify(cert_digest)))\n\n\ndef resolve_cert_reqs(candidate):\n \"\"\"\n Resolves the argument to a numeric constant, which can be passed to\n the wrap_socket function/method from the ssl module.\n Defaults to :data:`ssl.CERT_NONE`.\n If given a string it is assumed to be the name of the constant in the\n :mod:`ssl` module or its abbrevation.\n (So you can specify `REQUIRED` instead of `CERT_REQUIRED`.\n If it's neither `None` nor a string we assume it is already the numeric\n constant which can directly be passed to wrap_socket.\n \"\"\"\n if candidate is None:\n return CERT_NONE\n\n if isinstance(candidate, str):\n res = getattr(ssl, candidate, None)\n if res is None:\n res = getattr(ssl, 'CERT_' + candidate)\n return res\n\n return candidate\n\n\ndef resolve_ssl_version(candidate):\n \"\"\"\n like resolve_cert_reqs\n \"\"\"\n if candidate is None:\n return PROTOCOL_SSLv23\n\n if isinstance(candidate, str):\n res = getattr(ssl, candidate, None)\n if res is None:\n res = getattr(ssl, 'PROTOCOL_' + candidate)\n return res\n\n return candidate\n\n\ndef create_urllib3_context(ssl_version=None, cert_reqs=None,\n options=None, ciphers=None):\n \"\"\"All arguments have the same meaning as ``ssl_wrap_socket``.\n\n By default, this function does a lot of the same work that\n ``ssl.create_default_context`` does on Python 3.4+. It:\n\n - Disables SSLv2, SSLv3, and compression\n - Sets a restricted set of server ciphers\n\n If you wish to enable SSLv3, you can do::\n\n from urllib3.util import ssl_\n context = ssl_.create_urllib3_context()\n context.options &= ~ssl_.OP_NO_SSLv3\n\n You can do the same to enable compression (substituting ``COMPRESSION``\n for ``SSLv3`` in the last line above).\n\n :param ssl_version:\n The desired protocol version to use. This will default to\n PROTOCOL_SSLv23 which will negotiate the highest protocol that both\n the server and your installation of OpenSSL support.\n :param cert_reqs:\n Whether to require the certificate verification. This defaults to\n ``ssl.CERT_REQUIRED``.\n :param options:\n Specific OpenSSL options. These default to ``ssl.OP_NO_SSLv2``,\n ``ssl.OP_NO_SSLv3``, ``ssl.OP_NO_COMPRESSION``.\n :param ciphers:\n Which cipher suites to allow the server to select.\n :returns:\n Constructed SSLContext object with specified options\n :rtype: SSLContext\n \"\"\"\n context = SSLContext(ssl_version or ssl.PROTOCOL_SSLv23)\n\n # Setting the default here, as we may have no ssl module on import\n cert_reqs = ssl.CERT_REQUIRED if cert_reqs is None else cert_reqs\n\n if options is None:\n options = 0\n # SSLv2 is easily broken and is considered harmful and dangerous\n options |= OP_NO_SSLv2\n # SSLv3 has several problems and is now dangerous\n options |= OP_NO_SSLv3\n # Disable compression to prevent CRIME attacks for OpenSSL 1.0+\n # (issue #309)\n options |= OP_NO_COMPRESSION\n\n context.options |= options\n\n if getattr(context, 'supports_set_ciphers', True): # Platform-specific: Python 2.6\n context.set_ciphers(ciphers or DEFAULT_CIPHERS)\n\n context.verify_mode = cert_reqs\n if getattr(context, 'check_hostname', None) is not None: # Platform-specific: Python 3.2\n # We do our own verification, including fingerprints and alternative\n # hostnames. So disable it here\n context.check_hostname = False\n return context\n\n\ndef ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,\n ca_certs=None, server_hostname=None,\n ssl_version=None, ciphers=None, ssl_context=None,\n ca_cert_dir=None):\n \"\"\"\n All arguments except for server_hostname, ssl_context, and ca_cert_dir have\n the same meaning as they do when using :func:`ssl.wrap_socket`.\n\n :param server_hostname:\n When SNI is supported, the expected hostname of the certificate\n :param ssl_context:\n A pre-made :class:`SSLContext` object. If none is provided, one will\n be created using :func:`create_urllib3_context`.\n :param ciphers:\n A string of ciphers we wish the client to support. This is not\n supported on Python 2.6 as the ssl module does not support it.\n :param ca_cert_dir:\n A directory containing CA certificates in multiple separate files, as\n supported by OpenSSL's -CApath flag or the capath argument to\n SSLContext.load_verify_locations().\n \"\"\"\n context = ssl_context\n if context is None:\n context = create_urllib3_context(ssl_version, cert_reqs,\n ciphers=ciphers)\n\n if ca_certs or ca_cert_dir:\n try:\n context.load_verify_locations(ca_certs, ca_cert_dir)\n except IOError as e: # Platform-specific: Python 2.6, 2.7, 3.2\n raise SSLError(e)\n # Py33 raises FileNotFoundError which subclasses OSError\n # These are not equivalent unless we check the errno attribute\n except OSError as e: # Platform-specific: Python 3.3 and beyond\n if e.errno == errno.ENOENT:\n raise SSLError(e)\n raise\n\n if certfile:\n context.load_cert_chain(certfile, keyfile)\n if HAS_SNI: # Platform-specific: OpenSSL with enabled SNI\n return context.wrap_socket(sock, server_hostname=server_hostname)\n return context.wrap_socket(sock)\n", "path": "urllib3/util/ssl_.py"}]}
| 3,753 | 296 |
gh_patches_debug_312
|
rasdani/github-patches
|
git_diff
|
certbot__certbot-9331
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove the third-party `mock` dependency
>Functions in certbot.tests.util were previously using the third party mock module if it was available for backwards compatibiliy. This behavior will be removed and unittest.mock from the standard library will always be used instead.
e.g. https://github.com/certbot/certbot/blob/5c111d0bd1206d864d7cb93754e101f6073bc669/certbot/certbot/tests/util.py#L38-L50
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `certbot/setup.py`
Content:
```
1 import codecs
2 import os
3 import re
4 import sys
5
6 from pkg_resources import parse_version
7 from setuptools import __version__ as setuptools_version
8 from setuptools import find_packages
9 from setuptools import setup
10
11 min_setuptools_version='41.6.0'
12 # This conditional isn't necessary, but it provides better error messages to
13 # people who try to install this package with older versions of setuptools.
14 if parse_version(setuptools_version) < parse_version(min_setuptools_version):
15 raise RuntimeError(f'setuptools {min_setuptools_version}+ is required')
16
17 # Workaround for https://bugs.python.org/issue8876, see
18 # https://bugs.python.org/issue8876#msg208792
19 # This can be removed when using Python 2.7.9 or later:
20 # https://hg.python.org/cpython/raw-file/v2.7.9/Misc/NEWS
21 if os.path.abspath(__file__).split(os.path.sep)[1] == 'vagrant':
22 del os.link
23
24
25 def read_file(filename, encoding='utf8'):
26 """Read unicode from given file."""
27 with codecs.open(filename, encoding=encoding) as fd:
28 return fd.read()
29
30
31 here = os.path.abspath(os.path.dirname(__file__))
32
33 # read version number (and other metadata) from package init
34 init_fn = os.path.join(here, 'certbot', '__init__.py')
35 meta = dict(re.findall(r"""__([a-z]+)__ = '([^']+)""", read_file(init_fn)))
36
37 readme = read_file(os.path.join(here, 'README.rst'))
38 version = meta['version']
39
40 # This package relies on PyOpenSSL and requests, however, it isn't specified
41 # here to avoid masking the more specific request requirements in acme. See
42 # https://github.com/pypa/pip/issues/988 for more info.
43 install_requires = [
44 # We specify the minimum acme version as the current Certbot version for
45 # simplicity. See https://github.com/certbot/certbot/issues/8761 for more
46 # info.
47 f'acme>={version}',
48 # We technically need ConfigArgParse 0.10.0 for Python 2.6 support, but
49 # saying so here causes a runtime error against our temporary fork of 0.9.3
50 # in which we added 2.6 support (see #2243), so we relax the requirement.
51 'ConfigArgParse>=0.9.3',
52 'configobj>=5.0.6',
53 'cryptography>=2.5.0',
54 'distro>=1.0.1',
55 'josepy>=1.13.0',
56 'parsedatetime>=2.4',
57 'pyrfc3339',
58 'pytz>=2019.3',
59 # This dependency needs to be added using environment markers to avoid its
60 # installation on Linux.
61 'pywin32>=300 ; sys_platform == "win32"',
62 f'setuptools>={min_setuptools_version}',
63 ]
64
65 dev_extras = [
66 'azure-devops',
67 'ipdb',
68 # poetry 1.2.0+ is required for it to pin pip, setuptools, and wheel. See
69 # https://github.com/python-poetry/poetry/issues/1584.
70 'poetry>=1.2.0a1',
71 'twine',
72 ]
73
74 docs_extras = [
75 # If you have Sphinx<1.5.1, you need docutils<0.13.1
76 # https://github.com/sphinx-doc/sphinx/issues/3212
77 'Sphinx>=1.2', # Annotation support
78 'sphinx_rtd_theme',
79 ]
80
81 # Tools like pip, wheel, and tox are listed here to ensure they are properly
82 # pinned and installed during automated testing.
83 test_extras = [
84 'coverage',
85 'mypy',
86 'pip',
87 'pylint',
88 'pytest',
89 'pytest-cov',
90 'pytest-xdist',
91 'setuptools',
92 'tox',
93 'types-mock',
94 'types-pyOpenSSL',
95 'types-pyRFC3339',
96 'types-pytz',
97 'types-requests',
98 'types-setuptools',
99 'types-six',
100 # typing-extensions is required to import typing.Protocol and make the mypy checks
101 # pass (along with pylint about non-existent objects) on Python 3.7
102 'typing-extensions',
103 'wheel',
104 ]
105
106
107 all_extras = dev_extras + docs_extras + test_extras
108
109 setup(
110 name='certbot',
111 version=version,
112 description="ACME client",
113 long_description=readme,
114 url='https://github.com/letsencrypt/letsencrypt',
115 author="Certbot Project",
116 author_email='certbot-dev@eff.org',
117 license='Apache License 2.0',
118 python_requires='>=3.7',
119 classifiers=[
120 'Development Status :: 5 - Production/Stable',
121 'Environment :: Console',
122 'Environment :: Console :: Curses',
123 'Intended Audience :: System Administrators',
124 'License :: OSI Approved :: Apache Software License',
125 'Operating System :: POSIX :: Linux',
126 'Programming Language :: Python',
127 'Programming Language :: Python :: 3',
128 'Programming Language :: Python :: 3.7',
129 'Programming Language :: Python :: 3.8',
130 'Programming Language :: Python :: 3.9',
131 'Programming Language :: Python :: 3.10',
132 'Topic :: Internet :: WWW/HTTP',
133 'Topic :: Security',
134 'Topic :: System :: Installation/Setup',
135 'Topic :: System :: Networking',
136 'Topic :: System :: Systems Administration',
137 'Topic :: Utilities',
138 ],
139
140 packages=find_packages(exclude=['docs', 'examples', 'tests', 'venv']),
141 include_package_data=True,
142
143 install_requires=install_requires,
144 extras_require={
145 'all': all_extras,
146 'dev': dev_extras,
147 'docs': docs_extras,
148 'test': test_extras,
149 },
150
151 entry_points={
152 'console_scripts': [
153 'certbot = certbot.main:main',
154 ],
155 'certbot.plugins': [
156 'manual = certbot._internal.plugins.manual:Authenticator',
157 'null = certbot._internal.plugins.null:Installer',
158 'standalone = certbot._internal.plugins.standalone:Authenticator',
159 'webroot = certbot._internal.plugins.webroot:Authenticator',
160 ],
161 },
162 )
163
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/certbot/setup.py b/certbot/setup.py
--- a/certbot/setup.py
+++ b/certbot/setup.py
@@ -90,7 +90,6 @@
'pytest-xdist',
'setuptools',
'tox',
- 'types-mock',
'types-pyOpenSSL',
'types-pyRFC3339',
'types-pytz',
|
{"golden_diff": "diff --git a/certbot/setup.py b/certbot/setup.py\n--- a/certbot/setup.py\n+++ b/certbot/setup.py\n@@ -90,7 +90,6 @@\n 'pytest-xdist',\n 'setuptools',\n 'tox',\n- 'types-mock',\n 'types-pyOpenSSL',\n 'types-pyRFC3339',\n 'types-pytz',\n", "issue": "Remove the third-party `mock` dependency\n>Functions in certbot.tests.util were previously using the third party mock module if it was available for backwards compatibiliy. This behavior will be removed and unittest.mock from the standard library will always be used instead.\r\n\r\ne.g. https://github.com/certbot/certbot/blob/5c111d0bd1206d864d7cb93754e101f6073bc669/certbot/certbot/tests/util.py#L38-L50\r\n\r\n\r\n\r\n\n", "before_files": [{"content": "import codecs\nimport os\nimport re\nimport sys\n\nfrom pkg_resources import parse_version\nfrom setuptools import __version__ as setuptools_version\nfrom setuptools import find_packages\nfrom setuptools import setup\n\nmin_setuptools_version='41.6.0'\n# This conditional isn't necessary, but it provides better error messages to\n# people who try to install this package with older versions of setuptools.\nif parse_version(setuptools_version) < parse_version(min_setuptools_version):\n raise RuntimeError(f'setuptools {min_setuptools_version}+ is required')\n\n# Workaround for https://bugs.python.org/issue8876, see\n# https://bugs.python.org/issue8876#msg208792\n# This can be removed when using Python 2.7.9 or later:\n# https://hg.python.org/cpython/raw-file/v2.7.9/Misc/NEWS\nif os.path.abspath(__file__).split(os.path.sep)[1] == 'vagrant':\n del os.link\n\n\ndef read_file(filename, encoding='utf8'):\n \"\"\"Read unicode from given file.\"\"\"\n with codecs.open(filename, encoding=encoding) as fd:\n return fd.read()\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n# read version number (and other metadata) from package init\ninit_fn = os.path.join(here, 'certbot', '__init__.py')\nmeta = dict(re.findall(r\"\"\"__([a-z]+)__ = '([^']+)\"\"\", read_file(init_fn)))\n\nreadme = read_file(os.path.join(here, 'README.rst'))\nversion = meta['version']\n\n# This package relies on PyOpenSSL and requests, however, it isn't specified\n# here to avoid masking the more specific request requirements in acme. See\n# https://github.com/pypa/pip/issues/988 for more info.\ninstall_requires = [\n # We specify the minimum acme version as the current Certbot version for\n # simplicity. See https://github.com/certbot/certbot/issues/8761 for more\n # info.\n f'acme>={version}',\n # We technically need ConfigArgParse 0.10.0 for Python 2.6 support, but\n # saying so here causes a runtime error against our temporary fork of 0.9.3\n # in which we added 2.6 support (see #2243), so we relax the requirement.\n 'ConfigArgParse>=0.9.3',\n 'configobj>=5.0.6',\n 'cryptography>=2.5.0',\n 'distro>=1.0.1',\n 'josepy>=1.13.0',\n 'parsedatetime>=2.4',\n 'pyrfc3339',\n 'pytz>=2019.3',\n # This dependency needs to be added using environment markers to avoid its\n # installation on Linux.\n 'pywin32>=300 ; sys_platform == \"win32\"',\n f'setuptools>={min_setuptools_version}',\n]\n\ndev_extras = [\n 'azure-devops',\n 'ipdb',\n # poetry 1.2.0+ is required for it to pin pip, setuptools, and wheel. See\n # https://github.com/python-poetry/poetry/issues/1584.\n 'poetry>=1.2.0a1',\n 'twine',\n]\n\ndocs_extras = [\n # If you have Sphinx<1.5.1, you need docutils<0.13.1\n # https://github.com/sphinx-doc/sphinx/issues/3212\n 'Sphinx>=1.2', # Annotation support\n 'sphinx_rtd_theme',\n]\n\n# Tools like pip, wheel, and tox are listed here to ensure they are properly\n# pinned and installed during automated testing.\ntest_extras = [\n 'coverage',\n 'mypy',\n 'pip',\n 'pylint',\n 'pytest',\n 'pytest-cov',\n 'pytest-xdist',\n 'setuptools',\n 'tox',\n 'types-mock',\n 'types-pyOpenSSL',\n 'types-pyRFC3339',\n 'types-pytz',\n 'types-requests',\n 'types-setuptools',\n 'types-six',\n # typing-extensions is required to import typing.Protocol and make the mypy checks\n # pass (along with pylint about non-existent objects) on Python 3.7\n 'typing-extensions',\n 'wheel',\n]\n\n\nall_extras = dev_extras + docs_extras + test_extras\n\nsetup(\n name='certbot',\n version=version,\n description=\"ACME client\",\n long_description=readme,\n url='https://github.com/letsencrypt/letsencrypt',\n author=\"Certbot Project\",\n author_email='certbot-dev@eff.org',\n license='Apache License 2.0',\n python_requires='>=3.7',\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Environment :: Console :: Curses',\n 'Intended Audience :: System Administrators',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: POSIX :: Linux',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n 'Programming Language :: Python :: 3.9',\n 'Programming Language :: Python :: 3.10',\n 'Topic :: Internet :: WWW/HTTP',\n 'Topic :: Security',\n 'Topic :: System :: Installation/Setup',\n 'Topic :: System :: Networking',\n 'Topic :: System :: Systems Administration',\n 'Topic :: Utilities',\n ],\n\n packages=find_packages(exclude=['docs', 'examples', 'tests', 'venv']),\n include_package_data=True,\n\n install_requires=install_requires,\n extras_require={\n 'all': all_extras,\n 'dev': dev_extras,\n 'docs': docs_extras,\n 'test': test_extras,\n },\n\n entry_points={\n 'console_scripts': [\n 'certbot = certbot.main:main',\n ],\n 'certbot.plugins': [\n 'manual = certbot._internal.plugins.manual:Authenticator',\n 'null = certbot._internal.plugins.null:Installer',\n 'standalone = certbot._internal.plugins.standalone:Authenticator',\n 'webroot = certbot._internal.plugins.webroot:Authenticator',\n ],\n },\n)\n", "path": "certbot/setup.py"}], "after_files": [{"content": "import codecs\nimport os\nimport re\nimport sys\n\nfrom pkg_resources import parse_version\nfrom setuptools import __version__ as setuptools_version\nfrom setuptools import find_packages\nfrom setuptools import setup\n\nmin_setuptools_version='41.6.0'\n# This conditional isn't necessary, but it provides better error messages to\n# people who try to install this package with older versions of setuptools.\nif parse_version(setuptools_version) < parse_version(min_setuptools_version):\n raise RuntimeError(f'setuptools {min_setuptools_version}+ is required')\n\n# Workaround for https://bugs.python.org/issue8876, see\n# https://bugs.python.org/issue8876#msg208792\n# This can be removed when using Python 2.7.9 or later:\n# https://hg.python.org/cpython/raw-file/v2.7.9/Misc/NEWS\nif os.path.abspath(__file__).split(os.path.sep)[1] == 'vagrant':\n del os.link\n\n\ndef read_file(filename, encoding='utf8'):\n \"\"\"Read unicode from given file.\"\"\"\n with codecs.open(filename, encoding=encoding) as fd:\n return fd.read()\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n# read version number (and other metadata) from package init\ninit_fn = os.path.join(here, 'certbot', '__init__.py')\nmeta = dict(re.findall(r\"\"\"__([a-z]+)__ = '([^']+)\"\"\", read_file(init_fn)))\n\nreadme = read_file(os.path.join(here, 'README.rst'))\nversion = meta['version']\n\n# This package relies on PyOpenSSL and requests, however, it isn't specified\n# here to avoid masking the more specific request requirements in acme. See\n# https://github.com/pypa/pip/issues/988 for more info.\ninstall_requires = [\n # We specify the minimum acme version as the current Certbot version for\n # simplicity. See https://github.com/certbot/certbot/issues/8761 for more\n # info.\n f'acme>={version}',\n # We technically need ConfigArgParse 0.10.0 for Python 2.6 support, but\n # saying so here causes a runtime error against our temporary fork of 0.9.3\n # in which we added 2.6 support (see #2243), so we relax the requirement.\n 'ConfigArgParse>=0.9.3',\n 'configobj>=5.0.6',\n 'cryptography>=2.5.0',\n 'distro>=1.0.1',\n 'josepy>=1.13.0',\n 'parsedatetime>=2.4',\n 'pyrfc3339',\n 'pytz>=2019.3',\n # This dependency needs to be added using environment markers to avoid its\n # installation on Linux.\n 'pywin32>=300 ; sys_platform == \"win32\"',\n f'setuptools>={min_setuptools_version}',\n]\n\ndev_extras = [\n 'azure-devops',\n 'ipdb',\n # poetry 1.2.0+ is required for it to pin pip, setuptools, and wheel. See\n # https://github.com/python-poetry/poetry/issues/1584.\n 'poetry>=1.2.0a1',\n 'twine',\n]\n\ndocs_extras = [\n # If you have Sphinx<1.5.1, you need docutils<0.13.1\n # https://github.com/sphinx-doc/sphinx/issues/3212\n 'Sphinx>=1.2', # Annotation support\n 'sphinx_rtd_theme',\n]\n\n# Tools like pip, wheel, and tox are listed here to ensure they are properly\n# pinned and installed during automated testing.\ntest_extras = [\n 'coverage',\n 'mypy',\n 'pip',\n 'pylint',\n 'pytest',\n 'pytest-cov',\n 'pytest-xdist',\n 'setuptools',\n 'tox',\n 'types-pyOpenSSL',\n 'types-pyRFC3339',\n 'types-pytz',\n 'types-requests',\n 'types-setuptools',\n 'types-six',\n # typing-extensions is required to import typing.Protocol and make the mypy checks\n # pass (along with pylint about non-existent objects) on Python 3.7\n 'typing-extensions',\n 'wheel',\n]\n\n\nall_extras = dev_extras + docs_extras + test_extras\n\nsetup(\n name='certbot',\n version=version,\n description=\"ACME client\",\n long_description=readme,\n url='https://github.com/letsencrypt/letsencrypt',\n author=\"Certbot Project\",\n author_email='certbot-dev@eff.org',\n license='Apache License 2.0',\n python_requires='>=3.7',\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Environment :: Console :: Curses',\n 'Intended Audience :: System Administrators',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: POSIX :: Linux',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n 'Programming Language :: Python :: 3.9',\n 'Programming Language :: Python :: 3.10',\n 'Topic :: Internet :: WWW/HTTP',\n 'Topic :: Security',\n 'Topic :: System :: Installation/Setup',\n 'Topic :: System :: Networking',\n 'Topic :: System :: Systems Administration',\n 'Topic :: Utilities',\n ],\n\n packages=find_packages(exclude=['docs', 'examples', 'tests', 'venv']),\n include_package_data=True,\n\n install_requires=install_requires,\n extras_require={\n 'all': all_extras,\n 'dev': dev_extras,\n 'docs': docs_extras,\n 'test': test_extras,\n },\n\n entry_points={\n 'console_scripts': [\n 'certbot = certbot.main:main',\n ],\n 'certbot.plugins': [\n 'manual = certbot._internal.plugins.manual:Authenticator',\n 'null = certbot._internal.plugins.null:Installer',\n 'standalone = certbot._internal.plugins.standalone:Authenticator',\n 'webroot = certbot._internal.plugins.webroot:Authenticator',\n ],\n },\n)\n", "path": "certbot/setup.py"}]}
| 2,199 | 93 |
gh_patches_debug_8686
|
rasdani/github-patches
|
git_diff
|
Lightning-Universe__lightning-flash-1144
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cannot run on iris dataset
## 🐛 Bug
Cannot run on iris dataset
### To Reproduce
Run the code sample. CSV file is attached
[iris.csv](https://github.com/PyTorchLightning/lightning-flash/files/7967078/iris.csv)
Error message:
```
Num classes: 3
GPU available: False, used: False
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
/Users/minh.chau/miniconda3/envs/chem/lib/python3.7/site-packages/pytorch_lightning/trainer/configuration_validator.py:122: UserWarning: You defined a `validation_step` but have no `val_dataloader`. Skipping val loop.
rank_zero_warn("You defined a `validation_step` but have no `val_dataloader`. Skipping val loop.")
| Name | Type | Params
---------------------------------------------
0 | model | TabNet | 5.9 K
1 | train_metrics | ModuleDict | 0
2 | val_metrics | ModuleDict | 0
3 | test_metrics | ModuleDict | 0
---------------------------------------------
5.9 K Trainable params
0 Non-trainable params
5.9 K Total params
0.024 Total estimated model params size (MB)
......many lines later.............
return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index)
IndexError: Target 3 is out of bounds.
```
#### Code sample
```python
import pandas as pd
t=pd.read_csv("iris.csv")
datamodule = TabularClassificationData.from_csv(
numerical_fields=["sepal.length","sepal.width","petal.length","petal.width"],
target_fields="variety",
train_file="iris.csv",
val_split=0.3,
batch_size=4
)
print(f"Num classes: {datamodule.num_classes}")
model = TabularClassifier.from_data(datamodule)
```
### Expected behavior
A model is created as expected
### Environment
- PyTorch Version (e.g., 1.0): 1.9.0
- OS (e.g., Linux): MacOS
- How you installed PyTorch (`conda`, `pip`, source): conda
- Python version: 3.7
- CUDA/cuDNN version: No
- GPU models and configuration: No
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flash/tabular/input.py`
Content:
```
1 # Copyright The PyTorch Lightning team.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 from dataclasses import dataclass
15 from io import StringIO
16 from typing import Any, Dict, List, Optional, Union
17
18 import numpy as np
19 from pytorch_lightning.utilities.exceptions import MisconfigurationException
20
21 from flash.core.data.io.input import DataKeys, Input
22 from flash.core.data.process import Deserializer
23 from flash.core.data.properties import ProcessState
24 from flash.core.data.utilities.data_frame import read_csv
25 from flash.core.utilities.imports import _PANDAS_AVAILABLE
26 from flash.tabular.classification.utils import (
27 _compute_normalization,
28 _generate_codes,
29 _pre_transform,
30 _to_cat_vars_numpy,
31 _to_num_vars_numpy,
32 )
33
34 if _PANDAS_AVAILABLE:
35 from pandas.core.frame import DataFrame
36 else:
37 DataFrame = object
38
39
40 @dataclass(unsafe_hash=True, frozen=True)
41 class TabularParametersState(ProcessState):
42 """A :class:`~flash.core.data.properties.ProcessState` containing tabular data ``parameters``."""
43
44 parameters: Optional[Dict[str, Any]]
45
46
47 class TabularDataFrameInput(Input):
48 @staticmethod
49 def _sanetize_fields(
50 categorical_fields: Optional[Union[str, List[str]]], numerical_fields: Optional[Union[str, List[str]]]
51 ):
52 if categorical_fields is None and numerical_fields is None:
53 raise RuntimeError("Both `categorical_fields` and `numerical_fields` are None!")
54
55 categorical_fields = categorical_fields or []
56 numerical_fields = numerical_fields or []
57
58 if not isinstance(categorical_fields, list):
59 categorical_fields = [categorical_fields]
60
61 if not isinstance(numerical_fields, list):
62 numerical_fields = [numerical_fields]
63
64 return categorical_fields, numerical_fields
65
66 @staticmethod
67 def compute_parameters(
68 train_data_frame: DataFrame,
69 numerical_fields: List[str],
70 categorical_fields: List[str],
71 ) -> Dict[str, Any]:
72
73 mean, std = _compute_normalization(train_data_frame, numerical_fields)
74
75 codes = _generate_codes(train_data_frame, categorical_fields)
76
77 return dict(
78 mean=mean,
79 std=std,
80 codes=codes,
81 numerical_fields=numerical_fields,
82 categorical_fields=categorical_fields,
83 )
84
85 def preprocess(
86 self,
87 df: DataFrame,
88 categorical_fields: Optional[List[str]] = None,
89 numerical_fields: Optional[List[str]] = None,
90 parameters: Dict[str, Any] = None,
91 ):
92 if self.training:
93 categorical_fields, numerical_fields = self._sanetize_fields(categorical_fields, numerical_fields)
94 parameters = self.compute_parameters(df, numerical_fields, categorical_fields)
95
96 self.set_state(TabularParametersState(parameters))
97 else:
98 parameters_state = self.get_state(TabularParametersState)
99 parameters = parameters or (parameters_state.parameters if parameters_state is not None else None)
100 if parameters is None:
101 raise MisconfigurationException(
102 "Loading tabular data for evaluation or inference requires parameters from the train data. Either "
103 "construct the train data at the same time as evaluation and inference or provide the train "
104 "`datamodule.parameters` in the `parameters` argument."
105 )
106
107 self.parameters = parameters
108
109 # impute and normalize data
110 df = _pre_transform(
111 df,
112 parameters["numerical_fields"],
113 parameters["categorical_fields"],
114 parameters["codes"],
115 parameters["mean"],
116 parameters["std"],
117 )
118
119 cat_vars = _to_cat_vars_numpy(df, parameters["categorical_fields"])
120 num_vars = _to_num_vars_numpy(df, parameters["numerical_fields"])
121
122 num_samples = len(df)
123 cat_vars = np.stack(cat_vars, 1) if len(cat_vars) else np.zeros((num_samples, 0))
124 num_vars = np.stack(num_vars, 1) if len(num_vars) else np.zeros((num_samples, 0))
125
126 return cat_vars, num_vars
127
128
129 class TabularDeserializer(Deserializer):
130 def __init__(self, *args, parameters: Optional[Dict[str, Any]] = None, **kwargs):
131 self._parameters = parameters
132 super().__init__(*args, **kwargs)
133
134 @property
135 def parameters(self) -> Dict[str, Any]:
136 if self._parameters is not None:
137 return self._parameters
138 parameters_state = self.get_state(TabularParametersState)
139 if parameters_state is not None and parameters_state.parameters is not None:
140 return parameters_state.parameters
141 raise MisconfigurationException(
142 "Tabular tasks must previously have been trained in order to support serving or the `parameters` argument "
143 "must be provided to the `serve` method."
144 )
145
146 def serve_load_sample(self, data: str) -> Any:
147 parameters = self.parameters
148
149 df = read_csv(StringIO(data))
150 df = _pre_transform(
151 df,
152 parameters["numerical_fields"],
153 parameters["categorical_fields"],
154 parameters["codes"],
155 parameters["mean"],
156 parameters["std"],
157 )
158
159 cat_vars = _to_cat_vars_numpy(df, parameters["categorical_fields"])
160 num_vars = _to_num_vars_numpy(df, parameters["numerical_fields"])
161
162 cat_vars = np.stack(cat_vars, 1)
163 num_vars = np.stack(num_vars, 1)
164
165 return [{DataKeys.INPUT: [c, n]} for c, n in zip(cat_vars, num_vars)]
166
167 @property
168 def example_input(self) -> str:
169 parameters = self.parameters
170
171 row = {}
172 for cat_col in parameters["categorical_fields"]:
173 row[cat_col] = ["test"]
174 for num_col in parameters["numerical_fields"]:
175 row[num_col] = [0]
176 return str(DataFrame.from_dict(row).to_csv())
177
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/flash/tabular/input.py b/flash/tabular/input.py
--- a/flash/tabular/input.py
+++ b/flash/tabular/input.py
@@ -120,8 +120,8 @@
num_vars = _to_num_vars_numpy(df, parameters["numerical_fields"])
num_samples = len(df)
- cat_vars = np.stack(cat_vars, 1) if len(cat_vars) else np.zeros((num_samples, 0))
- num_vars = np.stack(num_vars, 1) if len(num_vars) else np.zeros((num_samples, 0))
+ cat_vars = np.stack(cat_vars, 1) if len(cat_vars) else np.zeros((num_samples, 0), dtype=np.int64)
+ num_vars = np.stack(num_vars, 1) if len(num_vars) else np.zeros((num_samples, 0), dtype=np.float32)
return cat_vars, num_vars
|
{"golden_diff": "diff --git a/flash/tabular/input.py b/flash/tabular/input.py\n--- a/flash/tabular/input.py\n+++ b/flash/tabular/input.py\n@@ -120,8 +120,8 @@\n num_vars = _to_num_vars_numpy(df, parameters[\"numerical_fields\"])\n \n num_samples = len(df)\n- cat_vars = np.stack(cat_vars, 1) if len(cat_vars) else np.zeros((num_samples, 0))\n- num_vars = np.stack(num_vars, 1) if len(num_vars) else np.zeros((num_samples, 0))\n+ cat_vars = np.stack(cat_vars, 1) if len(cat_vars) else np.zeros((num_samples, 0), dtype=np.int64)\n+ num_vars = np.stack(num_vars, 1) if len(num_vars) else np.zeros((num_samples, 0), dtype=np.float32)\n \n return cat_vars, num_vars\n", "issue": "Cannot run on iris dataset\n## \ud83d\udc1b Bug\r\n\r\nCannot run on iris dataset\r\n\r\n### To Reproduce\r\n\r\nRun the code sample. CSV file is attached\r\n[iris.csv](https://github.com/PyTorchLightning/lightning-flash/files/7967078/iris.csv)\r\n\r\nError message:\r\n```\r\nNum classes: 3\r\nGPU available: False, used: False\r\nTPU available: False, using: 0 TPU cores\r\nIPU available: False, using: 0 IPUs\r\n/Users/minh.chau/miniconda3/envs/chem/lib/python3.7/site-packages/pytorch_lightning/trainer/configuration_validator.py:122: UserWarning: You defined a `validation_step` but have no `val_dataloader`. Skipping val loop.\r\n rank_zero_warn(\"You defined a `validation_step` but have no `val_dataloader`. Skipping val loop.\")\r\n\r\n | Name | Type | Params\r\n---------------------------------------------\r\n0 | model | TabNet | 5.9 K \r\n1 | train_metrics | ModuleDict | 0 \r\n2 | val_metrics | ModuleDict | 0 \r\n3 | test_metrics | ModuleDict | 0 \r\n---------------------------------------------\r\n5.9 K Trainable params\r\n0 Non-trainable params\r\n5.9 K Total params\r\n0.024 Total estimated model params size (MB)\r\n......many lines later.............\r\n return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index)\r\nIndexError: Target 3 is out of bounds.\r\n```\r\n\r\n#### Code sample\r\n```python\r\nimport pandas as pd\r\nt=pd.read_csv(\"iris.csv\")\r\n\r\ndatamodule = TabularClassificationData.from_csv(\r\n numerical_fields=[\"sepal.length\",\"sepal.width\",\"petal.length\",\"petal.width\"],\r\n target_fields=\"variety\",\r\n train_file=\"iris.csv\",\r\n val_split=0.3,\r\n batch_size=4\r\n)\r\nprint(f\"Num classes: {datamodule.num_classes}\")\r\nmodel = TabularClassifier.from_data(datamodule)\r\n```\r\n\r\n### Expected behavior\r\n\r\nA model is created as expected\r\n\r\n### Environment\r\n\r\n - PyTorch Version (e.g., 1.0): 1.9.0\r\n - OS (e.g., Linux): MacOS\r\n - How you installed PyTorch (`conda`, `pip`, source): conda\r\n - Python version: 3.7\r\n - CUDA/cuDNN version: No\r\n - GPU models and configuration: No\n", "before_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom dataclasses import dataclass\nfrom io import StringIO\nfrom typing import Any, Dict, List, Optional, Union\n\nimport numpy as np\nfrom pytorch_lightning.utilities.exceptions import MisconfigurationException\n\nfrom flash.core.data.io.input import DataKeys, Input\nfrom flash.core.data.process import Deserializer\nfrom flash.core.data.properties import ProcessState\nfrom flash.core.data.utilities.data_frame import read_csv\nfrom flash.core.utilities.imports import _PANDAS_AVAILABLE\nfrom flash.tabular.classification.utils import (\n _compute_normalization,\n _generate_codes,\n _pre_transform,\n _to_cat_vars_numpy,\n _to_num_vars_numpy,\n)\n\nif _PANDAS_AVAILABLE:\n from pandas.core.frame import DataFrame\nelse:\n DataFrame = object\n\n\n@dataclass(unsafe_hash=True, frozen=True)\nclass TabularParametersState(ProcessState):\n \"\"\"A :class:`~flash.core.data.properties.ProcessState` containing tabular data ``parameters``.\"\"\"\n\n parameters: Optional[Dict[str, Any]]\n\n\nclass TabularDataFrameInput(Input):\n @staticmethod\n def _sanetize_fields(\n categorical_fields: Optional[Union[str, List[str]]], numerical_fields: Optional[Union[str, List[str]]]\n ):\n if categorical_fields is None and numerical_fields is None:\n raise RuntimeError(\"Both `categorical_fields` and `numerical_fields` are None!\")\n\n categorical_fields = categorical_fields or []\n numerical_fields = numerical_fields or []\n\n if not isinstance(categorical_fields, list):\n categorical_fields = [categorical_fields]\n\n if not isinstance(numerical_fields, list):\n numerical_fields = [numerical_fields]\n\n return categorical_fields, numerical_fields\n\n @staticmethod\n def compute_parameters(\n train_data_frame: DataFrame,\n numerical_fields: List[str],\n categorical_fields: List[str],\n ) -> Dict[str, Any]:\n\n mean, std = _compute_normalization(train_data_frame, numerical_fields)\n\n codes = _generate_codes(train_data_frame, categorical_fields)\n\n return dict(\n mean=mean,\n std=std,\n codes=codes,\n numerical_fields=numerical_fields,\n categorical_fields=categorical_fields,\n )\n\n def preprocess(\n self,\n df: DataFrame,\n categorical_fields: Optional[List[str]] = None,\n numerical_fields: Optional[List[str]] = None,\n parameters: Dict[str, Any] = None,\n ):\n if self.training:\n categorical_fields, numerical_fields = self._sanetize_fields(categorical_fields, numerical_fields)\n parameters = self.compute_parameters(df, numerical_fields, categorical_fields)\n\n self.set_state(TabularParametersState(parameters))\n else:\n parameters_state = self.get_state(TabularParametersState)\n parameters = parameters or (parameters_state.parameters if parameters_state is not None else None)\n if parameters is None:\n raise MisconfigurationException(\n \"Loading tabular data for evaluation or inference requires parameters from the train data. Either \"\n \"construct the train data at the same time as evaluation and inference or provide the train \"\n \"`datamodule.parameters` in the `parameters` argument.\"\n )\n\n self.parameters = parameters\n\n # impute and normalize data\n df = _pre_transform(\n df,\n parameters[\"numerical_fields\"],\n parameters[\"categorical_fields\"],\n parameters[\"codes\"],\n parameters[\"mean\"],\n parameters[\"std\"],\n )\n\n cat_vars = _to_cat_vars_numpy(df, parameters[\"categorical_fields\"])\n num_vars = _to_num_vars_numpy(df, parameters[\"numerical_fields\"])\n\n num_samples = len(df)\n cat_vars = np.stack(cat_vars, 1) if len(cat_vars) else np.zeros((num_samples, 0))\n num_vars = np.stack(num_vars, 1) if len(num_vars) else np.zeros((num_samples, 0))\n\n return cat_vars, num_vars\n\n\nclass TabularDeserializer(Deserializer):\n def __init__(self, *args, parameters: Optional[Dict[str, Any]] = None, **kwargs):\n self._parameters = parameters\n super().__init__(*args, **kwargs)\n\n @property\n def parameters(self) -> Dict[str, Any]:\n if self._parameters is not None:\n return self._parameters\n parameters_state = self.get_state(TabularParametersState)\n if parameters_state is not None and parameters_state.parameters is not None:\n return parameters_state.parameters\n raise MisconfigurationException(\n \"Tabular tasks must previously have been trained in order to support serving or the `parameters` argument \"\n \"must be provided to the `serve` method.\"\n )\n\n def serve_load_sample(self, data: str) -> Any:\n parameters = self.parameters\n\n df = read_csv(StringIO(data))\n df = _pre_transform(\n df,\n parameters[\"numerical_fields\"],\n parameters[\"categorical_fields\"],\n parameters[\"codes\"],\n parameters[\"mean\"],\n parameters[\"std\"],\n )\n\n cat_vars = _to_cat_vars_numpy(df, parameters[\"categorical_fields\"])\n num_vars = _to_num_vars_numpy(df, parameters[\"numerical_fields\"])\n\n cat_vars = np.stack(cat_vars, 1)\n num_vars = np.stack(num_vars, 1)\n\n return [{DataKeys.INPUT: [c, n]} for c, n in zip(cat_vars, num_vars)]\n\n @property\n def example_input(self) -> str:\n parameters = self.parameters\n\n row = {}\n for cat_col in parameters[\"categorical_fields\"]:\n row[cat_col] = [\"test\"]\n for num_col in parameters[\"numerical_fields\"]:\n row[num_col] = [0]\n return str(DataFrame.from_dict(row).to_csv())\n", "path": "flash/tabular/input.py"}], "after_files": [{"content": "# Copyright The PyTorch Lightning team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom dataclasses import dataclass\nfrom io import StringIO\nfrom typing import Any, Dict, List, Optional, Union\n\nimport numpy as np\nfrom pytorch_lightning.utilities.exceptions import MisconfigurationException\n\nfrom flash.core.data.io.input import DataKeys, Input\nfrom flash.core.data.process import Deserializer\nfrom flash.core.data.properties import ProcessState\nfrom flash.core.data.utilities.data_frame import read_csv\nfrom flash.core.utilities.imports import _PANDAS_AVAILABLE\nfrom flash.tabular.classification.utils import (\n _compute_normalization,\n _generate_codes,\n _pre_transform,\n _to_cat_vars_numpy,\n _to_num_vars_numpy,\n)\n\nif _PANDAS_AVAILABLE:\n from pandas.core.frame import DataFrame\nelse:\n DataFrame = object\n\n\n@dataclass(unsafe_hash=True, frozen=True)\nclass TabularParametersState(ProcessState):\n \"\"\"A :class:`~flash.core.data.properties.ProcessState` containing tabular data ``parameters``.\"\"\"\n\n parameters: Optional[Dict[str, Any]]\n\n\nclass TabularDataFrameInput(Input):\n @staticmethod\n def _sanetize_fields(\n categorical_fields: Optional[Union[str, List[str]]], numerical_fields: Optional[Union[str, List[str]]]\n ):\n if categorical_fields is None and numerical_fields is None:\n raise RuntimeError(\"Both `categorical_fields` and `numerical_fields` are None!\")\n\n categorical_fields = categorical_fields or []\n numerical_fields = numerical_fields or []\n\n if not isinstance(categorical_fields, list):\n categorical_fields = [categorical_fields]\n\n if not isinstance(numerical_fields, list):\n numerical_fields = [numerical_fields]\n\n return categorical_fields, numerical_fields\n\n @staticmethod\n def compute_parameters(\n train_data_frame: DataFrame,\n numerical_fields: List[str],\n categorical_fields: List[str],\n ) -> Dict[str, Any]:\n\n mean, std = _compute_normalization(train_data_frame, numerical_fields)\n\n codes = _generate_codes(train_data_frame, categorical_fields)\n\n return dict(\n mean=mean,\n std=std,\n codes=codes,\n numerical_fields=numerical_fields,\n categorical_fields=categorical_fields,\n )\n\n def preprocess(\n self,\n df: DataFrame,\n categorical_fields: Optional[List[str]] = None,\n numerical_fields: Optional[List[str]] = None,\n parameters: Dict[str, Any] = None,\n ):\n if self.training:\n categorical_fields, numerical_fields = self._sanetize_fields(categorical_fields, numerical_fields)\n parameters = self.compute_parameters(df, numerical_fields, categorical_fields)\n\n self.set_state(TabularParametersState(parameters))\n else:\n parameters_state = self.get_state(TabularParametersState)\n parameters = parameters or (parameters_state.parameters if parameters_state is not None else None)\n if parameters is None:\n raise MisconfigurationException(\n \"Loading tabular data for evaluation or inference requires parameters from the train data. Either \"\n \"construct the train data at the same time as evaluation and inference or provide the train \"\n \"`datamodule.parameters` in the `parameters` argument.\"\n )\n\n self.parameters = parameters\n\n # impute and normalize data\n df = _pre_transform(\n df,\n parameters[\"numerical_fields\"],\n parameters[\"categorical_fields\"],\n parameters[\"codes\"],\n parameters[\"mean\"],\n parameters[\"std\"],\n )\n\n cat_vars = _to_cat_vars_numpy(df, parameters[\"categorical_fields\"])\n num_vars = _to_num_vars_numpy(df, parameters[\"numerical_fields\"])\n\n num_samples = len(df)\n cat_vars = np.stack(cat_vars, 1) if len(cat_vars) else np.zeros((num_samples, 0), dtype=np.int64)\n num_vars = np.stack(num_vars, 1) if len(num_vars) else np.zeros((num_samples, 0), dtype=np.float32)\n\n return cat_vars, num_vars\n\n\nclass TabularDeserializer(Deserializer):\n def __init__(self, *args, parameters: Optional[Dict[str, Any]] = None, **kwargs):\n self._parameters = parameters\n super().__init__(*args, **kwargs)\n\n @property\n def parameters(self) -> Dict[str, Any]:\n if self._parameters is not None:\n return self._parameters\n parameters_state = self.get_state(TabularParametersState)\n if parameters_state is not None and parameters_state.parameters is not None:\n return parameters_state.parameters\n raise MisconfigurationException(\n \"Tabular tasks must previously have been trained in order to support serving or the `parameters` argument \"\n \"must be provided to the `serve` method.\"\n )\n\n def serve_load_sample(self, data: str) -> Any:\n parameters = self.parameters\n\n df = read_csv(StringIO(data))\n df = _pre_transform(\n df,\n parameters[\"numerical_fields\"],\n parameters[\"categorical_fields\"],\n parameters[\"codes\"],\n parameters[\"mean\"],\n parameters[\"std\"],\n )\n\n cat_vars = _to_cat_vars_numpy(df, parameters[\"categorical_fields\"])\n num_vars = _to_num_vars_numpy(df, parameters[\"numerical_fields\"])\n\n cat_vars = np.stack(cat_vars, 1)\n num_vars = np.stack(num_vars, 1)\n\n return [{DataKeys.INPUT: [c, n]} for c, n in zip(cat_vars, num_vars)]\n\n @property\n def example_input(self) -> str:\n parameters = self.parameters\n\n row = {}\n for cat_col in parameters[\"categorical_fields\"]:\n row[cat_col] = [\"test\"]\n for num_col in parameters[\"numerical_fields\"]:\n row[num_col] = [0]\n return str(DataFrame.from_dict(row).to_csv())\n", "path": "flash/tabular/input.py"}]}
| 2,571 | 208 |
gh_patches_debug_35032
|
rasdani/github-patches
|
git_diff
|
alltheplaces__alltheplaces-1880
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider quiznos is broken
During the global build at 2021-05-26-14-42-23, spider **quiznos** failed with **129 features** and **1 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/logs/quiznos.log) and [the output](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/quiznos.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import scrapy
3 import json
4 import re
5 import traceback
6
7 from locations.items import GeojsonPointItem
8
9 URL = "http://restaurants.quiznos.com"
10
11 class QuiznosSpider(scrapy.Spider):
12 name = "quiznos"
13 item_attributes = { 'brand': "Quizno's", 'brand_wikidata': "Q1936229" }
14 allowed_domains = [URL]
15 start_urls = (
16 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',
17 )
18
19 def store_hours(self, store_hours):
20 if store_hours == '' or store_hours.lower().find('close') > -1: return ''
21
22 day_groups = []
23 this_day_group = None
24 hour_intervals = []
25
26 interval = store_hours.split(' - ')
27 start_time = interval[0].split(' ')
28 end_time = interval[1].split(' ')
29 start_hour = start_time[0].split(':')
30 end_hour = end_time[0].split(':')
31
32 hour_intervals.append('{}:{}-{}:{}'.format(
33 start_hour[0],
34 start_hour[1],
35 int(end_hour[0]) + 12 if end_time[1] == 'PM' else end_hour[0],
36 end_hour[1],
37 ))
38
39 hours = ','.join(hour_intervals)
40
41 if not this_day_group:
42 this_day_group = {
43 'from_day': 'Su',
44 'to_day': 'Sa',
45 'hours': hours
46 }
47
48 day_groups.append(this_day_group)
49
50 opening_hours = ""
51 if len(day_groups) == 1 and day_groups[0]['hours'] in ('00:00-23:59', '00:00-00:00'):
52 opening_hours = '24/7'
53 else:
54 for day_group in day_groups:
55 if day_group['from_day'] == day_group['to_day']:
56 opening_hours += '{from_day} {hours}; '.format(**day_group)
57 elif day_group['from_day'] == 'Su' and day_group['to_day'] == 'Sa':
58 opening_hours += '{hours}; '.format(**day_group)
59 else:
60 opening_hours += '{from_day}-{to_day} {hours}; '.format(**day_group)
61 opening_hours = opening_hours[:-2]
62
63 return opening_hours
64
65 def parse(self, response):
66 data = response.body_as_unicode()
67 stores = json.loads(re.search(r'storeList\((.*)\)', data).group(1))
68
69 for store in stores:
70
71 yield GeojsonPointItem(
72 lat=store.get('latitude'),
73 lon=store.get('longitude'),
74 ref=str(store.get('storeid')),
75 phone=store.get('phone'),
76 name=store.get('restaurantname'),
77 opening_hours=self.store_hours(store.get('businesshours')),
78 addr_full=store.get('address1'),
79 city=store.get('city'),
80 state=store.get('statecode'),
81 postcode=store.get('zipcode'),
82 website=URL + store.get('url'),
83 )
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/locations/spiders/quiznos.py b/locations/spiders/quiznos.py
--- a/locations/spiders/quiznos.py
+++ b/locations/spiders/quiznos.py
@@ -2,22 +2,24 @@
import scrapy
import json
import re
-import traceback
from locations.items import GeojsonPointItem
-URL = "http://restaurants.quiznos.com"
class QuiznosSpider(scrapy.Spider):
name = "quiznos"
item_attributes = { 'brand': "Quizno's", 'brand_wikidata': "Q1936229" }
- allowed_domains = [URL]
+ allowed_domains = ["https://restaurants.quiznos.com"]
start_urls = (
- 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',
+ 'https://restaurants.quiznos.com/data/stores.json',
)
def store_hours(self, store_hours):
- if store_hours == '' or store_hours.lower().find('close') > -1: return ''
+ if store_hours == '' or store_hours == ' - ' or store_hours.lower().find('close') > -1:
+ return ''
+
+ if store_hours == 'Open 24 Hours - Open 24 Hours':
+ return '24/7'
day_groups = []
this_day_group = None
@@ -67,17 +69,18 @@
stores = json.loads(re.search(r'storeList\((.*)\)', data).group(1))
for store in stores:
+ properties = {
+ "lat": store.get('latitude'),
+ "lon": store.get('longitude'),
+ "ref": str(store.get('storeid')),
+ "phone": store.get('phone'),
+ "name": store.get('restaurantname'),
+ "opening_hours": self.store_hours(store.get('businesshours')),
+ "addr_full": store.get('address1'),
+ "city": store.get('city'),
+ "state": store.get('statecode'),
+ "postcode": store.get('zipcode'),
+ "website": response.urljoin(store.get('url')),
+ }
- yield GeojsonPointItem(
- lat=store.get('latitude'),
- lon=store.get('longitude'),
- ref=str(store.get('storeid')),
- phone=store.get('phone'),
- name=store.get('restaurantname'),
- opening_hours=self.store_hours(store.get('businesshours')),
- addr_full=store.get('address1'),
- city=store.get('city'),
- state=store.get('statecode'),
- postcode=store.get('zipcode'),
- website=URL + store.get('url'),
- )
+ yield GeojsonPointItem(**properties)
|
{"golden_diff": "diff --git a/locations/spiders/quiznos.py b/locations/spiders/quiznos.py\n--- a/locations/spiders/quiznos.py\n+++ b/locations/spiders/quiznos.py\n@@ -2,22 +2,24 @@\n import scrapy\n import json\n import re\n-import traceback\n \n from locations.items import GeojsonPointItem\n \n-URL = \"http://restaurants.quiznos.com\"\n \n class QuiznosSpider(scrapy.Spider):\n name = \"quiznos\"\n item_attributes = { 'brand': \"Quizno's\", 'brand_wikidata': \"Q1936229\" }\n- allowed_domains = [URL]\n+ allowed_domains = [\"https://restaurants.quiznos.com\"]\n start_urls = (\n- 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',\n+ 'https://restaurants.quiznos.com/data/stores.json',\n )\n \n def store_hours(self, store_hours):\n- if store_hours == '' or store_hours.lower().find('close') > -1: return ''\n+ if store_hours == '' or store_hours == ' - ' or store_hours.lower().find('close') > -1:\n+ return ''\n+\n+ if store_hours == 'Open 24 Hours - Open 24 Hours':\n+ return '24/7'\n \n day_groups = []\n this_day_group = None\n@@ -67,17 +69,18 @@\n stores = json.loads(re.search(r'storeList\\((.*)\\)', data).group(1))\n \n for store in stores:\n+ properties = {\n+ \"lat\": store.get('latitude'),\n+ \"lon\": store.get('longitude'),\n+ \"ref\": str(store.get('storeid')),\n+ \"phone\": store.get('phone'),\n+ \"name\": store.get('restaurantname'),\n+ \"opening_hours\": self.store_hours(store.get('businesshours')),\n+ \"addr_full\": store.get('address1'),\n+ \"city\": store.get('city'),\n+ \"state\": store.get('statecode'),\n+ \"postcode\": store.get('zipcode'),\n+ \"website\": response.urljoin(store.get('url')),\n+ }\n \n- yield GeojsonPointItem(\n- lat=store.get('latitude'),\n- lon=store.get('longitude'),\n- ref=str(store.get('storeid')),\n- phone=store.get('phone'),\n- name=store.get('restaurantname'),\n- opening_hours=self.store_hours(store.get('businesshours')),\n- addr_full=store.get('address1'),\n- city=store.get('city'),\n- state=store.get('statecode'),\n- postcode=store.get('zipcode'),\n- website=URL + store.get('url'),\n- )\n+ yield GeojsonPointItem(**properties)\n", "issue": "Spider quiznos is broken\nDuring the global build at 2021-05-26-14-42-23, spider **quiznos** failed with **129 features** and **1 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/logs/quiznos.log) and [the output](https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-05-26-14-42-23/output/quiznos.geojson))\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nimport json\nimport re\nimport traceback\n\nfrom locations.items import GeojsonPointItem\n\nURL = \"http://restaurants.quiznos.com\"\n\nclass QuiznosSpider(scrapy.Spider):\n name = \"quiznos\"\n item_attributes = { 'brand': \"Quizno's\", 'brand_wikidata': \"Q1936229\" }\n allowed_domains = [URL]\n start_urls = (\n 'http://restaurants.quiznos.com/data/stores.json?callback=storeList',\n )\n\n def store_hours(self, store_hours):\n if store_hours == '' or store_hours.lower().find('close') > -1: return ''\n\n day_groups = []\n this_day_group = None\n hour_intervals = []\n\n interval = store_hours.split(' - ')\n start_time = interval[0].split(' ')\n end_time = interval[1].split(' ')\n start_hour = start_time[0].split(':')\n end_hour = end_time[0].split(':')\n\n hour_intervals.append('{}:{}-{}:{}'.format(\n start_hour[0],\n start_hour[1],\n int(end_hour[0]) + 12 if end_time[1] == 'PM' else end_hour[0],\n end_hour[1],\n ))\n\n hours = ','.join(hour_intervals)\n\n if not this_day_group:\n this_day_group = {\n 'from_day': 'Su',\n 'to_day': 'Sa',\n 'hours': hours\n }\n\n day_groups.append(this_day_group)\n\n opening_hours = \"\"\n if len(day_groups) == 1 and day_groups[0]['hours'] in ('00:00-23:59', '00:00-00:00'):\n opening_hours = '24/7'\n else:\n for day_group in day_groups:\n if day_group['from_day'] == day_group['to_day']:\n opening_hours += '{from_day} {hours}; '.format(**day_group)\n elif day_group['from_day'] == 'Su' and day_group['to_day'] == 'Sa':\n opening_hours += '{hours}; '.format(**day_group)\n else:\n opening_hours += '{from_day}-{to_day} {hours}; '.format(**day_group)\n opening_hours = opening_hours[:-2]\n\n return opening_hours\n\n def parse(self, response):\n data = response.body_as_unicode()\n stores = json.loads(re.search(r'storeList\\((.*)\\)', data).group(1))\n\n for store in stores:\n\n yield GeojsonPointItem(\n lat=store.get('latitude'),\n lon=store.get('longitude'),\n ref=str(store.get('storeid')),\n phone=store.get('phone'),\n name=store.get('restaurantname'),\n opening_hours=self.store_hours(store.get('businesshours')),\n addr_full=store.get('address1'),\n city=store.get('city'),\n state=store.get('statecode'),\n postcode=store.get('zipcode'),\n website=URL + store.get('url'),\n )\n", "path": "locations/spiders/quiznos.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport scrapy\nimport json\nimport re\n\nfrom locations.items import GeojsonPointItem\n\n\nclass QuiznosSpider(scrapy.Spider):\n name = \"quiznos\"\n item_attributes = { 'brand': \"Quizno's\", 'brand_wikidata': \"Q1936229\" }\n allowed_domains = [\"https://restaurants.quiznos.com\"]\n start_urls = (\n 'https://restaurants.quiznos.com/data/stores.json',\n )\n\n def store_hours(self, store_hours):\n if store_hours == '' or store_hours == ' - ' or store_hours.lower().find('close') > -1:\n return ''\n\n if store_hours == 'Open 24 Hours - Open 24 Hours':\n return '24/7'\n\n day_groups = []\n this_day_group = None\n hour_intervals = []\n\n interval = store_hours.split(' - ')\n start_time = interval[0].split(' ')\n end_time = interval[1].split(' ')\n start_hour = start_time[0].split(':')\n end_hour = end_time[0].split(':')\n\n hour_intervals.append('{}:{}-{}:{}'.format(\n start_hour[0],\n start_hour[1],\n int(end_hour[0]) + 12 if end_time[1] == 'PM' else end_hour[0],\n end_hour[1],\n ))\n\n hours = ','.join(hour_intervals)\n\n if not this_day_group:\n this_day_group = {\n 'from_day': 'Su',\n 'to_day': 'Sa',\n 'hours': hours\n }\n\n day_groups.append(this_day_group)\n\n opening_hours = \"\"\n if len(day_groups) == 1 and day_groups[0]['hours'] in ('00:00-23:59', '00:00-00:00'):\n opening_hours = '24/7'\n else:\n for day_group in day_groups:\n if day_group['from_day'] == day_group['to_day']:\n opening_hours += '{from_day} {hours}; '.format(**day_group)\n elif day_group['from_day'] == 'Su' and day_group['to_day'] == 'Sa':\n opening_hours += '{hours}; '.format(**day_group)\n else:\n opening_hours += '{from_day}-{to_day} {hours}; '.format(**day_group)\n opening_hours = opening_hours[:-2]\n\n return opening_hours\n\n def parse(self, response):\n data = response.body_as_unicode()\n stores = json.loads(re.search(r'storeList\\((.*)\\)', data).group(1))\n\n for store in stores:\n properties = {\n \"lat\": store.get('latitude'),\n \"lon\": store.get('longitude'),\n \"ref\": str(store.get('storeid')),\n \"phone\": store.get('phone'),\n \"name\": store.get('restaurantname'),\n \"opening_hours\": self.store_hours(store.get('businesshours')),\n \"addr_full\": store.get('address1'),\n \"city\": store.get('city'),\n \"state\": store.get('statecode'),\n \"postcode\": store.get('zipcode'),\n \"website\": response.urljoin(store.get('url')),\n }\n\n yield GeojsonPointItem(**properties)\n", "path": "locations/spiders/quiznos.py"}]}
| 1,278 | 609 |
gh_patches_debug_37893
|
rasdani/github-patches
|
git_diff
|
hpcaitech__ColossalAI-5532
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/language/grok-1/inference_tp.py`
Content:
```
1 import time
2
3 import torch
4 from grok1_policy import Grok1ForCausalLMPolicy
5 from transformers import AutoModelForCausalLM, LlamaTokenizerFast
6 from utils import get_defualt_parser, inference, print_output
7
8 import colossalai
9 from colossalai.booster import Booster
10 from colossalai.booster.plugin import HybridParallelPlugin
11 from colossalai.cluster import DistCoordinator
12 from colossalai.lazy import LazyInitContext
13 from colossalai.utils import get_current_device
14
15 if __name__ == "__main__":
16 parser = get_defualt_parser()
17 args = parser.parse_args()
18 start = time.time()
19 colossalai.launch_from_torch({})
20 coordinator = DistCoordinator()
21 plugin = HybridParallelPlugin(
22 tp_size=coordinator.world_size,
23 pp_size=1,
24 precision="bf16",
25 parallel_output=False,
26 custom_policy=Grok1ForCausalLMPolicy(),
27 )
28 booster = Booster(plugin=plugin)
29 torch.set_default_dtype(torch.bfloat16)
30 with LazyInitContext(default_device=get_current_device()):
31 model = AutoModelForCausalLM.from_pretrained(
32 args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16
33 )
34 model, *_ = booster.boost(model)
35 model.eval()
36 init_time = time.time() - start
37
38 # A transformers-compatible version of the grok-1 tokenizer by Xenova
39 # https://huggingface.co/Xenova/grok-1-tokenizer
40 tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
41
42 for text in args.text:
43 output = inference(
44 model.unwrap(),
45 tokenizer,
46 text,
47 max_new_tokens=args.max_new_tokens,
48 do_sample=args.do_sample,
49 temperature=args.temperature,
50 top_k=args.top_k,
51 top_p=args.top_p,
52 )
53 if coordinator.is_master():
54 print_output(text, tokenizer.decode(output))
55
56 overall_time = time.time() - start
57 gen_latency = overall_time - init_time
58 avg_gen_latency = gen_latency / len(args.text)
59 coordinator.print_on_master(
60 f"Initializing time: {init_time:.2f} seconds.\n"
61 f"Overall time: {overall_time:.2f} seconds. \n"
62 f"Generation latency: {gen_latency:.2f} seconds. \n"
63 f"Average generation latency: {avg_gen_latency:.2f} seconds. \n"
64 )
65
```
Path: `examples/language/grok-1/inference.py`
Content:
```
1 import time
2
3 import torch
4 from transformers import AutoModelForCausalLM, LlamaTokenizerFast
5 from utils import get_defualt_parser, inference, print_output
6
7 if __name__ == "__main__":
8 parser = get_defualt_parser()
9 args = parser.parse_args()
10 start = time.time()
11 torch.set_default_dtype(torch.bfloat16)
12 model = AutoModelForCausalLM.from_pretrained(
13 args.pretrained,
14 trust_remote_code=True,
15 device_map="auto",
16 torch_dtype=torch.bfloat16,
17 )
18 model.eval()
19 init_time = time.time() - start
20
21 # A transformers-compatible version of the grok-1 tokenizer by Xenova
22 # https://huggingface.co/Xenova/grok-1-tokenizer
23 tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
24
25 for text in args.text:
26 output = inference(
27 model,
28 tokenizer,
29 text,
30 max_new_tokens=args.max_new_tokens,
31 do_sample=args.do_sample,
32 temperature=args.temperature,
33 top_k=args.top_k,
34 top_p=args.top_p,
35 )
36 print_output(text, tokenizer.decode(output))
37
38 overall_time = time.time() - start
39 gen_latency = overall_time - init_time
40 avg_gen_latency = gen_latency / len(args.text)
41 print(
42 f"Initializing time: {init_time:.2f} seconds.\n"
43 f"Overall time: {overall_time:.2f} seconds. \n"
44 f"Generation latency: {gen_latency:.2f} seconds. \n"
45 f"Average generation latency: {avg_gen_latency:.2f} seconds. \n"
46 )
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/examples/language/grok-1/inference.py b/examples/language/grok-1/inference.py
--- a/examples/language/grok-1/inference.py
+++ b/examples/language/grok-1/inference.py
@@ -1,7 +1,7 @@
import time
import torch
-from transformers import AutoModelForCausalLM, LlamaTokenizerFast
+from transformers import AutoModelForCausalLM, AutoTokenizer
from utils import get_defualt_parser, inference, print_output
if __name__ == "__main__":
@@ -9,6 +9,9 @@
args = parser.parse_args()
start = time.time()
torch.set_default_dtype(torch.bfloat16)
+
+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)
+
model = AutoModelForCausalLM.from_pretrained(
args.pretrained,
trust_remote_code=True,
@@ -18,10 +21,6 @@
model.eval()
init_time = time.time() - start
- # A transformers-compatible version of the grok-1 tokenizer by Xenova
- # https://huggingface.co/Xenova/grok-1-tokenizer
- tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
-
for text in args.text:
output = inference(
model,
diff --git a/examples/language/grok-1/inference_tp.py b/examples/language/grok-1/inference_tp.py
--- a/examples/language/grok-1/inference_tp.py
+++ b/examples/language/grok-1/inference_tp.py
@@ -2,7 +2,7 @@
import torch
from grok1_policy import Grok1ForCausalLMPolicy
-from transformers import AutoModelForCausalLM, LlamaTokenizerFast
+from transformers import AutoModelForCausalLM, AutoTokenizer
from utils import get_defualt_parser, inference, print_output
import colossalai
@@ -27,6 +27,9 @@
)
booster = Booster(plugin=plugin)
torch.set_default_dtype(torch.bfloat16)
+
+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)
+
with LazyInitContext(default_device=get_current_device()):
model = AutoModelForCausalLM.from_pretrained(
args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16
@@ -35,10 +38,6 @@
model.eval()
init_time = time.time() - start
- # A transformers-compatible version of the grok-1 tokenizer by Xenova
- # https://huggingface.co/Xenova/grok-1-tokenizer
- tokenizer = LlamaTokenizerFast.from_pretrained("Xenova/grok-1-tokenizer")
-
for text in args.text:
output = inference(
model.unwrap(),
|
{"golden_diff": "diff --git a/examples/language/grok-1/inference.py b/examples/language/grok-1/inference.py\n--- a/examples/language/grok-1/inference.py\n+++ b/examples/language/grok-1/inference.py\n@@ -1,7 +1,7 @@\n import time\n \n import torch\n-from transformers import AutoModelForCausalLM, LlamaTokenizerFast\n+from transformers import AutoModelForCausalLM, AutoTokenizer\n from utils import get_defualt_parser, inference, print_output\n \n if __name__ == \"__main__\":\n@@ -9,6 +9,9 @@\n args = parser.parse_args()\n start = time.time()\n torch.set_default_dtype(torch.bfloat16)\n+\n+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n+\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained,\n trust_remote_code=True,\n@@ -18,10 +21,6 @@\n model.eval()\n init_time = time.time() - start\n \n- # A transformers-compatible version of the grok-1 tokenizer by Xenova\n- # https://huggingface.co/Xenova/grok-1-tokenizer\n- tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n-\n for text in args.text:\n output = inference(\n model,\ndiff --git a/examples/language/grok-1/inference_tp.py b/examples/language/grok-1/inference_tp.py\n--- a/examples/language/grok-1/inference_tp.py\n+++ b/examples/language/grok-1/inference_tp.py\n@@ -2,7 +2,7 @@\n \n import torch\n from grok1_policy import Grok1ForCausalLMPolicy\n-from transformers import AutoModelForCausalLM, LlamaTokenizerFast\n+from transformers import AutoModelForCausalLM, AutoTokenizer\n from utils import get_defualt_parser, inference, print_output\n \n import colossalai\n@@ -27,6 +27,9 @@\n )\n booster = Booster(plugin=plugin)\n torch.set_default_dtype(torch.bfloat16)\n+\n+ tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n+\n with LazyInitContext(default_device=get_current_device()):\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16\n@@ -35,10 +38,6 @@\n model.eval()\n init_time = time.time() - start\n \n- # A transformers-compatible version of the grok-1 tokenizer by Xenova\n- # https://huggingface.co/Xenova/grok-1-tokenizer\n- tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n-\n for text in args.text:\n output = inference(\n model.unwrap(),\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "import time\n\nimport torch\nfrom grok1_policy import Grok1ForCausalLMPolicy\nfrom transformers import AutoModelForCausalLM, LlamaTokenizerFast\nfrom utils import get_defualt_parser, inference, print_output\n\nimport colossalai\nfrom colossalai.booster import Booster\nfrom colossalai.booster.plugin import HybridParallelPlugin\nfrom colossalai.cluster import DistCoordinator\nfrom colossalai.lazy import LazyInitContext\nfrom colossalai.utils import get_current_device\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n colossalai.launch_from_torch({})\n coordinator = DistCoordinator()\n plugin = HybridParallelPlugin(\n tp_size=coordinator.world_size,\n pp_size=1,\n precision=\"bf16\",\n parallel_output=False,\n custom_policy=Grok1ForCausalLMPolicy(),\n )\n booster = Booster(plugin=plugin)\n torch.set_default_dtype(torch.bfloat16)\n with LazyInitContext(default_device=get_current_device()):\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16\n )\n model, *_ = booster.boost(model)\n model.eval()\n init_time = time.time() - start\n\n # A transformers-compatible version of the grok-1 tokenizer by Xenova\n # https://huggingface.co/Xenova/grok-1-tokenizer\n tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n\n for text in args.text:\n output = inference(\n model.unwrap(),\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n if coordinator.is_master():\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n coordinator.print_on_master(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference_tp.py"}, {"content": "import time\n\nimport torch\nfrom transformers import AutoModelForCausalLM, LlamaTokenizerFast\nfrom utils import get_defualt_parser, inference, print_output\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n torch.set_default_dtype(torch.bfloat16)\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained,\n trust_remote_code=True,\n device_map=\"auto\",\n torch_dtype=torch.bfloat16,\n )\n model.eval()\n init_time = time.time() - start\n\n # A transformers-compatible version of the grok-1 tokenizer by Xenova\n # https://huggingface.co/Xenova/grok-1-tokenizer\n tokenizer = LlamaTokenizerFast.from_pretrained(\"Xenova/grok-1-tokenizer\")\n\n for text in args.text:\n output = inference(\n model,\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n print(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference.py"}], "after_files": [{"content": "import time\n\nimport torch\nfrom grok1_policy import Grok1ForCausalLMPolicy\nfrom transformers import AutoModelForCausalLM, AutoTokenizer\nfrom utils import get_defualt_parser, inference, print_output\n\nimport colossalai\nfrom colossalai.booster import Booster\nfrom colossalai.booster.plugin import HybridParallelPlugin\nfrom colossalai.cluster import DistCoordinator\nfrom colossalai.lazy import LazyInitContext\nfrom colossalai.utils import get_current_device\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n colossalai.launch_from_torch({})\n coordinator = DistCoordinator()\n plugin = HybridParallelPlugin(\n tp_size=coordinator.world_size,\n pp_size=1,\n precision=\"bf16\",\n parallel_output=False,\n custom_policy=Grok1ForCausalLMPolicy(),\n )\n booster = Booster(plugin=plugin)\n torch.set_default_dtype(torch.bfloat16)\n\n tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n\n with LazyInitContext(default_device=get_current_device()):\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained, trust_remote_code=True, torch_dtype=torch.bfloat16\n )\n model, *_ = booster.boost(model)\n model.eval()\n init_time = time.time() - start\n\n for text in args.text:\n output = inference(\n model.unwrap(),\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n if coordinator.is_master():\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n coordinator.print_on_master(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference_tp.py"}, {"content": "import time\n\nimport torch\nfrom transformers import AutoModelForCausalLM, AutoTokenizer\nfrom utils import get_defualt_parser, inference, print_output\n\nif __name__ == \"__main__\":\n parser = get_defualt_parser()\n args = parser.parse_args()\n start = time.time()\n torch.set_default_dtype(torch.bfloat16)\n\n tokenizer = AutoTokenizer.from_pretrained(args.pretrained, trust_remote_code=True)\n\n model = AutoModelForCausalLM.from_pretrained(\n args.pretrained,\n trust_remote_code=True,\n device_map=\"auto\",\n torch_dtype=torch.bfloat16,\n )\n model.eval()\n init_time = time.time() - start\n\n for text in args.text:\n output = inference(\n model,\n tokenizer,\n text,\n max_new_tokens=args.max_new_tokens,\n do_sample=args.do_sample,\n temperature=args.temperature,\n top_k=args.top_k,\n top_p=args.top_p,\n )\n print_output(text, tokenizer.decode(output))\n\n overall_time = time.time() - start\n gen_latency = overall_time - init_time\n avg_gen_latency = gen_latency / len(args.text)\n print(\n f\"Initializing time: {init_time:.2f} seconds.\\n\"\n f\"Overall time: {overall_time:.2f} seconds. \\n\"\n f\"Generation latency: {gen_latency:.2f} seconds. \\n\"\n f\"Average generation latency: {avg_gen_latency:.2f} seconds. \\n\"\n )\n", "path": "examples/language/grok-1/inference.py"}]}
| 1,411 | 622 |
gh_patches_debug_42178
|
rasdani/github-patches
|
git_diff
|
huggingface__dataset-viewer-1183
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
The workers fail with `mongoengine.errors.FieldDoesNotExist: The fields "{'force'}" do not exist on the document "Job"`
```
INFO: 2023-05-13 10:50:18,007 - root - Worker loop started
INFO: 2023-05-13 10:50:18,023 - root - Starting heartbeat.
ERROR: 2023-05-13 10:50:18,115 - asyncio - Task exception was never retrieved
future: <Task finished name='Task-2' coro=<every() done, defined at /src/services/worker/src/worker/executor.py:26> exception=FieldDoesNotExist('The fields "{\'force\'}" do not exist on the document "Job"')>
Traceback (most recent call last):
File "/src/services/worker/src/worker/executor.py", line 30, in every
out = func(*args, **kwargs)
File "/src/services/worker/src/worker/executor.py", line 117, in kill_zombies
zombies = queue.get_zombies(max_seconds_without_heartbeat=self.max_seconds_without_heartbeat_for_zombies)
File "/src/libs/libcommon/src/libcommon/queue.py", line 622, in get_zombies
zombies = [
File "/src/libs/libcommon/src/libcommon/queue.py", line 622, in <listcomp>
zombies = [
File "/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/queryset/queryset.py", line 110, in _iter_results
self._populate_cache()
File "/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/queryset/queryset.py", line 129, in _populate_cache
self._result_cache.append(next(self))
File "/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/queryset/base.py", line 1599, in __next__
doc = self._document._from_son(
File "/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/base/document.py", line 836, in _from_son
obj = cls(__auto_convert=False, _created=created, **data)
File "/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/base/document.py", line 99, in __init__
raise FieldDoesNotExist(msg)
mongoengine.errors.FieldDoesNotExist: The fields "{'force'}" do not exist on the document "Job"
Traceback (most recent call last):
File "/src/services/worker/src/worker/main.py", line 61, in <module>
worker_executor.start()
File "/src/services/worker/src/worker/executor.py", line 89, in start
loop.run_until_complete(
File "/usr/local/lib/python3.9/asyncio/base_events.py", line 645, in run_until_complete
raise RuntimeError('Event loop stopped before Future completed.')
RuntimeError: Event loop stopped before Future completed.
```
<img width="622" alt="Capture d’écran 2023-05-13 à 12 52 06" src="https://github.com/huggingface/datasets-server/assets/1676121/3e4d7e1a-dd02-476d-b671-57ce03d28d7b">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `services/worker/src/worker/job_manager.py`
Content:
```
1 # SPDX-License-Identifier: Apache-2.0
2 # Copyright 2022 The HuggingFace Authors.
3
4 import logging
5 from http import HTTPStatus
6 from typing import Literal, Optional
7
8 from libcommon.config import CommonConfig
9 from libcommon.dataset import DatasetNotFoundError, get_dataset_git_revision
10 from libcommon.exceptions import CustomError
11 from libcommon.processing_graph import ProcessingGraph, ProcessingStep
12 from libcommon.simple_cache import (
13 DoesNotExist,
14 get_response_without_content_params,
15 upsert_response_params,
16 )
17 from libcommon.state import DatasetState
18 from libcommon.utils import JobInfo, JobParams, Priority, Status, orjson_dumps
19
20 from worker.common_exceptions import (
21 JobManagerCrashedError,
22 JobManagerExceededMaximumDurationError,
23 NoGitRevisionError,
24 ResponseAlreadyComputedError,
25 TooBigContentError,
26 UnexpectedError,
27 )
28 from worker.config import AppConfig, WorkerConfig
29 from worker.job_runner import JobRunner
30
31 # List of error codes that should trigger a retry.
32 ERROR_CODES_TO_RETRY: list[str] = ["ClientConnectionError"]
33
34
35 class JobManager:
36 """
37 A job manager is a class that handles a job runner compute, for a specific processing step.
38
39 Args:
40 job_info (:obj:`JobInfo`):
41 The job to process. It contains the job_id, the job type, the dataset, the config, the split
42 and the priority level.
43 common_config (:obj:`CommonConfig`):
44 The common config.
45 processing_step (:obj:`ProcessingStep`):
46 The processing step to process.
47 """
48
49 job_id: str
50 job_params: JobParams
51 priority: Priority
52 worker_config: WorkerConfig
53 common_config: CommonConfig
54 processing_step: ProcessingStep
55 processing_graph: ProcessingGraph
56 _dataset_git_revision: Optional[str] = None
57 job_runner: JobRunner
58
59 def __init__(
60 self,
61 job_info: JobInfo,
62 app_config: AppConfig,
63 job_runner: JobRunner,
64 processing_graph: ProcessingGraph,
65 ) -> None:
66 self.job_info = job_info
67 self.job_type = job_info["type"]
68 self.job_id = job_info["job_id"]
69 self.priority = job_info["priority"]
70 self.job_params = job_info["params"]
71 self.common_config = app_config.common
72 self.worker_config = app_config.worker
73 self.job_runner = job_runner
74 self.processing_graph = processing_graph
75 self.processing_step = self.job_runner.processing_step
76 self.setup()
77
78 def setup(self) -> None:
79 job_type = self.job_runner.get_job_type()
80 if self.processing_step.job_type != job_type:
81 raise ValueError(
82 f"The processing step's job type is {self.processing_step.job_type}, but the job manager only"
83 f" processes {job_type}"
84 )
85 if self.job_type != job_type:
86 raise ValueError(
87 f"The submitted job type is {self.job_type}, but the job manager only processes {job_type}"
88 )
89
90 def __str__(self) -> str:
91 return f"JobManager(job_id={self.job_id} dataset={self.job_params['dataset']} job_info={self.job_info}"
92
93 def log(self, level: int, msg: str) -> None:
94 logging.log(level=level, msg=f"[{self.processing_step.job_type}] {msg}")
95
96 def debug(self, msg: str) -> None:
97 self.log(level=logging.DEBUG, msg=msg)
98
99 def info(self, msg: str) -> None:
100 self.log(level=logging.INFO, msg=msg)
101
102 def warning(self, msg: str) -> None:
103 self.log(level=logging.WARNING, msg=msg)
104
105 def exception(self, msg: str) -> None:
106 self.log(level=logging.ERROR, msg=msg)
107
108 def critical(self, msg: str) -> None:
109 self.log(level=logging.CRITICAL, msg=msg)
110
111 def run(self) -> Literal[Status.SUCCESS, Status.ERROR]:
112 try:
113 self.info(f"compute {self}")
114 result: Literal[Status.SUCCESS, Status.ERROR] = Status.SUCCESS if self.process() else Status.ERROR
115 except Exception:
116 self.exception(f"error while computing {self}")
117 result = Status.ERROR
118 self.backfill()
119 return result
120
121 def get_dataset_git_revision(self) -> Optional[str]:
122 """Get the git revision of the dataset repository."""
123 if self._dataset_git_revision is None:
124 self._dataset_git_revision = get_dataset_git_revision(
125 dataset=self.job_params["dataset"],
126 hf_endpoint=self.common_config.hf_endpoint,
127 hf_token=self.common_config.hf_token,
128 )
129 return self._dataset_git_revision
130
131 def raise_if_parallel_response_exists(self, parallel_cache_kind: str, parallel_job_version: int) -> None:
132 try:
133 existing_response = get_response_without_content_params(
134 kind=parallel_cache_kind,
135 job_params=self.job_params,
136 )
137
138 dataset_git_revision = self.get_dataset_git_revision()
139 if (
140 existing_response["http_status"] == HTTPStatus.OK
141 and existing_response["job_runner_version"] == parallel_job_version
142 and existing_response["progress"] == 1.0 # completed response
143 and dataset_git_revision is not None
144 and existing_response["dataset_git_revision"] == dataset_git_revision
145 ):
146 raise ResponseAlreadyComputedError(
147 f"Response has already been computed and stored in cache kind: {parallel_cache_kind}. Compute will"
148 " be skipped."
149 )
150 except DoesNotExist:
151 logging.debug(f"no cache found for {parallel_cache_kind}.")
152
153 def process(
154 self,
155 ) -> bool:
156 dataset_git_revision = None
157 try:
158 dataset_git_revision = self.get_dataset_git_revision()
159 if dataset_git_revision is None:
160 self.debug(f"the dataset={self.job_params['dataset']} has no git revision, don't update the cache")
161 raise NoGitRevisionError(f"Could not get git revision for dataset {self.job_params['dataset']}")
162 try:
163 self.job_runner.pre_compute()
164 parallel_job_runner = self.job_runner.get_parallel_job_runner()
165 if parallel_job_runner:
166 self.raise_if_parallel_response_exists(
167 parallel_cache_kind=parallel_job_runner["job_type"],
168 parallel_job_version=parallel_job_runner["job_runner_version"],
169 )
170
171 job_result = self.job_runner.compute()
172 content = job_result.content
173
174 # Validate content size
175 if len(orjson_dumps(content)) > self.worker_config.content_max_bytes:
176 raise TooBigContentError(
177 "The computed response content exceeds the supported size in bytes"
178 f" ({self.worker_config.content_max_bytes})."
179 )
180 finally:
181 # ensure the post_compute hook is called even if the compute raises an exception
182 self.job_runner.post_compute()
183 upsert_response_params(
184 kind=self.processing_step.cache_kind,
185 job_params=self.job_params,
186 content=content,
187 http_status=HTTPStatus.OK,
188 job_runner_version=self.job_runner.get_job_runner_version(),
189 dataset_git_revision=dataset_git_revision,
190 progress=job_result.progress,
191 )
192 self.debug(f"dataset={self.job_params['dataset']} job_info={self.job_info} is valid, cache updated")
193 return True
194 except DatasetNotFoundError:
195 # To avoid filling the cache, we don't save this error. Otherwise, DoS is possible.
196 self.debug(f"the dataset={self.job_params['dataset']} could not be found, don't update the cache")
197 return False
198 except Exception as err:
199 e = err if isinstance(err, CustomError) else UnexpectedError(str(err), err)
200 upsert_response_params(
201 kind=self.processing_step.cache_kind,
202 job_params=self.job_params,
203 content=dict(e.as_response()),
204 http_status=e.status_code,
205 error_code=e.code,
206 details=dict(e.as_response_with_cause()),
207 job_runner_version=self.job_runner.get_job_runner_version(),
208 dataset_git_revision=dataset_git_revision,
209 )
210 self.debug(f"response for job_info={self.job_info} had an error, cache updated")
211 return False
212
213 def backfill(self) -> None:
214 """Evaluate the state of the dataset and backfill the cache if necessary."""
215 DatasetState(
216 dataset=self.job_params["dataset"],
217 processing_graph=self.processing_graph,
218 revision=self.get_dataset_git_revision(),
219 error_codes_to_retry=ERROR_CODES_TO_RETRY,
220 priority=self.priority,
221 ).backfill()
222
223 def set_crashed(self, message: str, cause: Optional[BaseException] = None) -> None:
224 error = JobManagerCrashedError(message=message, cause=cause)
225 upsert_response_params(
226 kind=self.processing_step.cache_kind,
227 job_params=self.job_params,
228 content=dict(error.as_response()),
229 http_status=error.status_code,
230 error_code=error.code,
231 details=dict(error.as_response_with_cause()),
232 job_runner_version=self.job_runner.get_job_runner_version(),
233 dataset_git_revision=self.get_dataset_git_revision(),
234 )
235 logging.debug(
236 f"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (crashed), cache"
237 " updated"
238 )
239
240 def set_exceeded_maximum_duration(self, message: str, cause: Optional[BaseException] = None) -> None:
241 error = JobManagerExceededMaximumDurationError(message=message, cause=cause)
242 upsert_response_params(
243 kind=self.processing_step.cache_kind,
244 job_params=self.job_params,
245 content=dict(error.as_response()),
246 http_status=error.status_code,
247 error_code=error.code,
248 details=dict(error.as_response_with_cause()),
249 job_runner_version=self.job_runner.get_job_runner_version(),
250 dataset_git_revision=self.get_dataset_git_revision(),
251 )
252 logging.debug(
253 f"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (exceeded"
254 " maximum duration), cache updated"
255 )
256
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/services/worker/src/worker/job_manager.py b/services/worker/src/worker/job_manager.py
--- a/services/worker/src/worker/job_manager.py
+++ b/services/worker/src/worker/job_manager.py
@@ -115,17 +115,25 @@
except Exception:
self.exception(f"error while computing {self}")
result = Status.ERROR
- self.backfill()
+ revision = self.get_dataset_git_revision(allow_raise=False)
+ if revision is not None:
+ self.backfill(revision=revision)
return result
- def get_dataset_git_revision(self) -> Optional[str]:
+ def get_dataset_git_revision(self, allow_raise: bool = True) -> Optional[str]:
"""Get the git revision of the dataset repository."""
if self._dataset_git_revision is None:
- self._dataset_git_revision = get_dataset_git_revision(
- dataset=self.job_params["dataset"],
- hf_endpoint=self.common_config.hf_endpoint,
- hf_token=self.common_config.hf_token,
- )
+ try:
+ self._dataset_git_revision = get_dataset_git_revision(
+ dataset=self.job_params["dataset"],
+ hf_endpoint=self.common_config.hf_endpoint,
+ hf_token=self.common_config.hf_token,
+ )
+ except Exception as e:
+ if allow_raise:
+ raise e
+ else:
+ return None
return self._dataset_git_revision
def raise_if_parallel_response_exists(self, parallel_cache_kind: str, parallel_job_version: int) -> None:
@@ -210,12 +218,12 @@
self.debug(f"response for job_info={self.job_info} had an error, cache updated")
return False
- def backfill(self) -> None:
+ def backfill(self, revision: str) -> None:
"""Evaluate the state of the dataset and backfill the cache if necessary."""
DatasetState(
dataset=self.job_params["dataset"],
processing_graph=self.processing_graph,
- revision=self.get_dataset_git_revision(),
+ revision=revision,
error_codes_to_retry=ERROR_CODES_TO_RETRY,
priority=self.priority,
).backfill()
@@ -230,7 +238,7 @@
error_code=error.code,
details=dict(error.as_response_with_cause()),
job_runner_version=self.job_runner.get_job_runner_version(),
- dataset_git_revision=self.get_dataset_git_revision(),
+ dataset_git_revision=self.get_dataset_git_revision(allow_raise=False),
)
logging.debug(
f"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (crashed), cache"
@@ -247,7 +255,7 @@
error_code=error.code,
details=dict(error.as_response_with_cause()),
job_runner_version=self.job_runner.get_job_runner_version(),
- dataset_git_revision=self.get_dataset_git_revision(),
+ dataset_git_revision=self.get_dataset_git_revision(allow_raise=False),
)
logging.debug(
f"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (exceeded"
|
{"golden_diff": "diff --git a/services/worker/src/worker/job_manager.py b/services/worker/src/worker/job_manager.py\n--- a/services/worker/src/worker/job_manager.py\n+++ b/services/worker/src/worker/job_manager.py\n@@ -115,17 +115,25 @@\n except Exception:\n self.exception(f\"error while computing {self}\")\n result = Status.ERROR\n- self.backfill()\n+ revision = self.get_dataset_git_revision(allow_raise=False)\n+ if revision is not None:\n+ self.backfill(revision=revision)\n return result\n \n- def get_dataset_git_revision(self) -> Optional[str]:\n+ def get_dataset_git_revision(self, allow_raise: bool = True) -> Optional[str]:\n \"\"\"Get the git revision of the dataset repository.\"\"\"\n if self._dataset_git_revision is None:\n- self._dataset_git_revision = get_dataset_git_revision(\n- dataset=self.job_params[\"dataset\"],\n- hf_endpoint=self.common_config.hf_endpoint,\n- hf_token=self.common_config.hf_token,\n- )\n+ try:\n+ self._dataset_git_revision = get_dataset_git_revision(\n+ dataset=self.job_params[\"dataset\"],\n+ hf_endpoint=self.common_config.hf_endpoint,\n+ hf_token=self.common_config.hf_token,\n+ )\n+ except Exception as e:\n+ if allow_raise:\n+ raise e\n+ else:\n+ return None\n return self._dataset_git_revision\n \n def raise_if_parallel_response_exists(self, parallel_cache_kind: str, parallel_job_version: int) -> None:\n@@ -210,12 +218,12 @@\n self.debug(f\"response for job_info={self.job_info} had an error, cache updated\")\n return False\n \n- def backfill(self) -> None:\n+ def backfill(self, revision: str) -> None:\n \"\"\"Evaluate the state of the dataset and backfill the cache if necessary.\"\"\"\n DatasetState(\n dataset=self.job_params[\"dataset\"],\n processing_graph=self.processing_graph,\n- revision=self.get_dataset_git_revision(),\n+ revision=revision,\n error_codes_to_retry=ERROR_CODES_TO_RETRY,\n priority=self.priority,\n ).backfill()\n@@ -230,7 +238,7 @@\n error_code=error.code,\n details=dict(error.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n- dataset_git_revision=self.get_dataset_git_revision(),\n+ dataset_git_revision=self.get_dataset_git_revision(allow_raise=False),\n )\n logging.debug(\n f\"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (crashed), cache\"\n@@ -247,7 +255,7 @@\n error_code=error.code,\n details=dict(error.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n- dataset_git_revision=self.get_dataset_git_revision(),\n+ dataset_git_revision=self.get_dataset_git_revision(allow_raise=False),\n )\n logging.debug(\n f\"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (exceeded\"\n", "issue": "The workers fail with `mongoengine.errors.FieldDoesNotExist: The fields \"{'force'}\" do not exist on the document \"Job\"`\n```\r\nINFO: 2023-05-13 10:50:18,007 - root - Worker loop started\r\nINFO: 2023-05-13 10:50:18,023 - root - Starting heartbeat.\r\nERROR: 2023-05-13 10:50:18,115 - asyncio - Task exception was never retrieved\r\nfuture: <Task finished name='Task-2' coro=<every() done, defined at /src/services/worker/src/worker/executor.py:26> exception=FieldDoesNotExist('The fields \"{\\'force\\'}\" do not exist on the document \"Job\"')>\r\nTraceback (most recent call last):\r\n File \"/src/services/worker/src/worker/executor.py\", line 30, in every\r\n out = func(*args, **kwargs)\r\n File \"/src/services/worker/src/worker/executor.py\", line 117, in kill_zombies\r\n zombies = queue.get_zombies(max_seconds_without_heartbeat=self.max_seconds_without_heartbeat_for_zombies)\r\n File \"/src/libs/libcommon/src/libcommon/queue.py\", line 622, in get_zombies\r\n zombies = [\r\n File \"/src/libs/libcommon/src/libcommon/queue.py\", line 622, in <listcomp>\r\n zombies = [\r\n File \"/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/queryset/queryset.py\", line 110, in _iter_results\r\n self._populate_cache()\r\n File \"/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/queryset/queryset.py\", line 129, in _populate_cache\r\n self._result_cache.append(next(self))\r\n File \"/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/queryset/base.py\", line 1599, in __next__\r\n doc = self._document._from_son(\r\n File \"/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/base/document.py\", line 836, in _from_son\r\n obj = cls(__auto_convert=False, _created=created, **data)\r\n File \"/src/services/worker/.venv/lib/python3.9/site-packages/mongoengine/base/document.py\", line 99, in __init__\r\n raise FieldDoesNotExist(msg)\r\nmongoengine.errors.FieldDoesNotExist: The fields \"{'force'}\" do not exist on the document \"Job\"\r\nTraceback (most recent call last):\r\n File \"/src/services/worker/src/worker/main.py\", line 61, in <module>\r\n worker_executor.start()\r\n File \"/src/services/worker/src/worker/executor.py\", line 89, in start\r\n loop.run_until_complete(\r\n File \"/usr/local/lib/python3.9/asyncio/base_events.py\", line 645, in run_until_complete\r\n raise RuntimeError('Event loop stopped before Future completed.')\r\nRuntimeError: Event loop stopped before Future completed.\r\n```\r\n\r\n<img width=\"622\" alt=\"Capture d\u2019e\u0301cran 2023-05-13 a\u0300 12 52 06\" src=\"https://github.com/huggingface/datasets-server/assets/1676121/3e4d7e1a-dd02-476d-b671-57ce03d28d7b\">\r\n\n", "before_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n# Copyright 2022 The HuggingFace Authors.\n\nimport logging\nfrom http import HTTPStatus\nfrom typing import Literal, Optional\n\nfrom libcommon.config import CommonConfig\nfrom libcommon.dataset import DatasetNotFoundError, get_dataset_git_revision\nfrom libcommon.exceptions import CustomError\nfrom libcommon.processing_graph import ProcessingGraph, ProcessingStep\nfrom libcommon.simple_cache import (\n DoesNotExist,\n get_response_without_content_params,\n upsert_response_params,\n)\nfrom libcommon.state import DatasetState\nfrom libcommon.utils import JobInfo, JobParams, Priority, Status, orjson_dumps\n\nfrom worker.common_exceptions import (\n JobManagerCrashedError,\n JobManagerExceededMaximumDurationError,\n NoGitRevisionError,\n ResponseAlreadyComputedError,\n TooBigContentError,\n UnexpectedError,\n)\nfrom worker.config import AppConfig, WorkerConfig\nfrom worker.job_runner import JobRunner\n\n# List of error codes that should trigger a retry.\nERROR_CODES_TO_RETRY: list[str] = [\"ClientConnectionError\"]\n\n\nclass JobManager:\n \"\"\"\n A job manager is a class that handles a job runner compute, for a specific processing step.\n\n Args:\n job_info (:obj:`JobInfo`):\n The job to process. It contains the job_id, the job type, the dataset, the config, the split\n and the priority level.\n common_config (:obj:`CommonConfig`):\n The common config.\n processing_step (:obj:`ProcessingStep`):\n The processing step to process.\n \"\"\"\n\n job_id: str\n job_params: JobParams\n priority: Priority\n worker_config: WorkerConfig\n common_config: CommonConfig\n processing_step: ProcessingStep\n processing_graph: ProcessingGraph\n _dataset_git_revision: Optional[str] = None\n job_runner: JobRunner\n\n def __init__(\n self,\n job_info: JobInfo,\n app_config: AppConfig,\n job_runner: JobRunner,\n processing_graph: ProcessingGraph,\n ) -> None:\n self.job_info = job_info\n self.job_type = job_info[\"type\"]\n self.job_id = job_info[\"job_id\"]\n self.priority = job_info[\"priority\"]\n self.job_params = job_info[\"params\"]\n self.common_config = app_config.common\n self.worker_config = app_config.worker\n self.job_runner = job_runner\n self.processing_graph = processing_graph\n self.processing_step = self.job_runner.processing_step\n self.setup()\n\n def setup(self) -> None:\n job_type = self.job_runner.get_job_type()\n if self.processing_step.job_type != job_type:\n raise ValueError(\n f\"The processing step's job type is {self.processing_step.job_type}, but the job manager only\"\n f\" processes {job_type}\"\n )\n if self.job_type != job_type:\n raise ValueError(\n f\"The submitted job type is {self.job_type}, but the job manager only processes {job_type}\"\n )\n\n def __str__(self) -> str:\n return f\"JobManager(job_id={self.job_id} dataset={self.job_params['dataset']} job_info={self.job_info}\"\n\n def log(self, level: int, msg: str) -> None:\n logging.log(level=level, msg=f\"[{self.processing_step.job_type}] {msg}\")\n\n def debug(self, msg: str) -> None:\n self.log(level=logging.DEBUG, msg=msg)\n\n def info(self, msg: str) -> None:\n self.log(level=logging.INFO, msg=msg)\n\n def warning(self, msg: str) -> None:\n self.log(level=logging.WARNING, msg=msg)\n\n def exception(self, msg: str) -> None:\n self.log(level=logging.ERROR, msg=msg)\n\n def critical(self, msg: str) -> None:\n self.log(level=logging.CRITICAL, msg=msg)\n\n def run(self) -> Literal[Status.SUCCESS, Status.ERROR]:\n try:\n self.info(f\"compute {self}\")\n result: Literal[Status.SUCCESS, Status.ERROR] = Status.SUCCESS if self.process() else Status.ERROR\n except Exception:\n self.exception(f\"error while computing {self}\")\n result = Status.ERROR\n self.backfill()\n return result\n\n def get_dataset_git_revision(self) -> Optional[str]:\n \"\"\"Get the git revision of the dataset repository.\"\"\"\n if self._dataset_git_revision is None:\n self._dataset_git_revision = get_dataset_git_revision(\n dataset=self.job_params[\"dataset\"],\n hf_endpoint=self.common_config.hf_endpoint,\n hf_token=self.common_config.hf_token,\n )\n return self._dataset_git_revision\n\n def raise_if_parallel_response_exists(self, parallel_cache_kind: str, parallel_job_version: int) -> None:\n try:\n existing_response = get_response_without_content_params(\n kind=parallel_cache_kind,\n job_params=self.job_params,\n )\n\n dataset_git_revision = self.get_dataset_git_revision()\n if (\n existing_response[\"http_status\"] == HTTPStatus.OK\n and existing_response[\"job_runner_version\"] == parallel_job_version\n and existing_response[\"progress\"] == 1.0 # completed response\n and dataset_git_revision is not None\n and existing_response[\"dataset_git_revision\"] == dataset_git_revision\n ):\n raise ResponseAlreadyComputedError(\n f\"Response has already been computed and stored in cache kind: {parallel_cache_kind}. Compute will\"\n \" be skipped.\"\n )\n except DoesNotExist:\n logging.debug(f\"no cache found for {parallel_cache_kind}.\")\n\n def process(\n self,\n ) -> bool:\n dataset_git_revision = None\n try:\n dataset_git_revision = self.get_dataset_git_revision()\n if dataset_git_revision is None:\n self.debug(f\"the dataset={self.job_params['dataset']} has no git revision, don't update the cache\")\n raise NoGitRevisionError(f\"Could not get git revision for dataset {self.job_params['dataset']}\")\n try:\n self.job_runner.pre_compute()\n parallel_job_runner = self.job_runner.get_parallel_job_runner()\n if parallel_job_runner:\n self.raise_if_parallel_response_exists(\n parallel_cache_kind=parallel_job_runner[\"job_type\"],\n parallel_job_version=parallel_job_runner[\"job_runner_version\"],\n )\n\n job_result = self.job_runner.compute()\n content = job_result.content\n\n # Validate content size\n if len(orjson_dumps(content)) > self.worker_config.content_max_bytes:\n raise TooBigContentError(\n \"The computed response content exceeds the supported size in bytes\"\n f\" ({self.worker_config.content_max_bytes}).\"\n )\n finally:\n # ensure the post_compute hook is called even if the compute raises an exception\n self.job_runner.post_compute()\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=content,\n http_status=HTTPStatus.OK,\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=dataset_git_revision,\n progress=job_result.progress,\n )\n self.debug(f\"dataset={self.job_params['dataset']} job_info={self.job_info} is valid, cache updated\")\n return True\n except DatasetNotFoundError:\n # To avoid filling the cache, we don't save this error. Otherwise, DoS is possible.\n self.debug(f\"the dataset={self.job_params['dataset']} could not be found, don't update the cache\")\n return False\n except Exception as err:\n e = err if isinstance(err, CustomError) else UnexpectedError(str(err), err)\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=dict(e.as_response()),\n http_status=e.status_code,\n error_code=e.code,\n details=dict(e.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=dataset_git_revision,\n )\n self.debug(f\"response for job_info={self.job_info} had an error, cache updated\")\n return False\n\n def backfill(self) -> None:\n \"\"\"Evaluate the state of the dataset and backfill the cache if necessary.\"\"\"\n DatasetState(\n dataset=self.job_params[\"dataset\"],\n processing_graph=self.processing_graph,\n revision=self.get_dataset_git_revision(),\n error_codes_to_retry=ERROR_CODES_TO_RETRY,\n priority=self.priority,\n ).backfill()\n\n def set_crashed(self, message: str, cause: Optional[BaseException] = None) -> None:\n error = JobManagerCrashedError(message=message, cause=cause)\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=dict(error.as_response()),\n http_status=error.status_code,\n error_code=error.code,\n details=dict(error.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=self.get_dataset_git_revision(),\n )\n logging.debug(\n f\"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (crashed), cache\"\n \" updated\"\n )\n\n def set_exceeded_maximum_duration(self, message: str, cause: Optional[BaseException] = None) -> None:\n error = JobManagerExceededMaximumDurationError(message=message, cause=cause)\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=dict(error.as_response()),\n http_status=error.status_code,\n error_code=error.code,\n details=dict(error.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=self.get_dataset_git_revision(),\n )\n logging.debug(\n f\"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (exceeded\"\n \" maximum duration), cache updated\"\n )\n", "path": "services/worker/src/worker/job_manager.py"}], "after_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n# Copyright 2022 The HuggingFace Authors.\n\nimport logging\nfrom http import HTTPStatus\nfrom typing import Literal, Optional\n\nfrom libcommon.config import CommonConfig\nfrom libcommon.dataset import DatasetNotFoundError, get_dataset_git_revision\nfrom libcommon.exceptions import CustomError\nfrom libcommon.processing_graph import ProcessingGraph, ProcessingStep\nfrom libcommon.simple_cache import (\n DoesNotExist,\n get_response_without_content_params,\n upsert_response_params,\n)\nfrom libcommon.state import DatasetState\nfrom libcommon.utils import JobInfo, JobParams, Priority, Status, orjson_dumps\n\nfrom worker.common_exceptions import (\n JobManagerCrashedError,\n JobManagerExceededMaximumDurationError,\n NoGitRevisionError,\n ResponseAlreadyComputedError,\n TooBigContentError,\n UnexpectedError,\n)\nfrom worker.config import AppConfig, WorkerConfig\nfrom worker.job_runner import JobRunner\n\n# List of error codes that should trigger a retry.\nERROR_CODES_TO_RETRY: list[str] = [\"ClientConnectionError\"]\n\n\nclass JobManager:\n \"\"\"\n A job manager is a class that handles a job runner compute, for a specific processing step.\n\n Args:\n job_info (:obj:`JobInfo`):\n The job to process. It contains the job_id, the job type, the dataset, the config, the split\n and the priority level.\n common_config (:obj:`CommonConfig`):\n The common config.\n processing_step (:obj:`ProcessingStep`):\n The processing step to process.\n \"\"\"\n\n job_id: str\n job_params: JobParams\n priority: Priority\n worker_config: WorkerConfig\n common_config: CommonConfig\n processing_step: ProcessingStep\n processing_graph: ProcessingGraph\n _dataset_git_revision: Optional[str] = None\n job_runner: JobRunner\n\n def __init__(\n self,\n job_info: JobInfo,\n app_config: AppConfig,\n job_runner: JobRunner,\n processing_graph: ProcessingGraph,\n ) -> None:\n self.job_info = job_info\n self.job_type = job_info[\"type\"]\n self.job_id = job_info[\"job_id\"]\n self.priority = job_info[\"priority\"]\n self.job_params = job_info[\"params\"]\n self.common_config = app_config.common\n self.worker_config = app_config.worker\n self.job_runner = job_runner\n self.processing_graph = processing_graph\n self.processing_step = self.job_runner.processing_step\n self.setup()\n\n def setup(self) -> None:\n job_type = self.job_runner.get_job_type()\n if self.processing_step.job_type != job_type:\n raise ValueError(\n f\"The processing step's job type is {self.processing_step.job_type}, but the job manager only\"\n f\" processes {job_type}\"\n )\n if self.job_type != job_type:\n raise ValueError(\n f\"The submitted job type is {self.job_type}, but the job manager only processes {job_type}\"\n )\n\n def __str__(self) -> str:\n return f\"JobManager(job_id={self.job_id} dataset={self.job_params['dataset']} job_info={self.job_info}\"\n\n def log(self, level: int, msg: str) -> None:\n logging.log(level=level, msg=f\"[{self.processing_step.job_type}] {msg}\")\n\n def debug(self, msg: str) -> None:\n self.log(level=logging.DEBUG, msg=msg)\n\n def info(self, msg: str) -> None:\n self.log(level=logging.INFO, msg=msg)\n\n def warning(self, msg: str) -> None:\n self.log(level=logging.WARNING, msg=msg)\n\n def exception(self, msg: str) -> None:\n self.log(level=logging.ERROR, msg=msg)\n\n def critical(self, msg: str) -> None:\n self.log(level=logging.CRITICAL, msg=msg)\n\n def run(self) -> Literal[Status.SUCCESS, Status.ERROR]:\n try:\n self.info(f\"compute {self}\")\n result: Literal[Status.SUCCESS, Status.ERROR] = Status.SUCCESS if self.process() else Status.ERROR\n except Exception:\n self.exception(f\"error while computing {self}\")\n result = Status.ERROR\n revision = self.get_dataset_git_revision(allow_raise=False)\n if revision is not None:\n self.backfill(revision=revision)\n return result\n\n def get_dataset_git_revision(self, allow_raise: bool = True) -> Optional[str]:\n \"\"\"Get the git revision of the dataset repository.\"\"\"\n if self._dataset_git_revision is None:\n try:\n self._dataset_git_revision = get_dataset_git_revision(\n dataset=self.job_params[\"dataset\"],\n hf_endpoint=self.common_config.hf_endpoint,\n hf_token=self.common_config.hf_token,\n )\n except Exception as e:\n if allow_raise:\n raise e\n else:\n return None\n return self._dataset_git_revision\n\n def raise_if_parallel_response_exists(self, parallel_cache_kind: str, parallel_job_version: int) -> None:\n try:\n existing_response = get_response_without_content_params(\n kind=parallel_cache_kind,\n job_params=self.job_params,\n )\n\n dataset_git_revision = self.get_dataset_git_revision()\n if (\n existing_response[\"http_status\"] == HTTPStatus.OK\n and existing_response[\"job_runner_version\"] == parallel_job_version\n and existing_response[\"progress\"] == 1.0 # completed response\n and dataset_git_revision is not None\n and existing_response[\"dataset_git_revision\"] == dataset_git_revision\n ):\n raise ResponseAlreadyComputedError(\n f\"Response has already been computed and stored in cache kind: {parallel_cache_kind}. Compute will\"\n \" be skipped.\"\n )\n except DoesNotExist:\n logging.debug(f\"no cache found for {parallel_cache_kind}.\")\n\n def process(\n self,\n ) -> bool:\n dataset_git_revision = None\n try:\n dataset_git_revision = self.get_dataset_git_revision()\n if dataset_git_revision is None:\n self.debug(f\"the dataset={self.job_params['dataset']} has no git revision, don't update the cache\")\n raise NoGitRevisionError(f\"Could not get git revision for dataset {self.job_params['dataset']}\")\n try:\n self.job_runner.pre_compute()\n parallel_job_runner = self.job_runner.get_parallel_job_runner()\n if parallel_job_runner:\n self.raise_if_parallel_response_exists(\n parallel_cache_kind=parallel_job_runner[\"job_type\"],\n parallel_job_version=parallel_job_runner[\"job_runner_version\"],\n )\n\n job_result = self.job_runner.compute()\n content = job_result.content\n\n # Validate content size\n if len(orjson_dumps(content)) > self.worker_config.content_max_bytes:\n raise TooBigContentError(\n \"The computed response content exceeds the supported size in bytes\"\n f\" ({self.worker_config.content_max_bytes}).\"\n )\n finally:\n # ensure the post_compute hook is called even if the compute raises an exception\n self.job_runner.post_compute()\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=content,\n http_status=HTTPStatus.OK,\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=dataset_git_revision,\n progress=job_result.progress,\n )\n self.debug(f\"dataset={self.job_params['dataset']} job_info={self.job_info} is valid, cache updated\")\n return True\n except DatasetNotFoundError:\n # To avoid filling the cache, we don't save this error. Otherwise, DoS is possible.\n self.debug(f\"the dataset={self.job_params['dataset']} could not be found, don't update the cache\")\n return False\n except Exception as err:\n e = err if isinstance(err, CustomError) else UnexpectedError(str(err), err)\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=dict(e.as_response()),\n http_status=e.status_code,\n error_code=e.code,\n details=dict(e.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=dataset_git_revision,\n )\n self.debug(f\"response for job_info={self.job_info} had an error, cache updated\")\n return False\n\n def backfill(self, revision: str) -> None:\n \"\"\"Evaluate the state of the dataset and backfill the cache if necessary.\"\"\"\n DatasetState(\n dataset=self.job_params[\"dataset\"],\n processing_graph=self.processing_graph,\n revision=revision,\n error_codes_to_retry=ERROR_CODES_TO_RETRY,\n priority=self.priority,\n ).backfill()\n\n def set_crashed(self, message: str, cause: Optional[BaseException] = None) -> None:\n error = JobManagerCrashedError(message=message, cause=cause)\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=dict(error.as_response()),\n http_status=error.status_code,\n error_code=error.code,\n details=dict(error.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=self.get_dataset_git_revision(allow_raise=False),\n )\n logging.debug(\n f\"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (crashed), cache\"\n \" updated\"\n )\n\n def set_exceeded_maximum_duration(self, message: str, cause: Optional[BaseException] = None) -> None:\n error = JobManagerExceededMaximumDurationError(message=message, cause=cause)\n upsert_response_params(\n kind=self.processing_step.cache_kind,\n job_params=self.job_params,\n content=dict(error.as_response()),\n http_status=error.status_code,\n error_code=error.code,\n details=dict(error.as_response_with_cause()),\n job_runner_version=self.job_runner.get_job_runner_version(),\n dataset_git_revision=self.get_dataset_git_revision(allow_raise=False),\n )\n logging.debug(\n f\"response for dataset={self.job_params['dataset']} job_info={self.job_info} had an error (exceeded\"\n \" maximum duration), cache updated\"\n )\n", "path": "services/worker/src/worker/job_manager.py"}]}
| 3,839 | 681 |
gh_patches_debug_5549
|
rasdani/github-patches
|
git_diff
|
webkom__lego-2085
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Emails should contain a link to edit email preferences
Should be to just add a link to the base email template, as well as add a link to mails sent by restricted.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lego/apps/restricted/message_processor.py`
Content:
```
1 from copy import deepcopy
2 from email.message import Message
3 from email.mime.text import MIMEText
4
5 from django.conf import settings
6 from django.core.mail import get_connection
7
8 from channels.db import database_sync_to_async
9 from structlog import get_logger
10
11 from lego.apps.action_handlers.registry import get_handler
12 from lego.apps.restricted.models import RestrictedMail
13
14 from .message import EmailMessage
15 from .utils import get_mail_token
16
17 log = get_logger()
18
19
20 class MessageProcessor:
21 """
22 Pipeline for message processing after initial parsing is done by the SMTP server.
23 """
24
25 def __init__(self, sender, message, message_data):
26 self.sender = sender
27 self.message = message
28 self.message_data = message_data
29 self.action_handler = get_handler(RestrictedMail)
30
31 @database_sync_to_async
32 def process_message(self):
33 token = self.get_token(self.message)
34 if not token:
35 log.critical("restricted_mail_no_token_found", sender=self.sender)
36 # Notify about failure
37 self.action_handler.run(
38 None, "failure", sender=self.sender, reason="TOKEN_NOT_FOUND"
39 )
40 return None
41
42 restricted_message = self.lookup_instance(self.sender, token)
43 if restricted_message is None:
44 log.critical("restricted_mail_token_not_found")
45 # Notify about failure
46 self.action_handler.run(
47 None, "failure", sender=self.sender, reason="TOKEN_INVALID"
48 )
49 return None
50
51 recipients = restricted_message.lookup_recipients()
52 sender = self.get_sender(restricted_message)
53
54 message = self.rewrite_message(self.message, sender)
55
56 if sender == settings.RESTRICTED_FROM:
57 # Add a footer with a note about the from address rewrite.
58 self.decorate(message, restricted_message.hide_sender, self.sender)
59
60 self.send(recipients, sender, message)
61 restricted_message.mark_used()
62
63 # Send a success message to the creator
64 self.action_handler.run(restricted_message, "sent")
65
66 def get_sender(self, restricted_mail):
67 """
68 Get the sender address. We use the global settings and the restricted_mail object to find
69 the sender.
70 """
71
72 if (
73 settings.RESTRICTED_ALLOW_ORIGINAL_SENDER
74 and not restricted_mail.hide_sender
75 ):
76 return self.sender
77
78 return settings.RESTRICTED_FROM
79
80 @staticmethod
81 def get_token(message):
82 """
83 Lookup the attached token, this is used to lookup the existing restricted mail in our
84 database.
85 """
86 return get_mail_token(message)
87
88 @staticmethod
89 def lookup_instance(sender, token):
90 """
91 Get the restricted_mail instance based on a token found in the received message.
92 """
93 return RestrictedMail.get_restricted_mail(sender, token)
94
95 @staticmethod
96 def rewrite_message(message, sender):
97 """
98 This function replaces the headers in the message. We preserve the headers in the
99 preserve_headers list, all other headers is removed. We do this to get a higher chance to
100 pass thinks like SPF and DKIM checks. These headers is added automatically by our outgoing
101 mail handler if the sender address is valid and managed by us.
102 """
103 preserve_headers = ["Subject", "Content-Type", "MIME-Version"]
104 headers = {}
105
106 for header in preserve_headers:
107 header_value = message.get(header)
108 if header_value:
109 headers[header] = header_value
110
111 message._headers = []
112
113 for header, value in headers.items():
114 message[header] = value
115
116 message["Sender"] = sender
117 message["From"] = sender
118
119 return message
120
121 @staticmethod
122 def send(recipients, sender, message):
123 """
124 Create a new connection and bulk send mails
125 """
126 connection = get_connection(fail_silently=False)
127 messages = [
128 EmailMessage(recipient, sender, deepcopy(message))
129 for recipient in recipients
130 ]
131 log.info(
132 "restricted_mail_process_messages", sender=sender, recipients=len(messages)
133 )
134 return connection.send_messages(messages)
135
136 @staticmethod
137 def decorate(message, hide_sender, sender):
138 """
139 Notify the recipient about the sender rewrite.
140 """
141
142 footer = ["------------", "Du kan ikke svare direkte på denne eposten."]
143
144 if not hide_sender:
145 footer.append(
146 f"Opprinnelig avsender er {sender}, send svar til denne adressen."
147 )
148 footer.append(
149 "Denne eposten har uorginal avsender for å redusere risikoen for at "
150 "meldingen oppfattes som spam."
151 )
152 else:
153 footer.append("Opprinnelig avsender har valgt å skjule sin adresse.")
154
155 footer = "\n".join(footer)
156 charset = message.get_content_charset() or "us-ascii"
157 content_type = message.get_content_type()
158
159 wrap = True
160 if not message.is_multipart() and content_type == "text/plain":
161 format_param = message.get_param("format")
162 delsp = message.get_param("delsp")
163 transfer_encoding = message.get("content-transfer-encoding")
164
165 try:
166 old_payload = message.get_payload(decode=True).decode(charset)
167 del message["content-transfer-encoding"]
168
169 footer_separator = "\n"
170 payload = old_payload + footer_separator + footer
171
172 for cset in (charset, "utf-8"):
173 try:
174 message.set_payload(payload.encode(cset), cset)
175 except UnicodeError:
176 pass
177 else:
178 if format_param:
179 message.set_param("format", format_param)
180 if delsp:
181 message.set_param("delsp", delsp)
182 wrap = False
183 break
184 except (LookupError, UnicodeError):
185 if transfer_encoding:
186 del message["content-transfer-encoding"]
187 message["Content-Transfer-Encoding"] = transfer_encoding
188
189 elif message.get_content_type() == "multipart/mixed":
190 payload = message.get_payload()
191 if not isinstance(payload, list):
192 payload = [payload]
193
194 mime_footer = MIMEText(footer.encode("utf-8"), "plain", "utf-8")
195 mime_footer["Content-Disposition"] = "inline"
196 payload.append(mime_footer)
197 message.set_payload(payload)
198 wrap = False
199
200 if not wrap:
201 return
202
203 inner = Message()
204 for h, v in message.items():
205 if h.lower().startswith("content-"):
206 inner[h] = v
207 inner.set_payload(message.get_payload())
208 inner.set_unixfrom(message.get_unixfrom())
209 inner.preamble = message.preamble
210 inner.epilogue = message.epilogue
211 inner.set_default_type(message.get_default_type())
212 if hasattr(message, "__version__"):
213 inner.__version__ = message.__version__
214 payload = [inner]
215 mime_footer = MIMEText(footer.encode("utf-8"), "plain", "utf-8")
216 mime_footer["Content-Disposition"] = "inline"
217 payload.append(mime_footer)
218 message.set_payload(payload)
219 del message["content-type"]
220 del message["content-transfer-encoding"]
221 del message["content-disposition"]
222 message["Content-Type"] = "multipart/mixed"
223
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lego/apps/restricted/message_processor.py b/lego/apps/restricted/message_processor.py
--- a/lego/apps/restricted/message_processor.py
+++ b/lego/apps/restricted/message_processor.py
@@ -149,6 +149,9 @@
)
else:
footer.append("Opprinnelig avsender har valgt å skjule sin adresse.")
+ footer.append(
+ f"Meld deg av her: {settings.FRONTEND_URL}/users/me/settings/notifications"
+ )
footer = "\n".join(footer)
charset = message.get_content_charset() or "us-ascii"
|
{"golden_diff": "diff --git a/lego/apps/restricted/message_processor.py b/lego/apps/restricted/message_processor.py\n--- a/lego/apps/restricted/message_processor.py\n+++ b/lego/apps/restricted/message_processor.py\n@@ -149,6 +149,9 @@\n )\n else:\n footer.append(\"Opprinnelig avsender har valgt \u00e5 skjule sin adresse.\")\n+ footer.append(\n+ f\"Meld deg av her: {settings.FRONTEND_URL}/users/me/settings/notifications\"\n+ )\n \n footer = \"\\n\".join(footer)\n charset = message.get_content_charset() or \"us-ascii\"\n", "issue": "Emails should contain a link to edit email preferences\nShould be to just add a link to the base email template, as well as add a link to mails sent by restricted.\n", "before_files": [{"content": "from copy import deepcopy\nfrom email.message import Message\nfrom email.mime.text import MIMEText\n\nfrom django.conf import settings\nfrom django.core.mail import get_connection\n\nfrom channels.db import database_sync_to_async\nfrom structlog import get_logger\n\nfrom lego.apps.action_handlers.registry import get_handler\nfrom lego.apps.restricted.models import RestrictedMail\n\nfrom .message import EmailMessage\nfrom .utils import get_mail_token\n\nlog = get_logger()\n\n\nclass MessageProcessor:\n \"\"\"\n Pipeline for message processing after initial parsing is done by the SMTP server.\n \"\"\"\n\n def __init__(self, sender, message, message_data):\n self.sender = sender\n self.message = message\n self.message_data = message_data\n self.action_handler = get_handler(RestrictedMail)\n\n @database_sync_to_async\n def process_message(self):\n token = self.get_token(self.message)\n if not token:\n log.critical(\"restricted_mail_no_token_found\", sender=self.sender)\n # Notify about failure\n self.action_handler.run(\n None, \"failure\", sender=self.sender, reason=\"TOKEN_NOT_FOUND\"\n )\n return None\n\n restricted_message = self.lookup_instance(self.sender, token)\n if restricted_message is None:\n log.critical(\"restricted_mail_token_not_found\")\n # Notify about failure\n self.action_handler.run(\n None, \"failure\", sender=self.sender, reason=\"TOKEN_INVALID\"\n )\n return None\n\n recipients = restricted_message.lookup_recipients()\n sender = self.get_sender(restricted_message)\n\n message = self.rewrite_message(self.message, sender)\n\n if sender == settings.RESTRICTED_FROM:\n # Add a footer with a note about the from address rewrite.\n self.decorate(message, restricted_message.hide_sender, self.sender)\n\n self.send(recipients, sender, message)\n restricted_message.mark_used()\n\n # Send a success message to the creator\n self.action_handler.run(restricted_message, \"sent\")\n\n def get_sender(self, restricted_mail):\n \"\"\"\n Get the sender address. We use the global settings and the restricted_mail object to find\n the sender.\n \"\"\"\n\n if (\n settings.RESTRICTED_ALLOW_ORIGINAL_SENDER\n and not restricted_mail.hide_sender\n ):\n return self.sender\n\n return settings.RESTRICTED_FROM\n\n @staticmethod\n def get_token(message):\n \"\"\"\n Lookup the attached token, this is used to lookup the existing restricted mail in our\n database.\n \"\"\"\n return get_mail_token(message)\n\n @staticmethod\n def lookup_instance(sender, token):\n \"\"\"\n Get the restricted_mail instance based on a token found in the received message.\n \"\"\"\n return RestrictedMail.get_restricted_mail(sender, token)\n\n @staticmethod\n def rewrite_message(message, sender):\n \"\"\"\n This function replaces the headers in the message. We preserve the headers in the\n preserve_headers list, all other headers is removed. We do this to get a higher chance to\n pass thinks like SPF and DKIM checks. These headers is added automatically by our outgoing\n mail handler if the sender address is valid and managed by us.\n \"\"\"\n preserve_headers = [\"Subject\", \"Content-Type\", \"MIME-Version\"]\n headers = {}\n\n for header in preserve_headers:\n header_value = message.get(header)\n if header_value:\n headers[header] = header_value\n\n message._headers = []\n\n for header, value in headers.items():\n message[header] = value\n\n message[\"Sender\"] = sender\n message[\"From\"] = sender\n\n return message\n\n @staticmethod\n def send(recipients, sender, message):\n \"\"\"\n Create a new connection and bulk send mails\n \"\"\"\n connection = get_connection(fail_silently=False)\n messages = [\n EmailMessage(recipient, sender, deepcopy(message))\n for recipient in recipients\n ]\n log.info(\n \"restricted_mail_process_messages\", sender=sender, recipients=len(messages)\n )\n return connection.send_messages(messages)\n\n @staticmethod\n def decorate(message, hide_sender, sender):\n \"\"\"\n Notify the recipient about the sender rewrite.\n \"\"\"\n\n footer = [\"------------\", \"Du kan ikke svare direkte p\u00e5 denne eposten.\"]\n\n if not hide_sender:\n footer.append(\n f\"Opprinnelig avsender er {sender}, send svar til denne adressen.\"\n )\n footer.append(\n \"Denne eposten har uorginal avsender for \u00e5 redusere risikoen for at \"\n \"meldingen oppfattes som spam.\"\n )\n else:\n footer.append(\"Opprinnelig avsender har valgt \u00e5 skjule sin adresse.\")\n\n footer = \"\\n\".join(footer)\n charset = message.get_content_charset() or \"us-ascii\"\n content_type = message.get_content_type()\n\n wrap = True\n if not message.is_multipart() and content_type == \"text/plain\":\n format_param = message.get_param(\"format\")\n delsp = message.get_param(\"delsp\")\n transfer_encoding = message.get(\"content-transfer-encoding\")\n\n try:\n old_payload = message.get_payload(decode=True).decode(charset)\n del message[\"content-transfer-encoding\"]\n\n footer_separator = \"\\n\"\n payload = old_payload + footer_separator + footer\n\n for cset in (charset, \"utf-8\"):\n try:\n message.set_payload(payload.encode(cset), cset)\n except UnicodeError:\n pass\n else:\n if format_param:\n message.set_param(\"format\", format_param)\n if delsp:\n message.set_param(\"delsp\", delsp)\n wrap = False\n break\n except (LookupError, UnicodeError):\n if transfer_encoding:\n del message[\"content-transfer-encoding\"]\n message[\"Content-Transfer-Encoding\"] = transfer_encoding\n\n elif message.get_content_type() == \"multipart/mixed\":\n payload = message.get_payload()\n if not isinstance(payload, list):\n payload = [payload]\n\n mime_footer = MIMEText(footer.encode(\"utf-8\"), \"plain\", \"utf-8\")\n mime_footer[\"Content-Disposition\"] = \"inline\"\n payload.append(mime_footer)\n message.set_payload(payload)\n wrap = False\n\n if not wrap:\n return\n\n inner = Message()\n for h, v in message.items():\n if h.lower().startswith(\"content-\"):\n inner[h] = v\n inner.set_payload(message.get_payload())\n inner.set_unixfrom(message.get_unixfrom())\n inner.preamble = message.preamble\n inner.epilogue = message.epilogue\n inner.set_default_type(message.get_default_type())\n if hasattr(message, \"__version__\"):\n inner.__version__ = message.__version__\n payload = [inner]\n mime_footer = MIMEText(footer.encode(\"utf-8\"), \"plain\", \"utf-8\")\n mime_footer[\"Content-Disposition\"] = \"inline\"\n payload.append(mime_footer)\n message.set_payload(payload)\n del message[\"content-type\"]\n del message[\"content-transfer-encoding\"]\n del message[\"content-disposition\"]\n message[\"Content-Type\"] = \"multipart/mixed\"\n", "path": "lego/apps/restricted/message_processor.py"}], "after_files": [{"content": "from copy import deepcopy\nfrom email.message import Message\nfrom email.mime.text import MIMEText\n\nfrom django.conf import settings\nfrom django.core.mail import get_connection\n\nfrom structlog import get_logger\n\nfrom lego.apps.action_handlers.registry import get_handler\nfrom lego.apps.restricted.models import RestrictedMail\n\nfrom .message import EmailMessage\nfrom .utils import get_mail_token\n\nlog = get_logger()\n\n\nclass MessageProcessor:\n \"\"\"\n Pipeline for message processing after initial parsing is done by the LMTP server.\n \"\"\"\n\n def __init__(self, sender, message, message_data):\n self.sender = sender\n self.message = message\n self.message_data = message_data\n self.action_handler = get_handler(RestrictedMail)\n\n def process_message(self):\n token = self.get_token(self.message)\n if not token:\n log.critical(\"restricted_mail_no_token_found\", sender=self.sender)\n # Notify about failure\n self.action_handler.run(\n None, \"failure\", sender=self.sender, reason=\"TOKEN_NOT_FOUND\"\n )\n return None\n\n restricted_message = self.lookup_instance(self.sender, token)\n if restricted_message is None:\n log.critical(\"restricted_mail_token_not_found\")\n # Notify about failure\n self.action_handler.run(\n None, \"failure\", sender=self.sender, reason=\"TOKEN_INVALID\"\n )\n return None\n\n recipients = restricted_message.lookup_recipients()\n sender = self.get_sender(restricted_message)\n\n message = self.rewrite_message(self.message, sender)\n\n if sender == settings.RESTRICTED_FROM:\n # Add a footer with a note about the from address rewrite.\n self.decorate(message, restricted_message.hide_sender, self.sender)\n\n self.send(recipients, sender, message)\n restricted_message.mark_used()\n\n # Send a success message to the creator\n self.action_handler.run(restricted_message, \"sent\")\n\n def get_sender(self, restricted_mail):\n \"\"\"\n Get the sender address. We use the global settings and the restricted_mail object to find\n the sender.\n \"\"\"\n\n if (\n settings.RESTRICTED_ALLOW_ORIGINAL_SENDER\n and not restricted_mail.hide_sender\n ):\n return self.sender\n\n return settings.RESTRICTED_FROM\n\n @staticmethod\n def get_token(message):\n \"\"\"\n Lookup the attached token, this is used to lookup the existing restricted mail in our\n database.\n \"\"\"\n return get_mail_token(message)\n\n @staticmethod\n def lookup_instance(sender, token):\n \"\"\"\n Get the restricted_mail instance based on a token found in the received message.\n \"\"\"\n return RestrictedMail.get_restricted_mail(sender, token)\n\n @staticmethod\n def rewrite_message(message, sender):\n \"\"\"\n This function replaces the headers in the message. We preserve the headers in the\n preserve_headers list, all other headers is removed. We do this to get a higher chance to\n pass thinks like SPF and DKIM checks. These headers is added automatically by our outgoing\n mail handler if the sender address is valid and managed by us.\n \"\"\"\n preserve_headers = [\"Subject\", \"Content-Type\", \"MIME-Version\"]\n headers = {}\n\n for header in preserve_headers:\n header_value = message.get(header)\n if header_value:\n headers[header] = header_value\n\n message._headers = []\n\n for header, value in headers.items():\n message[header] = value\n\n message[\"Sender\"] = sender\n message[\"From\"] = sender\n\n return message\n\n @staticmethod\n def send(recipients, sender, message):\n \"\"\"\n Create a new connection and bulk send mails\n \"\"\"\n connection = get_connection(fail_silently=False)\n messages = [\n EmailMessage(recipient, sender, deepcopy(message))\n for recipient in recipients\n ]\n log.info(\n \"restricted_mail_process_messages\", sender=sender, recipients=len(messages)\n )\n return connection.send_messages(messages)\n\n @staticmethod\n def decorate(message, hide_sender, sender):\n \"\"\"\n Notify the recipient about the sender rewrite.\n \"\"\"\n\n footer = [\"------------\", \"Du kan ikke svare direkte p\u00e5 denne eposten.\"]\n\n if not hide_sender:\n footer.append(\n f\"Opprinnelig avsender er {sender}, send svar til denne adressen.\"\n )\n footer.append(\n \"Denne eposten har uorginal avsender for \u00e5 redusere risikoen for at \"\n \"meldingen oppfattes som spam.\"\n )\n else:\n footer.append(\"Opprinnelig avsender har valgt \u00e5 skjule sin adresse.\")\n footer.append(\n f\"Meld deg av her: {settings.FRONTEND_URL}/users/me/settings/notifications\"\n )\n\n footer = \"\\n\".join(footer)\n charset = message.get_content_charset() or \"us-ascii\"\n content_type = message.get_content_type()\n\n wrap = True\n if not message.is_multipart() and content_type == \"text/plain\":\n format_param = message.get_param(\"format\")\n delsp = message.get_param(\"delsp\")\n transfer_encoding = message.get(\"content-transfer-encoding\")\n\n try:\n old_payload = message.get_payload(decode=True).decode(charset)\n del message[\"content-transfer-encoding\"]\n\n footer_separator = \"\\n\"\n payload = old_payload + footer_separator + footer\n\n for cset in (charset, \"utf-8\"):\n try:\n message.set_payload(payload.encode(cset), cset)\n except UnicodeError:\n pass\n else:\n if format_param:\n message.set_param(\"format\", format_param)\n if delsp:\n message.set_param(\"delsp\", delsp)\n wrap = False\n break\n except (LookupError, UnicodeError):\n if transfer_encoding:\n del message[\"content-transfer-encoding\"]\n message[\"Content-Transfer-Encoding\"] = transfer_encoding\n\n elif message.get_content_type() == \"multipart/mixed\":\n payload = message.get_payload()\n if not isinstance(payload, list):\n payload = [payload]\n\n mime_footer = MIMEText(footer.encode(\"utf-8\"), \"plain\", \"utf-8\")\n mime_footer[\"Content-Disposition\"] = \"inline\"\n payload.append(mime_footer)\n message.set_payload(payload)\n wrap = False\n\n if not wrap:\n return\n\n inner = Message()\n for h, v in message.items():\n if h.lower().startswith(\"content-\"):\n inner[h] = v\n inner.set_payload(message.get_payload())\n inner.set_unixfrom(message.get_unixfrom())\n inner.preamble = message.preamble\n inner.epilogue = message.epilogue\n inner.set_default_type(message.get_default_type())\n if hasattr(message, \"__version__\"):\n inner.__version__ = message.__version__\n payload = [inner]\n mime_footer = MIMEText(footer.encode(\"utf-8\"), \"plain\", \"utf-8\")\n mime_footer[\"Content-Disposition\"] = \"inline\"\n payload.append(mime_footer)\n message.set_payload(payload)\n del message[\"content-type\"]\n del message[\"content-transfer-encoding\"]\n del message[\"content-disposition\"]\n message[\"Content-Type\"] = \"multipart/mixed\"\n", "path": "lego/apps/restricted/message_processor.py"}]}
| 2,403 | 142 |
gh_patches_debug_23134
|
rasdani/github-patches
|
git_diff
|
netbox-community__netbox-13753
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
New dashboard widgets have insufficient header contrast in dark mode
### NetBox version
v3.5.0-dev
### Python version
3.8
### Steps to Reproduce
1. Navigate to the dashboard screen with the new widgets.
### Expected Behavior
The text and icon colors within the card header should maintain at least the same contrast ratio between light and dark modes. You can see this done within the search and log in buttons in the screenshots below.
### Observed Behavior
Currently, the colors used for the text and icons for card headers in dark mode do not provide much contrast against the background, making them very difficult to see. Screenshots below for reference.


--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `netbox/extras/dashboard/widgets.py`
Content:
```
1 import uuid
2 from functools import cached_property
3 from hashlib import sha256
4 from urllib.parse import urlencode
5
6 import feedparser
7 import requests
8 from django import forms
9 from django.conf import settings
10 from django.contrib.contenttypes.models import ContentType
11 from django.core.cache import cache
12 from django.db.models import Q
13 from django.template.loader import render_to_string
14 from django.urls import NoReverseMatch, resolve, reverse
15 from django.utils.translation import gettext as _
16
17 from extras.choices import BookmarkOrderingChoices
18 from extras.utils import FeatureQuery
19 from utilities.forms import BootstrapMixin
20 from utilities.permissions import get_permission_for_model
21 from utilities.templatetags.builtins.filters import render_markdown
22 from utilities.utils import content_type_identifier, content_type_name, dict_to_querydict, get_viewname
23 from .utils import register_widget
24
25 __all__ = (
26 'BookmarksWidget',
27 'DashboardWidget',
28 'NoteWidget',
29 'ObjectCountsWidget',
30 'ObjectListWidget',
31 'RSSFeedWidget',
32 'WidgetConfigForm',
33 )
34
35
36 def get_content_type_labels():
37 return [
38 (content_type_identifier(ct), content_type_name(ct))
39 for ct in ContentType.objects.filter(
40 FeatureQuery('export_templates').get_query() | Q(app_label='extras', model='objectchange') |
41 Q(app_label='extras', model='configcontext')
42 ).order_by('app_label', 'model')
43 ]
44
45
46 def get_models_from_content_types(content_types):
47 """
48 Return a list of models corresponding to the given content types, identified by natural key.
49 """
50 models = []
51 for content_type_id in content_types:
52 app_label, model_name = content_type_id.split('.')
53 content_type = ContentType.objects.get_by_natural_key(app_label, model_name)
54 models.append(content_type.model_class())
55 return models
56
57
58 class WidgetConfigForm(BootstrapMixin, forms.Form):
59 pass
60
61
62 class DashboardWidget:
63 """
64 Base class for custom dashboard widgets.
65
66 Attributes:
67 description: A brief, user-friendly description of the widget's function
68 default_title: The string to show for the widget's title when none has been specified.
69 default_config: Default configuration parameters, as a dictionary mapping
70 width: The widget's default width (1 to 12)
71 height: The widget's default height; the number of rows it consumes
72 """
73 description = None
74 default_title = None
75 default_config = {}
76 width = 4
77 height = 3
78
79 class ConfigForm(WidgetConfigForm):
80 """
81 The widget's configuration form.
82 """
83 pass
84
85 def __init__(self, id=None, title=None, color=None, config=None, width=None, height=None, x=None, y=None):
86 self.id = id or str(uuid.uuid4())
87 self.config = config or self.default_config
88 self.title = title or self.default_title
89 self.color = color
90 if width:
91 self.width = width
92 if height:
93 self.height = height
94 self.x, self.y = x, y
95
96 def __str__(self):
97 return self.title or self.__class__.__name__
98
99 def set_layout(self, grid_item):
100 self.width = grid_item.get('w', 1)
101 self.height = grid_item.get('h', 1)
102 self.x = grid_item.get('x')
103 self.y = grid_item.get('y')
104
105 def render(self, request):
106 """
107 This method is called to render the widget's content.
108
109 Params:
110 request: The current request
111 """
112 raise NotImplementedError(f"{self.__class__} must define a render() method.")
113
114 @property
115 def name(self):
116 return f'{self.__class__.__module__.split(".")[0]}.{self.__class__.__name__}'
117
118 @property
119 def form_data(self):
120 return {
121 'title': self.title,
122 'color': self.color,
123 'config': self.config,
124 }
125
126
127 @register_widget
128 class NoteWidget(DashboardWidget):
129 default_title = _('Note')
130 description = _('Display some arbitrary custom content. Markdown is supported.')
131
132 class ConfigForm(WidgetConfigForm):
133 content = forms.CharField(
134 widget=forms.Textarea()
135 )
136
137 def render(self, request):
138 return render_markdown(self.config.get('content'))
139
140
141 @register_widget
142 class ObjectCountsWidget(DashboardWidget):
143 default_title = _('Object Counts')
144 description = _('Display a set of NetBox models and the number of objects created for each type.')
145 template_name = 'extras/dashboard/widgets/objectcounts.html'
146
147 class ConfigForm(WidgetConfigForm):
148 models = forms.MultipleChoiceField(
149 choices=get_content_type_labels
150 )
151 filters = forms.JSONField(
152 required=False,
153 label='Object filters',
154 help_text=_("Filters to apply when counting the number of objects")
155 )
156
157 def clean_filters(self):
158 if data := self.cleaned_data['filters']:
159 try:
160 dict(data)
161 except TypeError:
162 raise forms.ValidationError("Invalid format. Object filters must be passed as a dictionary.")
163 return data
164
165 def render(self, request):
166 counts = []
167 for model in get_models_from_content_types(self.config['models']):
168 permission = get_permission_for_model(model, 'view')
169 if request.user.has_perm(permission):
170 url = reverse(get_viewname(model, 'list'))
171 qs = model.objects.restrict(request.user, 'view')
172 # Apply any specified filters
173 if filters := self.config.get('filters'):
174 params = dict_to_querydict(filters)
175 filterset = getattr(resolve(url).func.view_class, 'filterset', None)
176 qs = filterset(params, qs).qs
177 url = f'{url}?{params.urlencode()}'
178 object_count = qs.count
179 counts.append((model, object_count, url))
180 else:
181 counts.append((model, None, None))
182
183 return render_to_string(self.template_name, {
184 'counts': counts,
185 })
186
187
188 @register_widget
189 class ObjectListWidget(DashboardWidget):
190 default_title = _('Object List')
191 description = _('Display an arbitrary list of objects.')
192 template_name = 'extras/dashboard/widgets/objectlist.html'
193 width = 12
194 height = 4
195
196 class ConfigForm(WidgetConfigForm):
197 model = forms.ChoiceField(
198 choices=get_content_type_labels
199 )
200 page_size = forms.IntegerField(
201 required=False,
202 min_value=1,
203 max_value=100,
204 help_text=_('The default number of objects to display')
205 )
206 url_params = forms.JSONField(
207 required=False,
208 label='URL parameters'
209 )
210
211 def clean_url_params(self):
212 if data := self.cleaned_data['url_params']:
213 try:
214 urlencode(data)
215 except (TypeError, ValueError):
216 raise forms.ValidationError("Invalid format. URL parameters must be passed as a dictionary.")
217 return data
218
219 def render(self, request):
220 app_label, model_name = self.config['model'].split('.')
221 model = ContentType.objects.get_by_natural_key(app_label, model_name).model_class()
222 viewname = get_viewname(model, action='list')
223
224 # Evaluate user's permission. Note that this controls only whether the HTMX element is
225 # embedded on the page: The view itself will also evaluate permissions separately.
226 permission = get_permission_for_model(model, 'view')
227 has_permission = request.user.has_perm(permission)
228
229 try:
230 htmx_url = reverse(viewname)
231 except NoReverseMatch:
232 htmx_url = None
233 parameters = self.config.get('url_params') or {}
234 if page_size := self.config.get('page_size'):
235 parameters['per_page'] = page_size
236
237 if parameters:
238 try:
239 htmx_url = f'{htmx_url}?{urlencode(parameters, doseq=True)}'
240 except ValueError:
241 pass
242 return render_to_string(self.template_name, {
243 'viewname': viewname,
244 'has_permission': has_permission,
245 'htmx_url': htmx_url,
246 })
247
248
249 @register_widget
250 class RSSFeedWidget(DashboardWidget):
251 default_title = _('RSS Feed')
252 default_config = {
253 'max_entries': 10,
254 'cache_timeout': 3600, # seconds
255 }
256 description = _('Embed an RSS feed from an external website.')
257 template_name = 'extras/dashboard/widgets/rssfeed.html'
258 width = 6
259 height = 4
260
261 class ConfigForm(WidgetConfigForm):
262 feed_url = forms.URLField(
263 label=_('Feed URL')
264 )
265 max_entries = forms.IntegerField(
266 min_value=1,
267 max_value=1000,
268 help_text=_('The maximum number of objects to display')
269 )
270 cache_timeout = forms.IntegerField(
271 min_value=600, # 10 minutes
272 max_value=86400, # 24 hours
273 help_text=_('How long to stored the cached content (in seconds)')
274 )
275
276 def render(self, request):
277 return render_to_string(self.template_name, {
278 'url': self.config['feed_url'],
279 **self.get_feed()
280 })
281
282 @cached_property
283 def cache_key(self):
284 url = self.config['feed_url']
285 url_checksum = sha256(url.encode('utf-8')).hexdigest()
286 return f'dashboard_rss_{url_checksum}'
287
288 def get_feed(self):
289 # Fetch RSS content from cache if available
290 if feed_content := cache.get(self.cache_key):
291 return {
292 'feed': feedparser.FeedParserDict(feed_content),
293 }
294
295 # Fetch feed content from remote server
296 try:
297 response = requests.get(
298 url=self.config['feed_url'],
299 headers={'User-Agent': f'NetBox/{settings.VERSION}'},
300 proxies=settings.HTTP_PROXIES,
301 timeout=3
302 )
303 response.raise_for_status()
304 except requests.exceptions.RequestException as e:
305 return {
306 'error': e,
307 }
308
309 # Parse feed content
310 feed = feedparser.parse(response.content)
311 if not feed.bozo:
312 # Cap number of entries
313 max_entries = self.config.get('max_entries')
314 feed['entries'] = feed['entries'][:max_entries]
315 # Cache the feed content
316 cache.set(self.cache_key, dict(feed), self.config.get('cache_timeout'))
317
318 return {
319 'feed': feed,
320 }
321
322
323 @register_widget
324 class BookmarksWidget(DashboardWidget):
325 default_title = _('Bookmarks')
326 default_config = {
327 'order_by': BookmarkOrderingChoices.ORDERING_NEWEST,
328 }
329 description = _('Show your personal bookmarks')
330 template_name = 'extras/dashboard/widgets/bookmarks.html'
331
332 class ConfigForm(WidgetConfigForm):
333 object_types = forms.MultipleChoiceField(
334 # TODO: Restrict the choices by FeatureQuery('bookmarks')
335 choices=get_content_type_labels,
336 required=False
337 )
338 order_by = forms.ChoiceField(
339 choices=BookmarkOrderingChoices
340 )
341 max_items = forms.IntegerField(
342 min_value=1,
343 required=False
344 )
345
346 def render(self, request):
347 from extras.models import Bookmark
348
349 if request.user.is_anonymous:
350 bookmarks = list()
351 else:
352 bookmarks = Bookmark.objects.filter(user=request.user).order_by(self.config['order_by'])
353 if object_types := self.config.get('object_types'):
354 models = get_models_from_content_types(object_types)
355 conent_types = ContentType.objects.get_for_models(*models).values()
356 bookmarks = bookmarks.filter(object_type__in=conent_types)
357 if max_items := self.config.get('max_items'):
358 bookmarks = bookmarks[:max_items]
359
360 return render_to_string(self.template_name, {
361 'bookmarks': bookmarks,
362 })
363
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/netbox/extras/dashboard/widgets.py b/netbox/extras/dashboard/widgets.py
--- a/netbox/extras/dashboard/widgets.py
+++ b/netbox/extras/dashboard/widgets.py
@@ -16,6 +16,7 @@
from extras.choices import BookmarkOrderingChoices
from extras.utils import FeatureQuery
+from utilities.choices import ButtonColorChoices
from utilities.forms import BootstrapMixin
from utilities.permissions import get_permission_for_model
from utilities.templatetags.builtins.filters import render_markdown
@@ -115,6 +116,22 @@
def name(self):
return f'{self.__class__.__module__.split(".")[0]}.{self.__class__.__name__}'
+ @property
+ def fg_color(self):
+ """
+ Return the appropriate foreground (text) color for the widget's color.
+ """
+ if self.color in (
+ ButtonColorChoices.CYAN,
+ ButtonColorChoices.GRAY,
+ ButtonColorChoices.GREY,
+ ButtonColorChoices.TEAL,
+ ButtonColorChoices.WHITE,
+ ButtonColorChoices.YELLOW,
+ ):
+ return ButtonColorChoices.BLACK
+ return ButtonColorChoices.WHITE
+
@property
def form_data(self):
return {
|
{"golden_diff": "diff --git a/netbox/extras/dashboard/widgets.py b/netbox/extras/dashboard/widgets.py\n--- a/netbox/extras/dashboard/widgets.py\n+++ b/netbox/extras/dashboard/widgets.py\n@@ -16,6 +16,7 @@\n \n from extras.choices import BookmarkOrderingChoices\n from extras.utils import FeatureQuery\n+from utilities.choices import ButtonColorChoices\n from utilities.forms import BootstrapMixin\n from utilities.permissions import get_permission_for_model\n from utilities.templatetags.builtins.filters import render_markdown\n@@ -115,6 +116,22 @@\n def name(self):\n return f'{self.__class__.__module__.split(\".\")[0]}.{self.__class__.__name__}'\n \n+ @property\n+ def fg_color(self):\n+ \"\"\"\n+ Return the appropriate foreground (text) color for the widget's color.\n+ \"\"\"\n+ if self.color in (\n+ ButtonColorChoices.CYAN,\n+ ButtonColorChoices.GRAY,\n+ ButtonColorChoices.GREY,\n+ ButtonColorChoices.TEAL,\n+ ButtonColorChoices.WHITE,\n+ ButtonColorChoices.YELLOW,\n+ ):\n+ return ButtonColorChoices.BLACK\n+ return ButtonColorChoices.WHITE\n+\n @property\n def form_data(self):\n return {\n", "issue": "New dashboard widgets have insufficient header contrast in dark mode\n### NetBox version\n\nv3.5.0-dev\n\n### Python version\n\n3.8\n\n### Steps to Reproduce\n\n1. Navigate to the dashboard screen with the new widgets.\n\n### Expected Behavior\n\nThe text and icon colors within the card header should maintain at least the same contrast ratio between light and dark modes. You can see this done within the search and log in buttons in the screenshots below.\n\n### Observed Behavior\n\nCurrently, the colors used for the text and icons for card headers in dark mode do not provide much contrast against the background, making them very difficult to see. Screenshots below for reference.\r\n\r\n\r\n\n", "before_files": [{"content": "import uuid\nfrom functools import cached_property\nfrom hashlib import sha256\nfrom urllib.parse import urlencode\n\nimport feedparser\nimport requests\nfrom django import forms\nfrom django.conf import settings\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.cache import cache\nfrom django.db.models import Q\nfrom django.template.loader import render_to_string\nfrom django.urls import NoReverseMatch, resolve, reverse\nfrom django.utils.translation import gettext as _\n\nfrom extras.choices import BookmarkOrderingChoices\nfrom extras.utils import FeatureQuery\nfrom utilities.forms import BootstrapMixin\nfrom utilities.permissions import get_permission_for_model\nfrom utilities.templatetags.builtins.filters import render_markdown\nfrom utilities.utils import content_type_identifier, content_type_name, dict_to_querydict, get_viewname\nfrom .utils import register_widget\n\n__all__ = (\n 'BookmarksWidget',\n 'DashboardWidget',\n 'NoteWidget',\n 'ObjectCountsWidget',\n 'ObjectListWidget',\n 'RSSFeedWidget',\n 'WidgetConfigForm',\n)\n\n\ndef get_content_type_labels():\n return [\n (content_type_identifier(ct), content_type_name(ct))\n for ct in ContentType.objects.filter(\n FeatureQuery('export_templates').get_query() | Q(app_label='extras', model='objectchange') |\n Q(app_label='extras', model='configcontext')\n ).order_by('app_label', 'model')\n ]\n\n\ndef get_models_from_content_types(content_types):\n \"\"\"\n Return a list of models corresponding to the given content types, identified by natural key.\n \"\"\"\n models = []\n for content_type_id in content_types:\n app_label, model_name = content_type_id.split('.')\n content_type = ContentType.objects.get_by_natural_key(app_label, model_name)\n models.append(content_type.model_class())\n return models\n\n\nclass WidgetConfigForm(BootstrapMixin, forms.Form):\n pass\n\n\nclass DashboardWidget:\n \"\"\"\n Base class for custom dashboard widgets.\n\n Attributes:\n description: A brief, user-friendly description of the widget's function\n default_title: The string to show for the widget's title when none has been specified.\n default_config: Default configuration parameters, as a dictionary mapping\n width: The widget's default width (1 to 12)\n height: The widget's default height; the number of rows it consumes\n \"\"\"\n description = None\n default_title = None\n default_config = {}\n width = 4\n height = 3\n\n class ConfigForm(WidgetConfigForm):\n \"\"\"\n The widget's configuration form.\n \"\"\"\n pass\n\n def __init__(self, id=None, title=None, color=None, config=None, width=None, height=None, x=None, y=None):\n self.id = id or str(uuid.uuid4())\n self.config = config or self.default_config\n self.title = title or self.default_title\n self.color = color\n if width:\n self.width = width\n if height:\n self.height = height\n self.x, self.y = x, y\n\n def __str__(self):\n return self.title or self.__class__.__name__\n\n def set_layout(self, grid_item):\n self.width = grid_item.get('w', 1)\n self.height = grid_item.get('h', 1)\n self.x = grid_item.get('x')\n self.y = grid_item.get('y')\n\n def render(self, request):\n \"\"\"\n This method is called to render the widget's content.\n\n Params:\n request: The current request\n \"\"\"\n raise NotImplementedError(f\"{self.__class__} must define a render() method.\")\n\n @property\n def name(self):\n return f'{self.__class__.__module__.split(\".\")[0]}.{self.__class__.__name__}'\n\n @property\n def form_data(self):\n return {\n 'title': self.title,\n 'color': self.color,\n 'config': self.config,\n }\n\n\n@register_widget\nclass NoteWidget(DashboardWidget):\n default_title = _('Note')\n description = _('Display some arbitrary custom content. Markdown is supported.')\n\n class ConfigForm(WidgetConfigForm):\n content = forms.CharField(\n widget=forms.Textarea()\n )\n\n def render(self, request):\n return render_markdown(self.config.get('content'))\n\n\n@register_widget\nclass ObjectCountsWidget(DashboardWidget):\n default_title = _('Object Counts')\n description = _('Display a set of NetBox models and the number of objects created for each type.')\n template_name = 'extras/dashboard/widgets/objectcounts.html'\n\n class ConfigForm(WidgetConfigForm):\n models = forms.MultipleChoiceField(\n choices=get_content_type_labels\n )\n filters = forms.JSONField(\n required=False,\n label='Object filters',\n help_text=_(\"Filters to apply when counting the number of objects\")\n )\n\n def clean_filters(self):\n if data := self.cleaned_data['filters']:\n try:\n dict(data)\n except TypeError:\n raise forms.ValidationError(\"Invalid format. Object filters must be passed as a dictionary.\")\n return data\n\n def render(self, request):\n counts = []\n for model in get_models_from_content_types(self.config['models']):\n permission = get_permission_for_model(model, 'view')\n if request.user.has_perm(permission):\n url = reverse(get_viewname(model, 'list'))\n qs = model.objects.restrict(request.user, 'view')\n # Apply any specified filters\n if filters := self.config.get('filters'):\n params = dict_to_querydict(filters)\n filterset = getattr(resolve(url).func.view_class, 'filterset', None)\n qs = filterset(params, qs).qs\n url = f'{url}?{params.urlencode()}'\n object_count = qs.count\n counts.append((model, object_count, url))\n else:\n counts.append((model, None, None))\n\n return render_to_string(self.template_name, {\n 'counts': counts,\n })\n\n\n@register_widget\nclass ObjectListWidget(DashboardWidget):\n default_title = _('Object List')\n description = _('Display an arbitrary list of objects.')\n template_name = 'extras/dashboard/widgets/objectlist.html'\n width = 12\n height = 4\n\n class ConfigForm(WidgetConfigForm):\n model = forms.ChoiceField(\n choices=get_content_type_labels\n )\n page_size = forms.IntegerField(\n required=False,\n min_value=1,\n max_value=100,\n help_text=_('The default number of objects to display')\n )\n url_params = forms.JSONField(\n required=False,\n label='URL parameters'\n )\n\n def clean_url_params(self):\n if data := self.cleaned_data['url_params']:\n try:\n urlencode(data)\n except (TypeError, ValueError):\n raise forms.ValidationError(\"Invalid format. URL parameters must be passed as a dictionary.\")\n return data\n\n def render(self, request):\n app_label, model_name = self.config['model'].split('.')\n model = ContentType.objects.get_by_natural_key(app_label, model_name).model_class()\n viewname = get_viewname(model, action='list')\n\n # Evaluate user's permission. Note that this controls only whether the HTMX element is\n # embedded on the page: The view itself will also evaluate permissions separately.\n permission = get_permission_for_model(model, 'view')\n has_permission = request.user.has_perm(permission)\n\n try:\n htmx_url = reverse(viewname)\n except NoReverseMatch:\n htmx_url = None\n parameters = self.config.get('url_params') or {}\n if page_size := self.config.get('page_size'):\n parameters['per_page'] = page_size\n\n if parameters:\n try:\n htmx_url = f'{htmx_url}?{urlencode(parameters, doseq=True)}'\n except ValueError:\n pass\n return render_to_string(self.template_name, {\n 'viewname': viewname,\n 'has_permission': has_permission,\n 'htmx_url': htmx_url,\n })\n\n\n@register_widget\nclass RSSFeedWidget(DashboardWidget):\n default_title = _('RSS Feed')\n default_config = {\n 'max_entries': 10,\n 'cache_timeout': 3600, # seconds\n }\n description = _('Embed an RSS feed from an external website.')\n template_name = 'extras/dashboard/widgets/rssfeed.html'\n width = 6\n height = 4\n\n class ConfigForm(WidgetConfigForm):\n feed_url = forms.URLField(\n label=_('Feed URL')\n )\n max_entries = forms.IntegerField(\n min_value=1,\n max_value=1000,\n help_text=_('The maximum number of objects to display')\n )\n cache_timeout = forms.IntegerField(\n min_value=600, # 10 minutes\n max_value=86400, # 24 hours\n help_text=_('How long to stored the cached content (in seconds)')\n )\n\n def render(self, request):\n return render_to_string(self.template_name, {\n 'url': self.config['feed_url'],\n **self.get_feed()\n })\n\n @cached_property\n def cache_key(self):\n url = self.config['feed_url']\n url_checksum = sha256(url.encode('utf-8')).hexdigest()\n return f'dashboard_rss_{url_checksum}'\n\n def get_feed(self):\n # Fetch RSS content from cache if available\n if feed_content := cache.get(self.cache_key):\n return {\n 'feed': feedparser.FeedParserDict(feed_content),\n }\n\n # Fetch feed content from remote server\n try:\n response = requests.get(\n url=self.config['feed_url'],\n headers={'User-Agent': f'NetBox/{settings.VERSION}'},\n proxies=settings.HTTP_PROXIES,\n timeout=3\n )\n response.raise_for_status()\n except requests.exceptions.RequestException as e:\n return {\n 'error': e,\n }\n\n # Parse feed content\n feed = feedparser.parse(response.content)\n if not feed.bozo:\n # Cap number of entries\n max_entries = self.config.get('max_entries')\n feed['entries'] = feed['entries'][:max_entries]\n # Cache the feed content\n cache.set(self.cache_key, dict(feed), self.config.get('cache_timeout'))\n\n return {\n 'feed': feed,\n }\n\n\n@register_widget\nclass BookmarksWidget(DashboardWidget):\n default_title = _('Bookmarks')\n default_config = {\n 'order_by': BookmarkOrderingChoices.ORDERING_NEWEST,\n }\n description = _('Show your personal bookmarks')\n template_name = 'extras/dashboard/widgets/bookmarks.html'\n\n class ConfigForm(WidgetConfigForm):\n object_types = forms.MultipleChoiceField(\n # TODO: Restrict the choices by FeatureQuery('bookmarks')\n choices=get_content_type_labels,\n required=False\n )\n order_by = forms.ChoiceField(\n choices=BookmarkOrderingChoices\n )\n max_items = forms.IntegerField(\n min_value=1,\n required=False\n )\n\n def render(self, request):\n from extras.models import Bookmark\n\n if request.user.is_anonymous:\n bookmarks = list()\n else:\n bookmarks = Bookmark.objects.filter(user=request.user).order_by(self.config['order_by'])\n if object_types := self.config.get('object_types'):\n models = get_models_from_content_types(object_types)\n conent_types = ContentType.objects.get_for_models(*models).values()\n bookmarks = bookmarks.filter(object_type__in=conent_types)\n if max_items := self.config.get('max_items'):\n bookmarks = bookmarks[:max_items]\n\n return render_to_string(self.template_name, {\n 'bookmarks': bookmarks,\n })\n", "path": "netbox/extras/dashboard/widgets.py"}], "after_files": [{"content": "import uuid\nfrom functools import cached_property\nfrom hashlib import sha256\nfrom urllib.parse import urlencode\n\nimport feedparser\nimport requests\nfrom django import forms\nfrom django.conf import settings\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.cache import cache\nfrom django.db.models import Q\nfrom django.template.loader import render_to_string\nfrom django.urls import NoReverseMatch, resolve, reverse\nfrom django.utils.translation import gettext as _\n\nfrom extras.choices import BookmarkOrderingChoices\nfrom extras.utils import FeatureQuery\nfrom utilities.choices import ButtonColorChoices\nfrom utilities.forms import BootstrapMixin\nfrom utilities.permissions import get_permission_for_model\nfrom utilities.templatetags.builtins.filters import render_markdown\nfrom utilities.utils import content_type_identifier, content_type_name, dict_to_querydict, get_viewname\nfrom .utils import register_widget\n\n__all__ = (\n 'BookmarksWidget',\n 'DashboardWidget',\n 'NoteWidget',\n 'ObjectCountsWidget',\n 'ObjectListWidget',\n 'RSSFeedWidget',\n 'WidgetConfigForm',\n)\n\n\ndef get_content_type_labels():\n return [\n (content_type_identifier(ct), content_type_name(ct))\n for ct in ContentType.objects.filter(\n FeatureQuery('export_templates').get_query() | Q(app_label='extras', model='objectchange') |\n Q(app_label='extras', model='configcontext')\n ).order_by('app_label', 'model')\n ]\n\n\ndef get_models_from_content_types(content_types):\n \"\"\"\n Return a list of models corresponding to the given content types, identified by natural key.\n \"\"\"\n models = []\n for content_type_id in content_types:\n app_label, model_name = content_type_id.split('.')\n content_type = ContentType.objects.get_by_natural_key(app_label, model_name)\n models.append(content_type.model_class())\n return models\n\n\nclass WidgetConfigForm(BootstrapMixin, forms.Form):\n pass\n\n\nclass DashboardWidget:\n \"\"\"\n Base class for custom dashboard widgets.\n\n Attributes:\n description: A brief, user-friendly description of the widget's function\n default_title: The string to show for the widget's title when none has been specified.\n default_config: Default configuration parameters, as a dictionary mapping\n width: The widget's default width (1 to 12)\n height: The widget's default height; the number of rows it consumes\n \"\"\"\n description = None\n default_title = None\n default_config = {}\n width = 4\n height = 3\n\n class ConfigForm(WidgetConfigForm):\n \"\"\"\n The widget's configuration form.\n \"\"\"\n pass\n\n def __init__(self, id=None, title=None, color=None, config=None, width=None, height=None, x=None, y=None):\n self.id = id or str(uuid.uuid4())\n self.config = config or self.default_config\n self.title = title or self.default_title\n self.color = color\n if width:\n self.width = width\n if height:\n self.height = height\n self.x, self.y = x, y\n\n def __str__(self):\n return self.title or self.__class__.__name__\n\n def set_layout(self, grid_item):\n self.width = grid_item.get('w', 1)\n self.height = grid_item.get('h', 1)\n self.x = grid_item.get('x')\n self.y = grid_item.get('y')\n\n def render(self, request):\n \"\"\"\n This method is called to render the widget's content.\n\n Params:\n request: The current request\n \"\"\"\n raise NotImplementedError(f\"{self.__class__} must define a render() method.\")\n\n @property\n def name(self):\n return f'{self.__class__.__module__.split(\".\")[0]}.{self.__class__.__name__}'\n\n @property\n def fg_color(self):\n \"\"\"\n Return the appropriate foreground (text) color for the widget's color.\n \"\"\"\n if self.color in (\n ButtonColorChoices.CYAN,\n ButtonColorChoices.GRAY,\n ButtonColorChoices.GREY,\n ButtonColorChoices.TEAL,\n ButtonColorChoices.WHITE,\n ButtonColorChoices.YELLOW,\n ):\n return ButtonColorChoices.BLACK\n return ButtonColorChoices.WHITE\n\n @property\n def form_data(self):\n return {\n 'title': self.title,\n 'color': self.color,\n 'config': self.config,\n }\n\n\n@register_widget\nclass NoteWidget(DashboardWidget):\n default_title = _('Note')\n description = _('Display some arbitrary custom content. Markdown is supported.')\n\n class ConfigForm(WidgetConfigForm):\n content = forms.CharField(\n widget=forms.Textarea()\n )\n\n def render(self, request):\n return render_markdown(self.config.get('content'))\n\n\n@register_widget\nclass ObjectCountsWidget(DashboardWidget):\n default_title = _('Object Counts')\n description = _('Display a set of NetBox models and the number of objects created for each type.')\n template_name = 'extras/dashboard/widgets/objectcounts.html'\n\n class ConfigForm(WidgetConfigForm):\n models = forms.MultipleChoiceField(\n choices=get_content_type_labels\n )\n filters = forms.JSONField(\n required=False,\n label='Object filters',\n help_text=_(\"Filters to apply when counting the number of objects\")\n )\n\n def clean_filters(self):\n if data := self.cleaned_data['filters']:\n try:\n dict(data)\n except TypeError:\n raise forms.ValidationError(\"Invalid format. Object filters must be passed as a dictionary.\")\n return data\n\n def render(self, request):\n counts = []\n for model in get_models_from_content_types(self.config['models']):\n permission = get_permission_for_model(model, 'view')\n if request.user.has_perm(permission):\n url = reverse(get_viewname(model, 'list'))\n qs = model.objects.restrict(request.user, 'view')\n # Apply any specified filters\n if filters := self.config.get('filters'):\n params = dict_to_querydict(filters)\n filterset = getattr(resolve(url).func.view_class, 'filterset', None)\n qs = filterset(params, qs).qs\n url = f'{url}?{params.urlencode()}'\n object_count = qs.count\n counts.append((model, object_count, url))\n else:\n counts.append((model, None, None))\n\n return render_to_string(self.template_name, {\n 'counts': counts,\n })\n\n\n@register_widget\nclass ObjectListWidget(DashboardWidget):\n default_title = _('Object List')\n description = _('Display an arbitrary list of objects.')\n template_name = 'extras/dashboard/widgets/objectlist.html'\n width = 12\n height = 4\n\n class ConfigForm(WidgetConfigForm):\n model = forms.ChoiceField(\n choices=get_content_type_labels\n )\n page_size = forms.IntegerField(\n required=False,\n min_value=1,\n max_value=100,\n help_text=_('The default number of objects to display')\n )\n url_params = forms.JSONField(\n required=False,\n label='URL parameters'\n )\n\n def clean_url_params(self):\n if data := self.cleaned_data['url_params']:\n try:\n urlencode(data)\n except (TypeError, ValueError):\n raise forms.ValidationError(\"Invalid format. URL parameters must be passed as a dictionary.\")\n return data\n\n def render(self, request):\n app_label, model_name = self.config['model'].split('.')\n model = ContentType.objects.get_by_natural_key(app_label, model_name).model_class()\n viewname = get_viewname(model, action='list')\n\n # Evaluate user's permission. Note that this controls only whether the HTMX element is\n # embedded on the page: The view itself will also evaluate permissions separately.\n permission = get_permission_for_model(model, 'view')\n has_permission = request.user.has_perm(permission)\n\n try:\n htmx_url = reverse(viewname)\n except NoReverseMatch:\n htmx_url = None\n parameters = self.config.get('url_params') or {}\n if page_size := self.config.get('page_size'):\n parameters['per_page'] = page_size\n\n if parameters:\n try:\n htmx_url = f'{htmx_url}?{urlencode(parameters, doseq=True)}'\n except ValueError:\n pass\n return render_to_string(self.template_name, {\n 'viewname': viewname,\n 'has_permission': has_permission,\n 'htmx_url': htmx_url,\n })\n\n\n@register_widget\nclass RSSFeedWidget(DashboardWidget):\n default_title = _('RSS Feed')\n default_config = {\n 'max_entries': 10,\n 'cache_timeout': 3600, # seconds\n }\n description = _('Embed an RSS feed from an external website.')\n template_name = 'extras/dashboard/widgets/rssfeed.html'\n width = 6\n height = 4\n\n class ConfigForm(WidgetConfigForm):\n feed_url = forms.URLField(\n label=_('Feed URL')\n )\n max_entries = forms.IntegerField(\n min_value=1,\n max_value=1000,\n help_text=_('The maximum number of objects to display')\n )\n cache_timeout = forms.IntegerField(\n min_value=600, # 10 minutes\n max_value=86400, # 24 hours\n help_text=_('How long to stored the cached content (in seconds)')\n )\n\n def render(self, request):\n return render_to_string(self.template_name, {\n 'url': self.config['feed_url'],\n **self.get_feed()\n })\n\n @cached_property\n def cache_key(self):\n url = self.config['feed_url']\n url_checksum = sha256(url.encode('utf-8')).hexdigest()\n return f'dashboard_rss_{url_checksum}'\n\n def get_feed(self):\n # Fetch RSS content from cache if available\n if feed_content := cache.get(self.cache_key):\n return {\n 'feed': feedparser.FeedParserDict(feed_content),\n }\n\n # Fetch feed content from remote server\n try:\n response = requests.get(\n url=self.config['feed_url'],\n headers={'User-Agent': f'NetBox/{settings.VERSION}'},\n proxies=settings.HTTP_PROXIES,\n timeout=3\n )\n response.raise_for_status()\n except requests.exceptions.RequestException as e:\n return {\n 'error': e,\n }\n\n # Parse feed content\n feed = feedparser.parse(response.content)\n if not feed.bozo:\n # Cap number of entries\n max_entries = self.config.get('max_entries')\n feed['entries'] = feed['entries'][:max_entries]\n # Cache the feed content\n cache.set(self.cache_key, dict(feed), self.config.get('cache_timeout'))\n\n return {\n 'feed': feed,\n }\n\n\n@register_widget\nclass BookmarksWidget(DashboardWidget):\n default_title = _('Bookmarks')\n default_config = {\n 'order_by': BookmarkOrderingChoices.ORDERING_NEWEST,\n }\n description = _('Show your personal bookmarks')\n template_name = 'extras/dashboard/widgets/bookmarks.html'\n\n class ConfigForm(WidgetConfigForm):\n object_types = forms.MultipleChoiceField(\n # TODO: Restrict the choices by FeatureQuery('bookmarks')\n choices=get_content_type_labels,\n required=False\n )\n order_by = forms.ChoiceField(\n choices=BookmarkOrderingChoices\n )\n max_items = forms.IntegerField(\n min_value=1,\n required=False\n )\n\n def render(self, request):\n from extras.models import Bookmark\n\n if request.user.is_anonymous:\n bookmarks = list()\n else:\n bookmarks = Bookmark.objects.filter(user=request.user).order_by(self.config['order_by'])\n if object_types := self.config.get('object_types'):\n models = get_models_from_content_types(object_types)\n conent_types = ContentType.objects.get_for_models(*models).values()\n bookmarks = bookmarks.filter(object_type__in=conent_types)\n if max_items := self.config.get('max_items'):\n bookmarks = bookmarks[:max_items]\n\n return render_to_string(self.template_name, {\n 'bookmarks': bookmarks,\n })\n", "path": "netbox/extras/dashboard/widgets.py"}]}
| 4,047 | 277 |
gh_patches_debug_6585
|
rasdani/github-patches
|
git_diff
|
Cog-Creators__Red-DiscordBot-1648
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[V3 Parser] --dry-run flag not handled
# Other bugs
#### What were you trying to do?
I was trying to run the bot with the `--dry-run` flag
#### What were you expecting to happen?
See the bot loading data, then stop before connecting to the gateway
```
--dry-run Makes Red quit with code 0 just before the login. This
is useful for testing the boot process.
```
#### What actually happened?
The bot connected to the gateway as usual, like if there weren't flags
#### How can we reproduce this issue?
Just run any bot with the `--dry-run` and see if it change something
### Details
I searched for the flag in the source code to see if it was actually being used somewhere... No results.
*Nice issues templates btw*
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `redbot/__main__.py`
Content:
```
1 #!/usr/bin/env python
2
3 # Discord Version check
4
5 import sys
6 import discord
7 from redbot.core.bot import Red, ExitCodes
8 from redbot.core.cog_manager import CogManagerUI
9 from redbot.core.data_manager import load_basic_configuration, config_file
10 from redbot.core.json_io import JsonIO
11 from redbot.core.global_checks import init_global_checks
12 from redbot.core.events import init_events
13 from redbot.core.cli import interactive_config, confirm, parse_cli_flags, ask_sentry
14 from redbot.core.core_commands import Core
15 from redbot.core.dev_commands import Dev
16 from redbot.core import rpc, __version__
17 import asyncio
18 import logging.handlers
19 import logging
20 import os
21
22
23 #
24 # Red - Discord Bot v3
25 #
26 # Made by Twentysix, improved by many
27 #
28
29
30 def init_loggers(cli_flags):
31 # d.py stuff
32 dpy_logger = logging.getLogger("discord")
33 dpy_logger.setLevel(logging.WARNING)
34 console = logging.StreamHandler()
35 console.setLevel(logging.WARNING)
36 dpy_logger.addHandler(console)
37
38 # Red stuff
39
40 logger = logging.getLogger("red")
41
42 red_format = logging.Formatter(
43 "%(asctime)s %(levelname)s %(module)s %(funcName)s %(lineno)d: " "%(message)s",
44 datefmt="[%d/%m/%Y %H:%M]",
45 )
46
47 stdout_handler = logging.StreamHandler(sys.stdout)
48 stdout_handler.setFormatter(red_format)
49
50 if cli_flags.debug:
51 os.environ["PYTHONASYNCIODEBUG"] = "1"
52 logger.setLevel(logging.DEBUG)
53 else:
54 logger.setLevel(logging.WARNING)
55
56 from redbot.core.data_manager import core_data_path
57
58 logfile_path = core_data_path() / "red.log"
59 fhandler = logging.handlers.RotatingFileHandler(
60 filename=str(logfile_path), encoding="utf-8", mode="a", maxBytes=10 ** 7, backupCount=5
61 )
62 fhandler.setFormatter(red_format)
63
64 logger.addHandler(fhandler)
65 logger.addHandler(stdout_handler)
66
67 # Sentry stuff
68 sentry_logger = logging.getLogger("red.sentry")
69 sentry_logger.setLevel(logging.WARNING)
70
71 return logger, sentry_logger
72
73
74 async def _get_prefix_and_token(red, indict):
75 """
76 Again, please blame <@269933075037814786> for this.
77 :param indict:
78 :return:
79 """
80 indict["token"] = await red.db.token()
81 indict["prefix"] = await red.db.prefix()
82 indict["enable_sentry"] = await red.db.enable_sentry()
83
84
85 def list_instances():
86 if not config_file.exists():
87 print(
88 "No instances have been configured! Configure one "
89 "using `redbot-setup` before trying to run the bot!"
90 )
91 sys.exit(1)
92 else:
93 data = JsonIO(config_file)._load_json()
94 text = "Configured Instances:\n\n"
95 for instance_name in sorted(data.keys()):
96 text += "{}\n".format(instance_name)
97 print(text)
98 sys.exit(0)
99
100
101 def main():
102 description = "Red - Version {}".format(__version__)
103 cli_flags = parse_cli_flags(sys.argv[1:])
104 if cli_flags.list_instances:
105 list_instances()
106 elif cli_flags.version:
107 print(description)
108 sys.exit(0)
109 elif not cli_flags.instance_name:
110 print("Error: No instance name was provided!")
111 sys.exit(1)
112 load_basic_configuration(cli_flags.instance_name)
113 log, sentry_log = init_loggers(cli_flags)
114 red = Red(cli_flags, description=description, pm_help=None)
115 init_global_checks(red)
116 init_events(red, cli_flags)
117 red.add_cog(Core(red))
118 red.add_cog(CogManagerUI())
119 if cli_flags.dev:
120 red.add_cog(Dev())
121 loop = asyncio.get_event_loop()
122 tmp_data = {}
123 loop.run_until_complete(_get_prefix_and_token(red, tmp_data))
124 token = os.environ.get("RED_TOKEN", tmp_data["token"])
125 prefix = cli_flags.prefix or tmp_data["prefix"]
126 if token is None or not prefix:
127 if cli_flags.no_prompt is False:
128 new_token = interactive_config(red, token_set=bool(token), prefix_set=bool(prefix))
129 if new_token:
130 token = new_token
131 else:
132 log.critical("Token and prefix must be set in order to login.")
133 sys.exit(1)
134 loop.run_until_complete(_get_prefix_and_token(red, tmp_data))
135 if tmp_data["enable_sentry"]:
136 red.enable_sentry()
137 cleanup_tasks = True
138 try:
139 loop.run_until_complete(red.start(token, bot=not cli_flags.not_bot))
140 except discord.LoginFailure:
141 cleanup_tasks = False # No login happened, no need for this
142 log.critical(
143 "This token doesn't seem to be valid. If it belongs to "
144 "a user account, remember that the --not-bot flag "
145 "must be used. For self-bot functionalities instead, "
146 "--self-bot"
147 )
148 db_token = red.db.token()
149 if db_token and not cli_flags.no_prompt:
150 print("\nDo you want to reset the token? (y/n)")
151 if confirm("> "):
152 loop.run_until_complete(red.db.token.set(""))
153 print("Token has been reset.")
154 except KeyboardInterrupt:
155 log.info("Keyboard interrupt detected. Quitting...")
156 loop.run_until_complete(red.logout())
157 red._shutdown_mode = ExitCodes.SHUTDOWN
158 except Exception as e:
159 log.critical("Fatal exception", exc_info=e)
160 sentry_log.critical("Fatal Exception", exc_info=e)
161 loop.run_until_complete(red.logout())
162 finally:
163 red.rpc.close()
164 if cleanup_tasks:
165 pending = asyncio.Task.all_tasks(loop=red.loop)
166 gathered = asyncio.gather(*pending, loop=red.loop, return_exceptions=True)
167 gathered.cancel()
168
169 sys.exit(red._shutdown_mode.value)
170
171
172 if __name__ == "__main__":
173 main()
174
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/redbot/__main__.py b/redbot/__main__.py
--- a/redbot/__main__.py
+++ b/redbot/__main__.py
@@ -132,6 +132,10 @@
log.critical("Token and prefix must be set in order to login.")
sys.exit(1)
loop.run_until_complete(_get_prefix_and_token(red, tmp_data))
+
+ if cli_flags.dry_run:
+ loop.run_until_complete(red.http.close())
+ sys.exit(0)
if tmp_data["enable_sentry"]:
red.enable_sentry()
cleanup_tasks = True
|
{"golden_diff": "diff --git a/redbot/__main__.py b/redbot/__main__.py\n--- a/redbot/__main__.py\n+++ b/redbot/__main__.py\n@@ -132,6 +132,10 @@\n log.critical(\"Token and prefix must be set in order to login.\")\n sys.exit(1)\n loop.run_until_complete(_get_prefix_and_token(red, tmp_data))\n+\n+ if cli_flags.dry_run:\n+ loop.run_until_complete(red.http.close())\n+ sys.exit(0)\n if tmp_data[\"enable_sentry\"]:\n red.enable_sentry()\n cleanup_tasks = True\n", "issue": "[V3 Parser] --dry-run flag not handled \n# Other bugs\r\n\r\n#### What were you trying to do?\r\n\r\nI was trying to run the bot with the `--dry-run` flag\r\n\r\n#### What were you expecting to happen?\r\n\r\nSee the bot loading data, then stop before connecting to the gateway\r\n```\r\n --dry-run Makes Red quit with code 0 just before the login. This\r\n is useful for testing the boot process.\r\n```\r\n#### What actually happened?\r\n\r\nThe bot connected to the gateway as usual, like if there weren't flags \r\n\r\n#### How can we reproduce this issue?\r\n\r\nJust run any bot with the `--dry-run` and see if it change something\r\n\r\n### Details\r\n\r\nI searched for the flag in the source code to see if it was actually being used somewhere... No results.\r\n\r\n*Nice issues templates btw*\n", "before_files": [{"content": "#!/usr/bin/env python\n\n# Discord Version check\n\nimport sys\nimport discord\nfrom redbot.core.bot import Red, ExitCodes\nfrom redbot.core.cog_manager import CogManagerUI\nfrom redbot.core.data_manager import load_basic_configuration, config_file\nfrom redbot.core.json_io import JsonIO\nfrom redbot.core.global_checks import init_global_checks\nfrom redbot.core.events import init_events\nfrom redbot.core.cli import interactive_config, confirm, parse_cli_flags, ask_sentry\nfrom redbot.core.core_commands import Core\nfrom redbot.core.dev_commands import Dev\nfrom redbot.core import rpc, __version__\nimport asyncio\nimport logging.handlers\nimport logging\nimport os\n\n\n#\n# Red - Discord Bot v3\n#\n# Made by Twentysix, improved by many\n#\n\n\ndef init_loggers(cli_flags):\n # d.py stuff\n dpy_logger = logging.getLogger(\"discord\")\n dpy_logger.setLevel(logging.WARNING)\n console = logging.StreamHandler()\n console.setLevel(logging.WARNING)\n dpy_logger.addHandler(console)\n\n # Red stuff\n\n logger = logging.getLogger(\"red\")\n\n red_format = logging.Formatter(\n \"%(asctime)s %(levelname)s %(module)s %(funcName)s %(lineno)d: \" \"%(message)s\",\n datefmt=\"[%d/%m/%Y %H:%M]\",\n )\n\n stdout_handler = logging.StreamHandler(sys.stdout)\n stdout_handler.setFormatter(red_format)\n\n if cli_flags.debug:\n os.environ[\"PYTHONASYNCIODEBUG\"] = \"1\"\n logger.setLevel(logging.DEBUG)\n else:\n logger.setLevel(logging.WARNING)\n\n from redbot.core.data_manager import core_data_path\n\n logfile_path = core_data_path() / \"red.log\"\n fhandler = logging.handlers.RotatingFileHandler(\n filename=str(logfile_path), encoding=\"utf-8\", mode=\"a\", maxBytes=10 ** 7, backupCount=5\n )\n fhandler.setFormatter(red_format)\n\n logger.addHandler(fhandler)\n logger.addHandler(stdout_handler)\n\n # Sentry stuff\n sentry_logger = logging.getLogger(\"red.sentry\")\n sentry_logger.setLevel(logging.WARNING)\n\n return logger, sentry_logger\n\n\nasync def _get_prefix_and_token(red, indict):\n \"\"\"\n Again, please blame <@269933075037814786> for this.\n :param indict:\n :return:\n \"\"\"\n indict[\"token\"] = await red.db.token()\n indict[\"prefix\"] = await red.db.prefix()\n indict[\"enable_sentry\"] = await red.db.enable_sentry()\n\n\ndef list_instances():\n if not config_file.exists():\n print(\n \"No instances have been configured! Configure one \"\n \"using `redbot-setup` before trying to run the bot!\"\n )\n sys.exit(1)\n else:\n data = JsonIO(config_file)._load_json()\n text = \"Configured Instances:\\n\\n\"\n for instance_name in sorted(data.keys()):\n text += \"{}\\n\".format(instance_name)\n print(text)\n sys.exit(0)\n\n\ndef main():\n description = \"Red - Version {}\".format(__version__)\n cli_flags = parse_cli_flags(sys.argv[1:])\n if cli_flags.list_instances:\n list_instances()\n elif cli_flags.version:\n print(description)\n sys.exit(0)\n elif not cli_flags.instance_name:\n print(\"Error: No instance name was provided!\")\n sys.exit(1)\n load_basic_configuration(cli_flags.instance_name)\n log, sentry_log = init_loggers(cli_flags)\n red = Red(cli_flags, description=description, pm_help=None)\n init_global_checks(red)\n init_events(red, cli_flags)\n red.add_cog(Core(red))\n red.add_cog(CogManagerUI())\n if cli_flags.dev:\n red.add_cog(Dev())\n loop = asyncio.get_event_loop()\n tmp_data = {}\n loop.run_until_complete(_get_prefix_and_token(red, tmp_data))\n token = os.environ.get(\"RED_TOKEN\", tmp_data[\"token\"])\n prefix = cli_flags.prefix or tmp_data[\"prefix\"]\n if token is None or not prefix:\n if cli_flags.no_prompt is False:\n new_token = interactive_config(red, token_set=bool(token), prefix_set=bool(prefix))\n if new_token:\n token = new_token\n else:\n log.critical(\"Token and prefix must be set in order to login.\")\n sys.exit(1)\n loop.run_until_complete(_get_prefix_and_token(red, tmp_data))\n if tmp_data[\"enable_sentry\"]:\n red.enable_sentry()\n cleanup_tasks = True\n try:\n loop.run_until_complete(red.start(token, bot=not cli_flags.not_bot))\n except discord.LoginFailure:\n cleanup_tasks = False # No login happened, no need for this\n log.critical(\n \"This token doesn't seem to be valid. If it belongs to \"\n \"a user account, remember that the --not-bot flag \"\n \"must be used. For self-bot functionalities instead, \"\n \"--self-bot\"\n )\n db_token = red.db.token()\n if db_token and not cli_flags.no_prompt:\n print(\"\\nDo you want to reset the token? (y/n)\")\n if confirm(\"> \"):\n loop.run_until_complete(red.db.token.set(\"\"))\n print(\"Token has been reset.\")\n except KeyboardInterrupt:\n log.info(\"Keyboard interrupt detected. Quitting...\")\n loop.run_until_complete(red.logout())\n red._shutdown_mode = ExitCodes.SHUTDOWN\n except Exception as e:\n log.critical(\"Fatal exception\", exc_info=e)\n sentry_log.critical(\"Fatal Exception\", exc_info=e)\n loop.run_until_complete(red.logout())\n finally:\n red.rpc.close()\n if cleanup_tasks:\n pending = asyncio.Task.all_tasks(loop=red.loop)\n gathered = asyncio.gather(*pending, loop=red.loop, return_exceptions=True)\n gathered.cancel()\n\n sys.exit(red._shutdown_mode.value)\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "redbot/__main__.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\n# Discord Version check\n\nimport sys\nimport discord\nfrom redbot.core.bot import Red, ExitCodes\nfrom redbot.core.cog_manager import CogManagerUI\nfrom redbot.core.data_manager import load_basic_configuration, config_file\nfrom redbot.core.json_io import JsonIO\nfrom redbot.core.global_checks import init_global_checks\nfrom redbot.core.events import init_events\nfrom redbot.core.cli import interactive_config, confirm, parse_cli_flags, ask_sentry\nfrom redbot.core.core_commands import Core\nfrom redbot.core.dev_commands import Dev\nfrom redbot.core import rpc, __version__\nimport asyncio\nimport logging.handlers\nimport logging\nimport os\n\n\n#\n# Red - Discord Bot v3\n#\n# Made by Twentysix, improved by many\n#\n\n\ndef init_loggers(cli_flags):\n # d.py stuff\n dpy_logger = logging.getLogger(\"discord\")\n dpy_logger.setLevel(logging.WARNING)\n console = logging.StreamHandler()\n console.setLevel(logging.WARNING)\n dpy_logger.addHandler(console)\n\n # Red stuff\n\n logger = logging.getLogger(\"red\")\n\n red_format = logging.Formatter(\n \"%(asctime)s %(levelname)s %(module)s %(funcName)s %(lineno)d: \" \"%(message)s\",\n datefmt=\"[%d/%m/%Y %H:%M]\",\n )\n\n stdout_handler = logging.StreamHandler(sys.stdout)\n stdout_handler.setFormatter(red_format)\n\n if cli_flags.debug:\n os.environ[\"PYTHONASYNCIODEBUG\"] = \"1\"\n logger.setLevel(logging.DEBUG)\n else:\n logger.setLevel(logging.WARNING)\n\n from redbot.core.data_manager import core_data_path\n\n logfile_path = core_data_path() / \"red.log\"\n fhandler = logging.handlers.RotatingFileHandler(\n filename=str(logfile_path), encoding=\"utf-8\", mode=\"a\", maxBytes=10 ** 7, backupCount=5\n )\n fhandler.setFormatter(red_format)\n\n logger.addHandler(fhandler)\n logger.addHandler(stdout_handler)\n\n # Sentry stuff\n sentry_logger = logging.getLogger(\"red.sentry\")\n sentry_logger.setLevel(logging.WARNING)\n\n return logger, sentry_logger\n\n\nasync def _get_prefix_and_token(red, indict):\n \"\"\"\n Again, please blame <@269933075037814786> for this.\n :param indict:\n :return:\n \"\"\"\n indict[\"token\"] = await red.db.token()\n indict[\"prefix\"] = await red.db.prefix()\n indict[\"enable_sentry\"] = await red.db.enable_sentry()\n\n\ndef list_instances():\n if not config_file.exists():\n print(\n \"No instances have been configured! Configure one \"\n \"using `redbot-setup` before trying to run the bot!\"\n )\n sys.exit(1)\n else:\n data = JsonIO(config_file)._load_json()\n text = \"Configured Instances:\\n\\n\"\n for instance_name in sorted(data.keys()):\n text += \"{}\\n\".format(instance_name)\n print(text)\n sys.exit(0)\n\n\ndef main():\n description = \"Red - Version {}\".format(__version__)\n cli_flags = parse_cli_flags(sys.argv[1:])\n if cli_flags.list_instances:\n list_instances()\n elif cli_flags.version:\n print(description)\n sys.exit(0)\n elif not cli_flags.instance_name:\n print(\"Error: No instance name was provided!\")\n sys.exit(1)\n load_basic_configuration(cli_flags.instance_name)\n log, sentry_log = init_loggers(cli_flags)\n red = Red(cli_flags, description=description, pm_help=None)\n init_global_checks(red)\n init_events(red, cli_flags)\n red.add_cog(Core(red))\n red.add_cog(CogManagerUI())\n if cli_flags.dev:\n red.add_cog(Dev())\n loop = asyncio.get_event_loop()\n tmp_data = {}\n loop.run_until_complete(_get_prefix_and_token(red, tmp_data))\n token = os.environ.get(\"RED_TOKEN\", tmp_data[\"token\"])\n prefix = cli_flags.prefix or tmp_data[\"prefix\"]\n if token is None or not prefix:\n if cli_flags.no_prompt is False:\n new_token = interactive_config(red, token_set=bool(token), prefix_set=bool(prefix))\n if new_token:\n token = new_token\n else:\n log.critical(\"Token and prefix must be set in order to login.\")\n sys.exit(1)\n loop.run_until_complete(_get_prefix_and_token(red, tmp_data))\n\n if cli_flags.dry_run:\n loop.run_until_complete(red.http.close())\n sys.exit(0)\n if tmp_data[\"enable_sentry\"]:\n red.enable_sentry()\n cleanup_tasks = True\n try:\n loop.run_until_complete(red.start(token, bot=not cli_flags.not_bot))\n except discord.LoginFailure:\n cleanup_tasks = False # No login happened, no need for this\n log.critical(\n \"This token doesn't seem to be valid. If it belongs to \"\n \"a user account, remember that the --not-bot flag \"\n \"must be used. For self-bot functionalities instead, \"\n \"--self-bot\"\n )\n db_token = red.db.token()\n if db_token and not cli_flags.no_prompt:\n print(\"\\nDo you want to reset the token? (y/n)\")\n if confirm(\"> \"):\n loop.run_until_complete(red.db.token.set(\"\"))\n print(\"Token has been reset.\")\n except KeyboardInterrupt:\n log.info(\"Keyboard interrupt detected. Quitting...\")\n loop.run_until_complete(red.logout())\n red._shutdown_mode = ExitCodes.SHUTDOWN\n except Exception as e:\n log.critical(\"Fatal exception\", exc_info=e)\n sentry_log.critical(\"Fatal Exception\", exc_info=e)\n loop.run_until_complete(red.logout())\n finally:\n red.rpc.close()\n if cleanup_tasks:\n pending = asyncio.Task.all_tasks(loop=red.loop)\n gathered = asyncio.gather(*pending, loop=red.loop, return_exceptions=True)\n gathered.cancel()\n\n sys.exit(red._shutdown_mode.value)\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "redbot/__main__.py"}]}
| 2,161 | 136 |
gh_patches_debug_9699
|
rasdani/github-patches
|
git_diff
|
litestar-org__litestar-695
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug: Header values should not be lowercased
**Describe the bug**
Headers keys and values are lowercased before being rendered. While lowercasing key is fine (as per HTTP2 spec), lowercasing values should not be done.
For my use case, integrating with a payment provider involves passing a transaction ID as a header. This ID contains lowercase, uppercase letters, and is not case-insensitive.
**To Reproduce**
1. Create a new route_handler.
2. In the route_handler, modify the response: `response.headers["Foo"] = "Bar"` .
3. Observe that the key and value are returned as lowercase:

4. While reporting this, I noticed that `date` and `vary` headers are not lowercased.
**Additional context**
Looking at the Git history, it seems that originally Starlette was used for handling the responses. I thought that Starlette might also have this behaviour, but they do not lowercase the header values. See https://github.com/encode/starlette/blob/fa737beb2838f568fe2477e2316a7fdf3d3eb257/starlette/responses.py#L72 .
The relevant code in Starlite is here: https://github.com/starlite-api/starlite/blob/95379f102be249f3c9453fbf9d216a41eede2d6f/starlite/response/base.py#L257
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlite/response/base.py`
Content:
```
1 from typing import (
2 TYPE_CHECKING,
3 Any,
4 Dict,
5 Generic,
6 List,
7 Optional,
8 Tuple,
9 TypeVar,
10 Union,
11 cast,
12 )
13
14 from orjson import OPT_INDENT_2, OPT_OMIT_MICROSECONDS, OPT_SERIALIZE_NUMPY, dumps
15 from pydantic_openapi_schema.v3_1_0 import OpenAPI
16 from yaml import dump as dump_yaml
17
18 from starlite.datastructures import Cookie, ETag
19 from starlite.enums import MediaType, OpenAPIMediaType
20 from starlite.exceptions import ImproperlyConfiguredException
21 from starlite.status_codes import (
22 HTTP_200_OK,
23 HTTP_204_NO_CONTENT,
24 HTTP_304_NOT_MODIFIED,
25 )
26 from starlite.utils.helpers import get_enum_string_value
27 from starlite.utils.serialization import default_serializer
28
29 if TYPE_CHECKING:
30 from typing_extensions import Literal
31
32 from starlite.datastructures import BackgroundTask, BackgroundTasks
33 from starlite.types import (
34 HTTPResponseBodyEvent,
35 HTTPResponseStartEvent,
36 Receive,
37 ResponseCookies,
38 Scope,
39 Send,
40 )
41
42 T = TypeVar("T")
43
44
45 class Response(Generic[T]):
46 __slots__ = (
47 "status_code",
48 "media_type",
49 "background",
50 "headers",
51 "cookies",
52 "encoding",
53 "body",
54 "status_allows_body",
55 "is_head_response",
56 )
57
58 def __init__(
59 self,
60 content: T,
61 *,
62 status_code: int = HTTP_200_OK,
63 media_type: Union[MediaType, "OpenAPIMediaType", str] = MediaType.JSON,
64 background: Optional[Union["BackgroundTask", "BackgroundTasks"]] = None,
65 headers: Optional[Dict[str, Any]] = None,
66 cookies: Optional["ResponseCookies"] = None,
67 encoding: str = "utf-8",
68 is_head_response: bool = False,
69 ) -> None:
70 """This is the base Starlite HTTP response class, used as the basis for
71 all other response classes.
72
73 Args:
74 content: A value for the response body that will be rendered into bytes string.
75 status_code: An HTTP status code.
76 media_type: A value for the response 'Content-Type' header.
77 background: A [BackgroundTask][starlite.datastructures.BackgroundTask] instance or
78 [BackgroundTasks][starlite.datastructures.BackgroundTasks] to execute after the response is finished.
79 Defaults to None.
80 headers: A string keyed dictionary of response headers. Header keys are insensitive.
81 cookies: A list of [Cookie][starlite.datastructures.Cookie] instances to be set under the response 'Set-Cookie' header.
82 encoding: The encoding to be used for the response headers.
83 is_head_response: Whether the response should send only the headers ("head" request) or also the content.
84 """
85 self.status_code = status_code
86 self.media_type = get_enum_string_value(media_type)
87 self.background = background
88 self.headers = headers or {}
89 self.cookies = cookies or []
90 self.encoding = encoding
91 self.is_head_response = is_head_response
92 self.status_allows_body = not (
93 self.status_code in {HTTP_204_NO_CONTENT, HTTP_304_NOT_MODIFIED} or self.status_code < HTTP_200_OK
94 )
95 self.body = self.render(content) if not self.is_head_response else b""
96
97 def set_cookie(
98 self,
99 key: str,
100 value: Optional[str] = None,
101 max_age: Optional[int] = None,
102 expires: Optional[int] = None,
103 path: str = "/",
104 domain: Optional[str] = None,
105 secure: bool = False,
106 httponly: bool = False,
107 samesite: 'Literal["lax", "strict", "none"]' = "lax",
108 ) -> None:
109 """Sets a cookie on the response.
110
111 Args:
112 key: Key for the cookie.
113 value: Value for the cookie, if none given defaults to empty string.
114 max_age: Maximal age of the cookie before its invalidated.
115 expires: Expiration date as unix MS timestamp.
116 path: Path fragment that must exist in the request url for the cookie to be valid. Defaults to '/'.
117 domain: Domain for which the cookie is valid.
118 secure: Https is required for the cookie.
119 httponly: Forbids javascript to access the cookie via 'Document.cookie'.
120 samesite: Controls whether a cookie is sent with cross-site requests. Defaults to 'lax'.
121
122 Returns:
123 None.
124 """
125 self.cookies.append(
126 Cookie(
127 domain=domain,
128 expires=expires,
129 httponly=httponly,
130 key=key,
131 max_age=max_age,
132 path=path,
133 samesite=samesite,
134 secure=secure,
135 value=value,
136 )
137 )
138
139 def set_header(self, key: str, value: str) -> None:
140 """Sets a header on the response.
141
142 Args:
143 key: Header key.
144 value: Header value.
145
146 Returns:
147 None.
148 """
149 self.headers[key] = value
150
151 def set_etag(self, etag: Union[str, "ETag"]) -> None:
152 """Sets an etag header.
153
154 Args:
155 etag: An etag value.
156
157 Returns:
158 None
159 """
160 self.headers["etag"] = etag.to_header() if isinstance(etag, ETag) else etag
161
162 def delete_cookie(
163 self,
164 key: str,
165 path: str = "/",
166 domain: Optional[str] = None,
167 ) -> None:
168 """Deletes a cookie.
169
170 Args:
171 key: Key of the cookie.
172 path: Path of the cookie.
173 domain: Domain of the cookie.
174
175 Returns:
176 None.
177 """
178 cookie = Cookie(key=key, path=path, domain=domain, expires=0, max_age=0)
179 self.cookies = [c for c in self.cookies if c != cookie]
180 self.cookies.append(cookie)
181
182 @staticmethod
183 def serializer(value: Any) -> Any:
184 """Serializer hook for orjson to handle pydantic models.
185
186 Args:
187 value: A value to serialize
188 Returns:
189 A serialized value
190 Raises:
191 TypeError: if value is not supported
192 """
193 return default_serializer(value)
194
195 def render(self, content: Any) -> bytes:
196 """
197 Handles the rendering of content T into a bytes string.
198 Args:
199 content: A value for the response body that will be rendered into bytes string.
200
201 Returns:
202 An encoded bytes string
203 """
204 if self.status_allows_body:
205 if isinstance(content, bytes):
206 return content
207 if isinstance(content, str):
208 return content.encode(self.encoding)
209 if self.media_type == MediaType.JSON:
210 try:
211 return dumps(content, default=self.serializer, option=OPT_SERIALIZE_NUMPY | OPT_OMIT_MICROSECONDS)
212 except (AttributeError, ValueError, TypeError) as e:
213 raise ImproperlyConfiguredException("Unable to serialize response content") from e
214 if isinstance(content, OpenAPI):
215 content_dict = content.dict(by_alias=True, exclude_none=True)
216 if self.media_type == OpenAPIMediaType.OPENAPI_YAML:
217 return cast("bytes", dump_yaml(content_dict, default_flow_style=False).encode("utf-8"))
218 return dumps(content_dict, option=OPT_INDENT_2 | OPT_OMIT_MICROSECONDS)
219 if content is None:
220 return b""
221 raise ImproperlyConfiguredException(
222 f"unable to render response body for the given {content} with media_type {self.media_type}"
223 )
224 if content is not None:
225 raise ImproperlyConfiguredException(
226 f"status_code {self.status_code} does not support a response body value"
227 )
228 return b""
229
230 @property
231 def content_length(self) -> Optional[int]:
232 """
233
234 Returns:
235 The content length of the body (e.g. for use in a "Content-Length" header).
236 If the response does not have a body, this value is `None`
237 """
238 if self.status_allows_body:
239 return len(self.body)
240 return None
241
242 @property
243 def encoded_headers(self) -> List[Tuple[bytes, bytes]]:
244 """
245 Notes:
246 - A 'Content-Length' header will be added if appropriate and not provided by the user.
247
248 Returns:
249 A list of tuples containing the headers and cookies of the request in a format ready for ASGI transmission.
250 """
251
252 if self.media_type.startswith("text/"):
253 content_type = f"{self.media_type}; charset={self.encoding}"
254 else:
255 content_type = self.media_type
256
257 encoded_headers = [
258 *((k.lower().encode("latin-1"), str(v).lower().encode("latin-1")) for k, v in self.headers.items()),
259 *((b"set-cookie", cookie.to_header(header="").encode("latin-1")) for cookie in self.cookies),
260 (b"content-type", content_type.encode("latin-1")),
261 ]
262
263 if self.content_length and not any(key == b"content-length" for key, _ in encoded_headers):
264 encoded_headers.append((b"content-length", str(self.content_length).encode("latin-1")))
265 return encoded_headers
266
267 async def after_response(self) -> None:
268 """Executed after the response is sent.
269
270 Returns:
271 None
272 """
273 if self.background is not None:
274 await self.background()
275
276 async def start_response(self, send: "Send") -> None:
277 """
278 Emits the start event of the response. This event includes the headers and status codes.
279 Args:
280 send: The ASGI send function.
281
282 Returns:
283 None
284 """
285 event: "HTTPResponseStartEvent" = {
286 "type": "http.response.start",
287 "status": self.status_code,
288 "headers": self.encoded_headers,
289 }
290
291 await send(event)
292
293 async def send_body(self, send: "Send", receive: "Receive") -> None: # pylint: disable=unused-argument
294 """Emits the response body.
295
296 Args:
297 send: The ASGI send function.
298 receive: The ASGI receive function.
299
300 Notes:
301 - Response subclasses should customize this method if there is a need to customize sending data.
302
303 Returns:
304 None
305 """
306 event: "HTTPResponseBodyEvent" = {"type": "http.response.body", "body": self.body, "more_body": False}
307 await send(event)
308
309 async def __call__(self, scope: "Scope", receive: "Receive", send: "Send") -> None:
310 """The call method of the response is an "ASGIApp".
311
312 Args:
313 scope: The ASGI connection scope.
314 receive: The ASGI receive function.
315 send: The ASGI send function.
316
317 Returns:
318 None
319 """
320 await self.start_response(send=send)
321
322 if self.is_head_response:
323 event: "HTTPResponseBodyEvent" = {"type": "http.response.body", "body": b"", "more_body": False}
324 await send(event)
325 else:
326 await self.send_body(send=send, receive=receive)
327
328 await self.after_response()
329
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/starlite/response/base.py b/starlite/response/base.py
--- a/starlite/response/base.py
+++ b/starlite/response/base.py
@@ -255,7 +255,7 @@
content_type = self.media_type
encoded_headers = [
- *((k.lower().encode("latin-1"), str(v).lower().encode("latin-1")) for k, v in self.headers.items()),
+ *((k.lower().encode("latin-1"), str(v).encode("latin-1")) for k, v in self.headers.items()),
*((b"set-cookie", cookie.to_header(header="").encode("latin-1")) for cookie in self.cookies),
(b"content-type", content_type.encode("latin-1")),
]
|
{"golden_diff": "diff --git a/starlite/response/base.py b/starlite/response/base.py\n--- a/starlite/response/base.py\n+++ b/starlite/response/base.py\n@@ -255,7 +255,7 @@\n content_type = self.media_type\n \n encoded_headers = [\n- *((k.lower().encode(\"latin-1\"), str(v).lower().encode(\"latin-1\")) for k, v in self.headers.items()),\n+ *((k.lower().encode(\"latin-1\"), str(v).encode(\"latin-1\")) for k, v in self.headers.items()),\n *((b\"set-cookie\", cookie.to_header(header=\"\").encode(\"latin-1\")) for cookie in self.cookies),\n (b\"content-type\", content_type.encode(\"latin-1\")),\n ]\n", "issue": "Bug: Header values should not be lowercased\n**Describe the bug**\r\nHeaders keys and values are lowercased before being rendered. While lowercasing key is fine (as per HTTP2 spec), lowercasing values should not be done.\r\n\r\nFor my use case, integrating with a payment provider involves passing a transaction ID as a header. This ID contains lowercase, uppercase letters, and is not case-insensitive.\r\n\r\n**To Reproduce**\r\n1. Create a new route_handler.\r\n2. In the route_handler, modify the response: `response.headers[\"Foo\"] = \"Bar\"` .\r\n3. Observe that the key and value are returned as lowercase:\r\n\r\n\r\n\r\n4. While reporting this, I noticed that `date` and `vary` headers are not lowercased.\r\n\r\n**Additional context**\r\nLooking at the Git history, it seems that originally Starlette was used for handling the responses. I thought that Starlette might also have this behaviour, but they do not lowercase the header values. See https://github.com/encode/starlette/blob/fa737beb2838f568fe2477e2316a7fdf3d3eb257/starlette/responses.py#L72 .\r\n\r\nThe relevant code in Starlite is here: https://github.com/starlite-api/starlite/blob/95379f102be249f3c9453fbf9d216a41eede2d6f/starlite/response/base.py#L257\r\n\n", "before_files": [{"content": "from typing import (\n TYPE_CHECKING,\n Any,\n Dict,\n Generic,\n List,\n Optional,\n Tuple,\n TypeVar,\n Union,\n cast,\n)\n\nfrom orjson import OPT_INDENT_2, OPT_OMIT_MICROSECONDS, OPT_SERIALIZE_NUMPY, dumps\nfrom pydantic_openapi_schema.v3_1_0 import OpenAPI\nfrom yaml import dump as dump_yaml\n\nfrom starlite.datastructures import Cookie, ETag\nfrom starlite.enums import MediaType, OpenAPIMediaType\nfrom starlite.exceptions import ImproperlyConfiguredException\nfrom starlite.status_codes import (\n HTTP_200_OK,\n HTTP_204_NO_CONTENT,\n HTTP_304_NOT_MODIFIED,\n)\nfrom starlite.utils.helpers import get_enum_string_value\nfrom starlite.utils.serialization import default_serializer\n\nif TYPE_CHECKING:\n from typing_extensions import Literal\n\n from starlite.datastructures import BackgroundTask, BackgroundTasks\n from starlite.types import (\n HTTPResponseBodyEvent,\n HTTPResponseStartEvent,\n Receive,\n ResponseCookies,\n Scope,\n Send,\n )\n\nT = TypeVar(\"T\")\n\n\nclass Response(Generic[T]):\n __slots__ = (\n \"status_code\",\n \"media_type\",\n \"background\",\n \"headers\",\n \"cookies\",\n \"encoding\",\n \"body\",\n \"status_allows_body\",\n \"is_head_response\",\n )\n\n def __init__(\n self,\n content: T,\n *,\n status_code: int = HTTP_200_OK,\n media_type: Union[MediaType, \"OpenAPIMediaType\", str] = MediaType.JSON,\n background: Optional[Union[\"BackgroundTask\", \"BackgroundTasks\"]] = None,\n headers: Optional[Dict[str, Any]] = None,\n cookies: Optional[\"ResponseCookies\"] = None,\n encoding: str = \"utf-8\",\n is_head_response: bool = False,\n ) -> None:\n \"\"\"This is the base Starlite HTTP response class, used as the basis for\n all other response classes.\n\n Args:\n content: A value for the response body that will be rendered into bytes string.\n status_code: An HTTP status code.\n media_type: A value for the response 'Content-Type' header.\n background: A [BackgroundTask][starlite.datastructures.BackgroundTask] instance or\n [BackgroundTasks][starlite.datastructures.BackgroundTasks] to execute after the response is finished.\n Defaults to None.\n headers: A string keyed dictionary of response headers. Header keys are insensitive.\n cookies: A list of [Cookie][starlite.datastructures.Cookie] instances to be set under the response 'Set-Cookie' header.\n encoding: The encoding to be used for the response headers.\n is_head_response: Whether the response should send only the headers (\"head\" request) or also the content.\n \"\"\"\n self.status_code = status_code\n self.media_type = get_enum_string_value(media_type)\n self.background = background\n self.headers = headers or {}\n self.cookies = cookies or []\n self.encoding = encoding\n self.is_head_response = is_head_response\n self.status_allows_body = not (\n self.status_code in {HTTP_204_NO_CONTENT, HTTP_304_NOT_MODIFIED} or self.status_code < HTTP_200_OK\n )\n self.body = self.render(content) if not self.is_head_response else b\"\"\n\n def set_cookie(\n self,\n key: str,\n value: Optional[str] = None,\n max_age: Optional[int] = None,\n expires: Optional[int] = None,\n path: str = \"/\",\n domain: Optional[str] = None,\n secure: bool = False,\n httponly: bool = False,\n samesite: 'Literal[\"lax\", \"strict\", \"none\"]' = \"lax\",\n ) -> None:\n \"\"\"Sets a cookie on the response.\n\n Args:\n key: Key for the cookie.\n value: Value for the cookie, if none given defaults to empty string.\n max_age: Maximal age of the cookie before its invalidated.\n expires: Expiration date as unix MS timestamp.\n path: Path fragment that must exist in the request url for the cookie to be valid. Defaults to '/'.\n domain: Domain for which the cookie is valid.\n secure: Https is required for the cookie.\n httponly: Forbids javascript to access the cookie via 'Document.cookie'.\n samesite: Controls whether a cookie is sent with cross-site requests. Defaults to 'lax'.\n\n Returns:\n None.\n \"\"\"\n self.cookies.append(\n Cookie(\n domain=domain,\n expires=expires,\n httponly=httponly,\n key=key,\n max_age=max_age,\n path=path,\n samesite=samesite,\n secure=secure,\n value=value,\n )\n )\n\n def set_header(self, key: str, value: str) -> None:\n \"\"\"Sets a header on the response.\n\n Args:\n key: Header key.\n value: Header value.\n\n Returns:\n None.\n \"\"\"\n self.headers[key] = value\n\n def set_etag(self, etag: Union[str, \"ETag\"]) -> None:\n \"\"\"Sets an etag header.\n\n Args:\n etag: An etag value.\n\n Returns:\n None\n \"\"\"\n self.headers[\"etag\"] = etag.to_header() if isinstance(etag, ETag) else etag\n\n def delete_cookie(\n self,\n key: str,\n path: str = \"/\",\n domain: Optional[str] = None,\n ) -> None:\n \"\"\"Deletes a cookie.\n\n Args:\n key: Key of the cookie.\n path: Path of the cookie.\n domain: Domain of the cookie.\n\n Returns:\n None.\n \"\"\"\n cookie = Cookie(key=key, path=path, domain=domain, expires=0, max_age=0)\n self.cookies = [c for c in self.cookies if c != cookie]\n self.cookies.append(cookie)\n\n @staticmethod\n def serializer(value: Any) -> Any:\n \"\"\"Serializer hook for orjson to handle pydantic models.\n\n Args:\n value: A value to serialize\n Returns:\n A serialized value\n Raises:\n TypeError: if value is not supported\n \"\"\"\n return default_serializer(value)\n\n def render(self, content: Any) -> bytes:\n \"\"\"\n Handles the rendering of content T into a bytes string.\n Args:\n content: A value for the response body that will be rendered into bytes string.\n\n Returns:\n An encoded bytes string\n \"\"\"\n if self.status_allows_body:\n if isinstance(content, bytes):\n return content\n if isinstance(content, str):\n return content.encode(self.encoding)\n if self.media_type == MediaType.JSON:\n try:\n return dumps(content, default=self.serializer, option=OPT_SERIALIZE_NUMPY | OPT_OMIT_MICROSECONDS)\n except (AttributeError, ValueError, TypeError) as e:\n raise ImproperlyConfiguredException(\"Unable to serialize response content\") from e\n if isinstance(content, OpenAPI):\n content_dict = content.dict(by_alias=True, exclude_none=True)\n if self.media_type == OpenAPIMediaType.OPENAPI_YAML:\n return cast(\"bytes\", dump_yaml(content_dict, default_flow_style=False).encode(\"utf-8\"))\n return dumps(content_dict, option=OPT_INDENT_2 | OPT_OMIT_MICROSECONDS)\n if content is None:\n return b\"\"\n raise ImproperlyConfiguredException(\n f\"unable to render response body for the given {content} with media_type {self.media_type}\"\n )\n if content is not None:\n raise ImproperlyConfiguredException(\n f\"status_code {self.status_code} does not support a response body value\"\n )\n return b\"\"\n\n @property\n def content_length(self) -> Optional[int]:\n \"\"\"\n\n Returns:\n The content length of the body (e.g. for use in a \"Content-Length\" header).\n If the response does not have a body, this value is `None`\n \"\"\"\n if self.status_allows_body:\n return len(self.body)\n return None\n\n @property\n def encoded_headers(self) -> List[Tuple[bytes, bytes]]:\n \"\"\"\n Notes:\n - A 'Content-Length' header will be added if appropriate and not provided by the user.\n\n Returns:\n A list of tuples containing the headers and cookies of the request in a format ready for ASGI transmission.\n \"\"\"\n\n if self.media_type.startswith(\"text/\"):\n content_type = f\"{self.media_type}; charset={self.encoding}\"\n else:\n content_type = self.media_type\n\n encoded_headers = [\n *((k.lower().encode(\"latin-1\"), str(v).lower().encode(\"latin-1\")) for k, v in self.headers.items()),\n *((b\"set-cookie\", cookie.to_header(header=\"\").encode(\"latin-1\")) for cookie in self.cookies),\n (b\"content-type\", content_type.encode(\"latin-1\")),\n ]\n\n if self.content_length and not any(key == b\"content-length\" for key, _ in encoded_headers):\n encoded_headers.append((b\"content-length\", str(self.content_length).encode(\"latin-1\")))\n return encoded_headers\n\n async def after_response(self) -> None:\n \"\"\"Executed after the response is sent.\n\n Returns:\n None\n \"\"\"\n if self.background is not None:\n await self.background()\n\n async def start_response(self, send: \"Send\") -> None:\n \"\"\"\n Emits the start event of the response. This event includes the headers and status codes.\n Args:\n send: The ASGI send function.\n\n Returns:\n None\n \"\"\"\n event: \"HTTPResponseStartEvent\" = {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.encoded_headers,\n }\n\n await send(event)\n\n async def send_body(self, send: \"Send\", receive: \"Receive\") -> None: # pylint: disable=unused-argument\n \"\"\"Emits the response body.\n\n Args:\n send: The ASGI send function.\n receive: The ASGI receive function.\n\n Notes:\n - Response subclasses should customize this method if there is a need to customize sending data.\n\n Returns:\n None\n \"\"\"\n event: \"HTTPResponseBodyEvent\" = {\"type\": \"http.response.body\", \"body\": self.body, \"more_body\": False}\n await send(event)\n\n async def __call__(self, scope: \"Scope\", receive: \"Receive\", send: \"Send\") -> None:\n \"\"\"The call method of the response is an \"ASGIApp\".\n\n Args:\n scope: The ASGI connection scope.\n receive: The ASGI receive function.\n send: The ASGI send function.\n\n Returns:\n None\n \"\"\"\n await self.start_response(send=send)\n\n if self.is_head_response:\n event: \"HTTPResponseBodyEvent\" = {\"type\": \"http.response.body\", \"body\": b\"\", \"more_body\": False}\n await send(event)\n else:\n await self.send_body(send=send, receive=receive)\n\n await self.after_response()\n", "path": "starlite/response/base.py"}], "after_files": [{"content": "from typing import (\n TYPE_CHECKING,\n Any,\n Dict,\n Generic,\n List,\n Optional,\n Tuple,\n TypeVar,\n Union,\n cast,\n)\n\nfrom orjson import OPT_INDENT_2, OPT_OMIT_MICROSECONDS, OPT_SERIALIZE_NUMPY, dumps\nfrom pydantic_openapi_schema.v3_1_0 import OpenAPI\nfrom yaml import dump as dump_yaml\n\nfrom starlite.datastructures import Cookie, ETag\nfrom starlite.enums import MediaType, OpenAPIMediaType\nfrom starlite.exceptions import ImproperlyConfiguredException\nfrom starlite.status_codes import (\n HTTP_200_OK,\n HTTP_204_NO_CONTENT,\n HTTP_304_NOT_MODIFIED,\n)\nfrom starlite.utils.helpers import get_enum_string_value\nfrom starlite.utils.serialization import default_serializer\n\nif TYPE_CHECKING:\n from typing_extensions import Literal\n\n from starlite.datastructures import BackgroundTask, BackgroundTasks\n from starlite.types import (\n HTTPResponseBodyEvent,\n HTTPResponseStartEvent,\n Receive,\n ResponseCookies,\n Scope,\n Send,\n )\n\nT = TypeVar(\"T\")\n\n\nclass Response(Generic[T]):\n __slots__ = (\n \"status_code\",\n \"media_type\",\n \"background\",\n \"headers\",\n \"cookies\",\n \"encoding\",\n \"body\",\n \"status_allows_body\",\n \"is_head_response\",\n )\n\n def __init__(\n self,\n content: T,\n *,\n status_code: int = HTTP_200_OK,\n media_type: Union[MediaType, \"OpenAPIMediaType\", str] = MediaType.JSON,\n background: Optional[Union[\"BackgroundTask\", \"BackgroundTasks\"]] = None,\n headers: Optional[Dict[str, Any]] = None,\n cookies: Optional[\"ResponseCookies\"] = None,\n encoding: str = \"utf-8\",\n is_head_response: bool = False,\n ) -> None:\n \"\"\"This is the base Starlite HTTP response class, used as the basis for\n all other response classes.\n\n Args:\n content: A value for the response body that will be rendered into bytes string.\n status_code: An HTTP status code.\n media_type: A value for the response 'Content-Type' header.\n background: A [BackgroundTask][starlite.datastructures.BackgroundTask] instance or\n [BackgroundTasks][starlite.datastructures.BackgroundTasks] to execute after the response is finished.\n Defaults to None.\n headers: A string keyed dictionary of response headers. Header keys are insensitive.\n cookies: A list of [Cookie][starlite.datastructures.Cookie] instances to be set under the response 'Set-Cookie' header.\n encoding: The encoding to be used for the response headers.\n is_head_response: Whether the response should send only the headers (\"head\" request) or also the content.\n \"\"\"\n self.status_code = status_code\n self.media_type = get_enum_string_value(media_type)\n self.background = background\n self.headers = headers or {}\n self.cookies = cookies or []\n self.encoding = encoding\n self.is_head_response = is_head_response\n self.status_allows_body = not (\n self.status_code in {HTTP_204_NO_CONTENT, HTTP_304_NOT_MODIFIED} or self.status_code < HTTP_200_OK\n )\n self.body = self.render(content) if not self.is_head_response else b\"\"\n\n def set_cookie(\n self,\n key: str,\n value: Optional[str] = None,\n max_age: Optional[int] = None,\n expires: Optional[int] = None,\n path: str = \"/\",\n domain: Optional[str] = None,\n secure: bool = False,\n httponly: bool = False,\n samesite: 'Literal[\"lax\", \"strict\", \"none\"]' = \"lax\",\n ) -> None:\n \"\"\"Sets a cookie on the response.\n\n Args:\n key: Key for the cookie.\n value: Value for the cookie, if none given defaults to empty string.\n max_age: Maximal age of the cookie before its invalidated.\n expires: Expiration date as unix MS timestamp.\n path: Path fragment that must exist in the request url for the cookie to be valid. Defaults to '/'.\n domain: Domain for which the cookie is valid.\n secure: Https is required for the cookie.\n httponly: Forbids javascript to access the cookie via 'Document.cookie'.\n samesite: Controls whether a cookie is sent with cross-site requests. Defaults to 'lax'.\n\n Returns:\n None.\n \"\"\"\n self.cookies.append(\n Cookie(\n domain=domain,\n expires=expires,\n httponly=httponly,\n key=key,\n max_age=max_age,\n path=path,\n samesite=samesite,\n secure=secure,\n value=value,\n )\n )\n\n def set_header(self, key: str, value: str) -> None:\n \"\"\"Sets a header on the response.\n\n Args:\n key: Header key.\n value: Header value.\n\n Returns:\n None.\n \"\"\"\n self.headers[key] = value\n\n def set_etag(self, etag: Union[str, \"ETag\"]) -> None:\n \"\"\"Sets an etag header.\n\n Args:\n etag: An etag value.\n\n Returns:\n None\n \"\"\"\n self.headers[\"etag\"] = etag.to_header() if isinstance(etag, ETag) else etag\n\n def delete_cookie(\n self,\n key: str,\n path: str = \"/\",\n domain: Optional[str] = None,\n ) -> None:\n \"\"\"Deletes a cookie.\n\n Args:\n key: Key of the cookie.\n path: Path of the cookie.\n domain: Domain of the cookie.\n\n Returns:\n None.\n \"\"\"\n cookie = Cookie(key=key, path=path, domain=domain, expires=0, max_age=0)\n self.cookies = [c for c in self.cookies if c != cookie]\n self.cookies.append(cookie)\n\n @staticmethod\n def serializer(value: Any) -> Any:\n \"\"\"Serializer hook for orjson to handle pydantic models.\n\n Args:\n value: A value to serialize\n Returns:\n A serialized value\n Raises:\n TypeError: if value is not supported\n \"\"\"\n return default_serializer(value)\n\n def render(self, content: Any) -> bytes:\n \"\"\"\n Handles the rendering of content T into a bytes string.\n Args:\n content: A value for the response body that will be rendered into bytes string.\n\n Returns:\n An encoded bytes string\n \"\"\"\n if self.status_allows_body:\n if isinstance(content, bytes):\n return content\n if isinstance(content, str):\n return content.encode(self.encoding)\n if self.media_type == MediaType.JSON:\n try:\n return dumps(content, default=self.serializer, option=OPT_SERIALIZE_NUMPY | OPT_OMIT_MICROSECONDS)\n except (AttributeError, ValueError, TypeError) as e:\n raise ImproperlyConfiguredException(\"Unable to serialize response content\") from e\n if isinstance(content, OpenAPI):\n content_dict = content.dict(by_alias=True, exclude_none=True)\n if self.media_type == OpenAPIMediaType.OPENAPI_YAML:\n return cast(\"bytes\", dump_yaml(content_dict, default_flow_style=False).encode(\"utf-8\"))\n return dumps(content_dict, option=OPT_INDENT_2 | OPT_OMIT_MICROSECONDS)\n if content is None:\n return b\"\"\n raise ImproperlyConfiguredException(\n f\"unable to render response body for the given {content} with media_type {self.media_type}\"\n )\n if content is not None:\n raise ImproperlyConfiguredException(\n f\"status_code {self.status_code} does not support a response body value\"\n )\n return b\"\"\n\n @property\n def content_length(self) -> Optional[int]:\n \"\"\"\n\n Returns:\n The content length of the body (e.g. for use in a \"Content-Length\" header).\n If the response does not have a body, this value is `None`\n \"\"\"\n if self.status_allows_body:\n return len(self.body)\n return None\n\n @property\n def encoded_headers(self) -> List[Tuple[bytes, bytes]]:\n \"\"\"\n Notes:\n - A 'Content-Length' header will be added if appropriate and not provided by the user.\n\n Returns:\n A list of tuples containing the headers and cookies of the request in a format ready for ASGI transmission.\n \"\"\"\n\n if self.media_type.startswith(\"text/\"):\n content_type = f\"{self.media_type}; charset={self.encoding}\"\n else:\n content_type = self.media_type\n\n encoded_headers = [\n *((k.lower().encode(\"latin-1\"), str(v).encode(\"latin-1\")) for k, v in self.headers.items()),\n *((b\"set-cookie\", cookie.to_header(header=\"\").encode(\"latin-1\")) for cookie in self.cookies),\n (b\"content-type\", content_type.encode(\"latin-1\")),\n ]\n\n if self.content_length and not any(key == b\"content-length\" for key, _ in encoded_headers):\n encoded_headers.append((b\"content-length\", str(self.content_length).encode(\"latin-1\")))\n return encoded_headers\n\n async def after_response(self) -> None:\n \"\"\"Executed after the response is sent.\n\n Returns:\n None\n \"\"\"\n if self.background is not None:\n await self.background()\n\n async def start_response(self, send: \"Send\") -> None:\n \"\"\"\n Emits the start event of the response. This event includes the headers and status codes.\n Args:\n send: The ASGI send function.\n\n Returns:\n None\n \"\"\"\n event: \"HTTPResponseStartEvent\" = {\n \"type\": \"http.response.start\",\n \"status\": self.status_code,\n \"headers\": self.encoded_headers,\n }\n\n await send(event)\n\n async def send_body(self, send: \"Send\", receive: \"Receive\") -> None: # pylint: disable=unused-argument\n \"\"\"Emits the response body.\n\n Args:\n send: The ASGI send function.\n receive: The ASGI receive function.\n\n Notes:\n - Response subclasses should customize this method if there is a need to customize sending data.\n\n Returns:\n None\n \"\"\"\n event: \"HTTPResponseBodyEvent\" = {\"type\": \"http.response.body\", \"body\": self.body, \"more_body\": False}\n await send(event)\n\n async def __call__(self, scope: \"Scope\", receive: \"Receive\", send: \"Send\") -> None:\n \"\"\"The call method of the response is an \"ASGIApp\".\n\n Args:\n scope: The ASGI connection scope.\n receive: The ASGI receive function.\n send: The ASGI send function.\n\n Returns:\n None\n \"\"\"\n await self.start_response(send=send)\n\n if self.is_head_response:\n event: \"HTTPResponseBodyEvent\" = {\"type\": \"http.response.body\", \"body\": b\"\", \"more_body\": False}\n await send(event)\n else:\n await self.send_body(send=send, receive=receive)\n\n await self.after_response()\n", "path": "starlite/response/base.py"}]}
| 3,984 | 163 |
gh_patches_debug_24902
|
rasdani/github-patches
|
git_diff
|
pre-commit__pre-commit-239
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pre-commit autoupdate fails on `local` hooks repos
```
$ pre-commit autoupdate
Updating git@github.com:pre-commit/pre-commit-hooks...updating 9ce45609a92f648c87b42207410386fd69a5d1e5 -> cf550fcab3f12015f8676b8278b30e1a5bc10e70.
Updating git@github.com:pre-commit/pre-commit...updating 4352d45451296934bc17494073b82bcacca3205c -> 1c46446427ab0dfa6293221426b855420533ef8d.
Updating git@github.com:asottile/reorder_python_imports...updating aeda21eb7df6af8c9f6cd990abb086375c71c953 -> 3d86483455ab5bd06cc1069fdd5ac57be5463f10.
Updating local...An unexpected error has occurred: AttributeError: 'NoneType' object has no attribute 'repo_path'
Check the log at ~/.pre-commit/pre-commit.log
(venv-pre_commit)asottile@work:~/workspace/pre-commit$ cat ~/.pre-commit/pre-commit.log
An unexpected error has occurred: AttributeError: 'NoneType' object has no attribute 'repo_path'
Traceback (most recent call last):
File "/home/asottile/workspace/pre-commit/pre_commit/error_handler.py", line 34, in error_handler
yield
File "/home/asottile/workspace/pre-commit/pre_commit/main.py", line 142, in main
return autoupdate(runner)
File "/home/asottile/workspace/pre-commit/pre_commit/commands/autoupdate.py", line 73, in autoupdate
new_repo_config = _update_repository(repo_config, runner)
File "/home/asottile/workspace/pre-commit/pre_commit/commands/autoupdate.py", line 33, in _update_repository
with cwd(repo.repo_path_getter.repo_path):
AttributeError: 'NoneType' object has no attribute 'repo_path'
(venv-pre_commit)asottile@work:~/workspace/pre-commit$ git diff
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 397ee72..20393a7 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -20,3 +20,10 @@
sha: aeda21eb7df6af8c9f6cd990abb086375c71c953
hooks:
- id: reorder-python-imports
+- repo: local
+ hooks:
+ - id: herp
+ name: Herp
+ entry: echo derp
+ language: system
+ files: ^$
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pre_commit/repository.py`
Content:
```
1 from __future__ import unicode_literals
2
3 import logging
4 import shutil
5
6 from cached_property import cached_property
7
8 from pre_commit import git
9 from pre_commit.clientlib.validate_config import is_local_hooks
10 from pre_commit.clientlib.validate_manifest import MANIFEST_JSON_SCHEMA
11 from pre_commit.jsonschema_extensions import apply_defaults
12 from pre_commit.languages.all import languages
13 from pre_commit.languages.helpers import environment_dir
14 from pre_commit.manifest import Manifest
15 from pre_commit.prefixed_command_runner import PrefixedCommandRunner
16
17
18 logger = logging.getLogger('pre_commit')
19
20
21 class Repository(object):
22 def __init__(self, repo_config, repo_path_getter):
23 self.repo_config = repo_config
24 self.repo_path_getter = repo_path_getter
25 self.__installed = False
26
27 @classmethod
28 def create(cls, config, store):
29 if is_local_hooks(config):
30 return LocalRepository(config)
31 else:
32 repo_path_getter = store.get_repo_path_getter(
33 config['repo'], config['sha']
34 )
35 return cls(config, repo_path_getter)
36
37 @cached_property
38 def repo_url(self):
39 return self.repo_config['repo']
40
41 @cached_property
42 def sha(self):
43 return self.repo_config['sha']
44
45 @cached_property
46 def languages(self):
47 return set(
48 (hook['language'], hook['language_version'])
49 for _, hook in self.hooks
50 )
51
52 @cached_property
53 def hooks(self):
54 # TODO: merging in manifest dicts is a smell imo
55 return tuple(
56 (hook['id'], dict(self.manifest.hooks[hook['id']], **hook))
57 for hook in self.repo_config['hooks']
58 )
59
60 @cached_property
61 def manifest(self):
62 return Manifest(self.repo_path_getter)
63
64 @cached_property
65 def cmd_runner(self):
66 return PrefixedCommandRunner(self.repo_path_getter.repo_path)
67
68 def require_installed(self):
69 if self.__installed:
70 return
71
72 self.install()
73 self.__installed = True
74
75 def install(self):
76 """Install the hook repository."""
77 def language_is_installed(language_name, language_version):
78 language = languages[language_name]
79 directory = environment_dir(
80 language.ENVIRONMENT_DIR, language_version,
81 )
82 return (
83 directory is None or
84 self.cmd_runner.exists(directory, '.installed')
85 )
86
87 if not all(
88 language_is_installed(language_name, language_version)
89 for language_name, language_version in self.languages
90 ):
91 logger.info(
92 'Installing environment for {0}.'.format(self.repo_url)
93 )
94 logger.info('Once installed this environment will be reused.')
95 logger.info('This may take a few minutes...')
96
97 for language_name, language_version in self.languages:
98 language = languages[language_name]
99 if language_is_installed(language_name, language_version):
100 continue
101
102 directory = environment_dir(
103 language.ENVIRONMENT_DIR, language_version,
104 )
105 # There's potentially incomplete cleanup from previous runs
106 # Clean it up!
107 if self.cmd_runner.exists(directory):
108 shutil.rmtree(self.cmd_runner.path(directory))
109
110 language.install_environment(self.cmd_runner, language_version)
111 # Touch the .installed file (atomic) to indicate we've installed
112 open(self.cmd_runner.path(directory, '.installed'), 'w').close()
113
114 def run_hook(self, hook, file_args):
115 """Run a hook.
116
117 Args:
118 hook - Hook dictionary
119 file_args - List of files to run
120 """
121 self.require_installed()
122 return languages[hook['language']].run_hook(
123 self.cmd_runner, hook, file_args,
124 )
125
126
127 class LocalRepository(Repository):
128 def __init__(self, repo_config, repo_path_getter=None):
129 repo_path_getter = None
130 super(LocalRepository, self).__init__(repo_config, repo_path_getter)
131
132 @cached_property
133 def hooks(self):
134 return tuple(
135 (hook['id'], apply_defaults(hook, MANIFEST_JSON_SCHEMA['items']))
136 for hook in self.repo_config['hooks']
137 )
138
139 @cached_property
140 def cmd_runner(self):
141 return PrefixedCommandRunner(git.get_root())
142
143 @cached_property
144 def sha(self):
145 raise NotImplementedError
146
147 @cached_property
148 def manifest(self):
149 raise NotImplementedError
150
```
Path: `pre_commit/commands/autoupdate.py`
Content:
```
1 from __future__ import print_function
2 from __future__ import unicode_literals
3
4 import sys
5
6 from aspy.yaml import ordered_dump
7 from aspy.yaml import ordered_load
8
9 import pre_commit.constants as C
10 from pre_commit.clientlib.validate_config import CONFIG_JSON_SCHEMA
11 from pre_commit.clientlib.validate_config import load_config
12 from pre_commit.jsonschema_extensions import remove_defaults
13 from pre_commit.ordereddict import OrderedDict
14 from pre_commit.repository import Repository
15 from pre_commit.util import cmd_output
16 from pre_commit.util import cwd
17
18
19 class RepositoryCannotBeUpdatedError(RuntimeError):
20 pass
21
22
23 def _update_repository(repo_config, runner):
24 """Updates a repository to the tip of `master`. If the repository cannot
25 be updated because a hook that is configured does not exist in `master`,
26 this raises a RepositoryCannotBeUpdatedError
27
28 Args:
29 repo_config - A config for a repository
30 """
31 repo = Repository.create(repo_config, runner.store)
32
33 with cwd(repo.repo_path_getter.repo_path):
34 cmd_output('git', 'fetch')
35 head_sha = cmd_output('git', 'rev-parse', 'origin/master')[1].strip()
36
37 # Don't bother trying to update if our sha is the same
38 if head_sha == repo_config['sha']:
39 return repo_config
40
41 # Construct a new config with the head sha
42 new_config = OrderedDict(repo_config)
43 new_config['sha'] = head_sha
44 new_repo = Repository.create(new_config, runner.store)
45
46 # See if any of our hooks were deleted with the new commits
47 hooks = set(hook_id for hook_id, _ in repo.hooks)
48 hooks_missing = hooks - (hooks & set(new_repo.manifest.hooks.keys()))
49 if hooks_missing:
50 raise RepositoryCannotBeUpdatedError(
51 'Cannot update because the tip of master is missing these hooks:\n'
52 '{0}'.format(', '.join(sorted(hooks_missing)))
53 )
54
55 return new_config
56
57
58 def autoupdate(runner):
59 """Auto-update the pre-commit config to the latest versions of repos."""
60 retv = 0
61 output_configs = []
62 changed = False
63
64 input_configs = load_config(
65 runner.config_file_path,
66 load_strategy=ordered_load,
67 )
68
69 for repo_config in input_configs:
70 sys.stdout.write('Updating {0}...'.format(repo_config['repo']))
71 sys.stdout.flush()
72 try:
73 new_repo_config = _update_repository(repo_config, runner)
74 except RepositoryCannotBeUpdatedError as error:
75 print(error.args[0])
76 output_configs.append(repo_config)
77 retv = 1
78 continue
79
80 if new_repo_config['sha'] != repo_config['sha']:
81 changed = True
82 print(
83 'updating {0} -> {1}.'.format(
84 repo_config['sha'], new_repo_config['sha'],
85 )
86 )
87 output_configs.append(new_repo_config)
88 else:
89 print('already up to date.')
90 output_configs.append(repo_config)
91
92 if changed:
93 with open(runner.config_file_path, 'w') as config_file:
94 config_file.write(
95 ordered_dump(
96 remove_defaults(output_configs, CONFIG_JSON_SCHEMA),
97 **C.YAML_DUMP_KWARGS
98 )
99 )
100
101 return retv
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pre_commit/commands/autoupdate.py b/pre_commit/commands/autoupdate.py
--- a/pre_commit/commands/autoupdate.py
+++ b/pre_commit/commands/autoupdate.py
@@ -8,6 +8,7 @@
import pre_commit.constants as C
from pre_commit.clientlib.validate_config import CONFIG_JSON_SCHEMA
+from pre_commit.clientlib.validate_config import is_local_hooks
from pre_commit.clientlib.validate_config import load_config
from pre_commit.jsonschema_extensions import remove_defaults
from pre_commit.ordereddict import OrderedDict
@@ -67,6 +68,8 @@
)
for repo_config in input_configs:
+ if is_local_hooks(repo_config):
+ continue
sys.stdout.write('Updating {0}...'.format(repo_config['repo']))
sys.stdout.flush()
try:
diff --git a/pre_commit/repository.py b/pre_commit/repository.py
--- a/pre_commit/repository.py
+++ b/pre_commit/repository.py
@@ -125,9 +125,8 @@
class LocalRepository(Repository):
- def __init__(self, repo_config, repo_path_getter=None):
- repo_path_getter = None
- super(LocalRepository, self).__init__(repo_config, repo_path_getter)
+ def __init__(self, repo_config):
+ super(LocalRepository, self).__init__(repo_config, None)
@cached_property
def hooks(self):
|
{"golden_diff": "diff --git a/pre_commit/commands/autoupdate.py b/pre_commit/commands/autoupdate.py\n--- a/pre_commit/commands/autoupdate.py\n+++ b/pre_commit/commands/autoupdate.py\n@@ -8,6 +8,7 @@\n \n import pre_commit.constants as C\n from pre_commit.clientlib.validate_config import CONFIG_JSON_SCHEMA\n+from pre_commit.clientlib.validate_config import is_local_hooks\n from pre_commit.clientlib.validate_config import load_config\n from pre_commit.jsonschema_extensions import remove_defaults\n from pre_commit.ordereddict import OrderedDict\n@@ -67,6 +68,8 @@\n )\n \n for repo_config in input_configs:\n+ if is_local_hooks(repo_config):\n+ continue\n sys.stdout.write('Updating {0}...'.format(repo_config['repo']))\n sys.stdout.flush()\n try:\ndiff --git a/pre_commit/repository.py b/pre_commit/repository.py\n--- a/pre_commit/repository.py\n+++ b/pre_commit/repository.py\n@@ -125,9 +125,8 @@\n \n \n class LocalRepository(Repository):\n- def __init__(self, repo_config, repo_path_getter=None):\n- repo_path_getter = None\n- super(LocalRepository, self).__init__(repo_config, repo_path_getter)\n+ def __init__(self, repo_config):\n+ super(LocalRepository, self).__init__(repo_config, None)\n \n @cached_property\n def hooks(self):\n", "issue": "pre-commit autoupdate fails on `local` hooks repos\n```\n$ pre-commit autoupdate\nUpdating git@github.com:pre-commit/pre-commit-hooks...updating 9ce45609a92f648c87b42207410386fd69a5d1e5 -> cf550fcab3f12015f8676b8278b30e1a5bc10e70.\nUpdating git@github.com:pre-commit/pre-commit...updating 4352d45451296934bc17494073b82bcacca3205c -> 1c46446427ab0dfa6293221426b855420533ef8d.\nUpdating git@github.com:asottile/reorder_python_imports...updating aeda21eb7df6af8c9f6cd990abb086375c71c953 -> 3d86483455ab5bd06cc1069fdd5ac57be5463f10.\nUpdating local...An unexpected error has occurred: AttributeError: 'NoneType' object has no attribute 'repo_path'\nCheck the log at ~/.pre-commit/pre-commit.log\n(venv-pre_commit)asottile@work:~/workspace/pre-commit$ cat ~/.pre-commit/pre-commit.log \nAn unexpected error has occurred: AttributeError: 'NoneType' object has no attribute 'repo_path'\nTraceback (most recent call last):\n File \"/home/asottile/workspace/pre-commit/pre_commit/error_handler.py\", line 34, in error_handler\n yield\n File \"/home/asottile/workspace/pre-commit/pre_commit/main.py\", line 142, in main\n return autoupdate(runner)\n File \"/home/asottile/workspace/pre-commit/pre_commit/commands/autoupdate.py\", line 73, in autoupdate\n new_repo_config = _update_repository(repo_config, runner)\n File \"/home/asottile/workspace/pre-commit/pre_commit/commands/autoupdate.py\", line 33, in _update_repository\n with cwd(repo.repo_path_getter.repo_path):\nAttributeError: 'NoneType' object has no attribute 'repo_path'\n\n(venv-pre_commit)asottile@work:~/workspace/pre-commit$ git diff\ndiff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml\nindex 397ee72..20393a7 100644\n--- a/.pre-commit-config.yaml\n+++ b/.pre-commit-config.yaml\n@@ -20,3 +20,10 @@\n sha: aeda21eb7df6af8c9f6cd990abb086375c71c953\n hooks:\n - id: reorder-python-imports\n+- repo: local\n+ hooks:\n+ - id: herp\n+ name: Herp\n+ entry: echo derp\n+ language: system\n+ files: ^$\n```\n\n", "before_files": [{"content": "from __future__ import unicode_literals\n\nimport logging\nimport shutil\n\nfrom cached_property import cached_property\n\nfrom pre_commit import git\nfrom pre_commit.clientlib.validate_config import is_local_hooks\nfrom pre_commit.clientlib.validate_manifest import MANIFEST_JSON_SCHEMA\nfrom pre_commit.jsonschema_extensions import apply_defaults\nfrom pre_commit.languages.all import languages\nfrom pre_commit.languages.helpers import environment_dir\nfrom pre_commit.manifest import Manifest\nfrom pre_commit.prefixed_command_runner import PrefixedCommandRunner\n\n\nlogger = logging.getLogger('pre_commit')\n\n\nclass Repository(object):\n def __init__(self, repo_config, repo_path_getter):\n self.repo_config = repo_config\n self.repo_path_getter = repo_path_getter\n self.__installed = False\n\n @classmethod\n def create(cls, config, store):\n if is_local_hooks(config):\n return LocalRepository(config)\n else:\n repo_path_getter = store.get_repo_path_getter(\n config['repo'], config['sha']\n )\n return cls(config, repo_path_getter)\n\n @cached_property\n def repo_url(self):\n return self.repo_config['repo']\n\n @cached_property\n def sha(self):\n return self.repo_config['sha']\n\n @cached_property\n def languages(self):\n return set(\n (hook['language'], hook['language_version'])\n for _, hook in self.hooks\n )\n\n @cached_property\n def hooks(self):\n # TODO: merging in manifest dicts is a smell imo\n return tuple(\n (hook['id'], dict(self.manifest.hooks[hook['id']], **hook))\n for hook in self.repo_config['hooks']\n )\n\n @cached_property\n def manifest(self):\n return Manifest(self.repo_path_getter)\n\n @cached_property\n def cmd_runner(self):\n return PrefixedCommandRunner(self.repo_path_getter.repo_path)\n\n def require_installed(self):\n if self.__installed:\n return\n\n self.install()\n self.__installed = True\n\n def install(self):\n \"\"\"Install the hook repository.\"\"\"\n def language_is_installed(language_name, language_version):\n language = languages[language_name]\n directory = environment_dir(\n language.ENVIRONMENT_DIR, language_version,\n )\n return (\n directory is None or\n self.cmd_runner.exists(directory, '.installed')\n )\n\n if not all(\n language_is_installed(language_name, language_version)\n for language_name, language_version in self.languages\n ):\n logger.info(\n 'Installing environment for {0}.'.format(self.repo_url)\n )\n logger.info('Once installed this environment will be reused.')\n logger.info('This may take a few minutes...')\n\n for language_name, language_version in self.languages:\n language = languages[language_name]\n if language_is_installed(language_name, language_version):\n continue\n\n directory = environment_dir(\n language.ENVIRONMENT_DIR, language_version,\n )\n # There's potentially incomplete cleanup from previous runs\n # Clean it up!\n if self.cmd_runner.exists(directory):\n shutil.rmtree(self.cmd_runner.path(directory))\n\n language.install_environment(self.cmd_runner, language_version)\n # Touch the .installed file (atomic) to indicate we've installed\n open(self.cmd_runner.path(directory, '.installed'), 'w').close()\n\n def run_hook(self, hook, file_args):\n \"\"\"Run a hook.\n\n Args:\n hook - Hook dictionary\n file_args - List of files to run\n \"\"\"\n self.require_installed()\n return languages[hook['language']].run_hook(\n self.cmd_runner, hook, file_args,\n )\n\n\nclass LocalRepository(Repository):\n def __init__(self, repo_config, repo_path_getter=None):\n repo_path_getter = None\n super(LocalRepository, self).__init__(repo_config, repo_path_getter)\n\n @cached_property\n def hooks(self):\n return tuple(\n (hook['id'], apply_defaults(hook, MANIFEST_JSON_SCHEMA['items']))\n for hook in self.repo_config['hooks']\n )\n\n @cached_property\n def cmd_runner(self):\n return PrefixedCommandRunner(git.get_root())\n\n @cached_property\n def sha(self):\n raise NotImplementedError\n\n @cached_property\n def manifest(self):\n raise NotImplementedError\n", "path": "pre_commit/repository.py"}, {"content": "from __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\n\nfrom aspy.yaml import ordered_dump\nfrom aspy.yaml import ordered_load\n\nimport pre_commit.constants as C\nfrom pre_commit.clientlib.validate_config import CONFIG_JSON_SCHEMA\nfrom pre_commit.clientlib.validate_config import load_config\nfrom pre_commit.jsonschema_extensions import remove_defaults\nfrom pre_commit.ordereddict import OrderedDict\nfrom pre_commit.repository import Repository\nfrom pre_commit.util import cmd_output\nfrom pre_commit.util import cwd\n\n\nclass RepositoryCannotBeUpdatedError(RuntimeError):\n pass\n\n\ndef _update_repository(repo_config, runner):\n \"\"\"Updates a repository to the tip of `master`. If the repository cannot\n be updated because a hook that is configured does not exist in `master`,\n this raises a RepositoryCannotBeUpdatedError\n\n Args:\n repo_config - A config for a repository\n \"\"\"\n repo = Repository.create(repo_config, runner.store)\n\n with cwd(repo.repo_path_getter.repo_path):\n cmd_output('git', 'fetch')\n head_sha = cmd_output('git', 'rev-parse', 'origin/master')[1].strip()\n\n # Don't bother trying to update if our sha is the same\n if head_sha == repo_config['sha']:\n return repo_config\n\n # Construct a new config with the head sha\n new_config = OrderedDict(repo_config)\n new_config['sha'] = head_sha\n new_repo = Repository.create(new_config, runner.store)\n\n # See if any of our hooks were deleted with the new commits\n hooks = set(hook_id for hook_id, _ in repo.hooks)\n hooks_missing = hooks - (hooks & set(new_repo.manifest.hooks.keys()))\n if hooks_missing:\n raise RepositoryCannotBeUpdatedError(\n 'Cannot update because the tip of master is missing these hooks:\\n'\n '{0}'.format(', '.join(sorted(hooks_missing)))\n )\n\n return new_config\n\n\ndef autoupdate(runner):\n \"\"\"Auto-update the pre-commit config to the latest versions of repos.\"\"\"\n retv = 0\n output_configs = []\n changed = False\n\n input_configs = load_config(\n runner.config_file_path,\n load_strategy=ordered_load,\n )\n\n for repo_config in input_configs:\n sys.stdout.write('Updating {0}...'.format(repo_config['repo']))\n sys.stdout.flush()\n try:\n new_repo_config = _update_repository(repo_config, runner)\n except RepositoryCannotBeUpdatedError as error:\n print(error.args[0])\n output_configs.append(repo_config)\n retv = 1\n continue\n\n if new_repo_config['sha'] != repo_config['sha']:\n changed = True\n print(\n 'updating {0} -> {1}.'.format(\n repo_config['sha'], new_repo_config['sha'],\n )\n )\n output_configs.append(new_repo_config)\n else:\n print('already up to date.')\n output_configs.append(repo_config)\n\n if changed:\n with open(runner.config_file_path, 'w') as config_file:\n config_file.write(\n ordered_dump(\n remove_defaults(output_configs, CONFIG_JSON_SCHEMA),\n **C.YAML_DUMP_KWARGS\n )\n )\n\n return retv\n", "path": "pre_commit/commands/autoupdate.py"}], "after_files": [{"content": "from __future__ import unicode_literals\n\nimport logging\nimport shutil\n\nfrom cached_property import cached_property\n\nfrom pre_commit import git\nfrom pre_commit.clientlib.validate_config import is_local_hooks\nfrom pre_commit.clientlib.validate_manifest import MANIFEST_JSON_SCHEMA\nfrom pre_commit.jsonschema_extensions import apply_defaults\nfrom pre_commit.languages.all import languages\nfrom pre_commit.languages.helpers import environment_dir\nfrom pre_commit.manifest import Manifest\nfrom pre_commit.prefixed_command_runner import PrefixedCommandRunner\n\n\nlogger = logging.getLogger('pre_commit')\n\n\nclass Repository(object):\n def __init__(self, repo_config, repo_path_getter):\n self.repo_config = repo_config\n self.repo_path_getter = repo_path_getter\n self.__installed = False\n\n @classmethod\n def create(cls, config, store):\n if is_local_hooks(config):\n return LocalRepository(config)\n else:\n repo_path_getter = store.get_repo_path_getter(\n config['repo'], config['sha']\n )\n return cls(config, repo_path_getter)\n\n @cached_property\n def repo_url(self):\n return self.repo_config['repo']\n\n @cached_property\n def sha(self):\n return self.repo_config['sha']\n\n @cached_property\n def languages(self):\n return set(\n (hook['language'], hook['language_version'])\n for _, hook in self.hooks\n )\n\n @cached_property\n def hooks(self):\n # TODO: merging in manifest dicts is a smell imo\n return tuple(\n (hook['id'], dict(self.manifest.hooks[hook['id']], **hook))\n for hook in self.repo_config['hooks']\n )\n\n @cached_property\n def manifest(self):\n return Manifest(self.repo_path_getter)\n\n @cached_property\n def cmd_runner(self):\n return PrefixedCommandRunner(self.repo_path_getter.repo_path)\n\n def require_installed(self):\n if self.__installed:\n return\n\n self.install()\n self.__installed = True\n\n def install(self):\n \"\"\"Install the hook repository.\"\"\"\n def language_is_installed(language_name, language_version):\n language = languages[language_name]\n directory = environment_dir(\n language.ENVIRONMENT_DIR, language_version,\n )\n return (\n directory is None or\n self.cmd_runner.exists(directory, '.installed')\n )\n\n if not all(\n language_is_installed(language_name, language_version)\n for language_name, language_version in self.languages\n ):\n logger.info(\n 'Installing environment for {0}.'.format(self.repo_url)\n )\n logger.info('Once installed this environment will be reused.')\n logger.info('This may take a few minutes...')\n\n for language_name, language_version in self.languages:\n language = languages[language_name]\n if language_is_installed(language_name, language_version):\n continue\n\n directory = environment_dir(\n language.ENVIRONMENT_DIR, language_version,\n )\n # There's potentially incomplete cleanup from previous runs\n # Clean it up!\n if self.cmd_runner.exists(directory):\n shutil.rmtree(self.cmd_runner.path(directory))\n\n language.install_environment(self.cmd_runner, language_version)\n # Touch the .installed file (atomic) to indicate we've installed\n open(self.cmd_runner.path(directory, '.installed'), 'w').close()\n\n def run_hook(self, hook, file_args):\n \"\"\"Run a hook.\n\n Args:\n hook - Hook dictionary\n file_args - List of files to run\n \"\"\"\n self.require_installed()\n return languages[hook['language']].run_hook(\n self.cmd_runner, hook, file_args,\n )\n\n\nclass LocalRepository(Repository):\n def __init__(self, repo_config):\n super(LocalRepository, self).__init__(repo_config, None)\n\n @cached_property\n def hooks(self):\n return tuple(\n (hook['id'], apply_defaults(hook, MANIFEST_JSON_SCHEMA['items']))\n for hook in self.repo_config['hooks']\n )\n\n @cached_property\n def cmd_runner(self):\n return PrefixedCommandRunner(git.get_root())\n\n @cached_property\n def sha(self):\n raise NotImplementedError\n\n @cached_property\n def manifest(self):\n raise NotImplementedError\n", "path": "pre_commit/repository.py"}, {"content": "from __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\n\nfrom aspy.yaml import ordered_dump\nfrom aspy.yaml import ordered_load\n\nimport pre_commit.constants as C\nfrom pre_commit.clientlib.validate_config import CONFIG_JSON_SCHEMA\nfrom pre_commit.clientlib.validate_config import is_local_hooks\nfrom pre_commit.clientlib.validate_config import load_config\nfrom pre_commit.jsonschema_extensions import remove_defaults\nfrom pre_commit.ordereddict import OrderedDict\nfrom pre_commit.repository import Repository\nfrom pre_commit.util import cmd_output\nfrom pre_commit.util import cwd\n\n\nclass RepositoryCannotBeUpdatedError(RuntimeError):\n pass\n\n\ndef _update_repository(repo_config, runner):\n \"\"\"Updates a repository to the tip of `master`. If the repository cannot\n be updated because a hook that is configured does not exist in `master`,\n this raises a RepositoryCannotBeUpdatedError\n\n Args:\n repo_config - A config for a repository\n \"\"\"\n repo = Repository.create(repo_config, runner.store)\n\n with cwd(repo.repo_path_getter.repo_path):\n cmd_output('git', 'fetch')\n head_sha = cmd_output('git', 'rev-parse', 'origin/master')[1].strip()\n\n # Don't bother trying to update if our sha is the same\n if head_sha == repo_config['sha']:\n return repo_config\n\n # Construct a new config with the head sha\n new_config = OrderedDict(repo_config)\n new_config['sha'] = head_sha\n new_repo = Repository.create(new_config, runner.store)\n\n # See if any of our hooks were deleted with the new commits\n hooks = set(hook_id for hook_id, _ in repo.hooks)\n hooks_missing = hooks - (hooks & set(new_repo.manifest.hooks.keys()))\n if hooks_missing:\n raise RepositoryCannotBeUpdatedError(\n 'Cannot update because the tip of master is missing these hooks:\\n'\n '{0}'.format(', '.join(sorted(hooks_missing)))\n )\n\n return new_config\n\n\ndef autoupdate(runner):\n \"\"\"Auto-update the pre-commit config to the latest versions of repos.\"\"\"\n retv = 0\n output_configs = []\n changed = False\n\n input_configs = load_config(\n runner.config_file_path,\n load_strategy=ordered_load,\n )\n\n for repo_config in input_configs:\n if is_local_hooks(repo_config):\n continue\n sys.stdout.write('Updating {0}...'.format(repo_config['repo']))\n sys.stdout.flush()\n try:\n new_repo_config = _update_repository(repo_config, runner)\n except RepositoryCannotBeUpdatedError as error:\n print(error.args[0])\n output_configs.append(repo_config)\n retv = 1\n continue\n\n if new_repo_config['sha'] != repo_config['sha']:\n changed = True\n print(\n 'updating {0} -> {1}.'.format(\n repo_config['sha'], new_repo_config['sha'],\n )\n )\n output_configs.append(new_repo_config)\n else:\n print('already up to date.')\n output_configs.append(repo_config)\n\n if changed:\n with open(runner.config_file_path, 'w') as config_file:\n config_file.write(\n ordered_dump(\n remove_defaults(output_configs, CONFIG_JSON_SCHEMA),\n **C.YAML_DUMP_KWARGS\n )\n )\n\n return retv\n", "path": "pre_commit/commands/autoupdate.py"}]}
| 3,163 | 312 |
gh_patches_debug_9806
|
rasdani/github-patches
|
git_diff
|
pyro-ppl__pyro-2776
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Upgrade to PyTorch 1.8
This documents our upgrade plan to PyTorch 1.8, which breaks `pyro.distributions` as in Pyro 1.5.1 release. PyTorch 1.8.0 release is scheduled for sometime in March 2021.
## Plan
- [x] #2755 Release Pyro 1.5.2 which constrains to `torch<=1.7.2`.
- [x] #2757 Upgrade dev branch to PyTorch 1.7.0, fixing tests.
- [x] #2753 Pin Pyro dev branch to PyTorch nightly build, and fix many breaking changes.
- [x] #2739 Implement `Distribution.infer_shapes()`.
- ...
- [ ] After PyTorch 1.8 release, pin Pyro dev back to PyTorch release and release Pyro 1.6.
cc @neerajprad
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/source/conf.py`
Content:
```
1 # Copyright (c) 2017-2019 Uber Technologies, Inc.
2 # SPDX-License-Identifier: Apache-2.0
3
4 import os
5 import sys
6
7 import sphinx_rtd_theme
8
9 # import pkg_resources
10
11 # -*- coding: utf-8 -*-
12 #
13 # Pyro documentation build configuration file, created by
14 # sphinx-quickstart on Thu Jun 15 17:16:14 2017.
15 #
16 # This file is execfile()d with the current directory set to its
17 # containing dir.
18 #
19 # Note that not all possible configuration values are present in this
20 # autogenerated file.
21 #
22 # All configuration values have a default; values that are commented out
23 # serve to show the default.
24
25 # If extensions (or modules to document with autodoc) are in another directory,
26 # add these directories to sys.path here. If the directory is relative to the
27 # documentation root, use os.path.abspath to make it absolute, like shown here.
28 #
29 sys.path.insert(0, os.path.abspath('../..'))
30
31 # -- General configuration ------------------------------------------------
32
33 # If your documentation needs a minimal Sphinx version, state it here.
34 #
35 # needs_sphinx = '1.0'
36
37 # Add any Sphinx extension module names here, as strings. They can be
38 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
39 # ones.
40 extensions = [
41 'sphinx.ext.intersphinx', #
42 'sphinx.ext.todo', #
43 'sphinx.ext.mathjax', #
44 'sphinx.ext.ifconfig', #
45 'sphinx.ext.viewcode', #
46 'sphinx.ext.githubpages', #
47 'sphinx.ext.graphviz', #
48 'sphinx.ext.autodoc',
49 'sphinx.ext.doctest',
50 ]
51
52 # Disable documentation inheritance so as to avoid inheriting
53 # docstrings in a different format, e.g. when the parent class
54 # is a PyTorch class.
55
56 autodoc_inherit_docstrings = False
57
58 # Add any paths that contain templates here, relative to this directory.
59 templates_path = ['_templates']
60
61 # The suffix(es) of source filenames.
62 # You can specify multiple suffix as a list of string:
63 #
64 # source_suffix = ['.rst', '.md']
65 source_suffix = '.rst'
66
67 # The master toctree document.
68 master_doc = 'index'
69
70 # General information about the project.
71 project = u'Pyro'
72 copyright = u'2017-2018, Uber Technologies, Inc'
73 author = u'Uber AI Labs'
74
75 # The version info for the project you're documenting, acts as replacement for
76 # |version| and |release|, also used in various other places throughout the
77 # built documents.
78
79 version = ''
80
81 if 'READTHEDOCS' not in os.environ:
82 # if developing locally, use pyro.__version__ as version
83 from pyro import __version__ # noqaE402
84 version = __version__
85
86 # release version
87 release = version
88
89 # The language for content autogenerated by Sphinx. Refer to documentation
90 # for a list of supported languages.
91 #
92 # This is also used if you do content translation via gettext catalogs.
93 # Usually you set "language" from the command line for these cases.
94 language = None
95
96 # List of patterns, relative to source directory, that match files and
97 # directories to ignore when looking for source files.
98 # This patterns also effect to html_static_path and html_extra_path
99 exclude_patterns = []
100
101 # The name of the Pygments (syntax highlighting) style to use.
102 pygments_style = 'sphinx'
103
104 # If true, `todo` and `todoList` produce output, else they produce nothing.
105 todo_include_todos = True
106
107 # do not prepend module name to functions
108 add_module_names = False
109
110 # -- Options for HTML output ----------------------------------------------
111
112 # logo
113 html_logo = '_static/img/pyro_logo_wide.png'
114
115 # logo
116 html_favicon = '_static/img/favicon/favicon.ico'
117
118 # The theme to use for HTML and HTML Help pages. See the documentation for
119 # a list of builtin themes.
120 #
121 html_theme = "sphinx_rtd_theme"
122 html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
123
124 # Theme options are theme-specific and customize the look and feel of a theme
125 # further. For a list of options available for each theme, see the
126 # documentation.
127
128 html_theme_options = {
129 'navigation_depth': 3,
130 'logo_only': True,
131 }
132
133 # Add any paths that contain custom static files (such as style sheets) here,
134 # relative to this directory. They are copied after the builtin static files,
135 # so a file named "default.css" will overwrite the builtin "default.css".
136 html_static_path = ['_static']
137 html_style = 'css/pyro.css'
138
139 # -- Options for HTMLHelp output ------------------------------------------
140
141 # Output file base name for HTML help builder.
142 htmlhelp_basename = 'Pyrodoc'
143
144 # -- Options for LaTeX output ---------------------------------------------
145
146 latex_elements = {
147 # The paper size ('letterpaper' or 'a4paper').
148 #
149 # 'papersize': 'letterpaper',
150
151 # The font size ('10pt', '11pt' or '12pt').
152 #
153 # 'pointsize': '10pt',
154
155 # Additional stuff for the LaTeX preamble.
156 #
157 # 'preamble': '',
158
159 # Latex figure (float) alignment
160 #
161 # 'figure_align': 'htbp',
162 }
163
164 # Grouping the document tree into LaTeX files. List of tuples
165 # (source start file, target name, title,
166 # author, documentclass [howto, manual, or own class]).
167 latex_documents = [
168 (master_doc, 'Pyro.tex', u'Pyro Documentation', u'Uber AI Labs', 'manual'),
169 ]
170
171 # -- Options for manual page output ---------------------------------------
172
173 # One entry per manual page. List of tuples
174 # (source start file, name, description, authors, manual section).
175 man_pages = [(master_doc, 'pyro', u'Pyro Documentation', [author], 1)]
176
177 # -- Options for Texinfo output -------------------------------------------
178
179 # Grouping the document tree into Texinfo files. List of tuples
180 # (source start file, target name, title, author,
181 # dir menu entry, description, category)
182 texinfo_documents = [
183 (master_doc, 'Pyro', u'Pyro Documentation', author, 'Pyro',
184 'Deep Universal Probabilistic Programming.', 'Miscellaneous'),
185 ]
186
187 # Example configuration for intersphinx: refer to the Python standard library.
188 intersphinx_mapping = {
189 'python': ('https://docs.python.org/3/', None),
190 'torch': ('https://pytorch.org/docs/master/', None),
191 'funsor': ('http://funsor.pyro.ai/en/stable/', None),
192 'opt_einsum': ('https://optimized-einsum.readthedocs.io/en/stable/', None),
193 'scipy': ('https://docs.scipy.org/doc/scipy/reference/', None),
194 'Bio': ('https://biopython.readthedocs.io/en/latest/', None),
195 'horovod': ('https://horovod.readthedocs.io/en/stable/', None),
196 }
197
198 # document class constructors (__init__ methods):
199 """ comment out this functionality for now;
200 def skip(app, what, name, obj, skip, options):
201 if name == "__init__":
202 return False
203 return skip
204 """
205
206
207 def setup(app):
208 app.add_stylesheet('css/pyro.css')
209 # app.connect("autodoc-skip-member", skip)
210
211
212 # @jpchen's hack to get rtd builder to install latest pytorch
213 # See similar line in the install section of .travis.yml
214 if 'READTHEDOCS' in os.environ:
215 os.system('pip install numpy')
216 # TODO replace with torch_stable before release
217 # os.system('pip install torch==1.8.0+cpu torchvision==0.9.0+cpu '
218 # '-f https://download.pytorch.org/whl/torch_stable.html')
219 os.system('pip install torch==1.8.0+cpu torchvision==0.9.0 '
220 '-f https://download.pytorch.org/whl/test/cpu/torch_test.html')
221
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/docs/source/conf.py b/docs/source/conf.py
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -213,8 +213,5 @@
# See similar line in the install section of .travis.yml
if 'READTHEDOCS' in os.environ:
os.system('pip install numpy')
- # TODO replace with torch_stable before release
- # os.system('pip install torch==1.8.0+cpu torchvision==0.9.0+cpu '
- # '-f https://download.pytorch.org/whl/torch_stable.html')
- os.system('pip install torch==1.8.0+cpu torchvision==0.9.0 '
- '-f https://download.pytorch.org/whl/test/cpu/torch_test.html')
+ os.system('pip install torch==1.8.0+cpu torchvision==0.9.0+cpu '
+ '-f https://download.pytorch.org/whl/torch_stable.html')
|
{"golden_diff": "diff --git a/docs/source/conf.py b/docs/source/conf.py\n--- a/docs/source/conf.py\n+++ b/docs/source/conf.py\n@@ -213,8 +213,5 @@\n # See similar line in the install section of .travis.yml\n if 'READTHEDOCS' in os.environ:\n os.system('pip install numpy')\n- # TODO replace with torch_stable before release\n- # os.system('pip install torch==1.8.0+cpu torchvision==0.9.0+cpu '\n- # '-f https://download.pytorch.org/whl/torch_stable.html')\n- os.system('pip install torch==1.8.0+cpu torchvision==0.9.0 '\n- '-f https://download.pytorch.org/whl/test/cpu/torch_test.html')\n+ os.system('pip install torch==1.8.0+cpu torchvision==0.9.0+cpu '\n+ '-f https://download.pytorch.org/whl/torch_stable.html')\n", "issue": "Upgrade to PyTorch 1.8\nThis documents our upgrade plan to PyTorch 1.8, which breaks `pyro.distributions` as in Pyro 1.5.1 release. PyTorch 1.8.0 release is scheduled for sometime in March 2021.\r\n\r\n## Plan\r\n\r\n- [x] #2755 Release Pyro 1.5.2 which constrains to `torch<=1.7.2`.\r\n- [x] #2757 Upgrade dev branch to PyTorch 1.7.0, fixing tests.\r\n- [x] #2753 Pin Pyro dev branch to PyTorch nightly build, and fix many breaking changes.\r\n- [x] #2739 Implement `Distribution.infer_shapes()`.\r\n- ...\r\n- [ ] After PyTorch 1.8 release, pin Pyro dev back to PyTorch release and release Pyro 1.6.\r\n\r\ncc @neerajprad \n", "before_files": [{"content": "# Copyright (c) 2017-2019 Uber Technologies, Inc.\n# SPDX-License-Identifier: Apache-2.0\n\nimport os\nimport sys\n\nimport sphinx_rtd_theme\n\n# import pkg_resources\n\n# -*- coding: utf-8 -*-\n#\n# Pyro documentation build configuration file, created by\n# sphinx-quickstart on Thu Jun 15 17:16:14 2017.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nsys.path.insert(0, os.path.abspath('../..'))\n\n# -- General configuration ------------------------------------------------\n\n# If your documentation needs a minimal Sphinx version, state it here.\n#\n# needs_sphinx = '1.0'\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n 'sphinx.ext.intersphinx', #\n 'sphinx.ext.todo', #\n 'sphinx.ext.mathjax', #\n 'sphinx.ext.ifconfig', #\n 'sphinx.ext.viewcode', #\n 'sphinx.ext.githubpages', #\n 'sphinx.ext.graphviz', #\n 'sphinx.ext.autodoc',\n 'sphinx.ext.doctest',\n]\n\n# Disable documentation inheritance so as to avoid inheriting\n# docstrings in a different format, e.g. when the parent class\n# is a PyTorch class.\n\nautodoc_inherit_docstrings = False\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\n# source_suffix = ['.rst', '.md']\nsource_suffix = '.rst'\n\n# The master toctree document.\nmaster_doc = 'index'\n\n# General information about the project.\nproject = u'Pyro'\ncopyright = u'2017-2018, Uber Technologies, Inc'\nauthor = u'Uber AI Labs'\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n\nversion = ''\n\nif 'READTHEDOCS' not in os.environ:\n # if developing locally, use pyro.__version__ as version\n from pyro import __version__ # noqaE402\n version = __version__\n\n# release version\nrelease = version\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n#\n# This is also used if you do content translation via gettext catalogs.\n# Usually you set \"language\" from the command line for these cases.\nlanguage = None\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This patterns also effect to html_static_path and html_extra_path\nexclude_patterns = []\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# If true, `todo` and `todoList` produce output, else they produce nothing.\ntodo_include_todos = True\n\n# do not prepend module name to functions\nadd_module_names = False\n\n# -- Options for HTML output ----------------------------------------------\n\n# logo\nhtml_logo = '_static/img/pyro_logo_wide.png'\n\n# logo\nhtml_favicon = '_static/img/favicon/favicon.ico'\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = \"sphinx_rtd_theme\"\nhtml_theme_path = [sphinx_rtd_theme.get_html_theme_path()]\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n\nhtml_theme_options = {\n 'navigation_depth': 3,\n 'logo_only': True,\n}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\nhtml_style = 'css/pyro.css'\n\n# -- Options for HTMLHelp output ------------------------------------------\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'Pyrodoc'\n\n# -- Options for LaTeX output ---------------------------------------------\n\nlatex_elements = {\n # The paper size ('letterpaper' or 'a4paper').\n #\n # 'papersize': 'letterpaper',\n\n # The font size ('10pt', '11pt' or '12pt').\n #\n # 'pointsize': '10pt',\n\n # Additional stuff for the LaTeX preamble.\n #\n # 'preamble': '',\n\n # Latex figure (float) alignment\n #\n # 'figure_align': 'htbp',\n}\n\n# Grouping the document tree into LaTeX files. List of tuples\n# (source start file, target name, title,\n# author, documentclass [howto, manual, or own class]).\nlatex_documents = [\n (master_doc, 'Pyro.tex', u'Pyro Documentation', u'Uber AI Labs', 'manual'),\n]\n\n# -- Options for manual page output ---------------------------------------\n\n# One entry per manual page. List of tuples\n# (source start file, name, description, authors, manual section).\nman_pages = [(master_doc, 'pyro', u'Pyro Documentation', [author], 1)]\n\n# -- Options for Texinfo output -------------------------------------------\n\n# Grouping the document tree into Texinfo files. List of tuples\n# (source start file, target name, title, author,\n# dir menu entry, description, category)\ntexinfo_documents = [\n (master_doc, 'Pyro', u'Pyro Documentation', author, 'Pyro',\n 'Deep Universal Probabilistic Programming.', 'Miscellaneous'),\n]\n\n# Example configuration for intersphinx: refer to the Python standard library.\nintersphinx_mapping = {\n 'python': ('https://docs.python.org/3/', None),\n 'torch': ('https://pytorch.org/docs/master/', None),\n 'funsor': ('http://funsor.pyro.ai/en/stable/', None),\n 'opt_einsum': ('https://optimized-einsum.readthedocs.io/en/stable/', None),\n 'scipy': ('https://docs.scipy.org/doc/scipy/reference/', None),\n 'Bio': ('https://biopython.readthedocs.io/en/latest/', None),\n 'horovod': ('https://horovod.readthedocs.io/en/stable/', None),\n}\n\n# document class constructors (__init__ methods):\n\"\"\" comment out this functionality for now;\ndef skip(app, what, name, obj, skip, options):\n if name == \"__init__\":\n return False\n return skip\n\"\"\"\n\n\ndef setup(app):\n app.add_stylesheet('css/pyro.css')\n# app.connect(\"autodoc-skip-member\", skip)\n\n\n# @jpchen's hack to get rtd builder to install latest pytorch\n# See similar line in the install section of .travis.yml\nif 'READTHEDOCS' in os.environ:\n os.system('pip install numpy')\n # TODO replace with torch_stable before release\n # os.system('pip install torch==1.8.0+cpu torchvision==0.9.0+cpu '\n # '-f https://download.pytorch.org/whl/torch_stable.html')\n os.system('pip install torch==1.8.0+cpu torchvision==0.9.0 '\n '-f https://download.pytorch.org/whl/test/cpu/torch_test.html')\n", "path": "docs/source/conf.py"}], "after_files": [{"content": "# Copyright (c) 2017-2019 Uber Technologies, Inc.\n# SPDX-License-Identifier: Apache-2.0\n\nimport os\nimport sys\n\nimport sphinx_rtd_theme\n\n# import pkg_resources\n\n# -*- coding: utf-8 -*-\n#\n# Pyro documentation build configuration file, created by\n# sphinx-quickstart on Thu Jun 15 17:16:14 2017.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#\n# All configuration values have a default; values that are commented out\n# serve to show the default.\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nsys.path.insert(0, os.path.abspath('../..'))\n\n# -- General configuration ------------------------------------------------\n\n# If your documentation needs a minimal Sphinx version, state it here.\n#\n# needs_sphinx = '1.0'\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n 'sphinx.ext.intersphinx', #\n 'sphinx.ext.todo', #\n 'sphinx.ext.mathjax', #\n 'sphinx.ext.ifconfig', #\n 'sphinx.ext.viewcode', #\n 'sphinx.ext.githubpages', #\n 'sphinx.ext.graphviz', #\n 'sphinx.ext.autodoc',\n 'sphinx.ext.doctest',\n]\n\n# Disable documentation inheritance so as to avoid inheriting\n# docstrings in a different format, e.g. when the parent class\n# is a PyTorch class.\n\nautodoc_inherit_docstrings = False\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\n# source_suffix = ['.rst', '.md']\nsource_suffix = '.rst'\n\n# The master toctree document.\nmaster_doc = 'index'\n\n# General information about the project.\nproject = u'Pyro'\ncopyright = u'2017-2018, Uber Technologies, Inc'\nauthor = u'Uber AI Labs'\n\n# The version info for the project you're documenting, acts as replacement for\n# |version| and |release|, also used in various other places throughout the\n# built documents.\n\nversion = ''\n\nif 'READTHEDOCS' not in os.environ:\n # if developing locally, use pyro.__version__ as version\n from pyro import __version__ # noqaE402\n version = __version__\n\n# release version\nrelease = version\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n#\n# This is also used if you do content translation via gettext catalogs.\n# Usually you set \"language\" from the command line for these cases.\nlanguage = None\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This patterns also effect to html_static_path and html_extra_path\nexclude_patterns = []\n\n# The name of the Pygments (syntax highlighting) style to use.\npygments_style = 'sphinx'\n\n# If true, `todo` and `todoList` produce output, else they produce nothing.\ntodo_include_todos = True\n\n# do not prepend module name to functions\nadd_module_names = False\n\n# -- Options for HTML output ----------------------------------------------\n\n# logo\nhtml_logo = '_static/img/pyro_logo_wide.png'\n\n# logo\nhtml_favicon = '_static/img/favicon/favicon.ico'\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = \"sphinx_rtd_theme\"\nhtml_theme_path = [sphinx_rtd_theme.get_html_theme_path()]\n\n# Theme options are theme-specific and customize the look and feel of a theme\n# further. For a list of options available for each theme, see the\n# documentation.\n\nhtml_theme_options = {\n 'navigation_depth': 3,\n 'logo_only': True,\n}\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\nhtml_style = 'css/pyro.css'\n\n# -- Options for HTMLHelp output ------------------------------------------\n\n# Output file base name for HTML help builder.\nhtmlhelp_basename = 'Pyrodoc'\n\n# -- Options for LaTeX output ---------------------------------------------\n\nlatex_elements = {\n # The paper size ('letterpaper' or 'a4paper').\n #\n # 'papersize': 'letterpaper',\n\n # The font size ('10pt', '11pt' or '12pt').\n #\n # 'pointsize': '10pt',\n\n # Additional stuff for the LaTeX preamble.\n #\n # 'preamble': '',\n\n # Latex figure (float) alignment\n #\n # 'figure_align': 'htbp',\n}\n\n# Grouping the document tree into LaTeX files. List of tuples\n# (source start file, target name, title,\n# author, documentclass [howto, manual, or own class]).\nlatex_documents = [\n (master_doc, 'Pyro.tex', u'Pyro Documentation', u'Uber AI Labs', 'manual'),\n]\n\n# -- Options for manual page output ---------------------------------------\n\n# One entry per manual page. List of tuples\n# (source start file, name, description, authors, manual section).\nman_pages = [(master_doc, 'pyro', u'Pyro Documentation', [author], 1)]\n\n# -- Options for Texinfo output -------------------------------------------\n\n# Grouping the document tree into Texinfo files. List of tuples\n# (source start file, target name, title, author,\n# dir menu entry, description, category)\ntexinfo_documents = [\n (master_doc, 'Pyro', u'Pyro Documentation', author, 'Pyro',\n 'Deep Universal Probabilistic Programming.', 'Miscellaneous'),\n]\n\n# Example configuration for intersphinx: refer to the Python standard library.\nintersphinx_mapping = {\n 'python': ('https://docs.python.org/3/', None),\n 'torch': ('https://pytorch.org/docs/master/', None),\n 'funsor': ('http://funsor.pyro.ai/en/stable/', None),\n 'opt_einsum': ('https://optimized-einsum.readthedocs.io/en/stable/', None),\n 'scipy': ('https://docs.scipy.org/doc/scipy/reference/', None),\n 'Bio': ('https://biopython.readthedocs.io/en/latest/', None),\n 'horovod': ('https://horovod.readthedocs.io/en/stable/', None),\n}\n\n# document class constructors (__init__ methods):\n\"\"\" comment out this functionality for now;\ndef skip(app, what, name, obj, skip, options):\n if name == \"__init__\":\n return False\n return skip\n\"\"\"\n\n\ndef setup(app):\n app.add_stylesheet('css/pyro.css')\n# app.connect(\"autodoc-skip-member\", skip)\n\n\n# @jpchen's hack to get rtd builder to install latest pytorch\n# See similar line in the install section of .travis.yml\nif 'READTHEDOCS' in os.environ:\n os.system('pip install numpy')\n os.system('pip install torch==1.8.0+cpu torchvision==0.9.0+cpu '\n '-f https://download.pytorch.org/whl/torch_stable.html')\n", "path": "docs/source/conf.py"}]}
| 2,825 | 225 |
gh_patches_debug_10427
|
rasdani/github-patches
|
git_diff
|
Qiskit__qiskit-9651
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Random-circuit conditional generation outputs incorrect typed comparison values
### Environment
- **Qiskit Terra version**: 0.23.2
- **Python version**: 3.10
- **Operating system**: macOS
### What is happening?
The output of `qiskit.circuit.random.random_circuit` for 64+ qubits with `conditional=True` can cause a variety of failures throughout transpilation, depending on the particular ordering of passes run.
### How can we reproduce the issue?
One minimal example, which triggers a bug via `StochasticSwap`:
```python
from qiskit import transpile
from qiskit.transpiler import CouplingMap
from qiskit.circuit.random import random_circuit
qc = random_circuit(71, 71, measure=True, conditional=True) # seed irrelevant
transpile(qc, coupling_map=CouplingMap.from_line(qc.num_qubits), optimization_level=0)
```
produces
```text
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-12-bfcfe1cad076> in <module>
4
5 qc = random_circuit(71, 71, measure=True, conditional=True) # seed irrelevant
----> 6 transpile(qc, coupling_map=CouplingMap.from_line(qc.num_qubits), optimization_level=0)
# ... snip ...
~/code/qiskit/terra/qiskit/transpiler/passes/routing/stochastic_swap.py in _layer_update(self, dag, layer, best_layout, best_depth, best_circuit)
253 logger.debug("layer_update: there are no swaps in this layer")
254 # Output this layer
--> 255 dag.compose(layer["graph"], qubits=best_layout.reorder_bits(dag.qubits))
256
257 def _mapper(self, circuit_graph, coupling_graph, trials=20):
~/code/qiskit/terra/qiskit/dagcircuit/dagcircuit.py in compose(self, other, qubits, clbits, front, inplace)
828 pass
829 elif isinstance(nd, DAGOpNode):
--> 830 condition = dag._map_condition(
831 edge_map, getattr(nd.op, "condition", None), dag.cregs.values()
832 )
~/code/qiskit/terra/qiskit/dagcircuit/dagcircuit.py in _map_condition(wire_map, condition, target_cregs)
691 # same as the cond_val since the new_creg is also a single Clbit.
692 new_cond_val = cond_val
--> 693 elif 2 ** (cond_creg[:].index(bit)) & cond_val:
694 # If the conditional values of the Clbit 'bit' is 1 then the new_cond_val
695 # is updated such that the conditional value of the Clbit to which 'bit'
TypeError: ufunc 'bitwise_and' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
```
### What should happen?
No type errors during compilation.
### Any suggestions?
The problem is that `random_circuit` outputs `numpy.uint64` values for the comparisons, but some points in the compiler stack need to re-map conditions, and use bitmasks against Python's bigint `int` type. The fixed-width Numpy types don't support this if the Python int they're masking / being masked by don't fit. `random_circuit` should be outputting the correct `int` type; this is compelled by the newer `IfElseOp` form, but not the old form.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qiskit/circuit/random/utils.py`
Content:
```
1 # This code is part of Qiskit.
2 #
3 # (C) Copyright IBM 2017.
4 #
5 # This code is licensed under the Apache License, Version 2.0. You may
6 # obtain a copy of this license in the LICENSE.txt file in the root directory
7 # of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
8 #
9 # Any modifications or derivative works of this code must retain this
10 # copyright notice, and modified files need to carry a notice indicating
11 # that they have been altered from the originals.
12
13 """Utility functions for generating random circuits."""
14
15 import numpy as np
16
17 from qiskit.circuit import ClassicalRegister, QuantumCircuit, CircuitInstruction
18 from qiskit.circuit import Reset
19 from qiskit.circuit.library import standard_gates
20 from qiskit.circuit.exceptions import CircuitError
21
22
23 def random_circuit(
24 num_qubits, depth, max_operands=4, measure=False, conditional=False, reset=False, seed=None
25 ):
26 """Generate random circuit of arbitrary size and form.
27
28 This function will generate a random circuit by randomly selecting gates
29 from the set of standard gates in :mod:`qiskit.extensions`. For example:
30
31 .. plot::
32 :include-source:
33
34 from qiskit.circuit.random import random_circuit
35
36 circ = random_circuit(2, 2, measure=True)
37 circ.draw(output='mpl')
38
39 Args:
40 num_qubits (int): number of quantum wires
41 depth (int): layers of operations (i.e. critical path length)
42 max_operands (int): maximum qubit operands of each gate (between 1 and 4)
43 measure (bool): if True, measure all qubits at the end
44 conditional (bool): if True, insert middle measurements and conditionals
45 reset (bool): if True, insert middle resets
46 seed (int): sets random seed (optional)
47
48 Returns:
49 QuantumCircuit: constructed circuit
50
51 Raises:
52 CircuitError: when invalid options given
53 """
54 if num_qubits == 0:
55 return QuantumCircuit()
56 if max_operands < 1 or max_operands > 4:
57 raise CircuitError("max_operands must be between 1 and 4")
58 max_operands = max_operands if num_qubits > max_operands else num_qubits
59
60 gates_1q = [
61 # (Gate class, number of qubits, number of parameters)
62 (standard_gates.IGate, 1, 0),
63 (standard_gates.SXGate, 1, 0),
64 (standard_gates.XGate, 1, 0),
65 (standard_gates.RZGate, 1, 1),
66 (standard_gates.RGate, 1, 2),
67 (standard_gates.HGate, 1, 0),
68 (standard_gates.PhaseGate, 1, 1),
69 (standard_gates.RXGate, 1, 1),
70 (standard_gates.RYGate, 1, 1),
71 (standard_gates.SGate, 1, 0),
72 (standard_gates.SdgGate, 1, 0),
73 (standard_gates.SXdgGate, 1, 0),
74 (standard_gates.TGate, 1, 0),
75 (standard_gates.TdgGate, 1, 0),
76 (standard_gates.UGate, 1, 3),
77 (standard_gates.U1Gate, 1, 1),
78 (standard_gates.U2Gate, 1, 2),
79 (standard_gates.U3Gate, 1, 3),
80 (standard_gates.YGate, 1, 0),
81 (standard_gates.ZGate, 1, 0),
82 ]
83 if reset:
84 gates_1q.append((Reset, 1, 0))
85 gates_2q = [
86 (standard_gates.CXGate, 2, 0),
87 (standard_gates.DCXGate, 2, 0),
88 (standard_gates.CHGate, 2, 0),
89 (standard_gates.CPhaseGate, 2, 1),
90 (standard_gates.CRXGate, 2, 1),
91 (standard_gates.CRYGate, 2, 1),
92 (standard_gates.CRZGate, 2, 1),
93 (standard_gates.CSXGate, 2, 0),
94 (standard_gates.CUGate, 2, 4),
95 (standard_gates.CU1Gate, 2, 1),
96 (standard_gates.CU3Gate, 2, 3),
97 (standard_gates.CYGate, 2, 0),
98 (standard_gates.CZGate, 2, 0),
99 (standard_gates.RXXGate, 2, 1),
100 (standard_gates.RYYGate, 2, 1),
101 (standard_gates.RZZGate, 2, 1),
102 (standard_gates.RZXGate, 2, 1),
103 (standard_gates.XXMinusYYGate, 2, 2),
104 (standard_gates.XXPlusYYGate, 2, 2),
105 (standard_gates.ECRGate, 2, 0),
106 (standard_gates.CSGate, 2, 0),
107 (standard_gates.CSdgGate, 2, 0),
108 (standard_gates.SwapGate, 2, 0),
109 (standard_gates.iSwapGate, 2, 0),
110 ]
111 gates_3q = [
112 (standard_gates.CCXGate, 3, 0),
113 (standard_gates.CSwapGate, 3, 0),
114 (standard_gates.CCZGate, 3, 0),
115 (standard_gates.RCCXGate, 3, 0),
116 ]
117 gates_4q = [
118 (standard_gates.C3SXGate, 4, 0),
119 (standard_gates.RC3XGate, 4, 0),
120 ]
121
122 gates = gates_1q.copy()
123 if max_operands >= 2:
124 gates.extend(gates_2q)
125 if max_operands >= 3:
126 gates.extend(gates_3q)
127 if max_operands >= 4:
128 gates.extend(gates_4q)
129 gates = np.array(
130 gates, dtype=[("class", object), ("num_qubits", np.int64), ("num_params", np.int64)]
131 )
132 gates_1q = np.array(gates_1q, dtype=gates.dtype)
133
134 qc = QuantumCircuit(num_qubits)
135
136 if measure or conditional:
137 cr = ClassicalRegister(num_qubits, "c")
138 qc.add_register(cr)
139
140 if seed is None:
141 seed = np.random.randint(0, np.iinfo(np.int32).max)
142 rng = np.random.default_rng(seed)
143
144 qubits = np.array(qc.qubits, dtype=object, copy=True)
145
146 # Apply arbitrary random operations in layers across all qubits.
147 for _ in range(depth):
148 # We generate all the randomness for the layer in one go, to avoid many separate calls to
149 # the randomisation routines, which can be fairly slow.
150
151 # This reliably draws too much randomness, but it's less expensive than looping over more
152 # calls to the rng. After, trim it down by finding the point when we've used all the qubits.
153 gate_specs = rng.choice(gates, size=len(qubits))
154 cumulative_qubits = np.cumsum(gate_specs["num_qubits"], dtype=np.int64)
155 # Efficiently find the point in the list where the total gates would use as many as
156 # possible of, but not more than, the number of qubits in the layer. If there's slack, fill
157 # it with 1q gates.
158 max_index = np.searchsorted(cumulative_qubits, num_qubits, side="right")
159 gate_specs = gate_specs[:max_index]
160 slack = num_qubits - cumulative_qubits[max_index - 1]
161 if slack:
162 gate_specs = np.hstack((gate_specs, rng.choice(gates_1q, size=slack)))
163
164 # For efficiency in the Python loop, this uses Numpy vectorisation to pre-calculate the
165 # indices into the lists of qubits and parameters for every gate, and then suitably
166 # randomises those lists.
167 q_indices = np.empty(len(gate_specs) + 1, dtype=np.int64)
168 p_indices = np.empty(len(gate_specs) + 1, dtype=np.int64)
169 q_indices[0] = p_indices[0] = 0
170 np.cumsum(gate_specs["num_qubits"], out=q_indices[1:])
171 np.cumsum(gate_specs["num_params"], out=p_indices[1:])
172 parameters = rng.uniform(0, 2 * np.pi, size=p_indices[-1])
173 rng.shuffle(qubits)
174
175 # We've now generated everything we're going to need. Now just to add everything. The
176 # conditional check is outside the two loops to make the more common case of no conditionals
177 # faster, since in Python we don't have a compiler to do this for us.
178 if conditional:
179 is_conditional = rng.random(size=len(gate_specs)) < 0.1
180 condition_values = rng.integers(
181 0, 1 << min(num_qubits, 63), size=np.count_nonzero(is_conditional)
182 )
183 c_ptr = 0
184 for gate, q_start, q_end, p_start, p_end, is_cond in zip(
185 gate_specs["class"],
186 q_indices[:-1],
187 q_indices[1:],
188 p_indices[:-1],
189 p_indices[1:],
190 is_conditional,
191 ):
192 operation = gate(*parameters[p_start:p_end])
193 if is_cond:
194 operation.condition = (cr, condition_values[c_ptr])
195 c_ptr += 1
196 qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))
197 else:
198 for gate, q_start, q_end, p_start, p_end in zip(
199 gate_specs["class"], q_indices[:-1], q_indices[1:], p_indices[:-1], p_indices[1:]
200 ):
201 operation = gate(*parameters[p_start:p_end])
202 qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))
203
204 if measure:
205 qc.measure(qc.qubits, cr)
206
207 return qc
208
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/qiskit/circuit/random/utils.py b/qiskit/circuit/random/utils.py
--- a/qiskit/circuit/random/utils.py
+++ b/qiskit/circuit/random/utils.py
@@ -191,7 +191,8 @@
):
operation = gate(*parameters[p_start:p_end])
if is_cond:
- operation.condition = (cr, condition_values[c_ptr])
+ # The condition values are required to be bigints, not Numpy's fixed-width type.
+ operation.condition = (cr, int(condition_values[c_ptr]))
c_ptr += 1
qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))
else:
|
{"golden_diff": "diff --git a/qiskit/circuit/random/utils.py b/qiskit/circuit/random/utils.py\n--- a/qiskit/circuit/random/utils.py\n+++ b/qiskit/circuit/random/utils.py\n@@ -191,7 +191,8 @@\n ):\n operation = gate(*parameters[p_start:p_end])\n if is_cond:\n- operation.condition = (cr, condition_values[c_ptr])\n+ # The condition values are required to be bigints, not Numpy's fixed-width type.\n+ operation.condition = (cr, int(condition_values[c_ptr]))\n c_ptr += 1\n qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))\n else:\n", "issue": "Random-circuit conditional generation outputs incorrect typed comparison values\n### Environment\n\n- **Qiskit Terra version**: 0.23.2\r\n- **Python version**: 3.10\r\n- **Operating system**: macOS\r\n\n\n### What is happening?\n\nThe output of `qiskit.circuit.random.random_circuit` for 64+ qubits with `conditional=True` can cause a variety of failures throughout transpilation, depending on the particular ordering of passes run.\n\n### How can we reproduce the issue?\n\nOne minimal example, which triggers a bug via `StochasticSwap`:\r\n\r\n```python\r\nfrom qiskit import transpile\r\nfrom qiskit.transpiler import CouplingMap\r\nfrom qiskit.circuit.random import random_circuit\r\n\r\nqc = random_circuit(71, 71, measure=True, conditional=True) # seed irrelevant\r\ntranspile(qc, coupling_map=CouplingMap.from_line(qc.num_qubits), optimization_level=0)\r\n```\r\nproduces\r\n```text\r\n---------------------------------------------------------------------------\r\nTypeError Traceback (most recent call last)\r\n<ipython-input-12-bfcfe1cad076> in <module>\r\n 4\r\n 5 qc = random_circuit(71, 71, measure=True, conditional=True) # seed irrelevant\r\n----> 6 transpile(qc, coupling_map=CouplingMap.from_line(qc.num_qubits), optimization_level=0)\r\n\r\n# ... snip ...\r\n\r\n~/code/qiskit/terra/qiskit/transpiler/passes/routing/stochastic_swap.py in _layer_update(self, dag, layer, best_layout, best_depth, best_circuit)\r\n 253 logger.debug(\"layer_update: there are no swaps in this layer\")\r\n 254 # Output this layer\r\n--> 255 dag.compose(layer[\"graph\"], qubits=best_layout.reorder_bits(dag.qubits))\r\n 256\r\n 257 def _mapper(self, circuit_graph, coupling_graph, trials=20):\r\n\r\n~/code/qiskit/terra/qiskit/dagcircuit/dagcircuit.py in compose(self, other, qubits, clbits, front, inplace)\r\n 828 pass\r\n 829 elif isinstance(nd, DAGOpNode):\r\n--> 830 condition = dag._map_condition(\r\n 831 edge_map, getattr(nd.op, \"condition\", None), dag.cregs.values()\r\n 832 )\r\n\r\n~/code/qiskit/terra/qiskit/dagcircuit/dagcircuit.py in _map_condition(wire_map, condition, target_cregs)\r\n 691 # same as the cond_val since the new_creg is also a single Clbit.\r\n 692 new_cond_val = cond_val\r\n--> 693 elif 2 ** (cond_creg[:].index(bit)) & cond_val:\r\n 694 # If the conditional values of the Clbit 'bit' is 1 then the new_cond_val\r\n 695 # is updated such that the conditional value of the Clbit to which 'bit'\r\n\r\nTypeError: ufunc 'bitwise_and' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''\r\n```\n\n### What should happen?\n\nNo type errors during compilation.\n\n### Any suggestions?\n\nThe problem is that `random_circuit` outputs `numpy.uint64` values for the comparisons, but some points in the compiler stack need to re-map conditions, and use bitmasks against Python's bigint `int` type. The fixed-width Numpy types don't support this if the Python int they're masking / being masked by don't fit. `random_circuit` should be outputting the correct `int` type; this is compelled by the newer `IfElseOp` form, but not the old form.\n", "before_files": [{"content": "# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"Utility functions for generating random circuits.\"\"\"\n\nimport numpy as np\n\nfrom qiskit.circuit import ClassicalRegister, QuantumCircuit, CircuitInstruction\nfrom qiskit.circuit import Reset\nfrom qiskit.circuit.library import standard_gates\nfrom qiskit.circuit.exceptions import CircuitError\n\n\ndef random_circuit(\n num_qubits, depth, max_operands=4, measure=False, conditional=False, reset=False, seed=None\n):\n \"\"\"Generate random circuit of arbitrary size and form.\n\n This function will generate a random circuit by randomly selecting gates\n from the set of standard gates in :mod:`qiskit.extensions`. For example:\n\n .. plot::\n :include-source:\n\n from qiskit.circuit.random import random_circuit\n\n circ = random_circuit(2, 2, measure=True)\n circ.draw(output='mpl')\n\n Args:\n num_qubits (int): number of quantum wires\n depth (int): layers of operations (i.e. critical path length)\n max_operands (int): maximum qubit operands of each gate (between 1 and 4)\n measure (bool): if True, measure all qubits at the end\n conditional (bool): if True, insert middle measurements and conditionals\n reset (bool): if True, insert middle resets\n seed (int): sets random seed (optional)\n\n Returns:\n QuantumCircuit: constructed circuit\n\n Raises:\n CircuitError: when invalid options given\n \"\"\"\n if num_qubits == 0:\n return QuantumCircuit()\n if max_operands < 1 or max_operands > 4:\n raise CircuitError(\"max_operands must be between 1 and 4\")\n max_operands = max_operands if num_qubits > max_operands else num_qubits\n\n gates_1q = [\n # (Gate class, number of qubits, number of parameters)\n (standard_gates.IGate, 1, 0),\n (standard_gates.SXGate, 1, 0),\n (standard_gates.XGate, 1, 0),\n (standard_gates.RZGate, 1, 1),\n (standard_gates.RGate, 1, 2),\n (standard_gates.HGate, 1, 0),\n (standard_gates.PhaseGate, 1, 1),\n (standard_gates.RXGate, 1, 1),\n (standard_gates.RYGate, 1, 1),\n (standard_gates.SGate, 1, 0),\n (standard_gates.SdgGate, 1, 0),\n (standard_gates.SXdgGate, 1, 0),\n (standard_gates.TGate, 1, 0),\n (standard_gates.TdgGate, 1, 0),\n (standard_gates.UGate, 1, 3),\n (standard_gates.U1Gate, 1, 1),\n (standard_gates.U2Gate, 1, 2),\n (standard_gates.U3Gate, 1, 3),\n (standard_gates.YGate, 1, 0),\n (standard_gates.ZGate, 1, 0),\n ]\n if reset:\n gates_1q.append((Reset, 1, 0))\n gates_2q = [\n (standard_gates.CXGate, 2, 0),\n (standard_gates.DCXGate, 2, 0),\n (standard_gates.CHGate, 2, 0),\n (standard_gates.CPhaseGate, 2, 1),\n (standard_gates.CRXGate, 2, 1),\n (standard_gates.CRYGate, 2, 1),\n (standard_gates.CRZGate, 2, 1),\n (standard_gates.CSXGate, 2, 0),\n (standard_gates.CUGate, 2, 4),\n (standard_gates.CU1Gate, 2, 1),\n (standard_gates.CU3Gate, 2, 3),\n (standard_gates.CYGate, 2, 0),\n (standard_gates.CZGate, 2, 0),\n (standard_gates.RXXGate, 2, 1),\n (standard_gates.RYYGate, 2, 1),\n (standard_gates.RZZGate, 2, 1),\n (standard_gates.RZXGate, 2, 1),\n (standard_gates.XXMinusYYGate, 2, 2),\n (standard_gates.XXPlusYYGate, 2, 2),\n (standard_gates.ECRGate, 2, 0),\n (standard_gates.CSGate, 2, 0),\n (standard_gates.CSdgGate, 2, 0),\n (standard_gates.SwapGate, 2, 0),\n (standard_gates.iSwapGate, 2, 0),\n ]\n gates_3q = [\n (standard_gates.CCXGate, 3, 0),\n (standard_gates.CSwapGate, 3, 0),\n (standard_gates.CCZGate, 3, 0),\n (standard_gates.RCCXGate, 3, 0),\n ]\n gates_4q = [\n (standard_gates.C3SXGate, 4, 0),\n (standard_gates.RC3XGate, 4, 0),\n ]\n\n gates = gates_1q.copy()\n if max_operands >= 2:\n gates.extend(gates_2q)\n if max_operands >= 3:\n gates.extend(gates_3q)\n if max_operands >= 4:\n gates.extend(gates_4q)\n gates = np.array(\n gates, dtype=[(\"class\", object), (\"num_qubits\", np.int64), (\"num_params\", np.int64)]\n )\n gates_1q = np.array(gates_1q, dtype=gates.dtype)\n\n qc = QuantumCircuit(num_qubits)\n\n if measure or conditional:\n cr = ClassicalRegister(num_qubits, \"c\")\n qc.add_register(cr)\n\n if seed is None:\n seed = np.random.randint(0, np.iinfo(np.int32).max)\n rng = np.random.default_rng(seed)\n\n qubits = np.array(qc.qubits, dtype=object, copy=True)\n\n # Apply arbitrary random operations in layers across all qubits.\n for _ in range(depth):\n # We generate all the randomness for the layer in one go, to avoid many separate calls to\n # the randomisation routines, which can be fairly slow.\n\n # This reliably draws too much randomness, but it's less expensive than looping over more\n # calls to the rng. After, trim it down by finding the point when we've used all the qubits.\n gate_specs = rng.choice(gates, size=len(qubits))\n cumulative_qubits = np.cumsum(gate_specs[\"num_qubits\"], dtype=np.int64)\n # Efficiently find the point in the list where the total gates would use as many as\n # possible of, but not more than, the number of qubits in the layer. If there's slack, fill\n # it with 1q gates.\n max_index = np.searchsorted(cumulative_qubits, num_qubits, side=\"right\")\n gate_specs = gate_specs[:max_index]\n slack = num_qubits - cumulative_qubits[max_index - 1]\n if slack:\n gate_specs = np.hstack((gate_specs, rng.choice(gates_1q, size=slack)))\n\n # For efficiency in the Python loop, this uses Numpy vectorisation to pre-calculate the\n # indices into the lists of qubits and parameters for every gate, and then suitably\n # randomises those lists.\n q_indices = np.empty(len(gate_specs) + 1, dtype=np.int64)\n p_indices = np.empty(len(gate_specs) + 1, dtype=np.int64)\n q_indices[0] = p_indices[0] = 0\n np.cumsum(gate_specs[\"num_qubits\"], out=q_indices[1:])\n np.cumsum(gate_specs[\"num_params\"], out=p_indices[1:])\n parameters = rng.uniform(0, 2 * np.pi, size=p_indices[-1])\n rng.shuffle(qubits)\n\n # We've now generated everything we're going to need. Now just to add everything. The\n # conditional check is outside the two loops to make the more common case of no conditionals\n # faster, since in Python we don't have a compiler to do this for us.\n if conditional:\n is_conditional = rng.random(size=len(gate_specs)) < 0.1\n condition_values = rng.integers(\n 0, 1 << min(num_qubits, 63), size=np.count_nonzero(is_conditional)\n )\n c_ptr = 0\n for gate, q_start, q_end, p_start, p_end, is_cond in zip(\n gate_specs[\"class\"],\n q_indices[:-1],\n q_indices[1:],\n p_indices[:-1],\n p_indices[1:],\n is_conditional,\n ):\n operation = gate(*parameters[p_start:p_end])\n if is_cond:\n operation.condition = (cr, condition_values[c_ptr])\n c_ptr += 1\n qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))\n else:\n for gate, q_start, q_end, p_start, p_end in zip(\n gate_specs[\"class\"], q_indices[:-1], q_indices[1:], p_indices[:-1], p_indices[1:]\n ):\n operation = gate(*parameters[p_start:p_end])\n qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))\n\n if measure:\n qc.measure(qc.qubits, cr)\n\n return qc\n", "path": "qiskit/circuit/random/utils.py"}], "after_files": [{"content": "# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"Utility functions for generating random circuits.\"\"\"\n\nimport numpy as np\n\nfrom qiskit.circuit import ClassicalRegister, QuantumCircuit, CircuitInstruction\nfrom qiskit.circuit import Reset\nfrom qiskit.circuit.library import standard_gates\nfrom qiskit.circuit.exceptions import CircuitError\n\n\ndef random_circuit(\n num_qubits, depth, max_operands=4, measure=False, conditional=False, reset=False, seed=None\n):\n \"\"\"Generate random circuit of arbitrary size and form.\n\n This function will generate a random circuit by randomly selecting gates\n from the set of standard gates in :mod:`qiskit.extensions`. For example:\n\n .. plot::\n :include-source:\n\n from qiskit.circuit.random import random_circuit\n\n circ = random_circuit(2, 2, measure=True)\n circ.draw(output='mpl')\n\n Args:\n num_qubits (int): number of quantum wires\n depth (int): layers of operations (i.e. critical path length)\n max_operands (int): maximum qubit operands of each gate (between 1 and 4)\n measure (bool): if True, measure all qubits at the end\n conditional (bool): if True, insert middle measurements and conditionals\n reset (bool): if True, insert middle resets\n seed (int): sets random seed (optional)\n\n Returns:\n QuantumCircuit: constructed circuit\n\n Raises:\n CircuitError: when invalid options given\n \"\"\"\n if num_qubits == 0:\n return QuantumCircuit()\n if max_operands < 1 or max_operands > 4:\n raise CircuitError(\"max_operands must be between 1 and 4\")\n max_operands = max_operands if num_qubits > max_operands else num_qubits\n\n gates_1q = [\n # (Gate class, number of qubits, number of parameters)\n (standard_gates.IGate, 1, 0),\n (standard_gates.SXGate, 1, 0),\n (standard_gates.XGate, 1, 0),\n (standard_gates.RZGate, 1, 1),\n (standard_gates.RGate, 1, 2),\n (standard_gates.HGate, 1, 0),\n (standard_gates.PhaseGate, 1, 1),\n (standard_gates.RXGate, 1, 1),\n (standard_gates.RYGate, 1, 1),\n (standard_gates.SGate, 1, 0),\n (standard_gates.SdgGate, 1, 0),\n (standard_gates.SXdgGate, 1, 0),\n (standard_gates.TGate, 1, 0),\n (standard_gates.TdgGate, 1, 0),\n (standard_gates.UGate, 1, 3),\n (standard_gates.U1Gate, 1, 1),\n (standard_gates.U2Gate, 1, 2),\n (standard_gates.U3Gate, 1, 3),\n (standard_gates.YGate, 1, 0),\n (standard_gates.ZGate, 1, 0),\n ]\n if reset:\n gates_1q.append((Reset, 1, 0))\n gates_2q = [\n (standard_gates.CXGate, 2, 0),\n (standard_gates.DCXGate, 2, 0),\n (standard_gates.CHGate, 2, 0),\n (standard_gates.CPhaseGate, 2, 1),\n (standard_gates.CRXGate, 2, 1),\n (standard_gates.CRYGate, 2, 1),\n (standard_gates.CRZGate, 2, 1),\n (standard_gates.CSXGate, 2, 0),\n (standard_gates.CUGate, 2, 4),\n (standard_gates.CU1Gate, 2, 1),\n (standard_gates.CU3Gate, 2, 3),\n (standard_gates.CYGate, 2, 0),\n (standard_gates.CZGate, 2, 0),\n (standard_gates.RXXGate, 2, 1),\n (standard_gates.RYYGate, 2, 1),\n (standard_gates.RZZGate, 2, 1),\n (standard_gates.RZXGate, 2, 1),\n (standard_gates.XXMinusYYGate, 2, 2),\n (standard_gates.XXPlusYYGate, 2, 2),\n (standard_gates.ECRGate, 2, 0),\n (standard_gates.CSGate, 2, 0),\n (standard_gates.CSdgGate, 2, 0),\n (standard_gates.SwapGate, 2, 0),\n (standard_gates.iSwapGate, 2, 0),\n ]\n gates_3q = [\n (standard_gates.CCXGate, 3, 0),\n (standard_gates.CSwapGate, 3, 0),\n (standard_gates.CCZGate, 3, 0),\n (standard_gates.RCCXGate, 3, 0),\n ]\n gates_4q = [\n (standard_gates.C3SXGate, 4, 0),\n (standard_gates.RC3XGate, 4, 0),\n ]\n\n gates = gates_1q.copy()\n if max_operands >= 2:\n gates.extend(gates_2q)\n if max_operands >= 3:\n gates.extend(gates_3q)\n if max_operands >= 4:\n gates.extend(gates_4q)\n gates = np.array(\n gates, dtype=[(\"class\", object), (\"num_qubits\", np.int64), (\"num_params\", np.int64)]\n )\n gates_1q = np.array(gates_1q, dtype=gates.dtype)\n\n qc = QuantumCircuit(num_qubits)\n\n if measure or conditional:\n cr = ClassicalRegister(num_qubits, \"c\")\n qc.add_register(cr)\n\n if seed is None:\n seed = np.random.randint(0, np.iinfo(np.int32).max)\n rng = np.random.default_rng(seed)\n\n qubits = np.array(qc.qubits, dtype=object, copy=True)\n\n # Apply arbitrary random operations in layers across all qubits.\n for _ in range(depth):\n # We generate all the randomness for the layer in one go, to avoid many separate calls to\n # the randomisation routines, which can be fairly slow.\n\n # This reliably draws too much randomness, but it's less expensive than looping over more\n # calls to the rng. After, trim it down by finding the point when we've used all the qubits.\n gate_specs = rng.choice(gates, size=len(qubits))\n cumulative_qubits = np.cumsum(gate_specs[\"num_qubits\"], dtype=np.int64)\n # Efficiently find the point in the list where the total gates would use as many as\n # possible of, but not more than, the number of qubits in the layer. If there's slack, fill\n # it with 1q gates.\n max_index = np.searchsorted(cumulative_qubits, num_qubits, side=\"right\")\n gate_specs = gate_specs[:max_index]\n slack = num_qubits - cumulative_qubits[max_index - 1]\n if slack:\n gate_specs = np.hstack((gate_specs, rng.choice(gates_1q, size=slack)))\n\n # For efficiency in the Python loop, this uses Numpy vectorisation to pre-calculate the\n # indices into the lists of qubits and parameters for every gate, and then suitably\n # randomises those lists.\n q_indices = np.empty(len(gate_specs) + 1, dtype=np.int64)\n p_indices = np.empty(len(gate_specs) + 1, dtype=np.int64)\n q_indices[0] = p_indices[0] = 0\n np.cumsum(gate_specs[\"num_qubits\"], out=q_indices[1:])\n np.cumsum(gate_specs[\"num_params\"], out=p_indices[1:])\n parameters = rng.uniform(0, 2 * np.pi, size=p_indices[-1])\n rng.shuffle(qubits)\n\n # We've now generated everything we're going to need. Now just to add everything. The\n # conditional check is outside the two loops to make the more common case of no conditionals\n # faster, since in Python we don't have a compiler to do this for us.\n if conditional:\n is_conditional = rng.random(size=len(gate_specs)) < 0.1\n condition_values = rng.integers(\n 0, 1 << min(num_qubits, 63), size=np.count_nonzero(is_conditional)\n )\n c_ptr = 0\n for gate, q_start, q_end, p_start, p_end, is_cond in zip(\n gate_specs[\"class\"],\n q_indices[:-1],\n q_indices[1:],\n p_indices[:-1],\n p_indices[1:],\n is_conditional,\n ):\n operation = gate(*parameters[p_start:p_end])\n if is_cond:\n # The condition values are required to be bigints, not Numpy's fixed-width type.\n operation.condition = (cr, int(condition_values[c_ptr]))\n c_ptr += 1\n qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))\n else:\n for gate, q_start, q_end, p_start, p_end in zip(\n gate_specs[\"class\"], q_indices[:-1], q_indices[1:], p_indices[:-1], p_indices[1:]\n ):\n operation = gate(*parameters[p_start:p_end])\n qc._append(CircuitInstruction(operation=operation, qubits=qubits[q_start:q_end]))\n\n if measure:\n qc.measure(qc.qubits, cr)\n\n return qc\n", "path": "qiskit/circuit/random/utils.py"}]}
| 3,944 | 155 |
gh_patches_debug_10285
|
rasdani/github-patches
|
git_diff
|
certbot__certbot-6002
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Certbot crashes if EFF e-mail submission endpoint is unavailable
Certbot will crash if the user's e-mail information can't be submitted to the EFF endpoint:
https://community.letsencrypt.org/t/issue-when-trying-to-create-certificate/59307/1
https://github.com/certbot/certbot/blob/master/certbot/eff.py#L74
It crashes with a somewhat relevant error if the endpoint is unavailable, but a confusing error if the endpoint returns something other than JSON. It should probably not crash at all in either case. If the endpoint specifically says that the user's e-mail address is malformed, that's probably worth stopping to report to the user, but if the endpoint simply doesn't reply usefully, it's probably preferable to continue since I wouldn't think we really want to make users wait in this situation.
(Adding the "bug" tag at least for the case where the endpoint returns a non-JSON reply because the error shown to the user in this case is unhelpful.)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `certbot/eff.py`
Content:
```
1 """Subscribes users to the EFF newsletter."""
2 import logging
3
4 import requests
5 import zope.component
6
7 from certbot import constants
8 from certbot import interfaces
9
10
11 logger = logging.getLogger(__name__)
12
13
14 def handle_subscription(config):
15 """High level function to take care of EFF newsletter subscriptions.
16
17 The user may be asked if they want to sign up for the newsletter if
18 they have not already specified.
19
20 :param .IConfig config: Client configuration.
21
22 """
23 if config.email is None:
24 if config.eff_email:
25 _report_failure("you didn't provide an e-mail address")
26 return
27 if config.eff_email is None:
28 config.eff_email = _want_subscription()
29 if config.eff_email:
30 subscribe(config.email)
31
32
33 def _want_subscription():
34 """Does the user want to be subscribed to the EFF newsletter?
35
36 :returns: True if we should subscribe the user, otherwise, False
37 :rtype: bool
38
39 """
40 prompt = (
41 'Would you be willing to share your email address with the '
42 "Electronic Frontier Foundation, a founding partner of the Let's "
43 'Encrypt project and the non-profit organization that develops '
44 "Certbot? We'd like to send you email about EFF and our work to "
45 'encrypt the web, protect its users and defend digital rights.')
46 display = zope.component.getUtility(interfaces.IDisplay)
47 return display.yesno(prompt, default=False)
48
49
50 def subscribe(email):
51 """Subscribe the user to the EFF mailing list.
52
53 :param str email: the e-mail address to subscribe
54
55 """
56 url = constants.EFF_SUBSCRIBE_URI
57 data = {'data_type': 'json',
58 'email': email,
59 'form_id': 'eff_supporters_library_subscribe_form'}
60 logger.debug('Sending POST request to %s:\n%s', url, data)
61 _check_response(requests.post(url, data=data))
62
63
64 def _check_response(response):
65 """Check for errors in the server's response.
66
67 If an error occurred, it will be reported to the user.
68
69 :param requests.Response response: the server's response to the
70 subscription request
71
72 """
73 logger.debug('Received response:\n%s', response.content)
74 if response.ok:
75 if not response.json()['status']:
76 _report_failure('your e-mail address appears to be invalid')
77 else:
78 _report_failure()
79
80
81 def _report_failure(reason=None):
82 """Notify the user of failing to sign them up for the newsletter.
83
84 :param reason: a phrase describing what the problem was
85 beginning with a lowercase letter and no closing punctuation
86 :type reason: `str` or `None`
87
88 """
89 msg = ['We were unable to subscribe you the EFF mailing list']
90 if reason is not None:
91 msg.append(' because ')
92 msg.append(reason)
93 msg.append('. You can try again later by visiting https://act.eff.org.')
94 reporter = zope.component.getUtility(interfaces.IReporter)
95 reporter.add_message(''.join(msg), reporter.LOW_PRIORITY)
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/certbot/eff.py b/certbot/eff.py
--- a/certbot/eff.py
+++ b/certbot/eff.py
@@ -71,11 +71,14 @@
"""
logger.debug('Received response:\n%s', response.content)
- if response.ok:
- if not response.json()['status']:
+ try:
+ response.raise_for_status()
+ if response.json()['status'] == False:
_report_failure('your e-mail address appears to be invalid')
- else:
+ except requests.exceptions.HTTPError:
_report_failure()
+ except (ValueError, KeyError):
+ _report_failure('there was a problem with the server response')
def _report_failure(reason=None):
|
{"golden_diff": "diff --git a/certbot/eff.py b/certbot/eff.py\n--- a/certbot/eff.py\n+++ b/certbot/eff.py\n@@ -71,11 +71,14 @@\n \n \"\"\"\n logger.debug('Received response:\\n%s', response.content)\n- if response.ok:\n- if not response.json()['status']:\n+ try:\n+ response.raise_for_status()\n+ if response.json()['status'] == False:\n _report_failure('your e-mail address appears to be invalid')\n- else:\n+ except requests.exceptions.HTTPError:\n _report_failure()\n+ except (ValueError, KeyError):\n+ _report_failure('there was a problem with the server response')\n \n \n def _report_failure(reason=None):\n", "issue": "Certbot crashes if EFF e-mail submission endpoint is unavailable\nCertbot will crash if the user's e-mail information can't be submitted to the EFF endpoint:\r\n\r\nhttps://community.letsencrypt.org/t/issue-when-trying-to-create-certificate/59307/1\r\n\r\nhttps://github.com/certbot/certbot/blob/master/certbot/eff.py#L74\r\n\r\nIt crashes with a somewhat relevant error if the endpoint is unavailable, but a confusing error if the endpoint returns something other than JSON. It should probably not crash at all in either case. If the endpoint specifically says that the user's e-mail address is malformed, that's probably worth stopping to report to the user, but if the endpoint simply doesn't reply usefully, it's probably preferable to continue since I wouldn't think we really want to make users wait in this situation.\r\n\r\n(Adding the \"bug\" tag at least for the case where the endpoint returns a non-JSON reply because the error shown to the user in this case is unhelpful.)\n", "before_files": [{"content": "\"\"\"Subscribes users to the EFF newsletter.\"\"\"\nimport logging\n\nimport requests\nimport zope.component\n\nfrom certbot import constants\nfrom certbot import interfaces\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef handle_subscription(config):\n \"\"\"High level function to take care of EFF newsletter subscriptions.\n\n The user may be asked if they want to sign up for the newsletter if\n they have not already specified.\n\n :param .IConfig config: Client configuration.\n\n \"\"\"\n if config.email is None:\n if config.eff_email:\n _report_failure(\"you didn't provide an e-mail address\")\n return\n if config.eff_email is None:\n config.eff_email = _want_subscription()\n if config.eff_email:\n subscribe(config.email)\n\n\ndef _want_subscription():\n \"\"\"Does the user want to be subscribed to the EFF newsletter?\n\n :returns: True if we should subscribe the user, otherwise, False\n :rtype: bool\n\n \"\"\"\n prompt = (\n 'Would you be willing to share your email address with the '\n \"Electronic Frontier Foundation, a founding partner of the Let's \"\n 'Encrypt project and the non-profit organization that develops '\n \"Certbot? We'd like to send you email about EFF and our work to \"\n 'encrypt the web, protect its users and defend digital rights.')\n display = zope.component.getUtility(interfaces.IDisplay)\n return display.yesno(prompt, default=False)\n\n\ndef subscribe(email):\n \"\"\"Subscribe the user to the EFF mailing list.\n\n :param str email: the e-mail address to subscribe\n\n \"\"\"\n url = constants.EFF_SUBSCRIBE_URI\n data = {'data_type': 'json',\n 'email': email,\n 'form_id': 'eff_supporters_library_subscribe_form'}\n logger.debug('Sending POST request to %s:\\n%s', url, data)\n _check_response(requests.post(url, data=data))\n\n\ndef _check_response(response):\n \"\"\"Check for errors in the server's response.\n\n If an error occurred, it will be reported to the user.\n\n :param requests.Response response: the server's response to the\n subscription request\n\n \"\"\"\n logger.debug('Received response:\\n%s', response.content)\n if response.ok:\n if not response.json()['status']:\n _report_failure('your e-mail address appears to be invalid')\n else:\n _report_failure()\n\n\ndef _report_failure(reason=None):\n \"\"\"Notify the user of failing to sign them up for the newsletter.\n\n :param reason: a phrase describing what the problem was\n beginning with a lowercase letter and no closing punctuation\n :type reason: `str` or `None`\n\n \"\"\"\n msg = ['We were unable to subscribe you the EFF mailing list']\n if reason is not None:\n msg.append(' because ')\n msg.append(reason)\n msg.append('. You can try again later by visiting https://act.eff.org.')\n reporter = zope.component.getUtility(interfaces.IReporter)\n reporter.add_message(''.join(msg), reporter.LOW_PRIORITY)\n", "path": "certbot/eff.py"}], "after_files": [{"content": "\"\"\"Subscribes users to the EFF newsletter.\"\"\"\nimport logging\n\nimport requests\nimport zope.component\n\nfrom certbot import constants\nfrom certbot import interfaces\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef handle_subscription(config):\n \"\"\"High level function to take care of EFF newsletter subscriptions.\n\n The user may be asked if they want to sign up for the newsletter if\n they have not already specified.\n\n :param .IConfig config: Client configuration.\n\n \"\"\"\n if config.email is None:\n if config.eff_email:\n _report_failure(\"you didn't provide an e-mail address\")\n return\n if config.eff_email is None:\n config.eff_email = _want_subscription()\n if config.eff_email:\n subscribe(config.email)\n\n\ndef _want_subscription():\n \"\"\"Does the user want to be subscribed to the EFF newsletter?\n\n :returns: True if we should subscribe the user, otherwise, False\n :rtype: bool\n\n \"\"\"\n prompt = (\n 'Would you be willing to share your email address with the '\n \"Electronic Frontier Foundation, a founding partner of the Let's \"\n 'Encrypt project and the non-profit organization that develops '\n \"Certbot? We'd like to send you email about EFF and our work to \"\n 'encrypt the web, protect its users and defend digital rights.')\n display = zope.component.getUtility(interfaces.IDisplay)\n return display.yesno(prompt, default=False)\n\n\ndef subscribe(email):\n \"\"\"Subscribe the user to the EFF mailing list.\n\n :param str email: the e-mail address to subscribe\n\n \"\"\"\n url = constants.EFF_SUBSCRIBE_URI\n data = {'data_type': 'json',\n 'email': email,\n 'form_id': 'eff_supporters_library_subscribe_form'}\n logger.debug('Sending POST request to %s:\\n%s', url, data)\n _check_response(requests.post(url, data=data))\n\n\ndef _check_response(response):\n \"\"\"Check for errors in the server's response.\n\n If an error occurred, it will be reported to the user.\n\n :param requests.Response response: the server's response to the\n subscription request\n\n \"\"\"\n logger.debug('Received response:\\n%s', response.content)\n try:\n response.raise_for_status()\n if response.json()['status'] == False:\n _report_failure('your e-mail address appears to be invalid')\n except requests.exceptions.HTTPError:\n _report_failure()\n except (ValueError, KeyError):\n _report_failure('there was a problem with the server response')\n\n\ndef _report_failure(reason=None):\n \"\"\"Notify the user of failing to sign them up for the newsletter.\n\n :param reason: a phrase describing what the problem was\n beginning with a lowercase letter and no closing punctuation\n :type reason: `str` or `None`\n\n \"\"\"\n msg = ['We were unable to subscribe you the EFF mailing list']\n if reason is not None:\n msg.append(' because ')\n msg.append(reason)\n msg.append('. You can try again later by visiting https://act.eff.org.')\n reporter = zope.component.getUtility(interfaces.IReporter)\n reporter.add_message(''.join(msg), reporter.LOW_PRIORITY)\n", "path": "certbot/eff.py"}]}
| 1,319 | 166 |
gh_patches_debug_27322
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-3563
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AWS Route 53 checks (CKV2_AWS_38 and CKV_AWS_39) false positive in Terraform Plan framework
**Describe the issue**
AWS Route 53 DNSSEC and Query Logging checks flag as false positive in Terraform Plan framework.
**Examples**
Use Hashicorp's basic example to reproduce
**Version (please complete the following information):**
- 2.1.226
**Additional context**
Create a new file/project using the Hashicorp example given
Run a vanilla check: `checkov -f foo.tf --quiet --compact -c CKV2_AWS_39`. Observe no issues.
Create a plan and format it with json: `terraform plan -out tfplan.bin && terraform state show tfplan.bin > tfplan.json`
Check it with terraform_plan framework: `checkov -f tfplan.json --framework terraform_plan --compact --quiet -c CKV2_AWS_39`. Observe it fail.
This same behavior occurs with CKV2_AWS_38 as well using the same basic example, and again--only with Terraform Plan: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/route53_hosted_zone_dnssec
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/plan_parser.py`
Content:
```
1 from __future__ import annotations
2
3 import itertools
4 from typing import Optional, Tuple, Dict, List, Any
5
6 from checkov.common.parsers.node import DictNode, ListNode
7 from checkov.terraform.context_parsers.tf_plan import parse
8
9 SIMPLE_TYPES = (str, int, float, bool)
10 TF_PLAN_RESOURCE_ADDRESS = "__address__"
11 TF_PLAN_RESOURCE_CHANGE_ACTIONS = "__change_actions__"
12
13
14 def _is_simple_type(obj: Any) -> bool:
15 if obj is None:
16 return True
17 if isinstance(obj, SIMPLE_TYPES):
18 return True
19 return False
20
21
22 def _is_list_of_simple_types(obj: Any) -> bool:
23 if not isinstance(obj, list):
24 return False
25 for i in obj:
26 if not _is_simple_type(i):
27 return False
28 return True
29
30
31 def _is_list_of_dicts(obj: Any) -> bool:
32 if not isinstance(obj, list):
33 return False
34 for i in obj:
35 if isinstance(i, dict):
36 return True
37 return False
38
39
40 def _hclify(obj: DictNode, conf: Optional[DictNode] = None, parent_key: Optional[str] = None) -> Dict[str, List[Any]]:
41 ret_dict = {}
42 if not isinstance(obj, dict):
43 raise Exception("this method receives only dicts")
44 if hasattr(obj, "start_mark") and hasattr(obj, "end_mark"):
45 obj["start_line"] = obj.start_mark.line
46 obj["end_line"] = obj.end_mark.line
47 for key, value in obj.items():
48 if _is_simple_type(value) or _is_list_of_simple_types(value):
49 if parent_key == "tags":
50 ret_dict[key] = value
51 else:
52 ret_dict[key] = _clean_simple_type_list([value])
53
54 if _is_list_of_dicts(value):
55 child_list = []
56 conf_val = conf.get(key, []) if conf else []
57 for internal_val, internal_conf_val in itertools.zip_longest(value, conf_val):
58 if isinstance(internal_val, dict):
59 child_list.append(_hclify(internal_val, internal_conf_val, parent_key=key))
60 if key == "tags":
61 ret_dict[key] = [child_list]
62 else:
63 ret_dict[key] = child_list
64 if isinstance(value, dict):
65 child_dict = _hclify(value, parent_key=key)
66 if parent_key == "tags":
67 ret_dict[key] = child_dict
68 else:
69 ret_dict[key] = [child_dict]
70 if conf and isinstance(conf, dict):
71 found_ref = False
72 for conf_key in conf.keys() - obj.keys():
73 ref = next((x for x in conf[conf_key].get("references", []) if not x.startswith(("var.", "local."))), None)
74 if ref:
75 ret_dict[conf_key] = [ref]
76 found_ref = True
77 if not found_ref:
78 for value in conf.values():
79 if isinstance(value, dict) and "references" in value.keys():
80 ret_dict["references_"] = value["references"]
81
82 return ret_dict
83
84
85 def _prepare_resource_block(
86 resource: DictNode, conf: Optional[DictNode], resource_changes: dict[str, dict[str, Any]]
87 ) -> tuple[dict[str, dict[str, Any]], bool]:
88 """hclify resource if pre-conditions met.
89
90 :param resource: tf planned_values resource block
91 :param conf: tf configuration resource block
92 :param resource_changes: tf resource_changes block
93
94 :returns:
95 - resource_block: a list of strings representing the header columns
96 - prepared: whether conditions met to prepare data
97 """
98
99 resource_block: Dict[str, Dict[str, Any]] = {}
100 resource_block[resource["type"]] = {}
101 prepared = False
102 mode = ""
103 if "mode" in resource:
104 mode = resource.get("mode")
105 # Rare cases where data block appears in resources with same name as resource block and only partial values
106 # and where *_module resources don't have values field
107 if mode == "managed" and "values" in resource:
108 expressions = conf.get("expressions") if conf else None
109
110 resource_conf = _hclify(resource["values"], expressions)
111 resource_address = resource.get("address")
112 resource_conf[TF_PLAN_RESOURCE_ADDRESS] = resource_address
113
114 changes = resource_changes.get(resource_address)
115 if changes:
116 resource_conf[TF_PLAN_RESOURCE_CHANGE_ACTIONS] = changes.get("change", {}).get("actions") or []
117
118 resource_block[resource["type"]][resource.get("name", "default")] = resource_conf
119 prepared = True
120 return resource_block, prepared
121
122
123 def _find_child_modules(
124 child_modules: ListNode, resource_changes: dict[str, dict[str, Any]], root_module_conf: dict[str, Any]
125 ) -> List[Dict[str, Dict[str, Any]]]:
126 """ Find all child modules if any. Including any amount of nested child modules.
127
128 :param child_modules: list of terraform child_module objects
129 :param resource_changes: a resource address to resource changes dict
130 :param root_module_conf: configuration block of the root module
131 :returns:
132 list of terraform resource blocks
133 """
134
135 resource_blocks = []
136 for child_module in child_modules:
137 nested_child_modules = child_module.get("child_modules", [])
138 if nested_child_modules:
139 nested_blocks = _find_child_modules(
140 child_modules=nested_child_modules,
141 resource_changes=resource_changes,
142 root_module_conf=root_module_conf
143 )
144 for resource in nested_blocks:
145 resource_blocks.append(resource)
146
147 module_address = child_module.get("address", "")
148 module_call_resources = _get_module_call_resources(
149 module_address=module_address,
150 root_module_conf=root_module_conf,
151 )
152
153 for resource in child_module.get("resources", []):
154 module_call_conf = None
155 if module_address and module_call_resources:
156 module_call_conf = next(
157 (
158 module_call_resource
159 for module_call_resource in module_call_resources
160 if f"{module_address}.{module_call_resource['address']}" == resource["address"]
161 ),
162 None
163 )
164
165 resource_block, prepared = _prepare_resource_block(
166 resource=resource,
167 conf=module_call_conf,
168 resource_changes=resource_changes,
169 )
170 if prepared is True:
171 resource_blocks.append(resource_block)
172 return resource_blocks
173
174
175 def _get_module_call_resources(module_address: str, root_module_conf: dict[str, Any]) -> list[dict[str, Any]]:
176 """Extracts the resources from the 'module_calls' block under 'configuration'"""
177
178 for module_name in module_address.split("."):
179 if module_name == "module":
180 # module names are always prefixed with 'module.', therefore skip it
181 continue
182 root_module_conf = root_module_conf.get("module_calls", {}).get(module_name, {}).get("module", {})
183
184 return root_module_conf.get("resources", [])
185
186
187 def _get_resource_changes(template: dict[str, Any]) -> dict[str, dict[str, Any]]:
188 """Returns a resource address to resource changes dict"""
189
190 resource_changes_map = {}
191
192 resource_changes = template.get("resource_changes")
193 if resource_changes and isinstance(resource_changes, list):
194 resource_changes_map = {
195 change.get("address", ""): change
196 for change in resource_changes
197 }
198
199 return resource_changes_map
200
201
202 def parse_tf_plan(tf_plan_file: str, out_parsing_errors: Dict[str, str]) -> Tuple[Optional[Dict[str, Any]], Optional[List[Tuple[int, str]]]]:
203 """
204 :type tf_plan_file: str - path to plan file
205 :rtype: tf_definition dictionary and template_lines of the plan file
206 """
207 tf_definition: Dict[str, Any] = {"resource": []}
208 template, template_lines = parse(tf_plan_file, out_parsing_errors)
209 if not template:
210 return None, None
211
212 resource_changes = _get_resource_changes(template=template)
213
214 for resource in template.get("planned_values", {}).get("root_module", {}).get("resources", []):
215 conf = next(
216 (
217 x
218 for x in template.get("configuration", {}).get("root_module", {}).get("resources", [])
219 if x["type"] == resource["type"] and x["name"] == resource["name"]
220 ),
221 None,
222 )
223 resource_block, prepared = _prepare_resource_block(
224 resource=resource,
225 conf=conf,
226 resource_changes=resource_changes,
227 )
228 if prepared is True:
229 tf_definition["resource"].append(resource_block)
230 child_modules = template.get("planned_values", {}).get("root_module", {}).get("child_modules", [])
231 root_module_conf = template.get("configuration", {}).get("root_module", {})
232 # Terraform supports modules within modules so we need to search
233 # in nested modules to find all resource blocks
234 resource_blocks = _find_child_modules(
235 child_modules=child_modules,
236 resource_changes=resource_changes,
237 root_module_conf=root_module_conf,
238 )
239 for resource in resource_blocks:
240 tf_definition["resource"].append(resource)
241 return tf_definition, template_lines
242
243
244 def _clean_simple_type_list(value_list: List[Any]) -> List[Any]:
245 """
246 Given a list of simple types return a cleaned list of simple types.
247 Converts booleans that are input as strings back to booleans to maintain consistent expectations for later evaluation.
248 Sometimes Terraform Plan will output Map values as strings regardless of boolean input.
249 """
250 for i in range(len(value_list)):
251 if isinstance(value_list[i], str):
252 lower_case_value = value_list[i].lower()
253 if lower_case_value == "true":
254 value_list[i] = True
255 if lower_case_value == "false":
256 value_list[i] = False
257 return value_list
258
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/terraform/plan_parser.py b/checkov/terraform/plan_parser.py
--- a/checkov/terraform/plan_parser.py
+++ b/checkov/terraform/plan_parser.py
@@ -70,7 +70,11 @@
if conf and isinstance(conf, dict):
found_ref = False
for conf_key in conf.keys() - obj.keys():
- ref = next((x for x in conf[conf_key].get("references", []) if not x.startswith(("var.", "local."))), None)
+ conf_value = conf[conf_key]
+ if not isinstance(conf_value, dict):
+ continue
+
+ ref = next((x for x in conf_value.get("references", []) if not x.startswith(("var.", "local."))), None)
if ref:
ret_dict[conf_key] = [ref]
found_ref = True
@@ -104,10 +108,10 @@
mode = resource.get("mode")
# Rare cases where data block appears in resources with same name as resource block and only partial values
# and where *_module resources don't have values field
- if mode == "managed" and "values" in resource:
+ if mode == "managed":
expressions = conf.get("expressions") if conf else None
- resource_conf = _hclify(resource["values"], expressions)
+ resource_conf = _hclify(resource.get("values", {}), expressions)
resource_address = resource.get("address")
resource_conf[TF_PLAN_RESOURCE_ADDRESS] = resource_address
|
{"golden_diff": "diff --git a/checkov/terraform/plan_parser.py b/checkov/terraform/plan_parser.py\n--- a/checkov/terraform/plan_parser.py\n+++ b/checkov/terraform/plan_parser.py\n@@ -70,7 +70,11 @@\n if conf and isinstance(conf, dict):\n found_ref = False\n for conf_key in conf.keys() - obj.keys():\n- ref = next((x for x in conf[conf_key].get(\"references\", []) if not x.startswith((\"var.\", \"local.\"))), None)\n+ conf_value = conf[conf_key]\n+ if not isinstance(conf_value, dict):\n+ continue\n+\n+ ref = next((x for x in conf_value.get(\"references\", []) if not x.startswith((\"var.\", \"local.\"))), None)\n if ref:\n ret_dict[conf_key] = [ref]\n found_ref = True\n@@ -104,10 +108,10 @@\n mode = resource.get(\"mode\")\n # Rare cases where data block appears in resources with same name as resource block and only partial values\n # and where *_module resources don't have values field\n- if mode == \"managed\" and \"values\" in resource:\n+ if mode == \"managed\":\n expressions = conf.get(\"expressions\") if conf else None\n \n- resource_conf = _hclify(resource[\"values\"], expressions)\n+ resource_conf = _hclify(resource.get(\"values\", {}), expressions)\n resource_address = resource.get(\"address\")\n resource_conf[TF_PLAN_RESOURCE_ADDRESS] = resource_address\n", "issue": "AWS Route 53 checks (CKV2_AWS_38 and CKV_AWS_39) false positive in Terraform Plan framework\n**Describe the issue**\r\nAWS Route 53 DNSSEC and Query Logging checks flag as false positive in Terraform Plan framework.\r\n\r\n**Examples**\r\nUse Hashicorp's basic example to reproduce\r\n\r\n**Version (please complete the following information):**\r\n - 2.1.226\r\n\r\n**Additional context**\r\nCreate a new file/project using the Hashicorp example given\r\nRun a vanilla check: `checkov -f foo.tf --quiet --compact -c CKV2_AWS_39`. Observe no issues.\r\nCreate a plan and format it with json: `terraform plan -out tfplan.bin && terraform state show tfplan.bin > tfplan.json`\r\nCheck it with terraform_plan framework: `checkov -f tfplan.json --framework terraform_plan --compact --quiet -c CKV2_AWS_39`. Observe it fail.\r\n\r\nThis same behavior occurs with CKV2_AWS_38 as well using the same basic example, and again--only with Terraform Plan: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/route53_hosted_zone_dnssec\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nimport itertools\nfrom typing import Optional, Tuple, Dict, List, Any\n\nfrom checkov.common.parsers.node import DictNode, ListNode\nfrom checkov.terraform.context_parsers.tf_plan import parse\n\nSIMPLE_TYPES = (str, int, float, bool)\nTF_PLAN_RESOURCE_ADDRESS = \"__address__\"\nTF_PLAN_RESOURCE_CHANGE_ACTIONS = \"__change_actions__\"\n\n\ndef _is_simple_type(obj: Any) -> bool:\n if obj is None:\n return True\n if isinstance(obj, SIMPLE_TYPES):\n return True\n return False\n\n\ndef _is_list_of_simple_types(obj: Any) -> bool:\n if not isinstance(obj, list):\n return False\n for i in obj:\n if not _is_simple_type(i):\n return False\n return True\n\n\ndef _is_list_of_dicts(obj: Any) -> bool:\n if not isinstance(obj, list):\n return False\n for i in obj:\n if isinstance(i, dict):\n return True\n return False\n\n\ndef _hclify(obj: DictNode, conf: Optional[DictNode] = None, parent_key: Optional[str] = None) -> Dict[str, List[Any]]:\n ret_dict = {}\n if not isinstance(obj, dict):\n raise Exception(\"this method receives only dicts\")\n if hasattr(obj, \"start_mark\") and hasattr(obj, \"end_mark\"):\n obj[\"start_line\"] = obj.start_mark.line\n obj[\"end_line\"] = obj.end_mark.line\n for key, value in obj.items():\n if _is_simple_type(value) or _is_list_of_simple_types(value):\n if parent_key == \"tags\":\n ret_dict[key] = value\n else:\n ret_dict[key] = _clean_simple_type_list([value])\n\n if _is_list_of_dicts(value):\n child_list = []\n conf_val = conf.get(key, []) if conf else []\n for internal_val, internal_conf_val in itertools.zip_longest(value, conf_val):\n if isinstance(internal_val, dict):\n child_list.append(_hclify(internal_val, internal_conf_val, parent_key=key))\n if key == \"tags\":\n ret_dict[key] = [child_list]\n else:\n ret_dict[key] = child_list\n if isinstance(value, dict):\n child_dict = _hclify(value, parent_key=key)\n if parent_key == \"tags\":\n ret_dict[key] = child_dict\n else:\n ret_dict[key] = [child_dict]\n if conf and isinstance(conf, dict):\n found_ref = False\n for conf_key in conf.keys() - obj.keys():\n ref = next((x for x in conf[conf_key].get(\"references\", []) if not x.startswith((\"var.\", \"local.\"))), None)\n if ref:\n ret_dict[conf_key] = [ref]\n found_ref = True\n if not found_ref:\n for value in conf.values():\n if isinstance(value, dict) and \"references\" in value.keys():\n ret_dict[\"references_\"] = value[\"references\"]\n\n return ret_dict\n\n\ndef _prepare_resource_block(\n resource: DictNode, conf: Optional[DictNode], resource_changes: dict[str, dict[str, Any]]\n) -> tuple[dict[str, dict[str, Any]], bool]:\n \"\"\"hclify resource if pre-conditions met.\n\n :param resource: tf planned_values resource block\n :param conf: tf configuration resource block\n :param resource_changes: tf resource_changes block\n\n :returns:\n - resource_block: a list of strings representing the header columns\n - prepared: whether conditions met to prepare data\n \"\"\"\n\n resource_block: Dict[str, Dict[str, Any]] = {}\n resource_block[resource[\"type\"]] = {}\n prepared = False\n mode = \"\"\n if \"mode\" in resource:\n mode = resource.get(\"mode\")\n # Rare cases where data block appears in resources with same name as resource block and only partial values\n # and where *_module resources don't have values field\n if mode == \"managed\" and \"values\" in resource:\n expressions = conf.get(\"expressions\") if conf else None\n\n resource_conf = _hclify(resource[\"values\"], expressions)\n resource_address = resource.get(\"address\")\n resource_conf[TF_PLAN_RESOURCE_ADDRESS] = resource_address\n\n changes = resource_changes.get(resource_address)\n if changes:\n resource_conf[TF_PLAN_RESOURCE_CHANGE_ACTIONS] = changes.get(\"change\", {}).get(\"actions\") or []\n\n resource_block[resource[\"type\"]][resource.get(\"name\", \"default\")] = resource_conf\n prepared = True\n return resource_block, prepared\n\n\ndef _find_child_modules(\n child_modules: ListNode, resource_changes: dict[str, dict[str, Any]], root_module_conf: dict[str, Any]\n) -> List[Dict[str, Dict[str, Any]]]:\n \"\"\" Find all child modules if any. Including any amount of nested child modules.\n\n :param child_modules: list of terraform child_module objects\n :param resource_changes: a resource address to resource changes dict\n :param root_module_conf: configuration block of the root module\n :returns:\n list of terraform resource blocks\n \"\"\"\n\n resource_blocks = []\n for child_module in child_modules:\n nested_child_modules = child_module.get(\"child_modules\", [])\n if nested_child_modules:\n nested_blocks = _find_child_modules(\n child_modules=nested_child_modules,\n resource_changes=resource_changes,\n root_module_conf=root_module_conf\n )\n for resource in nested_blocks:\n resource_blocks.append(resource)\n\n module_address = child_module.get(\"address\", \"\")\n module_call_resources = _get_module_call_resources(\n module_address=module_address,\n root_module_conf=root_module_conf,\n )\n\n for resource in child_module.get(\"resources\", []):\n module_call_conf = None\n if module_address and module_call_resources:\n module_call_conf = next(\n (\n module_call_resource\n for module_call_resource in module_call_resources\n if f\"{module_address}.{module_call_resource['address']}\" == resource[\"address\"]\n ),\n None\n )\n\n resource_block, prepared = _prepare_resource_block(\n resource=resource,\n conf=module_call_conf,\n resource_changes=resource_changes,\n )\n if prepared is True:\n resource_blocks.append(resource_block)\n return resource_blocks\n\n\ndef _get_module_call_resources(module_address: str, root_module_conf: dict[str, Any]) -> list[dict[str, Any]]:\n \"\"\"Extracts the resources from the 'module_calls' block under 'configuration'\"\"\"\n\n for module_name in module_address.split(\".\"):\n if module_name == \"module\":\n # module names are always prefixed with 'module.', therefore skip it\n continue\n root_module_conf = root_module_conf.get(\"module_calls\", {}).get(module_name, {}).get(\"module\", {})\n\n return root_module_conf.get(\"resources\", [])\n\n\ndef _get_resource_changes(template: dict[str, Any]) -> dict[str, dict[str, Any]]:\n \"\"\"Returns a resource address to resource changes dict\"\"\"\n\n resource_changes_map = {}\n\n resource_changes = template.get(\"resource_changes\")\n if resource_changes and isinstance(resource_changes, list):\n resource_changes_map = {\n change.get(\"address\", \"\"): change\n for change in resource_changes\n }\n\n return resource_changes_map\n\n\ndef parse_tf_plan(tf_plan_file: str, out_parsing_errors: Dict[str, str]) -> Tuple[Optional[Dict[str, Any]], Optional[List[Tuple[int, str]]]]:\n \"\"\"\n :type tf_plan_file: str - path to plan file\n :rtype: tf_definition dictionary and template_lines of the plan file\n \"\"\"\n tf_definition: Dict[str, Any] = {\"resource\": []}\n template, template_lines = parse(tf_plan_file, out_parsing_errors)\n if not template:\n return None, None\n\n resource_changes = _get_resource_changes(template=template)\n\n for resource in template.get(\"planned_values\", {}).get(\"root_module\", {}).get(\"resources\", []):\n conf = next(\n (\n x\n for x in template.get(\"configuration\", {}).get(\"root_module\", {}).get(\"resources\", [])\n if x[\"type\"] == resource[\"type\"] and x[\"name\"] == resource[\"name\"]\n ),\n None,\n )\n resource_block, prepared = _prepare_resource_block(\n resource=resource,\n conf=conf,\n resource_changes=resource_changes,\n )\n if prepared is True:\n tf_definition[\"resource\"].append(resource_block)\n child_modules = template.get(\"planned_values\", {}).get(\"root_module\", {}).get(\"child_modules\", [])\n root_module_conf = template.get(\"configuration\", {}).get(\"root_module\", {})\n # Terraform supports modules within modules so we need to search\n # in nested modules to find all resource blocks\n resource_blocks = _find_child_modules(\n child_modules=child_modules,\n resource_changes=resource_changes,\n root_module_conf=root_module_conf,\n )\n for resource in resource_blocks:\n tf_definition[\"resource\"].append(resource)\n return tf_definition, template_lines\n\n\ndef _clean_simple_type_list(value_list: List[Any]) -> List[Any]:\n \"\"\"\n Given a list of simple types return a cleaned list of simple types.\n Converts booleans that are input as strings back to booleans to maintain consistent expectations for later evaluation.\n Sometimes Terraform Plan will output Map values as strings regardless of boolean input.\n \"\"\"\n for i in range(len(value_list)):\n if isinstance(value_list[i], str):\n lower_case_value = value_list[i].lower()\n if lower_case_value == \"true\":\n value_list[i] = True\n if lower_case_value == \"false\":\n value_list[i] = False\n return value_list\n", "path": "checkov/terraform/plan_parser.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport itertools\nfrom typing import Optional, Tuple, Dict, List, Any\n\nfrom checkov.common.parsers.node import DictNode, ListNode\nfrom checkov.terraform.context_parsers.tf_plan import parse\n\nSIMPLE_TYPES = (str, int, float, bool)\nTF_PLAN_RESOURCE_ADDRESS = \"__address__\"\nTF_PLAN_RESOURCE_CHANGE_ACTIONS = \"__change_actions__\"\n\n\ndef _is_simple_type(obj: Any) -> bool:\n if obj is None:\n return True\n if isinstance(obj, SIMPLE_TYPES):\n return True\n return False\n\n\ndef _is_list_of_simple_types(obj: Any) -> bool:\n if not isinstance(obj, list):\n return False\n for i in obj:\n if not _is_simple_type(i):\n return False\n return True\n\n\ndef _is_list_of_dicts(obj: Any) -> bool:\n if not isinstance(obj, list):\n return False\n for i in obj:\n if isinstance(i, dict):\n return True\n return False\n\n\ndef _hclify(obj: DictNode, conf: Optional[DictNode] = None, parent_key: Optional[str] = None) -> Dict[str, List[Any]]:\n ret_dict = {}\n if not isinstance(obj, dict):\n raise Exception(\"this method receives only dicts\")\n if hasattr(obj, \"start_mark\") and hasattr(obj, \"end_mark\"):\n obj[\"start_line\"] = obj.start_mark.line\n obj[\"end_line\"] = obj.end_mark.line\n for key, value in obj.items():\n if _is_simple_type(value) or _is_list_of_simple_types(value):\n if parent_key == \"tags\":\n ret_dict[key] = value\n else:\n ret_dict[key] = _clean_simple_type_list([value])\n\n if _is_list_of_dicts(value):\n child_list = []\n conf_val = conf.get(key, []) if conf else []\n for internal_val, internal_conf_val in itertools.zip_longest(value, conf_val):\n if isinstance(internal_val, dict):\n child_list.append(_hclify(internal_val, internal_conf_val, parent_key=key))\n if key == \"tags\":\n ret_dict[key] = [child_list]\n else:\n ret_dict[key] = child_list\n if isinstance(value, dict):\n child_dict = _hclify(value, parent_key=key)\n if parent_key == \"tags\":\n ret_dict[key] = child_dict\n else:\n ret_dict[key] = [child_dict]\n if conf and isinstance(conf, dict):\n found_ref = False\n for conf_key in conf.keys() - obj.keys():\n conf_value = conf[conf_key]\n if not isinstance(conf_value, dict):\n continue\n\n ref = next((x for x in conf_value.get(\"references\", []) if not x.startswith((\"var.\", \"local.\"))), None)\n if ref:\n ret_dict[conf_key] = [ref]\n found_ref = True\n if not found_ref:\n for value in conf.values():\n if isinstance(value, dict) and \"references\" in value.keys():\n ret_dict[\"references_\"] = value[\"references\"]\n\n return ret_dict\n\n\ndef _prepare_resource_block(\n resource: DictNode, conf: Optional[DictNode], resource_changes: dict[str, dict[str, Any]]\n) -> tuple[dict[str, dict[str, Any]], bool]:\n \"\"\"hclify resource if pre-conditions met.\n\n :param resource: tf planned_values resource block\n :param conf: tf configuration resource block\n :param resource_changes: tf resource_changes block\n\n :returns:\n - resource_block: a list of strings representing the header columns\n - prepared: whether conditions met to prepare data\n \"\"\"\n\n resource_block: Dict[str, Dict[str, Any]] = {}\n resource_block[resource[\"type\"]] = {}\n prepared = False\n mode = \"\"\n if \"mode\" in resource:\n mode = resource.get(\"mode\")\n # Rare cases where data block appears in resources with same name as resource block and only partial values\n # and where *_module resources don't have values field\n if mode == \"managed\":\n expressions = conf.get(\"expressions\") if conf else None\n\n resource_conf = _hclify(resource.get(\"values\", {}), expressions)\n resource_address = resource.get(\"address\")\n resource_conf[TF_PLAN_RESOURCE_ADDRESS] = resource_address\n\n changes = resource_changes.get(resource_address)\n if changes:\n resource_conf[TF_PLAN_RESOURCE_CHANGE_ACTIONS] = changes.get(\"change\", {}).get(\"actions\") or []\n\n resource_block[resource[\"type\"]][resource.get(\"name\", \"default\")] = resource_conf\n prepared = True\n return resource_block, prepared\n\n\ndef _find_child_modules(\n child_modules: ListNode, resource_changes: dict[str, dict[str, Any]], root_module_conf: dict[str, Any]\n) -> List[Dict[str, Dict[str, Any]]]:\n \"\"\" Find all child modules if any. Including any amount of nested child modules.\n\n :param child_modules: list of terraform child_module objects\n :param resource_changes: a resource address to resource changes dict\n :param root_module_conf: configuration block of the root module\n :returns:\n list of terraform resource blocks\n \"\"\"\n\n resource_blocks = []\n for child_module in child_modules:\n nested_child_modules = child_module.get(\"child_modules\", [])\n if nested_child_modules:\n nested_blocks = _find_child_modules(\n child_modules=nested_child_modules,\n resource_changes=resource_changes,\n root_module_conf=root_module_conf\n )\n for resource in nested_blocks:\n resource_blocks.append(resource)\n\n module_address = child_module.get(\"address\", \"\")\n module_call_resources = _get_module_call_resources(\n module_address=module_address,\n root_module_conf=root_module_conf,\n )\n\n for resource in child_module.get(\"resources\", []):\n module_call_conf = None\n if module_address and module_call_resources:\n module_call_conf = next(\n (\n module_call_resource\n for module_call_resource in module_call_resources\n if f\"{module_address}.{module_call_resource['address']}\" == resource[\"address\"]\n ),\n None\n )\n\n resource_block, prepared = _prepare_resource_block(\n resource=resource,\n conf=module_call_conf,\n resource_changes=resource_changes,\n )\n if prepared is True:\n resource_blocks.append(resource_block)\n return resource_blocks\n\n\ndef _get_module_call_resources(module_address: str, root_module_conf: dict[str, Any]) -> list[dict[str, Any]]:\n \"\"\"Extracts the resources from the 'module_calls' block under 'configuration'\"\"\"\n\n for module_name in module_address.split(\".\"):\n if module_name == \"module\":\n # module names are always prefixed with 'module.', therefore skip it\n continue\n root_module_conf = root_module_conf.get(\"module_calls\", {}).get(module_name, {}).get(\"module\", {})\n\n return root_module_conf.get(\"resources\", [])\n\n\ndef _get_resource_changes(template: dict[str, Any]) -> dict[str, dict[str, Any]]:\n \"\"\"Returns a resource address to resource changes dict\"\"\"\n\n resource_changes_map = {}\n\n resource_changes = template.get(\"resource_changes\")\n if resource_changes and isinstance(resource_changes, list):\n resource_changes_map = {\n change.get(\"address\", \"\"): change\n for change in resource_changes\n }\n\n return resource_changes_map\n\n\ndef parse_tf_plan(tf_plan_file: str, out_parsing_errors: Dict[str, str]) -> Tuple[Optional[Dict[str, Any]], Optional[List[Tuple[int, str]]]]:\n \"\"\"\n :type tf_plan_file: str - path to plan file\n :rtype: tf_definition dictionary and template_lines of the plan file\n \"\"\"\n tf_definition: Dict[str, Any] = {\"resource\": []}\n template, template_lines = parse(tf_plan_file, out_parsing_errors)\n if not template:\n return None, None\n\n resource_changes = _get_resource_changes(template=template)\n\n for resource in template.get(\"planned_values\", {}).get(\"root_module\", {}).get(\"resources\", []):\n conf = next(\n (\n x\n for x in template.get(\"configuration\", {}).get(\"root_module\", {}).get(\"resources\", [])\n if x[\"type\"] == resource[\"type\"] and x[\"name\"] == resource[\"name\"]\n ),\n None,\n )\n resource_block, prepared = _prepare_resource_block(\n resource=resource,\n conf=conf,\n resource_changes=resource_changes,\n )\n if prepared is True:\n tf_definition[\"resource\"].append(resource_block)\n child_modules = template.get(\"planned_values\", {}).get(\"root_module\", {}).get(\"child_modules\", [])\n root_module_conf = template.get(\"configuration\", {}).get(\"root_module\", {})\n # Terraform supports modules within modules so we need to search\n # in nested modules to find all resource blocks\n resource_blocks = _find_child_modules(\n child_modules=child_modules,\n resource_changes=resource_changes,\n root_module_conf=root_module_conf,\n )\n for resource in resource_blocks:\n tf_definition[\"resource\"].append(resource)\n return tf_definition, template_lines\n\n\ndef _clean_simple_type_list(value_list: List[Any]) -> List[Any]:\n \"\"\"\n Given a list of simple types return a cleaned list of simple types.\n Converts booleans that are input as strings back to booleans to maintain consistent expectations for later evaluation.\n Sometimes Terraform Plan will output Map values as strings regardless of boolean input.\n \"\"\"\n for i in range(len(value_list)):\n if isinstance(value_list[i], str):\n lower_case_value = value_list[i].lower()\n if lower_case_value == \"true\":\n value_list[i] = True\n if lower_case_value == \"false\":\n value_list[i] = False\n return value_list\n", "path": "checkov/terraform/plan_parser.py"}]}
| 3,309 | 343 |
gh_patches_debug_34333
|
rasdani/github-patches
|
git_diff
|
internetarchive__openlibrary-8896
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove fixed subject line from contact form
The contact form has a drop down for choosing a topic. When patrons submit this form, the submissions all go to the same inbox. The selected topics are usually incorrect, but also make it more difficult to manage the inbox due to most messages looking the same.
### Describe the problem that you'd like solved
Let the patron populate the subject line OR auto-populate the subject line from the message.
### Proposal & Constraints
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
<!-- Which suggestions or requirements should be considered for how feature needs to appear or be implemented? -->
### Additional context
<!-- Add any other context or screenshots about the feature request here. -->
### Stakeholders
<!-- @ tag stakeholders of this bug -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `openlibrary/plugins/openlibrary/support.py`
Content:
```
1 import hashlib
2
3 import web
4 import logging
5
6 from infogami import config
7 from infogami.utils import delegate
8 from infogami.utils.view import render_template
9
10 from openlibrary import accounts
11 from openlibrary.core import stats
12 from openlibrary.core.cache import get_memcache
13 from openlibrary.plugins.upstream.addbook import get_recaptcha
14 from openlibrary.utils.dateutil import MINUTE_SECS
15
16 logger = logging.getLogger("openlibrary")
17
18
19 class contact(delegate.page):
20 def GET(self):
21 i = web.input(path=None)
22 user = accounts.get_current_user()
23 email = user and user.email
24
25 hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()
26 has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)
27 recaptcha = has_emailed_recently and get_recaptcha()
28 return render_template("support", email=email, url=i.path, recaptcha=recaptcha)
29
30 def POST(self):
31 form = web.input()
32 patron_name = form.get("name", "")
33 email = form.get("email", "")
34 topic = form.get("topic", "")
35 description = form.get("question", "")
36 url = form.get("url", "")
37 user = accounts.get_current_user()
38 useragent = web.ctx.env.get("HTTP_USER_AGENT", "")
39 if not all([email, topic, description]):
40 return ""
41
42 hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()
43 has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)
44 if has_emailed_recently:
45 recap = get_recaptcha()
46 if recap and not recap.validate():
47 return render_template(
48 "message.html",
49 'Recaptcha solution was incorrect',
50 (
51 'Please <a href="javascript:history.back()">go back</a> and try '
52 'again.'
53 ),
54 )
55
56 default_assignees = config.get("support_default_assignees", {})
57 if (topic_key := str(topic.replace(" ", "_").lower())) in default_assignees:
58 assignee = default_assignees.get(topic_key)
59 else:
60 assignee = default_assignees.get("default", "openlibrary@archive.org")
61 stats.increment("ol.support.all")
62 subject = "Support case *%s*" % topic
63
64 url = web.ctx.home + url
65 displayname = user and user.get_name() or ""
66 username = user and user.get_username() or ""
67
68 message = SUPPORT_EMAIL_TEMPLATE % locals()
69 sendmail(email, assignee, subject, message)
70
71 get_memcache().set(
72 'contact-POST-%s' % hashed_ip, "true", expires=15 * MINUTE_SECS
73 )
74 return render_template("email/case_created", assignee)
75
76
77 def sendmail(from_address, to_address, subject, message):
78 if config.get('dummy_sendmail'):
79 msg = (
80 f'To: {to_address}\n'
81 f'From:{from_address}\n'
82 f'Subject:{subject}\n'
83 f'\n{web.safestr(message)}'
84 )
85
86 logger.info("sending email:\n%s", msg)
87 else:
88 web.sendmail(from_address, to_address, subject, message)
89
90
91 SUPPORT_EMAIL_TEMPLATE = """
92
93 Description:\n
94 %(description)s
95
96 A new support case has been filed by %(displayname)s <%(email)s>.
97
98 Topic: %(topic)s
99 URL: %(url)s
100 User-Agent: %(useragent)s
101 OL-username: %(username)s
102 Patron-name: %(patron_name)s
103 """
104
105
106 def setup():
107 pass
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/openlibrary/plugins/openlibrary/support.py b/openlibrary/plugins/openlibrary/support.py
--- a/openlibrary/plugins/openlibrary/support.py
+++ b/openlibrary/plugins/openlibrary/support.py
@@ -32,11 +32,12 @@
patron_name = form.get("name", "")
email = form.get("email", "")
topic = form.get("topic", "")
+ subject_line = form.get('subject', '')
description = form.get("question", "")
url = form.get("url", "")
user = accounts.get_current_user()
useragent = web.ctx.env.get("HTTP_USER_AGENT", "")
- if not all([email, topic, description]):
+ if not all([email, description]):
return ""
hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()
@@ -59,7 +60,7 @@
else:
assignee = default_assignees.get("default", "openlibrary@archive.org")
stats.increment("ol.support.all")
- subject = "Support case *%s*" % topic
+ subject = "Support case *%s*" % self.prepare_subject_line(subject_line)
url = web.ctx.home + url
displayname = user and user.get_name() or ""
@@ -73,6 +74,14 @@
)
return render_template("email/case_created", assignee)
+ def prepare_subject_line(self, subject, max_length=60):
+ if not subject:
+ return '[no subject]'
+ if len(subject) <= max_length:
+ return subject
+
+ return subject[:max_length]
+
def sendmail(from_address, to_address, subject, message):
if config.get('dummy_sendmail'):
@@ -95,7 +104,7 @@
A new support case has been filed by %(displayname)s <%(email)s>.
-Topic: %(topic)s
+Subject: %(subject_line)s
URL: %(url)s
User-Agent: %(useragent)s
OL-username: %(username)s
|
{"golden_diff": "diff --git a/openlibrary/plugins/openlibrary/support.py b/openlibrary/plugins/openlibrary/support.py\n--- a/openlibrary/plugins/openlibrary/support.py\n+++ b/openlibrary/plugins/openlibrary/support.py\n@@ -32,11 +32,12 @@\n patron_name = form.get(\"name\", \"\")\n email = form.get(\"email\", \"\")\n topic = form.get(\"topic\", \"\")\n+ subject_line = form.get('subject', '')\n description = form.get(\"question\", \"\")\n url = form.get(\"url\", \"\")\n user = accounts.get_current_user()\n useragent = web.ctx.env.get(\"HTTP_USER_AGENT\", \"\")\n- if not all([email, topic, description]):\n+ if not all([email, description]):\n return \"\"\n \n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n@@ -59,7 +60,7 @@\n else:\n assignee = default_assignees.get(\"default\", \"openlibrary@archive.org\")\n stats.increment(\"ol.support.all\")\n- subject = \"Support case *%s*\" % topic\n+ subject = \"Support case *%s*\" % self.prepare_subject_line(subject_line)\n \n url = web.ctx.home + url\n displayname = user and user.get_name() or \"\"\n@@ -73,6 +74,14 @@\n )\n return render_template(\"email/case_created\", assignee)\n \n+ def prepare_subject_line(self, subject, max_length=60):\n+ if not subject:\n+ return '[no subject]'\n+ if len(subject) <= max_length:\n+ return subject\n+\n+ return subject[:max_length]\n+\n \n def sendmail(from_address, to_address, subject, message):\n if config.get('dummy_sendmail'):\n@@ -95,7 +104,7 @@\n \n A new support case has been filed by %(displayname)s <%(email)s>.\n \n-Topic: %(topic)s\n+Subject: %(subject_line)s\n URL: %(url)s\n User-Agent: %(useragent)s\n OL-username: %(username)s\n", "issue": "Remove fixed subject line from contact form\nThe contact form has a drop down for choosing a topic. When patrons submit this form, the submissions all go to the same inbox. The selected topics are usually incorrect, but also make it more difficult to manage the inbox due to most messages looking the same.\r\n\r\n### Describe the problem that you'd like solved\r\nLet the patron populate the subject line OR auto-populate the subject line from the message.\r\n\r\n### Proposal & Constraints\r\n<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->\r\n\r\n<!-- Which suggestions or requirements should be considered for how feature needs to appear or be implemented? -->\r\n\r\n### Additional context\r\n<!-- Add any other context or screenshots about the feature request here. -->\r\n\r\n### Stakeholders\r\n<!-- @ tag stakeholders of this bug -->\r\n\r\n\r\n\n", "before_files": [{"content": "import hashlib\n\nimport web\nimport logging\n\nfrom infogami import config\nfrom infogami.utils import delegate\nfrom infogami.utils.view import render_template\n\nfrom openlibrary import accounts\nfrom openlibrary.core import stats\nfrom openlibrary.core.cache import get_memcache\nfrom openlibrary.plugins.upstream.addbook import get_recaptcha\nfrom openlibrary.utils.dateutil import MINUTE_SECS\n\nlogger = logging.getLogger(\"openlibrary\")\n\n\nclass contact(delegate.page):\n def GET(self):\n i = web.input(path=None)\n user = accounts.get_current_user()\n email = user and user.email\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n recaptcha = has_emailed_recently and get_recaptcha()\n return render_template(\"support\", email=email, url=i.path, recaptcha=recaptcha)\n\n def POST(self):\n form = web.input()\n patron_name = form.get(\"name\", \"\")\n email = form.get(\"email\", \"\")\n topic = form.get(\"topic\", \"\")\n description = form.get(\"question\", \"\")\n url = form.get(\"url\", \"\")\n user = accounts.get_current_user()\n useragent = web.ctx.env.get(\"HTTP_USER_AGENT\", \"\")\n if not all([email, topic, description]):\n return \"\"\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n if has_emailed_recently:\n recap = get_recaptcha()\n if recap and not recap.validate():\n return render_template(\n \"message.html\",\n 'Recaptcha solution was incorrect',\n (\n 'Please <a href=\"javascript:history.back()\">go back</a> and try '\n 'again.'\n ),\n )\n\n default_assignees = config.get(\"support_default_assignees\", {})\n if (topic_key := str(topic.replace(\" \", \"_\").lower())) in default_assignees:\n assignee = default_assignees.get(topic_key)\n else:\n assignee = default_assignees.get(\"default\", \"openlibrary@archive.org\")\n stats.increment(\"ol.support.all\")\n subject = \"Support case *%s*\" % topic\n\n url = web.ctx.home + url\n displayname = user and user.get_name() or \"\"\n username = user and user.get_username() or \"\"\n\n message = SUPPORT_EMAIL_TEMPLATE % locals()\n sendmail(email, assignee, subject, message)\n\n get_memcache().set(\n 'contact-POST-%s' % hashed_ip, \"true\", expires=15 * MINUTE_SECS\n )\n return render_template(\"email/case_created\", assignee)\n\n\ndef sendmail(from_address, to_address, subject, message):\n if config.get('dummy_sendmail'):\n msg = (\n f'To: {to_address}\\n'\n f'From:{from_address}\\n'\n f'Subject:{subject}\\n'\n f'\\n{web.safestr(message)}'\n )\n\n logger.info(\"sending email:\\n%s\", msg)\n else:\n web.sendmail(from_address, to_address, subject, message)\n\n\nSUPPORT_EMAIL_TEMPLATE = \"\"\"\n\nDescription:\\n\n%(description)s\n\nA new support case has been filed by %(displayname)s <%(email)s>.\n\nTopic: %(topic)s\nURL: %(url)s\nUser-Agent: %(useragent)s\nOL-username: %(username)s\nPatron-name: %(patron_name)s\n\"\"\"\n\n\ndef setup():\n pass\n", "path": "openlibrary/plugins/openlibrary/support.py"}], "after_files": [{"content": "import hashlib\n\nimport web\nimport logging\n\nfrom infogami import config\nfrom infogami.utils import delegate\nfrom infogami.utils.view import render_template\n\nfrom openlibrary import accounts\nfrom openlibrary.core import stats\nfrom openlibrary.core.cache import get_memcache\nfrom openlibrary.plugins.upstream.addbook import get_recaptcha\nfrom openlibrary.utils.dateutil import MINUTE_SECS\n\nlogger = logging.getLogger(\"openlibrary\")\n\n\nclass contact(delegate.page):\n def GET(self):\n i = web.input(path=None)\n user = accounts.get_current_user()\n email = user and user.email\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n recaptcha = has_emailed_recently and get_recaptcha()\n return render_template(\"support\", email=email, url=i.path, recaptcha=recaptcha)\n\n def POST(self):\n form = web.input()\n patron_name = form.get(\"name\", \"\")\n email = form.get(\"email\", \"\")\n topic = form.get(\"topic\", \"\")\n subject_line = form.get('subject', '')\n description = form.get(\"question\", \"\")\n url = form.get(\"url\", \"\")\n user = accounts.get_current_user()\n useragent = web.ctx.env.get(\"HTTP_USER_AGENT\", \"\")\n if not all([email, description]):\n return \"\"\n\n hashed_ip = hashlib.md5(web.ctx.ip.encode('utf-8')).hexdigest()\n has_emailed_recently = get_memcache().get('contact-POST-%s' % hashed_ip)\n if has_emailed_recently:\n recap = get_recaptcha()\n if recap and not recap.validate():\n return render_template(\n \"message.html\",\n 'Recaptcha solution was incorrect',\n (\n 'Please <a href=\"javascript:history.back()\">go back</a> and try '\n 'again.'\n ),\n )\n\n default_assignees = config.get(\"support_default_assignees\", {})\n if (topic_key := str(topic.replace(\" \", \"_\").lower())) in default_assignees:\n assignee = default_assignees.get(topic_key)\n else:\n assignee = default_assignees.get(\"default\", \"openlibrary@archive.org\")\n stats.increment(\"ol.support.all\")\n subject = \"Support case *%s*\" % self.prepare_subject_line(subject_line)\n\n url = web.ctx.home + url\n displayname = user and user.get_name() or \"\"\n username = user and user.get_username() or \"\"\n\n message = SUPPORT_EMAIL_TEMPLATE % locals()\n sendmail(email, assignee, subject, message)\n\n get_memcache().set(\n 'contact-POST-%s' % hashed_ip, \"true\", expires=15 * MINUTE_SECS\n )\n return render_template(\"email/case_created\", assignee)\n\n def prepare_subject_line(self, subject, max_length=60):\n if not subject:\n return '[no subject]'\n if len(subject) <= max_length:\n return subject\n\n return subject[:max_length]\n\n\ndef sendmail(from_address, to_address, subject, message):\n if config.get('dummy_sendmail'):\n msg = (\n f'To: {to_address}\\n'\n f'From:{from_address}\\n'\n f'Subject:{subject}\\n'\n f'\\n{web.safestr(message)}'\n )\n\n logger.info(\"sending email:\\n%s\", msg)\n else:\n web.sendmail(from_address, to_address, subject, message)\n\n\nSUPPORT_EMAIL_TEMPLATE = \"\"\"\n\nDescription:\\n\n%(description)s\n\nA new support case has been filed by %(displayname)s <%(email)s>.\n\nSubject: %(subject_line)s\nURL: %(url)s\nUser-Agent: %(useragent)s\nOL-username: %(username)s\nPatron-name: %(patron_name)s\n\"\"\"\n\n\ndef setup():\n pass\n", "path": "openlibrary/plugins/openlibrary/support.py"}]}
| 1,429 | 448 |
gh_patches_debug_30256
|
rasdani/github-patches
|
git_diff
|
TheAlgorithms__Python-3949
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
web_programming/fetch_github_info.py err
<pre>message: Bad credentials. The API can't be accessed using username/password authentication. Please create a personal access token to access this endpoint: http://github.com/settings/tokens
documentation_url: https://docs.github.com/articles/creating-a-personal-access-token-for-the-command-line
</pre>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `web_programming/fetch_github_info.py`
Content:
```
1 #!/usr/bin/env python3
2
3 """
4 Created by sarathkaul on 14/11/19
5
6 Basic authentication using an API password is deprecated and will soon no longer work.
7 Visit https://developer.github.com/changes/2020-02-14-deprecating-password-auth
8 for more information around suggested workarounds and removal dates.
9 """
10
11
12 import requests
13
14 _GITHUB_API = "https://api.github.com/user"
15
16
17 def fetch_github_info(auth_user: str, auth_pass: str) -> dict:
18 """
19 Fetch GitHub info of a user using the requests module
20 """
21 return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()
22
23
24 if __name__ == "__main__":
25 for key, value in fetch_github_info("<USER NAME>", "<PASSWORD>").items():
26 print(f"{key}: {value}")
27
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/web_programming/fetch_github_info.py b/web_programming/fetch_github_info.py
--- a/web_programming/fetch_github_info.py
+++ b/web_programming/fetch_github_info.py
@@ -1,26 +1,50 @@
#!/usr/bin/env python3
-
"""
Created by sarathkaul on 14/11/19
+Updated by lawric1 on 24/11/20
-Basic authentication using an API password is deprecated and will soon no longer work.
-Visit https://developer.github.com/changes/2020-02-14-deprecating-password-auth
-for more information around suggested workarounds and removal dates.
-"""
+Authentication will be made via access token.
+To generate your personal access token visit https://github.com/settings/tokens.
+
+NOTE:
+Never hardcode any credential information in the code. Always use an environment
+file to store the private information and use the `os` module to get the information
+during runtime.
+Create a ".env" file in the root directory and write these two lines in that file
+with your token::
+
+#!/usr/bin/env bash
+export USER_TOKEN=""
+"""
+import os
+from typing import Any, Dict
import requests
-_GITHUB_API = "https://api.github.com/user"
+BASE_URL = "https://api.github.com"
+# https://docs.github.com/en/free-pro-team@latest/rest/reference/users#get-the-authenticated-user
+AUTHENTICATED_USER_ENDPOINT = BASE_URL + "/user"
-def fetch_github_info(auth_user: str, auth_pass: str) -> dict:
+# https://github.com/settings/tokens
+USER_TOKEN = os.environ.get("USER_TOKEN", "")
+
+
+def fetch_github_info(auth_token: str) -> Dict[Any, Any]:
"""
Fetch GitHub info of a user using the requests module
"""
- return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()
-
-
-if __name__ == "__main__":
- for key, value in fetch_github_info("<USER NAME>", "<PASSWORD>").items():
- print(f"{key}: {value}")
+ headers = {
+ "Authorization": f"token {auth_token}",
+ "Accept": "application/vnd.github.v3+json",
+ }
+ return requests.get(AUTHENTICATED_USER_ENDPOINT, headers=headers).json()
+
+
+if __name__ == "__main__": # pragma: no cover
+ if USER_TOKEN:
+ for key, value in fetch_github_info(USER_TOKEN).items():
+ print(f"{key}: {value}")
+ else:
+ raise ValueError("'USER_TOKEN' field cannot be empty.")
|
{"golden_diff": "diff --git a/web_programming/fetch_github_info.py b/web_programming/fetch_github_info.py\n--- a/web_programming/fetch_github_info.py\n+++ b/web_programming/fetch_github_info.py\n@@ -1,26 +1,50 @@\n #!/usr/bin/env python3\n-\n \"\"\"\n Created by sarathkaul on 14/11/19\n+Updated by lawric1 on 24/11/20\n \n-Basic authentication using an API password is deprecated and will soon no longer work.\n-Visit https://developer.github.com/changes/2020-02-14-deprecating-password-auth\n-for more information around suggested workarounds and removal dates.\n-\"\"\"\n+Authentication will be made via access token.\n+To generate your personal access token visit https://github.com/settings/tokens.\n+\n+NOTE:\n+Never hardcode any credential information in the code. Always use an environment\n+file to store the private information and use the `os` module to get the information\n+during runtime.\n \n+Create a \".env\" file in the root directory and write these two lines in that file\n+with your token::\n+\n+#!/usr/bin/env bash\n+export USER_TOKEN=\"\"\n+\"\"\"\n+import os\n+from typing import Any, Dict\n \n import requests\n \n-_GITHUB_API = \"https://api.github.com/user\"\n+BASE_URL = \"https://api.github.com\"\n \n+# https://docs.github.com/en/free-pro-team@latest/rest/reference/users#get-the-authenticated-user\n+AUTHENTICATED_USER_ENDPOINT = BASE_URL + \"/user\"\n \n-def fetch_github_info(auth_user: str, auth_pass: str) -> dict:\n+# https://github.com/settings/tokens\n+USER_TOKEN = os.environ.get(\"USER_TOKEN\", \"\")\n+\n+\n+def fetch_github_info(auth_token: str) -> Dict[Any, Any]:\n \"\"\"\n Fetch GitHub info of a user using the requests module\n \"\"\"\n- return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()\n-\n-\n-if __name__ == \"__main__\":\n- for key, value in fetch_github_info(\"<USER NAME>\", \"<PASSWORD>\").items():\n- print(f\"{key}: {value}\")\n+ headers = {\n+ \"Authorization\": f\"token {auth_token}\",\n+ \"Accept\": \"application/vnd.github.v3+json\",\n+ }\n+ return requests.get(AUTHENTICATED_USER_ENDPOINT, headers=headers).json()\n+\n+\n+if __name__ == \"__main__\": # pragma: no cover\n+ if USER_TOKEN:\n+ for key, value in fetch_github_info(USER_TOKEN).items():\n+ print(f\"{key}: {value}\")\n+ else:\n+ raise ValueError(\"'USER_TOKEN' field cannot be empty.\")\n", "issue": "web_programming/fetch_github_info.py err\n<pre>message: Bad credentials. The API can't be accessed using username/password authentication. Please create a personal access token to access this endpoint: http://github.com/settings/tokens\r\ndocumentation_url: https://docs.github.com/articles/creating-a-personal-access-token-for-the-command-line\r\n</pre>\n", "before_files": [{"content": "#!/usr/bin/env python3\n\n\"\"\"\nCreated by sarathkaul on 14/11/19\n\nBasic authentication using an API password is deprecated and will soon no longer work.\nVisit https://developer.github.com/changes/2020-02-14-deprecating-password-auth\nfor more information around suggested workarounds and removal dates.\n\"\"\"\n\n\nimport requests\n\n_GITHUB_API = \"https://api.github.com/user\"\n\n\ndef fetch_github_info(auth_user: str, auth_pass: str) -> dict:\n \"\"\"\n Fetch GitHub info of a user using the requests module\n \"\"\"\n return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()\n\n\nif __name__ == \"__main__\":\n for key, value in fetch_github_info(\"<USER NAME>\", \"<PASSWORD>\").items():\n print(f\"{key}: {value}\")\n", "path": "web_programming/fetch_github_info.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\"\"\"\nCreated by sarathkaul on 14/11/19\nUpdated by lawric1 on 24/11/20\n\nAuthentication will be made via access token.\nTo generate your personal access token visit https://github.com/settings/tokens.\n\nNOTE:\nNever hardcode any credential information in the code. Always use an environment\nfile to store the private information and use the `os` module to get the information\nduring runtime.\n\nCreate a \".env\" file in the root directory and write these two lines in that file\nwith your token::\n\n#!/usr/bin/env bash\nexport USER_TOKEN=\"\"\n\"\"\"\nimport os\nfrom typing import Any, Dict\n\nimport requests\n\nBASE_URL = \"https://api.github.com\"\n\n# https://docs.github.com/en/free-pro-team@latest/rest/reference/users#get-the-authenticated-user\nAUTHENTICATED_USER_ENDPOINT = BASE_URL + \"/user\"\n\n# https://github.com/settings/tokens\nUSER_TOKEN = os.environ.get(\"USER_TOKEN\", \"\")\n\n\ndef fetch_github_info(auth_token: str) -> Dict[Any, Any]:\n \"\"\"\n Fetch GitHub info of a user using the requests module\n \"\"\"\n headers = {\n \"Authorization\": f\"token {auth_token}\",\n \"Accept\": \"application/vnd.github.v3+json\",\n }\n return requests.get(AUTHENTICATED_USER_ENDPOINT, headers=headers).json()\n\n\nif __name__ == \"__main__\": # pragma: no cover\n if USER_TOKEN:\n for key, value in fetch_github_info(USER_TOKEN).items():\n print(f\"{key}: {value}\")\n else:\n raise ValueError(\"'USER_TOKEN' field cannot be empty.\")\n", "path": "web_programming/fetch_github_info.py"}]}
| 574 | 597 |
gh_patches_debug_14459
|
rasdani/github-patches
|
git_diff
|
sql-machine-learning__elasticdl-268
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[clean up] Should check tf is running eagerly early
so the program can fail early if tf is configured wrong.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `elasticdl/worker/worker.py`
Content:
```
1 import tensorflow as tf
2 from tensorflow.python.ops import math_ops
3 from proto import master_pb2
4 from proto import master_pb2_grpc
5
6
7 class Worker(object):
8 """ElasticDL worker"""
9
10 def __init__(self, model_cls, input_fn, opt_fn, channel=None):
11 """
12 Arguments:
13 model_cls: A class to define the model, which contains funcs
14 get_keras_model: return the keras model defined in the class, with a tf dataset as its input
15 output(data): get model ouput from data as input, either a single output of a dict of outputs
16 loss(output, data): get model loss from output and data as input
17 input_fn: a func to to get a dataset, which can be used as the keras model input
18 dataset = input_fn(dict_of_params)
19 dict_of_params from GetTask for DistributedTrain, from kwargs for LocalTrain
20 opt_fn: a func to get the optimizer
21 channel: grpc channel
22 """
23
24 self._model_cls = model_cls()
25 self._keras_model = self._model_cls.get_keras_model()
26 self._input_fn = input_fn
27 self._opt_fn = opt_fn
28 if channel is None:
29 self._stub = None
30 else:
31 self._stub = master_pb2_grpc.MasterStub(channel)
32 self._model_version = -1
33
34 def get_task(self):
35 # TODO: get task from master
36 pass
37
38 def get_model(self):
39 # TODO: get model from master
40 pass
41
42 def report_task_result(self):
43 # TODO: report task result to master
44 pass
45
46 def report_gradient(self):
47 # TODO: report gradient to ps
48 pass
49
50 def distributed_train(self):
51 # TODO: distributed training
52 pass
53
54 def local_train(self, batch_size, epoch=1, kwargs=None):
55 """
56 Local training for local testing. Must in eager mode.
57 Argments:
58 batch_size: batch size in training
59 epoch: the number of epoch in training
60 kwargs: contains a dict of parameters used in training
61 """
62
63 if not tf.executing_eagerly():
64 raise ValueError('Eager mode is required for LocalTrain')
65
66 dataset = self._input_fn(kwargs)
67 dataset = dataset.repeat(epoch).batch(batch_size)
68 optimizer = self._opt_fn()
69
70 for data in dataset:
71 with tf.GradientTape() as tape:
72 output = self._model_cls.output(data)
73 loss = self._model_cls.loss(output, data)
74 # Add regularization loss if any.
75 # Note: for distributed training, the regularization loss should
76 # be divided by the number of contributing workers, which
77 # might be difficult for elasticdl.
78 if self._keras_model.losses:
79 loss += math_ops.add_n(self._keras_model.losses)
80 grads = tape.gradient(loss, self._keras_model.variables)
81 optimizer.apply_gradients(zip(grads, self._keras_model.variables))
82 print('Loss is ', loss.numpy())
83 pass
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/elasticdl/worker/worker.py b/elasticdl/worker/worker.py
--- a/elasticdl/worker/worker.py
+++ b/elasticdl/worker/worker.py
@@ -1,4 +1,6 @@
import tensorflow as tf
+assert tf.executing_eagerly()
+
from tensorflow.python.ops import math_ops
from proto import master_pb2
from proto import master_pb2_grpc
@@ -60,9 +62,6 @@
kwargs: contains a dict of parameters used in training
"""
- if not tf.executing_eagerly():
- raise ValueError('Eager mode is required for LocalTrain')
-
dataset = self._input_fn(kwargs)
dataset = dataset.repeat(epoch).batch(batch_size)
optimizer = self._opt_fn()
|
{"golden_diff": "diff --git a/elasticdl/worker/worker.py b/elasticdl/worker/worker.py\n--- a/elasticdl/worker/worker.py\n+++ b/elasticdl/worker/worker.py\n@@ -1,4 +1,6 @@\n import tensorflow as tf\n+assert tf.executing_eagerly()\n+\n from tensorflow.python.ops import math_ops\n from proto import master_pb2\n from proto import master_pb2_grpc\n@@ -60,9 +62,6 @@\n kwargs: contains a dict of parameters used in training\n \"\"\"\n \n- if not tf.executing_eagerly():\n- raise ValueError('Eager mode is required for LocalTrain')\n-\n dataset = self._input_fn(kwargs)\n dataset = dataset.repeat(epoch).batch(batch_size)\n optimizer = self._opt_fn()\n", "issue": "[clean up] Should check tf is running eagerly early\nso the program can fail early if tf is configured wrong.\n", "before_files": [{"content": "import tensorflow as tf\nfrom tensorflow.python.ops import math_ops\nfrom proto import master_pb2\nfrom proto import master_pb2_grpc\n\n\nclass Worker(object):\n \"\"\"ElasticDL worker\"\"\"\n\n def __init__(self, model_cls, input_fn, opt_fn, channel=None):\n \"\"\"\n Arguments:\n model_cls: A class to define the model, which contains funcs\n get_keras_model: return the keras model defined in the class, with a tf dataset as its input\n output(data): get model ouput from data as input, either a single output of a dict of outputs\n loss(output, data): get model loss from output and data as input\n input_fn: a func to to get a dataset, which can be used as the keras model input\n dataset = input_fn(dict_of_params)\n dict_of_params from GetTask for DistributedTrain, from kwargs for LocalTrain\n opt_fn: a func to get the optimizer \n channel: grpc channel\n \"\"\"\n\n self._model_cls = model_cls()\n self._keras_model = self._model_cls.get_keras_model()\n self._input_fn = input_fn\n self._opt_fn = opt_fn\n if channel is None:\n self._stub = None\n else:\n self._stub = master_pb2_grpc.MasterStub(channel)\n self._model_version = -1\n\n def get_task(self):\n # TODO: get task from master\n pass\n\n def get_model(self):\n # TODO: get model from master\n pass\n\n def report_task_result(self):\n # TODO: report task result to master\n pass\n\n def report_gradient(self):\n # TODO: report gradient to ps\n pass\n\n def distributed_train(self):\n # TODO: distributed training\n pass\n\n def local_train(self, batch_size, epoch=1, kwargs=None):\n \"\"\"\n Local training for local testing. Must in eager mode.\n Argments:\n batch_size: batch size in training\n epoch: the number of epoch in training\n kwargs: contains a dict of parameters used in training\n \"\"\"\n\n if not tf.executing_eagerly():\n raise ValueError('Eager mode is required for LocalTrain')\n\n dataset = self._input_fn(kwargs)\n dataset = dataset.repeat(epoch).batch(batch_size)\n optimizer = self._opt_fn()\n\n for data in dataset:\n with tf.GradientTape() as tape:\n output = self._model_cls.output(data)\n loss = self._model_cls.loss(output, data)\n # Add regularization loss if any.\n # Note: for distributed training, the regularization loss should\n # be divided by the number of contributing workers, which\n # might be difficult for elasticdl.\n if self._keras_model.losses:\n loss += math_ops.add_n(self._keras_model.losses)\n grads = tape.gradient(loss, self._keras_model.variables)\n optimizer.apply_gradients(zip(grads, self._keras_model.variables))\n print('Loss is ', loss.numpy())\n pass\n", "path": "elasticdl/worker/worker.py"}], "after_files": [{"content": "import tensorflow as tf\nassert tf.executing_eagerly()\n\nfrom tensorflow.python.ops import math_ops\nfrom proto import master_pb2\nfrom proto import master_pb2_grpc\n\n\nclass Worker(object):\n \"\"\"ElasticDL worker\"\"\"\n\n def __init__(self, model_cls, input_fn, opt_fn, channel=None):\n \"\"\"\n Arguments:\n model_cls: A class to define the model, which contains funcs\n get_keras_model: return the keras model defined in the class, with a tf dataset as its input\n output(data): get model ouput from data as input, either a single output of a dict of outputs\n loss(output, data): get model loss from output and data as input\n input_fn: a func to to get a dataset, which can be used as the keras model input\n dataset = input_fn(dict_of_params)\n dict_of_params from GetTask for DistributedTrain, from kwargs for LocalTrain\n opt_fn: a func to get the optimizer \n channel: grpc channel\n \"\"\"\n\n self._model_cls = model_cls()\n self._keras_model = self._model_cls.get_keras_model()\n self._input_fn = input_fn\n self._opt_fn = opt_fn\n if channel is None:\n self._stub = None\n else:\n self._stub = master_pb2_grpc.MasterStub(channel)\n self._model_version = -1\n\n def get_task(self):\n # TODO: get task from master\n pass\n\n def get_model(self):\n # TODO: get model from master\n pass\n\n def report_task_result(self):\n # TODO: report task result to master\n pass\n\n def report_gradient(self):\n # TODO: report gradient to ps\n pass\n\n def distributed_train(self):\n # TODO: distributed training\n pass\n\n def local_train(self, batch_size, epoch=1, kwargs=None):\n \"\"\"\n Local training for local testing. Must in eager mode.\n Argments:\n batch_size: batch size in training\n epoch: the number of epoch in training\n kwargs: contains a dict of parameters used in training\n \"\"\"\n\n dataset = self._input_fn(kwargs)\n dataset = dataset.repeat(epoch).batch(batch_size)\n optimizer = self._opt_fn()\n\n for data in dataset:\n with tf.GradientTape() as tape:\n output = self._model_cls.output(data)\n loss = self._model_cls.loss(output, data)\n # Add regularization loss if any.\n # Note: for distributed training, the regularization loss should\n # be divided by the number of contributing workers, which\n # might be difficult for elasticdl.\n if self._keras_model.losses:\n loss += math_ops.add_n(self._keras_model.losses)\n grads = tape.gradient(loss, self._keras_model.variables)\n optimizer.apply_gradients(zip(grads, self._keras_model.variables))\n print('Loss is ', loss.numpy())\n pass\n", "path": "elasticdl/worker/worker.py"}]}
| 1,104 | 174 |
gh_patches_debug_22781
|
rasdani/github-patches
|
git_diff
|
microsoft__lisa-491
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
VERIFY-BOOT-ERROR-WARNINGS.py: fix test case
Test case is not running properly on redhat. Run the script directly, not through lisav2 to observe this better.
Release: 7.7 Snapshot 5 (most likely 7.6 should also have this behavior)
Issues to be fixed:
- ignore if syslog is not present (that exists only on ubuntu/debian).
[root@rhel77gen2 Linux]# python VERIFY-BOOT-ERROR-WARNINGS.py
2.7.5 (default, Jun 11 2019, 14:33:56)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
grep: /var/log/syslog: No such file or directory
- fix grep search
[root@rhel77gen2 Linux]# grep -nw '/var/log/messages' -e 'mlx5'
[root@rhel77gen2 Linux]# grep /var/log/messages -e mlx5
Jul 29 10:02:27 rhel77gen2 kernel: mlx5_core ae35:00:02.0: firmware version: 14.23.1020
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py`
Content:
```
1 #!/usr/bin/python
2 # Copyright (c) Microsoft Corporation. All rights reserved.
3 # Licensed under the Apache License.
4 from azuremodules import *
5
6 white_list_xml = "ignorable-boot-errors.xml"
7
8
9 def RunTest():
10 UpdateState("TestRunning")
11 RunLog.info("Checking for ERROR and WARNING messages in system logs.")
12 errors = Run("grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case")
13 warnings = Run("grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case")
14 failures = Run("grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case")
15 if (not errors and not warnings and not failures):
16 RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')
17 ResultLog.info('PASS')
18 else:
19 if white_list_xml and os.path.isfile(white_list_xml):
20 try:
21 import xml.etree.cElementTree as ET
22 except ImportError:
23 import xml.etree.ElementTree as ET
24
25 white_list_file = ET.parse(white_list_xml)
26 xml_root = white_list_file.getroot()
27
28 RunLog.info('Checking ignorable boot ERROR/WARNING/FAILURE messages...')
29 for node in xml_root:
30 if (failures and node.tag == "failures"):
31 failures = RemoveIgnorableMessages(failures, node)
32 if (errors and node.tag == "errors"):
33 errors = RemoveIgnorableMessages(errors, node)
34 if (warnings and node.tag == "warnings"):
35 warnings = RemoveIgnorableMessages(warnings, node)
36
37 if (errors or warnings or failures):
38 RunLog.error('Found ERROR/WARNING/FAILURE messages in logs.')
39 if(errors):
40 SplitLog('Errors', errors)
41 if(warnings):
42 SplitLog('warnings', warnings)
43 if(failures):
44 SplitLog('failures', failures)
45 ResultLog.error('FAIL')
46 else:
47 ResultLog.info('PASS')
48 UpdateState("TestCompleted")
49
50
51 def SplitLog(logType, logValues):
52 for logEntry in logValues:
53 RunLog.info(logType + ': ' + logEntry)
54
55
56 def RemoveIgnorableMessages(messages, keywords_xml_node):
57 message_list = messages.strip().split('\n')
58 valid_list = []
59 for msg in message_list:
60 for keywords in keywords_xml_node:
61 if keywords.text in msg:
62 RunLog.info('Ignorable ERROR/WARNING/FAILURE message: ' + msg)
63 break
64 else:
65 valid_list.append(msg)
66 if len(valid_list) > 0:
67 return valid_list
68 else:
69 return None
70
71 RunTest()
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py
--- a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py
+++ b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py
@@ -9,9 +9,9 @@
def RunTest():
UpdateState("TestRunning")
RunLog.info("Checking for ERROR and WARNING messages in system logs.")
- errors = Run("grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case")
- warnings = Run("grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case")
- failures = Run("grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case")
+ errors = Run("grep -nw '/var/log/syslog' -e 'error' --ignore-case --no-message && grep /var/log/messages -e error --ignore-case --no-message")
+ warnings = Run("grep -nw '/var/log/syslog' -e 'warning' --ignore-case --no-message && grep /var/log/messages -e warning --ignore-case --no-message")
+ failures = Run("grep -nw '/var/log/syslog' -e 'fail' --ignore-case --no-message && grep /var/log/messages -e fail --ignore-case --no-message")
if (not errors and not warnings and not failures):
RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')
ResultLog.info('PASS')
|
{"golden_diff": "diff --git a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py\n--- a/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py\n+++ b/Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py\n@@ -9,9 +9,9 @@\n def RunTest():\n UpdateState(\"TestRunning\")\n RunLog.info(\"Checking for ERROR and WARNING messages in system logs.\")\n- errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case\")\n- warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case\")\n- failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case\")\n+ errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case --no-message && grep /var/log/messages -e error --ignore-case --no-message\")\n+ warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case --no-message && grep /var/log/messages -e warning --ignore-case --no-message\")\n+ failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case --no-message && grep /var/log/messages -e fail --ignore-case --no-message\")\n if (not errors and not warnings and not failures):\n RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')\n ResultLog.info('PASS')\n", "issue": "VERIFY-BOOT-ERROR-WARNINGS.py: fix test case\nTest case is not running properly on redhat. Run the script directly, not through lisav2 to observe this better.\r\n\r\nRelease: 7.7 Snapshot 5 (most likely 7.6 should also have this behavior)\r\n\r\nIssues to be fixed:\r\n- ignore if syslog is not present (that exists only on ubuntu/debian).\r\n[root@rhel77gen2 Linux]# python VERIFY-BOOT-ERROR-WARNINGS.py\r\n2.7.5 (default, Jun 11 2019, 14:33:56)\r\n[GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]\r\ngrep: /var/log/syslog: No such file or directory\r\n\r\n- fix grep search\r\n[root@rhel77gen2 Linux]# grep -nw '/var/log/messages' -e 'mlx5'\r\n[root@rhel77gen2 Linux]# grep /var/log/messages -e mlx5\r\nJul 29 10:02:27 rhel77gen2 kernel: mlx5_core ae35:00:02.0: firmware version: 14.23.1020\r\n\n", "before_files": [{"content": "#!/usr/bin/python\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the Apache License.\nfrom azuremodules import *\n\nwhite_list_xml = \"ignorable-boot-errors.xml\"\n\n\ndef RunTest():\n UpdateState(\"TestRunning\")\n RunLog.info(\"Checking for ERROR and WARNING messages in system logs.\")\n errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case && grep -nw '/var/log/messages' -e 'error' --ignore-case\")\n warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case && grep -nw '/var/log/messages' -e 'warning' --ignore-case\")\n failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case && grep -nw '/var/log/messages' -e 'fail' --ignore-case\")\n if (not errors and not warnings and not failures):\n RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')\n ResultLog.info('PASS')\n else:\n if white_list_xml and os.path.isfile(white_list_xml):\n try:\n import xml.etree.cElementTree as ET\n except ImportError:\n import xml.etree.ElementTree as ET\n\n white_list_file = ET.parse(white_list_xml)\n xml_root = white_list_file.getroot()\n\n RunLog.info('Checking ignorable boot ERROR/WARNING/FAILURE messages...')\n for node in xml_root:\n if (failures and node.tag == \"failures\"):\n failures = RemoveIgnorableMessages(failures, node)\n if (errors and node.tag == \"errors\"):\n errors = RemoveIgnorableMessages(errors, node)\n if (warnings and node.tag == \"warnings\"):\n warnings = RemoveIgnorableMessages(warnings, node)\n\n if (errors or warnings or failures):\n RunLog.error('Found ERROR/WARNING/FAILURE messages in logs.')\n if(errors):\n SplitLog('Errors', errors)\n if(warnings):\n SplitLog('warnings', warnings)\n if(failures):\n SplitLog('failures', failures)\n ResultLog.error('FAIL')\n else:\n ResultLog.info('PASS')\n UpdateState(\"TestCompleted\")\n\n\ndef SplitLog(logType, logValues):\n for logEntry in logValues:\n RunLog.info(logType + ': ' + logEntry)\n\n\ndef RemoveIgnorableMessages(messages, keywords_xml_node):\n message_list = messages.strip().split('\\n')\n valid_list = []\n for msg in message_list:\n for keywords in keywords_xml_node:\n if keywords.text in msg:\n RunLog.info('Ignorable ERROR/WARNING/FAILURE message: ' + msg)\n break\n else:\n valid_list.append(msg)\n if len(valid_list) > 0:\n return valid_list\n else:\n return None\n\nRunTest()\n", "path": "Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py"}], "after_files": [{"content": "#!/usr/bin/python\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the Apache License.\nfrom azuremodules import *\n\nwhite_list_xml = \"ignorable-boot-errors.xml\"\n\n\ndef RunTest():\n UpdateState(\"TestRunning\")\n RunLog.info(\"Checking for ERROR and WARNING messages in system logs.\")\n errors = Run(\"grep -nw '/var/log/syslog' -e 'error' --ignore-case --no-message && grep /var/log/messages -e error --ignore-case --no-message\")\n warnings = Run(\"grep -nw '/var/log/syslog' -e 'warning' --ignore-case --no-message && grep /var/log/messages -e warning --ignore-case --no-message\")\n failures = Run(\"grep -nw '/var/log/syslog' -e 'fail' --ignore-case --no-message && grep /var/log/messages -e fail --ignore-case --no-message\")\n if (not errors and not warnings and not failures):\n RunLog.info('Could not find ERROR/WARNING/FAILURE messages in syslog/messages log file.')\n ResultLog.info('PASS')\n else:\n if white_list_xml and os.path.isfile(white_list_xml):\n try:\n import xml.etree.cElementTree as ET\n except ImportError:\n import xml.etree.ElementTree as ET\n\n white_list_file = ET.parse(white_list_xml)\n xml_root = white_list_file.getroot()\n\n RunLog.info('Checking ignorable boot ERROR/WARNING/FAILURE messages...')\n for node in xml_root:\n if (failures and node.tag == \"failures\"):\n failures = RemoveIgnorableMessages(failures, node)\n if (errors and node.tag == \"errors\"):\n errors = RemoveIgnorableMessages(errors, node)\n if (warnings and node.tag == \"warnings\"):\n warnings = RemoveIgnorableMessages(warnings, node)\n\n if (errors or warnings or failures):\n RunLog.error('Found ERROR/WARNING/FAILURE messages in logs.')\n if(errors):\n SplitLog('Errors', errors)\n if(warnings):\n SplitLog('warnings', warnings)\n if(failures):\n SplitLog('failures', failures)\n ResultLog.error('FAIL')\n else:\n ResultLog.info('PASS')\n UpdateState(\"TestCompleted\")\n\n\ndef SplitLog(logType, logValues):\n for logEntry in logValues:\n RunLog.info(logType + ': ' + logEntry)\n\n\ndef RemoveIgnorableMessages(messages, keywords_xml_node):\n message_list = messages.strip().split('\\n')\n valid_list = []\n for msg in message_list:\n for keywords in keywords_xml_node:\n if keywords.text in msg:\n RunLog.info('Ignorable ERROR/WARNING/FAILURE message: ' + msg)\n break\n else:\n valid_list.append(msg)\n if len(valid_list) > 0:\n return valid_list\n else:\n return None\n\nRunTest()\n", "path": "Testscripts/Linux/VERIFY-BOOT-ERROR-WARNINGS.py"}]}
| 1,302 | 388 |
gh_patches_debug_23467
|
rasdani/github-patches
|
git_diff
|
saleor__saleor-809
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Calling get_price_range() on an empty item range
On the product detail page backend doesn't check existence of product variants.
If the product requires variants, but they don't exist, Django throws an exception `Calling get_price_range() on an empty item range`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/product/utils.py`
Content:
```
1 from collections import namedtuple
2
3 from ..cart.utils import get_cart_from_request, get_or_create_cart_from_request
4 from ..core.utils import to_local_currency
5 from .forms import get_form_class_for_product
6 from .models.utils import get_attributes_display_map
7 from .models import Product
8
9
10 def products_visible_to_user(user):
11 if (user.is_authenticated() and
12 user.is_active and user.is_staff):
13 return Product.objects.all()
14 else:
15 return Product.objects.get_available_products()
16
17
18 def products_with_details(user):
19 products = products_visible_to_user(user)
20 products = products.prefetch_related('categories', 'images',
21 'variants__stock',
22 'variants__variant_images__image',
23 'attributes__values',
24 'product_class__variant_attributes__values',
25 'product_class__product_attributes__values')
26 return products
27
28
29 def get_product_images(product):
30 """
31 Returns list of product images that will be placed in product gallery
32 """
33 return list(product.images.all())
34
35
36 def products_with_availability(products, discounts, local_currency):
37 for product in products:
38 yield product, get_availability(product, discounts, local_currency)
39
40
41 ProductAvailability = namedtuple(
42 'ProductAvailability', (
43 'available', 'price_range', 'price_range_undiscounted', 'discount',
44 'price_range_local_currency', 'discount_local_currency'))
45
46
47 def get_availability(product, discounts=None, local_currency=None):
48 # In default currency
49 price_range = product.get_price_range(discounts=discounts)
50 undiscounted = product.get_price_range()
51 if undiscounted.min_price > price_range.min_price:
52 discount = undiscounted.min_price - price_range.min_price
53 else:
54 discount = None
55
56 # Local currency
57 if local_currency:
58 price_range_local = to_local_currency(
59 price_range, local_currency)
60 undiscounted_local = to_local_currency(
61 undiscounted, local_currency)
62 if (undiscounted_local and
63 undiscounted_local.min_price > price_range_local.min_price):
64 discount_local_currency = (
65 undiscounted_local.min_price - price_range_local.min_price)
66 else:
67 discount_local_currency = None
68 else:
69 price_range_local = None
70 discount_local_currency = None
71
72 is_available = product.is_in_stock() and product.is_available()
73
74 return ProductAvailability(
75 available=is_available,
76 price_range=price_range,
77 price_range_undiscounted=undiscounted,
78 discount=discount,
79 price_range_local_currency=price_range_local,
80 discount_local_currency=discount_local_currency)
81
82
83 def handle_cart_form(request, product, create_cart=False):
84 if create_cart:
85 cart = get_or_create_cart_from_request(request)
86 else:
87 cart = get_cart_from_request(request)
88
89 form_class = get_form_class_for_product(product)
90 form = form_class(cart=cart, product=product,
91 data=request.POST or None, discounts=request.discounts)
92 return form, cart
93
94
95 def products_for_cart(user):
96 products = products_visible_to_user(user)
97 products = products.prefetch_related(
98 'variants', 'variants__variant_images__image')
99 return products
100
101
102 def get_variant_picker_data(product, discounts=None):
103 availability = get_availability(product, discounts)
104 variants = product.variants.all()
105 data = {'variantAttributes': [], 'variants': []}
106
107 variant_attributes = product.product_class.variant_attributes.all()
108 for attribute in variant_attributes:
109 data['variantAttributes'].append({
110 'pk': attribute.pk,
111 'display': attribute.display,
112 'name': attribute.name,
113 'values': [{'pk': value.pk, 'display': value.display}
114 for value in attribute.values.all()]})
115
116 for variant in variants:
117 price = variant.get_price_per_item(discounts)
118 price_undiscounted = variant.get_price_per_item()
119 variant_data = {
120 'id': variant.id,
121 'price': price.gross,
122 'priceUndiscounted': price_undiscounted.gross,
123 'currency': price.currency,
124 'attributes': variant.attributes}
125 data['variants'].append(variant_data)
126
127 data['availability'] = {
128 'discount': price_as_dict(availability.discount),
129 'priceRange': price_range_as_dict(availability.price_range),
130 'priceRangeUndiscounted': price_range_as_dict(
131 availability.price_range_undiscounted)}
132 return data
133
134
135 def get_product_attributes_data(product):
136 attributes = product.product_class.product_attributes.all()
137 attributes_map = {attribute.pk: attribute for attribute in attributes}
138 values_map = get_attributes_display_map(product, attributes)
139 return {attributes_map.get(attr_pk): value_obj
140 for (attr_pk, value_obj) in values_map.items()}
141
142
143 def price_as_dict(price):
144 if not price:
145 return {}
146 return {'currency': price.currency,
147 'gross': price.gross,
148 'net': price.net}
149
150
151 def price_range_as_dict(price_range):
152 return {'maxPrice': price_as_dict(price_range.max_price),
153 'minPrice': price_as_dict(price_range.min_price)}
154
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/saleor/product/utils.py b/saleor/product/utils.py
--- a/saleor/product/utils.py
+++ b/saleor/product/utils.py
@@ -45,6 +45,15 @@
def get_availability(product, discounts=None, local_currency=None):
+ if not product.variants.exists():
+ return ProductAvailability(
+ available=False,
+ price_range=None,
+ price_range_undiscounted=None,
+ discount=None,
+ price_range_local_currency=None,
+ discount_local_currency=None)
+
# In default currency
price_range = product.get_price_range(discounts=discounts)
undiscounted = product.get_price_range()
@@ -142,12 +151,14 @@
def price_as_dict(price):
if not price:
- return {}
+ return None
return {'currency': price.currency,
'gross': price.gross,
'net': price.net}
def price_range_as_dict(price_range):
+ if not price_range:
+ return None
return {'maxPrice': price_as_dict(price_range.max_price),
'minPrice': price_as_dict(price_range.min_price)}
|
{"golden_diff": "diff --git a/saleor/product/utils.py b/saleor/product/utils.py\n--- a/saleor/product/utils.py\n+++ b/saleor/product/utils.py\n@@ -45,6 +45,15 @@\n \n \n def get_availability(product, discounts=None, local_currency=None):\n+ if not product.variants.exists():\n+ return ProductAvailability(\n+ available=False,\n+ price_range=None,\n+ price_range_undiscounted=None,\n+ discount=None,\n+ price_range_local_currency=None,\n+ discount_local_currency=None)\n+\n # In default currency\n price_range = product.get_price_range(discounts=discounts)\n undiscounted = product.get_price_range()\n@@ -142,12 +151,14 @@\n \n def price_as_dict(price):\n if not price:\n- return {}\n+ return None\n return {'currency': price.currency,\n 'gross': price.gross,\n 'net': price.net}\n \n \n def price_range_as_dict(price_range):\n+ if not price_range:\n+ return None\n return {'maxPrice': price_as_dict(price_range.max_price),\n 'minPrice': price_as_dict(price_range.min_price)}\n", "issue": "Calling get_price_range() on an empty item range\nOn the product detail page backend doesn't check existence of product variants.\r\nIf the product requires variants, but they don't exist, Django throws an exception `Calling get_price_range() on an empty item range`.\n", "before_files": [{"content": "from collections import namedtuple\n\nfrom ..cart.utils import get_cart_from_request, get_or_create_cart_from_request\nfrom ..core.utils import to_local_currency\nfrom .forms import get_form_class_for_product\nfrom .models.utils import get_attributes_display_map\nfrom .models import Product\n\n\ndef products_visible_to_user(user):\n if (user.is_authenticated() and\n user.is_active and user.is_staff):\n return Product.objects.all()\n else:\n return Product.objects.get_available_products()\n\n\ndef products_with_details(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related('categories', 'images',\n 'variants__stock',\n 'variants__variant_images__image',\n 'attributes__values',\n 'product_class__variant_attributes__values',\n 'product_class__product_attributes__values')\n return products\n\n\ndef get_product_images(product):\n \"\"\"\n Returns list of product images that will be placed in product gallery\n \"\"\"\n return list(product.images.all())\n\n\ndef products_with_availability(products, discounts, local_currency):\n for product in products:\n yield product, get_availability(product, discounts, local_currency)\n\n\nProductAvailability = namedtuple(\n 'ProductAvailability', (\n 'available', 'price_range', 'price_range_undiscounted', 'discount',\n 'price_range_local_currency', 'discount_local_currency'))\n\n\ndef get_availability(product, discounts=None, local_currency=None):\n # In default currency\n price_range = product.get_price_range(discounts=discounts)\n undiscounted = product.get_price_range()\n if undiscounted.min_price > price_range.min_price:\n discount = undiscounted.min_price - price_range.min_price\n else:\n discount = None\n\n # Local currency\n if local_currency:\n price_range_local = to_local_currency(\n price_range, local_currency)\n undiscounted_local = to_local_currency(\n undiscounted, local_currency)\n if (undiscounted_local and\n undiscounted_local.min_price > price_range_local.min_price):\n discount_local_currency = (\n undiscounted_local.min_price - price_range_local.min_price)\n else:\n discount_local_currency = None\n else:\n price_range_local = None\n discount_local_currency = None\n\n is_available = product.is_in_stock() and product.is_available()\n\n return ProductAvailability(\n available=is_available,\n price_range=price_range,\n price_range_undiscounted=undiscounted,\n discount=discount,\n price_range_local_currency=price_range_local,\n discount_local_currency=discount_local_currency)\n\n\ndef handle_cart_form(request, product, create_cart=False):\n if create_cart:\n cart = get_or_create_cart_from_request(request)\n else:\n cart = get_cart_from_request(request)\n\n form_class = get_form_class_for_product(product)\n form = form_class(cart=cart, product=product,\n data=request.POST or None, discounts=request.discounts)\n return form, cart\n\n\ndef products_for_cart(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related(\n 'variants', 'variants__variant_images__image')\n return products\n\n\ndef get_variant_picker_data(product, discounts=None):\n availability = get_availability(product, discounts)\n variants = product.variants.all()\n data = {'variantAttributes': [], 'variants': []}\n\n variant_attributes = product.product_class.variant_attributes.all()\n for attribute in variant_attributes:\n data['variantAttributes'].append({\n 'pk': attribute.pk,\n 'display': attribute.display,\n 'name': attribute.name,\n 'values': [{'pk': value.pk, 'display': value.display}\n for value in attribute.values.all()]})\n\n for variant in variants:\n price = variant.get_price_per_item(discounts)\n price_undiscounted = variant.get_price_per_item()\n variant_data = {\n 'id': variant.id,\n 'price': price.gross,\n 'priceUndiscounted': price_undiscounted.gross,\n 'currency': price.currency,\n 'attributes': variant.attributes}\n data['variants'].append(variant_data)\n\n data['availability'] = {\n 'discount': price_as_dict(availability.discount),\n 'priceRange': price_range_as_dict(availability.price_range),\n 'priceRangeUndiscounted': price_range_as_dict(\n availability.price_range_undiscounted)}\n return data\n\n\ndef get_product_attributes_data(product):\n attributes = product.product_class.product_attributes.all()\n attributes_map = {attribute.pk: attribute for attribute in attributes}\n values_map = get_attributes_display_map(product, attributes)\n return {attributes_map.get(attr_pk): value_obj\n for (attr_pk, value_obj) in values_map.items()}\n\n\ndef price_as_dict(price):\n if not price:\n return {}\n return {'currency': price.currency,\n 'gross': price.gross,\n 'net': price.net}\n\n\ndef price_range_as_dict(price_range):\n return {'maxPrice': price_as_dict(price_range.max_price),\n 'minPrice': price_as_dict(price_range.min_price)}\n", "path": "saleor/product/utils.py"}], "after_files": [{"content": "from collections import namedtuple\n\nfrom ..cart.utils import get_cart_from_request, get_or_create_cart_from_request\nfrom ..core.utils import to_local_currency\nfrom .forms import get_form_class_for_product\nfrom .models.utils import get_attributes_display_map\nfrom .models import Product\n\n\ndef products_visible_to_user(user):\n if (user.is_authenticated() and\n user.is_active and user.is_staff):\n return Product.objects.all()\n else:\n return Product.objects.get_available_products()\n\n\ndef products_with_details(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related('categories', 'images',\n 'variants__stock',\n 'variants__variant_images__image',\n 'attributes__values',\n 'product_class__variant_attributes__values',\n 'product_class__product_attributes__values')\n return products\n\n\ndef get_product_images(product):\n \"\"\"\n Returns list of product images that will be placed in product gallery\n \"\"\"\n return list(product.images.all())\n\n\ndef products_with_availability(products, discounts, local_currency):\n for product in products:\n yield product, get_availability(product, discounts, local_currency)\n\n\nProductAvailability = namedtuple(\n 'ProductAvailability', (\n 'available', 'price_range', 'price_range_undiscounted', 'discount',\n 'price_range_local_currency', 'discount_local_currency'))\n\n\ndef get_availability(product, discounts=None, local_currency=None):\n if not product.variants.exists():\n return ProductAvailability(\n available=False,\n price_range=None,\n price_range_undiscounted=None,\n discount=None,\n price_range_local_currency=None,\n discount_local_currency=None)\n\n # In default currency\n price_range = product.get_price_range(discounts=discounts)\n undiscounted = product.get_price_range()\n if undiscounted.min_price > price_range.min_price:\n discount = undiscounted.min_price - price_range.min_price\n else:\n discount = None\n\n # Local currency\n if local_currency:\n price_range_local = to_local_currency(\n price_range, local_currency)\n undiscounted_local = to_local_currency(\n undiscounted, local_currency)\n if (undiscounted_local and\n undiscounted_local.min_price > price_range_local.min_price):\n discount_local_currency = (\n undiscounted_local.min_price - price_range_local.min_price)\n else:\n discount_local_currency = None\n else:\n price_range_local = None\n discount_local_currency = None\n\n is_available = product.is_in_stock() and product.is_available()\n\n return ProductAvailability(\n available=is_available,\n price_range=price_range,\n price_range_undiscounted=undiscounted,\n discount=discount,\n price_range_local_currency=price_range_local,\n discount_local_currency=discount_local_currency)\n\n\ndef handle_cart_form(request, product, create_cart=False):\n if create_cart:\n cart = get_or_create_cart_from_request(request)\n else:\n cart = get_cart_from_request(request)\n\n form_class = get_form_class_for_product(product)\n form = form_class(cart=cart, product=product,\n data=request.POST or None, discounts=request.discounts)\n return form, cart\n\n\ndef products_for_cart(user):\n products = products_visible_to_user(user)\n products = products.prefetch_related(\n 'variants', 'variants__variant_images__image')\n return products\n\n\ndef get_variant_picker_data(product, discounts=None):\n availability = get_availability(product, discounts)\n variants = product.variants.all()\n data = {'variantAttributes': [], 'variants': []}\n\n variant_attributes = product.product_class.variant_attributes.all()\n for attribute in variant_attributes:\n data['variantAttributes'].append({\n 'pk': attribute.pk,\n 'display': attribute.display,\n 'name': attribute.name,\n 'values': [{'pk': value.pk, 'display': value.display}\n for value in attribute.values.all()]})\n\n for variant in variants:\n price = variant.get_price_per_item(discounts)\n price_undiscounted = variant.get_price_per_item()\n variant_data = {\n 'id': variant.id,\n 'price': price.gross,\n 'priceUndiscounted': price_undiscounted.gross,\n 'currency': price.currency,\n 'attributes': variant.attributes}\n data['variants'].append(variant_data)\n\n data['availability'] = {\n 'discount': price_as_dict(availability.discount),\n 'priceRange': price_range_as_dict(availability.price_range),\n 'priceRangeUndiscounted': price_range_as_dict(\n availability.price_range_undiscounted)}\n return data\n\n\ndef get_product_attributes_data(product):\n attributes = product.product_class.product_attributes.all()\n attributes_map = {attribute.pk: attribute for attribute in attributes}\n values_map = get_attributes_display_map(product, attributes)\n return {attributes_map.get(attr_pk): value_obj\n for (attr_pk, value_obj) in values_map.items()}\n\n\ndef price_as_dict(price):\n if not price:\n return None\n return {'currency': price.currency,\n 'gross': price.gross,\n 'net': price.net}\n\n\ndef price_range_as_dict(price_range):\n if not price_range:\n return None\n return {'maxPrice': price_as_dict(price_range.max_price),\n 'minPrice': price_as_dict(price_range.min_price)}\n", "path": "saleor/product/utils.py"}]}
| 1,765 | 259 |
gh_patches_debug_6281
|
rasdani/github-patches
|
git_diff
|
biolab__orange3-3109
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
owsql: Wrong auto-discovered DiscreteVariable values
##### Orange version
3.14.0 dev
##### Expected behavior
SQL Table correctly discovers categorical variables.
##### Actual behavior
`DiscreteVariable` values for char fields have additional with space on the end.
##### Steps to reproduce the behavior
SQL for Postgres:
``` sql
CREATE TABLE COMPANY(
ID INT PRIMARY KEY NOT NULL,
ADDRESS CHAR(50)
);
INSERT INTO COMPANY (ID,ADDRESS) VALUES (1, 'California');
```
In SQL Table select tabel company
##### Additional info (worksheets, data, screenshots, ...)
This issue is because on data query `::text` is used that removes the additional white space.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `Orange/data/sql/backend/postgres.py`
Content:
```
1 import logging
2 import re
3 import warnings
4 from contextlib import contextmanager
5 from time import time
6
7 from psycopg2 import Error, OperationalError
8 from psycopg2.pool import ThreadedConnectionPool
9
10 from Orange.data import ContinuousVariable, DiscreteVariable, StringVariable, TimeVariable
11 from Orange.data.sql.backend.base import Backend, ToSql, BackendError
12
13 log = logging.getLogger(__name__)
14
15 EXTENSIONS = ('tsm_system_time', 'quantile')
16
17
18 class Psycopg2Backend(Backend):
19 """Backend for accessing data stored in a Postgres database
20 """
21
22 display_name = "PostgreSQL"
23 connection_pool = None
24 auto_create_extensions = True
25
26 def __init__(self, connection_params):
27 super().__init__(connection_params)
28
29 if self.connection_pool is None:
30 self._create_connection_pool()
31
32 if self.auto_create_extensions:
33 self._create_extensions()
34
35 def _create_connection_pool(self):
36 try:
37 self.connection_pool = ThreadedConnectionPool(
38 1, 16, **self.connection_params)
39 except Error as ex:
40 raise BackendError(str(ex)) from ex
41
42 def _create_extensions(self):
43 for ext in EXTENSIONS:
44 try:
45 query = "CREATE EXTENSION IF NOT EXISTS {}".format(ext)
46 with self.execute_sql_query(query):
47 pass
48 except OperationalError:
49 warnings.warn("Database is missing extension {}".format(ext))
50
51 def create_sql_query(self, table_name, fields, filters=(),
52 group_by=None, order_by=None,
53 offset=None, limit=None,
54 use_time_sample=None):
55 sql = ["SELECT", ', '.join(fields),
56 "FROM", table_name]
57 if use_time_sample is not None:
58 sql.append("TABLESAMPLE system_time(%i)" % use_time_sample)
59 if filters:
60 sql.extend(["WHERE", " AND ".join(filters)])
61 if group_by is not None:
62 sql.extend(["GROUP BY", ", ".join(group_by)])
63 if order_by is not None:
64 sql.extend(["ORDER BY", ",".join(order_by)])
65 if offset is not None:
66 sql.extend(["OFFSET", str(offset)])
67 if limit is not None:
68 sql.extend(["LIMIT", str(limit)])
69 return " ".join(sql)
70
71 @contextmanager
72 def execute_sql_query(self, query, params=None):
73 connection = self.connection_pool.getconn()
74 cur = connection.cursor()
75 try:
76 utfquery = cur.mogrify(query, params).decode('utf-8')
77 log.debug("Executing: %s", utfquery)
78 t = time()
79 cur.execute(query, params)
80 yield cur
81 log.info("%.2f ms: %s", 1000 * (time() - t), utfquery)
82 except Error as ex:
83 raise BackendError(str(ex)) from ex
84 finally:
85 connection.commit()
86 self.connection_pool.putconn(connection)
87
88 def quote_identifier(self, name):
89 return '"%s"' % name
90
91 def unquote_identifier(self, quoted_name):
92 if quoted_name.startswith('"'):
93 return quoted_name[1:len(quoted_name) - 1]
94 else:
95 return quoted_name
96
97 def list_tables_query(self, schema=None):
98 if schema:
99 schema_clause = "AND n.nspname = '{}'".format(schema)
100 else:
101 schema_clause = "AND pg_catalog.pg_table_is_visible(c.oid)"
102 return """SELECT n.nspname as "Schema",
103 c.relname AS "Name"
104 FROM pg_catalog.pg_class c
105 LEFT JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace
106 WHERE c.relkind IN ('r','v','m','S','f','')
107 AND n.nspname <> 'pg_catalog'
108 AND n.nspname <> 'information_schema'
109 AND n.nspname !~ '^pg_toast'
110 {}
111 AND NOT c.relname LIKE '\\_\\_%'
112 ORDER BY 1;""".format(schema_clause)
113
114 def create_variable(self, field_name, field_metadata,
115 type_hints, inspect_table=None):
116 if field_name in type_hints:
117 var = type_hints[field_name]
118 else:
119 var = self._guess_variable(field_name, field_metadata,
120 inspect_table)
121
122 field_name_q = self.quote_identifier(field_name)
123 if var.is_continuous:
124 if isinstance(var, TimeVariable):
125 var.to_sql = ToSql("extract(epoch from {})"
126 .format(field_name_q))
127 else:
128 var.to_sql = ToSql("({})::double precision"
129 .format(field_name_q))
130 else: # discrete or string
131 var.to_sql = ToSql("({})::text"
132 .format(field_name_q))
133 return var
134
135 def _guess_variable(self, field_name, field_metadata, inspect_table):
136 type_code = field_metadata[0]
137
138 FLOATISH_TYPES = (700, 701, 1700) # real, float8, numeric
139 INT_TYPES = (20, 21, 23) # bigint, int, smallint
140 CHAR_TYPES = (25, 1042, 1043,) # text, char, varchar
141 BOOLEAN_TYPES = (16,) # bool
142 DATE_TYPES = (1082, 1114, 1184, ) # date, timestamp, timestamptz
143 # time, timestamp, timestamptz, timetz
144 TIME_TYPES = (1083, 1114, 1184, 1266,)
145
146 if type_code in FLOATISH_TYPES:
147 return ContinuousVariable.make(field_name)
148
149 if type_code in TIME_TYPES + DATE_TYPES:
150 tv = TimeVariable.make(field_name)
151 tv.have_date |= type_code in DATE_TYPES
152 tv.have_time |= type_code in TIME_TYPES
153 return tv
154
155 if type_code in INT_TYPES: # bigint, int, smallint
156 if inspect_table:
157 values = self.get_distinct_values(field_name, inspect_table)
158 if values:
159 return DiscreteVariable.make(field_name, values)
160 return ContinuousVariable.make(field_name)
161
162 if type_code in BOOLEAN_TYPES:
163 return DiscreteVariable.make(field_name, ['false', 'true'])
164
165 if type_code in CHAR_TYPES:
166 if inspect_table:
167 values = self.get_distinct_values(field_name, inspect_table)
168 if values:
169 return DiscreteVariable.make(field_name, values)
170
171 return StringVariable.make(field_name)
172
173 def count_approx(self, query):
174 sql = "EXPLAIN " + query
175 with self.execute_sql_query(sql) as cur:
176 s = ''.join(row[0] for row in cur.fetchall())
177 return int(re.findall(r'rows=(\d*)', s)[0])
178
179 def __getstate__(self):
180 # Drop connection_pool from state as it cannot be pickled
181 state = dict(self.__dict__)
182 state.pop('connection_pool', None)
183 return state
184
185 def __setstate__(self, state):
186 # Create a new connection pool if none exists
187 self.__dict__.update(state)
188 if self.connection_pool is None:
189 self._create_connection_pool()
190
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/Orange/data/sql/backend/postgres.py b/Orange/data/sql/backend/postgres.py
--- a/Orange/data/sql/backend/postgres.py
+++ b/Orange/data/sql/backend/postgres.py
@@ -165,6 +165,8 @@
if type_code in CHAR_TYPES:
if inspect_table:
values = self.get_distinct_values(field_name, inspect_table)
+ # remove trailing spaces
+ values = [v.rstrip() for v in values]
if values:
return DiscreteVariable.make(field_name, values)
|
{"golden_diff": "diff --git a/Orange/data/sql/backend/postgres.py b/Orange/data/sql/backend/postgres.py\n--- a/Orange/data/sql/backend/postgres.py\n+++ b/Orange/data/sql/backend/postgres.py\n@@ -165,6 +165,8 @@\n if type_code in CHAR_TYPES:\n if inspect_table:\n values = self.get_distinct_values(field_name, inspect_table)\n+ # remove trailing spaces\n+ values = [v.rstrip() for v in values]\n if values:\n return DiscreteVariable.make(field_name, values)\n", "issue": "owsql: Wrong auto-discovered DiscreteVariable values\n##### Orange version\r\n3.14.0 dev\r\n\r\n##### Expected behavior\r\nSQL Table correctly discovers categorical variables.\r\n\r\n##### Actual behavior\r\n`DiscreteVariable` values for char fields have additional with space on the end.\r\n\r\n##### Steps to reproduce the behavior\r\nSQL for Postgres:\r\n\r\n``` sql\r\nCREATE TABLE COMPANY( \r\n ID INT PRIMARY KEY NOT NULL,\r\n ADDRESS CHAR(50)\r\n);\r\nINSERT INTO COMPANY (ID,ADDRESS) VALUES (1, 'California');\r\n```\r\nIn SQL Table select tabel company\r\n\r\n##### Additional info (worksheets, data, screenshots, ...)\r\nThis issue is because on data query `::text` is used that removes the additional white space.\n", "before_files": [{"content": "import logging\nimport re\nimport warnings\nfrom contextlib import contextmanager\nfrom time import time\n\nfrom psycopg2 import Error, OperationalError\nfrom psycopg2.pool import ThreadedConnectionPool\n\nfrom Orange.data import ContinuousVariable, DiscreteVariable, StringVariable, TimeVariable\nfrom Orange.data.sql.backend.base import Backend, ToSql, BackendError\n\nlog = logging.getLogger(__name__)\n\nEXTENSIONS = ('tsm_system_time', 'quantile')\n\n\nclass Psycopg2Backend(Backend):\n \"\"\"Backend for accessing data stored in a Postgres database\n \"\"\"\n\n display_name = \"PostgreSQL\"\n connection_pool = None\n auto_create_extensions = True\n\n def __init__(self, connection_params):\n super().__init__(connection_params)\n\n if self.connection_pool is None:\n self._create_connection_pool()\n\n if self.auto_create_extensions:\n self._create_extensions()\n\n def _create_connection_pool(self):\n try:\n self.connection_pool = ThreadedConnectionPool(\n 1, 16, **self.connection_params)\n except Error as ex:\n raise BackendError(str(ex)) from ex\n\n def _create_extensions(self):\n for ext in EXTENSIONS:\n try:\n query = \"CREATE EXTENSION IF NOT EXISTS {}\".format(ext)\n with self.execute_sql_query(query):\n pass\n except OperationalError:\n warnings.warn(\"Database is missing extension {}\".format(ext))\n\n def create_sql_query(self, table_name, fields, filters=(),\n group_by=None, order_by=None,\n offset=None, limit=None,\n use_time_sample=None):\n sql = [\"SELECT\", ', '.join(fields),\n \"FROM\", table_name]\n if use_time_sample is not None:\n sql.append(\"TABLESAMPLE system_time(%i)\" % use_time_sample)\n if filters:\n sql.extend([\"WHERE\", \" AND \".join(filters)])\n if group_by is not None:\n sql.extend([\"GROUP BY\", \", \".join(group_by)])\n if order_by is not None:\n sql.extend([\"ORDER BY\", \",\".join(order_by)])\n if offset is not None:\n sql.extend([\"OFFSET\", str(offset)])\n if limit is not None:\n sql.extend([\"LIMIT\", str(limit)])\n return \" \".join(sql)\n\n @contextmanager\n def execute_sql_query(self, query, params=None):\n connection = self.connection_pool.getconn()\n cur = connection.cursor()\n try:\n utfquery = cur.mogrify(query, params).decode('utf-8')\n log.debug(\"Executing: %s\", utfquery)\n t = time()\n cur.execute(query, params)\n yield cur\n log.info(\"%.2f ms: %s\", 1000 * (time() - t), utfquery)\n except Error as ex:\n raise BackendError(str(ex)) from ex\n finally:\n connection.commit()\n self.connection_pool.putconn(connection)\n\n def quote_identifier(self, name):\n return '\"%s\"' % name\n\n def unquote_identifier(self, quoted_name):\n if quoted_name.startswith('\"'):\n return quoted_name[1:len(quoted_name) - 1]\n else:\n return quoted_name\n\n def list_tables_query(self, schema=None):\n if schema:\n schema_clause = \"AND n.nspname = '{}'\".format(schema)\n else:\n schema_clause = \"AND pg_catalog.pg_table_is_visible(c.oid)\"\n return \"\"\"SELECT n.nspname as \"Schema\",\n c.relname AS \"Name\"\n FROM pg_catalog.pg_class c\n LEFT JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace\n WHERE c.relkind IN ('r','v','m','S','f','')\n AND n.nspname <> 'pg_catalog'\n AND n.nspname <> 'information_schema'\n AND n.nspname !~ '^pg_toast'\n {}\n AND NOT c.relname LIKE '\\\\_\\\\_%'\n ORDER BY 1;\"\"\".format(schema_clause)\n\n def create_variable(self, field_name, field_metadata,\n type_hints, inspect_table=None):\n if field_name in type_hints:\n var = type_hints[field_name]\n else:\n var = self._guess_variable(field_name, field_metadata,\n inspect_table)\n\n field_name_q = self.quote_identifier(field_name)\n if var.is_continuous:\n if isinstance(var, TimeVariable):\n var.to_sql = ToSql(\"extract(epoch from {})\"\n .format(field_name_q))\n else:\n var.to_sql = ToSql(\"({})::double precision\"\n .format(field_name_q))\n else: # discrete or string\n var.to_sql = ToSql(\"({})::text\"\n .format(field_name_q))\n return var\n\n def _guess_variable(self, field_name, field_metadata, inspect_table):\n type_code = field_metadata[0]\n\n FLOATISH_TYPES = (700, 701, 1700) # real, float8, numeric\n INT_TYPES = (20, 21, 23) # bigint, int, smallint\n CHAR_TYPES = (25, 1042, 1043,) # text, char, varchar\n BOOLEAN_TYPES = (16,) # bool\n DATE_TYPES = (1082, 1114, 1184, ) # date, timestamp, timestamptz\n # time, timestamp, timestamptz, timetz\n TIME_TYPES = (1083, 1114, 1184, 1266,)\n\n if type_code in FLOATISH_TYPES:\n return ContinuousVariable.make(field_name)\n\n if type_code in TIME_TYPES + DATE_TYPES:\n tv = TimeVariable.make(field_name)\n tv.have_date |= type_code in DATE_TYPES\n tv.have_time |= type_code in TIME_TYPES\n return tv\n\n if type_code in INT_TYPES: # bigint, int, smallint\n if inspect_table:\n values = self.get_distinct_values(field_name, inspect_table)\n if values:\n return DiscreteVariable.make(field_name, values)\n return ContinuousVariable.make(field_name)\n\n if type_code in BOOLEAN_TYPES:\n return DiscreteVariable.make(field_name, ['false', 'true'])\n\n if type_code in CHAR_TYPES:\n if inspect_table:\n values = self.get_distinct_values(field_name, inspect_table)\n if values:\n return DiscreteVariable.make(field_name, values)\n\n return StringVariable.make(field_name)\n\n def count_approx(self, query):\n sql = \"EXPLAIN \" + query\n with self.execute_sql_query(sql) as cur:\n s = ''.join(row[0] for row in cur.fetchall())\n return int(re.findall(r'rows=(\\d*)', s)[0])\n\n def __getstate__(self):\n # Drop connection_pool from state as it cannot be pickled\n state = dict(self.__dict__)\n state.pop('connection_pool', None)\n return state\n\n def __setstate__(self, state):\n # Create a new connection pool if none exists\n self.__dict__.update(state)\n if self.connection_pool is None:\n self._create_connection_pool()\n", "path": "Orange/data/sql/backend/postgres.py"}], "after_files": [{"content": "import logging\nimport re\nimport warnings\nfrom contextlib import contextmanager\nfrom time import time\n\nfrom psycopg2 import Error, OperationalError\nfrom psycopg2.pool import ThreadedConnectionPool\n\nfrom Orange.data import ContinuousVariable, DiscreteVariable, StringVariable, TimeVariable\nfrom Orange.data.sql.backend.base import Backend, ToSql, BackendError\n\nlog = logging.getLogger(__name__)\n\nEXTENSIONS = ('tsm_system_time', 'quantile')\n\n\nclass Psycopg2Backend(Backend):\n \"\"\"Backend for accessing data stored in a Postgres database\n \"\"\"\n\n display_name = \"PostgreSQL\"\n connection_pool = None\n auto_create_extensions = True\n\n def __init__(self, connection_params):\n super().__init__(connection_params)\n\n if self.connection_pool is None:\n self._create_connection_pool()\n\n if self.auto_create_extensions:\n self._create_extensions()\n\n def _create_connection_pool(self):\n try:\n self.connection_pool = ThreadedConnectionPool(\n 1, 16, **self.connection_params)\n except Error as ex:\n raise BackendError(str(ex)) from ex\n\n def _create_extensions(self):\n for ext in EXTENSIONS:\n try:\n query = \"CREATE EXTENSION IF NOT EXISTS {}\".format(ext)\n with self.execute_sql_query(query):\n pass\n except OperationalError:\n warnings.warn(\"Database is missing extension {}\".format(ext))\n\n def create_sql_query(self, table_name, fields, filters=(),\n group_by=None, order_by=None,\n offset=None, limit=None,\n use_time_sample=None):\n sql = [\"SELECT\", ', '.join(fields),\n \"FROM\", table_name]\n if use_time_sample is not None:\n sql.append(\"TABLESAMPLE system_time(%i)\" % use_time_sample)\n if filters:\n sql.extend([\"WHERE\", \" AND \".join(filters)])\n if group_by is not None:\n sql.extend([\"GROUP BY\", \", \".join(group_by)])\n if order_by is not None:\n sql.extend([\"ORDER BY\", \",\".join(order_by)])\n if offset is not None:\n sql.extend([\"OFFSET\", str(offset)])\n if limit is not None:\n sql.extend([\"LIMIT\", str(limit)])\n return \" \".join(sql)\n\n @contextmanager\n def execute_sql_query(self, query, params=None):\n connection = self.connection_pool.getconn()\n cur = connection.cursor()\n try:\n utfquery = cur.mogrify(query, params).decode('utf-8')\n log.debug(\"Executing: %s\", utfquery)\n t = time()\n cur.execute(query, params)\n yield cur\n log.info(\"%.2f ms: %s\", 1000 * (time() - t), utfquery)\n except Error as ex:\n raise BackendError(str(ex)) from ex\n finally:\n connection.commit()\n self.connection_pool.putconn(connection)\n\n def quote_identifier(self, name):\n return '\"%s\"' % name\n\n def unquote_identifier(self, quoted_name):\n if quoted_name.startswith('\"'):\n return quoted_name[1:len(quoted_name) - 1]\n else:\n return quoted_name\n\n def list_tables_query(self, schema=None):\n if schema:\n schema_clause = \"AND n.nspname = '{}'\".format(schema)\n else:\n schema_clause = \"AND pg_catalog.pg_table_is_visible(c.oid)\"\n return \"\"\"SELECT n.nspname as \"Schema\",\n c.relname AS \"Name\"\n FROM pg_catalog.pg_class c\n LEFT JOIN pg_catalog.pg_namespace n ON n.oid = c.relnamespace\n WHERE c.relkind IN ('r','v','m','S','f','')\n AND n.nspname <> 'pg_catalog'\n AND n.nspname <> 'information_schema'\n AND n.nspname !~ '^pg_toast'\n {}\n AND NOT c.relname LIKE '\\\\_\\\\_%'\n ORDER BY 1;\"\"\".format(schema_clause)\n\n def create_variable(self, field_name, field_metadata,\n type_hints, inspect_table=None):\n if field_name in type_hints:\n var = type_hints[field_name]\n else:\n var = self._guess_variable(field_name, field_metadata,\n inspect_table)\n\n field_name_q = self.quote_identifier(field_name)\n if var.is_continuous:\n if isinstance(var, TimeVariable):\n var.to_sql = ToSql(\"extract(epoch from {})\"\n .format(field_name_q))\n else:\n var.to_sql = ToSql(\"({})::double precision\"\n .format(field_name_q))\n else: # discrete or string\n var.to_sql = ToSql(\"({})::text\"\n .format(field_name_q))\n return var\n\n def _guess_variable(self, field_name, field_metadata, inspect_table):\n type_code = field_metadata[0]\n\n FLOATISH_TYPES = (700, 701, 1700) # real, float8, numeric\n INT_TYPES = (20, 21, 23) # bigint, int, smallint\n CHAR_TYPES = (25, 1042, 1043,) # text, char, varchar\n BOOLEAN_TYPES = (16,) # bool\n DATE_TYPES = (1082, 1114, 1184, ) # date, timestamp, timestamptz\n # time, timestamp, timestamptz, timetz\n TIME_TYPES = (1083, 1114, 1184, 1266,)\n\n if type_code in FLOATISH_TYPES:\n return ContinuousVariable.make(field_name)\n\n if type_code in TIME_TYPES + DATE_TYPES:\n tv = TimeVariable.make(field_name)\n tv.have_date |= type_code in DATE_TYPES\n tv.have_time |= type_code in TIME_TYPES\n return tv\n\n if type_code in INT_TYPES: # bigint, int, smallint\n if inspect_table:\n values = self.get_distinct_values(field_name, inspect_table)\n if values:\n return DiscreteVariable.make(field_name, values)\n return ContinuousVariable.make(field_name)\n\n if type_code in BOOLEAN_TYPES:\n return DiscreteVariable.make(field_name, ['false', 'true'])\n\n if type_code in CHAR_TYPES:\n if inspect_table:\n values = self.get_distinct_values(field_name, inspect_table)\n # remove trailing spaces\n values = [v.rstrip() for v in values]\n if values:\n return DiscreteVariable.make(field_name, values)\n\n return StringVariable.make(field_name)\n\n def count_approx(self, query):\n sql = \"EXPLAIN \" + query\n with self.execute_sql_query(sql) as cur:\n s = ''.join(row[0] for row in cur.fetchall())\n return int(re.findall(r'rows=(\\d*)', s)[0])\n\n def __getstate__(self):\n # Drop connection_pool from state as it cannot be pickled\n state = dict(self.__dict__)\n state.pop('connection_pool', None)\n return state\n\n def __setstate__(self, state):\n # Create a new connection pool if none exists\n self.__dict__.update(state)\n if self.connection_pool is None:\n self._create_connection_pool()\n", "path": "Orange/data/sql/backend/postgres.py"}]}
| 2,443 | 120 |
gh_patches_debug_54265
|
rasdani/github-patches
|
git_diff
|
hylang__hy-411
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
problem with comment parsing
I was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.
```
(import pygame sys)
(import [pygame.locals [*]])
(pygame.init)
(setv *displaysurf* (pygame.display.set_mode (, 400 300)))
(pygame.display.set_caption "Hello World!")
(while True ; main game loop
(do (foreach [event (pygame.event.get)]
(if (= event.type QUIT)
(do (pygame.quit)
(sys.exit))))
(pygame.display.update)))
```
I get a parse error if the end-of-line comment ("main game loop") appears where it does. It works if I remove it.
The following interaction with the prompt also surprised me.
```
=> ; some comment
hy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1
```
Fixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.
problem with comment parsing
I was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.
```
(import pygame sys)
(import [pygame.locals [*]])
(pygame.init)
(setv *displaysurf* (pygame.display.set_mode (, 400 300)))
(pygame.display.set_caption "Hello World!")
(while True ; main game loop
(do (foreach [event (pygame.event.get)]
(if (= event.type QUIT)
(do (pygame.quit)
(sys.exit))))
(pygame.display.update)))
```
I get a parse error if the end-of-line comment ("main game loop") appears where it does. It works if I remove it.
The following interaction with the prompt also surprised me.
```
=> ; some comment
hy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1
```
Fixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hy/lex/lexer.py`
Content:
```
1 # Copyright (c) 2013 Nicolas Dandrimont <nicolas.dandrimont@crans.org>
2 #
3 # Permission is hereby granted, free of charge, to any person obtaining a
4 # copy of this software and associated documentation files (the "Software"),
5 # to deal in the Software without restriction, including without limitation
6 # the rights to use, copy, modify, merge, publish, distribute, sublicense,
7 # and/or sell copies of the Software, and to permit persons to whom the
8 # Software is furnished to do so, subject to the following conditions:
9 #
10 # The above copyright notice and this permission notice shall be included in
11 # all copies or substantial portions of the Software.
12 #
13 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
14 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
15 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
16 # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
17 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
18 # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
19 # DEALINGS IN THE SOFTWARE.
20
21 from rply import LexerGenerator
22
23
24 lg = LexerGenerator()
25
26
27 # A regexp for something that should end a quoting/unquoting operator
28 # i.e. a space or a closing brace/paren/curly
29 end_quote = r'(?![\s\)\]\}])'
30
31
32 lg.add('LPAREN', r'\(')
33 lg.add('RPAREN', r'\)')
34 lg.add('LBRACKET', r'\[')
35 lg.add('RBRACKET', r'\]')
36 lg.add('LCURLY', r'\{')
37 lg.add('RCURLY', r'\}')
38 lg.add('QUOTE', r'\'%s' % end_quote)
39 lg.add('QUASIQUOTE', r'`%s' % end_quote)
40 lg.add('UNQUOTESPLICE', r'~@%s' % end_quote)
41 lg.add('UNQUOTE', r'~%s' % end_quote)
42 lg.add('HASHBANG', r'#!.*[^\r\n]')
43 lg.add('HASHREADER', r'#.')
44
45
46 lg.add('STRING', r'''(?x)
47 (?:u|r|ur|ru)? # prefix
48 " # start string
49 (?:
50 | [^"\\] # non-quote or backslash
51 | \\. # or escaped single character
52 | \\x[0-9a-fA-F]{2} # or escaped raw character
53 | \\u[0-9a-fA-F]{4} # or unicode escape
54 | \\U[0-9a-fA-F]{8} # or long unicode escape
55 )* # one or more times
56 " # end string
57 ''')
58
59
60 lg.add('IDENTIFIER', r'[^()\[\]{}\'"\s;]+')
61
62
63 lg.ignore(r';.*[\r\n]+')
64 lg.ignore(r'\s+')
65
66
67 lexer = lg.build()
68
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/hy/lex/lexer.py b/hy/lex/lexer.py
--- a/hy/lex/lexer.py
+++ b/hy/lex/lexer.py
@@ -60,7 +60,7 @@
lg.add('IDENTIFIER', r'[^()\[\]{}\'"\s;]+')
-lg.ignore(r';.*[\r\n]+')
+lg.ignore(r';.*(?=\r|\n|$)')
lg.ignore(r'\s+')
|
{"golden_diff": "diff --git a/hy/lex/lexer.py b/hy/lex/lexer.py\n--- a/hy/lex/lexer.py\n+++ b/hy/lex/lexer.py\n@@ -60,7 +60,7 @@\n lg.add('IDENTIFIER', r'[^()\\[\\]{}\\'\"\\s;]+')\n \n \n-lg.ignore(r';.*[\\r\\n]+')\n+lg.ignore(r';.*(?=\\r|\\n|$)')\n lg.ignore(r'\\s+')\n", "issue": "problem with comment parsing\nI was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.\n\n```\n(import pygame sys)\n(import [pygame.locals [*]])\n\n(pygame.init)\n(setv *displaysurf* (pygame.display.set_mode (, 400 300)))\n(pygame.display.set_caption \"Hello World!\")\n(while True ; main game loop\n (do (foreach [event (pygame.event.get)]\n (if (= event.type QUIT)\n (do (pygame.quit)\n (sys.exit))))\n (pygame.display.update)))\n```\n\nI get a parse error if the end-of-line comment (\"main game loop\") appears where it does. It works if I remove it.\n\nThe following interaction with the prompt also surprised me.\n\n```\n=> ; some comment\nhy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1\n```\n\nFixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.\n\nproblem with comment parsing\nI was translating some code to Hy from a textbook on Python programming (http://inventwithpython.com/pygame/index.html) and ran into a problem with this direct translation.\n\n```\n(import pygame sys)\n(import [pygame.locals [*]])\n\n(pygame.init)\n(setv *displaysurf* (pygame.display.set_mode (, 400 300)))\n(pygame.display.set_caption \"Hello World!\")\n(while True ; main game loop\n (do (foreach [event (pygame.event.get)]\n (if (= event.type QUIT)\n (do (pygame.quit)\n (sys.exit))))\n (pygame.display.update)))\n```\n\nI get a parse error if the end-of-line comment (\"main game loop\") appears where it does. It works if I remove it.\n\nThe following interaction with the prompt also surprised me.\n\n```\n=> ; some comment\nhy.lex.exceptions.LexException: Could not identify the next token at line -1, column -1\n```\n\nFixing this isn't critical, but it should probably be fixed. I do occasionally type something, realize I need to do something else first, comment it, press Enter, type whatever setup I needed, press Enter, then press Up twice, uncomment the line, and Enter to run it.\n\n", "before_files": [{"content": "# Copyright (c) 2013 Nicolas Dandrimont <nicolas.dandrimont@crans.org>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish, distribute, sublicense,\n# and/or sell copies of the Software, and to permit persons to whom the\n# Software is furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL\n# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER\n# DEALINGS IN THE SOFTWARE.\n\nfrom rply import LexerGenerator\n\n\nlg = LexerGenerator()\n\n\n# A regexp for something that should end a quoting/unquoting operator\n# i.e. a space or a closing brace/paren/curly\nend_quote = r'(?![\\s\\)\\]\\}])'\n\n\nlg.add('LPAREN', r'\\(')\nlg.add('RPAREN', r'\\)')\nlg.add('LBRACKET', r'\\[')\nlg.add('RBRACKET', r'\\]')\nlg.add('LCURLY', r'\\{')\nlg.add('RCURLY', r'\\}')\nlg.add('QUOTE', r'\\'%s' % end_quote)\nlg.add('QUASIQUOTE', r'`%s' % end_quote)\nlg.add('UNQUOTESPLICE', r'~@%s' % end_quote)\nlg.add('UNQUOTE', r'~%s' % end_quote)\nlg.add('HASHBANG', r'#!.*[^\\r\\n]')\nlg.add('HASHREADER', r'#.')\n\n\nlg.add('STRING', r'''(?x)\n (?:u|r|ur|ru)? # prefix\n \" # start string\n (?:\n | [^\"\\\\] # non-quote or backslash\n | \\\\. # or escaped single character\n | \\\\x[0-9a-fA-F]{2} # or escaped raw character\n | \\\\u[0-9a-fA-F]{4} # or unicode escape\n | \\\\U[0-9a-fA-F]{8} # or long unicode escape\n )* # one or more times\n \" # end string\n''')\n\n\nlg.add('IDENTIFIER', r'[^()\\[\\]{}\\'\"\\s;]+')\n\n\nlg.ignore(r';.*[\\r\\n]+')\nlg.ignore(r'\\s+')\n\n\nlexer = lg.build()\n", "path": "hy/lex/lexer.py"}], "after_files": [{"content": "# Copyright (c) 2013 Nicolas Dandrimont <nicolas.dandrimont@crans.org>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, including without limitation\n# the rights to use, copy, modify, merge, publish, distribute, sublicense,\n# and/or sell copies of the Software, and to permit persons to whom the\n# Software is furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL\n# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER\n# DEALINGS IN THE SOFTWARE.\n\nfrom rply import LexerGenerator\n\n\nlg = LexerGenerator()\n\n\n# A regexp for something that should end a quoting/unquoting operator\n# i.e. a space or a closing brace/paren/curly\nend_quote = r'(?![\\s\\)\\]\\}])'\n\n\nlg.add('LPAREN', r'\\(')\nlg.add('RPAREN', r'\\)')\nlg.add('LBRACKET', r'\\[')\nlg.add('RBRACKET', r'\\]')\nlg.add('LCURLY', r'\\{')\nlg.add('RCURLY', r'\\}')\nlg.add('QUOTE', r'\\'%s' % end_quote)\nlg.add('QUASIQUOTE', r'`%s' % end_quote)\nlg.add('UNQUOTESPLICE', r'~@%s' % end_quote)\nlg.add('UNQUOTE', r'~%s' % end_quote)\nlg.add('HASHBANG', r'#!.*[^\\r\\n]')\nlg.add('HASHREADER', r'#.')\n\n\nlg.add('STRING', r'''(?x)\n (?:u|r|ur|ru)? # prefix\n \" # start string\n (?:\n | [^\"\\\\] # non-quote or backslash\n | \\\\. # or escaped single character\n | \\\\x[0-9a-fA-F]{2} # or escaped raw character\n | \\\\u[0-9a-fA-F]{4} # or unicode escape\n | \\\\U[0-9a-fA-F]{8} # or long unicode escape\n )* # one or more times\n \" # end string\n''')\n\n\nlg.add('IDENTIFIER', r'[^()\\[\\]{}\\'\"\\s;]+')\n\n\nlg.ignore(r';.*(?=\\r|\\n|$)')\nlg.ignore(r'\\s+')\n\n\nlexer = lg.build()\n", "path": "hy/lex/lexer.py"}]}
| 1,580 | 102 |
gh_patches_debug_20248
|
rasdani/github-patches
|
git_diff
|
liqd__a4-opin-280
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Styling in commenting text
Currently an the `<li>`s in an `<ul>` do not glow/highlight as expected when hovered. An the comment symbol is not part of the link / can't be clicked.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `euth/documents/models.py`
Content:
```
1 from ckeditor.fields import RichTextField
2 from django.contrib.contenttypes.models import ContentType
3 from django.core.exceptions import ObjectDoesNotExist, ValidationError
4 from django.db import models
5 from django.utils.functional import cached_property
6 from django.utils.translation import ugettext_lazy as _
7
8 from contrib.transforms import html_transforms
9 from euth.comments import models as comment_models
10 from euth.contrib import base_models
11 from euth.modules import models as module_models
12
13
14 class Document(module_models.Item):
15 name = models.CharField(max_length=120)
16
17 def __str__(self):
18 return "{}_document_{}".format(str(self.module), self.pk)
19
20 def clean(self, *args, **kwargs):
21 try:
22 Document.objects.get(module=self.module)
23 raise ValidationError(_('Document for that module already exists'))
24 except ObjectDoesNotExist:
25 super().clean(*args, **kwargs)
26
27 @cached_property
28 def paragraphs_sorted(self):
29 return self.paragraph_set.all().order_by('weight')
30
31
32 class Paragraph(base_models.TimeStampedModel):
33 name = models.CharField(max_length=120, blank=True)
34 text = RichTextField(config_name='document-editor')
35 weight = models.PositiveIntegerField()
36 document = models.ForeignKey(Document, on_delete=models.CASCADE)
37
38 def __str__(self):
39 return "{}_paragraph_{}".format(str(self.document), self.weight)
40
41 def save(self, *args, **kwargs):
42 self.text = html_transforms.clean_html_field(
43 self.text)
44 super().save(*args, **kwargs)
45
46 @cached_property
47 def comments(self):
48 contenttype = ContentType.objects.get_for_model(self)
49 pk = self.id
50 comments = comment_models.Comment.objects.all().filter(
51 content_type=contenttype, object_pk=pk)
52 return comments
53
```
Path: `euth_wagtail/settings/base.py`
Content:
```
1 """
2 Django settings for euth_wagtail project.
3
4 Generated by 'django-admin startproject' using Django 1.9.1.
5
6 For more information on this file, see
7 https://docs.djangoproject.com/en/1.9/topics/settings/
8
9 For the full list of settings and their values, see
10 https://docs.djangoproject.com/en/1.9/ref/settings/
11 """
12
13 # Build paths inside the project like this: os.path.join(BASE_DIR, ...)
14 import os
15
16 from django.utils.translation import ugettext_lazy as _
17
18 PROJECT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
19 BASE_DIR = os.path.dirname(PROJECT_DIR)
20
21
22 # Quick-start development settings - unsuitable for production
23 # See https://docs.djangoproject.com/en/1.9/howto/deployment/checklist/
24
25
26 # Application definition
27
28 INSTALLED_APPS = [
29 'home',
30 'search',
31
32 'wagtail.wagtailforms',
33 'wagtail.wagtailredirects',
34 'wagtail.wagtailembeds',
35 'wagtail.wagtailsites',
36 'wagtail.wagtailusers',
37 'wagtail.wagtailsnippets',
38 'wagtail.wagtaildocs',
39 'wagtail.wagtailimages',
40 'wagtail.wagtailsearch',
41 'wagtail.wagtailadmin',
42 'wagtail.wagtailcore',
43 'wagtail.contrib.wagtailstyleguide',
44
45 'modelcluster',
46 'compressor',
47 'taggit',
48 'projects',
49 'widget_tweaks',
50 'webpack_loader',
51 'easy_thumbnails',
52 'parler',
53 'ckeditor',
54
55 'django.contrib.sites',
56 'django.contrib.admin',
57 'django.contrib.auth',
58 'django.contrib.contenttypes',
59 'django.contrib.sessions',
60 'django.contrib.messages',
61 'django.contrib.staticfiles',
62 'django_countries',
63 'rest_framework',
64 'autofixture',
65 'rules.apps.AutodiscoverRulesConfig',
66
67 'euth.users.apps.UsersConfig',
68 'euth.organisations.apps.OrganisationsConfig',
69 'euth.projects.apps.ProjectsConfig',
70 'euth.comments.apps.CommentConfig',
71 'euth.phases.apps.PhasesConfig',
72 'euth.modules.apps.ModuleConfig',
73 'euth.ideas.apps.IdeaConfig',
74 'euth.ratings.apps.RatingsConfig',
75 'euth.reports.apps.ReportConfig',
76 'euth.dashboard.apps.DashboardConfig',
77 'euth.memberships.apps.MembershipsConfig',
78 'euth.documents.apps.DocumentConfig'
79 ]
80
81 MIDDLEWARE_CLASSES = [
82 'django.contrib.sessions.middleware.SessionMiddleware',
83 'django.middleware.common.CommonMiddleware',
84 'django.middleware.csrf.CsrfViewMiddleware',
85 'django.contrib.auth.middleware.AuthenticationMiddleware',
86 'django.contrib.auth.middleware.SessionAuthenticationMiddleware',
87 'django.contrib.messages.middleware.MessageMiddleware',
88 'django.middleware.clickjacking.XFrameOptionsMiddleware',
89 'django.middleware.security.SecurityMiddleware',
90 'django.middleware.locale.LocaleMiddleware',
91 'wagtail.wagtailcore.middleware.SiteMiddleware',
92 'wagtail.wagtailredirects.middleware.RedirectMiddleware',
93 ]
94
95 SITE_ID = 1
96
97 ROOT_URLCONF = 'euth_wagtail.urls'
98
99 LOCALE_PATHS = [os.path.join(BASE_DIR, 'locale')]
100
101 TEMPLATES = [
102 {
103 'BACKEND': 'django.template.backends.django.DjangoTemplates',
104 'DIRS': [
105 os.path.join(PROJECT_DIR, 'templates'),
106 ],
107 'APP_DIRS': True,
108 'OPTIONS': {
109 'context_processors': [
110 'django.template.context_processors.debug',
111 'django.template.context_processors.request',
112 'django.contrib.auth.context_processors.auth',
113 'django.contrib.messages.context_processors.messages',
114 ],
115 },
116 },
117 ]
118
119 WSGI_APPLICATION = 'euth_wagtail.wsgi.application'
120
121
122 # Database
123 # https://docs.djangoproject.com/en/1.9/ref/settings/#databases
124
125 DATABASES = {
126 'default': {
127 'ENGINE': 'django.db.backends.sqlite3',
128 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
129 }
130 }
131
132
133 # Auth
134 # https://docs.djangoproject.com/en/1.8/topics/auth/customizing/
135
136 AUTH_USER_MODEL = 'euth_users.User'
137
138 AUTHENTICATION_BACKENDS = (
139 'rules.permissions.ObjectPermissionBackend',
140 'django.contrib.auth.backends.ModelBackend',
141 )
142
143 CKEDITOR_CONFIGS = {
144 'default': {
145 'width': '100%',
146 'toolbar': 'Custom',
147 'toolbar_Custom': [
148 ['Bold', 'Italic', 'Underline'],
149 ['NumberedList', 'BulletedList'],
150 ['Link', 'Unlink']
151 ]
152 },
153 'document-editor': {
154 'width': '100%',
155 'toolbar': 'Custom',
156 'toolbar_Custom': [
157 ['Bold', 'Italic', 'Underline'],
158 ['NumberedList', 'BulletedList'],
159 ['Link', 'Unlink']
160 ],
161 'enterMode': 2
162 }
163 }
164
165 BLEACH_LIST = {
166 'default' : {
167 'tags': ['p','strong','em','u','ol','li','ul','a','br'],
168 'attributes': {
169 'a': ['href', 'rel']
170 }
171 }
172 }
173
174 # Internationalization
175 # https://docs.djangoproject.com/en/1.9/topics/i18n/
176
177 LANGUAGE_CODE = 'en'
178
179 TIME_ZONE = 'UTC'
180
181 USE_I18N = True
182
183 USE_L10N = True
184
185 USE_TZ = True
186
187 LANGUAGES = [
188 ('de', _('German')),
189 ('en', _('English')),
190 ('it', _('Italien')),
191 ('fr', _('French')),
192 ('sv', _('Swedish')),
193 ('sl', _('Slovene')),
194 ('da', _('Danish')),
195 ]
196
197 PARLER_LANGUAGES = {
198 1:[{'code': language_code } for language_code, language in LANGUAGES]
199 }
200
201 # fixtures
202
203 FIXTURE_DIRS = [ os.path.join(PROJECT_DIR, 'fixtures') ]
204
205 ALLOWED_UPLOAD_IMAGES = ('image/png', 'image/jpeg', 'image/gif')
206
207 THUMBNAIL_ALIASES = {
208 '': {
209 'organisation_thumbnail': {'size': (400, 400), 'crop': 'smart'},
210 'heroimage': {'size': (1500, 500), 'crop': 'smart'},
211 'project_thumbnail': {'size': (520, 330), 'crop': 'smart'},
212 'organisation_thumbnail': {'size': (740, 540), 'crop': 'smart'},
213 'avatar_small': {'size': (60, 60), 'crop': 'smart'},
214 'avatar_medium': {'size': (300, 300), 'crop': 'smart'},
215 }
216 }
217 # Static files (CSS, JavaScript, Images)
218 # https://docs.djangoproject.com/en/1.9/howto/static-files/
219
220 STATICFILES_FINDERS = [
221 'django.contrib.staticfiles.finders.FileSystemFinder',
222 'django.contrib.staticfiles.finders.AppDirectoriesFinder',
223 'compressor.finders.CompressorFinder',
224 ]
225
226
227 WEBPACK_LOADER = {
228 'DEFAULT': {
229 'CACHE': False,
230 'BUNDLE_DIR_NAME': 'bundles/', # must end with slash
231 'STATS_FILE': os.path.join(BASE_DIR, 'webpack-stats.json'),
232 'POLL_INTERVAL': 0.1,
233 'IGNORE': ['.+\.hot-update.js', '.+\.map']
234 }
235 }
236
237
238 STATICFILES_DIRS = [
239 os.path.join(BASE_DIR, 'node_modules/jquery/dist'),
240 os.path.join(BASE_DIR, 'node_modules/salvattore/dist'),
241 os.path.join(BASE_DIR, 'node_modules/bootstrap-sass/assets/javascripts'),
242 os.path.join(BASE_DIR, 'node_modules/bootstrap-sass/assets/stylesheets'),
243 os.path.join(BASE_DIR, 'node_modules/font-awesome'),
244 os.path.join(BASE_DIR, 'node_modules/owl.carousel/dist'),
245 os.path.join(PROJECT_DIR, 'static'),
246 ]
247
248 STATIC_ROOT = os.path.join(BASE_DIR, 'static')
249 STATIC_URL = '/static/'
250
251 MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
252 MEDIA_URL = '/media/'
253
254 COMPRESS_PRECOMPILERS = (
255 ('text/x-scss', 'django_libsass.SassCompiler'),
256 )
257 LIBSASS_SOURCEMAPS = True
258
259 # Wagtail settings
260
261 WAGTAIL_SITE_NAME = "euth_wagtail"
262
263 # Authentification
264
265 LOGIN_URL = 'login'
266
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/euth/documents/models.py b/euth/documents/models.py
--- a/euth/documents/models.py
+++ b/euth/documents/models.py
@@ -31,7 +31,7 @@
class Paragraph(base_models.TimeStampedModel):
name = models.CharField(max_length=120, blank=True)
- text = RichTextField(config_name='document-editor')
+ text = RichTextField()
weight = models.PositiveIntegerField()
document = models.ForeignKey(Document, on_delete=models.CASCADE)
diff --git a/euth_wagtail/settings/base.py b/euth_wagtail/settings/base.py
--- a/euth_wagtail/settings/base.py
+++ b/euth_wagtail/settings/base.py
@@ -157,14 +157,13 @@
['Bold', 'Italic', 'Underline'],
['NumberedList', 'BulletedList'],
['Link', 'Unlink']
- ],
- 'enterMode': 2
+ ]
}
}
BLEACH_LIST = {
'default' : {
- 'tags': ['p','strong','em','u','ol','li','ul','a','br'],
+ 'tags': ['p','strong','em','u','ol','li','ul','a'],
'attributes': {
'a': ['href', 'rel']
}
|
{"golden_diff": "diff --git a/euth/documents/models.py b/euth/documents/models.py\n--- a/euth/documents/models.py\n+++ b/euth/documents/models.py\n@@ -31,7 +31,7 @@\n \n class Paragraph(base_models.TimeStampedModel):\n name = models.CharField(max_length=120, blank=True)\n- text = RichTextField(config_name='document-editor')\n+ text = RichTextField()\n weight = models.PositiveIntegerField()\n document = models.ForeignKey(Document, on_delete=models.CASCADE)\n \ndiff --git a/euth_wagtail/settings/base.py b/euth_wagtail/settings/base.py\n--- a/euth_wagtail/settings/base.py\n+++ b/euth_wagtail/settings/base.py\n@@ -157,14 +157,13 @@\n ['Bold', 'Italic', 'Underline'],\n ['NumberedList', 'BulletedList'],\n ['Link', 'Unlink']\n- ],\n- 'enterMode': 2\n+ ]\n }\n }\n \n BLEACH_LIST = {\n 'default' : {\n- 'tags': ['p','strong','em','u','ol','li','ul','a','br'],\n+ 'tags': ['p','strong','em','u','ol','li','ul','a'],\n 'attributes': {\n 'a': ['href', 'rel']\n }\n", "issue": "Styling in commenting text\nCurrently an the `<li>`s in an `<ul>` do not glow/highlight as expected when hovered. An the comment symbol is not part of the link / can't be clicked.\n\n", "before_files": [{"content": "from ckeditor.fields import RichTextField\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.exceptions import ObjectDoesNotExist, ValidationError\nfrom django.db import models\nfrom django.utils.functional import cached_property\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom contrib.transforms import html_transforms\nfrom euth.comments import models as comment_models\nfrom euth.contrib import base_models\nfrom euth.modules import models as module_models\n\n\nclass Document(module_models.Item):\n name = models.CharField(max_length=120)\n\n def __str__(self):\n return \"{}_document_{}\".format(str(self.module), self.pk)\n\n def clean(self, *args, **kwargs):\n try:\n Document.objects.get(module=self.module)\n raise ValidationError(_('Document for that module already exists'))\n except ObjectDoesNotExist:\n super().clean(*args, **kwargs)\n\n @cached_property\n def paragraphs_sorted(self):\n return self.paragraph_set.all().order_by('weight')\n\n\nclass Paragraph(base_models.TimeStampedModel):\n name = models.CharField(max_length=120, blank=True)\n text = RichTextField(config_name='document-editor')\n weight = models.PositiveIntegerField()\n document = models.ForeignKey(Document, on_delete=models.CASCADE)\n\n def __str__(self):\n return \"{}_paragraph_{}\".format(str(self.document), self.weight)\n\n def save(self, *args, **kwargs):\n self.text = html_transforms.clean_html_field(\n self.text)\n super().save(*args, **kwargs)\n\n @cached_property\n def comments(self):\n contenttype = ContentType.objects.get_for_model(self)\n pk = self.id\n comments = comment_models.Comment.objects.all().filter(\n content_type=contenttype, object_pk=pk)\n return comments\n", "path": "euth/documents/models.py"}, {"content": "\"\"\"\nDjango settings for euth_wagtail project.\n\nGenerated by 'django-admin startproject' using Django 1.9.1.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.9/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.9/ref/settings/\n\"\"\"\n\n# Build paths inside the project like this: os.path.join(BASE_DIR, ...)\nimport os\n\nfrom django.utils.translation import ugettext_lazy as _\n\nPROJECT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\nBASE_DIR = os.path.dirname(PROJECT_DIR)\n\n\n# Quick-start development settings - unsuitable for production\n# See https://docs.djangoproject.com/en/1.9/howto/deployment/checklist/\n\n\n# Application definition\n\nINSTALLED_APPS = [\n 'home',\n 'search',\n\n 'wagtail.wagtailforms',\n 'wagtail.wagtailredirects',\n 'wagtail.wagtailembeds',\n 'wagtail.wagtailsites',\n 'wagtail.wagtailusers',\n 'wagtail.wagtailsnippets',\n 'wagtail.wagtaildocs',\n 'wagtail.wagtailimages',\n 'wagtail.wagtailsearch',\n 'wagtail.wagtailadmin',\n 'wagtail.wagtailcore',\n 'wagtail.contrib.wagtailstyleguide',\n\n 'modelcluster',\n 'compressor',\n 'taggit',\n 'projects',\n 'widget_tweaks',\n 'webpack_loader',\n 'easy_thumbnails',\n 'parler',\n 'ckeditor',\n\n 'django.contrib.sites',\n 'django.contrib.admin',\n 'django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'django_countries',\n 'rest_framework',\n 'autofixture',\n 'rules.apps.AutodiscoverRulesConfig',\n\n 'euth.users.apps.UsersConfig',\n 'euth.organisations.apps.OrganisationsConfig',\n 'euth.projects.apps.ProjectsConfig',\n 'euth.comments.apps.CommentConfig',\n 'euth.phases.apps.PhasesConfig',\n 'euth.modules.apps.ModuleConfig',\n 'euth.ideas.apps.IdeaConfig',\n 'euth.ratings.apps.RatingsConfig',\n 'euth.reports.apps.ReportConfig',\n 'euth.dashboard.apps.DashboardConfig',\n 'euth.memberships.apps.MembershipsConfig',\n 'euth.documents.apps.DocumentConfig'\n]\n\nMIDDLEWARE_CLASSES = [\n 'django.contrib.sessions.middleware.SessionMiddleware',\n 'django.middleware.common.CommonMiddleware',\n 'django.middleware.csrf.CsrfViewMiddleware',\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\n 'django.contrib.auth.middleware.SessionAuthenticationMiddleware',\n 'django.contrib.messages.middleware.MessageMiddleware',\n 'django.middleware.clickjacking.XFrameOptionsMiddleware',\n 'django.middleware.security.SecurityMiddleware',\n 'django.middleware.locale.LocaleMiddleware',\n 'wagtail.wagtailcore.middleware.SiteMiddleware',\n 'wagtail.wagtailredirects.middleware.RedirectMiddleware',\n]\n\nSITE_ID = 1\n\nROOT_URLCONF = 'euth_wagtail.urls'\n\nLOCALE_PATHS = [os.path.join(BASE_DIR, 'locale')]\n\nTEMPLATES = [\n {\n 'BACKEND': 'django.template.backends.django.DjangoTemplates',\n 'DIRS': [\n os.path.join(PROJECT_DIR, 'templates'),\n ],\n 'APP_DIRS': True,\n 'OPTIONS': {\n 'context_processors': [\n 'django.template.context_processors.debug',\n 'django.template.context_processors.request',\n 'django.contrib.auth.context_processors.auth',\n 'django.contrib.messages.context_processors.messages',\n ],\n },\n },\n]\n\nWSGI_APPLICATION = 'euth_wagtail.wsgi.application'\n\n\n# Database\n# https://docs.djangoproject.com/en/1.9/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.sqlite3',\n 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),\n }\n}\n\n\n# Auth\n# https://docs.djangoproject.com/en/1.8/topics/auth/customizing/\n\nAUTH_USER_MODEL = 'euth_users.User'\n\nAUTHENTICATION_BACKENDS = (\n 'rules.permissions.ObjectPermissionBackend',\n 'django.contrib.auth.backends.ModelBackend',\n)\n\nCKEDITOR_CONFIGS = {\n 'default': {\n 'width': '100%',\n 'toolbar': 'Custom',\n 'toolbar_Custom': [\n ['Bold', 'Italic', 'Underline'],\n ['NumberedList', 'BulletedList'],\n ['Link', 'Unlink']\n ]\n },\n 'document-editor': {\n 'width': '100%',\n 'toolbar': 'Custom',\n 'toolbar_Custom': [\n ['Bold', 'Italic', 'Underline'],\n ['NumberedList', 'BulletedList'],\n ['Link', 'Unlink']\n ],\n 'enterMode': 2\n }\n}\n\nBLEACH_LIST = {\n 'default' : {\n 'tags': ['p','strong','em','u','ol','li','ul','a','br'],\n 'attributes': {\n 'a': ['href', 'rel']\n }\n }\n}\n\n# Internationalization\n# https://docs.djangoproject.com/en/1.9/topics/i18n/\n\nLANGUAGE_CODE = 'en'\n\nTIME_ZONE = 'UTC'\n\nUSE_I18N = True\n\nUSE_L10N = True\n\nUSE_TZ = True\n\nLANGUAGES = [\n ('de', _('German')),\n ('en', _('English')),\n ('it', _('Italien')),\n ('fr', _('French')),\n ('sv', _('Swedish')),\n ('sl', _('Slovene')),\n ('da', _('Danish')),\n]\n\nPARLER_LANGUAGES = {\n 1:[{'code': language_code } for language_code, language in LANGUAGES]\n}\n\n# fixtures\n\nFIXTURE_DIRS = [ os.path.join(PROJECT_DIR, 'fixtures') ]\n\nALLOWED_UPLOAD_IMAGES = ('image/png', 'image/jpeg', 'image/gif')\n\nTHUMBNAIL_ALIASES = {\n '': {\n 'organisation_thumbnail': {'size': (400, 400), 'crop': 'smart'},\n 'heroimage': {'size': (1500, 500), 'crop': 'smart'},\n 'project_thumbnail': {'size': (520, 330), 'crop': 'smart'},\n 'organisation_thumbnail': {'size': (740, 540), 'crop': 'smart'},\n 'avatar_small': {'size': (60, 60), 'crop': 'smart'},\n 'avatar_medium': {'size': (300, 300), 'crop': 'smart'},\n }\n}\n# Static files (CSS, JavaScript, Images)\n# https://docs.djangoproject.com/en/1.9/howto/static-files/\n\nSTATICFILES_FINDERS = [\n 'django.contrib.staticfiles.finders.FileSystemFinder',\n 'django.contrib.staticfiles.finders.AppDirectoriesFinder',\n 'compressor.finders.CompressorFinder',\n]\n\n\nWEBPACK_LOADER = {\n 'DEFAULT': {\n 'CACHE': False,\n 'BUNDLE_DIR_NAME': 'bundles/', # must end with slash\n 'STATS_FILE': os.path.join(BASE_DIR, 'webpack-stats.json'),\n 'POLL_INTERVAL': 0.1,\n 'IGNORE': ['.+\\.hot-update.js', '.+\\.map']\n }\n}\n\n\nSTATICFILES_DIRS = [\n os.path.join(BASE_DIR, 'node_modules/jquery/dist'),\n os.path.join(BASE_DIR, 'node_modules/salvattore/dist'),\n os.path.join(BASE_DIR, 'node_modules/bootstrap-sass/assets/javascripts'),\n os.path.join(BASE_DIR, 'node_modules/bootstrap-sass/assets/stylesheets'),\n os.path.join(BASE_DIR, 'node_modules/font-awesome'),\n os.path.join(BASE_DIR, 'node_modules/owl.carousel/dist'),\n os.path.join(PROJECT_DIR, 'static'),\n]\n\nSTATIC_ROOT = os.path.join(BASE_DIR, 'static')\nSTATIC_URL = '/static/'\n\nMEDIA_ROOT = os.path.join(BASE_DIR, 'media')\nMEDIA_URL = '/media/'\n\nCOMPRESS_PRECOMPILERS = (\n ('text/x-scss', 'django_libsass.SassCompiler'),\n)\nLIBSASS_SOURCEMAPS = True\n\n# Wagtail settings\n\nWAGTAIL_SITE_NAME = \"euth_wagtail\"\n\n# Authentification\n\nLOGIN_URL = 'login'\n", "path": "euth_wagtail/settings/base.py"}], "after_files": [{"content": "from ckeditor.fields import RichTextField\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.exceptions import ObjectDoesNotExist, ValidationError\nfrom django.db import models\nfrom django.utils.functional import cached_property\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom contrib.transforms import html_transforms\nfrom euth.comments import models as comment_models\nfrom euth.contrib import base_models\nfrom euth.modules import models as module_models\n\n\nclass Document(module_models.Item):\n name = models.CharField(max_length=120)\n\n def __str__(self):\n return \"{}_document_{}\".format(str(self.module), self.pk)\n\n def clean(self, *args, **kwargs):\n try:\n Document.objects.get(module=self.module)\n raise ValidationError(_('Document for that module already exists'))\n except ObjectDoesNotExist:\n super().clean(*args, **kwargs)\n\n @cached_property\n def paragraphs_sorted(self):\n return self.paragraph_set.all().order_by('weight')\n\n\nclass Paragraph(base_models.TimeStampedModel):\n name = models.CharField(max_length=120, blank=True)\n text = RichTextField()\n weight = models.PositiveIntegerField()\n document = models.ForeignKey(Document, on_delete=models.CASCADE)\n\n def __str__(self):\n return \"{}_paragraph_{}\".format(str(self.document), self.weight)\n\n def save(self, *args, **kwargs):\n self.text = html_transforms.clean_html_field(\n self.text)\n super().save(*args, **kwargs)\n\n @cached_property\n def comments(self):\n contenttype = ContentType.objects.get_for_model(self)\n pk = self.id\n comments = comment_models.Comment.objects.all().filter(\n content_type=contenttype, object_pk=pk)\n return comments\n", "path": "euth/documents/models.py"}, {"content": "\"\"\"\nDjango settings for euth_wagtail project.\n\nGenerated by 'django-admin startproject' using Django 1.9.1.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.9/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.9/ref/settings/\n\"\"\"\n\n# Build paths inside the project like this: os.path.join(BASE_DIR, ...)\nimport os\n\nfrom django.utils.translation import ugettext_lazy as _\n\nPROJECT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\nBASE_DIR = os.path.dirname(PROJECT_DIR)\n\n\n# Quick-start development settings - unsuitable for production\n# See https://docs.djangoproject.com/en/1.9/howto/deployment/checklist/\n\n\n# Application definition\n\nINSTALLED_APPS = [\n 'home',\n 'search',\n\n 'wagtail.wagtailforms',\n 'wagtail.wagtailredirects',\n 'wagtail.wagtailembeds',\n 'wagtail.wagtailsites',\n 'wagtail.wagtailusers',\n 'wagtail.wagtailsnippets',\n 'wagtail.wagtaildocs',\n 'wagtail.wagtailimages',\n 'wagtail.wagtailsearch',\n 'wagtail.wagtailadmin',\n 'wagtail.wagtailcore',\n 'wagtail.contrib.wagtailstyleguide',\n\n 'modelcluster',\n 'compressor',\n 'taggit',\n 'projects',\n 'widget_tweaks',\n 'webpack_loader',\n 'easy_thumbnails',\n 'parler',\n 'ckeditor',\n\n 'django.contrib.sites',\n 'django.contrib.admin',\n 'django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'django_countries',\n 'rest_framework',\n 'autofixture',\n 'rules.apps.AutodiscoverRulesConfig',\n\n 'euth.users.apps.UsersConfig',\n 'euth.organisations.apps.OrganisationsConfig',\n 'euth.projects.apps.ProjectsConfig',\n 'euth.comments.apps.CommentConfig',\n 'euth.phases.apps.PhasesConfig',\n 'euth.modules.apps.ModuleConfig',\n 'euth.ideas.apps.IdeaConfig',\n 'euth.ratings.apps.RatingsConfig',\n 'euth.reports.apps.ReportConfig',\n 'euth.dashboard.apps.DashboardConfig',\n 'euth.memberships.apps.MembershipsConfig',\n 'euth.documents.apps.DocumentConfig'\n]\n\nMIDDLEWARE_CLASSES = [\n 'django.contrib.sessions.middleware.SessionMiddleware',\n 'django.middleware.common.CommonMiddleware',\n 'django.middleware.csrf.CsrfViewMiddleware',\n 'django.contrib.auth.middleware.AuthenticationMiddleware',\n 'django.contrib.auth.middleware.SessionAuthenticationMiddleware',\n 'django.contrib.messages.middleware.MessageMiddleware',\n 'django.middleware.clickjacking.XFrameOptionsMiddleware',\n 'django.middleware.security.SecurityMiddleware',\n 'django.middleware.locale.LocaleMiddleware',\n 'wagtail.wagtailcore.middleware.SiteMiddleware',\n 'wagtail.wagtailredirects.middleware.RedirectMiddleware',\n]\n\nSITE_ID = 1\n\nROOT_URLCONF = 'euth_wagtail.urls'\n\nLOCALE_PATHS = [os.path.join(BASE_DIR, 'locale')]\n\nTEMPLATES = [\n {\n 'BACKEND': 'django.template.backends.django.DjangoTemplates',\n 'DIRS': [\n os.path.join(PROJECT_DIR, 'templates'),\n ],\n 'APP_DIRS': True,\n 'OPTIONS': {\n 'context_processors': [\n 'django.template.context_processors.debug',\n 'django.template.context_processors.request',\n 'django.contrib.auth.context_processors.auth',\n 'django.contrib.messages.context_processors.messages',\n ],\n },\n },\n]\n\nWSGI_APPLICATION = 'euth_wagtail.wsgi.application'\n\n\n# Database\n# https://docs.djangoproject.com/en/1.9/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.sqlite3',\n 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),\n }\n}\n\n\n# Auth\n# https://docs.djangoproject.com/en/1.8/topics/auth/customizing/\n\nAUTH_USER_MODEL = 'euth_users.User'\n\nAUTHENTICATION_BACKENDS = (\n 'rules.permissions.ObjectPermissionBackend',\n 'django.contrib.auth.backends.ModelBackend',\n)\n\nCKEDITOR_CONFIGS = {\n 'default': {\n 'width': '100%',\n 'toolbar': 'Custom',\n 'toolbar_Custom': [\n ['Bold', 'Italic', 'Underline'],\n ['NumberedList', 'BulletedList'],\n ['Link', 'Unlink']\n ]\n },\n 'document-editor': {\n 'width': '100%',\n 'toolbar': 'Custom',\n 'toolbar_Custom': [\n ['Bold', 'Italic', 'Underline'],\n ['NumberedList', 'BulletedList'],\n ['Link', 'Unlink']\n ]\n }\n}\n\nBLEACH_LIST = {\n 'default' : {\n 'tags': ['p','strong','em','u','ol','li','ul','a'],\n 'attributes': {\n 'a': ['href', 'rel']\n }\n }\n}\n\n# Internationalization\n# https://docs.djangoproject.com/en/1.9/topics/i18n/\n\nLANGUAGE_CODE = 'en'\n\nTIME_ZONE = 'UTC'\n\nUSE_I18N = True\n\nUSE_L10N = True\n\nUSE_TZ = True\n\nLANGUAGES = [\n ('de', _('German')),\n ('en', _('English')),\n ('it', _('Italien')),\n ('fr', _('French')),\n ('sv', _('Swedish')),\n ('sl', _('Slovene')),\n ('da', _('Danish')),\n]\n\nPARLER_LANGUAGES = {\n 1:[{'code': language_code } for language_code, language in LANGUAGES]\n}\n\n# fixtures\n\nFIXTURE_DIRS = [ os.path.join(PROJECT_DIR, 'fixtures') ]\n\nALLOWED_UPLOAD_IMAGES = ('image/png', 'image/jpeg', 'image/gif')\n\nTHUMBNAIL_ALIASES = {\n '': {\n 'organisation_thumbnail': {'size': (400, 400), 'crop': 'smart'},\n 'heroimage': {'size': (1500, 500), 'crop': 'smart'},\n 'project_thumbnail': {'size': (520, 330), 'crop': 'smart'},\n 'organisation_thumbnail': {'size': (740, 540), 'crop': 'smart'},\n 'avatar_small': {'size': (60, 60), 'crop': 'smart'},\n 'avatar_medium': {'size': (300, 300), 'crop': 'smart'},\n }\n}\n# Static files (CSS, JavaScript, Images)\n# https://docs.djangoproject.com/en/1.9/howto/static-files/\n\nSTATICFILES_FINDERS = [\n 'django.contrib.staticfiles.finders.FileSystemFinder',\n 'django.contrib.staticfiles.finders.AppDirectoriesFinder',\n 'compressor.finders.CompressorFinder',\n]\n\n\nWEBPACK_LOADER = {\n 'DEFAULT': {\n 'CACHE': False,\n 'BUNDLE_DIR_NAME': 'bundles/', # must end with slash\n 'STATS_FILE': os.path.join(BASE_DIR, 'webpack-stats.json'),\n 'POLL_INTERVAL': 0.1,\n 'IGNORE': ['.+\\.hot-update.js', '.+\\.map']\n }\n}\n\n\nSTATICFILES_DIRS = [\n os.path.join(BASE_DIR, 'node_modules/jquery/dist'),\n os.path.join(BASE_DIR, 'node_modules/salvattore/dist'),\n os.path.join(BASE_DIR, 'node_modules/bootstrap-sass/assets/javascripts'),\n os.path.join(BASE_DIR, 'node_modules/bootstrap-sass/assets/stylesheets'),\n os.path.join(BASE_DIR, 'node_modules/font-awesome'),\n os.path.join(BASE_DIR, 'node_modules/owl.carousel/dist'),\n os.path.join(PROJECT_DIR, 'static'),\n]\n\nSTATIC_ROOT = os.path.join(BASE_DIR, 'static')\nSTATIC_URL = '/static/'\n\nMEDIA_ROOT = os.path.join(BASE_DIR, 'media')\nMEDIA_URL = '/media/'\n\nCOMPRESS_PRECOMPILERS = (\n ('text/x-scss', 'django_libsass.SassCompiler'),\n)\nLIBSASS_SOURCEMAPS = True\n\n# Wagtail settings\n\nWAGTAIL_SITE_NAME = \"euth_wagtail\"\n\n# Authentification\n\nLOGIN_URL = 'login'\n", "path": "euth_wagtail/settings/base.py"}]}
| 3,343 | 290 |
gh_patches_debug_91
|
rasdani/github-patches
|
git_diff
|
projectmesa__mesa-535
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Issue 523
add networkx to the dependency specified in setup.py for #523 to allow all the tests to run without error when installing with the `pip install -e .` command.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 import re
4
5 from setuptools import setup, find_packages
6 from codecs import open
7
8 requires = [
9 'tornado >= 4.2, < 5.0.0',
10 'numpy',
11 'pandas',
12 'tqdm',
13 'click',
14 ]
15
16 version = ''
17 with open('mesa/__init__.py', 'r') as fd:
18 version = re.search(r'^__version__\s*=\s*[\'"]([^\'"]*)[\'"]',
19 fd.read(), re.MULTILINE).group(1)
20
21 with open('README.rst', 'rb', encoding='utf-8') as f:
22 readme = f.read()
23
24 setup(
25 name='Mesa',
26 version=version,
27 description="Agent-based modeling (ABM) in Python 3+",
28 long_description=readme,
29 author='Project Mesa Team',
30 author_email='projectmesa@googlegroups.com',
31 url='https://github.com/projectmesa/mesa',
32 packages=find_packages(),
33 package_data={'mesa': ['visualization/templates/*.html', 'visualization/templates/css/*',
34 'visualization/templates/fonts/*', 'visualization/templates/js/*']},
35 include_package_data=True,
36 install_requires=requires,
37 keywords='agent based modeling model ABM simulation multi-agent',
38 license='Apache 2.0',
39 zip_safe=False,
40 classifiers=[
41 'Topic :: Scientific/Engineering',
42 'Topic :: Scientific/Engineering :: Artificial Life',
43 'Topic :: Scientific/Engineering :: Artificial Intelligence',
44 'Intended Audience :: Science/Research',
45 'Programming Language :: Python :: 3 :: Only',
46 'License :: OSI Approved :: Apache Software License',
47 'Operating System :: OS Independent',
48 'Development Status :: 3 - Alpha',
49 'Natural Language :: English',
50 ],
51 entry_points='''
52 [console_scripts]
53 mesa=mesa.main:cli
54 ''',
55 )
56
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -7,6 +7,7 @@
requires = [
'tornado >= 4.2, < 5.0.0',
+ 'networkx',
'numpy',
'pandas',
'tqdm',
|
{"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -7,6 +7,7 @@\n \n requires = [\n 'tornado >= 4.2, < 5.0.0',\n+ 'networkx',\n 'numpy',\n 'pandas',\n 'tqdm',\n", "issue": "Issue 523\nadd networkx to the dependency specified in setup.py for #523 to allow all the tests to run without error when installing with the `pip install -e .` command.\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nimport re\n\nfrom setuptools import setup, find_packages\nfrom codecs import open\n\nrequires = [\n 'tornado >= 4.2, < 5.0.0',\n 'numpy',\n 'pandas',\n 'tqdm',\n 'click',\n]\n\nversion = ''\nwith open('mesa/__init__.py', 'r') as fd:\n version = re.search(r'^__version__\\s*=\\s*[\\'\"]([^\\'\"]*)[\\'\"]',\n fd.read(), re.MULTILINE).group(1)\n\nwith open('README.rst', 'rb', encoding='utf-8') as f:\n readme = f.read()\n\nsetup(\n name='Mesa',\n version=version,\n description=\"Agent-based modeling (ABM) in Python 3+\",\n long_description=readme,\n author='Project Mesa Team',\n author_email='projectmesa@googlegroups.com',\n url='https://github.com/projectmesa/mesa',\n packages=find_packages(),\n package_data={'mesa': ['visualization/templates/*.html', 'visualization/templates/css/*',\n 'visualization/templates/fonts/*', 'visualization/templates/js/*']},\n include_package_data=True,\n install_requires=requires,\n keywords='agent based modeling model ABM simulation multi-agent',\n license='Apache 2.0',\n zip_safe=False,\n classifiers=[\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Life',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python :: 3 :: Only',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: OS Independent',\n 'Development Status :: 3 - Alpha',\n 'Natural Language :: English',\n ],\n entry_points='''\n [console_scripts]\n mesa=mesa.main:cli\n ''',\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nimport re\n\nfrom setuptools import setup, find_packages\nfrom codecs import open\n\nrequires = [\n 'tornado >= 4.2, < 5.0.0',\n 'networkx',\n 'numpy',\n 'pandas',\n 'tqdm',\n]\n\nversion = ''\nwith open('mesa/__init__.py', 'r') as fd:\n version = re.search(r'^__version__\\s*=\\s*[\\'\"]([^\\'\"]*)[\\'\"]',\n fd.read(), re.MULTILINE).group(1)\n\nwith open('README.rst', 'rb', encoding='utf-8') as f:\n readme = f.read()\n\nsetup(\n name='Mesa',\n version=version,\n description=\"Agent-based modeling (ABM) in Python 3+\",\n long_description=readme,\n author='Project Mesa Team',\n author_email='projectmesa@googlegroups.com',\n url='https://github.com/projectmesa/mesa',\n packages=find_packages(),\n package_data={'mesa': ['visualization/templates/*.html', 'visualization/templates/css/*',\n 'visualization/templates/fonts/*', 'visualization/templates/js/*']},\n include_package_data=True,\n install_requires=requires,\n keywords='agent based modeling model ABM simulation multi-agent',\n license='Apache 2.0',\n zip_safe=False,\n classifiers=[\n 'Topic :: Scientific/Engineering',\n 'Topic :: Scientific/Engineering :: Artificial Life',\n 'Topic :: Scientific/Engineering :: Artificial Intelligence',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python :: 3 :: Only',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: OS Independent',\n 'Development Status :: 3 - Alpha',\n 'Natural Language :: English',\n ],\n)\n", "path": "setup.py"}]}
| 821 | 73 |
gh_patches_debug_24159
|
rasdani/github-patches
|
git_diff
|
pytorch__ignite-399
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve State documentation
Hi,
I think the documentation of states and the way they're used by engines is not detailed enough.
Currently the only way to know what attributes are available in the default State object, used by the engines generated by convenience methods such as `create_supervised_trainer`, is to look at the source code.
Some of them are mentioned in the [Concept](https://pytorch.org/ignite/concepts.html#state) part of the documentation but there is still a lot of room for confusion about the way they work and some are missing.
For example, no information about the `batch` attribute which without documentation some might assume to correspond to the batch index (kinda like `epoch` for the epoch index) which is actually `iteration`. Same thing for `epoch` which is used implicitly in examples but no mention of it in the default `State` documentation, and you might think it's obvious but then someone will try to create a default `State` and assume `epoch` is a built-in attribute set to 0 at instantiation when in fact you'll have to search through the source code to figure out that it's not a default attribute and is only created by the default `Engine` through a kwarg passed to the `State` class.
Additionally this kind of explicit documentation of the state should be done every time a custom `State` is created otherwise the user is left to guess what attributes are available in it, what they mean and what format they're in. For example, is `output` the return from the forward pass of your model? The loss?
You'll have to look at `Engine` then find out it's the output of the process_function argument. Then figure out that in the case of the `create_supervised_trainer` that function is `_update` which returns `loss.item()` and in the case of `create_supervised_evaluator` returns `(y_pred, y)`.
Thank you, and sorry for the verbosity but without it I was afraid this would be rejected as trivial.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ignite/engine/__init__.py`
Content:
```
1 import torch
2
3 from ignite.engine.engine import Engine, State, Events
4 from ignite.utils import convert_tensor
5
6
7 def _prepare_batch(batch, device=None, non_blocking=False):
8 """Prepare batch for training: pass to a device with options.
9
10 """
11 x, y = batch
12 return (convert_tensor(x, device=device, non_blocking=non_blocking),
13 convert_tensor(y, device=device, non_blocking=non_blocking))
14
15
16 def create_supervised_trainer(model, optimizer, loss_fn,
17 device=None, non_blocking=False,
18 prepare_batch=_prepare_batch):
19 """
20 Factory function for creating a trainer for supervised models.
21
22 Args:
23 model (`torch.nn.Module`): the model to train.
24 optimizer (`torch.optim.Optimizer`): the optimizer to use.
25 loss_fn (torch.nn loss function): the loss function to use.
26 device (str, optional): device type specification (default: None).
27 Applies to both model and batches.
28 non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously
29 with respect to the host. For other cases, this argument has no effect.
30 prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
31 tuple of tensors `(batch_x, batch_y)`.
32
33 Returns:
34 Engine: a trainer engine with supervised update function.
35 """
36 if device:
37 model.to(device)
38
39 def _update(engine, batch):
40 model.train()
41 optimizer.zero_grad()
42 x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)
43 y_pred = model(x)
44 loss = loss_fn(y_pred, y)
45 loss.backward()
46 optimizer.step()
47 return loss.item()
48
49 return Engine(_update)
50
51
52 def create_supervised_evaluator(model, metrics={},
53 device=None, non_blocking=False,
54 prepare_batch=_prepare_batch):
55 """
56 Factory function for creating an evaluator for supervised models.
57
58 Args:
59 model (`torch.nn.Module`): the model to train.
60 metrics (dict of str - :class:`~ignite.metrics.Metric`): a map of metric names to Metrics.
61 device (str, optional): device type specification (default: None).
62 Applies to both model and batches.
63 non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously
64 with respect to the host. For other cases, this argument has no effect.
65 prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
66 tuple of tensors `(batch_x, batch_y)`.
67
68 Returns:
69 Engine: an evaluator engine with supervised inference function.
70 """
71 if device:
72 model.to(device)
73
74 def _inference(engine, batch):
75 model.eval()
76 with torch.no_grad():
77 x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)
78 y_pred = model(x)
79 return y_pred, y
80
81 engine = Engine(_inference)
82
83 for name, metric in metrics.items():
84 metric.attach(engine, name)
85
86 return engine
87
```
Path: `ignite/__init__.py`
Content:
```
1 import ignite.engine
2 import ignite.handlers
3 import ignite.metrics
4 import ignite.exceptions
5 import ignite.contrib
6
7 __version__ = '0.1.2'
8
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ignite/__init__.py b/ignite/__init__.py
--- a/ignite/__init__.py
+++ b/ignite/__init__.py
@@ -3,5 +3,6 @@
import ignite.metrics
import ignite.exceptions
import ignite.contrib
+import ignite.utils
__version__ = '0.1.2'
diff --git a/ignite/engine/__init__.py b/ignite/engine/__init__.py
--- a/ignite/engine/__init__.py
+++ b/ignite/engine/__init__.py
@@ -30,6 +30,8 @@
prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
tuple of tensors `(batch_x, batch_y)`.
+ Note: `engine.state.output` for this engine is the loss of the processed batch.
+
Returns:
Engine: a trainer engine with supervised update function.
"""
@@ -65,6 +67,8 @@
prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs
tuple of tensors `(batch_x, batch_y)`.
+ Note: `engine.state.output` for this engine is a tuple of `(batch_pred, batch_y)`.
+
Returns:
Engine: an evaluator engine with supervised inference function.
"""
|
{"golden_diff": "diff --git a/ignite/__init__.py b/ignite/__init__.py\n--- a/ignite/__init__.py\n+++ b/ignite/__init__.py\n@@ -3,5 +3,6 @@\n import ignite.metrics\n import ignite.exceptions\n import ignite.contrib\n+import ignite.utils\n \n __version__ = '0.1.2'\ndiff --git a/ignite/engine/__init__.py b/ignite/engine/__init__.py\n--- a/ignite/engine/__init__.py\n+++ b/ignite/engine/__init__.py\n@@ -30,6 +30,8 @@\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n \n+ Note: `engine.state.output` for this engine is the loss of the processed batch.\n+\n Returns:\n Engine: a trainer engine with supervised update function.\n \"\"\"\n@@ -65,6 +67,8 @@\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n \n+ Note: `engine.state.output` for this engine is a tuple of `(batch_pred, batch_y)`.\n+\n Returns:\n Engine: an evaluator engine with supervised inference function.\n \"\"\"\n", "issue": "Improve State documentation\nHi,\r\n\r\nI think the documentation of states and the way they're used by engines is not detailed enough.\r\nCurrently the only way to know what attributes are available in the default State object, used by the engines generated by convenience methods such as `create_supervised_trainer`, is to look at the source code. \r\nSome of them are mentioned in the [Concept](https://pytorch.org/ignite/concepts.html#state) part of the documentation but there is still a lot of room for confusion about the way they work and some are missing.\r\nFor example, no information about the `batch` attribute which without documentation some might assume to correspond to the batch index (kinda like `epoch` for the epoch index) which is actually `iteration`. Same thing for `epoch` which is used implicitly in examples but no mention of it in the default `State` documentation, and you might think it's obvious but then someone will try to create a default `State` and assume `epoch` is a built-in attribute set to 0 at instantiation when in fact you'll have to search through the source code to figure out that it's not a default attribute and is only created by the default `Engine` through a kwarg passed to the `State` class.\r\n\r\nAdditionally this kind of explicit documentation of the state should be done every time a custom `State` is created otherwise the user is left to guess what attributes are available in it, what they mean and what format they're in. For example, is `output` the return from the forward pass of your model? The loss?\r\nYou'll have to look at `Engine` then find out it's the output of the process_function argument. Then figure out that in the case of the `create_supervised_trainer` that function is `_update` which returns `loss.item()` and in the case of `create_supervised_evaluator` returns `(y_pred, y)`.\r\n\r\nThank you, and sorry for the verbosity but without it I was afraid this would be rejected as trivial.\n", "before_files": [{"content": "import torch\n\nfrom ignite.engine.engine import Engine, State, Events\nfrom ignite.utils import convert_tensor\n\n\ndef _prepare_batch(batch, device=None, non_blocking=False):\n \"\"\"Prepare batch for training: pass to a device with options.\n\n \"\"\"\n x, y = batch\n return (convert_tensor(x, device=device, non_blocking=non_blocking),\n convert_tensor(y, device=device, non_blocking=non_blocking))\n\n\ndef create_supervised_trainer(model, optimizer, loss_fn,\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating a trainer for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n optimizer (`torch.optim.Optimizer`): the optimizer to use.\n loss_fn (torch.nn loss function): the loss function to use.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Returns:\n Engine: a trainer engine with supervised update function.\n \"\"\"\n if device:\n model.to(device)\n\n def _update(engine, batch):\n model.train()\n optimizer.zero_grad()\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n loss = loss_fn(y_pred, y)\n loss.backward()\n optimizer.step()\n return loss.item()\n\n return Engine(_update)\n\n\ndef create_supervised_evaluator(model, metrics={},\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating an evaluator for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n metrics (dict of str - :class:`~ignite.metrics.Metric`): a map of metric names to Metrics.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Returns:\n Engine: an evaluator engine with supervised inference function.\n \"\"\"\n if device:\n model.to(device)\n\n def _inference(engine, batch):\n model.eval()\n with torch.no_grad():\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n return y_pred, y\n\n engine = Engine(_inference)\n\n for name, metric in metrics.items():\n metric.attach(engine, name)\n\n return engine\n", "path": "ignite/engine/__init__.py"}, {"content": "import ignite.engine\nimport ignite.handlers\nimport ignite.metrics\nimport ignite.exceptions\nimport ignite.contrib\n\n__version__ = '0.1.2'\n", "path": "ignite/__init__.py"}], "after_files": [{"content": "import torch\n\nfrom ignite.engine.engine import Engine, State, Events\nfrom ignite.utils import convert_tensor\n\n\ndef _prepare_batch(batch, device=None, non_blocking=False):\n \"\"\"Prepare batch for training: pass to a device with options.\n\n \"\"\"\n x, y = batch\n return (convert_tensor(x, device=device, non_blocking=non_blocking),\n convert_tensor(y, device=device, non_blocking=non_blocking))\n\n\ndef create_supervised_trainer(model, optimizer, loss_fn,\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating a trainer for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n optimizer (`torch.optim.Optimizer`): the optimizer to use.\n loss_fn (torch.nn loss function): the loss function to use.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Note: `engine.state.output` for this engine is the loss of the processed batch.\n\n Returns:\n Engine: a trainer engine with supervised update function.\n \"\"\"\n if device:\n model.to(device)\n\n def _update(engine, batch):\n model.train()\n optimizer.zero_grad()\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n loss = loss_fn(y_pred, y)\n loss.backward()\n optimizer.step()\n return loss.item()\n\n return Engine(_update)\n\n\ndef create_supervised_evaluator(model, metrics={},\n device=None, non_blocking=False,\n prepare_batch=_prepare_batch):\n \"\"\"\n Factory function for creating an evaluator for supervised models.\n\n Args:\n model (`torch.nn.Module`): the model to train.\n metrics (dict of str - :class:`~ignite.metrics.Metric`): a map of metric names to Metrics.\n device (str, optional): device type specification (default: None).\n Applies to both model and batches.\n non_blocking (bool, optional): if True and this copy is between CPU and GPU, the copy may occur asynchronously\n with respect to the host. For other cases, this argument has no effect.\n prepare_batch (callable, optional): function that receives `batch`, `device`, `non_blocking` and outputs\n tuple of tensors `(batch_x, batch_y)`.\n\n Note: `engine.state.output` for this engine is a tuple of `(batch_pred, batch_y)`.\n\n Returns:\n Engine: an evaluator engine with supervised inference function.\n \"\"\"\n if device:\n model.to(device)\n\n def _inference(engine, batch):\n model.eval()\n with torch.no_grad():\n x, y = prepare_batch(batch, device=device, non_blocking=non_blocking)\n y_pred = model(x)\n return y_pred, y\n\n engine = Engine(_inference)\n\n for name, metric in metrics.items():\n metric.attach(engine, name)\n\n return engine\n", "path": "ignite/engine/__init__.py"}, {"content": "import ignite.engine\nimport ignite.handlers\nimport ignite.metrics\nimport ignite.exceptions\nimport ignite.contrib\nimport ignite.utils\n\n__version__ = '0.1.2'\n", "path": "ignite/__init__.py"}]}
| 1,571 | 290 |
gh_patches_debug_1437
|
rasdani/github-patches
|
git_diff
|
pytorch__text-81
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
min_freq=0 bug
**Noticed:**
```
>>>some_field.build_vocab(some_dataset, min_freq=0)
>>>padding_idx = some_field.vocab.stoi['<pad'>]
>>>print(padding_idx, '<pad>')
12 <pad>
```
Looks like <pad> is not equal to 1 which is not okay.
Printed `stoi` and `itos` as well:
```
>>>print(some_field.vocab.stoi)
defaultdict(<function Vocab.__init__.<locals>.<lambda> at 0x103f4f0d0>, {'<pad>': 12, '1': 2, '2': 3, '9': 4, '0': 5, '5': 6, '4': 7, '6': 8, '8': 9, '3': 10, '7': 11, '<unk>': 13})
>>>print(some_field.vocab.itos)
['<unk>', '<pad>', '1', '2', '9', '0', '5', '4', '6', '8', '3', '7', '<pad>', '<unk>']
```
**Possible reason:**
Counter subtract does remove the specials but puts their count at 0.
`counter.subtract({tok: counter[tok] for tok in ['<unk>'] + specials})`
**Possible solution:**
Throw an error if `min_freq < 1`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchtext/vocab.py`
Content:
```
1 from __future__ import print_function
2 import array
3 from collections import defaultdict
4 import os
5 import zipfile
6
7 import six
8 from six.moves.urllib.request import urlretrieve
9 import torch
10 from tqdm import trange, tqdm
11
12 from .utils import reporthook
13
14 URL = {
15 'glove.42B': 'http://nlp.stanford.edu/data/glove.42B.300d.zip',
16 'glove.840B': 'http://nlp.stanford.edu/data/glove.840B.300d.zip',
17 'glove.twitter.27B': 'http://nlp.stanford.edu/data/glove.twitter.27B.zip',
18 'glove.6B': 'http://nlp.stanford.edu/data/glove.6B.zip'
19 }
20
21
22 def load_word_vectors(root, wv_type, dim):
23 """Load word vectors from a path, trying .pt, .txt, and .zip extensions."""
24 if isinstance(dim, int):
25 dim = str(dim) + 'd'
26 fname = os.path.join(root, wv_type + '.' + dim)
27 if os.path.isfile(fname + '.pt'):
28 fname_pt = fname + '.pt'
29 print('loading word vectors from', fname_pt)
30 return torch.load(fname_pt)
31 if os.path.isfile(fname + '.txt'):
32 fname_txt = fname + '.txt'
33 cm = open(fname_txt, 'rb')
34 cm = [line for line in cm]
35 elif os.path.basename(wv_type) in URL:
36 url = URL[wv_type]
37 print('downloading word vectors from {}'.format(url))
38 filename = os.path.basename(fname)
39 if not os.path.exists(root):
40 os.makedirs(root)
41 with tqdm(unit='B', unit_scale=True, miniters=1, desc=filename) as t:
42 fname, _ = urlretrieve(url, fname, reporthook=reporthook(t))
43 with zipfile.ZipFile(fname, "r") as zf:
44 print('extracting word vectors into {}'.format(root))
45 zf.extractall(root)
46 if not os.path.isfile(fname + '.txt'):
47 raise RuntimeError('no word vectors of requested dimension found')
48 return load_word_vectors(root, wv_type, dim)
49 else:
50 raise RuntimeError('unable to load word vectors')
51
52 wv_tokens, wv_arr, wv_size = [], array.array('d'), None
53 if cm is not None:
54 print("Loading word vectors from {}".format(fname_txt))
55 for line in trange(len(cm)):
56 entries = cm[line].strip().split(b' ')
57 word, entries = entries[0], entries[1:]
58 if wv_size is None:
59 wv_size = len(entries)
60 try:
61 if isinstance(word, six.binary_type):
62 word = word.decode('utf-8')
63 except:
64 print('non-UTF8 token', repr(word), 'ignored')
65 continue
66 wv_arr.extend(float(x) for x in entries)
67 wv_tokens.append(word)
68
69 wv_dict = {word: i for i, word in enumerate(wv_tokens)}
70 wv_arr = torch.Tensor(wv_arr).view(-1, wv_size)
71 ret = (wv_dict, wv_arr, wv_size)
72 torch.save(ret, fname + '.pt')
73 return ret
74
75
76 class Vocab(object):
77 """Defines a vocabulary object that will be used to numericalize a field.
78
79 Attributes:
80 freqs: A collections.Counter object holding the frequencies of tokens
81 in the data used to build the Vocab.
82 stoi: A collections.defaultdict instance mapping token strings to
83 numerical identifiers.
84 itos: A list of token strings indexed by their numerical identifiers.
85 vectors: A Tensor containing word vectors for the tokens in the Vocab,
86 if a word vector file has been provided.
87 """
88
89 def __init__(self, counter, max_size=None, min_freq=1, wv_dir=os.getcwd(),
90 wv_type=None, wv_dim=300, unk_init='random',
91 specials=['<pad>'], fill_from_vectors=False):
92 """Create a Vocab object from a collections.Counter.
93
94 Arguments:
95 counter: collections.Counter object holding the frequencies of
96 each value found in the data.
97 max_size: The maximum size of the vocabulary, or None for no
98 maximum. Default: None.
99 min_freq: The minimum frequency needed to include a token in the
100 vocabulary. Default: 1.
101 wv_dir: directory containing word vector file and destination for
102 downloaded word vector files
103 wv_type: type of word vectors; None for no word vectors
104 wv_dim: dimension of word vectors
105 specials: The list of special tokens (e.g., padding or eos) that
106 will be prepended to the vocabulary in addition to an <unk>
107 token.
108 fill_from_vectors: Whether to add to the vocabulary every token
109 for which a word vector specified by vectors is present
110 even if the token does not appear in the provided data.
111 unk_init: default to random initialization for word vectors not in the
112 pretrained word vector file; otherwise set to zero
113 """
114 self.freqs = counter.copy()
115 self.unk_init = unk_init
116 counter.update(['<unk>'] + specials)
117
118 if wv_type is not None:
119 wv_dict, wv_arr, self.wv_size = load_word_vectors(wv_dir, wv_type, wv_dim)
120
121 if fill_from_vectors:
122 counter.update(wv_dict.keys())
123
124 self.stoi = defaultdict(lambda: 0)
125 self.stoi.update({tok: i + 1 for i, tok in enumerate(specials)})
126 self.itos = ['<unk>'] + specials
127
128 counter.subtract({tok: counter[tok] for tok in ['<unk>'] + specials})
129 max_size = None if max_size is None else max_size + len(self.itos)
130
131 # sort by frequency, then alphabetically
132 words = sorted(counter.items(), key=lambda tup: tup[0])
133 words.sort(key=lambda tup: tup[1], reverse=True)
134
135 for k, v in words:
136 if v < min_freq or len(self.itos) == max_size:
137 break
138 self.itos.append(k)
139 self.stoi[k] = len(self.itos) - 1
140
141 if wv_type is not None:
142 self.set_vectors(wv_dict, wv_arr)
143
144 def __len__(self):
145 return len(self.itos)
146
147 def load_vectors(self, wv_dir=os.getcwd(), wv_type=None, wv_dim=300,
148 unk_init='random'):
149 """Loads word vectors into the vocab
150
151 Arguments:
152 wv_dir: directory containing word vector file and destination for
153 downloaded word vector files
154 wv_type: type of word vectors; None for no word vectors
155 wv_dim: dimension of word vectors
156
157 unk_init: default to random initialization for unknown word vectors;
158 otherwise set to zero
159 """
160 self.unk_init = unk_init
161 wv_dict, wv_arr, self.wv_size = load_word_vectors(wv_dir, wv_type, wv_dim)
162 self.set_vectors(wv_dict, wv_arr)
163
164 def set_vectors(self, wv_dict, wv_arr):
165 self.vectors = torch.Tensor(len(self), self.wv_size)
166 self.vectors.normal_(0, 1) if self.unk_init == 'random' else self.vectors.zero_()
167 for i, token in enumerate(self.itos):
168 wv_index = wv_dict.get(token, None)
169 if wv_index is not None:
170 self.vectors[i] = wv_arr[wv_index]
171
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/torchtext/vocab.py b/torchtext/vocab.py
--- a/torchtext/vocab.py
+++ b/torchtext/vocab.py
@@ -113,6 +113,7 @@
"""
self.freqs = counter.copy()
self.unk_init = unk_init
+ min_freq = max(min_freq, 1)
counter.update(['<unk>'] + specials)
if wv_type is not None:
|
{"golden_diff": "diff --git a/torchtext/vocab.py b/torchtext/vocab.py\n--- a/torchtext/vocab.py\n+++ b/torchtext/vocab.py\n@@ -113,6 +113,7 @@\n \"\"\"\n self.freqs = counter.copy()\n self.unk_init = unk_init\n+ min_freq = max(min_freq, 1)\n counter.update(['<unk>'] + specials)\n \n if wv_type is not None:\n", "issue": "min_freq=0 bug\n**Noticed:**\r\n```\r\n>>>some_field.build_vocab(some_dataset, min_freq=0)\r\n>>>padding_idx = some_field.vocab.stoi['<pad'>]\r\n>>>print(padding_idx, '<pad>')\r\n12 <pad>\r\n```\r\n\r\nLooks like <pad> is not equal to 1 which is not okay.\r\n\r\nPrinted `stoi` and `itos` as well:\r\n```\r\n>>>print(some_field.vocab.stoi)\r\ndefaultdict(<function Vocab.__init__.<locals>.<lambda> at 0x103f4f0d0>, {'<pad>': 12, '1': 2, '2': 3, '9': 4, '0': 5, '5': 6, '4': 7, '6': 8, '8': 9, '3': 10, '7': 11, '<unk>': 13})\r\n>>>print(some_field.vocab.itos)\r\n['<unk>', '<pad>', '1', '2', '9', '0', '5', '4', '6', '8', '3', '7', '<pad>', '<unk>']\r\n```\r\n\r\n**Possible reason:**\r\nCounter subtract does remove the specials but puts their count at 0.\r\n`counter.subtract({tok: counter[tok] for tok in ['<unk>'] + specials})`\r\n\r\n**Possible solution:**\r\nThrow an error if `min_freq < 1`\n", "before_files": [{"content": "from __future__ import print_function\nimport array\nfrom collections import defaultdict\nimport os\nimport zipfile\n\nimport six\nfrom six.moves.urllib.request import urlretrieve\nimport torch\nfrom tqdm import trange, tqdm\n\nfrom .utils import reporthook\n\nURL = {\n 'glove.42B': 'http://nlp.stanford.edu/data/glove.42B.300d.zip',\n 'glove.840B': 'http://nlp.stanford.edu/data/glove.840B.300d.zip',\n 'glove.twitter.27B': 'http://nlp.stanford.edu/data/glove.twitter.27B.zip',\n 'glove.6B': 'http://nlp.stanford.edu/data/glove.6B.zip'\n}\n\n\ndef load_word_vectors(root, wv_type, dim):\n \"\"\"Load word vectors from a path, trying .pt, .txt, and .zip extensions.\"\"\"\n if isinstance(dim, int):\n dim = str(dim) + 'd'\n fname = os.path.join(root, wv_type + '.' + dim)\n if os.path.isfile(fname + '.pt'):\n fname_pt = fname + '.pt'\n print('loading word vectors from', fname_pt)\n return torch.load(fname_pt)\n if os.path.isfile(fname + '.txt'):\n fname_txt = fname + '.txt'\n cm = open(fname_txt, 'rb')\n cm = [line for line in cm]\n elif os.path.basename(wv_type) in URL:\n url = URL[wv_type]\n print('downloading word vectors from {}'.format(url))\n filename = os.path.basename(fname)\n if not os.path.exists(root):\n os.makedirs(root)\n with tqdm(unit='B', unit_scale=True, miniters=1, desc=filename) as t:\n fname, _ = urlretrieve(url, fname, reporthook=reporthook(t))\n with zipfile.ZipFile(fname, \"r\") as zf:\n print('extracting word vectors into {}'.format(root))\n zf.extractall(root)\n if not os.path.isfile(fname + '.txt'):\n raise RuntimeError('no word vectors of requested dimension found')\n return load_word_vectors(root, wv_type, dim)\n else:\n raise RuntimeError('unable to load word vectors')\n\n wv_tokens, wv_arr, wv_size = [], array.array('d'), None\n if cm is not None:\n print(\"Loading word vectors from {}\".format(fname_txt))\n for line in trange(len(cm)):\n entries = cm[line].strip().split(b' ')\n word, entries = entries[0], entries[1:]\n if wv_size is None:\n wv_size = len(entries)\n try:\n if isinstance(word, six.binary_type):\n word = word.decode('utf-8')\n except:\n print('non-UTF8 token', repr(word), 'ignored')\n continue\n wv_arr.extend(float(x) for x in entries)\n wv_tokens.append(word)\n\n wv_dict = {word: i for i, word in enumerate(wv_tokens)}\n wv_arr = torch.Tensor(wv_arr).view(-1, wv_size)\n ret = (wv_dict, wv_arr, wv_size)\n torch.save(ret, fname + '.pt')\n return ret\n\n\nclass Vocab(object):\n \"\"\"Defines a vocabulary object that will be used to numericalize a field.\n\n Attributes:\n freqs: A collections.Counter object holding the frequencies of tokens\n in the data used to build the Vocab.\n stoi: A collections.defaultdict instance mapping token strings to\n numerical identifiers.\n itos: A list of token strings indexed by their numerical identifiers.\n vectors: A Tensor containing word vectors for the tokens in the Vocab,\n if a word vector file has been provided.\n \"\"\"\n\n def __init__(self, counter, max_size=None, min_freq=1, wv_dir=os.getcwd(),\n wv_type=None, wv_dim=300, unk_init='random',\n specials=['<pad>'], fill_from_vectors=False):\n \"\"\"Create a Vocab object from a collections.Counter.\n\n Arguments:\n counter: collections.Counter object holding the frequencies of\n each value found in the data.\n max_size: The maximum size of the vocabulary, or None for no\n maximum. Default: None.\n min_freq: The minimum frequency needed to include a token in the\n vocabulary. Default: 1.\n wv_dir: directory containing word vector file and destination for\n downloaded word vector files\n wv_type: type of word vectors; None for no word vectors\n wv_dim: dimension of word vectors\n specials: The list of special tokens (e.g., padding or eos) that\n will be prepended to the vocabulary in addition to an <unk>\n token.\n fill_from_vectors: Whether to add to the vocabulary every token\n for which a word vector specified by vectors is present\n even if the token does not appear in the provided data.\n unk_init: default to random initialization for word vectors not in the\n pretrained word vector file; otherwise set to zero\n \"\"\"\n self.freqs = counter.copy()\n self.unk_init = unk_init\n counter.update(['<unk>'] + specials)\n\n if wv_type is not None:\n wv_dict, wv_arr, self.wv_size = load_word_vectors(wv_dir, wv_type, wv_dim)\n\n if fill_from_vectors:\n counter.update(wv_dict.keys())\n\n self.stoi = defaultdict(lambda: 0)\n self.stoi.update({tok: i + 1 for i, tok in enumerate(specials)})\n self.itos = ['<unk>'] + specials\n\n counter.subtract({tok: counter[tok] for tok in ['<unk>'] + specials})\n max_size = None if max_size is None else max_size + len(self.itos)\n\n # sort by frequency, then alphabetically\n words = sorted(counter.items(), key=lambda tup: tup[0])\n words.sort(key=lambda tup: tup[1], reverse=True)\n\n for k, v in words:\n if v < min_freq or len(self.itos) == max_size:\n break\n self.itos.append(k)\n self.stoi[k] = len(self.itos) - 1\n\n if wv_type is not None:\n self.set_vectors(wv_dict, wv_arr)\n\n def __len__(self):\n return len(self.itos)\n\n def load_vectors(self, wv_dir=os.getcwd(), wv_type=None, wv_dim=300,\n unk_init='random'):\n \"\"\"Loads word vectors into the vocab\n\n Arguments:\n wv_dir: directory containing word vector file and destination for\n downloaded word vector files\n wv_type: type of word vectors; None for no word vectors\n wv_dim: dimension of word vectors\n\n unk_init: default to random initialization for unknown word vectors;\n otherwise set to zero\n \"\"\"\n self.unk_init = unk_init\n wv_dict, wv_arr, self.wv_size = load_word_vectors(wv_dir, wv_type, wv_dim)\n self.set_vectors(wv_dict, wv_arr)\n\n def set_vectors(self, wv_dict, wv_arr):\n self.vectors = torch.Tensor(len(self), self.wv_size)\n self.vectors.normal_(0, 1) if self.unk_init == 'random' else self.vectors.zero_()\n for i, token in enumerate(self.itos):\n wv_index = wv_dict.get(token, None)\n if wv_index is not None:\n self.vectors[i] = wv_arr[wv_index]\n", "path": "torchtext/vocab.py"}], "after_files": [{"content": "from __future__ import print_function\nimport array\nfrom collections import defaultdict\nimport os\nimport zipfile\n\nimport six\nfrom six.moves.urllib.request import urlretrieve\nimport torch\nfrom tqdm import trange, tqdm\n\nfrom .utils import reporthook\n\nURL = {\n 'glove.42B': 'http://nlp.stanford.edu/data/glove.42B.300d.zip',\n 'glove.840B': 'http://nlp.stanford.edu/data/glove.840B.300d.zip',\n 'glove.twitter.27B': 'http://nlp.stanford.edu/data/glove.twitter.27B.zip',\n 'glove.6B': 'http://nlp.stanford.edu/data/glove.6B.zip'\n}\n\n\ndef load_word_vectors(root, wv_type, dim):\n \"\"\"Load word vectors from a path, trying .pt, .txt, and .zip extensions.\"\"\"\n if isinstance(dim, int):\n dim = str(dim) + 'd'\n fname = os.path.join(root, wv_type + '.' + dim)\n if os.path.isfile(fname + '.pt'):\n fname_pt = fname + '.pt'\n print('loading word vectors from', fname_pt)\n return torch.load(fname_pt)\n if os.path.isfile(fname + '.txt'):\n fname_txt = fname + '.txt'\n cm = open(fname_txt, 'rb')\n cm = [line for line in cm]\n elif os.path.basename(wv_type) in URL:\n url = URL[wv_type]\n print('downloading word vectors from {}'.format(url))\n filename = os.path.basename(fname)\n if not os.path.exists(root):\n os.makedirs(root)\n with tqdm(unit='B', unit_scale=True, miniters=1, desc=filename) as t:\n fname, _ = urlretrieve(url, fname, reporthook=reporthook(t))\n with zipfile.ZipFile(fname, \"r\") as zf:\n print('extracting word vectors into {}'.format(root))\n zf.extractall(root)\n if not os.path.isfile(fname + '.txt'):\n raise RuntimeError('no word vectors of requested dimension found')\n return load_word_vectors(root, wv_type, dim)\n else:\n raise RuntimeError('unable to load word vectors')\n\n wv_tokens, wv_arr, wv_size = [], array.array('d'), None\n if cm is not None:\n print(\"Loading word vectors from {}\".format(fname_txt))\n for line in trange(len(cm)):\n entries = cm[line].strip().split(b' ')\n word, entries = entries[0], entries[1:]\n if wv_size is None:\n wv_size = len(entries)\n try:\n if isinstance(word, six.binary_type):\n word = word.decode('utf-8')\n except:\n print('non-UTF8 token', repr(word), 'ignored')\n continue\n wv_arr.extend(float(x) for x in entries)\n wv_tokens.append(word)\n\n wv_dict = {word: i for i, word in enumerate(wv_tokens)}\n wv_arr = torch.Tensor(wv_arr).view(-1, wv_size)\n ret = (wv_dict, wv_arr, wv_size)\n torch.save(ret, fname + '.pt')\n return ret\n\n\nclass Vocab(object):\n \"\"\"Defines a vocabulary object that will be used to numericalize a field.\n\n Attributes:\n freqs: A collections.Counter object holding the frequencies of tokens\n in the data used to build the Vocab.\n stoi: A collections.defaultdict instance mapping token strings to\n numerical identifiers.\n itos: A list of token strings indexed by their numerical identifiers.\n vectors: A Tensor containing word vectors for the tokens in the Vocab,\n if a word vector file has been provided.\n \"\"\"\n\n def __init__(self, counter, max_size=None, min_freq=1, wv_dir=os.getcwd(),\n wv_type=None, wv_dim=300, unk_init='random',\n specials=['<pad>'], fill_from_vectors=False):\n \"\"\"Create a Vocab object from a collections.Counter.\n\n Arguments:\n counter: collections.Counter object holding the frequencies of\n each value found in the data.\n max_size: The maximum size of the vocabulary, or None for no\n maximum. Default: None.\n min_freq: The minimum frequency needed to include a token in the\n vocabulary. Default: 1.\n wv_dir: directory containing word vector file and destination for\n downloaded word vector files\n wv_type: type of word vectors; None for no word vectors\n wv_dim: dimension of word vectors\n specials: The list of special tokens (e.g., padding or eos) that\n will be prepended to the vocabulary in addition to an <unk>\n token.\n fill_from_vectors: Whether to add to the vocabulary every token\n for which a word vector specified by vectors is present\n even if the token does not appear in the provided data.\n unk_init: default to random initialization for word vectors not in the\n pretrained word vector file; otherwise set to zero\n \"\"\"\n self.freqs = counter.copy()\n self.unk_init = unk_init\n min_freq = max(min_freq, 1)\n counter.update(['<unk>'] + specials)\n\n if wv_type is not None:\n wv_dict, wv_arr, self.wv_size = load_word_vectors(wv_dir, wv_type, wv_dim)\n\n if fill_from_vectors:\n counter.update(wv_dict.keys())\n\n self.stoi = defaultdict(lambda: 0)\n self.stoi.update({tok: i + 1 for i, tok in enumerate(specials)})\n self.itos = ['<unk>'] + specials\n\n counter.subtract({tok: counter[tok] for tok in ['<unk>'] + specials})\n max_size = None if max_size is None else max_size + len(self.itos)\n\n # sort by frequency, then alphabetically\n words = sorted(counter.items(), key=lambda tup: tup[0])\n words.sort(key=lambda tup: tup[1], reverse=True)\n\n for k, v in words:\n if v < min_freq or len(self.itos) == max_size:\n break\n self.itos.append(k)\n self.stoi[k] = len(self.itos) - 1\n\n if wv_type is not None:\n self.set_vectors(wv_dict, wv_arr)\n\n def __len__(self):\n return len(self.itos)\n\n def load_vectors(self, wv_dir=os.getcwd(), wv_type=None, wv_dim=300,\n unk_init='random'):\n \"\"\"Loads word vectors into the vocab\n\n Arguments:\n wv_dir: directory containing word vector file and destination for\n downloaded word vector files\n wv_type: type of word vectors; None for no word vectors\n wv_dim: dimension of word vectors\n\n unk_init: default to random initialization for unknown word vectors;\n otherwise set to zero\n \"\"\"\n self.unk_init = unk_init\n wv_dict, wv_arr, self.wv_size = load_word_vectors(wv_dir, wv_type, wv_dim)\n self.set_vectors(wv_dict, wv_arr)\n\n def set_vectors(self, wv_dict, wv_arr):\n self.vectors = torch.Tensor(len(self), self.wv_size)\n self.vectors.normal_(0, 1) if self.unk_init == 'random' else self.vectors.zero_()\n for i, token in enumerate(self.itos):\n wv_index = wv_dict.get(token, None)\n if wv_index is not None:\n self.vectors[i] = wv_arr[wv_index]\n", "path": "torchtext/vocab.py"}]}
| 2,664 | 102 |
gh_patches_debug_21524
|
rasdani/github-patches
|
git_diff
|
mindsdb__mindsdb-981
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Route to check connection
At this moment in GUI possible check connection only for already exists integration. Route need for checking connection before integration created.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mindsdb/api/http/namespaces/config.py`
Content:
```
1 import copy
2 import traceback
3
4 from flask import request
5 from flask_restx import Resource, abort
6 from flask import current_app as ca
7
8 from mindsdb.api.http.namespaces.configs.config import ns_conf
9 from mindsdb.interfaces.database.database import DatabaseWrapper
10 from mindsdb.utilities.functions import get_all_models_meta_data
11
12
13 def get_integration(name):
14 integrations = ca.config_obj.get('integrations', {})
15 return integrations.get(name)
16
17
18 @ns_conf.route('/integrations')
19 @ns_conf.param('name', 'List all database integration')
20 class ListIntegration(Resource):
21 @ns_conf.doc('get_integrations')
22 def get(self):
23 return {'integrations': [k for k in ca.config_obj.get('integrations', {})]}
24
25
26 @ns_conf.route('/all_integrations')
27 @ns_conf.param('name', 'List all database integration')
28 class AllIntegration(Resource):
29 @ns_conf.doc('get_all_integrations')
30 def get(self):
31 integrations = copy.deepcopy(
32 ca.config_obj.get('integrations', {})
33 )
34 for integration in integrations.values():
35 if 'password' in integration:
36 integration['password'] = None
37 return integrations
38
39
40 @ns_conf.route('/integrations/<name>')
41 @ns_conf.param('name', 'Database integration')
42 class Integration(Resource):
43 @ns_conf.doc('get_integration')
44 def get(self, name):
45 integration = get_integration(name)
46 if integration is None:
47 abort(404, f'Can\'t find database integration: {name}')
48 integration = copy.deepcopy(integration)
49 if 'password' in integration:
50 integration['password'] = None
51 return integration
52
53 @ns_conf.doc('put_integration')
54 def put(self, name):
55 params = request.json.get('params')
56 if not isinstance(params, dict):
57 abort(400, "type of 'params' must be dict")
58 integration = get_integration(name)
59 if integration is not None:
60 abort(400, f"Integration with name '{name}' already exists")
61 try:
62 if 'enabled' in params:
63 params['publish'] = params['enabled']
64 del params['enabled']
65 ca.config_obj.add_db_integration(name, params)
66
67 mdb = ca.mindsdb_native
68 cst = ca.custom_models
69 model_data_arr = get_all_models_meta_data(mdb, cst)
70 dbw = DatabaseWrapper(ca.config_obj)
71 dbw.register_predictors(model_data_arr)
72 except Exception as e:
73 print(traceback.format_exc())
74 abort(500, f'Error during config update: {str(e)}')
75 return '', 200
76
77 @ns_conf.doc('delete_integration')
78 def delete(self, name):
79 integration = get_integration(name)
80 if integration is None:
81 abort(400, f"Nothing to delete. '{name}' not exists.")
82 try:
83 ca.config_obj.remove_db_integration(name)
84 except Exception as e:
85 print(traceback.format_exc())
86 abort(500, f'Error during integration delete: {str(e)}')
87 return '', 200
88
89 @ns_conf.doc('modify_integration')
90 def post(self, name):
91 params = request.json.get('params')
92 if not isinstance(params, dict):
93 abort(400, "type of 'params' must be dict")
94 integration = get_integration(name)
95 if integration is None:
96 abort(400, f"Nothin to modify. '{name}' not exists.")
97 try:
98 if 'enabled' in params:
99 params['publish'] = params['enabled']
100 del params['enabled']
101 ca.config_obj.modify_db_integration(name, params)
102 DatabaseWrapper(ca.config_obj)
103 except Exception as e:
104 print(traceback.format_exc())
105 abort(500, f'Error during integration modifycation: {str(e)}')
106 return '', 200
107
108
109 @ns_conf.route('/integrations/<name>/check')
110 @ns_conf.param('name', 'Database integration checks')
111 class Check(Resource):
112 @ns_conf.doc('check')
113 def get(self, name):
114 if get_integration(name) is None:
115 abort(404, f'Can\'t find database integration: {name}')
116 dbw = DatabaseWrapper(ca.config_obj)
117 connections = dbw.check_connections()
118 return connections.get(name, False), 200
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mindsdb/api/http/namespaces/config.py b/mindsdb/api/http/namespaces/config.py
--- a/mindsdb/api/http/namespaces/config.py
+++ b/mindsdb/api/http/namespaces/config.py
@@ -55,6 +55,11 @@
params = request.json.get('params')
if not isinstance(params, dict):
abort(400, "type of 'params' must be dict")
+
+ is_test = params.get('test', False)
+ if is_test:
+ del params['test']
+
integration = get_integration(name)
if integration is not None:
abort(400, f"Integration with name '{name}' already exists")
@@ -72,6 +77,12 @@
except Exception as e:
print(traceback.format_exc())
abort(500, f'Error during config update: {str(e)}')
+
+ if is_test:
+ cons = dbw.check_connections()
+ ca.config_obj.remove_db_integration(name)
+ return {'success': cons[name]}, 200
+
return '', 200
@ns_conf.doc('delete_integration')
|
{"golden_diff": "diff --git a/mindsdb/api/http/namespaces/config.py b/mindsdb/api/http/namespaces/config.py\n--- a/mindsdb/api/http/namespaces/config.py\n+++ b/mindsdb/api/http/namespaces/config.py\n@@ -55,6 +55,11 @@\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n+\n+ is_test = params.get('test', False)\n+ if is_test:\n+ del params['test']\n+\n integration = get_integration(name)\n if integration is not None:\n abort(400, f\"Integration with name '{name}' already exists\")\n@@ -72,6 +77,12 @@\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during config update: {str(e)}')\n+\n+ if is_test:\n+ cons = dbw.check_connections()\n+ ca.config_obj.remove_db_integration(name)\n+ return {'success': cons[name]}, 200\n+\n return '', 200\n \n @ns_conf.doc('delete_integration')\n", "issue": "Route to check connection\nAt this moment in GUI possible check connection only for already exists integration. Route need for checking connection before integration created.\n", "before_files": [{"content": "import copy\nimport traceback\n\nfrom flask import request\nfrom flask_restx import Resource, abort\nfrom flask import current_app as ca\n\nfrom mindsdb.api.http.namespaces.configs.config import ns_conf\nfrom mindsdb.interfaces.database.database import DatabaseWrapper\nfrom mindsdb.utilities.functions import get_all_models_meta_data\n\n\ndef get_integration(name):\n integrations = ca.config_obj.get('integrations', {})\n return integrations.get(name)\n\n\n@ns_conf.route('/integrations')\n@ns_conf.param('name', 'List all database integration')\nclass ListIntegration(Resource):\n @ns_conf.doc('get_integrations')\n def get(self):\n return {'integrations': [k for k in ca.config_obj.get('integrations', {})]}\n\n\n@ns_conf.route('/all_integrations')\n@ns_conf.param('name', 'List all database integration')\nclass AllIntegration(Resource):\n @ns_conf.doc('get_all_integrations')\n def get(self):\n integrations = copy.deepcopy(\n ca.config_obj.get('integrations', {})\n )\n for integration in integrations.values():\n if 'password' in integration:\n integration['password'] = None\n return integrations\n\n\n@ns_conf.route('/integrations/<name>')\n@ns_conf.param('name', 'Database integration')\nclass Integration(Resource):\n @ns_conf.doc('get_integration')\n def get(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(404, f'Can\\'t find database integration: {name}')\n integration = copy.deepcopy(integration)\n if 'password' in integration:\n integration['password'] = None\n return integration\n\n @ns_conf.doc('put_integration')\n def put(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n integration = get_integration(name)\n if integration is not None:\n abort(400, f\"Integration with name '{name}' already exists\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.add_db_integration(name, params)\n\n mdb = ca.mindsdb_native\n cst = ca.custom_models\n model_data_arr = get_all_models_meta_data(mdb, cst)\n dbw = DatabaseWrapper(ca.config_obj)\n dbw.register_predictors(model_data_arr)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during config update: {str(e)}')\n return '', 200\n\n @ns_conf.doc('delete_integration')\n def delete(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothing to delete. '{name}' not exists.\")\n try:\n ca.config_obj.remove_db_integration(name)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration delete: {str(e)}')\n return '', 200\n\n @ns_conf.doc('modify_integration')\n def post(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothin to modify. '{name}' not exists.\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.modify_db_integration(name, params)\n DatabaseWrapper(ca.config_obj)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration modifycation: {str(e)}')\n return '', 200\n\n\n@ns_conf.route('/integrations/<name>/check')\n@ns_conf.param('name', 'Database integration checks')\nclass Check(Resource):\n @ns_conf.doc('check')\n def get(self, name):\n if get_integration(name) is None:\n abort(404, f'Can\\'t find database integration: {name}')\n dbw = DatabaseWrapper(ca.config_obj)\n connections = dbw.check_connections()\n return connections.get(name, False), 200\n", "path": "mindsdb/api/http/namespaces/config.py"}], "after_files": [{"content": "import copy\nimport traceback\n\nfrom flask import request\nfrom flask_restx import Resource, abort\nfrom flask import current_app as ca\n\nfrom mindsdb.api.http.namespaces.configs.config import ns_conf\nfrom mindsdb.interfaces.database.database import DatabaseWrapper\nfrom mindsdb.utilities.functions import get_all_models_meta_data\n\n\ndef get_integration(name):\n integrations = ca.config_obj.get('integrations', {})\n return integrations.get(name)\n\n\n@ns_conf.route('/integrations')\n@ns_conf.param('name', 'List all database integration')\nclass ListIntegration(Resource):\n @ns_conf.doc('get_integrations')\n def get(self):\n return {'integrations': [k for k in ca.config_obj.get('integrations', {})]}\n\n\n@ns_conf.route('/all_integrations')\n@ns_conf.param('name', 'List all database integration')\nclass AllIntegration(Resource):\n @ns_conf.doc('get_all_integrations')\n def get(self):\n integrations = copy.deepcopy(\n ca.config_obj.get('integrations', {})\n )\n for integration in integrations.values():\n if 'password' in integration:\n integration['password'] = None\n return integrations\n\n\n@ns_conf.route('/integrations/<name>')\n@ns_conf.param('name', 'Database integration')\nclass Integration(Resource):\n @ns_conf.doc('get_integration')\n def get(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(404, f'Can\\'t find database integration: {name}')\n integration = copy.deepcopy(integration)\n if 'password' in integration:\n integration['password'] = None\n return integration\n\n @ns_conf.doc('put_integration')\n def put(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n\n is_test = params.get('test', False)\n if is_test:\n del params['test']\n\n integration = get_integration(name)\n if integration is not None:\n abort(400, f\"Integration with name '{name}' already exists\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.add_db_integration(name, params)\n\n mdb = ca.mindsdb_native\n cst = ca.custom_models\n model_data_arr = get_all_models_meta_data(mdb, cst)\n dbw = DatabaseWrapper(ca.config_obj)\n dbw.register_predictors(model_data_arr)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during config update: {str(e)}')\n\n if is_test:\n cons = dbw.check_connections()\n ca.config_obj.remove_db_integration(name)\n return {'success': cons[name]}, 200\n\n return '', 200\n\n @ns_conf.doc('delete_integration')\n def delete(self, name):\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothing to delete. '{name}' not exists.\")\n try:\n ca.config_obj.remove_db_integration(name)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration delete: {str(e)}')\n return '', 200\n\n @ns_conf.doc('modify_integration')\n def post(self, name):\n params = request.json.get('params')\n if not isinstance(params, dict):\n abort(400, \"type of 'params' must be dict\")\n integration = get_integration(name)\n if integration is None:\n abort(400, f\"Nothin to modify. '{name}' not exists.\")\n try:\n if 'enabled' in params:\n params['publish'] = params['enabled']\n del params['enabled']\n ca.config_obj.modify_db_integration(name, params)\n DatabaseWrapper(ca.config_obj)\n except Exception as e:\n print(traceback.format_exc())\n abort(500, f'Error during integration modifycation: {str(e)}')\n return '', 200\n\n\n@ns_conf.route('/integrations/<name>/check')\n@ns_conf.param('name', 'Database integration checks')\nclass Check(Resource):\n @ns_conf.doc('check')\n def get(self, name):\n if get_integration(name) is None:\n abort(404, f'Can\\'t find database integration: {name}')\n dbw = DatabaseWrapper(ca.config_obj)\n connections = dbw.check_connections()\n return connections.get(name, False), 200\n", "path": "mindsdb/api/http/namespaces/config.py"}]}
| 1,485 | 261 |
gh_patches_debug_29959
|
rasdani/github-patches
|
git_diff
|
joke2k__faker-1800
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Latvian ssn generator is not correct
* Faker version: 17.0.0
* OS: Windows
Latvian ssn generator generates ssn in format 'ddd-dd-dddd', which is not correct.
### Steps to reproduce
```
from faker import Faker
fake = Faker('lv_LV')
fake.ssn()
```
### Expected behavior
It should be 12 characters long DDMMYY-CZZZQ e.g. 170334-19862
### Actual behavior
fake.ssn() for latvian locale outputs ssn in format 'ddd-dd-dddd'
'604-87-6475'
'824-21-9489'
'411-57-8156'
'938-77-8408'
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `faker/providers/ssn/lv_LV/__init__.py`
Content:
```
1 from .. import Provider as BaseProvider
2
3
4 class Provider(BaseProvider):
5 """
6 A Faker provider for the Latvian VAT IDs
7 """
8
9 vat_id_formats = ("LV###########",)
10
11 def vat_id(self) -> str:
12 """
13 http://ec.europa.eu/taxation_customs/vies/faq.html#item_11
14 :return: a random Latvian VAT ID
15 """
16
17 return self.bothify(self.random_element(self.vat_id_formats))
18
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/faker/providers/ssn/lv_LV/__init__.py b/faker/providers/ssn/lv_LV/__init__.py
--- a/faker/providers/ssn/lv_LV/__init__.py
+++ b/faker/providers/ssn/lv_LV/__init__.py
@@ -1,7 +1,56 @@
-from .. import Provider as BaseProvider
+import datetime
+from .. import Provider as SsnProvider
+
+
+class Provider(SsnProvider):
+ def ssn(self, min_age: int = 0, max_age: int = 105) -> str:
+ """
+ Returns 11 character Latvian personal identity code (Personas kods).
+ This function assigns random age to person.
+
+ Personal code consists of eleven characters of the form DDMMYYCZZZQ, where
+ DDMMYY is the date of birth, C the century sign, ZZZ the individual
+ number and Q the control character (checksum). The number for the
+ century is either 0 (1800–1899), 1 (1900–1999), or 2 (2000–2099).
+ """
+
+ def _checksum(ssn_without_checksum):
+ weights = [1, 6, 3, 7, 9, 10, 5, 8, 4, 2]
+ weighted_sum = sum(int(digit) * weight for digit, weight in zip(ssn_without_checksum, weights))
+ reminder = (1 - weighted_sum) % 11
+ if reminder == 10:
+ return 0
+ elif reminder < -1:
+ return reminder + 11
+ return reminder
+
+ age = datetime.timedelta(days=self.generator.random.randrange(min_age * 365, max_age * 365))
+ birthday = datetime.date.today() - age
+ ssn_date = "%02d%02d%s" % (
+ birthday.day,
+ birthday.month,
+ str(birthday.year)[-2:],
+ )
+ century = self._get_century_code(birthday.year) # Century
+ suffix = self.generator.random.randrange(111, 999)
+ checksum = _checksum(f"{ssn_date}{century:01d}{suffix:03d}")
+ ssn = f"{ssn_date}-{century:01d}{suffix:03d}{checksum:01d}"
+ return ssn
+
+ @staticmethod
+ def _get_century_code(year: int) -> int:
+ """Returns the century code for a given year"""
+ if 2000 <= year < 3000:
+ code = 2
+ elif 1900 <= year < 2000:
+ code = 1
+ elif 1800 <= year < 1900:
+ code = 0
+ else:
+ raise ValueError("SSN do not support people born before the year 1800 or after the year 2999")
+ return code
-class Provider(BaseProvider):
"""
A Faker provider for the Latvian VAT IDs
"""
|
{"golden_diff": "diff --git a/faker/providers/ssn/lv_LV/__init__.py b/faker/providers/ssn/lv_LV/__init__.py\n--- a/faker/providers/ssn/lv_LV/__init__.py\n+++ b/faker/providers/ssn/lv_LV/__init__.py\n@@ -1,7 +1,56 @@\n-from .. import Provider as BaseProvider\n+import datetime\n \n+from .. import Provider as SsnProvider\n+\n+\n+class Provider(SsnProvider):\n+ def ssn(self, min_age: int = 0, max_age: int = 105) -> str:\n+ \"\"\"\n+ Returns 11 character Latvian personal identity code (Personas kods).\n+ This function assigns random age to person.\n+\n+ Personal code consists of eleven characters of the form DDMMYYCZZZQ, where\n+ DDMMYY is the date of birth, C the century sign, ZZZ the individual\n+ number and Q the control character (checksum). The number for the\n+ century is either 0 (1800\u20131899), 1 (1900\u20131999), or 2 (2000\u20132099).\n+ \"\"\"\n+\n+ def _checksum(ssn_without_checksum):\n+ weights = [1, 6, 3, 7, 9, 10, 5, 8, 4, 2]\n+ weighted_sum = sum(int(digit) * weight for digit, weight in zip(ssn_without_checksum, weights))\n+ reminder = (1 - weighted_sum) % 11\n+ if reminder == 10:\n+ return 0\n+ elif reminder < -1:\n+ return reminder + 11\n+ return reminder\n+\n+ age = datetime.timedelta(days=self.generator.random.randrange(min_age * 365, max_age * 365))\n+ birthday = datetime.date.today() - age\n+ ssn_date = \"%02d%02d%s\" % (\n+ birthday.day,\n+ birthday.month,\n+ str(birthday.year)[-2:],\n+ )\n+ century = self._get_century_code(birthday.year) # Century\n+ suffix = self.generator.random.randrange(111, 999)\n+ checksum = _checksum(f\"{ssn_date}{century:01d}{suffix:03d}\")\n+ ssn = f\"{ssn_date}-{century:01d}{suffix:03d}{checksum:01d}\"\n+ return ssn\n+\n+ @staticmethod\n+ def _get_century_code(year: int) -> int:\n+ \"\"\"Returns the century code for a given year\"\"\"\n+ if 2000 <= year < 3000:\n+ code = 2\n+ elif 1900 <= year < 2000:\n+ code = 1\n+ elif 1800 <= year < 1900:\n+ code = 0\n+ else:\n+ raise ValueError(\"SSN do not support people born before the year 1800 or after the year 2999\")\n+ return code\n \n-class Provider(BaseProvider):\n \"\"\"\n A Faker provider for the Latvian VAT IDs\n \"\"\"\n", "issue": "Latvian ssn generator is not correct\n* Faker version: 17.0.0\r\n* OS: Windows\r\n\r\n\r\nLatvian ssn generator generates ssn in format 'ddd-dd-dddd', which is not correct. \r\n\r\n### Steps to reproduce\r\n\r\n```\r\nfrom faker import Faker\r\nfake = Faker('lv_LV')\r\nfake.ssn()\r\n\r\n```\r\n\r\n### Expected behavior\r\n\r\nIt should be 12 characters long DDMMYY-CZZZQ e.g. 170334-19862\r\n\r\n### Actual behavior\r\n\r\nfake.ssn() for latvian locale outputs ssn in format 'ddd-dd-dddd'\r\n'604-87-6475'\r\n'824-21-9489'\r\n'411-57-8156'\r\n'938-77-8408'\r\n\n", "before_files": [{"content": "from .. import Provider as BaseProvider\n\n\nclass Provider(BaseProvider):\n \"\"\"\n A Faker provider for the Latvian VAT IDs\n \"\"\"\n\n vat_id_formats = (\"LV###########\",)\n\n def vat_id(self) -> str:\n \"\"\"\n http://ec.europa.eu/taxation_customs/vies/faq.html#item_11\n :return: a random Latvian VAT ID\n \"\"\"\n\n return self.bothify(self.random_element(self.vat_id_formats))\n", "path": "faker/providers/ssn/lv_LV/__init__.py"}], "after_files": [{"content": "import datetime\n\nfrom .. import Provider as SsnProvider\n\n\nclass Provider(SsnProvider):\n def ssn(self, min_age: int = 0, max_age: int = 105) -> str:\n \"\"\"\n Returns 11 character Latvian personal identity code (Personas kods).\n This function assigns random age to person.\n\n Personal code consists of eleven characters of the form DDMMYYCZZZQ, where\n DDMMYY is the date of birth, C the century sign, ZZZ the individual\n number and Q the control character (checksum). The number for the\n century is either 0 (1800\u20131899), 1 (1900\u20131999), or 2 (2000\u20132099).\n \"\"\"\n\n def _checksum(ssn_without_checksum):\n weights = [1, 6, 3, 7, 9, 10, 5, 8, 4, 2]\n weighted_sum = sum(int(digit) * weight for digit, weight in zip(ssn_without_checksum, weights))\n reminder = (1 - weighted_sum) % 11\n if reminder == 10:\n return 0\n elif reminder < -1:\n return reminder + 11\n return reminder\n\n age = datetime.timedelta(days=self.generator.random.randrange(min_age * 365, max_age * 365))\n birthday = datetime.date.today() - age\n ssn_date = \"%02d%02d%s\" % (\n birthday.day,\n birthday.month,\n str(birthday.year)[-2:],\n )\n century = self._get_century_code(birthday.year) # Century\n suffix = self.generator.random.randrange(111, 999)\n checksum = _checksum(f\"{ssn_date}{century:01d}{suffix:03d}\")\n ssn = f\"{ssn_date}-{century:01d}{suffix:03d}{checksum:01d}\"\n return ssn\n\n @staticmethod\n def _get_century_code(year: int) -> int:\n \"\"\"Returns the century code for a given year\"\"\"\n if 2000 <= year < 3000:\n code = 2\n elif 1900 <= year < 2000:\n code = 1\n elif 1800 <= year < 1900:\n code = 0\n else:\n raise ValueError(\"SSN do not support people born before the year 1800 or after the year 2999\")\n return code\n\n \"\"\"\n A Faker provider for the Latvian VAT IDs\n \"\"\"\n\n vat_id_formats = (\"LV###########\",)\n\n def vat_id(self) -> str:\n \"\"\"\n http://ec.europa.eu/taxation_customs/vies/faq.html#item_11\n :return: a random Latvian VAT ID\n \"\"\"\n\n return self.bothify(self.random_element(self.vat_id_formats))\n", "path": "faker/providers/ssn/lv_LV/__init__.py"}]}
| 592 | 738 |
gh_patches_debug_28076
|
rasdani/github-patches
|
git_diff
|
PyGithub__PyGithub-1151
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Allow setting `name` for upload_asset
Currently, PyGithub uses `path.basename` for the `name` of the file assets -- could it be added that I can choose my own name? I can of course rename the file before upload, but it would be easier if I could just specify the name.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `github/GitRelease.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 ############################ Copyrights and license ############################
4 # #
5 # Copyright 2015 Ed Holland <eholland@alertlogic.com> #
6 # Copyright 2016 Benjamin Whitney <benjamin.whitney@ironnetcybersecurity.com> #
7 # Copyright 2016 Jannis Gebauer <ja.geb@me.com> #
8 # Copyright 2016 Peter Buckley <dx-pbuckley@users.noreply.github.com> #
9 # Copyright 2017 Chris McBride <thehighlander@users.noreply.github.com> #
10 # Copyright 2017 Simon <spam@esemi.ru> #
11 # Copyright 2018 Daniel Kesler <kesler.daniel@gmail.com> #
12 # Copyright 2018 Kuba <jakub.glapa@adspired.com> #
13 # Copyright 2018 Maarten Fonville <mfonville@users.noreply.github.com> #
14 # Copyright 2018 Shinichi TAMURA <shnch.tmr@gmail.com> #
15 # Copyright 2018 Wan Liuyang <tsfdye@gmail.com> #
16 # Copyright 2018 edquist <edquist@users.noreply.github.com> #
17 # Copyright 2018 nurupo <nurupo.contributions@gmail.com> #
18 # Copyright 2018 sfdye <tsfdye@gmail.com> #
19 # #
20 # This file is part of PyGithub. #
21 # http://pygithub.readthedocs.io/ #
22 # #
23 # PyGithub is free software: you can redistribute it and/or modify it under #
24 # the terms of the GNU Lesser General Public License as published by the Free #
25 # Software Foundation, either version 3 of the License, or (at your option) #
26 # any later version. #
27 # #
28 # PyGithub is distributed in the hope that it will be useful, but WITHOUT ANY #
29 # WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS #
30 # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more #
31 # details. #
32 # #
33 # You should have received a copy of the GNU Lesser General Public License #
34 # along with PyGithub. If not, see <http://www.gnu.org/licenses/>. #
35 # #
36 ################################################################################
37
38 from os.path import basename
39 import github.GithubObject
40 import github.NamedUser
41 import github.GitReleaseAsset
42
43
44 class GitRelease(github.GithubObject.CompletableGithubObject):
45 """
46 This class represents GitReleases. The reference can be found here https://developer.github.com/v3/repos/releases
47 """
48
49 def __repr__(self):
50 return self.get__repr__({"title": self._title.value})
51
52 @property
53 def id(self):
54 """
55 :type: integer
56 """
57 self._completeIfNotSet(self._id)
58 return self._id.value
59
60 @property
61 def body(self):
62 """
63 :type: string
64 """
65 self._completeIfNotSet(self._body)
66 return self._body.value
67
68 @property
69 def title(self):
70 """
71 :type: string
72 """
73 self._completeIfNotSet(self._title)
74 return self._title.value
75
76 @property
77 def tag_name(self):
78 """
79 :type: string
80 """
81 self._completeIfNotSet(self._tag_name)
82 return self._tag_name.value
83
84 @property
85 def target_commitish(self):
86 """
87 :type: string
88 """
89 self._completeIfNotSet(self._target_commitish)
90 return self._target_commitish.value
91
92 @property
93 def draft(self):
94 """
95 :type: bool
96 """
97 self._completeIfNotSet(self._draft)
98 return self._draft.value
99
100 @property
101 def prerelease(self):
102 """
103 :type: bool
104 """
105 self._completeIfNotSet(self._prerelease)
106 return self._prerelease.value
107
108 @property
109 def author(self):
110 """
111 :type: :class:`github.NamedUser.NamedUser`
112 """
113 self._completeIfNotSet(self._author)
114 return self._author.value
115
116 @property
117 def created_at(self):
118 """
119 :type: datetime.datetime
120 """
121 self._completeIfNotSet(self._created_at)
122 return self._created_at.value
123
124 @property
125 def published_at(self):
126 """
127 :type: datetime.datetime
128 """
129 self._completeIfNotSet(self._published_at)
130 return self._published_at.value
131
132 @property
133 def url(self):
134 """
135 :type: string
136 """
137 self._completeIfNotSet(self._url)
138 return self._url.value
139
140 @property
141 def upload_url(self):
142 """
143 :type: string
144 """
145 self._completeIfNotSet(self._upload_url)
146 return self._upload_url.value
147
148 @property
149 def html_url(self):
150 """
151 :type: string
152 """
153 self._completeIfNotSet(self._html_url)
154 return self._html_url.value
155
156 @property
157 def tarball_url(self):
158 """
159 :type: string
160 """
161 self._completeIfNotSet(self._tarball_url)
162 return self._tarball_url.value
163
164 @property
165 def zipball_url(self):
166 """
167 :type: string
168 """
169 self._completeIfNotSet(self._zipball_url)
170 return self._zipball_url.value
171
172 def delete_release(self):
173 """
174 :calls: `DELETE /repos/:owner/:repo/releases/:release_id <https://developer.github.com/v3/repos/releases/#delete-a-release>`_
175 :rtype: None
176 """
177 headers, data = self._requester.requestJsonAndCheck(
178 "DELETE",
179 self.url
180 )
181
182 def update_release(self, name, message, draft=False, prerelease=False,
183 tag_name=github.GithubObject.NotSet,
184 target_commitish=github.GithubObject.NotSet):
185 """
186 :calls: `PATCH /repos/:owner/:repo/releases/:release_id <https://developer.github.com/v3/repos/releases/#edit-a-release>`_
187 :rtype: :class:`github.GitRelease.GitRelease`
188 """
189 assert tag_name is github.GithubObject.NotSet \
190 or isinstance(tag_name, (str, unicode)), \
191 'tag_name must be a str/unicode object'
192 assert target_commitish is github.GithubObject.NotSet \
193 or isinstance(target_commitish, (str, unicode)), \
194 'target_commitish must be a str/unicode object'
195 assert isinstance(name, (str, unicode)), name
196 assert isinstance(message, (str, unicode)), message
197 assert isinstance(draft, bool), draft
198 assert isinstance(prerelease, bool), prerelease
199 if tag_name is github.GithubObject.NotSet:
200 tag_name = self.tag_name
201 post_parameters = {
202 "tag_name": tag_name,
203 "name": name,
204 "body": message,
205 "draft": draft,
206 "prerelease": prerelease,
207 }
208 # Do not set target_commitish to self.target_commitish when ommited, just don't send it
209 # alltogether in that case, in order to match the Github API behaviour. Only send it when set.
210 if target_commitish is not github.GithubObject.NotSet:
211 post_parameters['target_commitish'] = target_commitish
212 headers, data = self._requester.requestJsonAndCheck(
213 "PATCH",
214 self.url,
215 input=post_parameters
216 )
217 return github.GitRelease.GitRelease(self._requester, headers, data, completed=True)
218
219 def upload_asset(self, path, label="", content_type=""):
220 """
221 :calls: `POST https://<upload_url>/repos/:owner/:repo/releases/:release_id/assets?name=foo.zip <https://developer.github.com/v3/repos/releases/#upload-a-release-asset>`_
222 :rtype: :class:`github.GitReleaseAsset.GitReleaseAsset`
223 """
224 assert isinstance(path, (str, unicode)), path
225 assert isinstance(label, (str, unicode)), label
226
227 post_parameters = {
228 "name": basename(path),
229 "label": label
230 }
231 headers = {}
232 if len(content_type) > 0:
233 headers["Content-Type"] = content_type
234 resp_headers, data = self._requester.requestBlobAndCheck(
235 "POST",
236 self.upload_url.split("{?")[0],
237 parameters=post_parameters,
238 headers=headers,
239 input=path
240 )
241 return github.GitReleaseAsset.GitReleaseAsset(self._requester, resp_headers, data, completed=True)
242
243 def get_assets(self):
244 """
245 :calls: `GET /repos/:owner/:repo/releases/:release_id/assets <https://developer.github.com/v3/repos/releases/#list-assets-for-a-release>`_
246 :rtype: :class:`github.PaginatedList.PaginatedList`
247 """
248 return github.PaginatedList.PaginatedList(
249 github.GitReleaseAsset.GitReleaseAsset,
250 self._requester,
251 self.url + "/assets",
252 None
253 )
254
255 def _initAttributes(self):
256 self._id = github.GithubObject.NotSet
257 self._body = github.GithubObject.NotSet
258 self._title = github.GithubObject.NotSet
259 self._tag_name = github.GithubObject.NotSet
260 self._target_commitish = github.GithubObject.NotSet
261 self._draft = github.GithubObject.NotSet
262 self._prerelease = github.GithubObject.NotSet
263 self._author = github.GithubObject.NotSet
264 self._url = github.GithubObject.NotSet
265 self._upload_url = github.GithubObject.NotSet
266 self._html_url = github.GithubObject.NotSet
267 self._created_at = github.GithubObject.NotSet
268 self._published_at = github.GithubObject.NotSet
269 self._tarball_url = github.GithubObject.NotSet
270 self._zipball_url = github.GithubObject.NotSet
271
272 def _useAttributes(self, attributes):
273 if "id" in attributes:
274 self._id = self._makeIntAttribute(attributes["id"])
275 if "body" in attributes:
276 self._body = self._makeStringAttribute(attributes["body"])
277 if "name" in attributes:
278 self._title = self._makeStringAttribute(attributes["name"])
279 if "tag_name" in attributes:
280 self._tag_name = self._makeStringAttribute(attributes["tag_name"])
281 if "target_commitish" in attributes:
282 self._target_commitish = self._makeStringAttribute(attributes["target_commitish"])
283 if "draft" in attributes:
284 self._draft = self._makeBoolAttribute(attributes["draft"])
285 if "prerelease" in attributes:
286 self._prerelease = self._makeBoolAttribute(attributes["prerelease"])
287 if "author" in attributes:
288 self._author = self._makeClassAttribute(github.NamedUser.NamedUser, attributes["author"])
289 if "url" in attributes:
290 self._url = self._makeStringAttribute(attributes["url"])
291 if "upload_url" in attributes:
292 self._upload_url = self._makeStringAttribute(attributes["upload_url"])
293 if "html_url" in attributes:
294 self._html_url = self._makeStringAttribute(attributes["html_url"])
295 if "created_at" in attributes:
296 self._created_at = self._makeDatetimeAttribute(attributes["created_at"])
297 if "published_at" in attributes:
298 self._published_at = self._makeDatetimeAttribute(attributes["published_at"])
299 if "tarball_url" in attributes:
300 self._tarball_url = self._makeStringAttribute(attributes["tarball_url"])
301 if "zipball_url" in attributes:
302 self._zipball_url = self._makeStringAttribute(attributes["zipball_url"])
303
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/github/GitRelease.py b/github/GitRelease.py
--- a/github/GitRelease.py
+++ b/github/GitRelease.py
@@ -216,20 +216,24 @@
)
return github.GitRelease.GitRelease(self._requester, headers, data, completed=True)
- def upload_asset(self, path, label="", content_type=""):
+ def upload_asset(self, path, label="", content_type=github.GithubObject.NotSet, name=github.GithubObject.NotSet):
"""
- :calls: `POST https://<upload_url>/repos/:owner/:repo/releases/:release_id/assets?name=foo.zip <https://developer.github.com/v3/repos/releases/#upload-a-release-asset>`_
+ :calls: `POST https://<upload_url>/repos/:owner/:repo/releases/:release_id/assets <https://developer.github.com/v3/repos/releases/#upload-a-release-asset>`_
:rtype: :class:`github.GitReleaseAsset.GitReleaseAsset`
"""
assert isinstance(path, (str, unicode)), path
assert isinstance(label, (str, unicode)), label
+ assert name is github.GithubObject.NotSet or isinstance(name, (str, unicode)), name
post_parameters = {
- "name": basename(path),
"label": label
}
+ if name is github.GithubObject.NotSet:
+ post_parameters["name"] = basename(path)
+ else:
+ post_parameters["name"] = name
headers = {}
- if len(content_type) > 0:
+ if content_type is not github.GithubObject.NotSet:
headers["Content-Type"] = content_type
resp_headers, data = self._requester.requestBlobAndCheck(
"POST",
|
{"golden_diff": "diff --git a/github/GitRelease.py b/github/GitRelease.py\n--- a/github/GitRelease.py\n+++ b/github/GitRelease.py\n@@ -216,20 +216,24 @@\n )\n return github.GitRelease.GitRelease(self._requester, headers, data, completed=True)\n \n- def upload_asset(self, path, label=\"\", content_type=\"\"):\n+ def upload_asset(self, path, label=\"\", content_type=github.GithubObject.NotSet, name=github.GithubObject.NotSet):\n \"\"\"\n- :calls: `POST https://<upload_url>/repos/:owner/:repo/releases/:release_id/assets?name=foo.zip <https://developer.github.com/v3/repos/releases/#upload-a-release-asset>`_\n+ :calls: `POST https://<upload_url>/repos/:owner/:repo/releases/:release_id/assets <https://developer.github.com/v3/repos/releases/#upload-a-release-asset>`_\n :rtype: :class:`github.GitReleaseAsset.GitReleaseAsset`\n \"\"\"\n assert isinstance(path, (str, unicode)), path\n assert isinstance(label, (str, unicode)), label\n+ assert name is github.GithubObject.NotSet or isinstance(name, (str, unicode)), name\n \n post_parameters = {\n- \"name\": basename(path),\n \"label\": label\n }\n+ if name is github.GithubObject.NotSet:\n+ post_parameters[\"name\"] = basename(path)\n+ else:\n+ post_parameters[\"name\"] = name\n headers = {}\n- if len(content_type) > 0:\n+ if content_type is not github.GithubObject.NotSet:\n headers[\"Content-Type\"] = content_type\n resp_headers, data = self._requester.requestBlobAndCheck(\n \"POST\",\n", "issue": "Allow setting `name` for upload_asset\nCurrently, PyGithub uses `path.basename` for the `name` of the file assets -- could it be added that I can choose my own name? I can of course rename the file before upload, but it would be easier if I could just specify the name.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n############################ Copyrights and license ############################\n# #\n# Copyright 2015 Ed Holland <eholland@alertlogic.com> #\n# Copyright 2016 Benjamin Whitney <benjamin.whitney@ironnetcybersecurity.com> #\n# Copyright 2016 Jannis Gebauer <ja.geb@me.com> #\n# Copyright 2016 Peter Buckley <dx-pbuckley@users.noreply.github.com> #\n# Copyright 2017 Chris McBride <thehighlander@users.noreply.github.com> #\n# Copyright 2017 Simon <spam@esemi.ru> #\n# Copyright 2018 Daniel Kesler <kesler.daniel@gmail.com> #\n# Copyright 2018 Kuba <jakub.glapa@adspired.com> #\n# Copyright 2018 Maarten Fonville <mfonville@users.noreply.github.com> #\n# Copyright 2018 Shinichi TAMURA <shnch.tmr@gmail.com> #\n# Copyright 2018 Wan Liuyang <tsfdye@gmail.com> #\n# Copyright 2018 edquist <edquist@users.noreply.github.com> #\n# Copyright 2018 nurupo <nurupo.contributions@gmail.com> #\n# Copyright 2018 sfdye <tsfdye@gmail.com> #\n# #\n# This file is part of PyGithub. #\n# http://pygithub.readthedocs.io/ #\n# #\n# PyGithub is free software: you can redistribute it and/or modify it under #\n# the terms of the GNU Lesser General Public License as published by the Free #\n# Software Foundation, either version 3 of the License, or (at your option) #\n# any later version. #\n# #\n# PyGithub is distributed in the hope that it will be useful, but WITHOUT ANY #\n# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS #\n# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more #\n# details. #\n# #\n# You should have received a copy of the GNU Lesser General Public License #\n# along with PyGithub. If not, see <http://www.gnu.org/licenses/>. #\n# #\n################################################################################\n\nfrom os.path import basename\nimport github.GithubObject\nimport github.NamedUser\nimport github.GitReleaseAsset\n\n\nclass GitRelease(github.GithubObject.CompletableGithubObject):\n \"\"\"\n This class represents GitReleases. The reference can be found here https://developer.github.com/v3/repos/releases\n \"\"\"\n\n def __repr__(self):\n return self.get__repr__({\"title\": self._title.value})\n\n @property\n def id(self):\n \"\"\"\n :type: integer\n \"\"\"\n self._completeIfNotSet(self._id)\n return self._id.value\n\n @property\n def body(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._body)\n return self._body.value\n\n @property\n def title(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._title)\n return self._title.value\n\n @property\n def tag_name(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._tag_name)\n return self._tag_name.value\n\n @property\n def target_commitish(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._target_commitish)\n return self._target_commitish.value\n\n @property\n def draft(self):\n \"\"\"\n :type: bool\n \"\"\"\n self._completeIfNotSet(self._draft)\n return self._draft.value\n\n @property\n def prerelease(self):\n \"\"\"\n :type: bool\n \"\"\"\n self._completeIfNotSet(self._prerelease)\n return self._prerelease.value\n\n @property\n def author(self):\n \"\"\"\n :type: :class:`github.NamedUser.NamedUser`\n \"\"\"\n self._completeIfNotSet(self._author)\n return self._author.value\n\n @property\n def created_at(self):\n \"\"\"\n :type: datetime.datetime\n \"\"\"\n self._completeIfNotSet(self._created_at)\n return self._created_at.value\n\n @property\n def published_at(self):\n \"\"\"\n :type: datetime.datetime\n \"\"\"\n self._completeIfNotSet(self._published_at)\n return self._published_at.value\n\n @property\n def url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._url)\n return self._url.value\n\n @property\n def upload_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._upload_url)\n return self._upload_url.value\n\n @property\n def html_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._html_url)\n return self._html_url.value\n\n @property\n def tarball_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._tarball_url)\n return self._tarball_url.value\n\n @property\n def zipball_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._zipball_url)\n return self._zipball_url.value\n\n def delete_release(self):\n \"\"\"\n :calls: `DELETE /repos/:owner/:repo/releases/:release_id <https://developer.github.com/v3/repos/releases/#delete-a-release>`_\n :rtype: None\n \"\"\"\n headers, data = self._requester.requestJsonAndCheck(\n \"DELETE\",\n self.url\n )\n\n def update_release(self, name, message, draft=False, prerelease=False,\n tag_name=github.GithubObject.NotSet,\n target_commitish=github.GithubObject.NotSet):\n \"\"\"\n :calls: `PATCH /repos/:owner/:repo/releases/:release_id <https://developer.github.com/v3/repos/releases/#edit-a-release>`_\n :rtype: :class:`github.GitRelease.GitRelease`\n \"\"\"\n assert tag_name is github.GithubObject.NotSet \\\n or isinstance(tag_name, (str, unicode)), \\\n 'tag_name must be a str/unicode object'\n assert target_commitish is github.GithubObject.NotSet \\\n or isinstance(target_commitish, (str, unicode)), \\\n 'target_commitish must be a str/unicode object'\n assert isinstance(name, (str, unicode)), name\n assert isinstance(message, (str, unicode)), message\n assert isinstance(draft, bool), draft\n assert isinstance(prerelease, bool), prerelease\n if tag_name is github.GithubObject.NotSet:\n tag_name = self.tag_name\n post_parameters = {\n \"tag_name\": tag_name,\n \"name\": name,\n \"body\": message,\n \"draft\": draft,\n \"prerelease\": prerelease,\n }\n # Do not set target_commitish to self.target_commitish when ommited, just don't send it\n # alltogether in that case, in order to match the Github API behaviour. Only send it when set.\n if target_commitish is not github.GithubObject.NotSet:\n post_parameters['target_commitish'] = target_commitish\n headers, data = self._requester.requestJsonAndCheck(\n \"PATCH\",\n self.url,\n input=post_parameters\n )\n return github.GitRelease.GitRelease(self._requester, headers, data, completed=True)\n\n def upload_asset(self, path, label=\"\", content_type=\"\"):\n \"\"\"\n :calls: `POST https://<upload_url>/repos/:owner/:repo/releases/:release_id/assets?name=foo.zip <https://developer.github.com/v3/repos/releases/#upload-a-release-asset>`_\n :rtype: :class:`github.GitReleaseAsset.GitReleaseAsset`\n \"\"\"\n assert isinstance(path, (str, unicode)), path\n assert isinstance(label, (str, unicode)), label\n\n post_parameters = {\n \"name\": basename(path),\n \"label\": label\n }\n headers = {}\n if len(content_type) > 0:\n headers[\"Content-Type\"] = content_type\n resp_headers, data = self._requester.requestBlobAndCheck(\n \"POST\",\n self.upload_url.split(\"{?\")[0],\n parameters=post_parameters,\n headers=headers,\n input=path\n )\n return github.GitReleaseAsset.GitReleaseAsset(self._requester, resp_headers, data, completed=True)\n\n def get_assets(self):\n \"\"\"\n :calls: `GET /repos/:owner/:repo/releases/:release_id/assets <https://developer.github.com/v3/repos/releases/#list-assets-for-a-release>`_\n :rtype: :class:`github.PaginatedList.PaginatedList`\n \"\"\"\n return github.PaginatedList.PaginatedList(\n github.GitReleaseAsset.GitReleaseAsset,\n self._requester,\n self.url + \"/assets\",\n None\n )\n\n def _initAttributes(self):\n self._id = github.GithubObject.NotSet\n self._body = github.GithubObject.NotSet\n self._title = github.GithubObject.NotSet\n self._tag_name = github.GithubObject.NotSet\n self._target_commitish = github.GithubObject.NotSet\n self._draft = github.GithubObject.NotSet\n self._prerelease = github.GithubObject.NotSet\n self._author = github.GithubObject.NotSet\n self._url = github.GithubObject.NotSet\n self._upload_url = github.GithubObject.NotSet\n self._html_url = github.GithubObject.NotSet\n self._created_at = github.GithubObject.NotSet\n self._published_at = github.GithubObject.NotSet\n self._tarball_url = github.GithubObject.NotSet\n self._zipball_url = github.GithubObject.NotSet\n\n def _useAttributes(self, attributes):\n if \"id\" in attributes:\n self._id = self._makeIntAttribute(attributes[\"id\"])\n if \"body\" in attributes:\n self._body = self._makeStringAttribute(attributes[\"body\"])\n if \"name\" in attributes:\n self._title = self._makeStringAttribute(attributes[\"name\"])\n if \"tag_name\" in attributes:\n self._tag_name = self._makeStringAttribute(attributes[\"tag_name\"])\n if \"target_commitish\" in attributes:\n self._target_commitish = self._makeStringAttribute(attributes[\"target_commitish\"])\n if \"draft\" in attributes:\n self._draft = self._makeBoolAttribute(attributes[\"draft\"])\n if \"prerelease\" in attributes:\n self._prerelease = self._makeBoolAttribute(attributes[\"prerelease\"])\n if \"author\" in attributes:\n self._author = self._makeClassAttribute(github.NamedUser.NamedUser, attributes[\"author\"])\n if \"url\" in attributes:\n self._url = self._makeStringAttribute(attributes[\"url\"])\n if \"upload_url\" in attributes:\n self._upload_url = self._makeStringAttribute(attributes[\"upload_url\"])\n if \"html_url\" in attributes:\n self._html_url = self._makeStringAttribute(attributes[\"html_url\"])\n if \"created_at\" in attributes:\n self._created_at = self._makeDatetimeAttribute(attributes[\"created_at\"])\n if \"published_at\" in attributes:\n self._published_at = self._makeDatetimeAttribute(attributes[\"published_at\"])\n if \"tarball_url\" in attributes:\n self._tarball_url = self._makeStringAttribute(attributes[\"tarball_url\"])\n if \"zipball_url\" in attributes:\n self._zipball_url = self._makeStringAttribute(attributes[\"zipball_url\"])\n", "path": "github/GitRelease.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n############################ Copyrights and license ############################\n# #\n# Copyright 2015 Ed Holland <eholland@alertlogic.com> #\n# Copyright 2016 Benjamin Whitney <benjamin.whitney@ironnetcybersecurity.com> #\n# Copyright 2016 Jannis Gebauer <ja.geb@me.com> #\n# Copyright 2016 Peter Buckley <dx-pbuckley@users.noreply.github.com> #\n# Copyright 2017 Chris McBride <thehighlander@users.noreply.github.com> #\n# Copyright 2017 Simon <spam@esemi.ru> #\n# Copyright 2018 Daniel Kesler <kesler.daniel@gmail.com> #\n# Copyright 2018 Kuba <jakub.glapa@adspired.com> #\n# Copyright 2018 Maarten Fonville <mfonville@users.noreply.github.com> #\n# Copyright 2018 Shinichi TAMURA <shnch.tmr@gmail.com> #\n# Copyright 2018 Wan Liuyang <tsfdye@gmail.com> #\n# Copyright 2018 edquist <edquist@users.noreply.github.com> #\n# Copyright 2018 nurupo <nurupo.contributions@gmail.com> #\n# Copyright 2018 sfdye <tsfdye@gmail.com> #\n# #\n# This file is part of PyGithub. #\n# http://pygithub.readthedocs.io/ #\n# #\n# PyGithub is free software: you can redistribute it and/or modify it under #\n# the terms of the GNU Lesser General Public License as published by the Free #\n# Software Foundation, either version 3 of the License, or (at your option) #\n# any later version. #\n# #\n# PyGithub is distributed in the hope that it will be useful, but WITHOUT ANY #\n# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS #\n# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more #\n# details. #\n# #\n# You should have received a copy of the GNU Lesser General Public License #\n# along with PyGithub. If not, see <http://www.gnu.org/licenses/>. #\n# #\n################################################################################\n\nfrom os.path import basename\nimport github.GithubObject\nimport github.NamedUser\nimport github.GitReleaseAsset\n\n\nclass GitRelease(github.GithubObject.CompletableGithubObject):\n \"\"\"\n This class represents GitReleases. The reference can be found here https://developer.github.com/v3/repos/releases\n \"\"\"\n\n def __repr__(self):\n return self.get__repr__({\"title\": self._title.value})\n\n @property\n def id(self):\n \"\"\"\n :type: integer\n \"\"\"\n self._completeIfNotSet(self._id)\n return self._id.value\n\n @property\n def body(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._body)\n return self._body.value\n\n @property\n def title(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._title)\n return self._title.value\n\n @property\n def tag_name(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._tag_name)\n return self._tag_name.value\n\n @property\n def target_commitish(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._target_commitish)\n return self._target_commitish.value\n\n @property\n def draft(self):\n \"\"\"\n :type: bool\n \"\"\"\n self._completeIfNotSet(self._draft)\n return self._draft.value\n\n @property\n def prerelease(self):\n \"\"\"\n :type: bool\n \"\"\"\n self._completeIfNotSet(self._prerelease)\n return self._prerelease.value\n\n @property\n def author(self):\n \"\"\"\n :type: :class:`github.NamedUser.NamedUser`\n \"\"\"\n self._completeIfNotSet(self._author)\n return self._author.value\n\n @property\n def created_at(self):\n \"\"\"\n :type: datetime.datetime\n \"\"\"\n self._completeIfNotSet(self._created_at)\n return self._created_at.value\n\n @property\n def published_at(self):\n \"\"\"\n :type: datetime.datetime\n \"\"\"\n self._completeIfNotSet(self._published_at)\n return self._published_at.value\n\n @property\n def url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._url)\n return self._url.value\n\n @property\n def upload_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._upload_url)\n return self._upload_url.value\n\n @property\n def html_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._html_url)\n return self._html_url.value\n\n @property\n def tarball_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._tarball_url)\n return self._tarball_url.value\n\n @property\n def zipball_url(self):\n \"\"\"\n :type: string\n \"\"\"\n self._completeIfNotSet(self._zipball_url)\n return self._zipball_url.value\n\n def delete_release(self):\n \"\"\"\n :calls: `DELETE /repos/:owner/:repo/releases/:release_id <https://developer.github.com/v3/repos/releases/#delete-a-release>`_\n :rtype: None\n \"\"\"\n headers, data = self._requester.requestJsonAndCheck(\n \"DELETE\",\n self.url\n )\n\n def update_release(self, name, message, draft=False, prerelease=False,\n tag_name=github.GithubObject.NotSet,\n target_commitish=github.GithubObject.NotSet):\n \"\"\"\n :calls: `PATCH /repos/:owner/:repo/releases/:release_id <https://developer.github.com/v3/repos/releases/#edit-a-release>`_\n :rtype: :class:`github.GitRelease.GitRelease`\n \"\"\"\n assert tag_name is github.GithubObject.NotSet \\\n or isinstance(tag_name, (str, unicode)), \\\n 'tag_name must be a str/unicode object'\n assert target_commitish is github.GithubObject.NotSet \\\n or isinstance(target_commitish, (str, unicode)), \\\n 'target_commitish must be a str/unicode object'\n assert isinstance(name, (str, unicode)), name\n assert isinstance(message, (str, unicode)), message\n assert isinstance(draft, bool), draft\n assert isinstance(prerelease, bool), prerelease\n if tag_name is github.GithubObject.NotSet:\n tag_name = self.tag_name\n post_parameters = {\n \"tag_name\": tag_name,\n \"name\": name,\n \"body\": message,\n \"draft\": draft,\n \"prerelease\": prerelease,\n }\n # Do not set target_commitish to self.target_commitish when ommited, just don't send it\n # alltogether in that case, in order to match the Github API behaviour. Only send it when set.\n if target_commitish is not github.GithubObject.NotSet:\n post_parameters['target_commitish'] = target_commitish\n headers, data = self._requester.requestJsonAndCheck(\n \"PATCH\",\n self.url,\n input=post_parameters\n )\n return github.GitRelease.GitRelease(self._requester, headers, data, completed=True)\n\n def upload_asset(self, path, label=\"\", content_type=github.GithubObject.NotSet, name=github.GithubObject.NotSet):\n \"\"\"\n :calls: `POST https://<upload_url>/repos/:owner/:repo/releases/:release_id/assets <https://developer.github.com/v3/repos/releases/#upload-a-release-asset>`_\n :rtype: :class:`github.GitReleaseAsset.GitReleaseAsset`\n \"\"\"\n assert isinstance(path, (str, unicode)), path\n assert isinstance(label, (str, unicode)), label\n assert name is github.GithubObject.NotSet or isinstance(name, (str, unicode)), name\n\n post_parameters = {\n \"label\": label\n }\n if name is github.GithubObject.NotSet:\n post_parameters[\"name\"] = basename(path)\n else:\n post_parameters[\"name\"] = name\n headers = {}\n if content_type is not github.GithubObject.NotSet:\n headers[\"Content-Type\"] = content_type\n resp_headers, data = self._requester.requestBlobAndCheck(\n \"POST\",\n self.upload_url.split(\"{?\")[0],\n parameters=post_parameters,\n headers=headers,\n input=path\n )\n return github.GitReleaseAsset.GitReleaseAsset(self._requester, resp_headers, data, completed=True)\n\n def get_assets(self):\n \"\"\"\n :calls: `GET /repos/:owner/:repo/releases/:release_id/assets <https://developer.github.com/v3/repos/releases/#list-assets-for-a-release>`_\n :rtype: :class:`github.PaginatedList.PaginatedList`\n \"\"\"\n return github.PaginatedList.PaginatedList(\n github.GitReleaseAsset.GitReleaseAsset,\n self._requester,\n self.url + \"/assets\",\n None\n )\n\n def _initAttributes(self):\n self._id = github.GithubObject.NotSet\n self._body = github.GithubObject.NotSet\n self._title = github.GithubObject.NotSet\n self._tag_name = github.GithubObject.NotSet\n self._target_commitish = github.GithubObject.NotSet\n self._draft = github.GithubObject.NotSet\n self._prerelease = github.GithubObject.NotSet\n self._author = github.GithubObject.NotSet\n self._url = github.GithubObject.NotSet\n self._upload_url = github.GithubObject.NotSet\n self._html_url = github.GithubObject.NotSet\n self._created_at = github.GithubObject.NotSet\n self._published_at = github.GithubObject.NotSet\n self._tarball_url = github.GithubObject.NotSet\n self._zipball_url = github.GithubObject.NotSet\n\n def _useAttributes(self, attributes):\n if \"id\" in attributes:\n self._id = self._makeIntAttribute(attributes[\"id\"])\n if \"body\" in attributes:\n self._body = self._makeStringAttribute(attributes[\"body\"])\n if \"name\" in attributes:\n self._title = self._makeStringAttribute(attributes[\"name\"])\n if \"tag_name\" in attributes:\n self._tag_name = self._makeStringAttribute(attributes[\"tag_name\"])\n if \"target_commitish\" in attributes:\n self._target_commitish = self._makeStringAttribute(attributes[\"target_commitish\"])\n if \"draft\" in attributes:\n self._draft = self._makeBoolAttribute(attributes[\"draft\"])\n if \"prerelease\" in attributes:\n self._prerelease = self._makeBoolAttribute(attributes[\"prerelease\"])\n if \"author\" in attributes:\n self._author = self._makeClassAttribute(github.NamedUser.NamedUser, attributes[\"author\"])\n if \"url\" in attributes:\n self._url = self._makeStringAttribute(attributes[\"url\"])\n if \"upload_url\" in attributes:\n self._upload_url = self._makeStringAttribute(attributes[\"upload_url\"])\n if \"html_url\" in attributes:\n self._html_url = self._makeStringAttribute(attributes[\"html_url\"])\n if \"created_at\" in attributes:\n self._created_at = self._makeDatetimeAttribute(attributes[\"created_at\"])\n if \"published_at\" in attributes:\n self._published_at = self._makeDatetimeAttribute(attributes[\"published_at\"])\n if \"tarball_url\" in attributes:\n self._tarball_url = self._makeStringAttribute(attributes[\"tarball_url\"])\n if \"zipball_url\" in attributes:\n self._zipball_url = self._makeStringAttribute(attributes[\"zipball_url\"])\n", "path": "github/GitRelease.py"}]}
| 3,733 | 387 |
gh_patches_debug_11013
|
rasdani/github-patches
|
git_diff
|
holoviz__panel-6293
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pn.template attribute not found when lazy loading panel
panel==1.3.8
As Panel takes ~2secs to import I would like to lazy load it in one of my packages. This package provides different utility functions. Only some are panel related.
Lazy loading of modules in Python is defined here https://docs.python.org/3/library/importlib.html#implementing-lazy-imports. When I use that I get an `AttributeError` because the attribute `template` is not defined on `panel`. Without lazy loading this works.
## Minimum reproducible example
`lazy_loader.py`
```python
import importlib.util
import sys
def lazy_import(name):
spec = importlib.util.find_spec(name)
loader = importlib.util.LazyLoader(spec.loader)
spec.loader = loader
module = importlib.util.module_from_spec(spec)
sys.modules[name] = module
loader.exec_module(module)
return module
pn = lazy_import("panel")
# import panel as pn
def extension():
return pn.extension()
```
`app.py`
```python
from lazy_loader import extension
import panel as pn
extension()
pn.template.FastListTemplate(title="Hello", main=["world"]).servable()
```
```bash
panel serve app.py --autoreload --index app
```

¨
```bash
AttributeError: module 'panel' has no attribute 'template'
Traceback (most recent call last):
File "/home/jovyan/repos/aw-lib/.venv/lib/python3.11/site-packages/bokeh/application/handlers/code_runner.py", line 229, in run
exec(self._code, module.__dict__)
File "/home/jovyan/repos/aw-lib/app.py", line 6, in <module>
pn.template.FastListTemplate(title="Hello", main=["world"]).servable()
^^^^^^^^^^^
AttributeError: module 'panel' has no attribute 'template'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `panel/__init__.py`
Content:
```
1 """
2 Panel is a high level app and dashboarding framework
3 ====================================================
4
5 Panel is an open-source Python library that lets you create custom
6 interactive web apps and dashboards by connecting user-defined widgets
7 to plots, images, tables, or text.
8
9 Panel works with the tools you know and ❤️.
10
11 Check out https://panel.holoviz.org/
12
13 .. figure:: https://user-images.githubusercontent.com/42288570/152672367-6c239073-0ea0-4a2b-a4c0-817e8090e877.gif
14 :alt: Panel Dashboard
15
16 Panel Dashboard
17
18 How to develop a Panel app in 3 simple steps
19 --------------------------------------------
20
21 - Write the app
22
23 >>> import panel as pn
24 >>> pn.extension(sizing_mode="stretch_width", template="fast")
25 >>> pn.state.template.param.update(title="My Data App")
26 >>> pn.panel(some_python_object).servable()
27
28 - Run your app
29
30 $ panel serve my_script.py --autoreload --show
31
32 or
33
34 $ panel serve my_notebook.ipynb --autoreload --show
35
36 The app will be available in your browser!
37
38 - Change your code and save it
39
40 The app will reload with your changes!
41
42 You can also add automatic reload to jupyterlab. Check out
43 https://blog.holoviz.org/panel_0.12.0.html#JupyterLab-previews
44
45 To learn more about Panel check out
46 https://panel.holoviz.org/getting_started/index.html
47 """
48 from param import rx
49
50 from . import chat # noqa
51 from . import layout # noqa
52 from . import links # noqa
53 from . import pane # noqa
54 from . import param # noqa
55 from . import pipeline # noqa
56 from . import reactive # noqa
57 from . import viewable # noqa
58 from . import widgets # noqa
59 from .config import __version__, config, panel_extension as extension # noqa
60 from .depends import bind, depends # noqa
61 from .interact import interact # noqa
62 from .io import ( # noqa
63 _jupyter_server_extension_paths, cache, ipywidget, serve, state,
64 )
65 from .layout import ( # noqa
66 Accordion, Card, Column, FlexBox, FloatPanel, GridBox, GridSpec, GridStack,
67 HSpacer, Row, Spacer, Swipe, Tabs, VSpacer, WidgetBox,
68 )
69 from .pane import panel # noqa
70 from .param import Param, ReactiveExpr # noqa
71 from .template import Template # noqa
72 from .widgets import indicators, widget # noqa
73
74 __all__ = (
75 "__version__",
76 "Accordion",
77 "Card",
78 "chat",
79 "Column",
80 "FlexBox",
81 "FloatPanel",
82 "GridBox",
83 "GridSpec",
84 "GridStack",
85 "HSpacer",
86 "Param",
87 "ReactiveExpr",
88 "Row",
89 "Spacer",
90 "Tabs",
91 "Template",
92 "VSpacer",
93 "WidgetBox",
94 "bind",
95 "cache",
96 "config",
97 "depends",
98 "extension",
99 "indicators",
100 "interact",
101 "ipywidget",
102 "layout",
103 "links",
104 "pane",
105 "panel",
106 "param",
107 "pipeline",
108 "rx",
109 "serve",
110 "state",
111 "viewable",
112 "widgets",
113 "widget"
114 )
115
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/panel/__init__.py b/panel/__init__.py
--- a/panel/__init__.py
+++ b/panel/__init__.py
@@ -54,6 +54,7 @@
from . import param # noqa
from . import pipeline # noqa
from . import reactive # noqa
+from . import template # noqa
from . import viewable # noqa
from . import widgets # noqa
from .config import __version__, config, panel_extension as extension # noqa
@@ -108,6 +109,7 @@
"rx",
"serve",
"state",
+ "template",
"viewable",
"widgets",
"widget"
|
{"golden_diff": "diff --git a/panel/__init__.py b/panel/__init__.py\n--- a/panel/__init__.py\n+++ b/panel/__init__.py\n@@ -54,6 +54,7 @@\n from . import param # noqa\n from . import pipeline # noqa\n from . import reactive # noqa\n+from . import template # noqa\n from . import viewable # noqa\n from . import widgets # noqa\n from .config import __version__, config, panel_extension as extension # noqa\n@@ -108,6 +109,7 @@\n \"rx\",\n \"serve\",\n \"state\",\n+ \"template\",\n \"viewable\",\n \"widgets\",\n \"widget\"\n", "issue": "pn.template attribute not found when lazy loading panel\npanel==1.3.8\r\n\r\nAs Panel takes ~2secs to import I would like to lazy load it in one of my packages. This package provides different utility functions. Only some are panel related.\r\n\r\nLazy loading of modules in Python is defined here https://docs.python.org/3/library/importlib.html#implementing-lazy-imports. When I use that I get an `AttributeError` because the attribute `template` is not defined on `panel`. Without lazy loading this works.\r\n\r\n## Minimum reproducible example\r\n\r\n`lazy_loader.py`\r\n\r\n```python\r\nimport importlib.util\r\nimport sys\r\n\r\ndef lazy_import(name):\r\n spec = importlib.util.find_spec(name)\r\n loader = importlib.util.LazyLoader(spec.loader)\r\n spec.loader = loader\r\n module = importlib.util.module_from_spec(spec)\r\n sys.modules[name] = module\r\n loader.exec_module(module)\r\n return module\r\n\r\npn = lazy_import(\"panel\")\r\n# import panel as pn\r\n\r\ndef extension():\r\n return pn.extension()\r\n```\r\n\r\n`app.py`\r\n\r\n```python\r\nfrom lazy_loader import extension\r\nimport panel as pn\r\n\r\nextension()\r\n\r\npn.template.FastListTemplate(title=\"Hello\", main=[\"world\"]).servable()\r\n```\r\n\r\n```bash\r\npanel serve app.py --autoreload --index app\r\n```\r\n\r\n\r\n\u00a8\r\n\r\n```bash\r\nAttributeError: module 'panel' has no attribute 'template'\r\n\r\nTraceback (most recent call last):\r\n File \"/home/jovyan/repos/aw-lib/.venv/lib/python3.11/site-packages/bokeh/application/handlers/code_runner.py\", line 229, in run\r\n exec(self._code, module.__dict__)\r\n File \"/home/jovyan/repos/aw-lib/app.py\", line 6, in <module>\r\n pn.template.FastListTemplate(title=\"Hello\", main=[\"world\"]).servable()\r\n ^^^^^^^^^^^\r\nAttributeError: module 'panel' has no attribute 'template'\r\n```\n", "before_files": [{"content": "\"\"\"\nPanel is a high level app and dashboarding framework\n====================================================\n\nPanel is an open-source Python library that lets you create custom\ninteractive web apps and dashboards by connecting user-defined widgets\nto plots, images, tables, or text.\n\nPanel works with the tools you know and \u2764\ufe0f.\n\nCheck out https://panel.holoviz.org/\n\n.. figure:: https://user-images.githubusercontent.com/42288570/152672367-6c239073-0ea0-4a2b-a4c0-817e8090e877.gif\n :alt: Panel Dashboard\n\n Panel Dashboard\n\nHow to develop a Panel app in 3 simple steps\n--------------------------------------------\n\n- Write the app\n\n>>> import panel as pn\n>>> pn.extension(sizing_mode=\"stretch_width\", template=\"fast\")\n>>> pn.state.template.param.update(title=\"My Data App\")\n>>> pn.panel(some_python_object).servable()\n\n- Run your app\n\n$ panel serve my_script.py --autoreload --show\n\nor\n\n$ panel serve my_notebook.ipynb --autoreload --show\n\nThe app will be available in your browser!\n\n- Change your code and save it\n\nThe app will reload with your changes!\n\nYou can also add automatic reload to jupyterlab. Check out\nhttps://blog.holoviz.org/panel_0.12.0.html#JupyterLab-previews\n\nTo learn more about Panel check out\nhttps://panel.holoviz.org/getting_started/index.html\n\"\"\"\nfrom param import rx\n\nfrom . import chat # noqa\nfrom . import layout # noqa\nfrom . import links # noqa\nfrom . import pane # noqa\nfrom . import param # noqa\nfrom . import pipeline # noqa\nfrom . import reactive # noqa\nfrom . import viewable # noqa\nfrom . import widgets # noqa\nfrom .config import __version__, config, panel_extension as extension # noqa\nfrom .depends import bind, depends # noqa\nfrom .interact import interact # noqa\nfrom .io import ( # noqa\n _jupyter_server_extension_paths, cache, ipywidget, serve, state,\n)\nfrom .layout import ( # noqa\n Accordion, Card, Column, FlexBox, FloatPanel, GridBox, GridSpec, GridStack,\n HSpacer, Row, Spacer, Swipe, Tabs, VSpacer, WidgetBox,\n)\nfrom .pane import panel # noqa\nfrom .param import Param, ReactiveExpr # noqa\nfrom .template import Template # noqa\nfrom .widgets import indicators, widget # noqa\n\n__all__ = (\n \"__version__\",\n \"Accordion\",\n \"Card\",\n \"chat\",\n \"Column\",\n \"FlexBox\",\n \"FloatPanel\",\n \"GridBox\",\n \"GridSpec\",\n \"GridStack\",\n \"HSpacer\",\n \"Param\",\n \"ReactiveExpr\",\n \"Row\",\n \"Spacer\",\n \"Tabs\",\n \"Template\",\n \"VSpacer\",\n \"WidgetBox\",\n \"bind\",\n \"cache\",\n \"config\",\n \"depends\",\n \"extension\",\n \"indicators\",\n \"interact\",\n \"ipywidget\",\n \"layout\",\n \"links\",\n \"pane\",\n \"panel\",\n \"param\",\n \"pipeline\",\n \"rx\",\n \"serve\",\n \"state\",\n \"viewable\",\n \"widgets\",\n \"widget\"\n)\n", "path": "panel/__init__.py"}], "after_files": [{"content": "\"\"\"\nPanel is a high level app and dashboarding framework\n====================================================\n\nPanel is an open-source Python library that lets you create custom\ninteractive web apps and dashboards by connecting user-defined widgets\nto plots, images, tables, or text.\n\nPanel works with the tools you know and \u2764\ufe0f.\n\nCheck out https://panel.holoviz.org/\n\n.. figure:: https://user-images.githubusercontent.com/42288570/152672367-6c239073-0ea0-4a2b-a4c0-817e8090e877.gif\n :alt: Panel Dashboard\n\n Panel Dashboard\n\nHow to develop a Panel app in 3 simple steps\n--------------------------------------------\n\n- Write the app\n\n>>> import panel as pn\n>>> pn.extension(sizing_mode=\"stretch_width\", template=\"fast\")\n>>> pn.state.template.param.update(title=\"My Data App\")\n>>> pn.panel(some_python_object).servable()\n\n- Run your app\n\n$ panel serve my_script.py --autoreload --show\n\nor\n\n$ panel serve my_notebook.ipynb --autoreload --show\n\nThe app will be available in your browser!\n\n- Change your code and save it\n\nThe app will reload with your changes!\n\nYou can also add automatic reload to jupyterlab. Check out\nhttps://blog.holoviz.org/panel_0.12.0.html#JupyterLab-previews\n\nTo learn more about Panel check out\nhttps://panel.holoviz.org/getting_started/index.html\n\"\"\"\nfrom param import rx\n\nfrom . import chat # noqa\nfrom . import layout # noqa\nfrom . import links # noqa\nfrom . import pane # noqa\nfrom . import param # noqa\nfrom . import pipeline # noqa\nfrom . import reactive # noqa\nfrom . import template # noqa\nfrom . import viewable # noqa\nfrom . import widgets # noqa\nfrom .config import __version__, config, panel_extension as extension # noqa\nfrom .depends import bind, depends # noqa\nfrom .interact import interact # noqa\nfrom .io import ( # noqa\n _jupyter_server_extension_paths, cache, ipywidget, serve, state,\n)\nfrom .layout import ( # noqa\n Accordion, Card, Column, FlexBox, FloatPanel, GridBox, GridSpec, GridStack,\n HSpacer, Row, Spacer, Swipe, Tabs, VSpacer, WidgetBox,\n)\nfrom .pane import panel # noqa\nfrom .param import Param, ReactiveExpr # noqa\nfrom .template import Template # noqa\nfrom .widgets import indicators, widget # noqa\n\n__all__ = (\n \"__version__\",\n \"Accordion\",\n \"Card\",\n \"chat\",\n \"Column\",\n \"FlexBox\",\n \"FloatPanel\",\n \"GridBox\",\n \"GridSpec\",\n \"GridStack\",\n \"HSpacer\",\n \"Param\",\n \"ReactiveExpr\",\n \"Row\",\n \"Spacer\",\n \"Tabs\",\n \"Template\",\n \"VSpacer\",\n \"WidgetBox\",\n \"bind\",\n \"cache\",\n \"config\",\n \"depends\",\n \"extension\",\n \"indicators\",\n \"interact\",\n \"ipywidget\",\n \"layout\",\n \"links\",\n \"pane\",\n \"panel\",\n \"param\",\n \"pipeline\",\n \"rx\",\n \"serve\",\n \"state\",\n \"template\",\n \"viewable\",\n \"widgets\",\n \"widget\"\n)\n", "path": "panel/__init__.py"}]}
| 1,731 | 159 |
gh_patches_debug_2446
|
rasdani/github-patches
|
git_diff
|
quantumlib__Cirq-4780
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix deprecation warning for newly added `ClassicallyControlledOperation`
**Description of the issue**
The following deprecation warning is emitted on running the json serialization test and should be fixed.
```python
~/quantum/Cirq/cirq-core/cirq/protocols/json_serialization.py:283: DeprecationWarning: Found 'cirq_type': 'ClassicallyControlledOperation' in _json_dict_. Custom values of this field are not permitted, and will produce an error starting in Cirq v0.15.
```
**Cirq version**
0.14dev
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cirq-core/cirq/ops/classically_controlled_operation.py`
Content:
```
1 # Copyright 2021 The Cirq Developers
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # https://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 from typing import (
15 AbstractSet,
16 Any,
17 Dict,
18 FrozenSet,
19 List,
20 Optional,
21 Sequence,
22 TYPE_CHECKING,
23 Tuple,
24 Union,
25 )
26
27 import sympy
28
29 from cirq import protocols, value
30 from cirq.ops import raw_types
31
32 if TYPE_CHECKING:
33 import cirq
34
35
36 @value.value_equality
37 class ClassicallyControlledOperation(raw_types.Operation):
38 """Augments existing operations to be conditionally executed.
39
40 An operation that is classically controlled is executed iff all conditions
41 evaluate to True. Currently the only condition type is a measurement key.
42 A measurement key evaluates to True iff any qubit in the corresponding
43 measurement operation evaluated to a non-zero value.
44
45 This object is typically created via
46 `operation.with_classical_controls(*conditions)`.
47 """
48
49 def __init__(
50 self,
51 sub_operation: 'cirq.Operation',
52 conditions: Sequence[Union[str, 'cirq.MeasurementKey', 'cirq.Condition', sympy.Basic]],
53 ):
54 """Initializes a `ClassicallyControlledOperation`.
55
56 Multiple consecutive `ClassicallyControlledOperation` layers are
57 squashed when possible, so one should not depend on a specific number
58 of layers.
59
60 Args:
61 sub_operation: The operation to gate with a classical control
62 condition.
63 conditions: A sequence of measurement keys, or strings that can be
64 parsed into measurement keys.
65
66 Raises:
67 ValueError: If an unsupported gate is being classically
68 controlled.
69 """
70 if protocols.measurement_key_objs(sub_operation):
71 raise ValueError(
72 f'Cannot conditionally run operations with measurements: {sub_operation}'
73 )
74 conditions = tuple(conditions)
75 if isinstance(sub_operation, ClassicallyControlledOperation):
76 conditions += sub_operation._conditions
77 sub_operation = sub_operation._sub_operation
78 conds: List['cirq.Condition'] = []
79 for c in conditions:
80 if isinstance(c, str):
81 c = value.MeasurementKey.parse_serialized(c)
82 if isinstance(c, value.MeasurementKey):
83 c = value.KeyCondition(c)
84 if isinstance(c, sympy.Basic):
85 c = value.SympyCondition(c)
86 conds.append(c)
87 self._conditions: Tuple['cirq.Condition', ...] = tuple(conds)
88 self._sub_operation: 'cirq.Operation' = sub_operation
89
90 @property
91 def classical_controls(self) -> FrozenSet['cirq.Condition']:
92 return frozenset(self._conditions).union(self._sub_operation.classical_controls)
93
94 def without_classical_controls(self) -> 'cirq.Operation':
95 return self._sub_operation.without_classical_controls()
96
97 @property
98 def qubits(self):
99 return self._sub_operation.qubits
100
101 def with_qubits(self, *new_qubits):
102 return self._sub_operation.with_qubits(*new_qubits).with_classical_controls(
103 *self._conditions
104 )
105
106 def _decompose_(self):
107 result = protocols.decompose_once(self._sub_operation, NotImplemented)
108 if result is NotImplemented:
109 return NotImplemented
110
111 return [ClassicallyControlledOperation(op, self._conditions) for op in result]
112
113 def _value_equality_values_(self):
114 return (frozenset(self._conditions), self._sub_operation)
115
116 def __str__(self) -> str:
117 keys = ', '.join(map(str, self._conditions))
118 return f'{self._sub_operation}.with_classical_controls({keys})'
119
120 def __repr__(self):
121 return (
122 f'cirq.ClassicallyControlledOperation('
123 f'{self._sub_operation!r}, {list(self._conditions)!r})'
124 )
125
126 def _is_parameterized_(self) -> bool:
127 return protocols.is_parameterized(self._sub_operation)
128
129 def _parameter_names_(self) -> AbstractSet[str]:
130 return protocols.parameter_names(self._sub_operation)
131
132 def _resolve_parameters_(
133 self, resolver: 'cirq.ParamResolver', recursive: bool
134 ) -> 'ClassicallyControlledOperation':
135 new_sub_op = protocols.resolve_parameters(self._sub_operation, resolver, recursive)
136 return new_sub_op.with_classical_controls(*self._conditions)
137
138 def _circuit_diagram_info_(
139 self, args: 'cirq.CircuitDiagramInfoArgs'
140 ) -> Optional['protocols.CircuitDiagramInfo']:
141 sub_args = protocols.CircuitDiagramInfoArgs(
142 known_qubit_count=args.known_qubit_count,
143 known_qubits=args.known_qubits,
144 use_unicode_characters=args.use_unicode_characters,
145 precision=args.precision,
146 label_map=args.label_map,
147 )
148 sub_info = protocols.circuit_diagram_info(self._sub_operation, sub_args, None)
149 if sub_info is None:
150 return NotImplemented # coverage: ignore
151
152 control_count = len({k for c in self._conditions for k in c.keys})
153 wire_symbols = sub_info.wire_symbols + ('^',) * control_count
154 if any(not isinstance(c, value.KeyCondition) for c in self._conditions):
155 wire_symbols = (
156 wire_symbols[0]
157 + '(conditions=['
158 + ', '.join(str(c) for c in self._conditions)
159 + '])',
160 ) + wire_symbols[1:]
161 exponent_qubit_index = None
162 if sub_info.exponent_qubit_index is not None:
163 exponent_qubit_index = sub_info.exponent_qubit_index + control_count
164 elif sub_info.exponent is not None:
165 exponent_qubit_index = control_count
166 return protocols.CircuitDiagramInfo(
167 wire_symbols=wire_symbols,
168 exponent=sub_info.exponent,
169 exponent_qubit_index=exponent_qubit_index,
170 )
171
172 def _json_dict_(self) -> Dict[str, Any]:
173 return {
174 'cirq_type': self.__class__.__name__,
175 'conditions': self._conditions,
176 'sub_operation': self._sub_operation,
177 }
178
179 def _act_on_(self, args: 'cirq.ActOnArgs') -> bool:
180 if all(c.resolve(args.log_of_measurement_results) for c in self._conditions):
181 protocols.act_on(self._sub_operation, args)
182 return True
183
184 def _with_measurement_key_mapping_(
185 self, key_map: Dict[str, str]
186 ) -> 'ClassicallyControlledOperation':
187 conditions = [protocols.with_measurement_key_mapping(c, key_map) for c in self._conditions]
188 sub_operation = protocols.with_measurement_key_mapping(self._sub_operation, key_map)
189 sub_operation = self._sub_operation if sub_operation is NotImplemented else sub_operation
190 return sub_operation.with_classical_controls(*conditions)
191
192 def _with_key_path_prefix_(self, prefix: Tuple[str, ...]) -> 'ClassicallyControlledOperation':
193 conditions = [protocols.with_key_path_prefix(c, prefix) for c in self._conditions]
194 sub_operation = protocols.with_key_path_prefix(self._sub_operation, prefix)
195 sub_operation = self._sub_operation if sub_operation is NotImplemented else sub_operation
196 return sub_operation.with_classical_controls(*conditions)
197
198 def _with_rescoped_keys_(
199 self,
200 path: Tuple[str, ...],
201 bindable_keys: FrozenSet['cirq.MeasurementKey'],
202 ) -> 'ClassicallyControlledOperation':
203 conds = [protocols.with_rescoped_keys(c, path, bindable_keys) for c in self._conditions]
204 sub_operation = protocols.with_rescoped_keys(self._sub_operation, path, bindable_keys)
205 return sub_operation.with_classical_controls(*conds)
206
207 def _control_keys_(self) -> FrozenSet['cirq.MeasurementKey']:
208 local_keys: FrozenSet['cirq.MeasurementKey'] = frozenset(
209 k for condition in self._conditions for k in condition.keys
210 )
211 return local_keys.union(protocols.control_keys(self._sub_operation))
212
213 def _qasm_(self, args: 'cirq.QasmArgs') -> Optional[str]:
214 args.validate_version('2.0')
215 all_keys = " && ".join(c.qasm for c in self._conditions)
216 return args.format('if ({0}) {1}', all_keys, protocols.qasm(self._sub_operation, args=args))
217
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/cirq-core/cirq/ops/classically_controlled_operation.py b/cirq-core/cirq/ops/classically_controlled_operation.py
--- a/cirq-core/cirq/ops/classically_controlled_operation.py
+++ b/cirq-core/cirq/ops/classically_controlled_operation.py
@@ -171,7 +171,6 @@
def _json_dict_(self) -> Dict[str, Any]:
return {
- 'cirq_type': self.__class__.__name__,
'conditions': self._conditions,
'sub_operation': self._sub_operation,
}
|
{"golden_diff": "diff --git a/cirq-core/cirq/ops/classically_controlled_operation.py b/cirq-core/cirq/ops/classically_controlled_operation.py\n--- a/cirq-core/cirq/ops/classically_controlled_operation.py\n+++ b/cirq-core/cirq/ops/classically_controlled_operation.py\n@@ -171,7 +171,6 @@\n \n def _json_dict_(self) -> Dict[str, Any]:\n return {\n- 'cirq_type': self.__class__.__name__,\n 'conditions': self._conditions,\n 'sub_operation': self._sub_operation,\n }\n", "issue": "Fix deprecation warning for newly added `ClassicallyControlledOperation`\n**Description of the issue**\r\nThe following deprecation warning is emitted on running the json serialization test and should be fixed.\r\n```python\r\n~/quantum/Cirq/cirq-core/cirq/protocols/json_serialization.py:283: DeprecationWarning: Found 'cirq_type': 'ClassicallyControlledOperation' in _json_dict_. Custom values of this field are not permitted, and will produce an error starting in Cirq v0.15.\r\n```\r\n**Cirq version**\r\n0.14dev\r\n\n", "before_files": [{"content": "# Copyright 2021 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import (\n AbstractSet,\n Any,\n Dict,\n FrozenSet,\n List,\n Optional,\n Sequence,\n TYPE_CHECKING,\n Tuple,\n Union,\n)\n\nimport sympy\n\nfrom cirq import protocols, value\nfrom cirq.ops import raw_types\n\nif TYPE_CHECKING:\n import cirq\n\n\n@value.value_equality\nclass ClassicallyControlledOperation(raw_types.Operation):\n \"\"\"Augments existing operations to be conditionally executed.\n\n An operation that is classically controlled is executed iff all conditions\n evaluate to True. Currently the only condition type is a measurement key.\n A measurement key evaluates to True iff any qubit in the corresponding\n measurement operation evaluated to a non-zero value.\n\n This object is typically created via\n `operation.with_classical_controls(*conditions)`.\n \"\"\"\n\n def __init__(\n self,\n sub_operation: 'cirq.Operation',\n conditions: Sequence[Union[str, 'cirq.MeasurementKey', 'cirq.Condition', sympy.Basic]],\n ):\n \"\"\"Initializes a `ClassicallyControlledOperation`.\n\n Multiple consecutive `ClassicallyControlledOperation` layers are\n squashed when possible, so one should not depend on a specific number\n of layers.\n\n Args:\n sub_operation: The operation to gate with a classical control\n condition.\n conditions: A sequence of measurement keys, or strings that can be\n parsed into measurement keys.\n\n Raises:\n ValueError: If an unsupported gate is being classically\n controlled.\n \"\"\"\n if protocols.measurement_key_objs(sub_operation):\n raise ValueError(\n f'Cannot conditionally run operations with measurements: {sub_operation}'\n )\n conditions = tuple(conditions)\n if isinstance(sub_operation, ClassicallyControlledOperation):\n conditions += sub_operation._conditions\n sub_operation = sub_operation._sub_operation\n conds: List['cirq.Condition'] = []\n for c in conditions:\n if isinstance(c, str):\n c = value.MeasurementKey.parse_serialized(c)\n if isinstance(c, value.MeasurementKey):\n c = value.KeyCondition(c)\n if isinstance(c, sympy.Basic):\n c = value.SympyCondition(c)\n conds.append(c)\n self._conditions: Tuple['cirq.Condition', ...] = tuple(conds)\n self._sub_operation: 'cirq.Operation' = sub_operation\n\n @property\n def classical_controls(self) -> FrozenSet['cirq.Condition']:\n return frozenset(self._conditions).union(self._sub_operation.classical_controls)\n\n def without_classical_controls(self) -> 'cirq.Operation':\n return self._sub_operation.without_classical_controls()\n\n @property\n def qubits(self):\n return self._sub_operation.qubits\n\n def with_qubits(self, *new_qubits):\n return self._sub_operation.with_qubits(*new_qubits).with_classical_controls(\n *self._conditions\n )\n\n def _decompose_(self):\n result = protocols.decompose_once(self._sub_operation, NotImplemented)\n if result is NotImplemented:\n return NotImplemented\n\n return [ClassicallyControlledOperation(op, self._conditions) for op in result]\n\n def _value_equality_values_(self):\n return (frozenset(self._conditions), self._sub_operation)\n\n def __str__(self) -> str:\n keys = ', '.join(map(str, self._conditions))\n return f'{self._sub_operation}.with_classical_controls({keys})'\n\n def __repr__(self):\n return (\n f'cirq.ClassicallyControlledOperation('\n f'{self._sub_operation!r}, {list(self._conditions)!r})'\n )\n\n def _is_parameterized_(self) -> bool:\n return protocols.is_parameterized(self._sub_operation)\n\n def _parameter_names_(self) -> AbstractSet[str]:\n return protocols.parameter_names(self._sub_operation)\n\n def _resolve_parameters_(\n self, resolver: 'cirq.ParamResolver', recursive: bool\n ) -> 'ClassicallyControlledOperation':\n new_sub_op = protocols.resolve_parameters(self._sub_operation, resolver, recursive)\n return new_sub_op.with_classical_controls(*self._conditions)\n\n def _circuit_diagram_info_(\n self, args: 'cirq.CircuitDiagramInfoArgs'\n ) -> Optional['protocols.CircuitDiagramInfo']:\n sub_args = protocols.CircuitDiagramInfoArgs(\n known_qubit_count=args.known_qubit_count,\n known_qubits=args.known_qubits,\n use_unicode_characters=args.use_unicode_characters,\n precision=args.precision,\n label_map=args.label_map,\n )\n sub_info = protocols.circuit_diagram_info(self._sub_operation, sub_args, None)\n if sub_info is None:\n return NotImplemented # coverage: ignore\n\n control_count = len({k for c in self._conditions for k in c.keys})\n wire_symbols = sub_info.wire_symbols + ('^',) * control_count\n if any(not isinstance(c, value.KeyCondition) for c in self._conditions):\n wire_symbols = (\n wire_symbols[0]\n + '(conditions=['\n + ', '.join(str(c) for c in self._conditions)\n + '])',\n ) + wire_symbols[1:]\n exponent_qubit_index = None\n if sub_info.exponent_qubit_index is not None:\n exponent_qubit_index = sub_info.exponent_qubit_index + control_count\n elif sub_info.exponent is not None:\n exponent_qubit_index = control_count\n return protocols.CircuitDiagramInfo(\n wire_symbols=wire_symbols,\n exponent=sub_info.exponent,\n exponent_qubit_index=exponent_qubit_index,\n )\n\n def _json_dict_(self) -> Dict[str, Any]:\n return {\n 'cirq_type': self.__class__.__name__,\n 'conditions': self._conditions,\n 'sub_operation': self._sub_operation,\n }\n\n def _act_on_(self, args: 'cirq.ActOnArgs') -> bool:\n if all(c.resolve(args.log_of_measurement_results) for c in self._conditions):\n protocols.act_on(self._sub_operation, args)\n return True\n\n def _with_measurement_key_mapping_(\n self, key_map: Dict[str, str]\n ) -> 'ClassicallyControlledOperation':\n conditions = [protocols.with_measurement_key_mapping(c, key_map) for c in self._conditions]\n sub_operation = protocols.with_measurement_key_mapping(self._sub_operation, key_map)\n sub_operation = self._sub_operation if sub_operation is NotImplemented else sub_operation\n return sub_operation.with_classical_controls(*conditions)\n\n def _with_key_path_prefix_(self, prefix: Tuple[str, ...]) -> 'ClassicallyControlledOperation':\n conditions = [protocols.with_key_path_prefix(c, prefix) for c in self._conditions]\n sub_operation = protocols.with_key_path_prefix(self._sub_operation, prefix)\n sub_operation = self._sub_operation if sub_operation is NotImplemented else sub_operation\n return sub_operation.with_classical_controls(*conditions)\n\n def _with_rescoped_keys_(\n self,\n path: Tuple[str, ...],\n bindable_keys: FrozenSet['cirq.MeasurementKey'],\n ) -> 'ClassicallyControlledOperation':\n conds = [protocols.with_rescoped_keys(c, path, bindable_keys) for c in self._conditions]\n sub_operation = protocols.with_rescoped_keys(self._sub_operation, path, bindable_keys)\n return sub_operation.with_classical_controls(*conds)\n\n def _control_keys_(self) -> FrozenSet['cirq.MeasurementKey']:\n local_keys: FrozenSet['cirq.MeasurementKey'] = frozenset(\n k for condition in self._conditions for k in condition.keys\n )\n return local_keys.union(protocols.control_keys(self._sub_operation))\n\n def _qasm_(self, args: 'cirq.QasmArgs') -> Optional[str]:\n args.validate_version('2.0')\n all_keys = \" && \".join(c.qasm for c in self._conditions)\n return args.format('if ({0}) {1}', all_keys, protocols.qasm(self._sub_operation, args=args))\n", "path": "cirq-core/cirq/ops/classically_controlled_operation.py"}], "after_files": [{"content": "# Copyright 2021 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import (\n AbstractSet,\n Any,\n Dict,\n FrozenSet,\n List,\n Optional,\n Sequence,\n TYPE_CHECKING,\n Tuple,\n Union,\n)\n\nimport sympy\n\nfrom cirq import protocols, value\nfrom cirq.ops import raw_types\n\nif TYPE_CHECKING:\n import cirq\n\n\n@value.value_equality\nclass ClassicallyControlledOperation(raw_types.Operation):\n \"\"\"Augments existing operations to be conditionally executed.\n\n An operation that is classically controlled is executed iff all conditions\n evaluate to True. Currently the only condition type is a measurement key.\n A measurement key evaluates to True iff any qubit in the corresponding\n measurement operation evaluated to a non-zero value.\n\n This object is typically created via\n `operation.with_classical_controls(*conditions)`.\n \"\"\"\n\n def __init__(\n self,\n sub_operation: 'cirq.Operation',\n conditions: Sequence[Union[str, 'cirq.MeasurementKey', 'cirq.Condition', sympy.Basic]],\n ):\n \"\"\"Initializes a `ClassicallyControlledOperation`.\n\n Multiple consecutive `ClassicallyControlledOperation` layers are\n squashed when possible, so one should not depend on a specific number\n of layers.\n\n Args:\n sub_operation: The operation to gate with a classical control\n condition.\n conditions: A sequence of measurement keys, or strings that can be\n parsed into measurement keys.\n\n Raises:\n ValueError: If an unsupported gate is being classically\n controlled.\n \"\"\"\n if protocols.measurement_key_objs(sub_operation):\n raise ValueError(\n f'Cannot conditionally run operations with measurements: {sub_operation}'\n )\n conditions = tuple(conditions)\n if isinstance(sub_operation, ClassicallyControlledOperation):\n conditions += sub_operation._conditions\n sub_operation = sub_operation._sub_operation\n conds: List['cirq.Condition'] = []\n for c in conditions:\n if isinstance(c, str):\n c = value.MeasurementKey.parse_serialized(c)\n if isinstance(c, value.MeasurementKey):\n c = value.KeyCondition(c)\n if isinstance(c, sympy.Basic):\n c = value.SympyCondition(c)\n conds.append(c)\n self._conditions: Tuple['cirq.Condition', ...] = tuple(conds)\n self._sub_operation: 'cirq.Operation' = sub_operation\n\n @property\n def classical_controls(self) -> FrozenSet['cirq.Condition']:\n return frozenset(self._conditions).union(self._sub_operation.classical_controls)\n\n def without_classical_controls(self) -> 'cirq.Operation':\n return self._sub_operation.without_classical_controls()\n\n @property\n def qubits(self):\n return self._sub_operation.qubits\n\n def with_qubits(self, *new_qubits):\n return self._sub_operation.with_qubits(*new_qubits).with_classical_controls(\n *self._conditions\n )\n\n def _decompose_(self):\n result = protocols.decompose_once(self._sub_operation, NotImplemented)\n if result is NotImplemented:\n return NotImplemented\n\n return [ClassicallyControlledOperation(op, self._conditions) for op in result]\n\n def _value_equality_values_(self):\n return (frozenset(self._conditions), self._sub_operation)\n\n def __str__(self) -> str:\n keys = ', '.join(map(str, self._conditions))\n return f'{self._sub_operation}.with_classical_controls({keys})'\n\n def __repr__(self):\n return (\n f'cirq.ClassicallyControlledOperation('\n f'{self._sub_operation!r}, {list(self._conditions)!r})'\n )\n\n def _is_parameterized_(self) -> bool:\n return protocols.is_parameterized(self._sub_operation)\n\n def _parameter_names_(self) -> AbstractSet[str]:\n return protocols.parameter_names(self._sub_operation)\n\n def _resolve_parameters_(\n self, resolver: 'cirq.ParamResolver', recursive: bool\n ) -> 'ClassicallyControlledOperation':\n new_sub_op = protocols.resolve_parameters(self._sub_operation, resolver, recursive)\n return new_sub_op.with_classical_controls(*self._conditions)\n\n def _circuit_diagram_info_(\n self, args: 'cirq.CircuitDiagramInfoArgs'\n ) -> Optional['protocols.CircuitDiagramInfo']:\n sub_args = protocols.CircuitDiagramInfoArgs(\n known_qubit_count=args.known_qubit_count,\n known_qubits=args.known_qubits,\n use_unicode_characters=args.use_unicode_characters,\n precision=args.precision,\n label_map=args.label_map,\n )\n sub_info = protocols.circuit_diagram_info(self._sub_operation, sub_args, None)\n if sub_info is None:\n return NotImplemented # coverage: ignore\n\n control_count = len({k for c in self._conditions for k in c.keys})\n wire_symbols = sub_info.wire_symbols + ('^',) * control_count\n if any(not isinstance(c, value.KeyCondition) for c in self._conditions):\n wire_symbols = (\n wire_symbols[0]\n + '(conditions=['\n + ', '.join(str(c) for c in self._conditions)\n + '])',\n ) + wire_symbols[1:]\n exponent_qubit_index = None\n if sub_info.exponent_qubit_index is not None:\n exponent_qubit_index = sub_info.exponent_qubit_index + control_count\n elif sub_info.exponent is not None:\n exponent_qubit_index = control_count\n return protocols.CircuitDiagramInfo(\n wire_symbols=wire_symbols,\n exponent=sub_info.exponent,\n exponent_qubit_index=exponent_qubit_index,\n )\n\n def _json_dict_(self) -> Dict[str, Any]:\n return {\n 'conditions': self._conditions,\n 'sub_operation': self._sub_operation,\n }\n\n def _act_on_(self, args: 'cirq.ActOnArgs') -> bool:\n if all(c.resolve(args.log_of_measurement_results) for c in self._conditions):\n protocols.act_on(self._sub_operation, args)\n return True\n\n def _with_measurement_key_mapping_(\n self, key_map: Dict[str, str]\n ) -> 'ClassicallyControlledOperation':\n conditions = [protocols.with_measurement_key_mapping(c, key_map) for c in self._conditions]\n sub_operation = protocols.with_measurement_key_mapping(self._sub_operation, key_map)\n sub_operation = self._sub_operation if sub_operation is NotImplemented else sub_operation\n return sub_operation.with_classical_controls(*conditions)\n\n def _with_key_path_prefix_(self, prefix: Tuple[str, ...]) -> 'ClassicallyControlledOperation':\n conditions = [protocols.with_key_path_prefix(c, prefix) for c in self._conditions]\n sub_operation = protocols.with_key_path_prefix(self._sub_operation, prefix)\n sub_operation = self._sub_operation if sub_operation is NotImplemented else sub_operation\n return sub_operation.with_classical_controls(*conditions)\n\n def _with_rescoped_keys_(\n self,\n path: Tuple[str, ...],\n bindable_keys: FrozenSet['cirq.MeasurementKey'],\n ) -> 'ClassicallyControlledOperation':\n conds = [protocols.with_rescoped_keys(c, path, bindable_keys) for c in self._conditions]\n sub_operation = protocols.with_rescoped_keys(self._sub_operation, path, bindable_keys)\n return sub_operation.with_classical_controls(*conds)\n\n def _control_keys_(self) -> FrozenSet['cirq.MeasurementKey']:\n local_keys: FrozenSet['cirq.MeasurementKey'] = frozenset(\n k for condition in self._conditions for k in condition.keys\n )\n return local_keys.union(protocols.control_keys(self._sub_operation))\n\n def _qasm_(self, args: 'cirq.QasmArgs') -> Optional[str]:\n args.validate_version('2.0')\n all_keys = \" && \".join(c.qasm for c in self._conditions)\n return args.format('if ({0}) {1}', all_keys, protocols.qasm(self._sub_operation, args=args))\n", "path": "cirq-core/cirq/ops/classically_controlled_operation.py"}]}
| 2,838 | 129 |
gh_patches_debug_40115
|
rasdani/github-patches
|
git_diff
|
Pylons__pyramid-2902
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
add_translation_dirs behaves the wrong way around
This is a follow-up to #1473. See the reproducer and notes there.
I would propose an API change that changes behaviour of add_translation_dirs to append specs at the end of existing specs instead of prepending it. That way, API users simply need to understand "last spec wins". This matches much closer to the mental model of "add".
It matches the current documented behaviour of how the method is to _behave_ in case of multiple calls.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyramid/config/i18n.py`
Content:
```
1 from pyramid.interfaces import (
2 ILocaleNegotiator,
3 ITranslationDirectories,
4 )
5
6 from pyramid.exceptions import ConfigurationError
7 from pyramid.path import AssetResolver
8 from pyramid.util import action_method
9
10 class I18NConfiguratorMixin(object):
11 @action_method
12 def set_locale_negotiator(self, negotiator):
13 """
14 Set the :term:`locale negotiator` for this application. The
15 :term:`locale negotiator` is a callable which accepts a
16 :term:`request` object and which returns a :term:`locale
17 name`. The ``negotiator`` argument should be the locale
18 negotiator implementation or a :term:`dotted Python name`
19 which refers to such an implementation.
20
21 Later calls to this method override earlier calls; there can
22 be only one locale negotiator active at a time within an
23 application. See :ref:`activating_translation` for more
24 information.
25
26 .. note::
27
28 Using the ``locale_negotiator`` argument to the
29 :class:`pyramid.config.Configurator` constructor can be used to
30 achieve the same purpose.
31 """
32 def register():
33 self._set_locale_negotiator(negotiator)
34 intr = self.introspectable('locale negotiator', None,
35 self.object_description(negotiator),
36 'locale negotiator')
37 intr['negotiator'] = negotiator
38 self.action(ILocaleNegotiator, register, introspectables=(intr,))
39
40 def _set_locale_negotiator(self, negotiator):
41 locale_negotiator = self.maybe_dotted(negotiator)
42 self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)
43
44 @action_method
45 def add_translation_dirs(self, *specs):
46 """ Add one or more :term:`translation directory` paths to the
47 current configuration state. The ``specs`` argument is a
48 sequence that may contain absolute directory paths
49 (e.g. ``/usr/share/locale``) or :term:`asset specification`
50 names naming a directory path (e.g. ``some.package:locale``)
51 or a combination of the two.
52
53 Example:
54
55 .. code-block:: python
56
57 config.add_translation_dirs('/usr/share/locale',
58 'some.package:locale')
59
60 The translation directories are defined as a list in which
61 translations defined later have precedence over translations defined
62 earlier.
63
64 If multiple specs are provided in a single call to
65 ``add_translation_dirs``, the directories will be inserted in the
66 order they're provided (earlier items are trumped by later items).
67
68 .. warning::
69
70 Consecutive calls to ``add_translation_dirs`` will sort the
71 directories such that the later calls will add folders with
72 lower precedence than earlier calls.
73
74 """
75 introspectables = []
76
77 def register():
78 directories = []
79 resolver = AssetResolver(self.package_name)
80
81 # defer spec resolution until register to allow for asset
82 # overrides to take place in an earlier config phase
83 for spec in specs[::-1]: # reversed
84 # the trailing slash helps match asset overrides for folders
85 if not spec.endswith('/'):
86 spec += '/'
87 asset = resolver.resolve(spec)
88 directory = asset.abspath()
89 if not asset.isdir():
90 raise ConfigurationError('"%s" is not a directory' %
91 directory)
92 intr = self.introspectable('translation directories', directory,
93 spec, 'translation directory')
94 intr['directory'] = directory
95 intr['spec'] = spec
96 introspectables.append(intr)
97 directories.append(directory)
98
99 tdirs = self.registry.queryUtility(ITranslationDirectories)
100 if tdirs is None:
101 tdirs = []
102 self.registry.registerUtility(tdirs, ITranslationDirectories)
103 for directory in directories:
104 tdirs.insert(0, directory)
105
106 self.action(None, register, introspectables=introspectables)
107
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pyramid/config/i18n.py b/pyramid/config/i18n.py
--- a/pyramid/config/i18n.py
+++ b/pyramid/config/i18n.py
@@ -42,7 +42,7 @@
self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)
@action_method
- def add_translation_dirs(self, *specs):
+ def add_translation_dirs(self, *specs, **kw):
""" Add one or more :term:`translation directory` paths to the
current configuration state. The ``specs`` argument is a
sequence that may contain absolute directory paths
@@ -61,18 +61,27 @@
translations defined later have precedence over translations defined
earlier.
+ By default, consecutive calls to ``add_translation_dirs`` will add
+ directories to the start of the list. This means later calls to
+ ``add_translation_dirs`` will have their translations trumped by
+ earlier calls. If you explicitly need this call to trump an earlier
+ call then you may set ``override`` to ``True``.
+
If multiple specs are provided in a single call to
``add_translation_dirs``, the directories will be inserted in the
order they're provided (earlier items are trumped by later items).
- .. warning::
+ .. versionchanged:: 1.8
- Consecutive calls to ``add_translation_dirs`` will sort the
- directories such that the later calls will add folders with
- lower precedence than earlier calls.
+ The ``override`` parameter was added to allow a later call
+ to ``add_translation_dirs`` to override an earlier call, inserting
+ folders at the beginning of the translation directory list.
"""
introspectables = []
+ override = kw.pop('override', False)
+ if kw:
+ raise TypeError('invalid keyword arguments: %s', sorted(kw.keys()))
def register():
directories = []
@@ -80,7 +89,7 @@
# defer spec resolution until register to allow for asset
# overrides to take place in an earlier config phase
- for spec in specs[::-1]: # reversed
+ for spec in specs:
# the trailing slash helps match asset overrides for folders
if not spec.endswith('/'):
spec += '/'
@@ -100,8 +109,11 @@
if tdirs is None:
tdirs = []
self.registry.registerUtility(tdirs, ITranslationDirectories)
- for directory in directories:
- tdirs.insert(0, directory)
+ if override:
+ tdirs.extend(directories)
+ else:
+ for directory in reversed(directories):
+ tdirs.insert(0, directory)
self.action(None, register, introspectables=introspectables)
|
{"golden_diff": "diff --git a/pyramid/config/i18n.py b/pyramid/config/i18n.py\n--- a/pyramid/config/i18n.py\n+++ b/pyramid/config/i18n.py\n@@ -42,7 +42,7 @@\n self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)\n \n @action_method\n- def add_translation_dirs(self, *specs):\n+ def add_translation_dirs(self, *specs, **kw):\n \"\"\" Add one or more :term:`translation directory` paths to the\n current configuration state. The ``specs`` argument is a\n sequence that may contain absolute directory paths\n@@ -61,18 +61,27 @@\n translations defined later have precedence over translations defined\n earlier.\n \n+ By default, consecutive calls to ``add_translation_dirs`` will add\n+ directories to the start of the list. This means later calls to\n+ ``add_translation_dirs`` will have their translations trumped by\n+ earlier calls. If you explicitly need this call to trump an earlier\n+ call then you may set ``override`` to ``True``.\n+\n If multiple specs are provided in a single call to\n ``add_translation_dirs``, the directories will be inserted in the\n order they're provided (earlier items are trumped by later items).\n \n- .. warning::\n+ .. versionchanged:: 1.8\n \n- Consecutive calls to ``add_translation_dirs`` will sort the\n- directories such that the later calls will add folders with\n- lower precedence than earlier calls.\n+ The ``override`` parameter was added to allow a later call\n+ to ``add_translation_dirs`` to override an earlier call, inserting\n+ folders at the beginning of the translation directory list.\n \n \"\"\"\n introspectables = []\n+ override = kw.pop('override', False)\n+ if kw:\n+ raise TypeError('invalid keyword arguments: %s', sorted(kw.keys()))\n \n def register():\n directories = []\n@@ -80,7 +89,7 @@\n \n # defer spec resolution until register to allow for asset\n # overrides to take place in an earlier config phase\n- for spec in specs[::-1]: # reversed\n+ for spec in specs:\n # the trailing slash helps match asset overrides for folders\n if not spec.endswith('/'):\n spec += '/'\n@@ -100,8 +109,11 @@\n if tdirs is None:\n tdirs = []\n self.registry.registerUtility(tdirs, ITranslationDirectories)\n- for directory in directories:\n- tdirs.insert(0, directory)\n+ if override:\n+ tdirs.extend(directories)\n+ else:\n+ for directory in reversed(directories):\n+ tdirs.insert(0, directory)\n \n self.action(None, register, introspectables=introspectables)\n", "issue": "add_translation_dirs behaves the wrong way around\nThis is a follow-up to #1473. See the reproducer and notes there.\n\nI would propose an API change that changes behaviour of add_translation_dirs to append specs at the end of existing specs instead of prepending it. That way, API users simply need to understand \"last spec wins\". This matches much closer to the mental model of \"add\".\n\nIt matches the current documented behaviour of how the method is to _behave_ in case of multiple calls.\n\n", "before_files": [{"content": "from pyramid.interfaces import (\n ILocaleNegotiator,\n ITranslationDirectories,\n )\n\nfrom pyramid.exceptions import ConfigurationError\nfrom pyramid.path import AssetResolver\nfrom pyramid.util import action_method\n\nclass I18NConfiguratorMixin(object):\n @action_method\n def set_locale_negotiator(self, negotiator):\n \"\"\"\n Set the :term:`locale negotiator` for this application. The\n :term:`locale negotiator` is a callable which accepts a\n :term:`request` object and which returns a :term:`locale\n name`. The ``negotiator`` argument should be the locale\n negotiator implementation or a :term:`dotted Python name`\n which refers to such an implementation.\n\n Later calls to this method override earlier calls; there can\n be only one locale negotiator active at a time within an\n application. See :ref:`activating_translation` for more\n information.\n\n .. note::\n\n Using the ``locale_negotiator`` argument to the\n :class:`pyramid.config.Configurator` constructor can be used to\n achieve the same purpose.\n \"\"\"\n def register():\n self._set_locale_negotiator(negotiator)\n intr = self.introspectable('locale negotiator', None,\n self.object_description(negotiator),\n 'locale negotiator')\n intr['negotiator'] = negotiator\n self.action(ILocaleNegotiator, register, introspectables=(intr,))\n\n def _set_locale_negotiator(self, negotiator):\n locale_negotiator = self.maybe_dotted(negotiator)\n self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)\n\n @action_method\n def add_translation_dirs(self, *specs):\n \"\"\" Add one or more :term:`translation directory` paths to the\n current configuration state. The ``specs`` argument is a\n sequence that may contain absolute directory paths\n (e.g. ``/usr/share/locale``) or :term:`asset specification`\n names naming a directory path (e.g. ``some.package:locale``)\n or a combination of the two.\n\n Example:\n\n .. code-block:: python\n\n config.add_translation_dirs('/usr/share/locale',\n 'some.package:locale')\n\n The translation directories are defined as a list in which\n translations defined later have precedence over translations defined\n earlier.\n\n If multiple specs are provided in a single call to\n ``add_translation_dirs``, the directories will be inserted in the\n order they're provided (earlier items are trumped by later items).\n\n .. warning::\n\n Consecutive calls to ``add_translation_dirs`` will sort the\n directories such that the later calls will add folders with\n lower precedence than earlier calls.\n\n \"\"\"\n introspectables = []\n\n def register():\n directories = []\n resolver = AssetResolver(self.package_name)\n\n # defer spec resolution until register to allow for asset\n # overrides to take place in an earlier config phase\n for spec in specs[::-1]: # reversed\n # the trailing slash helps match asset overrides for folders\n if not spec.endswith('/'):\n spec += '/'\n asset = resolver.resolve(spec)\n directory = asset.abspath()\n if not asset.isdir():\n raise ConfigurationError('\"%s\" is not a directory' %\n directory)\n intr = self.introspectable('translation directories', directory,\n spec, 'translation directory')\n intr['directory'] = directory\n intr['spec'] = spec\n introspectables.append(intr)\n directories.append(directory)\n\n tdirs = self.registry.queryUtility(ITranslationDirectories)\n if tdirs is None:\n tdirs = []\n self.registry.registerUtility(tdirs, ITranslationDirectories)\n for directory in directories:\n tdirs.insert(0, directory)\n\n self.action(None, register, introspectables=introspectables)\n\n", "path": "pyramid/config/i18n.py"}], "after_files": [{"content": "from pyramid.interfaces import (\n ILocaleNegotiator,\n ITranslationDirectories,\n )\n\nfrom pyramid.exceptions import ConfigurationError\nfrom pyramid.path import AssetResolver\nfrom pyramid.util import action_method\n\nclass I18NConfiguratorMixin(object):\n @action_method\n def set_locale_negotiator(self, negotiator):\n \"\"\"\n Set the :term:`locale negotiator` for this application. The\n :term:`locale negotiator` is a callable which accepts a\n :term:`request` object and which returns a :term:`locale\n name`. The ``negotiator`` argument should be the locale\n negotiator implementation or a :term:`dotted Python name`\n which refers to such an implementation.\n\n Later calls to this method override earlier calls; there can\n be only one locale negotiator active at a time within an\n application. See :ref:`activating_translation` for more\n information.\n\n .. note::\n\n Using the ``locale_negotiator`` argument to the\n :class:`pyramid.config.Configurator` constructor can be used to\n achieve the same purpose.\n \"\"\"\n def register():\n self._set_locale_negotiator(negotiator)\n intr = self.introspectable('locale negotiator', None,\n self.object_description(negotiator),\n 'locale negotiator')\n intr['negotiator'] = negotiator\n self.action(ILocaleNegotiator, register, introspectables=(intr,))\n\n def _set_locale_negotiator(self, negotiator):\n locale_negotiator = self.maybe_dotted(negotiator)\n self.registry.registerUtility(locale_negotiator, ILocaleNegotiator)\n\n @action_method\n def add_translation_dirs(self, *specs, **kw):\n \"\"\" Add one or more :term:`translation directory` paths to the\n current configuration state. The ``specs`` argument is a\n sequence that may contain absolute directory paths\n (e.g. ``/usr/share/locale``) or :term:`asset specification`\n names naming a directory path (e.g. ``some.package:locale``)\n or a combination of the two.\n\n Example:\n\n .. code-block:: python\n\n config.add_translation_dirs('/usr/share/locale',\n 'some.package:locale')\n\n The translation directories are defined as a list in which\n translations defined later have precedence over translations defined\n earlier.\n\n By default, consecutive calls to ``add_translation_dirs`` will add\n directories to the start of the list. This means later calls to\n ``add_translation_dirs`` will have their translations trumped by\n earlier calls. If you explicitly need this call to trump an earlier\n call then you may set ``override`` to ``True``.\n\n If multiple specs are provided in a single call to\n ``add_translation_dirs``, the directories will be inserted in the\n order they're provided (earlier items are trumped by later items).\n\n .. versionchanged:: 1.8\n\n The ``override`` parameter was added to allow a later call\n to ``add_translation_dirs`` to override an earlier call, inserting\n folders at the beginning of the translation directory list.\n\n \"\"\"\n introspectables = []\n override = kw.pop('override', False)\n if kw:\n raise TypeError('invalid keyword arguments: %s', sorted(kw.keys()))\n\n def register():\n directories = []\n resolver = AssetResolver(self.package_name)\n\n # defer spec resolution until register to allow for asset\n # overrides to take place in an earlier config phase\n for spec in specs:\n # the trailing slash helps match asset overrides for folders\n if not spec.endswith('/'):\n spec += '/'\n asset = resolver.resolve(spec)\n directory = asset.abspath()\n if not asset.isdir():\n raise ConfigurationError('\"%s\" is not a directory' %\n directory)\n intr = self.introspectable('translation directories', directory,\n spec, 'translation directory')\n intr['directory'] = directory\n intr['spec'] = spec\n introspectables.append(intr)\n directories.append(directory)\n\n tdirs = self.registry.queryUtility(ITranslationDirectories)\n if tdirs is None:\n tdirs = []\n self.registry.registerUtility(tdirs, ITranslationDirectories)\n if override:\n tdirs.extend(directories)\n else:\n for directory in reversed(directories):\n tdirs.insert(0, directory)\n\n self.action(None, register, introspectables=introspectables)\n\n", "path": "pyramid/config/i18n.py"}]}
| 1,427 | 628 |
gh_patches_debug_13603
|
rasdani/github-patches
|
git_diff
|
python-telegram-bot__python-telegram-bot-1079
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Allow subprocess.Popen.stdout as file input
<!--
Thanks for reporting issues of python-telegram-bot!
Use this template to notify us if you found a bug, or if you want to request a new feature.
If you're looking for help with programming your bot using our library, feel free to ask your
questions in out telegram group at: https://t.me/pythontelegrambotgroup
To make it easier for us to help you please enter detailed information below.
Please note, we only support the latest version of python-telegram-bot and
master branch. Please make sure to upgrade & recreate the issue on the latest
version prior to opening an issue.
-->
### Steps to reproduce
1.
```python
@run_async
def pipe(bot, update):
proc = subprocess.Popen(
"cat image.jpg".split(),
stdout=subprocess.PIPE
)
update.message.reply_voice(proc.stdout)
while proc.returncode is None:
proc.poll()
```
### Expected behaviour
Allow `subprocess.Popen.stdout` be treated as valid io stream, just like `open("image.jpg", "rb")` is.
### Actual behaviour
```
...
File "/usr/lib/python3.6/site-packages/telegram/utils/request.py", line 262, in post
data = InputFile(data)
File "/usr/lib/python3.6/site-packages/telegram/files/inputfile.py", line 76, in __init__
self.filename = os.path.basename(self.input_file.name)
File "/usr/lib/python3.6/posixpath.py", line 144, in basename
p = os.fspath(p)
TypeError: expected str, bytes or os.PathLike object, not int
```
### Configuration
**Operating System:**
`Linux xiaomi 4.15.9-1-ARCH #1 SMP PREEMPT Sun Mar 11 17:54:33 UTC 2018 x86_64 GNU/Linux`
**Version of Python, python-telegram-bot & dependencies:**
```
Python 3.6.4 (default, Jan 5 2018, 02:35:40)
[GCC 7.2.1 20171224] on linux
```
### PROPOSED FIX
https://github.com/python-telegram-bot/python-telegram-bot/blob/master/telegram/files/inputfile.py#L76 should stringify `self.input_file.name` to look as:
```python
self.filename = os.path.basename(str(self.input_file.name))
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `telegram/files/inputfile.py`
Content:
```
1 #!/usr/bin/env python
2 # pylint: disable=W0622,E0611
3 #
4 # A library that provides a Python interface to the Telegram Bot API
5 # Copyright (C) 2015-2018
6 # Leandro Toledo de Souza <devs@python-telegram-bot.org>
7 #
8 # This program is free software: you can redistribute it and/or modify
9 # it under the terms of the GNU Lesser Public License as published by
10 # the Free Software Foundation, either version 3 of the License, or
11 # (at your option) any later version.
12 #
13 # This program is distributed in the hope that it will be useful,
14 # but WITHOUT ANY WARRANTY; without even the implied warranty of
15 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 # GNU Lesser Public License for more details.
17 #
18 # You should have received a copy of the GNU Lesser Public License
19 # along with this program. If not, see [http://www.gnu.org/licenses/].
20 """This module contains an object that represents a Telegram InputFile."""
21
22 try:
23 # python 3
24 from email.generator import _make_boundary as choose_boundary
25 except ImportError:
26 # python 2
27 from mimetools import choose_boundary
28
29 import imghdr
30 import mimetypes
31 import os
32 import sys
33
34 from telegram import TelegramError
35
36 DEFAULT_MIME_TYPE = 'application/octet-stream'
37 USER_AGENT = 'Python Telegram Bot (https://github.com/python-telegram-bot/python-telegram-bot)'
38 FILE_TYPES = ('audio', 'document', 'photo', 'sticker', 'video', 'voice', 'certificate',
39 'video_note', 'png_sticker')
40
41
42 class InputFile(object):
43 """This object represents a Telegram InputFile.
44
45 Attributes:
46 data (:obj:`dict`): Data containing an inputfile.
47
48 Args:
49 data (:obj:`dict`): Data containing an inputfile.
50
51 Raises:
52 TelegramError
53
54 """
55
56 def __init__(self, data):
57 self.data = data
58 self.boundary = choose_boundary()
59
60 for t in FILE_TYPES:
61 if t in data:
62 self.input_name = t
63 self.input_file = data.pop(t)
64 break
65 else:
66 raise TelegramError('Unknown inputfile type')
67
68 if hasattr(self.input_file, 'read'):
69 self.filename = None
70 self.input_file_content = self.input_file.read()
71 if 'filename' in data:
72 self.filename = self.data.pop('filename')
73 elif hasattr(self.input_file, 'name'):
74 # on py2.7, pylint fails to understand this properly
75 # pylint: disable=E1101
76 self.filename = os.path.basename(self.input_file.name)
77
78 try:
79 self.mimetype = self.is_image(self.input_file_content)
80 if not self.filename or '.' not in self.filename:
81 self.filename = self.mimetype.replace('/', '.')
82 except TelegramError:
83 if self.filename:
84 self.mimetype = mimetypes.guess_type(
85 self.filename)[0] or DEFAULT_MIME_TYPE
86 else:
87 self.mimetype = DEFAULT_MIME_TYPE
88
89 @property
90 def headers(self):
91 """:obj:`dict`: Headers."""
92
93 return {'User-agent': USER_AGENT, 'Content-type': self.content_type}
94
95 @property
96 def content_type(self):
97 """:obj:`str`: Content type"""
98 return 'multipart/form-data; boundary=%s' % self.boundary
99
100 def to_form(self):
101 """Transform the inputfile to multipart/form data.
102
103 Returns:
104 :obj:`str`
105
106 """
107 form = []
108 form_boundary = '--' + self.boundary
109
110 # Add data fields
111 for name in iter(self.data):
112 value = self.data[name]
113 form.extend([
114 form_boundary, 'Content-Disposition: form-data; name="%s"' % name, '', str(value)
115 ])
116
117 # Add input_file to upload
118 form.extend([
119 form_boundary, 'Content-Disposition: form-data; name="%s"; filename="%s"' %
120 (self.input_name,
121 self.filename), 'Content-Type: %s' % self.mimetype, '', self.input_file_content
122 ])
123
124 form.append('--' + self.boundary + '--')
125 form.append('')
126
127 return self._parse(form)
128
129 @staticmethod
130 def _parse(form):
131 if sys.version_info > (3,):
132 # on Python 3 form needs to be byte encoded
133 encoded_form = []
134 for item in form:
135 try:
136 encoded_form.append(item.encode())
137 except AttributeError:
138 encoded_form.append(item)
139
140 return b'\r\n'.join(encoded_form)
141 return '\r\n'.join(form)
142
143 @staticmethod
144 def is_image(stream):
145 """Check if the content file is an image by analyzing its headers.
146
147 Args:
148 stream (:obj:`str`): A str representing the content of a file.
149
150 Returns:
151 :obj:`str`: The str mime-type of an image.
152
153 """
154 image = imghdr.what(None, stream)
155 if image:
156 return 'image/%s' % image
157
158 raise TelegramError('Could not parse file content')
159
160 @staticmethod
161 def is_inputfile(data):
162 """Check if the request is a file request.
163
164 Args:
165 data (Dict[:obj:`str`, :obj:`str`]): A dict of (str, str) key/value pairs.
166
167 Returns:
168 :obj:`bool`
169
170 """
171 if data:
172 file_type = [i for i in iter(data) if i in FILE_TYPES]
173
174 if file_type:
175 file_content = data[file_type[0]]
176
177 return hasattr(file_content, 'read')
178
179 return False
180
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/telegram/files/inputfile.py b/telegram/files/inputfile.py
--- a/telegram/files/inputfile.py
+++ b/telegram/files/inputfile.py
@@ -70,7 +70,9 @@
self.input_file_content = self.input_file.read()
if 'filename' in data:
self.filename = self.data.pop('filename')
- elif hasattr(self.input_file, 'name'):
+ elif (hasattr(self.input_file, 'name') and
+ not isinstance(self.input_file.name, int) and # py3
+ self.input_file.name != '<fdopen>'): # py2
# on py2.7, pylint fails to understand this properly
# pylint: disable=E1101
self.filename = os.path.basename(self.input_file.name)
|
{"golden_diff": "diff --git a/telegram/files/inputfile.py b/telegram/files/inputfile.py\n--- a/telegram/files/inputfile.py\n+++ b/telegram/files/inputfile.py\n@@ -70,7 +70,9 @@\n self.input_file_content = self.input_file.read()\n if 'filename' in data:\n self.filename = self.data.pop('filename')\n- elif hasattr(self.input_file, 'name'):\n+ elif (hasattr(self.input_file, 'name') and\n+ not isinstance(self.input_file.name, int) and # py3\n+ self.input_file.name != '<fdopen>'): # py2\n # on py2.7, pylint fails to understand this properly\n # pylint: disable=E1101\n self.filename = os.path.basename(self.input_file.name)\n", "issue": "Allow subprocess.Popen.stdout as file input\n<!--\r\nThanks for reporting issues of python-telegram-bot!\r\n\r\nUse this template to notify us if you found a bug, or if you want to request a new feature.\r\nIf you're looking for help with programming your bot using our library, feel free to ask your\r\nquestions in out telegram group at: https://t.me/pythontelegrambotgroup\r\n\r\nTo make it easier for us to help you please enter detailed information below.\r\n\r\nPlease note, we only support the latest version of python-telegram-bot and\r\nmaster branch. Please make sure to upgrade & recreate the issue on the latest\r\nversion prior to opening an issue.\r\n-->\r\n### Steps to reproduce\r\n1.\r\n\r\n```python\r\n@run_async\r\ndef pipe(bot, update):\r\n proc = subprocess.Popen(\r\n \"cat image.jpg\".split(),\r\n stdout=subprocess.PIPE\r\n )\r\n update.message.reply_voice(proc.stdout)\r\n while proc.returncode is None:\r\n proc.poll()\r\n```\r\n\r\n### Expected behaviour\r\n\r\nAllow `subprocess.Popen.stdout` be treated as valid io stream, just like `open(\"image.jpg\", \"rb\")` is.\r\n\r\n### Actual behaviour\r\n\r\n```\r\n...\r\n File \"/usr/lib/python3.6/site-packages/telegram/utils/request.py\", line 262, in post\r\n data = InputFile(data)\r\n File \"/usr/lib/python3.6/site-packages/telegram/files/inputfile.py\", line 76, in __init__\r\n self.filename = os.path.basename(self.input_file.name)\r\n File \"/usr/lib/python3.6/posixpath.py\", line 144, in basename\r\n p = os.fspath(p)\r\nTypeError: expected str, bytes or os.PathLike object, not int\r\n```\r\n\r\n### Configuration\r\n**Operating System:**\r\n\r\n`Linux xiaomi 4.15.9-1-ARCH #1 SMP PREEMPT Sun Mar 11 17:54:33 UTC 2018 x86_64 GNU/Linux`\r\n\r\n**Version of Python, python-telegram-bot & dependencies:**\r\n\r\n```\r\nPython 3.6.4 (default, Jan 5 2018, 02:35:40) \r\n[GCC 7.2.1 20171224] on linux\r\n```\r\n\r\n### PROPOSED FIX\r\n\r\nhttps://github.com/python-telegram-bot/python-telegram-bot/blob/master/telegram/files/inputfile.py#L76 should stringify `self.input_file.name` to look as:\r\n\r\n```python\r\nself.filename = os.path.basename(str(self.input_file.name))\r\n```\n", "before_files": [{"content": "#!/usr/bin/env python\n# pylint: disable=W0622,E0611\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2018\n# Leandro Toledo de Souza <devs@python-telegram-bot.org>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains an object that represents a Telegram InputFile.\"\"\"\n\ntry:\n # python 3\n from email.generator import _make_boundary as choose_boundary\nexcept ImportError:\n # python 2\n from mimetools import choose_boundary\n\nimport imghdr\nimport mimetypes\nimport os\nimport sys\n\nfrom telegram import TelegramError\n\nDEFAULT_MIME_TYPE = 'application/octet-stream'\nUSER_AGENT = 'Python Telegram Bot (https://github.com/python-telegram-bot/python-telegram-bot)'\nFILE_TYPES = ('audio', 'document', 'photo', 'sticker', 'video', 'voice', 'certificate',\n 'video_note', 'png_sticker')\n\n\nclass InputFile(object):\n \"\"\"This object represents a Telegram InputFile.\n\n Attributes:\n data (:obj:`dict`): Data containing an inputfile.\n\n Args:\n data (:obj:`dict`): Data containing an inputfile.\n\n Raises:\n TelegramError\n\n \"\"\"\n\n def __init__(self, data):\n self.data = data\n self.boundary = choose_boundary()\n\n for t in FILE_TYPES:\n if t in data:\n self.input_name = t\n self.input_file = data.pop(t)\n break\n else:\n raise TelegramError('Unknown inputfile type')\n\n if hasattr(self.input_file, 'read'):\n self.filename = None\n self.input_file_content = self.input_file.read()\n if 'filename' in data:\n self.filename = self.data.pop('filename')\n elif hasattr(self.input_file, 'name'):\n # on py2.7, pylint fails to understand this properly\n # pylint: disable=E1101\n self.filename = os.path.basename(self.input_file.name)\n\n try:\n self.mimetype = self.is_image(self.input_file_content)\n if not self.filename or '.' not in self.filename:\n self.filename = self.mimetype.replace('/', '.')\n except TelegramError:\n if self.filename:\n self.mimetype = mimetypes.guess_type(\n self.filename)[0] or DEFAULT_MIME_TYPE\n else:\n self.mimetype = DEFAULT_MIME_TYPE\n\n @property\n def headers(self):\n \"\"\":obj:`dict`: Headers.\"\"\"\n\n return {'User-agent': USER_AGENT, 'Content-type': self.content_type}\n\n @property\n def content_type(self):\n \"\"\":obj:`str`: Content type\"\"\"\n return 'multipart/form-data; boundary=%s' % self.boundary\n\n def to_form(self):\n \"\"\"Transform the inputfile to multipart/form data.\n\n Returns:\n :obj:`str`\n\n \"\"\"\n form = []\n form_boundary = '--' + self.boundary\n\n # Add data fields\n for name in iter(self.data):\n value = self.data[name]\n form.extend([\n form_boundary, 'Content-Disposition: form-data; name=\"%s\"' % name, '', str(value)\n ])\n\n # Add input_file to upload\n form.extend([\n form_boundary, 'Content-Disposition: form-data; name=\"%s\"; filename=\"%s\"' %\n (self.input_name,\n self.filename), 'Content-Type: %s' % self.mimetype, '', self.input_file_content\n ])\n\n form.append('--' + self.boundary + '--')\n form.append('')\n\n return self._parse(form)\n\n @staticmethod\n def _parse(form):\n if sys.version_info > (3,):\n # on Python 3 form needs to be byte encoded\n encoded_form = []\n for item in form:\n try:\n encoded_form.append(item.encode())\n except AttributeError:\n encoded_form.append(item)\n\n return b'\\r\\n'.join(encoded_form)\n return '\\r\\n'.join(form)\n\n @staticmethod\n def is_image(stream):\n \"\"\"Check if the content file is an image by analyzing its headers.\n\n Args:\n stream (:obj:`str`): A str representing the content of a file.\n\n Returns:\n :obj:`str`: The str mime-type of an image.\n\n \"\"\"\n image = imghdr.what(None, stream)\n if image:\n return 'image/%s' % image\n\n raise TelegramError('Could not parse file content')\n\n @staticmethod\n def is_inputfile(data):\n \"\"\"Check if the request is a file request.\n\n Args:\n data (Dict[:obj:`str`, :obj:`str`]): A dict of (str, str) key/value pairs.\n\n Returns:\n :obj:`bool`\n\n \"\"\"\n if data:\n file_type = [i for i in iter(data) if i in FILE_TYPES]\n\n if file_type:\n file_content = data[file_type[0]]\n\n return hasattr(file_content, 'read')\n\n return False\n", "path": "telegram/files/inputfile.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# pylint: disable=W0622,E0611\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2018\n# Leandro Toledo de Souza <devs@python-telegram-bot.org>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains an object that represents a Telegram InputFile.\"\"\"\n\ntry:\n # python 3\n from email.generator import _make_boundary as choose_boundary\nexcept ImportError:\n # python 2\n from mimetools import choose_boundary\n\nimport imghdr\nimport mimetypes\nimport os\nimport sys\n\nfrom telegram import TelegramError\n\nDEFAULT_MIME_TYPE = 'application/octet-stream'\nUSER_AGENT = 'Python Telegram Bot (https://github.com/python-telegram-bot/python-telegram-bot)'\nFILE_TYPES = ('audio', 'document', 'photo', 'sticker', 'video', 'voice', 'certificate',\n 'video_note', 'png_sticker')\n\n\nclass InputFile(object):\n \"\"\"This object represents a Telegram InputFile.\n\n Attributes:\n data (:obj:`dict`): Data containing an inputfile.\n\n Args:\n data (:obj:`dict`): Data containing an inputfile.\n\n Raises:\n TelegramError\n\n \"\"\"\n\n def __init__(self, data):\n self.data = data\n self.boundary = choose_boundary()\n\n for t in FILE_TYPES:\n if t in data:\n self.input_name = t\n self.input_file = data.pop(t)\n break\n else:\n raise TelegramError('Unknown inputfile type')\n\n if hasattr(self.input_file, 'read'):\n self.filename = None\n self.input_file_content = self.input_file.read()\n if 'filename' in data:\n self.filename = self.data.pop('filename')\n elif (hasattr(self.input_file, 'name') and\n not isinstance(self.input_file.name, int) and # py3\n self.input_file.name != '<fdopen>'): # py2\n # on py2.7, pylint fails to understand this properly\n # pylint: disable=E1101\n self.filename = os.path.basename(self.input_file.name)\n\n try:\n self.mimetype = self.is_image(self.input_file_content)\n if not self.filename or '.' not in self.filename:\n self.filename = self.mimetype.replace('/', '.')\n except TelegramError:\n if self.filename:\n self.mimetype = mimetypes.guess_type(\n self.filename)[0] or DEFAULT_MIME_TYPE\n else:\n self.mimetype = DEFAULT_MIME_TYPE\n\n @property\n def headers(self):\n \"\"\":obj:`dict`: Headers.\"\"\"\n\n return {'User-agent': USER_AGENT, 'Content-type': self.content_type}\n\n @property\n def content_type(self):\n \"\"\":obj:`str`: Content type\"\"\"\n return 'multipart/form-data; boundary=%s' % self.boundary\n\n def to_form(self):\n \"\"\"Transform the inputfile to multipart/form data.\n\n Returns:\n :obj:`str`\n\n \"\"\"\n form = []\n form_boundary = '--' + self.boundary\n\n # Add data fields\n for name in iter(self.data):\n value = self.data[name]\n form.extend([\n form_boundary, 'Content-Disposition: form-data; name=\"%s\"' % name, '', str(value)\n ])\n\n # Add input_file to upload\n form.extend([\n form_boundary, 'Content-Disposition: form-data; name=\"%s\"; filename=\"%s\"' %\n (self.input_name,\n self.filename), 'Content-Type: %s' % self.mimetype, '', self.input_file_content\n ])\n\n form.append('--' + self.boundary + '--')\n form.append('')\n\n return self._parse(form)\n\n @staticmethod\n def _parse(form):\n if sys.version_info > (3,):\n # on Python 3 form needs to be byte encoded\n encoded_form = []\n for item in form:\n try:\n encoded_form.append(item.encode())\n except AttributeError:\n encoded_form.append(item)\n\n return b'\\r\\n'.join(encoded_form)\n return '\\r\\n'.join(form)\n\n @staticmethod\n def is_image(stream):\n \"\"\"Check if the content file is an image by analyzing its headers.\n\n Args:\n stream (:obj:`str`): A str representing the content of a file.\n\n Returns:\n :obj:`str`: The str mime-type of an image.\n\n \"\"\"\n image = imghdr.what(None, stream)\n if image:\n return 'image/%s' % image\n\n raise TelegramError('Could not parse file content')\n\n @staticmethod\n def is_inputfile(data):\n \"\"\"Check if the request is a file request.\n\n Args:\n data (Dict[:obj:`str`, :obj:`str`]): A dict of (str, str) key/value pairs.\n\n Returns:\n :obj:`bool`\n\n \"\"\"\n if data:\n file_type = [i for i in iter(data) if i in FILE_TYPES]\n\n if file_type:\n file_content = data[file_type[0]]\n\n return hasattr(file_content, 'read')\n\n return False\n", "path": "telegram/files/inputfile.py"}]}
| 2,475 | 175 |
gh_patches_debug_43176
|
rasdani/github-patches
|
git_diff
|
encode__starlette-134
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support `shutdown` as a synonym for `cleanup`
* Support either `cleanup` or `shutdown` as the ASGI lifespan message name.
* Update uvicorn to move to shutdown - https://github.com/encode/uvicorn/issues/233
* Finally, after a small period of time, drop `cleanup`
Easy PR for a contributor to jump on would be addressing the first part of this, and supporting either name.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlette/lifespan.py`
Content:
```
1 import asyncio
2 import logging
3 import traceback
4 import typing
5 from types import TracebackType
6 from starlette.types import ASGIApp, ASGIInstance, Receive, Message, Send
7
8
9 STATE_TRANSITION_ERROR = "Got invalid state transition on lifespan protocol."
10
11
12 class LifespanHandler:
13 def __init__(self) -> None:
14 self.startup_handlers = [] # type: typing.List[typing.Callable]
15 self.cleanup_handlers = [] # type: typing.List[typing.Callable]
16
17 def on_event(self, event_type: str) -> typing.Callable:
18 def decorator(func: typing.Callable) -> typing.Callable:
19 self.add_event_handler(event_type, func)
20 return func
21
22 return decorator
23
24 def add_event_handler(self, event_type: str, func: typing.Callable) -> None:
25 assert event_type in ("startup", "cleanup")
26
27 if event_type == "startup":
28 self.startup_handlers.append(func)
29 else:
30 self.cleanup_handlers.append(func)
31
32 async def run_startup(self) -> None:
33 for handler in self.startup_handlers:
34 if asyncio.iscoroutinefunction(handler):
35 await handler()
36 else:
37 handler()
38
39 async def run_cleanup(self) -> None:
40 for handler in self.cleanup_handlers:
41 if asyncio.iscoroutinefunction(handler):
42 await handler()
43 else:
44 handler()
45
46 def __call__(self, scope: Message) -> ASGIInstance:
47 assert scope["type"] == "lifespan"
48 return self.run_lifespan
49
50 async def run_lifespan(self, receive: Receive, send: Send) -> None:
51 message = await receive()
52 assert message["type"] == "lifespan.startup"
53 await self.run_startup()
54 await send({"type": "lifespan.startup.complete"})
55 message = await receive()
56 assert message["type"] == "lifespan.cleanup"
57 await self.run_cleanup()
58 await send({"type": "lifespan.cleanup.complete"})
59
60
61 class LifespanContext:
62 def __init__(
63 self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10
64 ) -> None:
65 self.startup_timeout = startup_timeout
66 self.cleanup_timeout = cleanup_timeout
67 self.startup_event = asyncio.Event()
68 self.cleanup_event = asyncio.Event()
69 self.receive_queue = asyncio.Queue() # type: asyncio.Queue
70 self.asgi = app({"type": "lifespan"}) # type: ASGIInstance
71
72 def __enter__(self) -> None:
73 loop = asyncio.get_event_loop()
74 loop.create_task(self.run_lifespan())
75 loop.run_until_complete(self.wait_startup())
76
77 def __exit__(
78 self,
79 exc_type: typing.Type[BaseException],
80 exc: BaseException,
81 tb: TracebackType,
82 ) -> None:
83 loop = asyncio.get_event_loop()
84 loop.run_until_complete(self.wait_cleanup())
85
86 async def run_lifespan(self) -> None:
87 try:
88 await self.asgi(self.receive, self.send)
89 finally:
90 self.startup_event.set()
91 self.cleanup_event.set()
92
93 async def send(self, message: Message) -> None:
94 if message["type"] == "lifespan.startup.complete":
95 assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR
96 assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
97 self.startup_event.set()
98 else:
99 assert message["type"] == "lifespan.cleanup.complete"
100 assert self.startup_event.is_set(), STATE_TRANSITION_ERROR
101 assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
102 self.cleanup_event.set()
103
104 async def receive(self) -> Message:
105 return await self.receive_queue.get()
106
107 async def wait_startup(self) -> None:
108 await self.receive_queue.put({"type": "lifespan.startup"})
109 await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)
110
111 async def wait_cleanup(self) -> None:
112 await self.receive_queue.put({"type": "lifespan.cleanup"})
113 await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/starlette/lifespan.py b/starlette/lifespan.py
--- a/starlette/lifespan.py
+++ b/starlette/lifespan.py
@@ -22,7 +22,7 @@
return decorator
def add_event_handler(self, event_type: str, func: typing.Callable) -> None:
- assert event_type in ("startup", "cleanup")
+ assert event_type in ("startup", "shutdown", "cleanup")
if event_type == "startup":
self.startup_handlers.append(func)
@@ -53,19 +53,26 @@
await self.run_startup()
await send({"type": "lifespan.startup.complete"})
message = await receive()
- assert message["type"] == "lifespan.cleanup"
+ assert (
+ message["type"] == "lifespan.shutdown"
+ or message["type"] == "lifespan.cleanup"
+ )
await self.run_cleanup()
- await send({"type": "lifespan.cleanup.complete"})
+ if message["type"] == "lifespan.shutdown":
+ await send({"type": "lifespan.shutdown.complete"})
+
+ if message["type"] == "lifespan.cleanup":
+ await send({"type": "lifespan.cleanup.complete"}) # pragma: no cover
class LifespanContext:
def __init__(
- self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10
+ self, app: ASGIApp, startup_timeout: int = 10, shutdown_timeout: int = 10
) -> None:
self.startup_timeout = startup_timeout
- self.cleanup_timeout = cleanup_timeout
+ self.shutdown_timeout = shutdown_timeout
self.startup_event = asyncio.Event()
- self.cleanup_event = asyncio.Event()
+ self.shutdown_event = asyncio.Event()
self.receive_queue = asyncio.Queue() # type: asyncio.Queue
self.asgi = app({"type": "lifespan"}) # type: ASGIInstance
@@ -81,25 +88,25 @@
tb: TracebackType,
) -> None:
loop = asyncio.get_event_loop()
- loop.run_until_complete(self.wait_cleanup())
+ loop.run_until_complete(self.wait_shutdown())
async def run_lifespan(self) -> None:
try:
await self.asgi(self.receive, self.send)
finally:
self.startup_event.set()
- self.cleanup_event.set()
+ self.shutdown_event.set()
async def send(self, message: Message) -> None:
if message["type"] == "lifespan.startup.complete":
assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR
- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR
self.startup_event.set()
else:
- assert message["type"] == "lifespan.cleanup.complete"
+ assert message["type"] == "lifespan.shutdown.complete"
assert self.startup_event.is_set(), STATE_TRANSITION_ERROR
- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR
- self.cleanup_event.set()
+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR
+ self.shutdown_event.set()
async def receive(self) -> Message:
return await self.receive_queue.get()
@@ -108,6 +115,8 @@
await self.receive_queue.put({"type": "lifespan.startup"})
await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)
- async def wait_cleanup(self) -> None:
- await self.receive_queue.put({"type": "lifespan.cleanup"})
- await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)
+ async def wait_shutdown(self) -> None:
+ await self.receive_queue.put({"type": "lifespan.shutdown"})
+ await asyncio.wait_for(
+ self.shutdown_event.wait(), timeout=self.shutdown_timeout
+ )
|
{"golden_diff": "diff --git a/starlette/lifespan.py b/starlette/lifespan.py\n--- a/starlette/lifespan.py\n+++ b/starlette/lifespan.py\n@@ -22,7 +22,7 @@\n return decorator\n \n def add_event_handler(self, event_type: str, func: typing.Callable) -> None:\n- assert event_type in (\"startup\", \"cleanup\")\n+ assert event_type in (\"startup\", \"shutdown\", \"cleanup\")\n \n if event_type == \"startup\":\n self.startup_handlers.append(func)\n@@ -53,19 +53,26 @@\n await self.run_startup()\n await send({\"type\": \"lifespan.startup.complete\"})\n message = await receive()\n- assert message[\"type\"] == \"lifespan.cleanup\"\n+ assert (\n+ message[\"type\"] == \"lifespan.shutdown\"\n+ or message[\"type\"] == \"lifespan.cleanup\"\n+ )\n await self.run_cleanup()\n- await send({\"type\": \"lifespan.cleanup.complete\"})\n+ if message[\"type\"] == \"lifespan.shutdown\":\n+ await send({\"type\": \"lifespan.shutdown.complete\"})\n+\n+ if message[\"type\"] == \"lifespan.cleanup\":\n+ await send({\"type\": \"lifespan.cleanup.complete\"}) # pragma: no cover\n \n \n class LifespanContext:\n def __init__(\n- self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10\n+ self, app: ASGIApp, startup_timeout: int = 10, shutdown_timeout: int = 10\n ) -> None:\n self.startup_timeout = startup_timeout\n- self.cleanup_timeout = cleanup_timeout\n+ self.shutdown_timeout = shutdown_timeout\n self.startup_event = asyncio.Event()\n- self.cleanup_event = asyncio.Event()\n+ self.shutdown_event = asyncio.Event()\n self.receive_queue = asyncio.Queue() # type: asyncio.Queue\n self.asgi = app({\"type\": \"lifespan\"}) # type: ASGIInstance\n \n@@ -81,25 +88,25 @@\n tb: TracebackType,\n ) -> None:\n loop = asyncio.get_event_loop()\n- loop.run_until_complete(self.wait_cleanup())\n+ loop.run_until_complete(self.wait_shutdown())\n \n async def run_lifespan(self) -> None:\n try:\n await self.asgi(self.receive, self.send)\n finally:\n self.startup_event.set()\n- self.cleanup_event.set()\n+ self.shutdown_event.set()\n \n async def send(self, message: Message) -> None:\n if message[\"type\"] == \"lifespan.startup.complete\":\n assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR\n- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n self.startup_event.set()\n else:\n- assert message[\"type\"] == \"lifespan.cleanup.complete\"\n+ assert message[\"type\"] == \"lifespan.shutdown.complete\"\n assert self.startup_event.is_set(), STATE_TRANSITION_ERROR\n- assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n- self.cleanup_event.set()\n+ assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n+ self.shutdown_event.set()\n \n async def receive(self) -> Message:\n return await self.receive_queue.get()\n@@ -108,6 +115,8 @@\n await self.receive_queue.put({\"type\": \"lifespan.startup\"})\n await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)\n \n- async def wait_cleanup(self) -> None:\n- await self.receive_queue.put({\"type\": \"lifespan.cleanup\"})\n- await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)\n+ async def wait_shutdown(self) -> None:\n+ await self.receive_queue.put({\"type\": \"lifespan.shutdown\"})\n+ await asyncio.wait_for(\n+ self.shutdown_event.wait(), timeout=self.shutdown_timeout\n+ )\n", "issue": "Support `shutdown` as a synonym for `cleanup`\n* Support either `cleanup` or `shutdown` as the ASGI lifespan message name.\r\n* Update uvicorn to move to shutdown - https://github.com/encode/uvicorn/issues/233\r\n* Finally, after a small period of time, drop `cleanup`\r\n\r\nEasy PR for a contributor to jump on would be addressing the first part of this, and supporting either name.\n", "before_files": [{"content": "import asyncio\nimport logging\nimport traceback\nimport typing\nfrom types import TracebackType\nfrom starlette.types import ASGIApp, ASGIInstance, Receive, Message, Send\n\n\nSTATE_TRANSITION_ERROR = \"Got invalid state transition on lifespan protocol.\"\n\n\nclass LifespanHandler:\n def __init__(self) -> None:\n self.startup_handlers = [] # type: typing.List[typing.Callable]\n self.cleanup_handlers = [] # type: typing.List[typing.Callable]\n\n def on_event(self, event_type: str) -> typing.Callable:\n def decorator(func: typing.Callable) -> typing.Callable:\n self.add_event_handler(event_type, func)\n return func\n\n return decorator\n\n def add_event_handler(self, event_type: str, func: typing.Callable) -> None:\n assert event_type in (\"startup\", \"cleanup\")\n\n if event_type == \"startup\":\n self.startup_handlers.append(func)\n else:\n self.cleanup_handlers.append(func)\n\n async def run_startup(self) -> None:\n for handler in self.startup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n async def run_cleanup(self) -> None:\n for handler in self.cleanup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n def __call__(self, scope: Message) -> ASGIInstance:\n assert scope[\"type\"] == \"lifespan\"\n return self.run_lifespan\n\n async def run_lifespan(self, receive: Receive, send: Send) -> None:\n message = await receive()\n assert message[\"type\"] == \"lifespan.startup\"\n await self.run_startup()\n await send({\"type\": \"lifespan.startup.complete\"})\n message = await receive()\n assert message[\"type\"] == \"lifespan.cleanup\"\n await self.run_cleanup()\n await send({\"type\": \"lifespan.cleanup.complete\"})\n\n\nclass LifespanContext:\n def __init__(\n self, app: ASGIApp, startup_timeout: int = 10, cleanup_timeout: int = 10\n ) -> None:\n self.startup_timeout = startup_timeout\n self.cleanup_timeout = cleanup_timeout\n self.startup_event = asyncio.Event()\n self.cleanup_event = asyncio.Event()\n self.receive_queue = asyncio.Queue() # type: asyncio.Queue\n self.asgi = app({\"type\": \"lifespan\"}) # type: ASGIInstance\n\n def __enter__(self) -> None:\n loop = asyncio.get_event_loop()\n loop.create_task(self.run_lifespan())\n loop.run_until_complete(self.wait_startup())\n\n def __exit__(\n self,\n exc_type: typing.Type[BaseException],\n exc: BaseException,\n tb: TracebackType,\n ) -> None:\n loop = asyncio.get_event_loop()\n loop.run_until_complete(self.wait_cleanup())\n\n async def run_lifespan(self) -> None:\n try:\n await self.asgi(self.receive, self.send)\n finally:\n self.startup_event.set()\n self.cleanup_event.set()\n\n async def send(self, message: Message) -> None:\n if message[\"type\"] == \"lifespan.startup.complete\":\n assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n self.startup_event.set()\n else:\n assert message[\"type\"] == \"lifespan.cleanup.complete\"\n assert self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.cleanup_event.is_set(), STATE_TRANSITION_ERROR\n self.cleanup_event.set()\n\n async def receive(self) -> Message:\n return await self.receive_queue.get()\n\n async def wait_startup(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.startup\"})\n await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)\n\n async def wait_cleanup(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.cleanup\"})\n await asyncio.wait_for(self.cleanup_event.wait(), timeout=self.cleanup_timeout)\n", "path": "starlette/lifespan.py"}], "after_files": [{"content": "import asyncio\nimport logging\nimport traceback\nimport typing\nfrom types import TracebackType\nfrom starlette.types import ASGIApp, ASGIInstance, Receive, Message, Send\n\n\nSTATE_TRANSITION_ERROR = \"Got invalid state transition on lifespan protocol.\"\n\n\nclass LifespanHandler:\n def __init__(self) -> None:\n self.startup_handlers = [] # type: typing.List[typing.Callable]\n self.cleanup_handlers = [] # type: typing.List[typing.Callable]\n\n def on_event(self, event_type: str) -> typing.Callable:\n def decorator(func: typing.Callable) -> typing.Callable:\n self.add_event_handler(event_type, func)\n return func\n\n return decorator\n\n def add_event_handler(self, event_type: str, func: typing.Callable) -> None:\n assert event_type in (\"startup\", \"shutdown\", \"cleanup\")\n\n if event_type == \"startup\":\n self.startup_handlers.append(func)\n else:\n self.cleanup_handlers.append(func)\n\n async def run_startup(self) -> None:\n for handler in self.startup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n async def run_cleanup(self) -> None:\n for handler in self.cleanup_handlers:\n if asyncio.iscoroutinefunction(handler):\n await handler()\n else:\n handler()\n\n def __call__(self, scope: Message) -> ASGIInstance:\n assert scope[\"type\"] == \"lifespan\"\n return self.run_lifespan\n\n async def run_lifespan(self, receive: Receive, send: Send) -> None:\n message = await receive()\n assert message[\"type\"] == \"lifespan.startup\"\n await self.run_startup()\n await send({\"type\": \"lifespan.startup.complete\"})\n message = await receive()\n assert (\n message[\"type\"] == \"lifespan.shutdown\"\n or message[\"type\"] == \"lifespan.cleanup\"\n )\n await self.run_cleanup()\n if message[\"type\"] == \"lifespan.shutdown\":\n await send({\"type\": \"lifespan.shutdown.complete\"})\n\n if message[\"type\"] == \"lifespan.cleanup\":\n await send({\"type\": \"lifespan.cleanup.complete\"}) # pragma: no cover\n\n\nclass LifespanContext:\n def __init__(\n self, app: ASGIApp, startup_timeout: int = 10, shutdown_timeout: int = 10\n ) -> None:\n self.startup_timeout = startup_timeout\n self.shutdown_timeout = shutdown_timeout\n self.startup_event = asyncio.Event()\n self.shutdown_event = asyncio.Event()\n self.receive_queue = asyncio.Queue() # type: asyncio.Queue\n self.asgi = app({\"type\": \"lifespan\"}) # type: ASGIInstance\n\n def __enter__(self) -> None:\n loop = asyncio.get_event_loop()\n loop.create_task(self.run_lifespan())\n loop.run_until_complete(self.wait_startup())\n\n def __exit__(\n self,\n exc_type: typing.Type[BaseException],\n exc: BaseException,\n tb: TracebackType,\n ) -> None:\n loop = asyncio.get_event_loop()\n loop.run_until_complete(self.wait_shutdown())\n\n async def run_lifespan(self) -> None:\n try:\n await self.asgi(self.receive, self.send)\n finally:\n self.startup_event.set()\n self.shutdown_event.set()\n\n async def send(self, message: Message) -> None:\n if message[\"type\"] == \"lifespan.startup.complete\":\n assert not self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n self.startup_event.set()\n else:\n assert message[\"type\"] == \"lifespan.shutdown.complete\"\n assert self.startup_event.is_set(), STATE_TRANSITION_ERROR\n assert not self.shutdown_event.is_set(), STATE_TRANSITION_ERROR\n self.shutdown_event.set()\n\n async def receive(self) -> Message:\n return await self.receive_queue.get()\n\n async def wait_startup(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.startup\"})\n await asyncio.wait_for(self.startup_event.wait(), timeout=self.startup_timeout)\n\n async def wait_shutdown(self) -> None:\n await self.receive_queue.put({\"type\": \"lifespan.shutdown\"})\n await asyncio.wait_for(\n self.shutdown_event.wait(), timeout=self.shutdown_timeout\n )\n", "path": "starlette/lifespan.py"}]}
| 1,484 | 888 |
gh_patches_debug_32409
|
rasdani/github-patches
|
git_diff
|
plone__Products.CMFPlone-1044
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
HTML filtering requires a "re save" on each page for the settings to take effect.
Using: Plone 5.0b4.dev0 (5007)
Been testing the HTML filtering settings using iFrame video.
When I add an "iframe" video the saved page does not show the video (as expected). But when I toggle ON the "Disable HTML Filtering" checkbox the video still does not display on the page (even after emptying cache). If I edit the video page and immediately "Save" (making no edits) the video displays as expected. The reverse is also true. If I have video correctly displaying on the page (with html filtering disabled) and uncheck "Disable HTML Filtering" checkbox in the HTML Filtering panel, go back to the video page and reload the video still displays. If I edit the page and immediately save the page the video properly does not display. Is this the expected behavior?
In addition. If I add "iFrame" to the "Custom tags" dialogue box I cannot get the video to display with "Disable HTML Filtering" checkbox OFF. Even if I do the "Edit" and "Save" trick from above.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `Products/CMFPlone/controlpanel/browser/filter.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from Products.CMFCore.utils import getToolByName
3 from Products.CMFPlone import PloneMessageFactory as _ # NOQA
4 from Products.CMFPlone.interfaces import IFilterSchema
5 from Products.Five.browser.pagetemplatefile import ViewPageTemplateFile
6 from Products.statusmessages.interfaces import IStatusMessage
7 from plone.autoform.form import AutoExtensibleForm
8 from plone.z3cform import layout
9 from z3c.form import button
10 from z3c.form import form
11 from Products.PortalTransforms.transforms.safe_html import VALID_TAGS
12 from Products.PortalTransforms.transforms.safe_html import NASTY_TAGS
13
14
15 class FilterControlPanel(AutoExtensibleForm, form.EditForm):
16 id = "FilterControlPanel"
17 label = _(u"HTML Filtering Settings")
18 description = _(
19 'description_html_filtering',
20 default=u"HTML generation is heavily cached across Plone. "
21 u"After changing settings here, you may have to edit "
22 u"existing content to see the changes in these filter settings "
23 u"or restart your server.")
24 schema = IFilterSchema
25 form_name = _(u"HTML Filtering Settings")
26 control_panel_view = "filter-controlpanel"
27
28 def updateActions(self): # NOQA
29 """Have to override this because we only have Save, not Cancel
30 """
31 super(FilterControlPanel, self).updateActions()
32 self.actions['save'].addClass("context")
33
34 @button.buttonAndHandler(_(u"Save"), name='save')
35 def handleSave(self, action): # NOQA
36 data, errors = self.extractData()
37 if errors:
38 self.status = self.formErrorsMessage
39 return
40
41 # Save in portal tools
42 safe_html = getattr(
43 getToolByName(self.context, 'portal_transforms'),
44 'safe_html',
45 None)
46
47 nasty_tags = data['nasty_tags']
48 custom_tags = data['custom_tags']
49 stripped_tags = data['stripped_tags']
50
51 valid = safe_html._config['valid_tags']
52
53 # start with removing tags that do not belong in valid tags
54 for value in nasty_tags + stripped_tags:
55 if value in valid:
56 del valid[value]
57 # add in custom tags
58 for custom in custom_tags:
59 if value not in valid:
60 valid[custom] = 1
61 # then, check if something was previously prevented but is no longer
62 for tag in set(VALID_TAGS.keys()) - set(valid.keys()):
63 if tag not in nasty_tags and tag not in stripped_tags:
64 valid[tag] = VALID_TAGS[tag]
65
66 # nasty tags are simple, just set the value here
67 nasty_value = {}
68 for tag in nasty_tags:
69 nasty_value[tag] = NASTY_TAGS.get(tag, VALID_TAGS.get(tag, 1))
70 safe_html._config['nasty_tags'] = nasty_value
71
72 disable_filtering = int(data['disable_filtering'])
73 if disable_filtering != safe_html._config['disable_transform']:
74 safe_html._config['disable_transform'] = disable_filtering
75
76 for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',
77 'style_whitelist'):
78 value = data[attr]
79 if value is None:
80 if attr == 'stripped_combinations':
81 value = {}
82 else:
83 value = []
84 if value != safe_html._config[attr]:
85 safe_html._config[attr] = value
86
87 # always reload the transform
88 safe_html._p_changed = True
89 safe_html.reload()
90
91 self.applyChanges(data)
92 IStatusMessage(self.request).addStatusMessage(
93 _(u"Changes saved."),
94 "info")
95 self.request.response.redirect(self.request.getURL())
96
97
98 class ControlPanelFormWrapper(layout.FormWrapper):
99 """Use this form as the plone.z3cform layout wrapper to get the control
100 panel layout.
101 """
102 index = ViewPageTemplateFile('filter_controlpanel.pt')
103
104
105 FilterControlPanelView = layout.wrap_form(
106 FilterControlPanel, ControlPanelFormWrapper)
107
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/Products/CMFPlone/controlpanel/browser/filter.py b/Products/CMFPlone/controlpanel/browser/filter.py
--- a/Products/CMFPlone/controlpanel/browser/filter.py
+++ b/Products/CMFPlone/controlpanel/browser/filter.py
@@ -15,12 +15,7 @@
class FilterControlPanel(AutoExtensibleForm, form.EditForm):
id = "FilterControlPanel"
label = _(u"HTML Filtering Settings")
- description = _(
- 'description_html_filtering',
- default=u"HTML generation is heavily cached across Plone. "
- u"After changing settings here, you may have to edit "
- u"existing content to see the changes in these filter settings "
- u"or restart your server.")
+ description = ""
schema = IFilterSchema
form_name = _(u"HTML Filtering Settings")
control_panel_view = "filter-controlpanel"
@@ -73,8 +68,8 @@
if disable_filtering != safe_html._config['disable_transform']:
safe_html._config['disable_transform'] = disable_filtering
- for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',
- 'style_whitelist'):
+ for attr in ('stripped_combinations', 'class_blacklist',
+ 'stripped_attributes', 'style_whitelist'):
value = data[attr]
if value is None:
if attr == 'stripped_combinations':
@@ -92,6 +87,11 @@
IStatusMessage(self.request).addStatusMessage(
_(u"Changes saved."),
"info")
+ IStatusMessage(self.request).addStatusMessage(
+ _(u"HTML generation is heavily cached across Plone. You may "
+ u"have to edit existing content or restart your server to see "
+ u"the changes."),
+ "warning")
self.request.response.redirect(self.request.getURL())
|
{"golden_diff": "diff --git a/Products/CMFPlone/controlpanel/browser/filter.py b/Products/CMFPlone/controlpanel/browser/filter.py\n--- a/Products/CMFPlone/controlpanel/browser/filter.py\n+++ b/Products/CMFPlone/controlpanel/browser/filter.py\n@@ -15,12 +15,7 @@\n class FilterControlPanel(AutoExtensibleForm, form.EditForm):\n id = \"FilterControlPanel\"\n label = _(u\"HTML Filtering Settings\")\n- description = _(\n- 'description_html_filtering',\n- default=u\"HTML generation is heavily cached across Plone. \"\n- u\"After changing settings here, you may have to edit \"\n- u\"existing content to see the changes in these filter settings \"\n- u\"or restart your server.\")\n+ description = \"\"\n schema = IFilterSchema\n form_name = _(u\"HTML Filtering Settings\")\n control_panel_view = \"filter-controlpanel\"\n@@ -73,8 +68,8 @@\n if disable_filtering != safe_html._config['disable_transform']:\n safe_html._config['disable_transform'] = disable_filtering\n \n- for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',\n- 'style_whitelist'):\n+ for attr in ('stripped_combinations', 'class_blacklist',\n+ 'stripped_attributes', 'style_whitelist'):\n value = data[attr]\n if value is None:\n if attr == 'stripped_combinations':\n@@ -92,6 +87,11 @@\n IStatusMessage(self.request).addStatusMessage(\n _(u\"Changes saved.\"),\n \"info\")\n+ IStatusMessage(self.request).addStatusMessage(\n+ _(u\"HTML generation is heavily cached across Plone. You may \"\n+ u\"have to edit existing content or restart your server to see \"\n+ u\"the changes.\"),\n+ \"warning\")\n self.request.response.redirect(self.request.getURL())\n", "issue": "HTML filtering requires a \"re save\" on each page for the settings to take effect. \nUsing: Plone 5.0b4.dev0 (5007)\n\nBeen testing the HTML filtering settings using iFrame video. \nWhen I add an \"iframe\" video the saved page does not show the video (as expected). But when I toggle ON the \"Disable HTML Filtering\" checkbox the video still does not display on the page (even after emptying cache). If I edit the video page and immediately \"Save\" (making no edits) the video displays as expected. The reverse is also true. If I have video correctly displaying on the page (with html filtering disabled) and uncheck \"Disable HTML Filtering\" checkbox in the HTML Filtering panel, go back to the video page and reload the video still displays. If I edit the page and immediately save the page the video properly does not display. Is this the expected behavior?\n\nIn addition. If I add \"iFrame\" to the \"Custom tags\" dialogue box I cannot get the video to display with \"Disable HTML Filtering\" checkbox OFF. Even if I do the \"Edit\" and \"Save\" trick from above.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom Products.CMFCore.utils import getToolByName\nfrom Products.CMFPlone import PloneMessageFactory as _ # NOQA\nfrom Products.CMFPlone.interfaces import IFilterSchema\nfrom Products.Five.browser.pagetemplatefile import ViewPageTemplateFile\nfrom Products.statusmessages.interfaces import IStatusMessage\nfrom plone.autoform.form import AutoExtensibleForm\nfrom plone.z3cform import layout\nfrom z3c.form import button\nfrom z3c.form import form\nfrom Products.PortalTransforms.transforms.safe_html import VALID_TAGS\nfrom Products.PortalTransforms.transforms.safe_html import NASTY_TAGS\n\n\nclass FilterControlPanel(AutoExtensibleForm, form.EditForm):\n id = \"FilterControlPanel\"\n label = _(u\"HTML Filtering Settings\")\n description = _(\n 'description_html_filtering',\n default=u\"HTML generation is heavily cached across Plone. \"\n u\"After changing settings here, you may have to edit \"\n u\"existing content to see the changes in these filter settings \"\n u\"or restart your server.\")\n schema = IFilterSchema\n form_name = _(u\"HTML Filtering Settings\")\n control_panel_view = \"filter-controlpanel\"\n\n def updateActions(self): # NOQA\n \"\"\"Have to override this because we only have Save, not Cancel\n \"\"\"\n super(FilterControlPanel, self).updateActions()\n self.actions['save'].addClass(\"context\")\n\n @button.buttonAndHandler(_(u\"Save\"), name='save')\n def handleSave(self, action): # NOQA\n data, errors = self.extractData()\n if errors:\n self.status = self.formErrorsMessage\n return\n\n # Save in portal tools\n safe_html = getattr(\n getToolByName(self.context, 'portal_transforms'),\n 'safe_html',\n None)\n\n nasty_tags = data['nasty_tags']\n custom_tags = data['custom_tags']\n stripped_tags = data['stripped_tags']\n\n valid = safe_html._config['valid_tags']\n\n # start with removing tags that do not belong in valid tags\n for value in nasty_tags + stripped_tags:\n if value in valid:\n del valid[value]\n # add in custom tags\n for custom in custom_tags:\n if value not in valid:\n valid[custom] = 1\n # then, check if something was previously prevented but is no longer\n for tag in set(VALID_TAGS.keys()) - set(valid.keys()):\n if tag not in nasty_tags and tag not in stripped_tags:\n valid[tag] = VALID_TAGS[tag]\n\n # nasty tags are simple, just set the value here\n nasty_value = {}\n for tag in nasty_tags:\n nasty_value[tag] = NASTY_TAGS.get(tag, VALID_TAGS.get(tag, 1))\n safe_html._config['nasty_tags'] = nasty_value\n\n disable_filtering = int(data['disable_filtering'])\n if disable_filtering != safe_html._config['disable_transform']:\n safe_html._config['disable_transform'] = disable_filtering\n\n for attr in ('stripped_combinations', 'class_blacklist', 'stripped_attributes',\n 'style_whitelist'):\n value = data[attr]\n if value is None:\n if attr == 'stripped_combinations':\n value = {}\n else:\n value = []\n if value != safe_html._config[attr]:\n safe_html._config[attr] = value\n\n # always reload the transform\n safe_html._p_changed = True\n safe_html.reload()\n\n self.applyChanges(data)\n IStatusMessage(self.request).addStatusMessage(\n _(u\"Changes saved.\"),\n \"info\")\n self.request.response.redirect(self.request.getURL())\n\n\nclass ControlPanelFormWrapper(layout.FormWrapper):\n \"\"\"Use this form as the plone.z3cform layout wrapper to get the control\n panel layout.\n \"\"\"\n index = ViewPageTemplateFile('filter_controlpanel.pt')\n\n\nFilterControlPanelView = layout.wrap_form(\n FilterControlPanel, ControlPanelFormWrapper)\n", "path": "Products/CMFPlone/controlpanel/browser/filter.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom Products.CMFCore.utils import getToolByName\nfrom Products.CMFPlone import PloneMessageFactory as _ # NOQA\nfrom Products.CMFPlone.interfaces import IFilterSchema\nfrom Products.Five.browser.pagetemplatefile import ViewPageTemplateFile\nfrom Products.statusmessages.interfaces import IStatusMessage\nfrom plone.autoform.form import AutoExtensibleForm\nfrom plone.z3cform import layout\nfrom z3c.form import button\nfrom z3c.form import form\nfrom Products.PortalTransforms.transforms.safe_html import VALID_TAGS\nfrom Products.PortalTransforms.transforms.safe_html import NASTY_TAGS\n\n\nclass FilterControlPanel(AutoExtensibleForm, form.EditForm):\n id = \"FilterControlPanel\"\n label = _(u\"HTML Filtering Settings\")\n description = \"\"\n schema = IFilterSchema\n form_name = _(u\"HTML Filtering Settings\")\n control_panel_view = \"filter-controlpanel\"\n\n def updateActions(self): # NOQA\n \"\"\"Have to override this because we only have Save, not Cancel\n \"\"\"\n super(FilterControlPanel, self).updateActions()\n self.actions['save'].addClass(\"context\")\n\n @button.buttonAndHandler(_(u\"Save\"), name='save')\n def handleSave(self, action): # NOQA\n data, errors = self.extractData()\n if errors:\n self.status = self.formErrorsMessage\n return\n\n # Save in portal tools\n safe_html = getattr(\n getToolByName(self.context, 'portal_transforms'),\n 'safe_html',\n None)\n\n nasty_tags = data['nasty_tags']\n custom_tags = data['custom_tags']\n stripped_tags = data['stripped_tags']\n\n valid = safe_html._config['valid_tags']\n\n # start with removing tags that do not belong in valid tags\n for value in nasty_tags + stripped_tags:\n if value in valid:\n del valid[value]\n # add in custom tags\n for custom in custom_tags:\n if value not in valid:\n valid[custom] = 1\n # then, check if something was previously prevented but is no longer\n for tag in set(VALID_TAGS.keys()) - set(valid.keys()):\n if tag not in nasty_tags and tag not in stripped_tags:\n valid[tag] = VALID_TAGS[tag]\n\n # nasty tags are simple, just set the value here\n nasty_value = {}\n for tag in nasty_tags:\n nasty_value[tag] = NASTY_TAGS.get(tag, VALID_TAGS.get(tag, 1))\n safe_html._config['nasty_tags'] = nasty_value\n\n disable_filtering = int(data['disable_filtering'])\n if disable_filtering != safe_html._config['disable_transform']:\n safe_html._config['disable_transform'] = disable_filtering\n\n for attr in ('stripped_combinations', 'class_blacklist',\n 'stripped_attributes', 'style_whitelist'):\n value = data[attr]\n if value is None:\n if attr == 'stripped_combinations':\n value = {}\n else:\n value = []\n if value != safe_html._config[attr]:\n safe_html._config[attr] = value\n\n # always reload the transform\n safe_html._p_changed = True\n safe_html.reload()\n\n self.applyChanges(data)\n IStatusMessage(self.request).addStatusMessage(\n _(u\"Changes saved.\"),\n \"info\")\n IStatusMessage(self.request).addStatusMessage(\n _(u\"HTML generation is heavily cached across Plone. You may \"\n u\"have to edit existing content or restart your server to see \"\n u\"the changes.\"),\n \"warning\")\n self.request.response.redirect(self.request.getURL())\n\n\nclass ControlPanelFormWrapper(layout.FormWrapper):\n \"\"\"Use this form as the plone.z3cform layout wrapper to get the control\n panel layout.\n \"\"\"\n index = ViewPageTemplateFile('filter_controlpanel.pt')\n\n\nFilterControlPanelView = layout.wrap_form(\n FilterControlPanel, ControlPanelFormWrapper)\n", "path": "Products/CMFPlone/controlpanel/browser/filter.py"}]}
| 1,597 | 425 |
gh_patches_debug_9181
|
rasdani/github-patches
|
git_diff
|
certbot__certbot-9218
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
snap_config.prepare_env can hang forever if snapd is malfunctioning
Reported at https://community.letsencrypt.org/t/sudo-certbot-renew-hangs-forever/138649
There's some evidence in that thread that `snapd` can get into a state where it just stops responding to commands (whether via `snap`, `snapctl`, or the REST API directly).
Certbot should guard against the possibility of hanging forever, by setting timeouts on relevant network operations:
https://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L87-L89
https://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L56-L57
Edit: to clarify, although the reported issue involves the previous shell-based snap wrapper, I have reproduced the same hang and lack of effective timeout with the current Python implementation.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `certbot/certbot/_internal/snap_config.py`
Content:
```
1 """Module configuring Certbot in a snap environment"""
2 import logging
3 import socket
4 from typing import Iterable
5 from typing import List
6 from typing import Optional
7
8 from requests import Session
9 from requests.adapters import HTTPAdapter
10 from requests.exceptions import HTTPError
11 from requests.exceptions import RequestException
12
13 from certbot.compat import os
14 from certbot.errors import Error
15
16 try:
17 from urllib3.connection import HTTPConnection
18 from urllib3.connectionpool import HTTPConnectionPool
19 except ImportError:
20 # Stub imports for oldest requirements, that will never be used in snaps.
21 HTTPConnection = object # type: ignore[misc,assignment]
22 HTTPConnectionPool = object # type: ignore[misc,assignment]
23
24
25 _ARCH_TRIPLET_MAP = {
26 'arm64': 'aarch64-linux-gnu',
27 'armhf': 'arm-linux-gnueabihf',
28 'i386': 'i386-linux-gnu',
29 'ppc64el': 'powerpc64le-linux-gnu',
30 'powerpc': 'powerpc-linux-gnu',
31 'amd64': 'x86_64-linux-gnu',
32 's390x': 's390x-linux-gnu',
33 }
34
35 LOGGER = logging.getLogger(__name__)
36
37
38 def prepare_env(cli_args: List[str]) -> List[str]:
39 """
40 Prepare runtime environment for a certbot execution in snap.
41 :param list cli_args: List of command line arguments
42 :return: Update list of command line arguments
43 :rtype: list
44 """
45 snap_arch = os.environ.get('SNAP_ARCH')
46
47 if snap_arch not in _ARCH_TRIPLET_MAP:
48 raise Error('Unrecognized value of SNAP_ARCH: {0}'.format(snap_arch))
49
50 os.environ['CERTBOT_AUGEAS_PATH'] = '{0}/usr/lib/{1}/libaugeas.so.0'.format(
51 os.environ.get('SNAP'), _ARCH_TRIPLET_MAP[snap_arch])
52
53 with Session() as session:
54 session.mount('http://snapd/', _SnapdAdapter())
55
56 try:
57 response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')
58 response.raise_for_status()
59 except RequestException as e:
60 if isinstance(e, HTTPError) and e.response.status_code == 404:
61 LOGGER.error('An error occurred while fetching Certbot snap plugins: '
62 'your version of snapd is outdated.')
63 LOGGER.error('Please run "sudo snap install core; sudo snap refresh core" '
64 'in your terminal and try again.')
65 else:
66 LOGGER.error('An error occurred while fetching Certbot snap plugins: '
67 'make sure the snapd service is running.')
68 raise e
69
70 data = response.json()
71 connections = ['/snap/{0}/current/lib/python3.8/site-packages/'.format(item['slot']['snap'])
72 for item in data.get('result', {}).get('established', [])
73 if item.get('plug', {}).get('plug') == 'plugin'
74 and item.get('plug-attrs', {}).get('content') == 'certbot-1']
75
76 os.environ['CERTBOT_PLUGIN_PATH'] = ':'.join(connections)
77
78 cli_args.append('--preconfigured-renewal')
79
80 return cli_args
81
82
83 class _SnapdConnection(HTTPConnection):
84 def __init__(self) -> None:
85 super().__init__("localhost")
86 self.sock: Optional[socket.socket] = None
87
88 def connect(self) -> None:
89 self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
90 self.sock.connect("/run/snapd.socket")
91
92
93 class _SnapdConnectionPool(HTTPConnectionPool):
94 def __init__(self) -> None:
95 super().__init__("localhost")
96
97 def _new_conn(self) -> _SnapdConnection:
98 return _SnapdConnection()
99
100
101 class _SnapdAdapter(HTTPAdapter):
102 def get_connection(self, url: str,
103 proxies: Optional[Iterable[str]] = None) -> _SnapdConnectionPool:
104 return _SnapdConnectionPool()
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/certbot/certbot/_internal/snap_config.py b/certbot/certbot/_internal/snap_config.py
--- a/certbot/certbot/_internal/snap_config.py
+++ b/certbot/certbot/_internal/snap_config.py
@@ -54,7 +54,8 @@
session.mount('http://snapd/', _SnapdAdapter())
try:
- response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')
+ response = session.get('http://snapd/v2/connections?snap=certbot&interface=content',
+ timeout=30.0)
response.raise_for_status()
except RequestException as e:
if isinstance(e, HTTPError) and e.response.status_code == 404:
|
{"golden_diff": "diff --git a/certbot/certbot/_internal/snap_config.py b/certbot/certbot/_internal/snap_config.py\n--- a/certbot/certbot/_internal/snap_config.py\n+++ b/certbot/certbot/_internal/snap_config.py\n@@ -54,7 +54,8 @@\n session.mount('http://snapd/', _SnapdAdapter())\n \n try:\n- response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')\n+ response = session.get('http://snapd/v2/connections?snap=certbot&interface=content',\n+ timeout=30.0)\n response.raise_for_status()\n except RequestException as e:\n if isinstance(e, HTTPError) and e.response.status_code == 404:\n", "issue": "snap_config.prepare_env can hang forever if snapd is malfunctioning\nReported at https://community.letsencrypt.org/t/sudo-certbot-renew-hangs-forever/138649\r\n\r\nThere's some evidence in that thread that `snapd` can get into a state where it just stops responding to commands (whether via `snap`, `snapctl`, or the REST API directly).\r\n\r\nCertbot should guard against the possibility of hanging forever, by setting timeouts on relevant network operations:\r\n\r\nhttps://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L87-L89\r\n\r\nhttps://github.com/certbot/certbot/blob/9ca7f76505b10b2f395ddffc4ddc1cbc8afb516b/certbot/certbot/_internal/snap_config.py#L56-L57\r\n\r\nEdit: to clarify, although the reported issue involves the previous shell-based snap wrapper, I have reproduced the same hang and lack of effective timeout with the current Python implementation.\n", "before_files": [{"content": "\"\"\"Module configuring Certbot in a snap environment\"\"\"\nimport logging\nimport socket\nfrom typing import Iterable\nfrom typing import List\nfrom typing import Optional\n\nfrom requests import Session\nfrom requests.adapters import HTTPAdapter\nfrom requests.exceptions import HTTPError\nfrom requests.exceptions import RequestException\n\nfrom certbot.compat import os\nfrom certbot.errors import Error\n\ntry:\n from urllib3.connection import HTTPConnection\n from urllib3.connectionpool import HTTPConnectionPool\nexcept ImportError:\n # Stub imports for oldest requirements, that will never be used in snaps.\n HTTPConnection = object # type: ignore[misc,assignment]\n HTTPConnectionPool = object # type: ignore[misc,assignment]\n\n\n_ARCH_TRIPLET_MAP = {\n 'arm64': 'aarch64-linux-gnu',\n 'armhf': 'arm-linux-gnueabihf',\n 'i386': 'i386-linux-gnu',\n 'ppc64el': 'powerpc64le-linux-gnu',\n 'powerpc': 'powerpc-linux-gnu',\n 'amd64': 'x86_64-linux-gnu',\n 's390x': 's390x-linux-gnu',\n}\n\nLOGGER = logging.getLogger(__name__)\n\n\ndef prepare_env(cli_args: List[str]) -> List[str]:\n \"\"\"\n Prepare runtime environment for a certbot execution in snap.\n :param list cli_args: List of command line arguments\n :return: Update list of command line arguments\n :rtype: list\n \"\"\"\n snap_arch = os.environ.get('SNAP_ARCH')\n\n if snap_arch not in _ARCH_TRIPLET_MAP:\n raise Error('Unrecognized value of SNAP_ARCH: {0}'.format(snap_arch))\n\n os.environ['CERTBOT_AUGEAS_PATH'] = '{0}/usr/lib/{1}/libaugeas.so.0'.format(\n os.environ.get('SNAP'), _ARCH_TRIPLET_MAP[snap_arch])\n\n with Session() as session:\n session.mount('http://snapd/', _SnapdAdapter())\n\n try:\n response = session.get('http://snapd/v2/connections?snap=certbot&interface=content')\n response.raise_for_status()\n except RequestException as e:\n if isinstance(e, HTTPError) and e.response.status_code == 404:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'your version of snapd is outdated.')\n LOGGER.error('Please run \"sudo snap install core; sudo snap refresh core\" '\n 'in your terminal and try again.')\n else:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'make sure the snapd service is running.')\n raise e\n\n data = response.json()\n connections = ['/snap/{0}/current/lib/python3.8/site-packages/'.format(item['slot']['snap'])\n for item in data.get('result', {}).get('established', [])\n if item.get('plug', {}).get('plug') == 'plugin'\n and item.get('plug-attrs', {}).get('content') == 'certbot-1']\n\n os.environ['CERTBOT_PLUGIN_PATH'] = ':'.join(connections)\n\n cli_args.append('--preconfigured-renewal')\n\n return cli_args\n\n\nclass _SnapdConnection(HTTPConnection):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n self.sock: Optional[socket.socket] = None\n\n def connect(self) -> None:\n self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n self.sock.connect(\"/run/snapd.socket\")\n\n\nclass _SnapdConnectionPool(HTTPConnectionPool):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n\n def _new_conn(self) -> _SnapdConnection:\n return _SnapdConnection()\n\n\nclass _SnapdAdapter(HTTPAdapter):\n def get_connection(self, url: str,\n proxies: Optional[Iterable[str]] = None) -> _SnapdConnectionPool:\n return _SnapdConnectionPool()\n", "path": "certbot/certbot/_internal/snap_config.py"}], "after_files": [{"content": "\"\"\"Module configuring Certbot in a snap environment\"\"\"\nimport logging\nimport socket\nfrom typing import Iterable\nfrom typing import List\nfrom typing import Optional\n\nfrom requests import Session\nfrom requests.adapters import HTTPAdapter\nfrom requests.exceptions import HTTPError\nfrom requests.exceptions import RequestException\n\nfrom certbot.compat import os\nfrom certbot.errors import Error\n\ntry:\n from urllib3.connection import HTTPConnection\n from urllib3.connectionpool import HTTPConnectionPool\nexcept ImportError:\n # Stub imports for oldest requirements, that will never be used in snaps.\n HTTPConnection = object # type: ignore[misc,assignment]\n HTTPConnectionPool = object # type: ignore[misc,assignment]\n\n\n_ARCH_TRIPLET_MAP = {\n 'arm64': 'aarch64-linux-gnu',\n 'armhf': 'arm-linux-gnueabihf',\n 'i386': 'i386-linux-gnu',\n 'ppc64el': 'powerpc64le-linux-gnu',\n 'powerpc': 'powerpc-linux-gnu',\n 'amd64': 'x86_64-linux-gnu',\n 's390x': 's390x-linux-gnu',\n}\n\nLOGGER = logging.getLogger(__name__)\n\n\ndef prepare_env(cli_args: List[str]) -> List[str]:\n \"\"\"\n Prepare runtime environment for a certbot execution in snap.\n :param list cli_args: List of command line arguments\n :return: Update list of command line arguments\n :rtype: list\n \"\"\"\n snap_arch = os.environ.get('SNAP_ARCH')\n\n if snap_arch not in _ARCH_TRIPLET_MAP:\n raise Error('Unrecognized value of SNAP_ARCH: {0}'.format(snap_arch))\n\n os.environ['CERTBOT_AUGEAS_PATH'] = '{0}/usr/lib/{1}/libaugeas.so.0'.format(\n os.environ.get('SNAP'), _ARCH_TRIPLET_MAP[snap_arch])\n\n with Session() as session:\n session.mount('http://snapd/', _SnapdAdapter())\n\n try:\n response = session.get('http://snapd/v2/connections?snap=certbot&interface=content',\n timeout=30.0)\n response.raise_for_status()\n except RequestException as e:\n if isinstance(e, HTTPError) and e.response.status_code == 404:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'your version of snapd is outdated.')\n LOGGER.error('Please run \"sudo snap install core; sudo snap refresh core\" '\n 'in your terminal and try again.')\n else:\n LOGGER.error('An error occurred while fetching Certbot snap plugins: '\n 'make sure the snapd service is running.')\n raise e\n\n data = response.json()\n connections = ['/snap/{0}/current/lib/python3.8/site-packages/'.format(item['slot']['snap'])\n for item in data.get('result', {}).get('established', [])\n if item.get('plug', {}).get('plug') == 'plugin'\n and item.get('plug-attrs', {}).get('content') == 'certbot-1']\n\n os.environ['CERTBOT_PLUGIN_PATH'] = ':'.join(connections)\n\n cli_args.append('--preconfigured-renewal')\n\n return cli_args\n\n\nclass _SnapdConnection(HTTPConnection):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n self.sock: Optional[socket.socket] = None\n\n def connect(self) -> None:\n self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)\n self.sock.connect(\"/run/snapd.socket\")\n\n\nclass _SnapdConnectionPool(HTTPConnectionPool):\n def __init__(self) -> None:\n super().__init__(\"localhost\")\n\n def _new_conn(self) -> _SnapdConnection:\n return _SnapdConnection()\n\n\nclass _SnapdAdapter(HTTPAdapter):\n def get_connection(self, url: str,\n proxies: Optional[Iterable[str]] = None) -> _SnapdConnectionPool:\n return _SnapdConnectionPool()\n", "path": "certbot/certbot/_internal/snap_config.py"}]}
| 1,615 | 181 |
gh_patches_debug_2671
|
rasdani/github-patches
|
git_diff
|
saleor__saleor-1389
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add robots meta tag and "nofollow" link attribute
1. Fragile pages should be not indexed by search engines.
```
<meta name=”robots” content=”nofollow, noindex”>
```
- [x] Add above meta tag to order's confirmation page
2. Pages that brings no to little content value should not be crawled
```
<meta name=”robots” content=”nofollow”>
```
- [x] Add above meta tag to sign in/sign up/cart pages
3. Add link attribute
- [x] Links pointing to above pages should have set attribute `rel="nofollow"`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/core/views.py`
Content:
```
1 from __future__ import unicode_literals
2
3 from django.template.response import TemplateResponse
4 from django.contrib import messages
5 from django.conf import settings
6 from django.utils.translation import pgettext_lazy
7 from impersonate.views import impersonate as orig_impersonate
8
9 from ..dashboard.views import staff_member_required
10 from ..product.utils import products_with_availability, products_for_homepage
11 from ..userprofile.models import User
12
13
14 def home(request):
15 products = products_for_homepage()[:8]
16 products = products_with_availability(
17 products, discounts=request.discounts, local_currency=request.currency)
18 return TemplateResponse(
19 request, 'home.html',
20 {'products': products, 'parent': None})
21
22
23 @staff_member_required
24 def styleguide(request):
25 return TemplateResponse(request, 'styleguide.html')
26
27
28 def impersonate(request, uid):
29 response = orig_impersonate(request, uid)
30 if request.session.modified:
31 msg = pgettext_lazy(
32 'Impersonation message',
33 'You are now logged as {}'.format(User.objects.get(pk=uid)))
34 messages.success(request, msg)
35 return response
36
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/saleor/core/views.py b/saleor/core/views.py
--- a/saleor/core/views.py
+++ b/saleor/core/views.py
@@ -2,7 +2,6 @@
from django.template.response import TemplateResponse
from django.contrib import messages
-from django.conf import settings
from django.utils.translation import pgettext_lazy
from impersonate.views import impersonate as orig_impersonate
|
{"golden_diff": "diff --git a/saleor/core/views.py b/saleor/core/views.py\n--- a/saleor/core/views.py\n+++ b/saleor/core/views.py\n@@ -2,7 +2,6 @@\n \n from django.template.response import TemplateResponse\n from django.contrib import messages\n-from django.conf import settings\n from django.utils.translation import pgettext_lazy\n from impersonate.views import impersonate as orig_impersonate\n", "issue": "Add robots meta tag and \"nofollow\" link attribute\n1. Fragile pages should be not indexed by search engines.\r\n```\r\n<meta name=\u201drobots\u201d content=\u201dnofollow, noindex\u201d>\r\n```\r\n- [x] Add above meta tag to order's confirmation page\r\n\r\n2. Pages that brings no to little content value should not be crawled\r\n```\r\n<meta name=\u201drobots\u201d content=\u201dnofollow\u201d>\r\n```\r\n- [x] Add above meta tag to sign in/sign up/cart pages \r\n3. Add link attribute\r\n- [x] Links pointing to above pages should have set attribute `rel=\"nofollow\"` \n", "before_files": [{"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.conf import settings\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\n\nfrom ..dashboard.views import staff_member_required\nfrom ..product.utils import products_with_availability, products_for_homepage\nfrom ..userprofile.models import User\n\n\ndef home(request):\n products = products_for_homepage()[:8]\n products = products_with_availability(\n products, discounts=request.discounts, local_currency=request.currency)\n return TemplateResponse(\n request, 'home.html',\n {'products': products, 'parent': None})\n\n\n@staff_member_required\ndef styleguide(request):\n return TemplateResponse(request, 'styleguide.html')\n\n\ndef impersonate(request, uid):\n response = orig_impersonate(request, uid)\n if request.session.modified:\n msg = pgettext_lazy(\n 'Impersonation message',\n 'You are now logged as {}'.format(User.objects.get(pk=uid)))\n messages.success(request, msg)\n return response\n", "path": "saleor/core/views.py"}], "after_files": [{"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\n\nfrom ..dashboard.views import staff_member_required\nfrom ..product.utils import products_with_availability, products_for_homepage\nfrom ..userprofile.models import User\n\n\ndef home(request):\n products = products_for_homepage()[:8]\n products = products_with_availability(\n products, discounts=request.discounts, local_currency=request.currency)\n return TemplateResponse(\n request, 'home.html',\n {'products': products, 'parent': None})\n\n\n@staff_member_required\ndef styleguide(request):\n return TemplateResponse(request, 'styleguide.html')\n\n\ndef impersonate(request, uid):\n response = orig_impersonate(request, uid)\n if request.session.modified:\n msg = pgettext_lazy(\n 'Impersonation message',\n 'You are now logged as {}'.format(User.objects.get(pk=uid)))\n messages.success(request, msg)\n return response\n", "path": "saleor/core/views.py"}]}
| 683 | 89 |
gh_patches_debug_9597
|
rasdani/github-patches
|
git_diff
|
google__turbinia-1017
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cron.py unicode decode error
It looks like input evidence caused a unicode error for cron. I think this would only be raised when receiving unreadable input but filing this if we want to catch this exception differently/look into it further
```
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/__init__.py", line 1005, in run_wrapper
self.result = self.run(evidence, self.result)
File "/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/cron.py", line 54, in run
crontab = input_file.read()
File "/usr/lib/python3.8/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xdc in position 0: invalid continuation byte
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `turbinia/workers/cron.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright 2021 Google Inc.
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15 """Task for analysing cron files."""
16
17 from __future__ import unicode_literals
18
19 import os
20 import re
21
22 from turbinia.evidence import EvidenceState as state
23 from turbinia.evidence import ReportText
24 from turbinia.lib import text_formatter as fmt
25 from turbinia.workers import TurbiniaTask
26 from turbinia.workers import Priority
27
28
29 class CronAnalysisTask(TurbiniaTask):
30 """Task to analyze crontabs."""
31
32 REQUIRED_STATES = [
33 state.ATTACHED, state.CONTAINER_MOUNTED, state.DECOMPRESSED
34 ]
35
36 def run(self, evidence, result):
37 """Run the cron analysis worker.
38
39 Args:
40 evidence (Evidence object): The evidence we will process.
41 result (TurbiniaTaskResult): The object to place task results into.
42
43 Returns:
44 TurbiniaTaskResult object.
45 """
46 # Where to store the resulting output file.
47 output_file_name = 'cron_analysis.txt'
48 output_file_path = os.path.join(self.output_dir, output_file_name)
49 # Set the output file as the data source for the output evidence.
50 output_evidence = ReportText(source_path=output_file_path)
51
52 # Read the input file
53 with open(evidence.local_path, 'r') as input_file:
54 crontab = input_file.read()
55
56 (report, priority, summary) = self.analyse_crontab(crontab)
57 output_evidence.text_data = report
58 result.report_priority = priority
59 result.report_data = report
60
61 # Write the report to the output file.
62 with open(output_file_path, 'wb') as fh:
63 fh.write(output_evidence.text_data.encode('utf-8'))
64
65 # Add the resulting evidence to the result object.
66 result.add_evidence(output_evidence, evidence.config)
67 result.close(self, success=True, status=summary)
68 return result
69
70 def analyse_crontab(self, crontab):
71 """Analyses a Cron file.
72
73 Args:
74 crontab (str): file content.
75
76 Returns:
77 Tuple(
78 report_text(str): The report data
79 report_priority(int): The priority of the report (0 - 100)
80 summary(str): A summary of the report (used for task status)
81 )
82 """
83 findings = []
84 wget_or_curl = re.compile(r'(wget|curl)', re.IGNORECASE | re.MULTILINE)
85 pipe_to_sh = re.compile(r'\|(.*)sh ', re.IGNORECASE | re.MULTILINE)
86 get_piped_to_sh = re.compile(
87 r'((wget|curl).*\|)+(.*sh)', re.IGNORECASE | re.MULTILINE)
88
89 if re.search(get_piped_to_sh, crontab):
90 findings.append(fmt.bullet('Remote file retrieval piped to a shell.'))
91 elif re.search(wget_or_curl, crontab):
92 findings.append(fmt.bullet('Remote file retrieval'))
93 elif re.search(pipe_to_sh, crontab):
94 findings.append(fmt.bullet('File piped to shell'))
95
96 if findings:
97 summary = 'Potentially backdoored crontab found.'
98 findings.insert(0, fmt.heading4(fmt.bold(summary)))
99 report = '\n'.join(findings)
100 return (report, Priority.HIGH, summary)
101
102 report = 'No issues found in crontabs'
103 return (report, Priority.LOW, report)
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/turbinia/workers/cron.py b/turbinia/workers/cron.py
--- a/turbinia/workers/cron.py
+++ b/turbinia/workers/cron.py
@@ -51,7 +51,14 @@
# Read the input file
with open(evidence.local_path, 'r') as input_file:
- crontab = input_file.read()
+ try:
+ crontab = input_file.read()
+ except UnicodeDecodeError as exception:
+ message = 'Error parsing cron file {0:s}: {1!s}'.format(
+ evidence.local_path, exception)
+ result.log(message)
+ result.close(self, success=False, status=message)
+ return result
(report, priority, summary) = self.analyse_crontab(crontab)
output_evidence.text_data = report
|
{"golden_diff": "diff --git a/turbinia/workers/cron.py b/turbinia/workers/cron.py\n--- a/turbinia/workers/cron.py\n+++ b/turbinia/workers/cron.py\n@@ -51,7 +51,14 @@\n \n # Read the input file\n with open(evidence.local_path, 'r') as input_file:\n- crontab = input_file.read()\n+ try:\n+ crontab = input_file.read()\n+ except UnicodeDecodeError as exception:\n+ message = 'Error parsing cron file {0:s}: {1!s}'.format(\n+ evidence.local_path, exception)\n+ result.log(message)\n+ result.close(self, success=False, status=message)\n+ return result\n \n (report, priority, summary) = self.analyse_crontab(crontab)\n output_evidence.text_data = report\n", "issue": "Cron.py unicode decode error\nIt looks like input evidence caused a unicode error for cron. I think this would only be raised when receiving unreadable input but filing this if we want to catch this exception differently/look into it further\r\n\r\n```\r\n Traceback (most recent call last):\r\n File \"/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/__init__.py\", line 1005, in run_wrapper\r\n self.result = self.run(evidence, self.result)\r\n File \"/usr/local/lib/python3.8/dist-packages/turbinia-20220216-py3.8.egg/turbinia/workers/cron.py\", line 54, in run\r\n crontab = input_file.read()\r\n File \"/usr/lib/python3.8/codecs.py\", line 322, in decode\r\n (result, consumed) = self._buffer_decode(data, self.errors, final)\r\nUnicodeDecodeError: 'utf-8' codec can't decode byte 0xdc in position 0: invalid continuation byte\r\n```\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2021 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Task for analysing cron files.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport os\nimport re\n\nfrom turbinia.evidence import EvidenceState as state\nfrom turbinia.evidence import ReportText\nfrom turbinia.lib import text_formatter as fmt\nfrom turbinia.workers import TurbiniaTask\nfrom turbinia.workers import Priority\n\n\nclass CronAnalysisTask(TurbiniaTask):\n \"\"\"Task to analyze crontabs.\"\"\"\n\n REQUIRED_STATES = [\n state.ATTACHED, state.CONTAINER_MOUNTED, state.DECOMPRESSED\n ]\n\n def run(self, evidence, result):\n \"\"\"Run the cron analysis worker.\n\n Args:\n evidence (Evidence object): The evidence we will process.\n result (TurbiniaTaskResult): The object to place task results into.\n\n Returns:\n TurbiniaTaskResult object.\n \"\"\"\n # Where to store the resulting output file.\n output_file_name = 'cron_analysis.txt'\n output_file_path = os.path.join(self.output_dir, output_file_name)\n # Set the output file as the data source for the output evidence.\n output_evidence = ReportText(source_path=output_file_path)\n\n # Read the input file\n with open(evidence.local_path, 'r') as input_file:\n crontab = input_file.read()\n\n (report, priority, summary) = self.analyse_crontab(crontab)\n output_evidence.text_data = report\n result.report_priority = priority\n result.report_data = report\n\n # Write the report to the output file.\n with open(output_file_path, 'wb') as fh:\n fh.write(output_evidence.text_data.encode('utf-8'))\n\n # Add the resulting evidence to the result object.\n result.add_evidence(output_evidence, evidence.config)\n result.close(self, success=True, status=summary)\n return result\n\n def analyse_crontab(self, crontab):\n \"\"\"Analyses a Cron file.\n\n Args:\n crontab (str): file content.\n\n Returns:\n Tuple(\n report_text(str): The report data\n report_priority(int): The priority of the report (0 - 100)\n summary(str): A summary of the report (used for task status)\n )\n \"\"\"\n findings = []\n wget_or_curl = re.compile(r'(wget|curl)', re.IGNORECASE | re.MULTILINE)\n pipe_to_sh = re.compile(r'\\|(.*)sh ', re.IGNORECASE | re.MULTILINE)\n get_piped_to_sh = re.compile(\n r'((wget|curl).*\\|)+(.*sh)', re.IGNORECASE | re.MULTILINE)\n\n if re.search(get_piped_to_sh, crontab):\n findings.append(fmt.bullet('Remote file retrieval piped to a shell.'))\n elif re.search(wget_or_curl, crontab):\n findings.append(fmt.bullet('Remote file retrieval'))\n elif re.search(pipe_to_sh, crontab):\n findings.append(fmt.bullet('File piped to shell'))\n\n if findings:\n summary = 'Potentially backdoored crontab found.'\n findings.insert(0, fmt.heading4(fmt.bold(summary)))\n report = '\\n'.join(findings)\n return (report, Priority.HIGH, summary)\n\n report = 'No issues found in crontabs'\n return (report, Priority.LOW, report)\n", "path": "turbinia/workers/cron.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2021 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Task for analysing cron files.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport os\nimport re\n\nfrom turbinia.evidence import EvidenceState as state\nfrom turbinia.evidence import ReportText\nfrom turbinia.lib import text_formatter as fmt\nfrom turbinia.workers import TurbiniaTask\nfrom turbinia.workers import Priority\n\n\nclass CronAnalysisTask(TurbiniaTask):\n \"\"\"Task to analyze crontabs.\"\"\"\n\n REQUIRED_STATES = [\n state.ATTACHED, state.CONTAINER_MOUNTED, state.DECOMPRESSED\n ]\n\n def run(self, evidence, result):\n \"\"\"Run the cron analysis worker.\n\n Args:\n evidence (Evidence object): The evidence we will process.\n result (TurbiniaTaskResult): The object to place task results into.\n\n Returns:\n TurbiniaTaskResult object.\n \"\"\"\n # Where to store the resulting output file.\n output_file_name = 'cron_analysis.txt'\n output_file_path = os.path.join(self.output_dir, output_file_name)\n # Set the output file as the data source for the output evidence.\n output_evidence = ReportText(source_path=output_file_path)\n\n # Read the input file\n with open(evidence.local_path, 'r') as input_file:\n try:\n crontab = input_file.read()\n except UnicodeDecodeError as exception:\n message = 'Error parsing cron file {0:s}: {1!s}'.format(\n evidence.local_path, exception)\n result.log(message)\n result.close(self, success=False, status=message)\n return result\n\n (report, priority, summary) = self.analyse_crontab(crontab)\n output_evidence.text_data = report\n result.report_priority = priority\n result.report_data = report\n\n # Write the report to the output file.\n with open(output_file_path, 'wb') as fh:\n fh.write(output_evidence.text_data.encode('utf-8'))\n\n # Add the resulting evidence to the result object.\n result.add_evidence(output_evidence, evidence.config)\n result.close(self, success=True, status=summary)\n return result\n\n def analyse_crontab(self, crontab):\n \"\"\"Analyses a Cron file.\n\n Args:\n crontab (str): file content.\n\n Returns:\n Tuple(\n report_text(str): The report data\n report_priority(int): The priority of the report (0 - 100)\n summary(str): A summary of the report (used for task status)\n )\n \"\"\"\n findings = []\n wget_or_curl = re.compile(r'(wget|curl)', re.IGNORECASE | re.MULTILINE)\n pipe_to_sh = re.compile(r'\\|(.*)sh ', re.IGNORECASE | re.MULTILINE)\n get_piped_to_sh = re.compile(\n r'((wget|curl).*\\|)+(.*sh)', re.IGNORECASE | re.MULTILINE)\n\n if re.search(get_piped_to_sh, crontab):\n findings.append(fmt.bullet('Remote file retrieval piped to a shell.'))\n elif re.search(wget_or_curl, crontab):\n findings.append(fmt.bullet('Remote file retrieval'))\n elif re.search(pipe_to_sh, crontab):\n findings.append(fmt.bullet('File piped to shell'))\n\n if findings:\n summary = 'Potentially backdoored crontab found.'\n findings.insert(0, fmt.heading4(fmt.bold(summary)))\n report = '\\n'.join(findings)\n return (report, Priority.HIGH, summary)\n\n report = 'No issues found in crontabs'\n return (report, Priority.LOW, report)\n", "path": "turbinia/workers/cron.py"}]}
| 1,592 | 197 |
gh_patches_debug_10779
|
rasdani/github-patches
|
git_diff
|
aio-libs__aiohttp-2319
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
web.FileResponse overrides content_type
## Long story short
When returning a `web.FileResponse` from a route, any changes to the `content_type` member is not taken into account when the file is served. Instead, the file extension is always used to determine the content type of the response.
## Expected behaviour
`Content-Type` header of the response be that of `web.FileResponse.content_type` when such has been set.
## Actual behaviour
`Content-Type` header is guessed from the file extension of the served file.
## Steps to reproduce
Create a route that responds something along the lines of
```python
response = web.FileResponse('/path/to/some/image.png')
response.content_type = 'application/octet-stream'
return response
```
then `curl -i` it and notice the `Content-Type` is `image/png`.
## Your environment
```
[~]$ uname -mrs
Linux 4.11.9-1-ARCH x86_64
[~]$ python --version
Python 3.6.2
[~]$ pip show aiohttp | grep Version
Version: 2.2.5
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `aiohttp/web_fileresponse.py`
Content:
```
1 import asyncio
2 import mimetypes
3 import os
4 import pathlib
5
6 from . import hdrs
7 from .helpers import create_future
8 from .http_writer import PayloadWriter
9 from .log import server_logger
10 from .web_exceptions import (HTTPNotModified, HTTPOk, HTTPPartialContent,
11 HTTPRequestRangeNotSatisfiable)
12 from .web_response import StreamResponse
13
14
15 __all__ = ('FileResponse',)
16
17
18 NOSENDFILE = bool(os.environ.get("AIOHTTP_NOSENDFILE"))
19
20
21 class SendfilePayloadWriter(PayloadWriter):
22
23 def set_transport(self, transport):
24 self._transport = transport
25
26 if self._drain_waiter is not None:
27 waiter, self._drain_maiter = self._drain_maiter, None
28 if not waiter.done():
29 waiter.set_result(None)
30
31 def _write(self, chunk):
32 self.output_size += len(chunk)
33 self._buffer.append(chunk)
34
35 def _sendfile_cb(self, fut, out_fd, in_fd,
36 offset, count, loop, registered):
37 if registered:
38 loop.remove_writer(out_fd)
39 if fut.cancelled():
40 return
41
42 try:
43 n = os.sendfile(out_fd, in_fd, offset, count)
44 if n == 0: # EOF reached
45 n = count
46 except (BlockingIOError, InterruptedError):
47 n = 0
48 except Exception as exc:
49 fut.set_exception(exc)
50 return
51
52 if n < count:
53 loop.add_writer(out_fd, self._sendfile_cb, fut, out_fd, in_fd,
54 offset + n, count - n, loop, True)
55 else:
56 fut.set_result(None)
57
58 @asyncio.coroutine
59 def sendfile(self, fobj, count):
60 if self._transport is None:
61 if self._drain_waiter is None:
62 self._drain_waiter = create_future(self.loop)
63
64 yield from self._drain_waiter
65
66 out_socket = self._transport.get_extra_info("socket").dup()
67 out_socket.setblocking(False)
68 out_fd = out_socket.fileno()
69 in_fd = fobj.fileno()
70 offset = fobj.tell()
71
72 loop = self.loop
73 try:
74 yield from loop.sock_sendall(out_socket, b''.join(self._buffer))
75 fut = create_future(loop)
76 self._sendfile_cb(fut, out_fd, in_fd, offset, count, loop, False)
77 yield from fut
78 except:
79 server_logger.debug('Socket error')
80 self._transport.close()
81 finally:
82 out_socket.close()
83
84 self.output_size += count
85 self._transport = None
86 self._stream.release()
87
88 @asyncio.coroutine
89 def write_eof(self, chunk=b''):
90 pass
91
92
93 class FileResponse(StreamResponse):
94 """A response object can be used to send files."""
95
96 def __init__(self, path, chunk_size=256*1024, *args, **kwargs):
97 super().__init__(*args, **kwargs)
98
99 if isinstance(path, str):
100 path = pathlib.Path(path)
101
102 self._path = path
103 self._chunk_size = chunk_size
104
105 @asyncio.coroutine
106 def _sendfile_system(self, request, fobj, count):
107 # Write count bytes of fobj to resp using
108 # the os.sendfile system call.
109 #
110 # For details check
111 # https://github.com/KeepSafe/aiohttp/issues/1177
112 # See https://github.com/KeepSafe/aiohttp/issues/958 for details
113 #
114 # request should be a aiohttp.web.Request instance.
115 # fobj should be an open file object.
116 # count should be an integer > 0.
117
118 transport = request.transport
119 if (transport.get_extra_info("sslcontext") or
120 transport.get_extra_info("socket") is None):
121 writer = yield from self._sendfile_fallback(request, fobj, count)
122 else:
123 writer = request._protocol.writer.replace(
124 request._writer, SendfilePayloadWriter)
125 request._writer = writer
126 yield from super().prepare(request)
127 yield from writer.sendfile(fobj, count)
128
129 return writer
130
131 @asyncio.coroutine
132 def _sendfile_fallback(self, request, fobj, count):
133 # Mimic the _sendfile_system() method, but without using the
134 # os.sendfile() system call. This should be used on systems
135 # that don't support the os.sendfile().
136
137 # To avoid blocking the event loop & to keep memory usage low,
138 # fobj is transferred in chunks controlled by the
139 # constructor's chunk_size argument.
140
141 writer = (yield from super().prepare(request))
142
143 self.set_tcp_cork(True)
144 try:
145 chunk_size = self._chunk_size
146
147 chunk = fobj.read(chunk_size)
148 while True:
149 yield from writer.write(chunk)
150 count = count - chunk_size
151 if count <= 0:
152 break
153 chunk = fobj.read(min(chunk_size, count))
154 finally:
155 self.set_tcp_nodelay(True)
156
157 yield from writer.drain()
158 return writer
159
160 if hasattr(os, "sendfile") and not NOSENDFILE: # pragma: no cover
161 _sendfile = _sendfile_system
162 else: # pragma: no cover
163 _sendfile = _sendfile_fallback
164
165 @asyncio.coroutine
166 def prepare(self, request):
167 filepath = self._path
168
169 gzip = False
170 if 'gzip' in request.headers.get(hdrs.ACCEPT_ENCODING, ''):
171 gzip_path = filepath.with_name(filepath.name + '.gz')
172
173 if gzip_path.is_file():
174 filepath = gzip_path
175 gzip = True
176
177 st = filepath.stat()
178
179 modsince = request.if_modified_since
180 if modsince is not None and st.st_mtime <= modsince.timestamp():
181 self.set_status(HTTPNotModified.status_code)
182 self._length_check = False
183 return (yield from super().prepare(request))
184
185 ct, encoding = mimetypes.guess_type(str(filepath))
186 if not ct:
187 ct = 'application/octet-stream'
188
189 status = HTTPOk.status_code
190 file_size = st.st_size
191 count = file_size
192
193 try:
194 rng = request.http_range
195 start = rng.start
196 end = rng.stop
197 except ValueError:
198 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
199 return (yield from super().prepare(request))
200
201 # If a range request has been made, convert start, end slice notation
202 # into file pointer offset and count
203 if start is not None or end is not None:
204 if start is None and end < 0: # return tail of file
205 start = file_size + end
206 count = -end
207 else:
208 count = (end or file_size) - start
209
210 if start + count > file_size:
211 # rfc7233:If the last-byte-pos value is
212 # absent, or if the value is greater than or equal to
213 # the current length of the representation data,
214 # the byte range is interpreted as the remainder
215 # of the representation (i.e., the server replaces the
216 # value of last-byte-pos with a value that is one less than
217 # the current length of the selected representation).
218 count = file_size - start
219
220 if start >= file_size:
221 count = 0
222
223 if count != file_size:
224 status = HTTPPartialContent.status_code
225
226 self.set_status(status)
227 self.content_type = ct
228 if encoding:
229 self.headers[hdrs.CONTENT_ENCODING] = encoding
230 if gzip:
231 self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
232 self.last_modified = st.st_mtime
233 self.content_length = count
234
235 if count:
236 with filepath.open('rb') as fobj:
237 if start:
238 fobj.seek(start)
239
240 return (yield from self._sendfile(request, fobj, count))
241
242 return (yield from super().prepare(request))
243
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/aiohttp/web_fileresponse.py b/aiohttp/web_fileresponse.py
--- a/aiohttp/web_fileresponse.py
+++ b/aiohttp/web_fileresponse.py
@@ -182,9 +182,13 @@
self._length_check = False
return (yield from super().prepare(request))
- ct, encoding = mimetypes.guess_type(str(filepath))
- if not ct:
- ct = 'application/octet-stream'
+ if hdrs.CONTENT_TYPE not in self.headers:
+ ct, encoding = mimetypes.guess_type(str(filepath))
+ if not ct:
+ ct = 'application/octet-stream'
+ else:
+ ct = self.headers[hdrs.CONTENT_TYPE]
+ encoding = 'gzip' if gzip else None
status = HTTPOk.status_code
file_size = st.st_size
|
{"golden_diff": "diff --git a/aiohttp/web_fileresponse.py b/aiohttp/web_fileresponse.py\n--- a/aiohttp/web_fileresponse.py\n+++ b/aiohttp/web_fileresponse.py\n@@ -182,9 +182,13 @@\n self._length_check = False\n return (yield from super().prepare(request))\n \n- ct, encoding = mimetypes.guess_type(str(filepath))\n- if not ct:\n- ct = 'application/octet-stream'\n+ if hdrs.CONTENT_TYPE not in self.headers:\n+ ct, encoding = mimetypes.guess_type(str(filepath))\n+ if not ct:\n+ ct = 'application/octet-stream'\n+ else:\n+ ct = self.headers[hdrs.CONTENT_TYPE]\n+ encoding = 'gzip' if gzip else None\n \n status = HTTPOk.status_code\n file_size = st.st_size\n", "issue": "web.FileResponse overrides content_type\n## Long story short\r\n\r\nWhen returning a `web.FileResponse` from a route, any changes to the `content_type` member is not taken into account when the file is served. Instead, the file extension is always used to determine the content type of the response.\r\n\r\n## Expected behaviour\r\n\r\n`Content-Type` header of the response be that of `web.FileResponse.content_type` when such has been set.\r\n\r\n## Actual behaviour\r\n\r\n`Content-Type` header is guessed from the file extension of the served file.\r\n\r\n## Steps to reproduce\r\n\r\nCreate a route that responds something along the lines of\r\n\r\n```python\r\n response = web.FileResponse('/path/to/some/image.png')\r\n response.content_type = 'application/octet-stream'\r\n return response\r\n```\r\n\r\nthen `curl -i` it and notice the `Content-Type` is `image/png`.\r\n\r\n## Your environment\r\n\r\n```\r\n[~]$ uname -mrs\r\nLinux 4.11.9-1-ARCH x86_64\r\n[~]$ python --version\r\nPython 3.6.2\r\n[~]$ pip show aiohttp | grep Version\r\nVersion: 2.2.5\r\n```\r\n\n", "before_files": [{"content": "import asyncio\nimport mimetypes\nimport os\nimport pathlib\n\nfrom . import hdrs\nfrom .helpers import create_future\nfrom .http_writer import PayloadWriter\nfrom .log import server_logger\nfrom .web_exceptions import (HTTPNotModified, HTTPOk, HTTPPartialContent,\n HTTPRequestRangeNotSatisfiable)\nfrom .web_response import StreamResponse\n\n\n__all__ = ('FileResponse',)\n\n\nNOSENDFILE = bool(os.environ.get(\"AIOHTTP_NOSENDFILE\"))\n\n\nclass SendfilePayloadWriter(PayloadWriter):\n\n def set_transport(self, transport):\n self._transport = transport\n\n if self._drain_waiter is not None:\n waiter, self._drain_maiter = self._drain_maiter, None\n if not waiter.done():\n waiter.set_result(None)\n\n def _write(self, chunk):\n self.output_size += len(chunk)\n self._buffer.append(chunk)\n\n def _sendfile_cb(self, fut, out_fd, in_fd,\n offset, count, loop, registered):\n if registered:\n loop.remove_writer(out_fd)\n if fut.cancelled():\n return\n\n try:\n n = os.sendfile(out_fd, in_fd, offset, count)\n if n == 0: # EOF reached\n n = count\n except (BlockingIOError, InterruptedError):\n n = 0\n except Exception as exc:\n fut.set_exception(exc)\n return\n\n if n < count:\n loop.add_writer(out_fd, self._sendfile_cb, fut, out_fd, in_fd,\n offset + n, count - n, loop, True)\n else:\n fut.set_result(None)\n\n @asyncio.coroutine\n def sendfile(self, fobj, count):\n if self._transport is None:\n if self._drain_waiter is None:\n self._drain_waiter = create_future(self.loop)\n\n yield from self._drain_waiter\n\n out_socket = self._transport.get_extra_info(\"socket\").dup()\n out_socket.setblocking(False)\n out_fd = out_socket.fileno()\n in_fd = fobj.fileno()\n offset = fobj.tell()\n\n loop = self.loop\n try:\n yield from loop.sock_sendall(out_socket, b''.join(self._buffer))\n fut = create_future(loop)\n self._sendfile_cb(fut, out_fd, in_fd, offset, count, loop, False)\n yield from fut\n except:\n server_logger.debug('Socket error')\n self._transport.close()\n finally:\n out_socket.close()\n\n self.output_size += count\n self._transport = None\n self._stream.release()\n\n @asyncio.coroutine\n def write_eof(self, chunk=b''):\n pass\n\n\nclass FileResponse(StreamResponse):\n \"\"\"A response object can be used to send files.\"\"\"\n\n def __init__(self, path, chunk_size=256*1024, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n if isinstance(path, str):\n path = pathlib.Path(path)\n\n self._path = path\n self._chunk_size = chunk_size\n\n @asyncio.coroutine\n def _sendfile_system(self, request, fobj, count):\n # Write count bytes of fobj to resp using\n # the os.sendfile system call.\n #\n # For details check\n # https://github.com/KeepSafe/aiohttp/issues/1177\n # See https://github.com/KeepSafe/aiohttp/issues/958 for details\n #\n # request should be a aiohttp.web.Request instance.\n # fobj should be an open file object.\n # count should be an integer > 0.\n\n transport = request.transport\n if (transport.get_extra_info(\"sslcontext\") or\n transport.get_extra_info(\"socket\") is None):\n writer = yield from self._sendfile_fallback(request, fobj, count)\n else:\n writer = request._protocol.writer.replace(\n request._writer, SendfilePayloadWriter)\n request._writer = writer\n yield from super().prepare(request)\n yield from writer.sendfile(fobj, count)\n\n return writer\n\n @asyncio.coroutine\n def _sendfile_fallback(self, request, fobj, count):\n # Mimic the _sendfile_system() method, but without using the\n # os.sendfile() system call. This should be used on systems\n # that don't support the os.sendfile().\n\n # To avoid blocking the event loop & to keep memory usage low,\n # fobj is transferred in chunks controlled by the\n # constructor's chunk_size argument.\n\n writer = (yield from super().prepare(request))\n\n self.set_tcp_cork(True)\n try:\n chunk_size = self._chunk_size\n\n chunk = fobj.read(chunk_size)\n while True:\n yield from writer.write(chunk)\n count = count - chunk_size\n if count <= 0:\n break\n chunk = fobj.read(min(chunk_size, count))\n finally:\n self.set_tcp_nodelay(True)\n\n yield from writer.drain()\n return writer\n\n if hasattr(os, \"sendfile\") and not NOSENDFILE: # pragma: no cover\n _sendfile = _sendfile_system\n else: # pragma: no cover\n _sendfile = _sendfile_fallback\n\n @asyncio.coroutine\n def prepare(self, request):\n filepath = self._path\n\n gzip = False\n if 'gzip' in request.headers.get(hdrs.ACCEPT_ENCODING, ''):\n gzip_path = filepath.with_name(filepath.name + '.gz')\n\n if gzip_path.is_file():\n filepath = gzip_path\n gzip = True\n\n st = filepath.stat()\n\n modsince = request.if_modified_since\n if modsince is not None and st.st_mtime <= modsince.timestamp():\n self.set_status(HTTPNotModified.status_code)\n self._length_check = False\n return (yield from super().prepare(request))\n\n ct, encoding = mimetypes.guess_type(str(filepath))\n if not ct:\n ct = 'application/octet-stream'\n\n status = HTTPOk.status_code\n file_size = st.st_size\n count = file_size\n\n try:\n rng = request.http_range\n start = rng.start\n end = rng.stop\n except ValueError:\n self.set_status(HTTPRequestRangeNotSatisfiable.status_code)\n return (yield from super().prepare(request))\n\n # If a range request has been made, convert start, end slice notation\n # into file pointer offset and count\n if start is not None or end is not None:\n if start is None and end < 0: # return tail of file\n start = file_size + end\n count = -end\n else:\n count = (end or file_size) - start\n\n if start + count > file_size:\n # rfc7233:If the last-byte-pos value is\n # absent, or if the value is greater than or equal to\n # the current length of the representation data,\n # the byte range is interpreted as the remainder\n # of the representation (i.e., the server replaces the\n # value of last-byte-pos with a value that is one less than\n # the current length of the selected representation).\n count = file_size - start\n\n if start >= file_size:\n count = 0\n\n if count != file_size:\n status = HTTPPartialContent.status_code\n\n self.set_status(status)\n self.content_type = ct\n if encoding:\n self.headers[hdrs.CONTENT_ENCODING] = encoding\n if gzip:\n self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING\n self.last_modified = st.st_mtime\n self.content_length = count\n\n if count:\n with filepath.open('rb') as fobj:\n if start:\n fobj.seek(start)\n\n return (yield from self._sendfile(request, fobj, count))\n\n return (yield from super().prepare(request))\n", "path": "aiohttp/web_fileresponse.py"}], "after_files": [{"content": "import asyncio\nimport mimetypes\nimport os\nimport pathlib\n\nfrom . import hdrs\nfrom .helpers import create_future\nfrom .http_writer import PayloadWriter\nfrom .log import server_logger\nfrom .web_exceptions import (HTTPNotModified, HTTPOk, HTTPPartialContent,\n HTTPRequestRangeNotSatisfiable)\nfrom .web_response import StreamResponse\n\n\n__all__ = ('FileResponse',)\n\n\nNOSENDFILE = bool(os.environ.get(\"AIOHTTP_NOSENDFILE\"))\n\n\nclass SendfilePayloadWriter(PayloadWriter):\n\n def set_transport(self, transport):\n self._transport = transport\n\n if self._drain_waiter is not None:\n waiter, self._drain_maiter = self._drain_maiter, None\n if not waiter.done():\n waiter.set_result(None)\n\n def _write(self, chunk):\n self.output_size += len(chunk)\n self._buffer.append(chunk)\n\n def _sendfile_cb(self, fut, out_fd, in_fd,\n offset, count, loop, registered):\n if registered:\n loop.remove_writer(out_fd)\n if fut.cancelled():\n return\n\n try:\n n = os.sendfile(out_fd, in_fd, offset, count)\n if n == 0: # EOF reached\n n = count\n except (BlockingIOError, InterruptedError):\n n = 0\n except Exception as exc:\n fut.set_exception(exc)\n return\n\n if n < count:\n loop.add_writer(out_fd, self._sendfile_cb, fut, out_fd, in_fd,\n offset + n, count - n, loop, True)\n else:\n fut.set_result(None)\n\n @asyncio.coroutine\n def sendfile(self, fobj, count):\n if self._transport is None:\n if self._drain_waiter is None:\n self._drain_waiter = create_future(self.loop)\n\n yield from self._drain_waiter\n\n out_socket = self._transport.get_extra_info(\"socket\").dup()\n out_socket.setblocking(False)\n out_fd = out_socket.fileno()\n in_fd = fobj.fileno()\n offset = fobj.tell()\n\n loop = self.loop\n try:\n yield from loop.sock_sendall(out_socket, b''.join(self._buffer))\n fut = create_future(loop)\n self._sendfile_cb(fut, out_fd, in_fd, offset, count, loop, False)\n yield from fut\n except:\n server_logger.debug('Socket error')\n self._transport.close()\n finally:\n out_socket.close()\n\n self.output_size += count\n self._transport = None\n self._stream.release()\n\n @asyncio.coroutine\n def write_eof(self, chunk=b''):\n pass\n\n\nclass FileResponse(StreamResponse):\n \"\"\"A response object can be used to send files.\"\"\"\n\n def __init__(self, path, chunk_size=256*1024, *args, **kwargs):\n super().__init__(*args, **kwargs)\n\n if isinstance(path, str):\n path = pathlib.Path(path)\n\n self._path = path\n self._chunk_size = chunk_size\n\n @asyncio.coroutine\n def _sendfile_system(self, request, fobj, count):\n # Write count bytes of fobj to resp using\n # the os.sendfile system call.\n #\n # For details check\n # https://github.com/KeepSafe/aiohttp/issues/1177\n # See https://github.com/KeepSafe/aiohttp/issues/958 for details\n #\n # request should be a aiohttp.web.Request instance.\n # fobj should be an open file object.\n # count should be an integer > 0.\n\n transport = request.transport\n if (transport.get_extra_info(\"sslcontext\") or\n transport.get_extra_info(\"socket\") is None):\n writer = yield from self._sendfile_fallback(request, fobj, count)\n else:\n writer = request._protocol.writer.replace(\n request._writer, SendfilePayloadWriter)\n request._writer = writer\n yield from super().prepare(request)\n yield from writer.sendfile(fobj, count)\n\n return writer\n\n @asyncio.coroutine\n def _sendfile_fallback(self, request, fobj, count):\n # Mimic the _sendfile_system() method, but without using the\n # os.sendfile() system call. This should be used on systems\n # that don't support the os.sendfile().\n\n # To avoid blocking the event loop & to keep memory usage low,\n # fobj is transferred in chunks controlled by the\n # constructor's chunk_size argument.\n\n writer = (yield from super().prepare(request))\n\n self.set_tcp_cork(True)\n try:\n chunk_size = self._chunk_size\n\n chunk = fobj.read(chunk_size)\n while True:\n yield from writer.write(chunk)\n count = count - chunk_size\n if count <= 0:\n break\n chunk = fobj.read(min(chunk_size, count))\n finally:\n self.set_tcp_nodelay(True)\n\n yield from writer.drain()\n return writer\n\n if hasattr(os, \"sendfile\") and not NOSENDFILE: # pragma: no cover\n _sendfile = _sendfile_system\n else: # pragma: no cover\n _sendfile = _sendfile_fallback\n\n @asyncio.coroutine\n def prepare(self, request):\n filepath = self._path\n\n gzip = False\n if 'gzip' in request.headers.get(hdrs.ACCEPT_ENCODING, ''):\n gzip_path = filepath.with_name(filepath.name + '.gz')\n\n if gzip_path.is_file():\n filepath = gzip_path\n gzip = True\n\n st = filepath.stat()\n\n modsince = request.if_modified_since\n if modsince is not None and st.st_mtime <= modsince.timestamp():\n self.set_status(HTTPNotModified.status_code)\n self._length_check = False\n return (yield from super().prepare(request))\n\n if hdrs.CONTENT_TYPE not in self.headers:\n ct, encoding = mimetypes.guess_type(str(filepath))\n if not ct:\n ct = 'application/octet-stream'\n else:\n ct = self.headers[hdrs.CONTENT_TYPE]\n encoding = 'gzip' if gzip else None\n\n status = HTTPOk.status_code\n file_size = st.st_size\n count = file_size\n\n try:\n rng = request.http_range\n start = rng.start\n end = rng.stop\n except ValueError:\n self.set_status(HTTPRequestRangeNotSatisfiable.status_code)\n return (yield from super().prepare(request))\n\n # If a range request has been made, convert start, end slice notation\n # into file pointer offset and count\n if start is not None or end is not None:\n if start is None and end < 0: # return tail of file\n start = file_size + end\n count = -end\n else:\n count = (end or file_size) - start\n\n if start + count > file_size:\n # rfc7233:If the last-byte-pos value is\n # absent, or if the value is greater than or equal to\n # the current length of the representation data,\n # the byte range is interpreted as the remainder\n # of the representation (i.e., the server replaces the\n # value of last-byte-pos with a value that is one less than\n # the current length of the selected representation).\n count = file_size - start\n\n if start >= file_size:\n count = 0\n\n if count != file_size:\n status = HTTPPartialContent.status_code\n\n self.set_status(status)\n self.content_type = ct\n if encoding:\n self.headers[hdrs.CONTENT_ENCODING] = encoding\n if gzip:\n self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING\n self.last_modified = st.st_mtime\n self.content_length = count\n\n if count:\n with filepath.open('rb') as fobj:\n if start:\n fobj.seek(start)\n\n return (yield from self._sendfile(request, fobj, count))\n\n return (yield from super().prepare(request))\n", "path": "aiohttp/web_fileresponse.py"}]}
| 2,918 | 198 |
gh_patches_debug_933
|
rasdani/github-patches
|
git_diff
|
python-pillow__Pillow-399
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Image opened twice if imagemagick and xv are installed
If ImageMagick and xv are both installed and you call Image.show() it will open the image twice, once with display and once with xv. This is probably related to the for loop over registered viewers. Not sure why the return 1 isn't working though.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `PIL/ImageShow.py`
Content:
```
1 #
2 # The Python Imaging Library.
3 # $Id$
4 #
5 # im.show() drivers
6 #
7 # History:
8 # 2008-04-06 fl Created
9 #
10 # Copyright (c) Secret Labs AB 2008.
11 #
12 # See the README file for information on usage and redistribution.
13 #
14
15 from __future__ import print_function
16
17 from PIL import Image
18 import os, sys
19
20 _viewers = []
21
22 def register(viewer, order=1):
23 try:
24 if issubclass(viewer, Viewer):
25 viewer = viewer()
26 except TypeError:
27 pass # raised if viewer wasn't a class
28 if order > 0:
29 _viewers.append(viewer)
30 elif order < 0:
31 _viewers.insert(0, viewer)
32
33 ##
34 # Displays a given image.
35 #
36 # @param image An image object.
37 # @param title Optional title. Not all viewers can display the title.
38 # @param **options Additional viewer options.
39 # @return True if a suitable viewer was found, false otherwise.
40
41 def show(image, title=None, **options):
42 for viewer in _viewers:
43 if viewer.show(image, title=title, **options):
44 return 1
45 return 0
46
47 ##
48 # Base class for viewers.
49
50 class Viewer:
51
52 # main api
53
54 def show(self, image, **options):
55
56 # save temporary image to disk
57 if image.mode[:4] == "I;16":
58 # @PIL88 @PIL101
59 # "I;16" isn't an 'official' mode, but we still want to
60 # provide a simple way to show 16-bit images.
61 base = "L"
62 # FIXME: auto-contrast if max() > 255?
63 else:
64 base = Image.getmodebase(image.mode)
65 if base != image.mode and image.mode != "1":
66 image = image.convert(base)
67
68 self.show_image(image, **options)
69
70 # hook methods
71
72 format = None
73
74 def get_format(self, image):
75 # return format name, or None to save as PGM/PPM
76 return self.format
77
78 def get_command(self, file, **options):
79 raise NotImplementedError
80
81 def save_image(self, image):
82 # save to temporary file, and return filename
83 return image._dump(format=self.get_format(image))
84
85 def show_image(self, image, **options):
86 # display given image
87 return self.show_file(self.save_image(image), **options)
88
89 def show_file(self, file, **options):
90 # display given file
91 os.system(self.get_command(file, **options))
92 return 1
93
94 # --------------------------------------------------------------------
95
96 if sys.platform == "win32":
97
98 class WindowsViewer(Viewer):
99 format = "BMP"
100 def get_command(self, file, **options):
101 return ("start /wait %s && ping -n 2 127.0.0.1 >NUL "
102 "&& del /f %s" % (file, file))
103
104 register(WindowsViewer)
105
106 elif sys.platform == "darwin":
107
108 class MacViewer(Viewer):
109 format = "BMP"
110 def get_command(self, file, **options):
111 # on darwin open returns immediately resulting in the temp
112 # file removal while app is opening
113 command = "open -a /Applications/Preview.app"
114 command = "(%s %s; sleep 20; rm -f %s)&" % (command, file, file)
115 return command
116
117 register(MacViewer)
118
119 else:
120
121 # unixoids
122
123 def which(executable):
124 path = os.environ.get("PATH")
125 if not path:
126 return None
127 for dirname in path.split(os.pathsep):
128 filename = os.path.join(dirname, executable)
129 if os.path.isfile(filename):
130 # FIXME: make sure it's executable
131 return filename
132 return None
133
134 class UnixViewer(Viewer):
135 def show_file(self, file, **options):
136 command, executable = self.get_command_ex(file, **options)
137 command = "(%s %s; rm -f %s)&" % (command, file, file)
138 os.system(command)
139 return 1
140
141 # implementations
142
143 class DisplayViewer(UnixViewer):
144 def get_command_ex(self, file, **options):
145 command = executable = "display"
146 return command, executable
147
148 if which("display"):
149 register(DisplayViewer)
150
151 class XVViewer(UnixViewer):
152 def get_command_ex(self, file, title=None, **options):
153 # note: xv is pretty outdated. most modern systems have
154 # imagemagick's display command instead.
155 command = executable = "xv"
156 if title:
157 # FIXME: do full escaping
158 command = command + " -name \"%s\"" % title
159 return command, executable
160
161 if which("xv"):
162 register(XVViewer)
163
164 if __name__ == "__main__":
165 # usage: python ImageShow.py imagefile [title]
166 print(show(Image.open(sys.argv[1]), *sys.argv[2:]))
167
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/PIL/ImageShow.py b/PIL/ImageShow.py
--- a/PIL/ImageShow.py
+++ b/PIL/ImageShow.py
@@ -65,7 +65,7 @@
if base != image.mode and image.mode != "1":
image = image.convert(base)
- self.show_image(image, **options)
+ return self.show_image(image, **options)
# hook methods
|
{"golden_diff": "diff --git a/PIL/ImageShow.py b/PIL/ImageShow.py\n--- a/PIL/ImageShow.py\n+++ b/PIL/ImageShow.py\n@@ -65,7 +65,7 @@\n if base != image.mode and image.mode != \"1\":\n image = image.convert(base)\n \n- self.show_image(image, **options)\n+ return self.show_image(image, **options)\n \n # hook methods\n", "issue": "Image opened twice if imagemagick and xv are installed\nIf ImageMagick and xv are both installed and you call Image.show() it will open the image twice, once with display and once with xv. This is probably related to the for loop over registered viewers. Not sure why the return 1 isn't working though.\n\n", "before_files": [{"content": "#\n# The Python Imaging Library.\n# $Id$\n#\n# im.show() drivers\n#\n# History:\n# 2008-04-06 fl Created\n#\n# Copyright (c) Secret Labs AB 2008.\n#\n# See the README file for information on usage and redistribution.\n#\n\nfrom __future__ import print_function\n\nfrom PIL import Image\nimport os, sys\n\n_viewers = []\n\ndef register(viewer, order=1):\n try:\n if issubclass(viewer, Viewer):\n viewer = viewer()\n except TypeError:\n pass # raised if viewer wasn't a class\n if order > 0:\n _viewers.append(viewer)\n elif order < 0:\n _viewers.insert(0, viewer)\n\n##\n# Displays a given image.\n#\n# @param image An image object.\n# @param title Optional title. Not all viewers can display the title.\n# @param **options Additional viewer options.\n# @return True if a suitable viewer was found, false otherwise.\n\ndef show(image, title=None, **options):\n for viewer in _viewers:\n if viewer.show(image, title=title, **options):\n return 1\n return 0\n\n##\n# Base class for viewers.\n\nclass Viewer:\n\n # main api\n\n def show(self, image, **options):\n\n # save temporary image to disk\n if image.mode[:4] == \"I;16\":\n # @PIL88 @PIL101\n # \"I;16\" isn't an 'official' mode, but we still want to\n # provide a simple way to show 16-bit images.\n base = \"L\"\n # FIXME: auto-contrast if max() > 255?\n else:\n base = Image.getmodebase(image.mode)\n if base != image.mode and image.mode != \"1\":\n image = image.convert(base)\n\n self.show_image(image, **options)\n\n # hook methods\n\n format = None\n\n def get_format(self, image):\n # return format name, or None to save as PGM/PPM\n return self.format\n\n def get_command(self, file, **options):\n raise NotImplementedError\n\n def save_image(self, image):\n # save to temporary file, and return filename\n return image._dump(format=self.get_format(image))\n\n def show_image(self, image, **options):\n # display given image\n return self.show_file(self.save_image(image), **options)\n\n def show_file(self, file, **options):\n # display given file\n os.system(self.get_command(file, **options))\n return 1\n\n# --------------------------------------------------------------------\n\nif sys.platform == \"win32\":\n\n class WindowsViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n return (\"start /wait %s && ping -n 2 127.0.0.1 >NUL \"\n \"&& del /f %s\" % (file, file))\n\n register(WindowsViewer)\n\nelif sys.platform == \"darwin\":\n\n class MacViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n # on darwin open returns immediately resulting in the temp\n # file removal while app is opening\n command = \"open -a /Applications/Preview.app\"\n command = \"(%s %s; sleep 20; rm -f %s)&\" % (command, file, file)\n return command\n\n register(MacViewer)\n\nelse:\n\n # unixoids\n\n def which(executable):\n path = os.environ.get(\"PATH\")\n if not path:\n return None\n for dirname in path.split(os.pathsep):\n filename = os.path.join(dirname, executable)\n if os.path.isfile(filename):\n # FIXME: make sure it's executable\n return filename\n return None\n\n class UnixViewer(Viewer):\n def show_file(self, file, **options):\n command, executable = self.get_command_ex(file, **options)\n command = \"(%s %s; rm -f %s)&\" % (command, file, file)\n os.system(command)\n return 1\n\n # implementations\n\n class DisplayViewer(UnixViewer):\n def get_command_ex(self, file, **options):\n command = executable = \"display\"\n return command, executable\n\n if which(\"display\"):\n register(DisplayViewer)\n\n class XVViewer(UnixViewer):\n def get_command_ex(self, file, title=None, **options):\n # note: xv is pretty outdated. most modern systems have\n # imagemagick's display command instead.\n command = executable = \"xv\"\n if title:\n # FIXME: do full escaping\n command = command + \" -name \\\"%s\\\"\" % title\n return command, executable\n\n if which(\"xv\"):\n register(XVViewer)\n\nif __name__ == \"__main__\":\n # usage: python ImageShow.py imagefile [title]\n print(show(Image.open(sys.argv[1]), *sys.argv[2:]))\n", "path": "PIL/ImageShow.py"}], "after_files": [{"content": "#\n# The Python Imaging Library.\n# $Id$\n#\n# im.show() drivers\n#\n# History:\n# 2008-04-06 fl Created\n#\n# Copyright (c) Secret Labs AB 2008.\n#\n# See the README file for information on usage and redistribution.\n#\n\nfrom __future__ import print_function\n\nfrom PIL import Image\nimport os, sys\n\n_viewers = []\n\ndef register(viewer, order=1):\n try:\n if issubclass(viewer, Viewer):\n viewer = viewer()\n except TypeError:\n pass # raised if viewer wasn't a class\n if order > 0:\n _viewers.append(viewer)\n elif order < 0:\n _viewers.insert(0, viewer)\n\n##\n# Displays a given image.\n#\n# @param image An image object.\n# @param title Optional title. Not all viewers can display the title.\n# @param **options Additional viewer options.\n# @return True if a suitable viewer was found, false otherwise.\n\ndef show(image, title=None, **options):\n for viewer in _viewers:\n if viewer.show(image, title=title, **options):\n return 1\n return 0\n\n##\n# Base class for viewers.\n\nclass Viewer:\n\n # main api\n\n def show(self, image, **options):\n\n # save temporary image to disk\n if image.mode[:4] == \"I;16\":\n # @PIL88 @PIL101\n # \"I;16\" isn't an 'official' mode, but we still want to\n # provide a simple way to show 16-bit images.\n base = \"L\"\n # FIXME: auto-contrast if max() > 255?\n else:\n base = Image.getmodebase(image.mode)\n if base != image.mode and image.mode != \"1\":\n image = image.convert(base)\n\n return self.show_image(image, **options)\n\n # hook methods\n\n format = None\n\n def get_format(self, image):\n # return format name, or None to save as PGM/PPM\n return self.format\n\n def get_command(self, file, **options):\n raise NotImplementedError\n\n def save_image(self, image):\n # save to temporary file, and return filename\n return image._dump(format=self.get_format(image))\n\n def show_image(self, image, **options):\n # display given image\n return self.show_file(self.save_image(image), **options)\n\n def show_file(self, file, **options):\n # display given file\n os.system(self.get_command(file, **options))\n return 1\n\n# --------------------------------------------------------------------\n\nif sys.platform == \"win32\":\n\n class WindowsViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n return (\"start /wait %s && ping -n 2 127.0.0.1 >NUL \"\n \"&& del /f %s\" % (file, file))\n\n register(WindowsViewer)\n\nelif sys.platform == \"darwin\":\n\n class MacViewer(Viewer):\n format = \"BMP\"\n def get_command(self, file, **options):\n # on darwin open returns immediately resulting in the temp\n # file removal while app is opening\n command = \"open -a /Applications/Preview.app\"\n command = \"(%s %s; sleep 20; rm -f %s)&\" % (command, file, file)\n return command\n\n register(MacViewer)\n\nelse:\n\n # unixoids\n\n def which(executable):\n path = os.environ.get(\"PATH\")\n if not path:\n return None\n for dirname in path.split(os.pathsep):\n filename = os.path.join(dirname, executable)\n if os.path.isfile(filename):\n # FIXME: make sure it's executable\n return filename\n return None\n\n class UnixViewer(Viewer):\n def show_file(self, file, **options):\n command, executable = self.get_command_ex(file, **options)\n command = \"(%s %s; rm -f %s)&\" % (command, file, file)\n os.system(command)\n return 1\n\n # implementations\n\n class DisplayViewer(UnixViewer):\n def get_command_ex(self, file, **options):\n command = executable = \"display\"\n return command, executable\n\n if which(\"display\"):\n register(DisplayViewer)\n\n class XVViewer(UnixViewer):\n def get_command_ex(self, file, title=None, **options):\n # note: xv is pretty outdated. most modern systems have\n # imagemagick's display command instead.\n command = executable = \"xv\"\n if title:\n # FIXME: do full escaping\n command = command + \" -name \\\"%s\\\"\" % title\n return command, executable\n\n if which(\"xv\"):\n register(XVViewer)\n\nif __name__ == \"__main__\":\n # usage: python ImageShow.py imagefile [title]\n print(show(Image.open(sys.argv[1]), *sys.argv[2:]))\n", "path": "PIL/ImageShow.py"}]}
| 1,844 | 92 |
gh_patches_debug_20118
|
rasdani/github-patches
|
git_diff
|
google__turbinia-321
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
turbinia_job_graph.py doesn't support new job manager
turbinia_job_graph.py needs to be updated to support the new job manager (from #257).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tools/turbinia_job_graph.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Copyright 2018 Google Inc.
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15 """Graph to visualise job/evidence relationships."""
16
17 from __future__ import unicode_literals
18
19 import argparse
20 import graphviz
21
22 from turbinia.jobs import get_jobs as turbinia_jobs
23
24
25 def create_graph():
26 """Create graph of relationships between Turbinia jobs and evidence.
27
28 Returns:
29 Instance of graphviz.dot.Digraph
30 """
31 dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')
32 for job in turbinia_jobs():
33 dot.node(job.name)
34 for evidence in job.evidence_input:
35 dot.node(evidence.__name__, shape='box')
36 dot.edge(evidence.__name__, job.name)
37
38 for evidence in job.evidence_output:
39 dot.node(evidence.__name__, shape='box')
40 dot.edge(job.name, evidence.__name__)
41 return dot
42
43
44 if __name__ == '__main__':
45 parser = argparse.ArgumentParser(
46 description='Create Turbinia evidence graph.')
47 parser.add_argument('filename', type=unicode, help='where to save the file')
48 args = parser.parse_args()
49
50 graph = create_graph()
51 output_file = args.filename.replace('.png', '')
52
53 try:
54 rendered_graph = graph.render(filename=output_file, cleanup=True)
55 print('Graph generated and saved to: {0}'.format(rendered_graph))
56 except graphviz.ExecutableNotFound:
57 print('Graphviz is not installed - Run: apt-get install graphviz')
58
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/tools/turbinia_job_graph.py b/tools/turbinia_job_graph.py
--- a/tools/turbinia_job_graph.py
+++ b/tools/turbinia_job_graph.py
@@ -19,7 +19,7 @@
import argparse
import graphviz
-from turbinia.jobs import get_jobs as turbinia_jobs
+from turbinia.jobs import manager as jobs_manager
def create_graph():
@@ -29,15 +29,15 @@
Instance of graphviz.dot.Digraph
"""
dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')
- for job in turbinia_jobs():
- dot.node(job.name)
+ for _, job in jobs_manager.JobsManager.GetJobs():
+ dot.node(job.NAME)
for evidence in job.evidence_input:
dot.node(evidence.__name__, shape='box')
- dot.edge(evidence.__name__, job.name)
+ dot.edge(evidence.__name__, job.NAME)
for evidence in job.evidence_output:
dot.node(evidence.__name__, shape='box')
- dot.edge(job.name, evidence.__name__)
+ dot.edge(job.NAME, evidence.__name__)
return dot
|
{"golden_diff": "diff --git a/tools/turbinia_job_graph.py b/tools/turbinia_job_graph.py\n--- a/tools/turbinia_job_graph.py\n+++ b/tools/turbinia_job_graph.py\n@@ -19,7 +19,7 @@\n import argparse\n import graphviz\n \n-from turbinia.jobs import get_jobs as turbinia_jobs\n+from turbinia.jobs import manager as jobs_manager\n \n \n def create_graph():\n@@ -29,15 +29,15 @@\n Instance of graphviz.dot.Digraph\n \"\"\"\n dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')\n- for job in turbinia_jobs():\n- dot.node(job.name)\n+ for _, job in jobs_manager.JobsManager.GetJobs():\n+ dot.node(job.NAME)\n for evidence in job.evidence_input:\n dot.node(evidence.__name__, shape='box')\n- dot.edge(evidence.__name__, job.name)\n+ dot.edge(evidence.__name__, job.NAME)\n \n for evidence in job.evidence_output:\n dot.node(evidence.__name__, shape='box')\n- dot.edge(job.name, evidence.__name__)\n+ dot.edge(job.NAME, evidence.__name__)\n return dot\n", "issue": "turbinia_job_graph.py doesn't support new job manager\nturbinia_job_graph.py needs to be updated to support the new job manager (from #257).\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2018 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Graph to visualise job/evidence relationships.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport argparse\nimport graphviz\n\nfrom turbinia.jobs import get_jobs as turbinia_jobs\n\n\ndef create_graph():\n \"\"\"Create graph of relationships between Turbinia jobs and evidence.\n\n Returns:\n Instance of graphviz.dot.Digraph\n \"\"\"\n dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')\n for job in turbinia_jobs():\n dot.node(job.name)\n for evidence in job.evidence_input:\n dot.node(evidence.__name__, shape='box')\n dot.edge(evidence.__name__, job.name)\n\n for evidence in job.evidence_output:\n dot.node(evidence.__name__, shape='box')\n dot.edge(job.name, evidence.__name__)\n return dot\n\n\nif __name__ == '__main__':\n parser = argparse.ArgumentParser(\n description='Create Turbinia evidence graph.')\n parser.add_argument('filename', type=unicode, help='where to save the file')\n args = parser.parse_args()\n\n graph = create_graph()\n output_file = args.filename.replace('.png', '')\n\n try:\n rendered_graph = graph.render(filename=output_file, cleanup=True)\n print('Graph generated and saved to: {0}'.format(rendered_graph))\n except graphviz.ExecutableNotFound:\n print('Graphviz is not installed - Run: apt-get install graphviz')\n", "path": "tools/turbinia_job_graph.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Copyright 2018 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Graph to visualise job/evidence relationships.\"\"\"\n\nfrom __future__ import unicode_literals\n\nimport argparse\nimport graphviz\n\nfrom turbinia.jobs import manager as jobs_manager\n\n\ndef create_graph():\n \"\"\"Create graph of relationships between Turbinia jobs and evidence.\n\n Returns:\n Instance of graphviz.dot.Digraph\n \"\"\"\n dot = graphviz.Digraph(comment='Turbinia Evidence graph', format='png')\n for _, job in jobs_manager.JobsManager.GetJobs():\n dot.node(job.NAME)\n for evidence in job.evidence_input:\n dot.node(evidence.__name__, shape='box')\n dot.edge(evidence.__name__, job.NAME)\n\n for evidence in job.evidence_output:\n dot.node(evidence.__name__, shape='box')\n dot.edge(job.NAME, evidence.__name__)\n return dot\n\n\nif __name__ == '__main__':\n parser = argparse.ArgumentParser(\n description='Create Turbinia evidence graph.')\n parser.add_argument('filename', type=unicode, help='where to save the file')\n args = parser.parse_args()\n\n graph = create_graph()\n output_file = args.filename.replace('.png', '')\n\n try:\n rendered_graph = graph.render(filename=output_file, cleanup=True)\n print('Graph generated and saved to: {0}'.format(rendered_graph))\n except graphviz.ExecutableNotFound:\n print('Graphviz is not installed - Run: apt-get install graphviz')\n", "path": "tools/turbinia_job_graph.py"}]}
| 851 | 269 |
gh_patches_debug_11740
|
rasdani/github-patches
|
git_diff
|
locustio__locust-2715
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Locust insists on using IPv6 despite being in an IPv4 stack.
### Prerequisites
- [X] I am using [the latest version of Locust](https://github.com/locustio/locust/releases/)
- [X] I am reporting a bug, not asking a question
### Description
I have an environment that is IPv4-only, but there is an IPv6 address listed in /etc/hosts. When I try to connect to the master using the following command, it results in an error:
```
# locust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG
[2024-04-25 06:24:31,138] ip-192-168-0-235.ap-northeast-3.compute.internal/DEBUG/locust.runners: Failed to connect to master locust-master:5557, retry 1/60.
```
By changing [self.socket.setsockopt(zmq.IPV6, 1)](https://github.com/locustio/locust/blob/master/locust/rpc/zmqrpc.py#L18) to 0 resolves the issue.
```
# cat /etc/hosts
127.0.0.1 localhost
192.168.0.235 ip-192-168-0-235.ap-northeast-3.compute.internal
127.255.0.1 locust-master
2600:f0f0:0:0:0:0:0:1 locust-master
```
I recommend detecting whether the current environment supports IPv6 before enabling it.
### Command line
locust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG
### Locustfile contents
```python3
N/A
```
### Python version
3.11
### Locust version
2.26.0
### Operating system
Debian 12
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locust/rpc/zmqrpc.py`
Content:
```
1 from locust.exception import RPCError, RPCReceiveError, RPCSendError
2 from locust.util.exception_handler import retry
3
4 import msgpack.exceptions as msgerr
5 import zmq.error as zmqerr
6 import zmq.green as zmq
7
8 from .protocol import Message
9
10
11 class BaseSocket:
12 def __init__(self, sock_type):
13 context = zmq.Context()
14 self.socket = context.socket(sock_type)
15
16 self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)
17 self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)
18 self.socket.setsockopt(zmq.IPV6, 1)
19
20 @retry()
21 def send(self, msg):
22 try:
23 self.socket.send(msg.serialize(), zmq.NOBLOCK)
24 except zmqerr.ZMQError as e:
25 raise RPCSendError("ZMQ sent failure") from e
26
27 @retry()
28 def send_to_client(self, msg):
29 try:
30 self.socket.send_multipart([msg.node_id.encode(), msg.serialize()])
31 except zmqerr.ZMQError as e:
32 raise RPCSendError("ZMQ sent failure") from e
33
34 def recv(self):
35 try:
36 data = self.socket.recv()
37 msg = Message.unserialize(data)
38 except msgerr.ExtraData as e:
39 raise RPCReceiveError("ZMQ interrupted message") from e
40 except zmqerr.ZMQError as e:
41 raise RPCError("ZMQ network broken") from e
42 return msg
43
44 def recv_from_client(self):
45 try:
46 data = self.socket.recv_multipart()
47 addr = data[0].decode()
48 except UnicodeDecodeError as e:
49 raise RPCReceiveError("ZMQ interrupted or corrupted message") from e
50 except zmqerr.ZMQError as e:
51 raise RPCError("ZMQ network broken") from e
52 try:
53 msg = Message.unserialize(data[1])
54 except (UnicodeDecodeError, msgerr.ExtraData) as e:
55 raise RPCReceiveError("ZMQ interrupted or corrupted message", addr=addr) from e
56 return addr, msg
57
58 def close(self, linger=None):
59 self.socket.close(linger=linger)
60
61
62 class Server(BaseSocket):
63 def __init__(self, host, port):
64 BaseSocket.__init__(self, zmq.ROUTER)
65 if port == 0:
66 self.port = self.socket.bind_to_random_port(f"tcp://{host}")
67 else:
68 try:
69 self.socket.bind("tcp://%s:%i" % (host, port))
70 self.port = port
71 except zmqerr.ZMQError as e:
72 raise RPCError(f"Socket bind failure: {e}")
73
74
75 class Client(BaseSocket):
76 def __init__(self, host, port, identity):
77 BaseSocket.__init__(self, zmq.DEALER)
78 self.socket.setsockopt(zmq.IDENTITY, identity.encode())
79 self.socket.connect("tcp://%s:%i" % (host, port))
80
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/locust/rpc/zmqrpc.py b/locust/rpc/zmqrpc.py
--- a/locust/rpc/zmqrpc.py
+++ b/locust/rpc/zmqrpc.py
@@ -4,6 +4,7 @@
import msgpack.exceptions as msgerr
import zmq.error as zmqerr
import zmq.green as zmq
+from urllib3.util.connection import HAS_IPV6
from .protocol import Message
@@ -15,7 +16,8 @@
self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)
self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)
- self.socket.setsockopt(zmq.IPV6, 1)
+ if HAS_IPV6:
+ self.socket.setsockopt(zmq.IPV6, 1)
@retry()
def send(self, msg):
|
{"golden_diff": "diff --git a/locust/rpc/zmqrpc.py b/locust/rpc/zmqrpc.py\n--- a/locust/rpc/zmqrpc.py\n+++ b/locust/rpc/zmqrpc.py\n@@ -4,6 +4,7 @@\n import msgpack.exceptions as msgerr\n import zmq.error as zmqerr\n import zmq.green as zmq\n+from urllib3.util.connection import HAS_IPV6\n \n from .protocol import Message\n \n@@ -15,7 +16,8 @@\n \n self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)\n self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)\n- self.socket.setsockopt(zmq.IPV6, 1)\n+ if HAS_IPV6:\n+ self.socket.setsockopt(zmq.IPV6, 1)\n \n @retry()\n def send(self, msg):\n", "issue": "Locust insists on using IPv6 despite being in an IPv4 stack.\n### Prerequisites\r\n\r\n- [X] I am using [the latest version of Locust](https://github.com/locustio/locust/releases/)\r\n- [X] I am reporting a bug, not asking a question\r\n\r\n### Description\r\n\r\nI have an environment that is IPv4-only, but there is an IPv6 address listed in /etc/hosts. When I try to connect to the master using the following command, it results in an error:\r\n\r\n```\r\n# locust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG\r\n[2024-04-25 06:24:31,138] ip-192-168-0-235.ap-northeast-3.compute.internal/DEBUG/locust.runners: Failed to connect to master locust-master:5557, retry 1/60.\r\n```\r\n\r\nBy changing [self.socket.setsockopt(zmq.IPV6, 1)](https://github.com/locustio/locust/blob/master/locust/rpc/zmqrpc.py#L18) to 0 resolves the issue.\r\n```\r\n# cat /etc/hosts\r\n127.0.0.1 localhost\r\n192.168.0.235 ip-192-168-0-235.ap-northeast-3.compute.internal\r\n127.255.0.1 locust-master\r\n2600:f0f0:0:0:0:0:0:1 locust-master\r\n```\r\n\r\nI recommend detecting whether the current environment supports IPv6 before enabling it.\r\n\r\n### Command line\r\n\r\nlocust -f ./p4_basic_code.py --worker --master-host locust-master -L DEBUG\r\n\r\n### Locustfile contents\r\n\r\n```python3\r\nN/A\r\n```\r\n\r\n\r\n### Python version\r\n\r\n3.11\r\n\r\n### Locust version\r\n\r\n2.26.0\r\n\r\n### Operating system\r\n\r\nDebian 12\n", "before_files": [{"content": "from locust.exception import RPCError, RPCReceiveError, RPCSendError\nfrom locust.util.exception_handler import retry\n\nimport msgpack.exceptions as msgerr\nimport zmq.error as zmqerr\nimport zmq.green as zmq\n\nfrom .protocol import Message\n\n\nclass BaseSocket:\n def __init__(self, sock_type):\n context = zmq.Context()\n self.socket = context.socket(sock_type)\n\n self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)\n self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)\n self.socket.setsockopt(zmq.IPV6, 1)\n\n @retry()\n def send(self, msg):\n try:\n self.socket.send(msg.serialize(), zmq.NOBLOCK)\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n @retry()\n def send_to_client(self, msg):\n try:\n self.socket.send_multipart([msg.node_id.encode(), msg.serialize()])\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n def recv(self):\n try:\n data = self.socket.recv()\n msg = Message.unserialize(data)\n except msgerr.ExtraData as e:\n raise RPCReceiveError(\"ZMQ interrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n return msg\n\n def recv_from_client(self):\n try:\n data = self.socket.recv_multipart()\n addr = data[0].decode()\n except UnicodeDecodeError as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n try:\n msg = Message.unserialize(data[1])\n except (UnicodeDecodeError, msgerr.ExtraData) as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\", addr=addr) from e\n return addr, msg\n\n def close(self, linger=None):\n self.socket.close(linger=linger)\n\n\nclass Server(BaseSocket):\n def __init__(self, host, port):\n BaseSocket.__init__(self, zmq.ROUTER)\n if port == 0:\n self.port = self.socket.bind_to_random_port(f\"tcp://{host}\")\n else:\n try:\n self.socket.bind(\"tcp://%s:%i\" % (host, port))\n self.port = port\n except zmqerr.ZMQError as e:\n raise RPCError(f\"Socket bind failure: {e}\")\n\n\nclass Client(BaseSocket):\n def __init__(self, host, port, identity):\n BaseSocket.__init__(self, zmq.DEALER)\n self.socket.setsockopt(zmq.IDENTITY, identity.encode())\n self.socket.connect(\"tcp://%s:%i\" % (host, port))\n", "path": "locust/rpc/zmqrpc.py"}], "after_files": [{"content": "from locust.exception import RPCError, RPCReceiveError, RPCSendError\nfrom locust.util.exception_handler import retry\n\nimport msgpack.exceptions as msgerr\nimport zmq.error as zmqerr\nimport zmq.green as zmq\nfrom urllib3.util.connection import HAS_IPV6\n\nfrom .protocol import Message\n\n\nclass BaseSocket:\n def __init__(self, sock_type):\n context = zmq.Context()\n self.socket = context.socket(sock_type)\n\n self.socket.setsockopt(zmq.TCP_KEEPALIVE, 1)\n self.socket.setsockopt(zmq.TCP_KEEPALIVE_IDLE, 30)\n if HAS_IPV6:\n self.socket.setsockopt(zmq.IPV6, 1)\n\n @retry()\n def send(self, msg):\n try:\n self.socket.send(msg.serialize(), zmq.NOBLOCK)\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n @retry()\n def send_to_client(self, msg):\n try:\n self.socket.send_multipart([msg.node_id.encode(), msg.serialize()])\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e\n\n def recv(self):\n try:\n data = self.socket.recv()\n msg = Message.unserialize(data)\n except msgerr.ExtraData as e:\n raise RPCReceiveError(\"ZMQ interrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n return msg\n\n def recv_from_client(self):\n try:\n data = self.socket.recv_multipart()\n addr = data[0].decode()\n except UnicodeDecodeError as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\") from e\n except zmqerr.ZMQError as e:\n raise RPCError(\"ZMQ network broken\") from e\n try:\n msg = Message.unserialize(data[1])\n except (UnicodeDecodeError, msgerr.ExtraData) as e:\n raise RPCReceiveError(\"ZMQ interrupted or corrupted message\", addr=addr) from e\n return addr, msg\n\n def close(self, linger=None):\n self.socket.close(linger=linger)\n\n\nclass Server(BaseSocket):\n def __init__(self, host, port):\n BaseSocket.__init__(self, zmq.ROUTER)\n if port == 0:\n self.port = self.socket.bind_to_random_port(f\"tcp://{host}\")\n else:\n try:\n self.socket.bind(\"tcp://%s:%i\" % (host, port))\n self.port = port\n except zmqerr.ZMQError as e:\n raise RPCError(f\"Socket bind failure: {e}\")\n\n\nclass Client(BaseSocket):\n def __init__(self, host, port, identity):\n BaseSocket.__init__(self, zmq.DEALER)\n self.socket.setsockopt(zmq.IDENTITY, identity.encode())\n self.socket.connect(\"tcp://%s:%i\" % (host, port))\n", "path": "locust/rpc/zmqrpc.py"}]}
| 1,494 | 196 |
gh_patches_debug_25123
|
rasdani/github-patches
|
git_diff
|
sql-machine-learning__elasticdl-1044
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Move _pos_int and _non_neg_int to python/common/args.py
`_pos_int` and `_non_neg_int` should be common functions
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `elasticdl/python/master/args.py`
Content:
```
1 import argparse
2
3 from elasticdl.python.common.args import add_common_params, add_train_params
4
5
6 def _pos_int(arg):
7 res = int(arg)
8 if res <= 0:
9 raise ValueError("Positive integer argument required. Got %s" % res)
10 return res
11
12
13 def _non_neg_int(arg):
14 res = int(arg)
15 if res < 0:
16 raise ValueError(
17 "Non-negative integer argument required. Get %s" % res
18 )
19 return res
20
21
22 def parse_args():
23 parser = argparse.ArgumentParser(description="ElasticDL Master")
24 parser.add_argument(
25 "--port",
26 default=50001,
27 type=_pos_int,
28 help="The listening port of master",
29 )
30 parser.add_argument(
31 "--worker_image", help="Docker image for workers", default=None
32 )
33 parser.add_argument(
34 "--worker_pod_priority", help="Priority requested by workers"
35 )
36 parser.add_argument(
37 "--prediction_data_dir",
38 help="Prediction data directory. Files should be in RecordIO format",
39 default="",
40 )
41 parser.add_argument(
42 "--log_level",
43 choices=["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"],
44 type=str.upper,
45 default="INFO",
46 help="The logging level. Default to WARNING",
47 )
48 add_common_params(parser)
49 add_train_params(parser)
50
51 args = parser.parse_args()
52
53 if all(
54 v == "" or v is None
55 for v in [
56 args.training_data_dir,
57 args.evaluation_data_dir,
58 args.prediction_data_dir,
59 ]
60 ):
61 raise ValueError(
62 "At least one of the data directories needs to be provided"
63 )
64
65 if args.prediction_data_dir and (
66 args.training_data_dir or args.evaluation_data_dir
67 ):
68 raise ValueError(
69 "Running prediction together with training or evaluation "
70 "is not supported"
71 )
72 if args.prediction_data_dir and not args.checkpoint_filename_for_init:
73 raise ValueError(
74 "checkpoint_filename_for_init is required for running "
75 "prediction job"
76 )
77
78 return args
79
```
Path: `elasticdl/python/common/args.py`
Content:
```
1 def add_common_params(parser):
2 parser.add_argument(
3 "--model_zoo",
4 help="The directory that contains user-defined model files "
5 "or a specific model file",
6 required=True,
7 )
8 parser.add_argument(
9 "--docker_image_prefix",
10 default="",
11 help="The prefix for generated Docker images, if set, the image is "
12 "also pushed to the registry",
13 )
14 parser.add_argument("--image_base", help="Base Docker image.")
15 parser.add_argument("--job_name", help="ElasticDL job name", required=True)
16 parser.add_argument(
17 "--master_resource_request",
18 default="cpu=0.1,memory=1024Mi",
19 type=str,
20 help="The minimal resource required by master, "
21 "e.g. cpu=0.1,memory=1024Mi,disk=1024Mi,gpu=1",
22 )
23 parser.add_argument(
24 "--master_resource_limit",
25 type=str,
26 help="The maximal resource required by master, "
27 "e.g. cpu=0.1,memory=1024Mi,disk=1024Mi,gpu=1, "
28 "default to master_resource_request",
29 )
30 parser.add_argument(
31 "--num_workers", type=int, help="Number of workers", default=0
32 )
33 parser.add_argument(
34 "--worker_resource_request",
35 default="cpu=1,memory=4096Mi",
36 type=str,
37 help="The minimal resource required by worker, "
38 "e.g. cpu=1,memory=1024Mi,disk=1024Mi,gpu=1",
39 )
40 parser.add_argument(
41 "--worker_resource_limit",
42 type=str,
43 help="The maximal resource required by worker, "
44 "e.g. cpu=1,memory=1024Mi,disk=1024Mi,gpu=1,"
45 "default to worker_resource_request",
46 )
47 parser.add_argument(
48 "--master_pod_priority", help="The requested priority of master pod"
49 )
50 parser.add_argument(
51 "--volume",
52 help="The Kubernetes volume information, "
53 'e.g. "claim_name=c1,mount_path=/path1".',
54 )
55 parser.add_argument(
56 "--image_pull_policy",
57 default="Always",
58 help="The image pull policy of master and worker",
59 )
60 parser.add_argument(
61 "--restart_policy",
62 default="Never",
63 help="The pod restart policy when pod crashed",
64 )
65 parser.add_argument(
66 "--extra_pypi_index", help="The extra python package repository"
67 )
68 parser.add_argument(
69 "--namespace",
70 default="default",
71 type=str,
72 help="The name of the Kubernetes namespace where ElasticDL "
73 "pods will be created",
74 )
75 parser.add_argument("--records_per_task", type=int, required=True)
76 parser.add_argument(
77 "--minibatch_size",
78 type=int,
79 help="Minibatch size used by workers",
80 required=True,
81 )
82 parser.add_argument(
83 "--dataset_fn",
84 type=str,
85 default="dataset_fn",
86 help="The name of the dataset function defined in the model file",
87 )
88 parser.add_argument(
89 "--eval_metrics_fn",
90 type=str,
91 default="eval_metrics_fn",
92 help="The name of the evaluation metrics function defined "
93 "in the model file",
94 )
95 parser.add_argument(
96 "--model_def",
97 type=str,
98 required=True,
99 help="The import path to the model definition function/class in the "
100 'model zoo, e.g. "cifar10_subclass.cifar10_subclass.CustomModel"',
101 )
102 parser.add_argument(
103 "--model_params",
104 type=str,
105 default="",
106 help="The dictionary of model parameters in a string that will be "
107 'used to instantiate the model, e.g. "param1=1,param2=2"',
108 )
109 parser.add_argument(
110 "--cluster_spec",
111 help="The file that contains user-defined cluster specification",
112 default="",
113 )
114
115
116 def add_train_params(parser):
117 parser.add_argument(
118 "--tensorboard_log_dir",
119 default="",
120 type=str,
121 help="Directory where TensorBoard will look to find "
122 "TensorFlow event files that it can display. "
123 "TensorBoard will recursively walk the directory "
124 "structure rooted at log dir, looking for .*tfevents.* "
125 "files. You may also pass a comma separated list of log "
126 "directories, and TensorBoard will watch each "
127 "directory.",
128 )
129 parser.add_argument("--num_epochs", type=int, default=1)
130 parser.add_argument(
131 "--grads_to_wait",
132 type=int,
133 help="Number of gradients to wait before updating model",
134 default=2,
135 )
136 parser.add_argument(
137 "--training_data_dir",
138 help="Training data directory. Files should be in RecordIO format",
139 default="",
140 )
141 parser.add_argument(
142 "--evaluation_data_dir",
143 help="Evaluation data directory. Files should be in RecordIO format",
144 default="",
145 )
146 parser.add_argument(
147 "--evaluation_steps",
148 type=int,
149 help="Evaluate the model every this many steps."
150 "If 0, step-based evaluation is disabled",
151 default=0,
152 )
153 parser.add_argument(
154 "--evaluation_start_delay_secs",
155 type=int,
156 help="Start time-based evaluation only after waiting for "
157 "this many seconds",
158 default=100,
159 )
160 parser.add_argument(
161 "--evaluation_throttle_secs",
162 type=int,
163 help="Do not re-evaluate unless the last evaluation was started "
164 "at least this many seconds ago."
165 "If 0, time-based evaluation is disabled",
166 default=0,
167 )
168 parser.add_argument(
169 "--checkpoint_steps",
170 type=int,
171 help="Save checkpoint every this many steps."
172 "If 0, no checkpoints to save.",
173 default=0,
174 )
175 parser.add_argument(
176 "--checkpoint_dir",
177 help="The directory to store the checkpoint files",
178 default="",
179 )
180 parser.add_argument(
181 "--keep_checkpoint_max",
182 type=int,
183 help="The maximum number of recent checkpoint files to keep."
184 "If 0, keep all.",
185 default=0,
186 )
187 parser.add_argument(
188 "--checkpoint_filename_for_init",
189 help="The checkpoint file to initialize the training model",
190 default="",
191 )
192 parser.add_argument(
193 "--loss",
194 type=str,
195 default="loss",
196 help="The name of the loss function defined in the model file",
197 )
198 parser.add_argument(
199 "--optimizer",
200 type=str,
201 default="optimizer",
202 help="The name of the optimizer defined in the model file",
203 )
204 parser.add_argument(
205 "--output",
206 type=str,
207 default="",
208 help="The path to save the final trained model",
209 )
210
211
212 def add_evaluate_params(parser):
213 parser.add_argument(
214 "--evaluation_data_dir",
215 help="Evaluation data directory. Files should be in RecordIO format",
216 required=True,
217 )
218 parser.add_argument(
219 "--checkpoint_filename_for_init",
220 help="The checkpoint file to initialize the training model",
221 required=True,
222 )
223
224
225 def add_predict_params(parser):
226 parser.add_argument(
227 "--prediction_data_dir",
228 help="Prediction data directory. Files should be in RecordIO format",
229 required=True,
230 )
231 parser.add_argument(
232 "--checkpoint_filename_for_init",
233 help="The checkpoint file to initialize the training model",
234 required=True,
235 )
236
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/elasticdl/python/common/args.py b/elasticdl/python/common/args.py
--- a/elasticdl/python/common/args.py
+++ b/elasticdl/python/common/args.py
@@ -1,3 +1,19 @@
+def pos_int(arg):
+ res = int(arg)
+ if res <= 0:
+ raise ValueError("Positive integer argument required. Got %s" % res)
+ return res
+
+
+def non_neg_int(arg):
+ res = int(arg)
+ if res < 0:
+ raise ValueError(
+ "Non-negative integer argument required. Get %s" % res
+ )
+ return res
+
+
def add_common_params(parser):
parser.add_argument(
"--model_zoo",
diff --git a/elasticdl/python/master/args.py b/elasticdl/python/master/args.py
--- a/elasticdl/python/master/args.py
+++ b/elasticdl/python/master/args.py
@@ -1,22 +1,10 @@
import argparse
-from elasticdl.python.common.args import add_common_params, add_train_params
-
-
-def _pos_int(arg):
- res = int(arg)
- if res <= 0:
- raise ValueError("Positive integer argument required. Got %s" % res)
- return res
-
-
-def _non_neg_int(arg):
- res = int(arg)
- if res < 0:
- raise ValueError(
- "Non-negative integer argument required. Get %s" % res
- )
- return res
+from elasticdl.python.common.args import (
+ add_common_params,
+ add_train_params,
+ pos_int,
+)
def parse_args():
@@ -24,7 +12,7 @@
parser.add_argument(
"--port",
default=50001,
- type=_pos_int,
+ type=pos_int,
help="The listening port of master",
)
parser.add_argument(
|
{"golden_diff": "diff --git a/elasticdl/python/common/args.py b/elasticdl/python/common/args.py\n--- a/elasticdl/python/common/args.py\n+++ b/elasticdl/python/common/args.py\n@@ -1,3 +1,19 @@\n+def pos_int(arg):\n+ res = int(arg)\n+ if res <= 0:\n+ raise ValueError(\"Positive integer argument required. Got %s\" % res)\n+ return res\n+\n+\n+def non_neg_int(arg):\n+ res = int(arg)\n+ if res < 0:\n+ raise ValueError(\n+ \"Non-negative integer argument required. Get %s\" % res\n+ )\n+ return res\n+\n+\n def add_common_params(parser):\n parser.add_argument(\n \"--model_zoo\",\ndiff --git a/elasticdl/python/master/args.py b/elasticdl/python/master/args.py\n--- a/elasticdl/python/master/args.py\n+++ b/elasticdl/python/master/args.py\n@@ -1,22 +1,10 @@\n import argparse\n \n-from elasticdl.python.common.args import add_common_params, add_train_params\n-\n-\n-def _pos_int(arg):\n- res = int(arg)\n- if res <= 0:\n- raise ValueError(\"Positive integer argument required. Got %s\" % res)\n- return res\n-\n-\n-def _non_neg_int(arg):\n- res = int(arg)\n- if res < 0:\n- raise ValueError(\n- \"Non-negative integer argument required. Get %s\" % res\n- )\n- return res\n+from elasticdl.python.common.args import (\n+ add_common_params,\n+ add_train_params,\n+ pos_int,\n+)\n \n \n def parse_args():\n@@ -24,7 +12,7 @@\n parser.add_argument(\n \"--port\",\n default=50001,\n- type=_pos_int,\n+ type=pos_int,\n help=\"The listening port of master\",\n )\n parser.add_argument(\n", "issue": "Move _pos_int and _non_neg_int to python/common/args.py\n`_pos_int` and `_non_neg_int` should be common functions\n", "before_files": [{"content": "import argparse\n\nfrom elasticdl.python.common.args import add_common_params, add_train_params\n\n\ndef _pos_int(arg):\n res = int(arg)\n if res <= 0:\n raise ValueError(\"Positive integer argument required. Got %s\" % res)\n return res\n\n\ndef _non_neg_int(arg):\n res = int(arg)\n if res < 0:\n raise ValueError(\n \"Non-negative integer argument required. Get %s\" % res\n )\n return res\n\n\ndef parse_args():\n parser = argparse.ArgumentParser(description=\"ElasticDL Master\")\n parser.add_argument(\n \"--port\",\n default=50001,\n type=_pos_int,\n help=\"The listening port of master\",\n )\n parser.add_argument(\n \"--worker_image\", help=\"Docker image for workers\", default=None\n )\n parser.add_argument(\n \"--worker_pod_priority\", help=\"Priority requested by workers\"\n )\n parser.add_argument(\n \"--prediction_data_dir\",\n help=\"Prediction data directory. Files should be in RecordIO format\",\n default=\"\",\n )\n parser.add_argument(\n \"--log_level\",\n choices=[\"DEBUG\", \"INFO\", \"WARNING\", \"ERROR\", \"CRITICAL\"],\n type=str.upper,\n default=\"INFO\",\n help=\"The logging level. Default to WARNING\",\n )\n add_common_params(parser)\n add_train_params(parser)\n\n args = parser.parse_args()\n\n if all(\n v == \"\" or v is None\n for v in [\n args.training_data_dir,\n args.evaluation_data_dir,\n args.prediction_data_dir,\n ]\n ):\n raise ValueError(\n \"At least one of the data directories needs to be provided\"\n )\n\n if args.prediction_data_dir and (\n args.training_data_dir or args.evaluation_data_dir\n ):\n raise ValueError(\n \"Running prediction together with training or evaluation \"\n \"is not supported\"\n )\n if args.prediction_data_dir and not args.checkpoint_filename_for_init:\n raise ValueError(\n \"checkpoint_filename_for_init is required for running \"\n \"prediction job\"\n )\n\n return args\n", "path": "elasticdl/python/master/args.py"}, {"content": "def add_common_params(parser):\n parser.add_argument(\n \"--model_zoo\",\n help=\"The directory that contains user-defined model files \"\n \"or a specific model file\",\n required=True,\n )\n parser.add_argument(\n \"--docker_image_prefix\",\n default=\"\",\n help=\"The prefix for generated Docker images, if set, the image is \"\n \"also pushed to the registry\",\n )\n parser.add_argument(\"--image_base\", help=\"Base Docker image.\")\n parser.add_argument(\"--job_name\", help=\"ElasticDL job name\", required=True)\n parser.add_argument(\n \"--master_resource_request\",\n default=\"cpu=0.1,memory=1024Mi\",\n type=str,\n help=\"The minimal resource required by master, \"\n \"e.g. cpu=0.1,memory=1024Mi,disk=1024Mi,gpu=1\",\n )\n parser.add_argument(\n \"--master_resource_limit\",\n type=str,\n help=\"The maximal resource required by master, \"\n \"e.g. cpu=0.1,memory=1024Mi,disk=1024Mi,gpu=1, \"\n \"default to master_resource_request\",\n )\n parser.add_argument(\n \"--num_workers\", type=int, help=\"Number of workers\", default=0\n )\n parser.add_argument(\n \"--worker_resource_request\",\n default=\"cpu=1,memory=4096Mi\",\n type=str,\n help=\"The minimal resource required by worker, \"\n \"e.g. cpu=1,memory=1024Mi,disk=1024Mi,gpu=1\",\n )\n parser.add_argument(\n \"--worker_resource_limit\",\n type=str,\n help=\"The maximal resource required by worker, \"\n \"e.g. cpu=1,memory=1024Mi,disk=1024Mi,gpu=1,\"\n \"default to worker_resource_request\",\n )\n parser.add_argument(\n \"--master_pod_priority\", help=\"The requested priority of master pod\"\n )\n parser.add_argument(\n \"--volume\",\n help=\"The Kubernetes volume information, \"\n 'e.g. \"claim_name=c1,mount_path=/path1\".',\n )\n parser.add_argument(\n \"--image_pull_policy\",\n default=\"Always\",\n help=\"The image pull policy of master and worker\",\n )\n parser.add_argument(\n \"--restart_policy\",\n default=\"Never\",\n help=\"The pod restart policy when pod crashed\",\n )\n parser.add_argument(\n \"--extra_pypi_index\", help=\"The extra python package repository\"\n )\n parser.add_argument(\n \"--namespace\",\n default=\"default\",\n type=str,\n help=\"The name of the Kubernetes namespace where ElasticDL \"\n \"pods will be created\",\n )\n parser.add_argument(\"--records_per_task\", type=int, required=True)\n parser.add_argument(\n \"--minibatch_size\",\n type=int,\n help=\"Minibatch size used by workers\",\n required=True,\n )\n parser.add_argument(\n \"--dataset_fn\",\n type=str,\n default=\"dataset_fn\",\n help=\"The name of the dataset function defined in the model file\",\n )\n parser.add_argument(\n \"--eval_metrics_fn\",\n type=str,\n default=\"eval_metrics_fn\",\n help=\"The name of the evaluation metrics function defined \"\n \"in the model file\",\n )\n parser.add_argument(\n \"--model_def\",\n type=str,\n required=True,\n help=\"The import path to the model definition function/class in the \"\n 'model zoo, e.g. \"cifar10_subclass.cifar10_subclass.CustomModel\"',\n )\n parser.add_argument(\n \"--model_params\",\n type=str,\n default=\"\",\n help=\"The dictionary of model parameters in a string that will be \"\n 'used to instantiate the model, e.g. \"param1=1,param2=2\"',\n )\n parser.add_argument(\n \"--cluster_spec\",\n help=\"The file that contains user-defined cluster specification\",\n default=\"\",\n )\n\n\ndef add_train_params(parser):\n parser.add_argument(\n \"--tensorboard_log_dir\",\n default=\"\",\n type=str,\n help=\"Directory where TensorBoard will look to find \"\n \"TensorFlow event files that it can display. \"\n \"TensorBoard will recursively walk the directory \"\n \"structure rooted at log dir, looking for .*tfevents.* \"\n \"files. You may also pass a comma separated list of log \"\n \"directories, and TensorBoard will watch each \"\n \"directory.\",\n )\n parser.add_argument(\"--num_epochs\", type=int, default=1)\n parser.add_argument(\n \"--grads_to_wait\",\n type=int,\n help=\"Number of gradients to wait before updating model\",\n default=2,\n )\n parser.add_argument(\n \"--training_data_dir\",\n help=\"Training data directory. Files should be in RecordIO format\",\n default=\"\",\n )\n parser.add_argument(\n \"--evaluation_data_dir\",\n help=\"Evaluation data directory. Files should be in RecordIO format\",\n default=\"\",\n )\n parser.add_argument(\n \"--evaluation_steps\",\n type=int,\n help=\"Evaluate the model every this many steps.\"\n \"If 0, step-based evaluation is disabled\",\n default=0,\n )\n parser.add_argument(\n \"--evaluation_start_delay_secs\",\n type=int,\n help=\"Start time-based evaluation only after waiting for \"\n \"this many seconds\",\n default=100,\n )\n parser.add_argument(\n \"--evaluation_throttle_secs\",\n type=int,\n help=\"Do not re-evaluate unless the last evaluation was started \"\n \"at least this many seconds ago.\"\n \"If 0, time-based evaluation is disabled\",\n default=0,\n )\n parser.add_argument(\n \"--checkpoint_steps\",\n type=int,\n help=\"Save checkpoint every this many steps.\"\n \"If 0, no checkpoints to save.\",\n default=0,\n )\n parser.add_argument(\n \"--checkpoint_dir\",\n help=\"The directory to store the checkpoint files\",\n default=\"\",\n )\n parser.add_argument(\n \"--keep_checkpoint_max\",\n type=int,\n help=\"The maximum number of recent checkpoint files to keep.\"\n \"If 0, keep all.\",\n default=0,\n )\n parser.add_argument(\n \"--checkpoint_filename_for_init\",\n help=\"The checkpoint file to initialize the training model\",\n default=\"\",\n )\n parser.add_argument(\n \"--loss\",\n type=str,\n default=\"loss\",\n help=\"The name of the loss function defined in the model file\",\n )\n parser.add_argument(\n \"--optimizer\",\n type=str,\n default=\"optimizer\",\n help=\"The name of the optimizer defined in the model file\",\n )\n parser.add_argument(\n \"--output\",\n type=str,\n default=\"\",\n help=\"The path to save the final trained model\",\n )\n\n\ndef add_evaluate_params(parser):\n parser.add_argument(\n \"--evaluation_data_dir\",\n help=\"Evaluation data directory. Files should be in RecordIO format\",\n required=True,\n )\n parser.add_argument(\n \"--checkpoint_filename_for_init\",\n help=\"The checkpoint file to initialize the training model\",\n required=True,\n )\n\n\ndef add_predict_params(parser):\n parser.add_argument(\n \"--prediction_data_dir\",\n help=\"Prediction data directory. Files should be in RecordIO format\",\n required=True,\n )\n parser.add_argument(\n \"--checkpoint_filename_for_init\",\n help=\"The checkpoint file to initialize the training model\",\n required=True,\n )\n", "path": "elasticdl/python/common/args.py"}], "after_files": [{"content": "import argparse\n\nfrom elasticdl.python.common.args import (\n add_common_params,\n add_train_params,\n pos_int,\n)\n\n\ndef parse_args():\n parser = argparse.ArgumentParser(description=\"ElasticDL Master\")\n parser.add_argument(\n \"--port\",\n default=50001,\n type=pos_int,\n help=\"The listening port of master\",\n )\n parser.add_argument(\n \"--worker_image\", help=\"Docker image for workers\", default=None\n )\n parser.add_argument(\n \"--worker_pod_priority\", help=\"Priority requested by workers\"\n )\n parser.add_argument(\n \"--prediction_data_dir\",\n help=\"Prediction data directory. Files should be in RecordIO format\",\n default=\"\",\n )\n parser.add_argument(\n \"--log_level\",\n choices=[\"DEBUG\", \"INFO\", \"WARNING\", \"ERROR\", \"CRITICAL\"],\n type=str.upper,\n default=\"INFO\",\n help=\"The logging level. Default to WARNING\",\n )\n add_common_params(parser)\n add_train_params(parser)\n\n args = parser.parse_args()\n\n if all(\n v == \"\" or v is None\n for v in [\n args.training_data_dir,\n args.evaluation_data_dir,\n args.prediction_data_dir,\n ]\n ):\n raise ValueError(\n \"At least one of the data directories needs to be provided\"\n )\n\n if args.prediction_data_dir and (\n args.training_data_dir or args.evaluation_data_dir\n ):\n raise ValueError(\n \"Running prediction together with training or evaluation \"\n \"is not supported\"\n )\n if args.prediction_data_dir and not args.checkpoint_filename_for_init:\n raise ValueError(\n \"checkpoint_filename_for_init is required for running \"\n \"prediction job\"\n )\n\n return args\n", "path": "elasticdl/python/master/args.py"}, {"content": "def pos_int(arg):\n res = int(arg)\n if res <= 0:\n raise ValueError(\"Positive integer argument required. Got %s\" % res)\n return res\n\n\ndef non_neg_int(arg):\n res = int(arg)\n if res < 0:\n raise ValueError(\n \"Non-negative integer argument required. Get %s\" % res\n )\n return res\n\n\ndef add_common_params(parser):\n parser.add_argument(\n \"--model_zoo\",\n help=\"The directory that contains user-defined model files \"\n \"or a specific model file\",\n required=True,\n )\n parser.add_argument(\n \"--docker_image_prefix\",\n default=\"\",\n help=\"The prefix for generated Docker images, if set, the image is \"\n \"also pushed to the registry\",\n )\n parser.add_argument(\"--image_base\", help=\"Base Docker image.\")\n parser.add_argument(\"--job_name\", help=\"ElasticDL job name\", required=True)\n parser.add_argument(\n \"--master_resource_request\",\n default=\"cpu=0.1,memory=1024Mi\",\n type=str,\n help=\"The minimal resource required by master, \"\n \"e.g. cpu=0.1,memory=1024Mi,disk=1024Mi,gpu=1\",\n )\n parser.add_argument(\n \"--master_resource_limit\",\n type=str,\n help=\"The maximal resource required by master, \"\n \"e.g. cpu=0.1,memory=1024Mi,disk=1024Mi,gpu=1, \"\n \"default to master_resource_request\",\n )\n parser.add_argument(\n \"--num_workers\", type=int, help=\"Number of workers\", default=0\n )\n parser.add_argument(\n \"--worker_resource_request\",\n default=\"cpu=1,memory=4096Mi\",\n type=str,\n help=\"The minimal resource required by worker, \"\n \"e.g. cpu=1,memory=1024Mi,disk=1024Mi,gpu=1\",\n )\n parser.add_argument(\n \"--worker_resource_limit\",\n type=str,\n help=\"The maximal resource required by worker, \"\n \"e.g. cpu=1,memory=1024Mi,disk=1024Mi,gpu=1,\"\n \"default to worker_resource_request\",\n )\n parser.add_argument(\n \"--master_pod_priority\", help=\"The requested priority of master pod\"\n )\n parser.add_argument(\n \"--volume\",\n help=\"The Kubernetes volume information, \"\n 'e.g. \"claim_name=c1,mount_path=/path1\".',\n )\n parser.add_argument(\n \"--image_pull_policy\",\n default=\"Always\",\n help=\"The image pull policy of master and worker\",\n )\n parser.add_argument(\n \"--restart_policy\",\n default=\"Never\",\n help=\"The pod restart policy when pod crashed\",\n )\n parser.add_argument(\n \"--extra_pypi_index\", help=\"The extra python package repository\"\n )\n parser.add_argument(\n \"--namespace\",\n default=\"default\",\n type=str,\n help=\"The name of the Kubernetes namespace where ElasticDL \"\n \"pods will be created\",\n )\n parser.add_argument(\"--records_per_task\", type=int, required=True)\n parser.add_argument(\n \"--minibatch_size\",\n type=int,\n help=\"Minibatch size used by workers\",\n required=True,\n )\n parser.add_argument(\n \"--dataset_fn\",\n type=str,\n default=\"dataset_fn\",\n help=\"The name of the dataset function defined in the model file\",\n )\n parser.add_argument(\n \"--eval_metrics_fn\",\n type=str,\n default=\"eval_metrics_fn\",\n help=\"The name of the evaluation metrics function defined \"\n \"in the model file\",\n )\n parser.add_argument(\n \"--model_def\",\n type=str,\n required=True,\n help=\"The import path to the model definition function/class in the \"\n 'model zoo, e.g. \"cifar10_subclass.cifar10_subclass.CustomModel\"',\n )\n parser.add_argument(\n \"--model_params\",\n type=str,\n default=\"\",\n help=\"The dictionary of model parameters in a string that will be \"\n 'used to instantiate the model, e.g. \"param1=1,param2=2\"',\n )\n parser.add_argument(\n \"--cluster_spec\",\n help=\"The file that contains user-defined cluster specification\",\n default=\"\",\n )\n\n\ndef add_train_params(parser):\n parser.add_argument(\n \"--tensorboard_log_dir\",\n default=\"\",\n type=str,\n help=\"Directory where TensorBoard will look to find \"\n \"TensorFlow event files that it can display. \"\n \"TensorBoard will recursively walk the directory \"\n \"structure rooted at log dir, looking for .*tfevents.* \"\n \"files. You may also pass a comma separated list of log \"\n \"directories, and TensorBoard will watch each \"\n \"directory.\",\n )\n parser.add_argument(\"--num_epochs\", type=int, default=1)\n parser.add_argument(\n \"--grads_to_wait\",\n type=int,\n help=\"Number of gradients to wait before updating model\",\n default=2,\n )\n parser.add_argument(\n \"--training_data_dir\",\n help=\"Training data directory. Files should be in RecordIO format\",\n default=\"\",\n )\n parser.add_argument(\n \"--evaluation_data_dir\",\n help=\"Evaluation data directory. Files should be in RecordIO format\",\n default=\"\",\n )\n parser.add_argument(\n \"--evaluation_steps\",\n type=int,\n help=\"Evaluate the model every this many steps.\"\n \"If 0, step-based evaluation is disabled\",\n default=0,\n )\n parser.add_argument(\n \"--evaluation_start_delay_secs\",\n type=int,\n help=\"Start time-based evaluation only after waiting for \"\n \"this many seconds\",\n default=100,\n )\n parser.add_argument(\n \"--evaluation_throttle_secs\",\n type=int,\n help=\"Do not re-evaluate unless the last evaluation was started \"\n \"at least this many seconds ago.\"\n \"If 0, time-based evaluation is disabled\",\n default=0,\n )\n parser.add_argument(\n \"--checkpoint_steps\",\n type=int,\n help=\"Save checkpoint every this many steps.\"\n \"If 0, no checkpoints to save.\",\n default=0,\n )\n parser.add_argument(\n \"--checkpoint_dir\",\n help=\"The directory to store the checkpoint files\",\n default=\"\",\n )\n parser.add_argument(\n \"--keep_checkpoint_max\",\n type=int,\n help=\"The maximum number of recent checkpoint files to keep.\"\n \"If 0, keep all.\",\n default=0,\n )\n parser.add_argument(\n \"--checkpoint_filename_for_init\",\n help=\"The checkpoint file to initialize the training model\",\n default=\"\",\n )\n parser.add_argument(\n \"--loss\",\n type=str,\n default=\"loss\",\n help=\"The name of the loss function defined in the model file\",\n )\n parser.add_argument(\n \"--optimizer\",\n type=str,\n default=\"optimizer\",\n help=\"The name of the optimizer defined in the model file\",\n )\n parser.add_argument(\n \"--output\",\n type=str,\n default=\"\",\n help=\"The path to save the final trained model\",\n )\n\n\ndef add_evaluate_params(parser):\n parser.add_argument(\n \"--evaluation_data_dir\",\n help=\"Evaluation data directory. Files should be in RecordIO format\",\n required=True,\n )\n parser.add_argument(\n \"--checkpoint_filename_for_init\",\n help=\"The checkpoint file to initialize the training model\",\n required=True,\n )\n\n\ndef add_predict_params(parser):\n parser.add_argument(\n \"--prediction_data_dir\",\n help=\"Prediction data directory. Files should be in RecordIO format\",\n required=True,\n )\n parser.add_argument(\n \"--checkpoint_filename_for_init\",\n help=\"The checkpoint file to initialize the training model\",\n required=True,\n )\n", "path": "elasticdl/python/common/args.py"}]}
| 3,148 | 434 |
gh_patches_debug_1391
|
rasdani/github-patches
|
git_diff
|
opsdroid__opsdroid-1683
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
skill-seen broken with redis database?
I've been testing opsdroid with a redis database and the seen skill appears to be having problems serializing python datetime objects.
user: when did you last see user?
opsdroid: Whoops there has been an error.
opsdroid: Check the log for details.
this is the opsdroid log with DEBUG logging enabled:
```
notrexroof_1 | DEBUG opsdroid.memory: Putting seen to memory.
notrexroof_1 | DEBUG opsdroid.database.redis: Putting seen into Redis.
notrexroof_1 | ERROR opsdroid.core: Exception when running skill 'seen'.
notrexroof_1 | Traceback (most recent call last):
notrexroof_1 | File "/usr/local/lib/python3.8/site-packages/opsdroid/core.py", line 427, in run_skill
notrexroof_1 | return await skill(self, config, event)
notrexroof_1 | File "/root/.local/share/opsdroid/opsdroid-modules/skill/seen/__init__.py", line 16, in last_seen
notrexroof_1 | await message.respond("I last saw {} {}".format(name, human(seen[name], precision=1)))
notrexroof_1 | File "/root/.local/share/opsdroid/site-packages/ago.py", line 55, in human
notrexroof_1 | delta = get_delta_from_subject(subject)
notrexroof_1 | File "/root/.local/share/opsdroid/site-packages/ago.py", line 16, in get_delta_from_subject
notrexroof_1 | subject = float(subject)
notrexroof_1 | TypeError: float() argument must be a string or a number, not 'dict'
```
I know this hasn't been touched in a few years, but I'm wondering if there is a general issue with serializing objects into a redis database within opsdroid.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opsdroid/database/redis/__init__.py`
Content:
```
1 """Module for storing data within Redis."""
2 import json
3 import logging
4
5 import aioredis
6 from aioredis import parser
7 from voluptuous import Any
8
9 from opsdroid.database import Database
10 from opsdroid.helper import JSONEncoder, JSONDecoder
11
12 _LOGGER = logging.getLogger(__name__)
13 CONFIG_SCHEMA = {"host": str, "port": Any(int, str), "database": int, "password": str}
14
15
16 class RedisDatabase(Database):
17 """Database class for storing data within a Redis instance."""
18
19 def __init__(self, config, opsdroid=None):
20 """Initialise the redis database.
21
22 Set basic properties of the database. Initialise properties like
23 name, connection arguments, database file, table name and config.
24
25 Args:
26 config (dict): The configuration of the database which consists
27 of `file` and `table` name of the sqlite database
28 specified in `configuration.yaml` file.
29 opsdroid (OpsDroid): An instance of opsdroid.core.
30
31 """
32 super().__init__(config, opsdroid=opsdroid)
33 self.config = config
34 self.client = None
35 self.host = self.config.get("host", "localhost")
36 self.port = self.config.get("port", 6379)
37 self.database = self.config.get("database", 0)
38 self.password = self.config.get("password", None)
39 _LOGGER.debug(_("Loaded Redis database connector."))
40
41 async def connect(self):
42 """Connect to the database.
43
44 This method will connect to a Redis database. By default it will
45 connect to Redis on localhost on port 6379
46
47 """
48 try:
49 self.client = await aioredis.create_pool(
50 address=(self.host, int(self.port)),
51 db=self.database,
52 password=self.password,
53 parser=parser.PyReader,
54 )
55
56 _LOGGER.info(
57 _("Connected to Redis database %s from %s on port %s."),
58 self.database,
59 self.host,
60 self.port,
61 )
62 except OSError:
63 _LOGGER.warning(
64 _("Unable to connect to Redis database on address: %s port: %s."),
65 self.host,
66 self.port,
67 )
68
69 async def put(self, key, data):
70 """Store the data object in Redis against the key.
71
72 Args:
73 key (string): The key to store the data object under.
74 data (object): The data object to store.
75
76 """
77 if self.client:
78 _LOGGER.debug(_("Putting %s into Redis."), key)
79 await self.client.execute("SET", key, json.dumps(data, cls=JSONEncoder))
80
81 async def get(self, key):
82 """Get data from Redis for a given key.
83
84 Args:
85 key (string): The key to lookup in the database.
86
87 Returns:
88 object or None: The data object stored for that key, or None if no
89 object found for that key.
90
91 """
92 if self.client:
93 _LOGGER.debug(_("Getting %s from Redis."), key)
94 data = await self.client.execute("GET", key)
95
96 if data:
97 return json.loads(data, encoding=JSONDecoder)
98
99 return None
100
101 async def delete(self, key):
102 """Delete data from Redis for a given key.
103
104 Args:
105 key (string): The key to delete in the database.
106
107 """
108 if self.client:
109 _LOGGER.debug(_("Deleting %s from Redis."), key)
110 await self.client.execute("DEL", key)
111
112 async def disconnect(self):
113 """Disconnect from the database."""
114 if self.client:
115 self.client.close()
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py
--- a/opsdroid/database/redis/__init__.py
+++ b/opsdroid/database/redis/__init__.py
@@ -94,7 +94,7 @@
data = await self.client.execute("GET", key)
if data:
- return json.loads(data, encoding=JSONDecoder)
+ return json.loads(data, object_hook=JSONDecoder())
return None
|
{"golden_diff": "diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py\n--- a/opsdroid/database/redis/__init__.py\n+++ b/opsdroid/database/redis/__init__.py\n@@ -94,7 +94,7 @@\n data = await self.client.execute(\"GET\", key)\n \n if data:\n- return json.loads(data, encoding=JSONDecoder)\n+ return json.loads(data, object_hook=JSONDecoder())\n \n return None\n", "issue": "skill-seen broken with redis database?\nI've been testing opsdroid with a redis database and the seen skill appears to be having problems serializing python datetime objects.\r\n\r\nuser: when did you last see user?\r\nopsdroid: Whoops there has been an error.\r\nopsdroid: Check the log for details.\r\n\r\nthis is the opsdroid log with DEBUG logging enabled:\r\n\r\n```\r\nnotrexroof_1 | DEBUG opsdroid.memory: Putting seen to memory.\r\nnotrexroof_1 | DEBUG opsdroid.database.redis: Putting seen into Redis.\r\nnotrexroof_1 | ERROR opsdroid.core: Exception when running skill 'seen'.\r\nnotrexroof_1 | Traceback (most recent call last):\r\nnotrexroof_1 | File \"/usr/local/lib/python3.8/site-packages/opsdroid/core.py\", line 427, in run_skill\r\nnotrexroof_1 | return await skill(self, config, event)\r\nnotrexroof_1 | File \"/root/.local/share/opsdroid/opsdroid-modules/skill/seen/__init__.py\", line 16, in last_seen\r\nnotrexroof_1 | await message.respond(\"I last saw {} {}\".format(name, human(seen[name], precision=1)))\r\nnotrexroof_1 | File \"/root/.local/share/opsdroid/site-packages/ago.py\", line 55, in human\r\nnotrexroof_1 | delta = get_delta_from_subject(subject)\r\nnotrexroof_1 | File \"/root/.local/share/opsdroid/site-packages/ago.py\", line 16, in get_delta_from_subject\r\nnotrexroof_1 | subject = float(subject)\r\nnotrexroof_1 | TypeError: float() argument must be a string or a number, not 'dict'\r\n```\r\n\r\nI know this hasn't been touched in a few years, but I'm wondering if there is a general issue with serializing objects into a redis database within opsdroid.\r\n\r\n\n", "before_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nimport aioredis\nfrom aioredis import parser\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = await aioredis.create_pool(\n address=(self.host, int(self.port)),\n db=self.database,\n password=self.password,\n parser=parser.PyReader,\n )\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute(\"GET\", key)\n\n if data:\n return json.loads(data, encoding=JSONDecoder)\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}], "after_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nimport aioredis\nfrom aioredis import parser\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = await aioredis.create_pool(\n address=(self.host, int(self.port)),\n db=self.database,\n password=self.password,\n parser=parser.PyReader,\n )\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute(\"GET\", key)\n\n if data:\n return json.loads(data, object_hook=JSONDecoder())\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}]}
| 1,717 | 115 |
gh_patches_debug_3892
|
rasdani/github-patches
|
git_diff
|
redis__redis-py-1901
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
run GraphCommands.explain raise TypeError
**Version**: 4.1.0
**Python Version**: 3.8
**Platform**: MacOS 12.0.1
**Description**: I'm using redisgraph:2.8.7 with redis-py. I met a error when I run explain. Error:
```python
graph.explain('match (n:person) where n.name = "ken" return n')
```
raise exception
```
File "/lib/python3.8/site-packages/redis/commands/graph/commands.py", line 140, in explain
return "\n".join(plan)
TypeError: sequence item 0: expected str instance, bytes found
```
```python
graph.execute_command('GRAPH.EXPLAIN', graph.name, 'match (n:person) where n.name = "ken" return n')
```
return
```python
[b'Results',
b' Project',
b' Filter',
b' Node By Label Scan | (n:person)']
```
The problem is the result that type is List[byte].
I guess the code should modify as this:
```python
def explain(self, query, params=None):
"""
Get the execution plan for given query,
Returns an array of operations.
For more information see `GRAPH.EXPLAIN <https://oss.redis.com/redisgraph/master/commands/#graphexplain>`_. # noqa
Args:
-------
query:
The query that will be executed.
params: dict
Query parameters.
"""
if params is not None:
query = self._build_params_header(params) + query
plan = self.execute_command("GRAPH.EXPLAIN", self.name, query)
return "\n".join([b.decode() for b in plan])
```
or
```python
def explain(self, query, params=None):
"""
Get the execution plan for given query,
Returns an array of operations.
For more information see `GRAPH.EXPLAIN <https://oss.redis.com/redisgraph/master/commands/#graphexplain>`_. # noqa
Args:
-------
query:
The query that will be executed.
params: dict
Query parameters.
"""
if params is not None:
query = self._build_params_header(params) + query
plan = self.execute_command("GRAPH.EXPLAIN", self.name, query)
return plan
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `redis/commands/graph/commands.py`
Content:
```
1 from redis import DataError
2 from redis.exceptions import ResponseError
3
4 from .exceptions import VersionMismatchException
5 from .query_result import QueryResult
6
7
8 class GraphCommands:
9 """RedisGraph Commands"""
10
11 def commit(self):
12 """
13 Create entire graph.
14 For more information see `CREATE <https://oss.redis.com/redisgraph/master/commands/#create>`_. # noqa
15 """
16 if len(self.nodes) == 0 and len(self.edges) == 0:
17 return None
18
19 query = "CREATE "
20 for _, node in self.nodes.items():
21 query += str(node) + ","
22
23 query += ",".join([str(edge) for edge in self.edges])
24
25 # Discard leading comma.
26 if query[-1] == ",":
27 query = query[:-1]
28
29 return self.query(query)
30
31 def query(self, q, params=None, timeout=None, read_only=False, profile=False):
32 """
33 Executes a query against the graph.
34 For more information see `GRAPH.QUERY <https://oss.redis.com/redisgraph/master/commands/#graphquery>`_. # noqa
35
36 Args:
37
38 q : str
39 The query.
40 params : dict
41 Query parameters.
42 timeout : int
43 Maximum runtime for read queries in milliseconds.
44 read_only : bool
45 Executes a readonly query if set to True.
46 profile : bool
47 Return details on results produced by and time
48 spent in each operation.
49 """
50
51 # maintain original 'q'
52 query = q
53
54 # handle query parameters
55 if params is not None:
56 query = self._build_params_header(params) + query
57
58 # construct query command
59 # ask for compact result-set format
60 # specify known graph version
61 if profile:
62 cmd = "GRAPH.PROFILE"
63 else:
64 cmd = "GRAPH.RO_QUERY" if read_only else "GRAPH.QUERY"
65 command = [cmd, self.name, query, "--compact"]
66
67 # include timeout is specified
68 if timeout:
69 if not isinstance(timeout, int):
70 raise Exception("Timeout argument must be a positive integer")
71 command += ["timeout", timeout]
72
73 # issue query
74 try:
75 response = self.execute_command(*command)
76 return QueryResult(self, response, profile)
77 except ResponseError as e:
78 if "wrong number of arguments" in str(e):
79 print(
80 "Note: RedisGraph Python requires server version 2.2.8 or above"
81 ) # noqa
82 if "unknown command" in str(e) and read_only:
83 # `GRAPH.RO_QUERY` is unavailable in older versions.
84 return self.query(q, params, timeout, read_only=False)
85 raise e
86 except VersionMismatchException as e:
87 # client view over the graph schema is out of sync
88 # set client version and refresh local schema
89 self.version = e.version
90 self._refresh_schema()
91 # re-issue query
92 return self.query(q, params, timeout, read_only)
93
94 def merge(self, pattern):
95 """
96 Merge pattern.
97 For more information see `MERGE <https://oss.redis.com/redisgraph/master/commands/#merge>`_. # noqa
98 """
99 query = "MERGE "
100 query += str(pattern)
101
102 return self.query(query)
103
104 def delete(self):
105 """
106 Deletes graph.
107 For more information see `DELETE <https://oss.redis.com/redisgraph/master/commands/#delete>`_. # noqa
108 """
109 self._clear_schema()
110 return self.execute_command("GRAPH.DELETE", self.name)
111
112 # declared here, to override the built in redis.db.flush()
113 def flush(self):
114 """
115 Commit the graph and reset the edges and the nodes to zero length.
116 """
117 self.commit()
118 self.nodes = {}
119 self.edges = []
120
121 def explain(self, query, params=None):
122 """
123 Get the execution plan for given query,
124 Returns an array of operations.
125 For more information see `GRAPH.EXPLAIN <https://oss.redis.com/redisgraph/master/commands/#graphexplain>`_. # noqa
126
127 Args:
128
129 query:
130 The query that will be executed.
131 params: dict
132 Query parameters.
133 """
134 if params is not None:
135 query = self._build_params_header(params) + query
136
137 plan = self.execute_command("GRAPH.EXPLAIN", self.name, query)
138 return "\n".join(plan)
139
140 def bulk(self, **kwargs):
141 """Internal only. Not supported."""
142 raise NotImplementedError(
143 "GRAPH.BULK is internal only. "
144 "Use https://github.com/redisgraph/redisgraph-bulk-loader."
145 )
146
147 def profile(self, query):
148 """
149 Execute a query and produce an execution plan augmented with metrics
150 for each operation's execution. Return a string representation of a
151 query execution plan, with details on results produced by and time
152 spent in each operation.
153 For more information see `GRAPH.PROFILE <https://oss.redis.com/redisgraph/master/commands/#graphprofile>`_. # noqa
154 """
155 return self.query(query, profile=True)
156
157 def slowlog(self):
158 """
159 Get a list containing up to 10 of the slowest queries issued
160 against the given graph ID.
161 For more information see `GRAPH.SLOWLOG <https://oss.redis.com/redisgraph/master/commands/#graphslowlog>`_. # noqa
162
163 Each item in the list has the following structure:
164 1. A unix timestamp at which the log entry was processed.
165 2. The issued command.
166 3. The issued query.
167 4. The amount of time needed for its execution, in milliseconds.
168 """
169 return self.execute_command("GRAPH.SLOWLOG", self.name)
170
171 def config(self, name, value=None, set=False):
172 """
173 Retrieve or update a RedisGraph configuration.
174 For more information see `GRAPH.CONFIG <https://oss.redis.com/redisgraph/master/commands/#graphconfig>`_. # noqa
175
176 Args:
177
178 name : str
179 The name of the configuration
180 value :
181 The value we want to set (can be used only when `set` is on)
182 set : bool
183 Turn on to set a configuration. Default behavior is get.
184 """
185 params = ["SET" if set else "GET", name]
186 if value is not None:
187 if set:
188 params.append(value)
189 else:
190 raise DataError(
191 "``value`` can be provided only when ``set`` is True"
192 ) # noqa
193 return self.execute_command("GRAPH.CONFIG", *params)
194
195 def list_keys(self):
196 """
197 Lists all graph keys in the keyspace.
198 For more information see `GRAPH.LIST <https://oss.redis.com/redisgraph/master/commands/#graphlist>`_. # noqa
199 """
200 return self.execute_command("GRAPH.LIST")
201
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/redis/commands/graph/commands.py b/redis/commands/graph/commands.py
--- a/redis/commands/graph/commands.py
+++ b/redis/commands/graph/commands.py
@@ -135,6 +135,8 @@
query = self._build_params_header(params) + query
plan = self.execute_command("GRAPH.EXPLAIN", self.name, query)
+ if isinstance(plan[0], bytes):
+ plan = [b.decode() for b in plan]
return "\n".join(plan)
def bulk(self, **kwargs):
|
{"golden_diff": "diff --git a/redis/commands/graph/commands.py b/redis/commands/graph/commands.py\n--- a/redis/commands/graph/commands.py\n+++ b/redis/commands/graph/commands.py\n@@ -135,6 +135,8 @@\n query = self._build_params_header(params) + query\n \n plan = self.execute_command(\"GRAPH.EXPLAIN\", self.name, query)\n+ if isinstance(plan[0], bytes):\n+ plan = [b.decode() for b in plan]\n return \"\\n\".join(plan)\n \n def bulk(self, **kwargs):\n", "issue": "run GraphCommands.explain raise TypeError\n\r\n**Version**: 4.1.0\r\n\r\n**Python Version**: 3.8\r\n\r\n**Platform**: MacOS 12.0.1\r\n\r\n**Description**: I'm using redisgraph:2.8.7 with redis-py. I met a error when I run explain. Error:\r\n```python\r\ngraph.explain('match (n:person) where n.name = \"ken\" return n')\r\n```\r\nraise exception\r\n``` \r\nFile \"/lib/python3.8/site-packages/redis/commands/graph/commands.py\", line 140, in explain\r\n return \"\\n\".join(plan)\r\nTypeError: sequence item 0: expected str instance, bytes found\r\n```\r\n\r\n```python\r\ngraph.execute_command('GRAPH.EXPLAIN', graph.name, 'match (n:person) where n.name = \"ken\" return n')\r\n```\r\nreturn\r\n```python\r\n[b'Results',\r\n b' Project',\r\n b' Filter',\r\n b' Node By Label Scan | (n:person)']\r\n```\r\n\r\nThe problem is the result that type is List[byte].\r\nI guess the code should modify as this:\r\n```python\r\n def explain(self, query, params=None):\r\n \"\"\"\r\n Get the execution plan for given query,\r\n Returns an array of operations.\r\n For more information see `GRAPH.EXPLAIN <https://oss.redis.com/redisgraph/master/commands/#graphexplain>`_. # noqa\r\n\r\n Args:\r\n\r\n -------\r\n query:\r\n The query that will be executed.\r\n params: dict\r\n Query parameters.\r\n \"\"\"\r\n if params is not None:\r\n query = self._build_params_header(params) + query\r\n\r\n plan = self.execute_command(\"GRAPH.EXPLAIN\", self.name, query)\r\n return \"\\n\".join([b.decode() for b in plan])\r\n```\r\nor\r\n```python\r\n def explain(self, query, params=None):\r\n \"\"\"\r\n Get the execution plan for given query,\r\n Returns an array of operations.\r\n For more information see `GRAPH.EXPLAIN <https://oss.redis.com/redisgraph/master/commands/#graphexplain>`_. # noqa\r\n\r\n Args:\r\n\r\n -------\r\n query:\r\n The query that will be executed.\r\n params: dict\r\n Query parameters.\r\n \"\"\"\r\n if params is not None:\r\n query = self._build_params_header(params) + query\r\n\r\n plan = self.execute_command(\"GRAPH.EXPLAIN\", self.name, query)\r\n return plan\r\n```\r\n\n", "before_files": [{"content": "from redis import DataError\nfrom redis.exceptions import ResponseError\n\nfrom .exceptions import VersionMismatchException\nfrom .query_result import QueryResult\n\n\nclass GraphCommands:\n \"\"\"RedisGraph Commands\"\"\"\n\n def commit(self):\n \"\"\"\n Create entire graph.\n For more information see `CREATE <https://oss.redis.com/redisgraph/master/commands/#create>`_. # noqa\n \"\"\"\n if len(self.nodes) == 0 and len(self.edges) == 0:\n return None\n\n query = \"CREATE \"\n for _, node in self.nodes.items():\n query += str(node) + \",\"\n\n query += \",\".join([str(edge) for edge in self.edges])\n\n # Discard leading comma.\n if query[-1] == \",\":\n query = query[:-1]\n\n return self.query(query)\n\n def query(self, q, params=None, timeout=None, read_only=False, profile=False):\n \"\"\"\n Executes a query against the graph.\n For more information see `GRAPH.QUERY <https://oss.redis.com/redisgraph/master/commands/#graphquery>`_. # noqa\n\n Args:\n\n q : str\n The query.\n params : dict\n Query parameters.\n timeout : int\n Maximum runtime for read queries in milliseconds.\n read_only : bool\n Executes a readonly query if set to True.\n profile : bool\n Return details on results produced by and time\n spent in each operation.\n \"\"\"\n\n # maintain original 'q'\n query = q\n\n # handle query parameters\n if params is not None:\n query = self._build_params_header(params) + query\n\n # construct query command\n # ask for compact result-set format\n # specify known graph version\n if profile:\n cmd = \"GRAPH.PROFILE\"\n else:\n cmd = \"GRAPH.RO_QUERY\" if read_only else \"GRAPH.QUERY\"\n command = [cmd, self.name, query, \"--compact\"]\n\n # include timeout is specified\n if timeout:\n if not isinstance(timeout, int):\n raise Exception(\"Timeout argument must be a positive integer\")\n command += [\"timeout\", timeout]\n\n # issue query\n try:\n response = self.execute_command(*command)\n return QueryResult(self, response, profile)\n except ResponseError as e:\n if \"wrong number of arguments\" in str(e):\n print(\n \"Note: RedisGraph Python requires server version 2.2.8 or above\"\n ) # noqa\n if \"unknown command\" in str(e) and read_only:\n # `GRAPH.RO_QUERY` is unavailable in older versions.\n return self.query(q, params, timeout, read_only=False)\n raise e\n except VersionMismatchException as e:\n # client view over the graph schema is out of sync\n # set client version and refresh local schema\n self.version = e.version\n self._refresh_schema()\n # re-issue query\n return self.query(q, params, timeout, read_only)\n\n def merge(self, pattern):\n \"\"\"\n Merge pattern.\n For more information see `MERGE <https://oss.redis.com/redisgraph/master/commands/#merge>`_. # noqa\n \"\"\"\n query = \"MERGE \"\n query += str(pattern)\n\n return self.query(query)\n\n def delete(self):\n \"\"\"\n Deletes graph.\n For more information see `DELETE <https://oss.redis.com/redisgraph/master/commands/#delete>`_. # noqa\n \"\"\"\n self._clear_schema()\n return self.execute_command(\"GRAPH.DELETE\", self.name)\n\n # declared here, to override the built in redis.db.flush()\n def flush(self):\n \"\"\"\n Commit the graph and reset the edges and the nodes to zero length.\n \"\"\"\n self.commit()\n self.nodes = {}\n self.edges = []\n\n def explain(self, query, params=None):\n \"\"\"\n Get the execution plan for given query,\n Returns an array of operations.\n For more information see `GRAPH.EXPLAIN <https://oss.redis.com/redisgraph/master/commands/#graphexplain>`_. # noqa\n\n Args:\n\n query:\n The query that will be executed.\n params: dict\n Query parameters.\n \"\"\"\n if params is not None:\n query = self._build_params_header(params) + query\n\n plan = self.execute_command(\"GRAPH.EXPLAIN\", self.name, query)\n return \"\\n\".join(plan)\n\n def bulk(self, **kwargs):\n \"\"\"Internal only. Not supported.\"\"\"\n raise NotImplementedError(\n \"GRAPH.BULK is internal only. \"\n \"Use https://github.com/redisgraph/redisgraph-bulk-loader.\"\n )\n\n def profile(self, query):\n \"\"\"\n Execute a query and produce an execution plan augmented with metrics\n for each operation's execution. Return a string representation of a\n query execution plan, with details on results produced by and time\n spent in each operation.\n For more information see `GRAPH.PROFILE <https://oss.redis.com/redisgraph/master/commands/#graphprofile>`_. # noqa\n \"\"\"\n return self.query(query, profile=True)\n\n def slowlog(self):\n \"\"\"\n Get a list containing up to 10 of the slowest queries issued\n against the given graph ID.\n For more information see `GRAPH.SLOWLOG <https://oss.redis.com/redisgraph/master/commands/#graphslowlog>`_. # noqa\n\n Each item in the list has the following structure:\n 1. A unix timestamp at which the log entry was processed.\n 2. The issued command.\n 3. The issued query.\n 4. The amount of time needed for its execution, in milliseconds.\n \"\"\"\n return self.execute_command(\"GRAPH.SLOWLOG\", self.name)\n\n def config(self, name, value=None, set=False):\n \"\"\"\n Retrieve or update a RedisGraph configuration.\n For more information see `GRAPH.CONFIG <https://oss.redis.com/redisgraph/master/commands/#graphconfig>`_. # noqa\n\n Args:\n\n name : str\n The name of the configuration\n value :\n The value we want to set (can be used only when `set` is on)\n set : bool\n Turn on to set a configuration. Default behavior is get.\n \"\"\"\n params = [\"SET\" if set else \"GET\", name]\n if value is not None:\n if set:\n params.append(value)\n else:\n raise DataError(\n \"``value`` can be provided only when ``set`` is True\"\n ) # noqa\n return self.execute_command(\"GRAPH.CONFIG\", *params)\n\n def list_keys(self):\n \"\"\"\n Lists all graph keys in the keyspace.\n For more information see `GRAPH.LIST <https://oss.redis.com/redisgraph/master/commands/#graphlist>`_. # noqa\n \"\"\"\n return self.execute_command(\"GRAPH.LIST\")\n", "path": "redis/commands/graph/commands.py"}], "after_files": [{"content": "from redis import DataError\nfrom redis.exceptions import ResponseError\n\nfrom .exceptions import VersionMismatchException\nfrom .query_result import QueryResult\n\n\nclass GraphCommands:\n \"\"\"RedisGraph Commands\"\"\"\n\n def commit(self):\n \"\"\"\n Create entire graph.\n For more information see `CREATE <https://oss.redis.com/redisgraph/master/commands/#create>`_. # noqa\n \"\"\"\n if len(self.nodes) == 0 and len(self.edges) == 0:\n return None\n\n query = \"CREATE \"\n for _, node in self.nodes.items():\n query += str(node) + \",\"\n\n query += \",\".join([str(edge) for edge in self.edges])\n\n # Discard leading comma.\n if query[-1] == \",\":\n query = query[:-1]\n\n return self.query(query)\n\n def query(self, q, params=None, timeout=None, read_only=False, profile=False):\n \"\"\"\n Executes a query against the graph.\n For more information see `GRAPH.QUERY <https://oss.redis.com/redisgraph/master/commands/#graphquery>`_. # noqa\n\n Args:\n\n q : str\n The query.\n params : dict\n Query parameters.\n timeout : int\n Maximum runtime for read queries in milliseconds.\n read_only : bool\n Executes a readonly query if set to True.\n profile : bool\n Return details on results produced by and time\n spent in each operation.\n \"\"\"\n\n # maintain original 'q'\n query = q\n\n # handle query parameters\n if params is not None:\n query = self._build_params_header(params) + query\n\n # construct query command\n # ask for compact result-set format\n # specify known graph version\n if profile:\n cmd = \"GRAPH.PROFILE\"\n else:\n cmd = \"GRAPH.RO_QUERY\" if read_only else \"GRAPH.QUERY\"\n command = [cmd, self.name, query, \"--compact\"]\n\n # include timeout is specified\n if timeout:\n if not isinstance(timeout, int):\n raise Exception(\"Timeout argument must be a positive integer\")\n command += [\"timeout\", timeout]\n\n # issue query\n try:\n response = self.execute_command(*command)\n return QueryResult(self, response, profile)\n except ResponseError as e:\n if \"wrong number of arguments\" in str(e):\n print(\n \"Note: RedisGraph Python requires server version 2.2.8 or above\"\n ) # noqa\n if \"unknown command\" in str(e) and read_only:\n # `GRAPH.RO_QUERY` is unavailable in older versions.\n return self.query(q, params, timeout, read_only=False)\n raise e\n except VersionMismatchException as e:\n # client view over the graph schema is out of sync\n # set client version and refresh local schema\n self.version = e.version\n self._refresh_schema()\n # re-issue query\n return self.query(q, params, timeout, read_only)\n\n def merge(self, pattern):\n \"\"\"\n Merge pattern.\n For more information see `MERGE <https://oss.redis.com/redisgraph/master/commands/#merge>`_. # noqa\n \"\"\"\n query = \"MERGE \"\n query += str(pattern)\n\n return self.query(query)\n\n def delete(self):\n \"\"\"\n Deletes graph.\n For more information see `DELETE <https://oss.redis.com/redisgraph/master/commands/#delete>`_. # noqa\n \"\"\"\n self._clear_schema()\n return self.execute_command(\"GRAPH.DELETE\", self.name)\n\n # declared here, to override the built in redis.db.flush()\n def flush(self):\n \"\"\"\n Commit the graph and reset the edges and the nodes to zero length.\n \"\"\"\n self.commit()\n self.nodes = {}\n self.edges = []\n\n def explain(self, query, params=None):\n \"\"\"\n Get the execution plan for given query,\n Returns an array of operations.\n For more information see `GRAPH.EXPLAIN <https://oss.redis.com/redisgraph/master/commands/#graphexplain>`_. # noqa\n\n Args:\n\n query:\n The query that will be executed.\n params: dict\n Query parameters.\n \"\"\"\n if params is not None:\n query = self._build_params_header(params) + query\n\n plan = self.execute_command(\"GRAPH.EXPLAIN\", self.name, query)\n if isinstance(plan[0], bytes):\n plan = [b.decode() for b in plan]\n return \"\\n\".join(plan)\n\n def bulk(self, **kwargs):\n \"\"\"Internal only. Not supported.\"\"\"\n raise NotImplementedError(\n \"GRAPH.BULK is internal only. \"\n \"Use https://github.com/redisgraph/redisgraph-bulk-loader.\"\n )\n\n def profile(self, query):\n \"\"\"\n Execute a query and produce an execution plan augmented with metrics\n for each operation's execution. Return a string representation of a\n query execution plan, with details on results produced by and time\n spent in each operation.\n For more information see `GRAPH.PROFILE <https://oss.redis.com/redisgraph/master/commands/#graphprofile>`_. # noqa\n \"\"\"\n return self.query(query, profile=True)\n\n def slowlog(self):\n \"\"\"\n Get a list containing up to 10 of the slowest queries issued\n against the given graph ID.\n For more information see `GRAPH.SLOWLOG <https://oss.redis.com/redisgraph/master/commands/#graphslowlog>`_. # noqa\n\n Each item in the list has the following structure:\n 1. A unix timestamp at which the log entry was processed.\n 2. The issued command.\n 3. The issued query.\n 4. The amount of time needed for its execution, in milliseconds.\n \"\"\"\n return self.execute_command(\"GRAPH.SLOWLOG\", self.name)\n\n def config(self, name, value=None, set=False):\n \"\"\"\n Retrieve or update a RedisGraph configuration.\n For more information see `GRAPH.CONFIG <https://oss.redis.com/redisgraph/master/commands/#graphconfig>`_. # noqa\n\n Args:\n\n name : str\n The name of the configuration\n value :\n The value we want to set (can be used only when `set` is on)\n set : bool\n Turn on to set a configuration. Default behavior is get.\n \"\"\"\n params = [\"SET\" if set else \"GET\", name]\n if value is not None:\n if set:\n params.append(value)\n else:\n raise DataError(\n \"``value`` can be provided only when ``set`` is True\"\n ) # noqa\n return self.execute_command(\"GRAPH.CONFIG\", *params)\n\n def list_keys(self):\n \"\"\"\n Lists all graph keys in the keyspace.\n For more information see `GRAPH.LIST <https://oss.redis.com/redisgraph/master/commands/#graphlist>`_. # noqa\n \"\"\"\n return self.execute_command(\"GRAPH.LIST\")\n", "path": "redis/commands/graph/commands.py"}]}
| 2,754 | 127 |
gh_patches_debug_13451
|
rasdani/github-patches
|
git_diff
|
CTPUG__wafer-312
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Schedule Editor does not clear extra fields on existing items
When replacing an existing item in the schedule editor, the notes, css_class and details fields are not replaced or cleared.
While this can be useful to leave css_class untouched, it is surprising behaviour and usually the wrong thing to do for notes and details.
I think we must clear notes and details when this happens.
We should probably clear all extra fields when replacing an item, for the sake of predictablity,
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wafer/schedule/serializers.py`
Content:
```
1 from rest_framework import serializers
2
3 from wafer.talks.models import Talk
4 from wafer.pages.models import Page
5 from wafer.schedule.models import ScheduleItem, Venue, Slot
6
7
8 class ScheduleItemSerializer(serializers.HyperlinkedModelSerializer):
9 page = serializers.PrimaryKeyRelatedField(
10 allow_null=True, queryset=Page.objects.all())
11 talk = serializers.PrimaryKeyRelatedField(
12 allow_null=True, queryset=Talk.objects.all())
13 venue = serializers.PrimaryKeyRelatedField(
14 allow_null=True, queryset=Venue.objects.all())
15 slots = serializers.PrimaryKeyRelatedField(
16 allow_null=True, many=True, queryset=Slot.objects.all())
17
18 class Meta:
19 model = ScheduleItem
20 fields = ('id', 'talk', 'page', 'venue', 'slots')
21
22 def create(self, validated_data):
23 venue_id = validated_data['venue']
24 slots = validated_data['slots']
25 talk = validated_data.get('talk')
26 page = validated_data.get('page')
27
28 try:
29 existing_schedule_item = ScheduleItem.objects.get(
30 venue_id=venue_id, slots__in=slots)
31 except ScheduleItem.DoesNotExist:
32 pass
33 else:
34 existing_schedule_item.talk = talk
35 existing_schedule_item.page = page
36 existing_schedule_item.slots = slots
37 existing_schedule_item.save()
38 return existing_schedule_item
39 return super(ScheduleItemSerializer, self).create(validated_data)
40
41
42
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/wafer/schedule/serializers.py b/wafer/schedule/serializers.py
--- a/wafer/schedule/serializers.py
+++ b/wafer/schedule/serializers.py
@@ -34,6 +34,12 @@
existing_schedule_item.talk = talk
existing_schedule_item.page = page
existing_schedule_item.slots = slots
+ # Clear any existing details that aren't editable by the
+ # schedule edit view
+ existing_schedule_item.details = ''
+ existing_schedule_item.notes = ''
+ existing_schedule_item.css_class = ''
+ existing_schedule_item.expand = False
existing_schedule_item.save()
return existing_schedule_item
return super(ScheduleItemSerializer, self).create(validated_data)
|
{"golden_diff": "diff --git a/wafer/schedule/serializers.py b/wafer/schedule/serializers.py\n--- a/wafer/schedule/serializers.py\n+++ b/wafer/schedule/serializers.py\n@@ -34,6 +34,12 @@\n existing_schedule_item.talk = talk\n existing_schedule_item.page = page\n existing_schedule_item.slots = slots\n+ # Clear any existing details that aren't editable by the\n+ # schedule edit view\n+ existing_schedule_item.details = ''\n+ existing_schedule_item.notes = ''\n+ existing_schedule_item.css_class = ''\n+ existing_schedule_item.expand = False\n existing_schedule_item.save()\n return existing_schedule_item\n return super(ScheduleItemSerializer, self).create(validated_data)\n", "issue": "Schedule Editor does not clear extra fields on existing items\nWhen replacing an existing item in the schedule editor, the notes, css_class and details fields are not replaced or cleared.\n\nWhile this can be useful to leave css_class untouched, it is surprising behaviour and usually the wrong thing to do for notes and details. \n\nI think we must clear notes and details when this happens.\n\nWe should probably clear all extra fields when replacing an item, for the sake of predictablity,\n\n", "before_files": [{"content": "from rest_framework import serializers\n\nfrom wafer.talks.models import Talk\nfrom wafer.pages.models import Page\nfrom wafer.schedule.models import ScheduleItem, Venue, Slot\n\n\nclass ScheduleItemSerializer(serializers.HyperlinkedModelSerializer):\n page = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Page.objects.all())\n talk = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Talk.objects.all())\n venue = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Venue.objects.all())\n slots = serializers.PrimaryKeyRelatedField(\n allow_null=True, many=True, queryset=Slot.objects.all())\n\n class Meta:\n model = ScheduleItem\n fields = ('id', 'talk', 'page', 'venue', 'slots')\n\n def create(self, validated_data):\n venue_id = validated_data['venue']\n slots = validated_data['slots']\n talk = validated_data.get('talk')\n page = validated_data.get('page')\n\n try:\n existing_schedule_item = ScheduleItem.objects.get(\n venue_id=venue_id, slots__in=slots)\n except ScheduleItem.DoesNotExist:\n pass\n else:\n existing_schedule_item.talk = talk\n existing_schedule_item.page = page\n existing_schedule_item.slots = slots\n existing_schedule_item.save()\n return existing_schedule_item\n return super(ScheduleItemSerializer, self).create(validated_data)\n\n\n", "path": "wafer/schedule/serializers.py"}], "after_files": [{"content": "from rest_framework import serializers\n\nfrom wafer.talks.models import Talk\nfrom wafer.pages.models import Page\nfrom wafer.schedule.models import ScheduleItem, Venue, Slot\n\n\nclass ScheduleItemSerializer(serializers.HyperlinkedModelSerializer):\n page = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Page.objects.all())\n talk = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Talk.objects.all())\n venue = serializers.PrimaryKeyRelatedField(\n allow_null=True, queryset=Venue.objects.all())\n slots = serializers.PrimaryKeyRelatedField(\n allow_null=True, many=True, queryset=Slot.objects.all())\n\n class Meta:\n model = ScheduleItem\n fields = ('id', 'talk', 'page', 'venue', 'slots')\n\n def create(self, validated_data):\n venue_id = validated_data['venue']\n slots = validated_data['slots']\n talk = validated_data.get('talk')\n page = validated_data.get('page')\n\n try:\n existing_schedule_item = ScheduleItem.objects.get(\n venue_id=venue_id, slots__in=slots)\n except ScheduleItem.DoesNotExist:\n pass\n else:\n existing_schedule_item.talk = talk\n existing_schedule_item.page = page\n existing_schedule_item.slots = slots\n # Clear any existing details that aren't editable by the\n # schedule edit view\n existing_schedule_item.details = ''\n existing_schedule_item.notes = ''\n existing_schedule_item.css_class = ''\n existing_schedule_item.expand = False\n existing_schedule_item.save()\n return existing_schedule_item\n return super(ScheduleItemSerializer, self).create(validated_data)\n\n\n", "path": "wafer/schedule/serializers.py"}]}
| 730 | 169 |
gh_patches_debug_19609
|
rasdani/github-patches
|
git_diff
|
fossasia__open-event-server-4050
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
IntegrityError: (psycopg2.IntegrityError) null value in column "name" violates not-null constraint
https://sentry.eventyay.com/eventyay/api/issues/14/
```
IntegrityError: (psycopg2.IntegrityError) null value in column "name" violates not-null constraint
DETAIL: Failing row contains (392, null, null, null, null, null, 176).
CONTEXT: SQL statement "update microlocations set name = NULL where name = '';update microlocations set room = NULL where room = '';"
PL/pgSQL function replace_in_table(text,text,text,boolean) line 23 at EXECUTE
[SQL: "SELECT replace_in_table('microlocations', '''''', 'NULL', true)"]
(25 additional frame(s) were not displayed)
...
File "sqlalchemy/engine/base.py", line 1189, in _execute_context
context)
File "sqlalchemy/engine/base.py", line 1402, in _handle_dbapi_exception
exc_info
File "sqlalchemy/util/compat.py", line 203, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "sqlalchemy/engine/base.py", line 1182, in _execute_context
context)
File "sqlalchemy/engine/default.py", line 470, in do_execute
cursor.execute(statement, parameters)
IntegrityError: (psycopg2.IntegrityError) null value in column "name" violates not-null constraint
DETAIL: Failing row contains (392, null, null, null, null, null, 176).
CONTEXT: SQL statement "update microlocations set name = NULL where name = '';update microlocations set room = NULL where room = '';"
PL/pgSQL function replace_in_table(text,text,text,boolean) line 23 at EXECUTE
[SQL: "SELECT replace_in_table('microlocations', '''''', 'NULL', true)"]
```
> This error occurred when running the migrations against a clone of the production database
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `migrations/versions/1b9c8698c632_.py`
Content:
```
1 """empty message
2
3 Revision ID: 1b9c8698c632
4 Revises: af9215895b80
5 Create Date: 2017-06-15 22:39:27.240374
6
7 """
8
9 # revision identifiers, used by Alembic.
10 revision = '1b9c8698c632'
11 down_revision = 'af9215895b80'
12
13 from alembic import op
14 import sqlalchemy as sa
15 import sqlalchemy_utils
16
17
18 class ReplaceableObject(object):
19 def __init__(self, name, sqltext):
20 self.name = name
21 self.sqltext = sqltext
22
23 # function to find a and replace a string(the whole string should match, not substring)
24 # across all the columns of a table.
25 # If run is true then the function executes the query.
26 replace_in_table = ReplaceableObject(
27 "replace_in_table(table_name text, original text, replacement text, run boolean)",
28 """
29 returns text language plpgsql
30 as $$
31 declare
32 r record;
33 q text;
34 begin
35 q = '';
36 for r in
37 select attname
38 from pg_attribute
39 where attrelid = table_name::regclass
40 and attnum > 0
41 and not attisdropped
42 and (atttypid = 25
43 or atttypid = 1042
44 or atttypid = 1043)
45 order by attnum
46 loop
47 q = format($fmt$%supdate %s set %s = %s where %s = %s;$fmt$,
48 q, table_name, r.attname, replacement, r.attname, original);
49 end loop;
50 q = format('%s;', rtrim(q, ';'));
51 if run then
52 execute q;
53 end if;
54 return q;
55 end $$;
56 """
57 )
58
59
60 def upgrade():
61 # ### commands auto generated by Alembic - please adjust! ###
62 op.create_or_replace_sp(replace_in_table)
63 op.alter_column('social_links', 'link',
64 existing_type=sa.VARCHAR(),
65 nullable=True)
66 op.alter_column('sessions', 'title',
67 existing_type=sa.VARCHAR(),
68 nullable=True)
69 op.alter_column('sessions_version', 'title',
70 existing_type=sa.VARCHAR(),
71 nullable=True)
72 op.alter_column('call_for_papers', 'announcement',
73 existing_type=sa.VARCHAR(),
74 nullable=True)
75
76 # to create these statements automatically, please print them like
77 # print("""op.execute("SELECT replace_in_table('""" + table[0] + """', '''''', 'NULL', true)",
78 # execution_options=None)""")
79 # inside the function to fetch all tables and models in the views directory
80 # TRY AS HARD AS YOU CAN TO NOT BRING THE APP LOGIC IN MIGRATIONS
81 op.execute("SELECT replace_in_table('event_copyrights', '''''', 'NULL', true)",
82 execution_options=None)
83 op.execute("SELECT replace_in_table('mails', '''''', 'NULL', true)",
84 execution_options=None)
85 op.execute("SELECT replace_in_table('social_links', '''''', 'NULL', true)",
86 execution_options=None)
87 op.execute("SELECT replace_in_table('booked_ticket', '''''', 'NULL', true)",
88 execution_options=None)
89 op.execute("SELECT replace_in_table('tax', '''''', 'NULL', true)",
90 execution_options=None)
91 op.execute("SELECT replace_in_table('custom_sys_roles', '''''', 'NULL', true)",
92 execution_options=None)
93 op.execute("SELECT replace_in_table('custom_forms', '''''', 'NULL', true)",
94 execution_options=None)
95 op.execute("SELECT replace_in_table('microlocations', '''''', 'NULL', true)",
96 execution_options=None)
97 op.execute("SELECT replace_in_table('user_permissions', '''''', 'NULL', true)",
98 execution_options=None)
99 op.execute("SELECT replace_in_table('event_user', '''''', 'NULL', true)",
100 execution_options=None)
101 op.execute("SELECT replace_in_table('orders', '''''', 'NULL', true)",
102 execution_options=None)
103 op.execute("SELECT replace_in_table('orders_tickets', '''''', 'NULL', true)",
104 execution_options=None)
105 op.execute("SELECT replace_in_table('discount_codes', '''''', 'NULL', true)",
106 execution_options=None)
107 op.execute("SELECT replace_in_table('ticket_fees', '''''', 'NULL', true)",
108 execution_options=None)
109 op.execute("SELECT replace_in_table('access_codes', '''''', 'NULL', true)",
110 execution_options=None)
111 op.execute("SELECT replace_in_table('user_system_role', '''''', 'NULL', true)",
112 execution_options=None)
113 op.execute("SELECT replace_in_table('message_settings', '''''', 'NULL', true)",
114 execution_options=None)
115 op.execute("SELECT replace_in_table('image_sizes', '''''', 'NULL', true)",
116 execution_options=None)
117 op.execute("SELECT replace_in_table('speaker', '''''', 'NULL', true)",
118 execution_options=None)
119 op.execute("SELECT replace_in_table('stripe_authorizations', '''''', 'NULL', true)",
120 execution_options=None)
121 op.execute("SELECT replace_in_table('role_invites', '''''', 'NULL', true)",
122 execution_options=None)
123 op.execute("SELECT replace_in_table('custom_placeholders', '''''', 'NULL', true)",
124 execution_options=None)
125 op.execute("SELECT replace_in_table('session_types', '''''', 'NULL', true)",
126 execution_options=None)
127 op.execute("SELECT replace_in_table('events', '''''', 'NULL', true)",
128 execution_options=None)
129 op.execute("SELECT replace_in_table('activities', '''''', 'NULL', true)",
130 execution_options=None)
131 op.execute("SELECT replace_in_table('users', '''''', 'NULL', true)",
132 execution_options=None)
133 op.execute("SELECT replace_in_table('sessions', '''''', 'NULL', true)",
134 execution_options=None)
135 op.execute("SELECT replace_in_table('sponsors', '''''', 'NULL', true)",
136 execution_options=None)
137 op.execute("SELECT replace_in_table('ticket_holders', '''''', 'NULL', true)",
138 execution_options=None)
139 op.execute("SELECT replace_in_table('notifications', '''''', 'NULL', true)",
140 execution_options=None)
141 op.execute("SELECT replace_in_table('tracks', '''''', 'NULL', true)",
142 execution_options=None)
143 op.execute("SELECT replace_in_table('users_events_roles', '''''', 'NULL', true)",
144 execution_options=None)
145 op.execute("SELECT replace_in_table('email_notifications', '''''', 'NULL', true)",
146 execution_options=None)
147 op.execute("SELECT replace_in_table('services', '''''', 'NULL', true)",
148 execution_options=None)
149 op.execute("SELECT replace_in_table('export_jobs', '''''', 'NULL', true)",
150 execution_options=None)
151 op.execute("SELECT replace_in_table('image_config', '''''', 'NULL', true)",
152 execution_options=None)
153 op.execute("SELECT replace_in_table('pages', '''''', 'NULL', true)",
154 execution_options=None)
155 op.execute("SELECT replace_in_table('permissions', '''''', 'NULL', true)",
156 execution_options=None)
157 op.execute("SELECT replace_in_table('tickets', '''''', 'NULL', true)",
158 execution_options=None)
159 op.execute("SELECT replace_in_table('roles', '''''', 'NULL', true)",
160 execution_options=None)
161 op.execute("SELECT replace_in_table('settings', '''''', 'NULL', true)",
162 execution_options=None)
163 op.execute("SELECT replace_in_table('versions', '''''', 'NULL', true)",
164 execution_options=None)
165 op.execute("SELECT replace_in_table('modules', '''''', 'NULL', true)",
166 execution_options=None)
167 op.execute("SELECT replace_in_table('event_invoices', '''''', 'NULL', true)",
168 execution_options=None)
169 op.execute("SELECT replace_in_table('import_jobs', '''''', 'NULL', true)",
170 execution_options=None)
171 op.execute("SELECT replace_in_table('call_for_papers', '''''', 'NULL', true)",
172 execution_options=None)
173 op.execute("SELECT replace_in_table('panel_permissions', '''''', 'NULL', true)",
174 execution_options=None)
175 op.execute("SELECT replace_in_table('ticket_tag', '''''', 'NULL', true)",
176 execution_options=None)
177 op.execute("SELECT replace_in_table('invites', '''''', 'NULL', true)",
178 execution_options=None)
179
180 op.execute("SELECT replace_in_table('sessions_version', '''''', 'NULL', true)",
181 execution_options=None)
182 op.execute("SELECT replace_in_table('events_version', '''''', 'NULL', true)",
183 execution_options=None)
184 # ### end Alembic commands ###
185
186
187 def downgrade():
188 # ### commands auto generated by Alembic - please adjust! ###
189 op.drop_sp(replace_in_table)
190 op.alter_column('social_links', 'link',
191 existing_type=sa.VARCHAR(),
192 nullable=False)
193 op.alter_column('sessions', 'title',
194 existing_type=sa.VARCHAR(),
195 nullable=False)
196 op.alter_column('sessions_version', 'title',
197 existing_type=sa.VARCHAR(),
198 nullable=False)
199 op.alter_column('call_for_papers', 'announcement',
200 existing_type=sa.VARCHAR(),
201 nullable=False)
202 # ### end Alembic commands ###
203
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/migrations/versions/1b9c8698c632_.py b/migrations/versions/1b9c8698c632_.py
--- a/migrations/versions/1b9c8698c632_.py
+++ b/migrations/versions/1b9c8698c632_.py
@@ -23,6 +23,7 @@
# function to find a and replace a string(the whole string should match, not substring)
# across all the columns of a table.
# If run is true then the function executes the query.
+# This will work only for nullable columns
replace_in_table = ReplaceableObject(
"replace_in_table(table_name text, original text, replacement text, run boolean)",
"""
@@ -42,6 +43,7 @@
and (atttypid = 25
or atttypid = 1042
or atttypid = 1043)
+ and not attnotnull
order by attnum
loop
q = format($fmt$%supdate %s set %s = %s where %s = %s;$fmt$,
|
{"golden_diff": "diff --git a/migrations/versions/1b9c8698c632_.py b/migrations/versions/1b9c8698c632_.py\n--- a/migrations/versions/1b9c8698c632_.py\n+++ b/migrations/versions/1b9c8698c632_.py\n@@ -23,6 +23,7 @@\n # function to find a and replace a string(the whole string should match, not substring)\n # across all the columns of a table.\n # If run is true then the function executes the query.\n+# This will work only for nullable columns\n replace_in_table = ReplaceableObject(\n \"replace_in_table(table_name text, original text, replacement text, run boolean)\",\n \"\"\"\n@@ -42,6 +43,7 @@\n and (atttypid = 25\n or atttypid = 1042\n or atttypid = 1043)\n+ and not attnotnull\n order by attnum\n loop\n q = format($fmt$%supdate %s set %s = %s where %s = %s;$fmt$,\n", "issue": "IntegrityError: (psycopg2.IntegrityError) null value in column \"name\" violates not-null constraint\nhttps://sentry.eventyay.com/eventyay/api/issues/14/\r\n\r\n```\r\nIntegrityError: (psycopg2.IntegrityError) null value in column \"name\" violates not-null constraint\r\nDETAIL: Failing row contains (392, null, null, null, null, null, 176).\r\nCONTEXT: SQL statement \"update microlocations set name = NULL where name = '';update microlocations set room = NULL where room = '';\"\r\nPL/pgSQL function replace_in_table(text,text,text,boolean) line 23 at EXECUTE\r\n [SQL: \"SELECT replace_in_table('microlocations', '''''', 'NULL', true)\"]\r\n(25 additional frame(s) were not displayed)\r\n...\r\n File \"sqlalchemy/engine/base.py\", line 1189, in _execute_context\r\n context)\r\n File \"sqlalchemy/engine/base.py\", line 1402, in _handle_dbapi_exception\r\n exc_info\r\n File \"sqlalchemy/util/compat.py\", line 203, in raise_from_cause\r\n reraise(type(exception), exception, tb=exc_tb, cause=cause)\r\n File \"sqlalchemy/engine/base.py\", line 1182, in _execute_context\r\n context)\r\n File \"sqlalchemy/engine/default.py\", line 470, in do_execute\r\n cursor.execute(statement, parameters)\r\n\r\nIntegrityError: (psycopg2.IntegrityError) null value in column \"name\" violates not-null constraint\r\nDETAIL: Failing row contains (392, null, null, null, null, null, 176).\r\nCONTEXT: SQL statement \"update microlocations set name = NULL where name = '';update microlocations set room = NULL where room = '';\"\r\nPL/pgSQL function replace_in_table(text,text,text,boolean) line 23 at EXECUTE\r\n [SQL: \"SELECT replace_in_table('microlocations', '''''', 'NULL', true)\"]\r\n```\r\n\r\n> This error occurred when running the migrations against a clone of the production database\n", "before_files": [{"content": "\"\"\"empty message\n\nRevision ID: 1b9c8698c632\nRevises: af9215895b80\nCreate Date: 2017-06-15 22:39:27.240374\n\n\"\"\"\n\n# revision identifiers, used by Alembic.\nrevision = '1b9c8698c632'\ndown_revision = 'af9215895b80'\n\nfrom alembic import op\nimport sqlalchemy as sa\nimport sqlalchemy_utils\n\n\nclass ReplaceableObject(object):\n def __init__(self, name, sqltext):\n self.name = name\n self.sqltext = sqltext\n\n# function to find a and replace a string(the whole string should match, not substring)\n# across all the columns of a table.\n# If run is true then the function executes the query.\nreplace_in_table = ReplaceableObject(\n \"replace_in_table(table_name text, original text, replacement text, run boolean)\",\n \"\"\"\n returns text language plpgsql\n as $$\n declare\n r record;\n q text;\n begin\n q = '';\n for r in\n select attname\n from pg_attribute\n where attrelid = table_name::regclass\n and attnum > 0\n and not attisdropped\n and (atttypid = 25\n or atttypid = 1042\n or atttypid = 1043)\n order by attnum\n loop\n q = format($fmt$%supdate %s set %s = %s where %s = %s;$fmt$,\n q, table_name, r.attname, replacement, r.attname, original);\n end loop;\n q = format('%s;', rtrim(q, ';'));\n if run then\n execute q;\n end if;\n return q;\n end $$;\n \"\"\"\n)\n\n\ndef upgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.create_or_replace_sp(replace_in_table)\n op.alter_column('social_links', 'link',\n existing_type=sa.VARCHAR(),\n nullable=True)\n op.alter_column('sessions', 'title',\n existing_type=sa.VARCHAR(),\n nullable=True)\n op.alter_column('sessions_version', 'title',\n existing_type=sa.VARCHAR(),\n nullable=True)\n op.alter_column('call_for_papers', 'announcement',\n existing_type=sa.VARCHAR(),\n nullable=True)\n\n # to create these statements automatically, please print them like\n # print(\"\"\"op.execute(\"SELECT replace_in_table('\"\"\" + table[0] + \"\"\"', '''''', 'NULL', true)\",\n # execution_options=None)\"\"\")\n # inside the function to fetch all tables and models in the views directory\n # TRY AS HARD AS YOU CAN TO NOT BRING THE APP LOGIC IN MIGRATIONS\n op.execute(\"SELECT replace_in_table('event_copyrights', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('mails', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('social_links', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('booked_ticket', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('tax', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('custom_sys_roles', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('custom_forms', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('microlocations', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('user_permissions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('event_user', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('orders', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('orders_tickets', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('discount_codes', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('ticket_fees', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('access_codes', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('user_system_role', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('message_settings', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('image_sizes', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('speaker', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('stripe_authorizations', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('role_invites', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('custom_placeholders', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('session_types', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('events', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('activities', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('users', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('sessions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('sponsors', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('ticket_holders', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('notifications', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('tracks', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('users_events_roles', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('email_notifications', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('services', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('export_jobs', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('image_config', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('pages', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('permissions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('tickets', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('roles', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('settings', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('versions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('modules', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('event_invoices', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('import_jobs', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('call_for_papers', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('panel_permissions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('ticket_tag', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('invites', '''''', 'NULL', true)\",\n execution_options=None)\n\n op.execute(\"SELECT replace_in_table('sessions_version', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('events_version', '''''', 'NULL', true)\",\n execution_options=None)\n # ### end Alembic commands ###\n\n\ndef downgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.drop_sp(replace_in_table)\n op.alter_column('social_links', 'link',\n existing_type=sa.VARCHAR(),\n nullable=False)\n op.alter_column('sessions', 'title',\n existing_type=sa.VARCHAR(),\n nullable=False)\n op.alter_column('sessions_version', 'title',\n existing_type=sa.VARCHAR(),\n nullable=False)\n op.alter_column('call_for_papers', 'announcement',\n existing_type=sa.VARCHAR(),\n nullable=False)\n # ### end Alembic commands ###\n", "path": "migrations/versions/1b9c8698c632_.py"}], "after_files": [{"content": "\"\"\"empty message\n\nRevision ID: 1b9c8698c632\nRevises: af9215895b80\nCreate Date: 2017-06-15 22:39:27.240374\n\n\"\"\"\n\n# revision identifiers, used by Alembic.\nrevision = '1b9c8698c632'\ndown_revision = 'af9215895b80'\n\nfrom alembic import op\nimport sqlalchemy as sa\nimport sqlalchemy_utils\n\n\nclass ReplaceableObject(object):\n def __init__(self, name, sqltext):\n self.name = name\n self.sqltext = sqltext\n\n# function to find a and replace a string(the whole string should match, not substring)\n# across all the columns of a table.\n# If run is true then the function executes the query.\n# This will work only for nullable columns\nreplace_in_table = ReplaceableObject(\n \"replace_in_table(table_name text, original text, replacement text, run boolean)\",\n \"\"\"\n returns text language plpgsql\n as $$\n declare\n r record;\n q text;\n begin\n q = '';\n for r in\n select attname\n from pg_attribute\n where attrelid = table_name::regclass\n and attnum > 0\n and not attisdropped\n and (atttypid = 25\n or atttypid = 1042\n or atttypid = 1043)\n and not attnotnull\n order by attnum\n loop\n q = format($fmt$%supdate %s set %s = %s where %s = %s;$fmt$,\n q, table_name, r.attname, replacement, r.attname, original);\n end loop;\n q = format('%s;', rtrim(q, ';'));\n if run then\n execute q;\n end if;\n return q;\n end $$;\n \"\"\"\n)\n\n\ndef upgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.create_or_replace_sp(replace_in_table)\n op.alter_column('social_links', 'link',\n existing_type=sa.VARCHAR(),\n nullable=True)\n op.alter_column('sessions', 'title',\n existing_type=sa.VARCHAR(),\n nullable=True)\n op.alter_column('sessions_version', 'title',\n existing_type=sa.VARCHAR(),\n nullable=True)\n op.alter_column('call_for_papers', 'announcement',\n existing_type=sa.VARCHAR(),\n nullable=True)\n\n # to create these statements automatically, please print them like\n # print(\"\"\"op.execute(\"SELECT replace_in_table('\"\"\" + table[0] + \"\"\"', '''''', 'NULL', true)\",\n # execution_options=None)\"\"\")\n # inside the function to fetch all tables and models in the views directory\n # TRY AS HARD AS YOU CAN TO NOT BRING THE APP LOGIC IN MIGRATIONS\n op.execute(\"SELECT replace_in_table('event_copyrights', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('mails', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('social_links', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('booked_ticket', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('tax', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('custom_sys_roles', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('custom_forms', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('microlocations', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('user_permissions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('event_user', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('orders', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('orders_tickets', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('discount_codes', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('ticket_fees', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('access_codes', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('user_system_role', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('message_settings', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('image_sizes', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('speaker', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('stripe_authorizations', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('role_invites', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('custom_placeholders', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('session_types', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('events', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('activities', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('users', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('sessions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('sponsors', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('ticket_holders', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('notifications', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('tracks', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('users_events_roles', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('email_notifications', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('services', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('export_jobs', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('image_config', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('pages', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('permissions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('tickets', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('roles', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('settings', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('versions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('modules', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('event_invoices', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('import_jobs', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('call_for_papers', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('panel_permissions', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('ticket_tag', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('invites', '''''', 'NULL', true)\",\n execution_options=None)\n\n op.execute(\"SELECT replace_in_table('sessions_version', '''''', 'NULL', true)\",\n execution_options=None)\n op.execute(\"SELECT replace_in_table('events_version', '''''', 'NULL', true)\",\n execution_options=None)\n # ### end Alembic commands ###\n\n\ndef downgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.drop_sp(replace_in_table)\n op.alter_column('social_links', 'link',\n existing_type=sa.VARCHAR(),\n nullable=False)\n op.alter_column('sessions', 'title',\n existing_type=sa.VARCHAR(),\n nullable=False)\n op.alter_column('sessions_version', 'title',\n existing_type=sa.VARCHAR(),\n nullable=False)\n op.alter_column('call_for_papers', 'announcement',\n existing_type=sa.VARCHAR(),\n nullable=False)\n # ### end Alembic commands ###\n", "path": "migrations/versions/1b9c8698c632_.py"}]}
| 3,299 | 267 |
gh_patches_debug_827
|
rasdani/github-patches
|
git_diff
|
elastic__apm-agent-python-1558
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 369: invalid start byte
**Describe the bug**:
Sending PUT request to FastAPI with binary file encoded, its returns a error.
**To Reproduce**
1. Configure FastAPI with elasticapm.contrib.starlette
2. Send a PUT request with binary file
**Environment (please complete the following information)**
- OS: Linux
- Python version: 3.8
- Framework and version: fastapi 0.61.2
- APM Server version: 8
- Agent version: 8
The same problema was resolved in the main code : [#344](https://github.com/elastic/apm-agent-python/issues/334)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `elasticapm/contrib/starlette/utils.py`
Content:
```
1 # BSD 3-Clause License
2 #
3 # Copyright (c) 2019, Elasticsearch BV
4 # All rights reserved.
5 #
6 # Redistribution and use in source and binary forms, with or without
7 # modification, are permitted provided that the following conditions are met:
8 #
9 # * Redistributions of source code must retain the above copyright notice, this
10 # list of conditions and the following disclaimer.
11 #
12 # * Redistributions in binary form must reproduce the above copyright notice,
13 # this list of conditions and the following disclaimer in the documentation
14 # and/or other materials provided with the distribution.
15 #
16 # * Neither the name of the copyright holder nor the names of its
17 # contributors may be used to endorse or promote products derived from
18 # this software without specific prior written permission.
19 #
20 # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
21 # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
22 # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
23 # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
24 # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
25 # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
26 # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
27 # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
28 # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29 # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30 import asyncio
31
32 from starlette.datastructures import Headers
33 from starlette.requests import Request
34 from starlette.types import Message
35
36 from elasticapm.conf import Config, constants
37 from elasticapm.utils import get_url_dict
38
39
40 async def get_data_from_request(request: Request, config: Config, event_type: str) -> dict:
41 """Loads data from incoming request for APM capturing.
42
43 Args:
44 request (Request)
45 config (Config)
46 event_type (str)
47
48 Returns:
49 dict
50 """
51 result = {
52 "method": request.method,
53 "socket": {"remote_address": _get_client_ip(request)},
54 "cookies": request.cookies,
55 }
56 if config.capture_headers:
57 result["headers"] = dict(request.headers)
58
59 if request.method in constants.HTTP_WITH_BODY:
60 if config.capture_body not in ("all", event_type):
61 result["body"] = "[REDACTED]"
62 else:
63 body = None
64 try:
65 body = await get_body(request)
66 except Exception:
67 pass
68 if body is not None:
69 result["body"] = body
70
71 result["url"] = get_url_dict(str(request.url))
72
73 return result
74
75
76 async def get_data_from_response(message: dict, config: Config, event_type: str) -> dict:
77 """Loads data from response for APM capturing.
78
79 Args:
80 message (dict)
81 config (Config)
82 event_type (str)
83
84 Returns:
85 dict
86 """
87 result = {}
88
89 if "status" in message:
90 result["status_code"] = message["status"]
91
92 if config.capture_headers and "headers" in message:
93 headers = Headers(raw=message["headers"])
94 result["headers"] = {key: ";".join(headers.getlist(key)) for key in headers.keys()}
95
96 return result
97
98
99 async def set_body(request: Request, body: bytes):
100 """Overwrites body in Starlette.
101
102 Args:
103 request (Request)
104 body (bytes)
105 """
106
107 async def receive() -> Message:
108 await asyncio.sleep(0)
109 return {"type": "http.request", "body": body}
110
111 request._receive = receive
112
113
114 async def get_body(request: Request) -> str:
115 """Gets body from the request.
116
117 When we consume the body, we replace the streaming mechanism with
118 a mocked version -- this workaround came from
119 https://github.com/encode/starlette/issues/495#issuecomment-513138055
120
121 Args:
122 request (Request)
123
124 Returns:
125 str
126 """
127 body = await request.body()
128 await set_body(request, body)
129
130 request._stream_consumed = False
131
132 return body.decode("utf-8")
133
134
135 async def query_params_to_dict(query_params: str) -> dict:
136 """Transforms query params from URL to dictionary
137
138 Args:
139 query_params (str)
140
141 Returns:
142 dict
143
144 Examples:
145 >>> print(query_params_to_dict(b"key=val&key2=val2"))
146 {"key": "val", "key2": "val2"}
147 """
148 query_params = query_params.split("&")
149 res = {}
150 for param in query_params:
151 key, val = param.split("=")
152 res[key] = val
153
154 return res
155
156
157 def _get_client_ip(request: Request):
158 x_forwarded_for = request.headers.get("HTTP_X_FORWARDED_FOR")
159 if x_forwarded_for:
160 ip = x_forwarded_for.split(",")[0]
161 else:
162 ip = request.headers.get("REMOTE_ADDR")
163 return ip
164
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/elasticapm/contrib/starlette/utils.py b/elasticapm/contrib/starlette/utils.py
--- a/elasticapm/contrib/starlette/utils.py
+++ b/elasticapm/contrib/starlette/utils.py
@@ -129,7 +129,7 @@
request._stream_consumed = False
- return body.decode("utf-8")
+ return body.decode("utf-8", errors="replace")
async def query_params_to_dict(query_params: str) -> dict:
|
{"golden_diff": "diff --git a/elasticapm/contrib/starlette/utils.py b/elasticapm/contrib/starlette/utils.py\n--- a/elasticapm/contrib/starlette/utils.py\n+++ b/elasticapm/contrib/starlette/utils.py\n@@ -129,7 +129,7 @@\n \n request._stream_consumed = False\n \n- return body.decode(\"utf-8\")\n+ return body.decode(\"utf-8\", errors=\"replace\")\n \n \n async def query_params_to_dict(query_params: str) -> dict:\n", "issue": "UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 369: invalid start byte\n**Describe the bug**: \r\nSending PUT request to FastAPI with binary file encoded, its returns a error.\r\n\r\n**To Reproduce**\r\n\r\n1. Configure FastAPI with elasticapm.contrib.starlette\r\n2. Send a PUT request with binary file\r\n\r\n**Environment (please complete the following information)**\r\n- OS: Linux\r\n- Python version: 3.8\r\n- Framework and version: fastapi 0.61.2\r\n- APM Server version: 8\r\n- Agent version: 8\r\n\r\nThe same problema was resolved in the main code : [#344](https://github.com/elastic/apm-agent-python/issues/334)\r\n\n", "before_files": [{"content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice, this\n# list of conditions and the following disclaimer.\n#\n# * Redistributions in binary form must reproduce the above copyright notice,\n# this list of conditions and the following disclaimer in the documentation\n# and/or other materials provided with the distribution.\n#\n# * Neither the name of the copyright holder nor the names of its\n# contributors may be used to endorse or promote products derived from\n# this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\nimport asyncio\n\nfrom starlette.datastructures import Headers\nfrom starlette.requests import Request\nfrom starlette.types import Message\n\nfrom elasticapm.conf import Config, constants\nfrom elasticapm.utils import get_url_dict\n\n\nasync def get_data_from_request(request: Request, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from incoming request for APM capturing.\n\n Args:\n request (Request)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {\n \"method\": request.method,\n \"socket\": {\"remote_address\": _get_client_ip(request)},\n \"cookies\": request.cookies,\n }\n if config.capture_headers:\n result[\"headers\"] = dict(request.headers)\n\n if request.method in constants.HTTP_WITH_BODY:\n if config.capture_body not in (\"all\", event_type):\n result[\"body\"] = \"[REDACTED]\"\n else:\n body = None\n try:\n body = await get_body(request)\n except Exception:\n pass\n if body is not None:\n result[\"body\"] = body\n\n result[\"url\"] = get_url_dict(str(request.url))\n\n return result\n\n\nasync def get_data_from_response(message: dict, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from response for APM capturing.\n\n Args:\n message (dict)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {}\n\n if \"status\" in message:\n result[\"status_code\"] = message[\"status\"]\n\n if config.capture_headers and \"headers\" in message:\n headers = Headers(raw=message[\"headers\"])\n result[\"headers\"] = {key: \";\".join(headers.getlist(key)) for key in headers.keys()}\n\n return result\n\n\nasync def set_body(request: Request, body: bytes):\n \"\"\"Overwrites body in Starlette.\n\n Args:\n request (Request)\n body (bytes)\n \"\"\"\n\n async def receive() -> Message:\n await asyncio.sleep(0)\n return {\"type\": \"http.request\", \"body\": body}\n\n request._receive = receive\n\n\nasync def get_body(request: Request) -> str:\n \"\"\"Gets body from the request.\n\n When we consume the body, we replace the streaming mechanism with\n a mocked version -- this workaround came from\n https://github.com/encode/starlette/issues/495#issuecomment-513138055\n\n Args:\n request (Request)\n\n Returns:\n str\n \"\"\"\n body = await request.body()\n await set_body(request, body)\n\n request._stream_consumed = False\n\n return body.decode(\"utf-8\")\n\n\nasync def query_params_to_dict(query_params: str) -> dict:\n \"\"\"Transforms query params from URL to dictionary\n\n Args:\n query_params (str)\n\n Returns:\n dict\n\n Examples:\n >>> print(query_params_to_dict(b\"key=val&key2=val2\"))\n {\"key\": \"val\", \"key2\": \"val2\"}\n \"\"\"\n query_params = query_params.split(\"&\")\n res = {}\n for param in query_params:\n key, val = param.split(\"=\")\n res[key] = val\n\n return res\n\n\ndef _get_client_ip(request: Request):\n x_forwarded_for = request.headers.get(\"HTTP_X_FORWARDED_FOR\")\n if x_forwarded_for:\n ip = x_forwarded_for.split(\",\")[0]\n else:\n ip = request.headers.get(\"REMOTE_ADDR\")\n return ip\n", "path": "elasticapm/contrib/starlette/utils.py"}], "after_files": [{"content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain the above copyright notice, this\n# list of conditions and the following disclaimer.\n#\n# * Redistributions in binary form must reproduce the above copyright notice,\n# this list of conditions and the following disclaimer in the documentation\n# and/or other materials provided with the distribution.\n#\n# * Neither the name of the copyright holder nor the names of its\n# contributors may be used to endorse or promote products derived from\n# this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\nimport asyncio\n\nfrom starlette.datastructures import Headers\nfrom starlette.requests import Request\nfrom starlette.types import Message\n\nfrom elasticapm.conf import Config, constants\nfrom elasticapm.utils import get_url_dict\n\n\nasync def get_data_from_request(request: Request, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from incoming request for APM capturing.\n\n Args:\n request (Request)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {\n \"method\": request.method,\n \"socket\": {\"remote_address\": _get_client_ip(request)},\n \"cookies\": request.cookies,\n }\n if config.capture_headers:\n result[\"headers\"] = dict(request.headers)\n\n if request.method in constants.HTTP_WITH_BODY:\n if config.capture_body not in (\"all\", event_type):\n result[\"body\"] = \"[REDACTED]\"\n else:\n body = None\n try:\n body = await get_body(request)\n except Exception:\n pass\n if body is not None:\n result[\"body\"] = body\n\n result[\"url\"] = get_url_dict(str(request.url))\n\n return result\n\n\nasync def get_data_from_response(message: dict, config: Config, event_type: str) -> dict:\n \"\"\"Loads data from response for APM capturing.\n\n Args:\n message (dict)\n config (Config)\n event_type (str)\n\n Returns:\n dict\n \"\"\"\n result = {}\n\n if \"status\" in message:\n result[\"status_code\"] = message[\"status\"]\n\n if config.capture_headers and \"headers\" in message:\n headers = Headers(raw=message[\"headers\"])\n result[\"headers\"] = {key: \";\".join(headers.getlist(key)) for key in headers.keys()}\n\n return result\n\n\nasync def set_body(request: Request, body: bytes):\n \"\"\"Overwrites body in Starlette.\n\n Args:\n request (Request)\n body (bytes)\n \"\"\"\n\n async def receive() -> Message:\n await asyncio.sleep(0)\n return {\"type\": \"http.request\", \"body\": body}\n\n request._receive = receive\n\n\nasync def get_body(request: Request) -> str:\n \"\"\"Gets body from the request.\n\n When we consume the body, we replace the streaming mechanism with\n a mocked version -- this workaround came from\n https://github.com/encode/starlette/issues/495#issuecomment-513138055\n\n Args:\n request (Request)\n\n Returns:\n str\n \"\"\"\n body = await request.body()\n await set_body(request, body)\n\n request._stream_consumed = False\n\n return body.decode(\"utf-8\", errors=\"replace\")\n\n\nasync def query_params_to_dict(query_params: str) -> dict:\n \"\"\"Transforms query params from URL to dictionary\n\n Args:\n query_params (str)\n\n Returns:\n dict\n\n Examples:\n >>> print(query_params_to_dict(b\"key=val&key2=val2\"))\n {\"key\": \"val\", \"key2\": \"val2\"}\n \"\"\"\n query_params = query_params.split(\"&\")\n res = {}\n for param in query_params:\n key, val = param.split(\"=\")\n res[key] = val\n\n return res\n\n\ndef _get_client_ip(request: Request):\n x_forwarded_for = request.headers.get(\"HTTP_X_FORWARDED_FOR\")\n if x_forwarded_for:\n ip = x_forwarded_for.split(\",\")[0]\n else:\n ip = request.headers.get(\"REMOTE_ADDR\")\n return ip\n", "path": "elasticapm/contrib/starlette/utils.py"}]}
| 1,938 | 115 |
gh_patches_debug_13618
|
rasdani/github-patches
|
git_diff
|
mlflow__mlflow-8016
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] Send `Cache-Control` headers from static files routes
### Issues Policy acknowledgement
- [X] I have read and agree to submit bug reports in accordance with the [issues policy](https://www.github.com/mlflow/mlflow/blob/master/ISSUE_POLICY.md)
### Willingness to contribute
Yes. I would be willing to contribute a fix for this bug with guidance from the MLflow community.
### MLflow version
- Client: 2.1.1 & master
- Tracking server: 2.1.1 & master
### System information
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: all
- **Python version**: 3.10
- **yarn version, if running the dev UI**: 3.3.0
### Describe the problem
I was doing some performance profiling on the mlflow ui, as users were reporting slow load times both on initial page load and when navigating. The react app is quite a large bundle that's fairly split out via webpack.
I noticed the entirety of the js files are being re-fetched on every navigation. Since these files contain a hash, they are essentially immutable files and could be cached almost indefinitely by the browser. (I assume this is a content hash from webpack, but would need to confirm).
Seems the browser is re-fetching/re-validating the files due to flask setting a [`Cache-Control: no-cache` ](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control#directives) header for static files. This can also cause upstream proxies (nginx etc) not to cache the content.
It seems to makes sense to add a `max_age` parameter to the [`send_from_directory`](https://github.com/mlflow/mlflow/blob/68e11fbc39a7e1c0d42ab6a38a6243dbc71e031f/mlflow/server/__init__.py#L77-L88) calls in `mlflow.server.__init__.py`. The [`max_age `](https://flask.palletsprojects.com/en/2.2.x/api/?highlight=send_file#flask.send_file) parameter would cause a `Cache-Control: public max_age=2419200` header to be sent. Browsers would then cache the static files which would improve performance on the front and and reduce server load.
This issue explains more about why the default is not to send cache headers: https://github.com/pallets/werkzeug/issues/1882. It seems the mlflow static files would want the inverse of the default behavior.
Ideally, something like a `Cache-Control: max-age=2419200, stale-while-revalidate=2419200` header could be sent. But it isn't possible via `send_from_directory` directly. One could edit the `Response`'s cache control policy manually for more fine grained control.
Example easy mode patch:
```patch
dex bbb5110e7..6264f5391 100644
--- a/mlflow/server/__init__.py
+++ b/mlflow/server/__init__.py
@@ -76,16 +76,17 @@ def serve_get_metric_history_bulk():
# We expect the react app to be built assuming it is hosted at /static-files, so that requests for
# CSS/JS resources will be made to e.g. /static-files/main.css and we can handle them here.
+# cache_timeout could be really high actually
@app.route(_add_static_prefix("/static-files/<path:path>"))
def serve_static_file(path):
- return send_from_directory(STATIC_DIR, path)
+ return send_from_directory(STATIC_DIR, path, max_age=2419200)
# Serve the index.html for the React App for all other routes.
@app.route(_add_static_prefix("/"))
def serve():
if os.path.exists(os.path.join(STATIC_DIR, "index.html")):
- return send_from_directory(STATIC_DIR, "index.html")
+ return send_from_directory(STATIC_DIR, "index.html", max_age=2419200)
text = textwrap.dedent(
"""
```
Thoughts?
### Tracking information
<!-- PLEASE KEEP BACKTICKS AND CHECK PREVIEW -->
Developer tools in browser.
### Code to reproduce issue
`Cache-Control: no-cache`
### Stack trace
n/a
### Other info / logs
n/a
### What component(s) does this bug affect?
- [ ] `area/artifacts`: Artifact stores and artifact logging
- [ ] `area/build`: Build and test infrastructure for MLflow
- [ ] `area/docs`: MLflow documentation pages
- [ ] `area/examples`: Example code
- [ ] `area/model-registry`: Model Registry service, APIs, and the fluent client calls for Model Registry
- [ ] `area/models`: MLmodel format, model serialization/deserialization, flavors
- [ ] `area/recipes`: Recipes, Recipe APIs, Recipe configs, Recipe Templates
- [ ] `area/projects`: MLproject format, project running backends
- [ ] `area/scoring`: MLflow Model server, model deployment tools, Spark UDFs
- [X] `area/server-infra`: MLflow Tracking server backend
- [ ] `area/tracking`: Tracking Service, tracking client APIs, autologging
### What interface(s) does this bug affect?
- [X] `area/uiux`: Front-end, user experience, plotting, JavaScript, JavaScript dev server
- [ ] `area/docker`: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
- [ ] `area/sqlalchemy`: Use of SQLAlchemy in the Tracking Service or Model Registry
- [ ] `area/windows`: Windows support
### What language(s) does this bug affect?
- [ ] `language/r`: R APIs and clients
- [ ] `language/java`: Java APIs and clients
- [ ] `language/new`: Proposals for new client languages
### What integration(s) does this bug affect?
- [ ] `integrations/azure`: Azure and Azure ML integrations
- [ ] `integrations/sagemaker`: SageMaker integrations
- [ ] `integrations/databricks`: Databricks integrations
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mlflow/server/__init__.py`
Content:
```
1 import os
2 import shlex
3 import sys
4 import textwrap
5 import importlib.metadata
6
7 from flask import Flask, send_from_directory, Response
8
9 from mlflow.exceptions import MlflowException
10 from mlflow.server import handlers
11 from mlflow.server.handlers import (
12 get_artifact_handler,
13 get_metric_history_bulk_handler,
14 STATIC_PREFIX_ENV_VAR,
15 _add_static_prefix,
16 get_model_version_artifact_handler,
17 )
18 from mlflow.utils.process import _exec_cmd
19 from mlflow.version import VERSION
20
21 # NB: These are internal environment variables used for communication between
22 # the cli and the forked gunicorn processes.
23 BACKEND_STORE_URI_ENV_VAR = "_MLFLOW_SERVER_FILE_STORE"
24 REGISTRY_STORE_URI_ENV_VAR = "_MLFLOW_SERVER_REGISTRY_STORE"
25 ARTIFACT_ROOT_ENV_VAR = "_MLFLOW_SERVER_ARTIFACT_ROOT"
26 ARTIFACTS_DESTINATION_ENV_VAR = "_MLFLOW_SERVER_ARTIFACT_DESTINATION"
27 PROMETHEUS_EXPORTER_ENV_VAR = "prometheus_multiproc_dir"
28 SERVE_ARTIFACTS_ENV_VAR = "_MLFLOW_SERVER_SERVE_ARTIFACTS"
29 ARTIFACTS_ONLY_ENV_VAR = "_MLFLOW_SERVER_ARTIFACTS_ONLY"
30
31 REL_STATIC_DIR = "js/build"
32
33 app = Flask(__name__, static_folder=REL_STATIC_DIR)
34 STATIC_DIR = os.path.join(app.root_path, REL_STATIC_DIR)
35
36
37 for http_path, handler, methods in handlers.get_endpoints():
38 app.add_url_rule(http_path, handler.__name__, handler, methods=methods)
39
40 if os.getenv(PROMETHEUS_EXPORTER_ENV_VAR):
41 from mlflow.server.prometheus_exporter import activate_prometheus_exporter
42
43 prometheus_metrics_path = os.getenv(PROMETHEUS_EXPORTER_ENV_VAR)
44 if not os.path.exists(prometheus_metrics_path):
45 os.makedirs(prometheus_metrics_path)
46 activate_prometheus_exporter(app)
47
48
49 # Provide a health check endpoint to ensure the application is responsive
50 @app.route("/health")
51 def health():
52 return "OK", 200
53
54
55 # Provide an endpoint to query the version of mlflow running on the server
56 @app.route("/version")
57 def version():
58 return VERSION, 200
59
60
61 # Serve the "get-artifact" route.
62 @app.route(_add_static_prefix("/get-artifact"))
63 def serve_artifacts():
64 return get_artifact_handler()
65
66
67 # Serve the "model-versions/get-artifact" route.
68 @app.route(_add_static_prefix("/model-versions/get-artifact"))
69 def serve_model_version_artifact():
70 return get_model_version_artifact_handler()
71
72
73 # Serve the "metrics/get-history-bulk" route.
74 @app.route(_add_static_prefix("/ajax-api/2.0/mlflow/metrics/get-history-bulk"))
75 def serve_get_metric_history_bulk():
76 return get_metric_history_bulk_handler()
77
78
79 # We expect the react app to be built assuming it is hosted at /static-files, so that requests for
80 # CSS/JS resources will be made to e.g. /static-files/main.css and we can handle them here.
81 @app.route(_add_static_prefix("/static-files/<path:path>"))
82 def serve_static_file(path):
83 return send_from_directory(STATIC_DIR, path)
84
85
86 # Serve the index.html for the React App for all other routes.
87 @app.route(_add_static_prefix("/"))
88 def serve():
89 if os.path.exists(os.path.join(STATIC_DIR, "index.html")):
90 return send_from_directory(STATIC_DIR, "index.html")
91
92 text = textwrap.dedent(
93 """
94 Unable to display MLflow UI - landing page (index.html) not found.
95
96 You are very likely running the MLflow server using a source installation of the Python MLflow
97 package.
98
99 If you are a developer making MLflow source code changes and intentionally running a source
100 installation of MLflow, you can view the UI by running the Javascript dev server:
101 https://github.com/mlflow/mlflow/blob/master/CONTRIBUTING.md#running-the-javascript-dev-server
102
103 Otherwise, uninstall MLflow via 'pip uninstall mlflow', reinstall an official MLflow release
104 from PyPI via 'pip install mlflow', and rerun the MLflow server.
105 """
106 )
107 return Response(text, mimetype="text/plain")
108
109
110 def _find_app(app_name: str) -> str:
111 apps = importlib.metadata.entry_points().get("mlflow.app", [])
112 for app in apps:
113 if app.name == app_name:
114 return app.value
115
116 raise MlflowException(
117 f"Failed to find app '{app_name}'. Available apps: {[a.name for a in apps]}"
118 )
119
120
121 def _build_waitress_command(waitress_opts, host, port, app_name):
122 opts = shlex.split(waitress_opts) if waitress_opts else []
123 return (
124 ["waitress-serve"]
125 + opts
126 + ["--host=%s" % host, "--port=%s" % port, "--ident=mlflow", app_name]
127 )
128
129
130 def _build_gunicorn_command(gunicorn_opts, host, port, workers, app_name):
131 bind_address = f"{host}:{port}"
132 opts = shlex.split(gunicorn_opts) if gunicorn_opts else []
133 return ["gunicorn"] + opts + ["-b", bind_address, "-w", "%s" % workers, app_name]
134
135
136 def _run_server(
137 file_store_path,
138 registry_store_uri,
139 default_artifact_root,
140 serve_artifacts,
141 artifacts_only,
142 artifacts_destination,
143 host,
144 port,
145 static_prefix=None,
146 workers=None,
147 gunicorn_opts=None,
148 waitress_opts=None,
149 expose_prometheus=None,
150 app_name=None,
151 ):
152 """
153 Run the MLflow server, wrapping it in gunicorn or waitress on windows
154 :param static_prefix: If set, the index.html asset will be served from the path static_prefix.
155 If left None, the index.html asset will be served from the root path.
156 :return: None
157 """
158 env_map = {}
159 if file_store_path:
160 env_map[BACKEND_STORE_URI_ENV_VAR] = file_store_path
161 if registry_store_uri:
162 env_map[REGISTRY_STORE_URI_ENV_VAR] = registry_store_uri
163 if default_artifact_root:
164 env_map[ARTIFACT_ROOT_ENV_VAR] = default_artifact_root
165 if serve_artifacts:
166 env_map[SERVE_ARTIFACTS_ENV_VAR] = "true"
167 if artifacts_only:
168 env_map[ARTIFACTS_ONLY_ENV_VAR] = "true"
169 if artifacts_destination:
170 env_map[ARTIFACTS_DESTINATION_ENV_VAR] = artifacts_destination
171 if static_prefix:
172 env_map[STATIC_PREFIX_ENV_VAR] = static_prefix
173
174 if expose_prometheus:
175 env_map[PROMETHEUS_EXPORTER_ENV_VAR] = expose_prometheus
176
177 app_spec = f"{__name__}:app" if app_name is None else _find_app(app_name)
178 # TODO: eventually may want waitress on non-win32
179 if sys.platform == "win32":
180 full_command = _build_waitress_command(waitress_opts, host, port, app_spec)
181 else:
182 full_command = _build_gunicorn_command(gunicorn_opts, host, port, workers or 4, app_spec)
183 _exec_cmd(full_command, extra_env=env_map, capture_output=False)
184
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/mlflow/server/__init__.py b/mlflow/server/__init__.py
--- a/mlflow/server/__init__.py
+++ b/mlflow/server/__init__.py
@@ -78,9 +78,10 @@
# We expect the react app to be built assuming it is hosted at /static-files, so that requests for
# CSS/JS resources will be made to e.g. /static-files/main.css and we can handle them here.
+# The files are hashed based on source code, so ok to send Cache-Control headers via max_age.
@app.route(_add_static_prefix("/static-files/<path:path>"))
def serve_static_file(path):
- return send_from_directory(STATIC_DIR, path)
+ return send_from_directory(STATIC_DIR, path, max_age=2419200)
# Serve the index.html for the React App for all other routes.
|
{"golden_diff": "diff --git a/mlflow/server/__init__.py b/mlflow/server/__init__.py\n--- a/mlflow/server/__init__.py\n+++ b/mlflow/server/__init__.py\n@@ -78,9 +78,10 @@\n \n # We expect the react app to be built assuming it is hosted at /static-files, so that requests for\n # CSS/JS resources will be made to e.g. /static-files/main.css and we can handle them here.\n+# The files are hashed based on source code, so ok to send Cache-Control headers via max_age.\n @app.route(_add_static_prefix(\"/static-files/<path:path>\"))\n def serve_static_file(path):\n- return send_from_directory(STATIC_DIR, path)\n+ return send_from_directory(STATIC_DIR, path, max_age=2419200)\n \n \n # Serve the index.html for the React App for all other routes.\n", "issue": "[BUG] Send `Cache-Control` headers from static files routes\n### Issues Policy acknowledgement\n\n- [X] I have read and agree to submit bug reports in accordance with the [issues policy](https://www.github.com/mlflow/mlflow/blob/master/ISSUE_POLICY.md)\n\n### Willingness to contribute\n\nYes. I would be willing to contribute a fix for this bug with guidance from the MLflow community.\n\n### MLflow version\n\n- Client: 2.1.1 & master\r\n- Tracking server: 2.1.1 & master\r\n\n\n### System information\n\n- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: all\r\n- **Python version**: 3.10\r\n- **yarn version, if running the dev UI**: 3.3.0\r\n\n\n### Describe the problem\n\nI was doing some performance profiling on the mlflow ui, as users were reporting slow load times both on initial page load and when navigating. The react app is quite a large bundle that's fairly split out via webpack.\r\n\r\nI noticed the entirety of the js files are being re-fetched on every navigation. Since these files contain a hash, they are essentially immutable files and could be cached almost indefinitely by the browser. (I assume this is a content hash from webpack, but would need to confirm).\r\n\r\nSeems the browser is re-fetching/re-validating the files due to flask setting a [`Cache-Control: no-cache` ](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control#directives) header for static files. This can also cause upstream proxies (nginx etc) not to cache the content.\r\n\r\nIt seems to makes sense to add a `max_age` parameter to the [`send_from_directory`](https://github.com/mlflow/mlflow/blob/68e11fbc39a7e1c0d42ab6a38a6243dbc71e031f/mlflow/server/__init__.py#L77-L88) calls in `mlflow.server.__init__.py`. The [`max_age `](https://flask.palletsprojects.com/en/2.2.x/api/?highlight=send_file#flask.send_file) parameter would cause a `Cache-Control: public max_age=2419200` header to be sent. Browsers would then cache the static files which would improve performance on the front and and reduce server load.\r\n\r\nThis issue explains more about why the default is not to send cache headers: https://github.com/pallets/werkzeug/issues/1882. It seems the mlflow static files would want the inverse of the default behavior.\r\n\r\nIdeally, something like a `Cache-Control: max-age=2419200, stale-while-revalidate=2419200` header could be sent. But it isn't possible via `send_from_directory` directly. One could edit the `Response`'s cache control policy manually for more fine grained control.\r\n\r\nExample easy mode patch:\r\n\r\n```patch\r\ndex bbb5110e7..6264f5391 100644\r\n--- a/mlflow/server/__init__.py\r\n+++ b/mlflow/server/__init__.py\r\n@@ -76,16 +76,17 @@ def serve_get_metric_history_bulk():\r\n\r\n # We expect the react app to be built assuming it is hosted at /static-files, so that requests for\r\n # CSS/JS resources will be made to e.g. /static-files/main.css and we can handle them here.\r\n+# cache_timeout could be really high actually\r\n @app.route(_add_static_prefix(\"/static-files/<path:path>\"))\r\n def serve_static_file(path):\r\n- return send_from_directory(STATIC_DIR, path)\r\n+ return send_from_directory(STATIC_DIR, path, max_age=2419200)\r\n\r\n\r\n # Serve the index.html for the React App for all other routes.\r\n @app.route(_add_static_prefix(\"/\"))\r\n def serve():\r\n if os.path.exists(os.path.join(STATIC_DIR, \"index.html\")):\r\n- return send_from_directory(STATIC_DIR, \"index.html\")\r\n+ return send_from_directory(STATIC_DIR, \"index.html\", max_age=2419200)\r\n\r\n text = textwrap.dedent(\r\n \"\"\"\r\n```\r\n\r\nThoughts?\n\n### Tracking information\n\n<!-- PLEASE KEEP BACKTICKS AND CHECK PREVIEW -->\r\nDeveloper tools in browser.\n\n### Code to reproduce issue\n\n`Cache-Control: no-cache`\n\n### Stack trace\n\nn/a\n\n### Other info / logs\n\nn/a\n\n### What component(s) does this bug affect?\n\n- [ ] `area/artifacts`: Artifact stores and artifact logging\n- [ ] `area/build`: Build and test infrastructure for MLflow\n- [ ] `area/docs`: MLflow documentation pages\n- [ ] `area/examples`: Example code\n- [ ] `area/model-registry`: Model Registry service, APIs, and the fluent client calls for Model Registry\n- [ ] `area/models`: MLmodel format, model serialization/deserialization, flavors\n- [ ] `area/recipes`: Recipes, Recipe APIs, Recipe configs, Recipe Templates\n- [ ] `area/projects`: MLproject format, project running backends\n- [ ] `area/scoring`: MLflow Model server, model deployment tools, Spark UDFs\n- [X] `area/server-infra`: MLflow Tracking server backend\n- [ ] `area/tracking`: Tracking Service, tracking client APIs, autologging\n\n### What interface(s) does this bug affect?\n\n- [X] `area/uiux`: Front-end, user experience, plotting, JavaScript, JavaScript dev server\n- [ ] `area/docker`: Docker use across MLflow's components, such as MLflow Projects and MLflow Models\n- [ ] `area/sqlalchemy`: Use of SQLAlchemy in the Tracking Service or Model Registry\n- [ ] `area/windows`: Windows support\n\n### What language(s) does this bug affect?\n\n- [ ] `language/r`: R APIs and clients\n- [ ] `language/java`: Java APIs and clients\n- [ ] `language/new`: Proposals for new client languages\n\n### What integration(s) does this bug affect?\n\n- [ ] `integrations/azure`: Azure and Azure ML integrations\n- [ ] `integrations/sagemaker`: SageMaker integrations\n- [ ] `integrations/databricks`: Databricks integrations\n", "before_files": [{"content": "import os\nimport shlex\nimport sys\nimport textwrap\nimport importlib.metadata\n\nfrom flask import Flask, send_from_directory, Response\n\nfrom mlflow.exceptions import MlflowException\nfrom mlflow.server import handlers\nfrom mlflow.server.handlers import (\n get_artifact_handler,\n get_metric_history_bulk_handler,\n STATIC_PREFIX_ENV_VAR,\n _add_static_prefix,\n get_model_version_artifact_handler,\n)\nfrom mlflow.utils.process import _exec_cmd\nfrom mlflow.version import VERSION\n\n# NB: These are internal environment variables used for communication between\n# the cli and the forked gunicorn processes.\nBACKEND_STORE_URI_ENV_VAR = \"_MLFLOW_SERVER_FILE_STORE\"\nREGISTRY_STORE_URI_ENV_VAR = \"_MLFLOW_SERVER_REGISTRY_STORE\"\nARTIFACT_ROOT_ENV_VAR = \"_MLFLOW_SERVER_ARTIFACT_ROOT\"\nARTIFACTS_DESTINATION_ENV_VAR = \"_MLFLOW_SERVER_ARTIFACT_DESTINATION\"\nPROMETHEUS_EXPORTER_ENV_VAR = \"prometheus_multiproc_dir\"\nSERVE_ARTIFACTS_ENV_VAR = \"_MLFLOW_SERVER_SERVE_ARTIFACTS\"\nARTIFACTS_ONLY_ENV_VAR = \"_MLFLOW_SERVER_ARTIFACTS_ONLY\"\n\nREL_STATIC_DIR = \"js/build\"\n\napp = Flask(__name__, static_folder=REL_STATIC_DIR)\nSTATIC_DIR = os.path.join(app.root_path, REL_STATIC_DIR)\n\n\nfor http_path, handler, methods in handlers.get_endpoints():\n app.add_url_rule(http_path, handler.__name__, handler, methods=methods)\n\nif os.getenv(PROMETHEUS_EXPORTER_ENV_VAR):\n from mlflow.server.prometheus_exporter import activate_prometheus_exporter\n\n prometheus_metrics_path = os.getenv(PROMETHEUS_EXPORTER_ENV_VAR)\n if not os.path.exists(prometheus_metrics_path):\n os.makedirs(prometheus_metrics_path)\n activate_prometheus_exporter(app)\n\n\n# Provide a health check endpoint to ensure the application is responsive\n@app.route(\"/health\")\ndef health():\n return \"OK\", 200\n\n\n# Provide an endpoint to query the version of mlflow running on the server\n@app.route(\"/version\")\ndef version():\n return VERSION, 200\n\n\n# Serve the \"get-artifact\" route.\n@app.route(_add_static_prefix(\"/get-artifact\"))\ndef serve_artifacts():\n return get_artifact_handler()\n\n\n# Serve the \"model-versions/get-artifact\" route.\n@app.route(_add_static_prefix(\"/model-versions/get-artifact\"))\ndef serve_model_version_artifact():\n return get_model_version_artifact_handler()\n\n\n# Serve the \"metrics/get-history-bulk\" route.\n@app.route(_add_static_prefix(\"/ajax-api/2.0/mlflow/metrics/get-history-bulk\"))\ndef serve_get_metric_history_bulk():\n return get_metric_history_bulk_handler()\n\n\n# We expect the react app to be built assuming it is hosted at /static-files, so that requests for\n# CSS/JS resources will be made to e.g. /static-files/main.css and we can handle them here.\n@app.route(_add_static_prefix(\"/static-files/<path:path>\"))\ndef serve_static_file(path):\n return send_from_directory(STATIC_DIR, path)\n\n\n# Serve the index.html for the React App for all other routes.\n@app.route(_add_static_prefix(\"/\"))\ndef serve():\n if os.path.exists(os.path.join(STATIC_DIR, \"index.html\")):\n return send_from_directory(STATIC_DIR, \"index.html\")\n\n text = textwrap.dedent(\n \"\"\"\n Unable to display MLflow UI - landing page (index.html) not found.\n\n You are very likely running the MLflow server using a source installation of the Python MLflow\n package.\n\n If you are a developer making MLflow source code changes and intentionally running a source\n installation of MLflow, you can view the UI by running the Javascript dev server:\n https://github.com/mlflow/mlflow/blob/master/CONTRIBUTING.md#running-the-javascript-dev-server\n\n Otherwise, uninstall MLflow via 'pip uninstall mlflow', reinstall an official MLflow release\n from PyPI via 'pip install mlflow', and rerun the MLflow server.\n \"\"\"\n )\n return Response(text, mimetype=\"text/plain\")\n\n\ndef _find_app(app_name: str) -> str:\n apps = importlib.metadata.entry_points().get(\"mlflow.app\", [])\n for app in apps:\n if app.name == app_name:\n return app.value\n\n raise MlflowException(\n f\"Failed to find app '{app_name}'. Available apps: {[a.name for a in apps]}\"\n )\n\n\ndef _build_waitress_command(waitress_opts, host, port, app_name):\n opts = shlex.split(waitress_opts) if waitress_opts else []\n return (\n [\"waitress-serve\"]\n + opts\n + [\"--host=%s\" % host, \"--port=%s\" % port, \"--ident=mlflow\", app_name]\n )\n\n\ndef _build_gunicorn_command(gunicorn_opts, host, port, workers, app_name):\n bind_address = f\"{host}:{port}\"\n opts = shlex.split(gunicorn_opts) if gunicorn_opts else []\n return [\"gunicorn\"] + opts + [\"-b\", bind_address, \"-w\", \"%s\" % workers, app_name]\n\n\ndef _run_server(\n file_store_path,\n registry_store_uri,\n default_artifact_root,\n serve_artifacts,\n artifacts_only,\n artifacts_destination,\n host,\n port,\n static_prefix=None,\n workers=None,\n gunicorn_opts=None,\n waitress_opts=None,\n expose_prometheus=None,\n app_name=None,\n):\n \"\"\"\n Run the MLflow server, wrapping it in gunicorn or waitress on windows\n :param static_prefix: If set, the index.html asset will be served from the path static_prefix.\n If left None, the index.html asset will be served from the root path.\n :return: None\n \"\"\"\n env_map = {}\n if file_store_path:\n env_map[BACKEND_STORE_URI_ENV_VAR] = file_store_path\n if registry_store_uri:\n env_map[REGISTRY_STORE_URI_ENV_VAR] = registry_store_uri\n if default_artifact_root:\n env_map[ARTIFACT_ROOT_ENV_VAR] = default_artifact_root\n if serve_artifacts:\n env_map[SERVE_ARTIFACTS_ENV_VAR] = \"true\"\n if artifacts_only:\n env_map[ARTIFACTS_ONLY_ENV_VAR] = \"true\"\n if artifacts_destination:\n env_map[ARTIFACTS_DESTINATION_ENV_VAR] = artifacts_destination\n if static_prefix:\n env_map[STATIC_PREFIX_ENV_VAR] = static_prefix\n\n if expose_prometheus:\n env_map[PROMETHEUS_EXPORTER_ENV_VAR] = expose_prometheus\n\n app_spec = f\"{__name__}:app\" if app_name is None else _find_app(app_name)\n # TODO: eventually may want waitress on non-win32\n if sys.platform == \"win32\":\n full_command = _build_waitress_command(waitress_opts, host, port, app_spec)\n else:\n full_command = _build_gunicorn_command(gunicorn_opts, host, port, workers or 4, app_spec)\n _exec_cmd(full_command, extra_env=env_map, capture_output=False)\n", "path": "mlflow/server/__init__.py"}], "after_files": [{"content": "import os\nimport shlex\nimport sys\nimport textwrap\nimport importlib.metadata\n\nfrom flask import Flask, send_from_directory, Response\n\nfrom mlflow.exceptions import MlflowException\nfrom mlflow.server import handlers\nfrom mlflow.server.handlers import (\n get_artifact_handler,\n get_metric_history_bulk_handler,\n STATIC_PREFIX_ENV_VAR,\n _add_static_prefix,\n get_model_version_artifact_handler,\n)\nfrom mlflow.utils.process import _exec_cmd\nfrom mlflow.version import VERSION\n\n# NB: These are internal environment variables used for communication between\n# the cli and the forked gunicorn processes.\nBACKEND_STORE_URI_ENV_VAR = \"_MLFLOW_SERVER_FILE_STORE\"\nREGISTRY_STORE_URI_ENV_VAR = \"_MLFLOW_SERVER_REGISTRY_STORE\"\nARTIFACT_ROOT_ENV_VAR = \"_MLFLOW_SERVER_ARTIFACT_ROOT\"\nARTIFACTS_DESTINATION_ENV_VAR = \"_MLFLOW_SERVER_ARTIFACT_DESTINATION\"\nPROMETHEUS_EXPORTER_ENV_VAR = \"prometheus_multiproc_dir\"\nSERVE_ARTIFACTS_ENV_VAR = \"_MLFLOW_SERVER_SERVE_ARTIFACTS\"\nARTIFACTS_ONLY_ENV_VAR = \"_MLFLOW_SERVER_ARTIFACTS_ONLY\"\n\nREL_STATIC_DIR = \"js/build\"\n\napp = Flask(__name__, static_folder=REL_STATIC_DIR)\nSTATIC_DIR = os.path.join(app.root_path, REL_STATIC_DIR)\n\n\nfor http_path, handler, methods in handlers.get_endpoints():\n app.add_url_rule(http_path, handler.__name__, handler, methods=methods)\n\nif os.getenv(PROMETHEUS_EXPORTER_ENV_VAR):\n from mlflow.server.prometheus_exporter import activate_prometheus_exporter\n\n prometheus_metrics_path = os.getenv(PROMETHEUS_EXPORTER_ENV_VAR)\n if not os.path.exists(prometheus_metrics_path):\n os.makedirs(prometheus_metrics_path)\n activate_prometheus_exporter(app)\n\n\n# Provide a health check endpoint to ensure the application is responsive\n@app.route(\"/health\")\ndef health():\n return \"OK\", 200\n\n\n# Provide an endpoint to query the version of mlflow running on the server\n@app.route(\"/version\")\ndef version():\n return VERSION, 200\n\n\n# Serve the \"get-artifact\" route.\n@app.route(_add_static_prefix(\"/get-artifact\"))\ndef serve_artifacts():\n return get_artifact_handler()\n\n\n# Serve the \"model-versions/get-artifact\" route.\n@app.route(_add_static_prefix(\"/model-versions/get-artifact\"))\ndef serve_model_version_artifact():\n return get_model_version_artifact_handler()\n\n\n# Serve the \"metrics/get-history-bulk\" route.\n@app.route(_add_static_prefix(\"/ajax-api/2.0/mlflow/metrics/get-history-bulk\"))\ndef serve_get_metric_history_bulk():\n return get_metric_history_bulk_handler()\n\n\n# We expect the react app to be built assuming it is hosted at /static-files, so that requests for\n# CSS/JS resources will be made to e.g. /static-files/main.css and we can handle them here.\n# The files are hashed based on source code, so ok to send Cache-Control headers via max_age.\n@app.route(_add_static_prefix(\"/static-files/<path:path>\"))\ndef serve_static_file(path):\n return send_from_directory(STATIC_DIR, path, max_age=2419200)\n\n\n# Serve the index.html for the React App for all other routes.\n@app.route(_add_static_prefix(\"/\"))\ndef serve():\n if os.path.exists(os.path.join(STATIC_DIR, \"index.html\")):\n return send_from_directory(STATIC_DIR, \"index.html\")\n\n text = textwrap.dedent(\n \"\"\"\n Unable to display MLflow UI - landing page (index.html) not found.\n\n You are very likely running the MLflow server using a source installation of the Python MLflow\n package.\n\n If you are a developer making MLflow source code changes and intentionally running a source\n installation of MLflow, you can view the UI by running the Javascript dev server:\n https://github.com/mlflow/mlflow/blob/master/CONTRIBUTING.md#running-the-javascript-dev-server\n\n Otherwise, uninstall MLflow via 'pip uninstall mlflow', reinstall an official MLflow release\n from PyPI via 'pip install mlflow', and rerun the MLflow server.\n \"\"\"\n )\n return Response(text, mimetype=\"text/plain\")\n\n\ndef _find_app(app_name: str) -> str:\n apps = importlib.metadata.entry_points().get(\"mlflow.app\", [])\n for app in apps:\n if app.name == app_name:\n return app.value\n\n raise MlflowException(\n f\"Failed to find app '{app_name}'. Available apps: {[a.name for a in apps]}\"\n )\n\n\ndef _build_waitress_command(waitress_opts, host, port, app_name):\n opts = shlex.split(waitress_opts) if waitress_opts else []\n return (\n [\"waitress-serve\"]\n + opts\n + [\"--host=%s\" % host, \"--port=%s\" % port, \"--ident=mlflow\", app_name]\n )\n\n\ndef _build_gunicorn_command(gunicorn_opts, host, port, workers, app_name):\n bind_address = f\"{host}:{port}\"\n opts = shlex.split(gunicorn_opts) if gunicorn_opts else []\n return [\"gunicorn\"] + opts + [\"-b\", bind_address, \"-w\", \"%s\" % workers, app_name]\n\n\ndef _run_server(\n file_store_path,\n registry_store_uri,\n default_artifact_root,\n serve_artifacts,\n artifacts_only,\n artifacts_destination,\n host,\n port,\n static_prefix=None,\n workers=None,\n gunicorn_opts=None,\n waitress_opts=None,\n expose_prometheus=None,\n app_name=None,\n):\n \"\"\"\n Run the MLflow server, wrapping it in gunicorn or waitress on windows\n :param static_prefix: If set, the index.html asset will be served from the path static_prefix.\n If left None, the index.html asset will be served from the root path.\n :return: None\n \"\"\"\n env_map = {}\n if file_store_path:\n env_map[BACKEND_STORE_URI_ENV_VAR] = file_store_path\n if registry_store_uri:\n env_map[REGISTRY_STORE_URI_ENV_VAR] = registry_store_uri\n if default_artifact_root:\n env_map[ARTIFACT_ROOT_ENV_VAR] = default_artifact_root\n if serve_artifacts:\n env_map[SERVE_ARTIFACTS_ENV_VAR] = \"true\"\n if artifacts_only:\n env_map[ARTIFACTS_ONLY_ENV_VAR] = \"true\"\n if artifacts_destination:\n env_map[ARTIFACTS_DESTINATION_ENV_VAR] = artifacts_destination\n if static_prefix:\n env_map[STATIC_PREFIX_ENV_VAR] = static_prefix\n\n if expose_prometheus:\n env_map[PROMETHEUS_EXPORTER_ENV_VAR] = expose_prometheus\n\n app_spec = f\"{__name__}:app\" if app_name is None else _find_app(app_name)\n # TODO: eventually may want waitress on non-win32\n if sys.platform == \"win32\":\n full_command = _build_waitress_command(waitress_opts, host, port, app_spec)\n else:\n full_command = _build_gunicorn_command(gunicorn_opts, host, port, workers or 4, app_spec)\n _exec_cmd(full_command, extra_env=env_map, capture_output=False)\n", "path": "mlflow/server/__init__.py"}]}
| 3,646 | 193 |
gh_patches_debug_18803
|
rasdani/github-patches
|
git_diff
|
akvo__akvo-rsr-2746
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
List of reports not loading for some users
The reports page fetches user information from `/rest/v1/user/{id}` and this seems to be failing when the Country name of primary organisation is unicode.
Created via Reamaze:
Link: https://akvoo.reamaze.com/admin/conversations/list-of-reports-not-loading
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rest/views/user.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """Akvo RSR is covered by the GNU Affero General Public License.
3
4 See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6 """
7
8 from django.contrib.auth import get_user_model
9 from django.db import IntegrityError
10 from django.utils.translation import ugettext_lazy as _
11
12 from rest_framework import status
13 from rest_framework.authentication import SessionAuthentication
14 from rest_framework.decorators import api_view
15 from rest_framework.decorators import authentication_classes
16 from rest_framework.exceptions import PermissionDenied
17 from rest_framework.response import Response
18
19 from akvo.rest.models import TastyTokenAuthentication
20 from ...rsr.models import Employment
21 from ...rsr.models import Organisation
22 from ..viewsets import BaseRSRViewSet
23 from ..serializers import EmploymentSerializer
24 from ..serializers import OrganisationSerializer
25 from ..serializers import UserSerializer, UserDetailsSerializer, UserPasswordSerializer
26
27
28 class UserViewSet(BaseRSRViewSet):
29
30 """User resource."""
31
32 queryset = get_user_model().objects.select_related(
33 'organisation',
34 'organisation__primary_location',
35 'organisation__primary_location__country',
36 'organisation__primary_location__location_target',
37 'organisation__primary_location__location_target__internal_org_ids',
38 ).prefetch_related(
39 'organisations',
40 'organisations__primary_location',
41 'organisations__primary_location__country',
42 'organisations__primary_location__location_target',)
43 serializer_class = UserSerializer
44
45
46 @api_view(['POST'])
47 def change_password(request, pk=None):
48 # Get the user, or return an error if the user does not exist
49 try:
50 user = get_user_model().objects.get(pk=pk)
51 except get_user_model().DoesNotExist:
52 return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)
53
54 # Users are only allowed to edit their own details
55 request_user = getattr(request, 'user', None)
56 if not user == request_user:
57 raise PermissionDenied()
58
59 # Process request
60 serializer = UserPasswordSerializer(data=request.data, instance=user)
61 if serializer.is_valid():
62 user = serializer.update(serializer.instance, serializer.validated_data)
63 user.save()
64 return Response({'status': 'password set'})
65 else:
66 return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
67
68
69 @api_view(['POST'])
70 def update_details(request, pk=None):
71 # Get the user, or return an error if the user does not exist
72 try:
73 user = get_user_model().objects.get(pk=pk)
74 except get_user_model().DoesNotExist:
75 return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)
76
77 # Users are only allowed to edit their own details
78 request_user = getattr(request, 'user', None)
79 if not user == request_user:
80 raise PermissionDenied()
81
82 # Process request
83 serializer = UserDetailsSerializer(data=request.data)
84 if serializer.is_valid():
85 serializer.update(user, serializer.initial_data)
86 return Response(request.data)
87 else:
88 return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
89
90
91 @api_view(['POST'])
92 @authentication_classes([SessionAuthentication, TastyTokenAuthentication])
93 def request_organisation(request, pk=None):
94
95 # Add missing keys to request.data to simplify code and getting working
96 # with DRF-3 Serializer
97 for key in ('group', 'country', 'job_title'):
98 if key not in request.data:
99 request.data[key] = ''
100
101 # Get the user, or return an error if the user does not exist
102 try:
103 user_by_pk = get_user_model().objects.get(pk=pk)
104 except get_user_model().DoesNotExist:
105 return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)
106
107 # request.user is the user identified by the auth token
108 user = request.user
109 # Users themselves are only allowed to request to join an organisation
110 if not user_by_pk == user:
111 raise PermissionDenied()
112 request.data['user'] = pk
113
114 # Process request
115 serializer = EmploymentSerializer(data=request.data)
116 if serializer.is_valid():
117 try:
118 organisation = Organisation.objects.get(pk=serializer.data['organisation'])
119 employment = Employment.objects.create(
120 user=user,
121 organisation=organisation,
122 country=serializer.data['country'],
123 job_title=serializer.data['job_title'],
124 is_approved=False,
125 )
126 except IntegrityError:
127 return Response({'detail': _(u'User already linked to this organisation')},
128 status=status.HTTP_409_CONFLICT)
129
130 data = EmploymentSerializer(employment).data
131 if data['country']:
132 data['country_full'] = employment.iati_country().name
133 data['country_name'] = str(employment.iati_country())
134 else:
135 data['country_full'] = ''
136 data['organisation_full'] = OrganisationSerializer(organisation).data
137 data['id'] = employment.pk
138
139 return Response(data)
140 else:
141 return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
142
```
Path: `akvo/rsr/models/employment.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7 from akvo.codelists import models as codelist_models
8 from akvo.codelists.store.codelists_v202 import COUNTRY
9 from akvo.utils import codelist_choices, codelist_value
10
11 from django.contrib.admin.models import LogEntry, CHANGE
12 from django.contrib.auth import get_user_model
13 from django.contrib.auth.models import Group
14 from django.contrib.contenttypes.models import ContentType
15 from django.core.exceptions import ObjectDoesNotExist
16 from django.db import models
17 from django.db.models.query import QuerySet as DjangoQuerySet
18 from django.forms.models import model_to_dict
19 from django.utils.encoding import force_unicode
20 from django.utils.translation import ugettext_lazy as _
21
22 from .models_utils import QuerySetManager
23
24 from ..fields import ValidXMLCharField
25
26
27 class Employment(models.Model):
28 organisation = models.ForeignKey('Organisation', verbose_name=_(u'organisation'),
29 related_name='employees')
30 user = models.ForeignKey('User', verbose_name=_(u'user'), related_name='employers')
31 group = models.ForeignKey(Group, verbose_name=_(u'group'), null=True,
32 related_name='employments', related_query_name='employment',
33 on_delete=models.SET_NULL,
34 help_text=_('The permissions group for this user\'s employment.'))
35 is_approved = models.BooleanField(_('approved'), default=False,
36 help_text=_('Designates whether this employment is approved '
37 'by an administrator.'))
38 country = ValidXMLCharField(
39 _(u'country'), blank=True, max_length=2, choices=codelist_choices(COUNTRY, show_code=False)
40 )
41 job_title = ValidXMLCharField(_(u'job title'), max_length=50, blank=True)
42
43 objects = QuerySetManager()
44
45 class QuerySet(DjangoQuerySet):
46 def organisations(self):
47 """
48 Return an Organisation QuerySet containing the organisations of the Employment QuerySet
49 """
50 from ..models import Organisation
51 return Organisation.objects.filter(employees__in=self).distinct()
52
53 def users(self):
54 """
55 Return a User QuerySet containing the users of the Employment QuerySet
56 """
57 return get_user_model().objects.filter(employers__in=self).distinct()
58
59 def approved(self):
60 """
61 Return an Employment QuerySet containing the approved Employments
62 """
63 return self.filter(is_approved=True)
64
65 class Meta:
66 app_label = 'rsr'
67 verbose_name = _(u'user employment')
68 verbose_name_plural = _(u'user employments')
69 unique_together = ('organisation', 'user', 'group')
70
71 def __unicode__(self):
72 return u"{0} {1}: {2}".format(self.user.first_name, self.user.last_name,
73 self.organisation.name)
74
75 def iati_country(self):
76 return codelist_value(codelist_models.Country, self, 'country')
77
78 def iati_country_unicode(self):
79 return str(self.iati_country())
80
81 def approve(self, approved_by):
82 """
83 Approve an Employment. Will also be logged in the Employment admin history.
84
85 :param approved_by: User that approves the Employment
86 """
87 if not self.is_approved:
88 self.is_approved = True
89 self.save()
90
91 # Log in Employment admin history
92 LogEntry.objects.log_action(
93 user_id=approved_by.pk,
94 content_type_id=ContentType.objects.get_for_model(self).pk,
95 object_id=self.pk,
96 object_repr=force_unicode(self),
97 action_flag=CHANGE,
98 change_message=u'Changed is_approved, outside of admin.'
99 )
100
101 def to_dict(self, org_list):
102 # Set groups in right order
103 all_groups = []
104 auth_group_names = ['Users', 'User Managers', 'Project Editors', 'Admins']
105 for auth_group_name in auth_group_names:
106 try:
107 all_groups.append(Group.objects.get(name=auth_group_name))
108 except ObjectDoesNotExist:
109 continue
110
111 user_group = model_to_dict(self.group, fields=['id', 'name']) if self.group else None
112 other_groups = [model_to_dict(group, fields=['id', 'name']) for group in all_groups]
113
114 return dict(
115 id=self.pk,
116 organisation_full=model_to_dict(self.organisation, fields=['id', 'name', 'long_name', ]),
117 user_full=model_to_dict(self.user, fields=['id', 'first_name', 'last_name', 'email', ]),
118 is_approved=self.is_approved,
119 job_title=self.job_title,
120 country_full=self.iati_country().name if self.country else '',
121 group=user_group,
122 other_groups=other_groups,
123 actions=True if self.organisation in org_list else False,
124 )
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/akvo/rest/views/user.py b/akvo/rest/views/user.py
--- a/akvo/rest/views/user.py
+++ b/akvo/rest/views/user.py
@@ -130,7 +130,7 @@
data = EmploymentSerializer(employment).data
if data['country']:
data['country_full'] = employment.iati_country().name
- data['country_name'] = str(employment.iati_country())
+ data['country_name'] = unicode(employment.iati_country())
else:
data['country_full'] = ''
data['organisation_full'] = OrganisationSerializer(organisation).data
diff --git a/akvo/rsr/models/employment.py b/akvo/rsr/models/employment.py
--- a/akvo/rsr/models/employment.py
+++ b/akvo/rsr/models/employment.py
@@ -76,7 +76,7 @@
return codelist_value(codelist_models.Country, self, 'country')
def iati_country_unicode(self):
- return str(self.iati_country())
+ return unicode(self.iati_country())
def approve(self, approved_by):
"""
|
{"golden_diff": "diff --git a/akvo/rest/views/user.py b/akvo/rest/views/user.py\n--- a/akvo/rest/views/user.py\n+++ b/akvo/rest/views/user.py\n@@ -130,7 +130,7 @@\n data = EmploymentSerializer(employment).data\n if data['country']:\n data['country_full'] = employment.iati_country().name\n- data['country_name'] = str(employment.iati_country())\n+ data['country_name'] = unicode(employment.iati_country())\n else:\n data['country_full'] = ''\n data['organisation_full'] = OrganisationSerializer(organisation).data\ndiff --git a/akvo/rsr/models/employment.py b/akvo/rsr/models/employment.py\n--- a/akvo/rsr/models/employment.py\n+++ b/akvo/rsr/models/employment.py\n@@ -76,7 +76,7 @@\n return codelist_value(codelist_models.Country, self, 'country')\n \n def iati_country_unicode(self):\n- return str(self.iati_country())\n+ return unicode(self.iati_country())\n \n def approve(self, approved_by):\n \"\"\"\n", "issue": "List of reports not loading for some users\nThe reports page fetches user information from `/rest/v1/user/{id}` and this seems to be failing when the Country name of primary organisation is unicode. \r\n\r\nCreated via Reamaze:\r\n\r\nLink: https://akvoo.reamaze.com/admin/conversations/list-of-reports-not-loading\r\n\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\n\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom django.contrib.auth import get_user_model\nfrom django.db import IntegrityError\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom rest_framework import status\nfrom rest_framework.authentication import SessionAuthentication\nfrom rest_framework.decorators import api_view\nfrom rest_framework.decorators import authentication_classes\nfrom rest_framework.exceptions import PermissionDenied\nfrom rest_framework.response import Response\n\nfrom akvo.rest.models import TastyTokenAuthentication\nfrom ...rsr.models import Employment\nfrom ...rsr.models import Organisation\nfrom ..viewsets import BaseRSRViewSet\nfrom ..serializers import EmploymentSerializer\nfrom ..serializers import OrganisationSerializer\nfrom ..serializers import UserSerializer, UserDetailsSerializer, UserPasswordSerializer\n\n\nclass UserViewSet(BaseRSRViewSet):\n\n \"\"\"User resource.\"\"\"\n\n queryset = get_user_model().objects.select_related(\n 'organisation',\n 'organisation__primary_location',\n 'organisation__primary_location__country',\n 'organisation__primary_location__location_target',\n 'organisation__primary_location__location_target__internal_org_ids',\n ).prefetch_related(\n 'organisations',\n 'organisations__primary_location',\n 'organisations__primary_location__country',\n 'organisations__primary_location__location_target',)\n serializer_class = UserSerializer\n\n\n@api_view(['POST'])\ndef change_password(request, pk=None):\n # Get the user, or return an error if the user does not exist\n try:\n user = get_user_model().objects.get(pk=pk)\n except get_user_model().DoesNotExist:\n return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)\n\n # Users are only allowed to edit their own details\n request_user = getattr(request, 'user', None)\n if not user == request_user:\n raise PermissionDenied()\n\n # Process request\n serializer = UserPasswordSerializer(data=request.data, instance=user)\n if serializer.is_valid():\n user = serializer.update(serializer.instance, serializer.validated_data)\n user.save()\n return Response({'status': 'password set'})\n else:\n return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)\n\n\n@api_view(['POST'])\ndef update_details(request, pk=None):\n # Get the user, or return an error if the user does not exist\n try:\n user = get_user_model().objects.get(pk=pk)\n except get_user_model().DoesNotExist:\n return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)\n\n # Users are only allowed to edit their own details\n request_user = getattr(request, 'user', None)\n if not user == request_user:\n raise PermissionDenied()\n\n # Process request\n serializer = UserDetailsSerializer(data=request.data)\n if serializer.is_valid():\n serializer.update(user, serializer.initial_data)\n return Response(request.data)\n else:\n return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)\n\n\n@api_view(['POST'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication])\ndef request_organisation(request, pk=None):\n\n # Add missing keys to request.data to simplify code and getting working\n # with DRF-3 Serializer\n for key in ('group', 'country', 'job_title'):\n if key not in request.data:\n request.data[key] = ''\n\n # Get the user, or return an error if the user does not exist\n try:\n user_by_pk = get_user_model().objects.get(pk=pk)\n except get_user_model().DoesNotExist:\n return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)\n\n # request.user is the user identified by the auth token\n user = request.user\n # Users themselves are only allowed to request to join an organisation\n if not user_by_pk == user:\n raise PermissionDenied()\n request.data['user'] = pk\n\n # Process request\n serializer = EmploymentSerializer(data=request.data)\n if serializer.is_valid():\n try:\n organisation = Organisation.objects.get(pk=serializer.data['organisation'])\n employment = Employment.objects.create(\n user=user,\n organisation=organisation,\n country=serializer.data['country'],\n job_title=serializer.data['job_title'],\n is_approved=False,\n )\n except IntegrityError:\n return Response({'detail': _(u'User already linked to this organisation')},\n status=status.HTTP_409_CONFLICT)\n\n data = EmploymentSerializer(employment).data\n if data['country']:\n data['country_full'] = employment.iati_country().name\n data['country_name'] = str(employment.iati_country())\n else:\n data['country_full'] = ''\n data['organisation_full'] = OrganisationSerializer(organisation).data\n data['id'] = employment.pk\n\n return Response(data)\n else:\n return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)\n", "path": "akvo/rest/views/user.py"}, {"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nfrom akvo.codelists import models as codelist_models\nfrom akvo.codelists.store.codelists_v202 import COUNTRY\nfrom akvo.utils import codelist_choices, codelist_value\n\nfrom django.contrib.admin.models import LogEntry, CHANGE\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.auth.models import Group\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.db import models\nfrom django.db.models.query import QuerySet as DjangoQuerySet\nfrom django.forms.models import model_to_dict\nfrom django.utils.encoding import force_unicode\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom .models_utils import QuerySetManager\n\nfrom ..fields import ValidXMLCharField\n\n\nclass Employment(models.Model):\n organisation = models.ForeignKey('Organisation', verbose_name=_(u'organisation'),\n related_name='employees')\n user = models.ForeignKey('User', verbose_name=_(u'user'), related_name='employers')\n group = models.ForeignKey(Group, verbose_name=_(u'group'), null=True,\n related_name='employments', related_query_name='employment',\n on_delete=models.SET_NULL,\n help_text=_('The permissions group for this user\\'s employment.'))\n is_approved = models.BooleanField(_('approved'), default=False,\n help_text=_('Designates whether this employment is approved '\n 'by an administrator.'))\n country = ValidXMLCharField(\n _(u'country'), blank=True, max_length=2, choices=codelist_choices(COUNTRY, show_code=False)\n )\n job_title = ValidXMLCharField(_(u'job title'), max_length=50, blank=True)\n\n objects = QuerySetManager()\n\n class QuerySet(DjangoQuerySet):\n def organisations(self):\n \"\"\"\n Return an Organisation QuerySet containing the organisations of the Employment QuerySet\n \"\"\"\n from ..models import Organisation\n return Organisation.objects.filter(employees__in=self).distinct()\n\n def users(self):\n \"\"\"\n Return a User QuerySet containing the users of the Employment QuerySet\n \"\"\"\n return get_user_model().objects.filter(employers__in=self).distinct()\n\n def approved(self):\n \"\"\"\n Return an Employment QuerySet containing the approved Employments\n \"\"\"\n return self.filter(is_approved=True)\n\n class Meta:\n app_label = 'rsr'\n verbose_name = _(u'user employment')\n verbose_name_plural = _(u'user employments')\n unique_together = ('organisation', 'user', 'group')\n\n def __unicode__(self):\n return u\"{0} {1}: {2}\".format(self.user.first_name, self.user.last_name,\n self.organisation.name)\n\n def iati_country(self):\n return codelist_value(codelist_models.Country, self, 'country')\n\n def iati_country_unicode(self):\n return str(self.iati_country())\n\n def approve(self, approved_by):\n \"\"\"\n Approve an Employment. Will also be logged in the Employment admin history.\n\n :param approved_by: User that approves the Employment\n \"\"\"\n if not self.is_approved:\n self.is_approved = True\n self.save()\n\n # Log in Employment admin history\n LogEntry.objects.log_action(\n user_id=approved_by.pk,\n content_type_id=ContentType.objects.get_for_model(self).pk,\n object_id=self.pk,\n object_repr=force_unicode(self),\n action_flag=CHANGE,\n change_message=u'Changed is_approved, outside of admin.'\n )\n\n def to_dict(self, org_list):\n # Set groups in right order\n all_groups = []\n auth_group_names = ['Users', 'User Managers', 'Project Editors', 'Admins']\n for auth_group_name in auth_group_names:\n try:\n all_groups.append(Group.objects.get(name=auth_group_name))\n except ObjectDoesNotExist:\n continue\n\n user_group = model_to_dict(self.group, fields=['id', 'name']) if self.group else None\n other_groups = [model_to_dict(group, fields=['id', 'name']) for group in all_groups]\n\n return dict(\n id=self.pk,\n organisation_full=model_to_dict(self.organisation, fields=['id', 'name', 'long_name', ]),\n user_full=model_to_dict(self.user, fields=['id', 'first_name', 'last_name', 'email', ]),\n is_approved=self.is_approved,\n job_title=self.job_title,\n country_full=self.iati_country().name if self.country else '',\n group=user_group,\n other_groups=other_groups,\n actions=True if self.organisation in org_list else False,\n )\n", "path": "akvo/rsr/models/employment.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\n\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom django.contrib.auth import get_user_model\nfrom django.db import IntegrityError\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom rest_framework import status\nfrom rest_framework.authentication import SessionAuthentication\nfrom rest_framework.decorators import api_view\nfrom rest_framework.decorators import authentication_classes\nfrom rest_framework.exceptions import PermissionDenied\nfrom rest_framework.response import Response\n\nfrom akvo.rest.models import TastyTokenAuthentication\nfrom ...rsr.models import Employment\nfrom ...rsr.models import Organisation\nfrom ..viewsets import BaseRSRViewSet\nfrom ..serializers import EmploymentSerializer\nfrom ..serializers import OrganisationSerializer\nfrom ..serializers import UserSerializer, UserDetailsSerializer, UserPasswordSerializer\n\n\nclass UserViewSet(BaseRSRViewSet):\n\n \"\"\"User resource.\"\"\"\n\n queryset = get_user_model().objects.select_related(\n 'organisation',\n 'organisation__primary_location',\n 'organisation__primary_location__country',\n 'organisation__primary_location__location_target',\n 'organisation__primary_location__location_target__internal_org_ids',\n ).prefetch_related(\n 'organisations',\n 'organisations__primary_location',\n 'organisations__primary_location__country',\n 'organisations__primary_location__location_target',)\n serializer_class = UserSerializer\n\n\n@api_view(['POST'])\ndef change_password(request, pk=None):\n # Get the user, or return an error if the user does not exist\n try:\n user = get_user_model().objects.get(pk=pk)\n except get_user_model().DoesNotExist:\n return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)\n\n # Users are only allowed to edit their own details\n request_user = getattr(request, 'user', None)\n if not user == request_user:\n raise PermissionDenied()\n\n # Process request\n serializer = UserPasswordSerializer(data=request.data, instance=user)\n if serializer.is_valid():\n user = serializer.update(serializer.instance, serializer.validated_data)\n user.save()\n return Response({'status': 'password set'})\n else:\n return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)\n\n\n@api_view(['POST'])\ndef update_details(request, pk=None):\n # Get the user, or return an error if the user does not exist\n try:\n user = get_user_model().objects.get(pk=pk)\n except get_user_model().DoesNotExist:\n return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)\n\n # Users are only allowed to edit their own details\n request_user = getattr(request, 'user', None)\n if not user == request_user:\n raise PermissionDenied()\n\n # Process request\n serializer = UserDetailsSerializer(data=request.data)\n if serializer.is_valid():\n serializer.update(user, serializer.initial_data)\n return Response(request.data)\n else:\n return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)\n\n\n@api_view(['POST'])\n@authentication_classes([SessionAuthentication, TastyTokenAuthentication])\ndef request_organisation(request, pk=None):\n\n # Add missing keys to request.data to simplify code and getting working\n # with DRF-3 Serializer\n for key in ('group', 'country', 'job_title'):\n if key not in request.data:\n request.data[key] = ''\n\n # Get the user, or return an error if the user does not exist\n try:\n user_by_pk = get_user_model().objects.get(pk=pk)\n except get_user_model().DoesNotExist:\n return Response({'user': _('User does not exist')}, status=status.HTTP_400_BAD_REQUEST)\n\n # request.user is the user identified by the auth token\n user = request.user\n # Users themselves are only allowed to request to join an organisation\n if not user_by_pk == user:\n raise PermissionDenied()\n request.data['user'] = pk\n\n # Process request\n serializer = EmploymentSerializer(data=request.data)\n if serializer.is_valid():\n try:\n organisation = Organisation.objects.get(pk=serializer.data['organisation'])\n employment = Employment.objects.create(\n user=user,\n organisation=organisation,\n country=serializer.data['country'],\n job_title=serializer.data['job_title'],\n is_approved=False,\n )\n except IntegrityError:\n return Response({'detail': _(u'User already linked to this organisation')},\n status=status.HTTP_409_CONFLICT)\n\n data = EmploymentSerializer(employment).data\n if data['country']:\n data['country_full'] = employment.iati_country().name\n data['country_name'] = unicode(employment.iati_country())\n else:\n data['country_full'] = ''\n data['organisation_full'] = OrganisationSerializer(organisation).data\n data['id'] = employment.pk\n\n return Response(data)\n else:\n return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)\n", "path": "akvo/rest/views/user.py"}, {"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nfrom akvo.codelists import models as codelist_models\nfrom akvo.codelists.store.codelists_v202 import COUNTRY\nfrom akvo.utils import codelist_choices, codelist_value\n\nfrom django.contrib.admin.models import LogEntry, CHANGE\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.auth.models import Group\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.db import models\nfrom django.db.models.query import QuerySet as DjangoQuerySet\nfrom django.forms.models import model_to_dict\nfrom django.utils.encoding import force_unicode\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom .models_utils import QuerySetManager\n\nfrom ..fields import ValidXMLCharField\n\n\nclass Employment(models.Model):\n organisation = models.ForeignKey('Organisation', verbose_name=_(u'organisation'),\n related_name='employees')\n user = models.ForeignKey('User', verbose_name=_(u'user'), related_name='employers')\n group = models.ForeignKey(Group, verbose_name=_(u'group'), null=True,\n related_name='employments', related_query_name='employment',\n on_delete=models.SET_NULL,\n help_text=_('The permissions group for this user\\'s employment.'))\n is_approved = models.BooleanField(_('approved'), default=False,\n help_text=_('Designates whether this employment is approved '\n 'by an administrator.'))\n country = ValidXMLCharField(\n _(u'country'), blank=True, max_length=2, choices=codelist_choices(COUNTRY, show_code=False)\n )\n job_title = ValidXMLCharField(_(u'job title'), max_length=50, blank=True)\n\n objects = QuerySetManager()\n\n class QuerySet(DjangoQuerySet):\n def organisations(self):\n \"\"\"\n Return an Organisation QuerySet containing the organisations of the Employment QuerySet\n \"\"\"\n from ..models import Organisation\n return Organisation.objects.filter(employees__in=self).distinct()\n\n def users(self):\n \"\"\"\n Return a User QuerySet containing the users of the Employment QuerySet\n \"\"\"\n return get_user_model().objects.filter(employers__in=self).distinct()\n\n def approved(self):\n \"\"\"\n Return an Employment QuerySet containing the approved Employments\n \"\"\"\n return self.filter(is_approved=True)\n\n class Meta:\n app_label = 'rsr'\n verbose_name = _(u'user employment')\n verbose_name_plural = _(u'user employments')\n unique_together = ('organisation', 'user', 'group')\n\n def __unicode__(self):\n return u\"{0} {1}: {2}\".format(self.user.first_name, self.user.last_name,\n self.organisation.name)\n\n def iati_country(self):\n return codelist_value(codelist_models.Country, self, 'country')\n\n def iati_country_unicode(self):\n return unicode(self.iati_country())\n\n def approve(self, approved_by):\n \"\"\"\n Approve an Employment. Will also be logged in the Employment admin history.\n\n :param approved_by: User that approves the Employment\n \"\"\"\n if not self.is_approved:\n self.is_approved = True\n self.save()\n\n # Log in Employment admin history\n LogEntry.objects.log_action(\n user_id=approved_by.pk,\n content_type_id=ContentType.objects.get_for_model(self).pk,\n object_id=self.pk,\n object_repr=force_unicode(self),\n action_flag=CHANGE,\n change_message=u'Changed is_approved, outside of admin.'\n )\n\n def to_dict(self, org_list):\n # Set groups in right order\n all_groups = []\n auth_group_names = ['Users', 'User Managers', 'Project Editors', 'Admins']\n for auth_group_name in auth_group_names:\n try:\n all_groups.append(Group.objects.get(name=auth_group_name))\n except ObjectDoesNotExist:\n continue\n\n user_group = model_to_dict(self.group, fields=['id', 'name']) if self.group else None\n other_groups = [model_to_dict(group, fields=['id', 'name']) for group in all_groups]\n\n return dict(\n id=self.pk,\n organisation_full=model_to_dict(self.organisation, fields=['id', 'name', 'long_name', ]),\n user_full=model_to_dict(self.user, fields=['id', 'first_name', 'last_name', 'email', ]),\n is_approved=self.is_approved,\n job_title=self.job_title,\n country_full=self.iati_country().name if self.country else '',\n group=user_group,\n other_groups=other_groups,\n actions=True if self.organisation in org_list else False,\n )\n", "path": "akvo/rsr/models/employment.py"}]}
| 3,095 | 252 |
gh_patches_debug_27813
|
rasdani/github-patches
|
git_diff
|
scrapy__scrapy-4413
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Offsite middleware ignoring port
In my spider I have the following:
class MySpider(BaseSpider):
``` python
allowed_domains = ['192.169.0.15:8080']
```
and in the parse method I do something like:
``` python
yield Request('http://192.169.0.15:8080/mypage.html', self.my_callback_function)
```
the result when I run the code is that that scrapy reports:
DEBUG: Filtered offsite request to '192.168.0.15': <GET http://192.168.0.15:8080/mypage.html>
Which is wrong - it seems to be ignoring the port. If I change the allowed_domains to:
``` python
allowed_domains = ['192.169.0.15:8080', '192.16.0.15']
```
Then it works as you would expect it to. No big deal, can work around it but I think it is a bug. The problem being located in the should_follow method of the OffsiteMiddleware class in contrib/spidermiddleware/offsite.py
Fixed "Offsite middleware ignoring port #50" issue
ref: https://github.com/scrapy/scrapy/issues/50
Although a [patch](https://github.com/scrapy/scrapy/issues/50#issuecomment-289200833) was suggested, simply applying it fails another test. As shown here.
```
============================================================ FAILURES ============================================================
____________________________________________________ EngineTest.test_crawler _____________________________________________________
result = None, g = <generator object test_crawler at 0x7f396bc25780>, deferred = <Deferred at 0x7f396b6a6a28 current result: None>
def _inlineCallbacks(result, g, deferred):
"""
See L{inlineCallbacks}.
"""
# This function is complicated by the need to prevent unbounded recursion
# arising from repeatedly yielding immediately ready deferreds. This while
# loop and the waiting variable solve that by manually unfolding the
# recursion.
waiting = [True, # waiting for result?
None] # result
while 1:
try:
# Send the last result back as the result of the yield expression.
isFailure = isinstance(result, failure.Failure)
if isFailure:
result = result.throwExceptionIntoGenerator(g)
else:
> result = g.send(result)
/scrapy/.tox/py27/local/lib/python2.7/site-packages/twisted/internet/defer.py:1386:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/scrapy/tests/test_engine.py:167: in test_crawler
self._assert_visited_urls()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <tests.test_engine.EngineTest testMethod=test_crawler>
def _assert_visited_urls(self):
must_be_visited = ["/", "/redirect", "/redirected",
"/item1.html", "/item2.html", "/item999.html"]
urls_visited = set([rp[0].url for rp in self.run.respplug])
urls_expected = set([self.run.geturl(p) for p in must_be_visited])
> assert urls_expected <= urls_visited, "URLs not visited: %s" % list(urls_expected - urls_visited)
E AssertionError: URLs not visited: ['http://localhost:42519/item2.html', 'http://localhost:42519/item999.html', 'http://localhost:42519/item1.html']
/scrapy/tests/test_engine.py:183: AssertionError
=============================================== 1 failed, 3 passed in 1.37 seconds ===============================================
ERROR: InvocationError: '/scrapy/.tox/py27/bin/py.test --cov=scrapy --cov-report= tests/test_engine.py'
____________________________________________________________ summary _____________________________________________________________
```
So, "/scrapy/tests/test_engine.py" is failing.
It looks the the test creates spiders by using
```
allowed_domains = ["scrapytest.org", "localhost"]
```
A spider fails to visit some urls with port number like 'http://localhost:37089/item1.html'.
Since test server is launched each time we run the test, it has different port number for local host each time. This means we cannot add a fixed port number to "localhost" in 'allowed_domain" list.
So I modified the patch to...
```py
# scrapy/spidermiddlewares/offsite.py
def should_follow(self, request, spider):
regex = self.host_regex
# hostname can be None for wrong urls (like javascript links)
hostname = urlparse_cached(request).hostname or ''
netloc = urlparse_cached(request).netloc or ''
return bool(regex.search(hostname)) or bool(regex.search(netloc))
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scrapy/spidermiddlewares/offsite.py`
Content:
```
1 """
2 Offsite Spider Middleware
3
4 See documentation in docs/topics/spider-middleware.rst
5 """
6 import re
7 import logging
8 import warnings
9
10 from scrapy import signals
11 from scrapy.http import Request
12 from scrapy.utils.httpobj import urlparse_cached
13
14 logger = logging.getLogger(__name__)
15
16
17 class OffsiteMiddleware(object):
18
19 def __init__(self, stats):
20 self.stats = stats
21
22 @classmethod
23 def from_crawler(cls, crawler):
24 o = cls(crawler.stats)
25 crawler.signals.connect(o.spider_opened, signal=signals.spider_opened)
26 return o
27
28 def process_spider_output(self, response, result, spider):
29 for x in result:
30 if isinstance(x, Request):
31 if x.dont_filter or self.should_follow(x, spider):
32 yield x
33 else:
34 domain = urlparse_cached(x).hostname
35 if domain and domain not in self.domains_seen:
36 self.domains_seen.add(domain)
37 logger.debug(
38 "Filtered offsite request to %(domain)r: %(request)s",
39 {'domain': domain, 'request': x}, extra={'spider': spider})
40 self.stats.inc_value('offsite/domains', spider=spider)
41 self.stats.inc_value('offsite/filtered', spider=spider)
42 else:
43 yield x
44
45 def should_follow(self, request, spider):
46 regex = self.host_regex
47 # hostname can be None for wrong urls (like javascript links)
48 host = urlparse_cached(request).hostname or ''
49 return bool(regex.search(host))
50
51 def get_host_regex(self, spider):
52 """Override this method to implement a different offsite policy"""
53 allowed_domains = getattr(spider, 'allowed_domains', None)
54 if not allowed_domains:
55 return re.compile('') # allow all by default
56 url_pattern = re.compile("^https?://.*$")
57 for domain in allowed_domains:
58 if url_pattern.match(domain):
59 message = ("allowed_domains accepts only domains, not URLs. "
60 "Ignoring URL entry %s in allowed_domains." % domain)
61 warnings.warn(message, URLWarning)
62 domains = [re.escape(d) for d in allowed_domains if d is not None]
63 regex = r'^(.*\.)?(%s)$' % '|'.join(domains)
64 return re.compile(regex)
65
66 def spider_opened(self, spider):
67 self.host_regex = self.get_host_regex(spider)
68 self.domains_seen = set()
69
70
71 class URLWarning(Warning):
72 pass
73
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/scrapy/spidermiddlewares/offsite.py b/scrapy/spidermiddlewares/offsite.py
--- a/scrapy/spidermiddlewares/offsite.py
+++ b/scrapy/spidermiddlewares/offsite.py
@@ -53,13 +53,22 @@
allowed_domains = getattr(spider, 'allowed_domains', None)
if not allowed_domains:
return re.compile('') # allow all by default
- url_pattern = re.compile("^https?://.*$")
+ url_pattern = re.compile(r"^https?://.*$")
+ port_pattern = re.compile(r":\d+$")
+ domains = []
for domain in allowed_domains:
- if url_pattern.match(domain):
+ if domain is None:
+ continue
+ elif url_pattern.match(domain):
message = ("allowed_domains accepts only domains, not URLs. "
"Ignoring URL entry %s in allowed_domains." % domain)
warnings.warn(message, URLWarning)
- domains = [re.escape(d) for d in allowed_domains if d is not None]
+ elif port_pattern.search(domain):
+ message = ("allowed_domains accepts only domains without ports. "
+ "Ignoring entry %s in allowed_domains." % domain)
+ warnings.warn(message, PortWarning)
+ else:
+ domains.append(re.escape(domain))
regex = r'^(.*\.)?(%s)$' % '|'.join(domains)
return re.compile(regex)
@@ -70,3 +79,7 @@
class URLWarning(Warning):
pass
+
+
+class PortWarning(Warning):
+ pass
|
{"golden_diff": "diff --git a/scrapy/spidermiddlewares/offsite.py b/scrapy/spidermiddlewares/offsite.py\n--- a/scrapy/spidermiddlewares/offsite.py\n+++ b/scrapy/spidermiddlewares/offsite.py\n@@ -53,13 +53,22 @@\n allowed_domains = getattr(spider, 'allowed_domains', None)\n if not allowed_domains:\n return re.compile('') # allow all by default\n- url_pattern = re.compile(\"^https?://.*$\")\n+ url_pattern = re.compile(r\"^https?://.*$\")\n+ port_pattern = re.compile(r\":\\d+$\")\n+ domains = []\n for domain in allowed_domains:\n- if url_pattern.match(domain):\n+ if domain is None:\n+ continue\n+ elif url_pattern.match(domain):\n message = (\"allowed_domains accepts only domains, not URLs. \"\n \"Ignoring URL entry %s in allowed_domains.\" % domain)\n warnings.warn(message, URLWarning)\n- domains = [re.escape(d) for d in allowed_domains if d is not None]\n+ elif port_pattern.search(domain):\n+ message = (\"allowed_domains accepts only domains without ports. \"\n+ \"Ignoring entry %s in allowed_domains.\" % domain)\n+ warnings.warn(message, PortWarning)\n+ else:\n+ domains.append(re.escape(domain))\n regex = r'^(.*\\.)?(%s)$' % '|'.join(domains)\n return re.compile(regex)\n \n@@ -70,3 +79,7 @@\n \n class URLWarning(Warning):\n pass\n+\n+\n+class PortWarning(Warning):\n+ pass\n", "issue": "Offsite middleware ignoring port\nIn my spider I have the following:\n\nclass MySpider(BaseSpider):\n\n``` python\n allowed_domains = ['192.169.0.15:8080']\n```\n\nand in the parse method I do something like:\n\n``` python\n yield Request('http://192.169.0.15:8080/mypage.html', self.my_callback_function)\n```\n\nthe result when I run the code is that that scrapy reports:\n\n DEBUG: Filtered offsite request to '192.168.0.15': <GET http://192.168.0.15:8080/mypage.html>\n\nWhich is wrong - it seems to be ignoring the port. If I change the allowed_domains to:\n\n``` python\n allowed_domains = ['192.169.0.15:8080', '192.16.0.15']\n```\n\nThen it works as you would expect it to. No big deal, can work around it but I think it is a bug. The problem being located in the should_follow method of the OffsiteMiddleware class in contrib/spidermiddleware/offsite.py\n\nFixed \"Offsite middleware ignoring port #50\" issue\nref: https://github.com/scrapy/scrapy/issues/50\r\n\r\nAlthough a [patch](https://github.com/scrapy/scrapy/issues/50#issuecomment-289200833) was suggested, simply applying it fails another test. As shown here. \r\n\r\n```\r\n\r\n============================================================ FAILURES ============================================================\r\n____________________________________________________ EngineTest.test_crawler _____________________________________________________\r\n\r\nresult = None, g = <generator object test_crawler at 0x7f396bc25780>, deferred = <Deferred at 0x7f396b6a6a28 current result: None>\r\n\r\n def _inlineCallbacks(result, g, deferred):\r\n \"\"\"\r\n See L{inlineCallbacks}.\r\n \"\"\"\r\n # This function is complicated by the need to prevent unbounded recursion\r\n # arising from repeatedly yielding immediately ready deferreds. This while\r\n # loop and the waiting variable solve that by manually unfolding the\r\n # recursion.\r\n\r\n waiting = [True, # waiting for result?\r\n None] # result\r\n\r\n while 1:\r\n try:\r\n # Send the last result back as the result of the yield expression.\r\n isFailure = isinstance(result, failure.Failure)\r\n if isFailure:\r\n result = result.throwExceptionIntoGenerator(g)\r\n else:\r\n> result = g.send(result)\r\n\r\n/scrapy/.tox/py27/local/lib/python2.7/site-packages/twisted/internet/defer.py:1386:\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n/scrapy/tests/test_engine.py:167: in test_crawler\r\n self._assert_visited_urls()\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _\r\n\r\nself = <tests.test_engine.EngineTest testMethod=test_crawler>\r\n\r\n def _assert_visited_urls(self):\r\n must_be_visited = [\"/\", \"/redirect\", \"/redirected\",\r\n \"/item1.html\", \"/item2.html\", \"/item999.html\"]\r\n urls_visited = set([rp[0].url for rp in self.run.respplug])\r\n urls_expected = set([self.run.geturl(p) for p in must_be_visited])\r\n> assert urls_expected <= urls_visited, \"URLs not visited: %s\" % list(urls_expected - urls_visited)\r\nE AssertionError: URLs not visited: ['http://localhost:42519/item2.html', 'http://localhost:42519/item999.html', 'http://localhost:42519/item1.html']\r\n\r\n/scrapy/tests/test_engine.py:183: AssertionError\r\n=============================================== 1 failed, 3 passed in 1.37 seconds ===============================================\r\nERROR: InvocationError: '/scrapy/.tox/py27/bin/py.test --cov=scrapy --cov-report= tests/test_engine.py'\r\n____________________________________________________________ summary _____________________________________________________________\r\n\r\n```\r\n\r\nSo, \"/scrapy/tests/test_engine.py\" is failing.\r\nIt looks the the test creates spiders by using \r\n\r\n```\r\nallowed_domains = [\"scrapytest.org\", \"localhost\"] \r\n```\r\n\r\nA spider fails to visit some urls with port number like 'http://localhost:37089/item1.html'.\r\nSince test server is launched each time we run the test, it has different port number for local host each time. This means we cannot add a fixed port number to \"localhost\" in 'allowed_domain\" list.\r\n\r\nSo I modified the patch to... \r\n```py\r\n# scrapy/spidermiddlewares/offsite.py\r\n\r\n def should_follow(self, request, spider):\r\n regex = self.host_regex\r\n # hostname can be None for wrong urls (like javascript links)\r\n hostname = urlparse_cached(request).hostname or ''\r\n netloc = urlparse_cached(request).netloc or ''\r\n return bool(regex.search(hostname)) or bool(regex.search(netloc))\r\n```\r\n\r\n\r\n\n", "before_files": [{"content": "\"\"\"\nOffsite Spider Middleware\n\nSee documentation in docs/topics/spider-middleware.rst\n\"\"\"\nimport re\nimport logging\nimport warnings\n\nfrom scrapy import signals\nfrom scrapy.http import Request\nfrom scrapy.utils.httpobj import urlparse_cached\n\nlogger = logging.getLogger(__name__)\n\n\nclass OffsiteMiddleware(object):\n\n def __init__(self, stats):\n self.stats = stats\n\n @classmethod\n def from_crawler(cls, crawler):\n o = cls(crawler.stats)\n crawler.signals.connect(o.spider_opened, signal=signals.spider_opened)\n return o\n\n def process_spider_output(self, response, result, spider):\n for x in result:\n if isinstance(x, Request):\n if x.dont_filter or self.should_follow(x, spider):\n yield x\n else:\n domain = urlparse_cached(x).hostname\n if domain and domain not in self.domains_seen:\n self.domains_seen.add(domain)\n logger.debug(\n \"Filtered offsite request to %(domain)r: %(request)s\",\n {'domain': domain, 'request': x}, extra={'spider': spider})\n self.stats.inc_value('offsite/domains', spider=spider)\n self.stats.inc_value('offsite/filtered', spider=spider)\n else:\n yield x\n\n def should_follow(self, request, spider):\n regex = self.host_regex\n # hostname can be None for wrong urls (like javascript links)\n host = urlparse_cached(request).hostname or ''\n return bool(regex.search(host))\n\n def get_host_regex(self, spider):\n \"\"\"Override this method to implement a different offsite policy\"\"\"\n allowed_domains = getattr(spider, 'allowed_domains', None)\n if not allowed_domains:\n return re.compile('') # allow all by default\n url_pattern = re.compile(\"^https?://.*$\")\n for domain in allowed_domains:\n if url_pattern.match(domain):\n message = (\"allowed_domains accepts only domains, not URLs. \"\n \"Ignoring URL entry %s in allowed_domains.\" % domain)\n warnings.warn(message, URLWarning)\n domains = [re.escape(d) for d in allowed_domains if d is not None]\n regex = r'^(.*\\.)?(%s)$' % '|'.join(domains)\n return re.compile(regex)\n\n def spider_opened(self, spider):\n self.host_regex = self.get_host_regex(spider)\n self.domains_seen = set()\n\n\nclass URLWarning(Warning):\n pass\n", "path": "scrapy/spidermiddlewares/offsite.py"}], "after_files": [{"content": "\"\"\"\nOffsite Spider Middleware\n\nSee documentation in docs/topics/spider-middleware.rst\n\"\"\"\nimport re\nimport logging\nimport warnings\n\nfrom scrapy import signals\nfrom scrapy.http import Request\nfrom scrapy.utils.httpobj import urlparse_cached\n\nlogger = logging.getLogger(__name__)\n\n\nclass OffsiteMiddleware(object):\n\n def __init__(self, stats):\n self.stats = stats\n\n @classmethod\n def from_crawler(cls, crawler):\n o = cls(crawler.stats)\n crawler.signals.connect(o.spider_opened, signal=signals.spider_opened)\n return o\n\n def process_spider_output(self, response, result, spider):\n for x in result:\n if isinstance(x, Request):\n if x.dont_filter or self.should_follow(x, spider):\n yield x\n else:\n domain = urlparse_cached(x).hostname\n if domain and domain not in self.domains_seen:\n self.domains_seen.add(domain)\n logger.debug(\n \"Filtered offsite request to %(domain)r: %(request)s\",\n {'domain': domain, 'request': x}, extra={'spider': spider})\n self.stats.inc_value('offsite/domains', spider=spider)\n self.stats.inc_value('offsite/filtered', spider=spider)\n else:\n yield x\n\n def should_follow(self, request, spider):\n regex = self.host_regex\n # hostname can be None for wrong urls (like javascript links)\n host = urlparse_cached(request).hostname or ''\n return bool(regex.search(host))\n\n def get_host_regex(self, spider):\n \"\"\"Override this method to implement a different offsite policy\"\"\"\n allowed_domains = getattr(spider, 'allowed_domains', None)\n if not allowed_domains:\n return re.compile('') # allow all by default\n url_pattern = re.compile(r\"^https?://.*$\")\n port_pattern = re.compile(r\":\\d+$\")\n domains = []\n for domain in allowed_domains:\n if domain is None:\n continue\n elif url_pattern.match(domain):\n message = (\"allowed_domains accepts only domains, not URLs. \"\n \"Ignoring URL entry %s in allowed_domains.\" % domain)\n warnings.warn(message, URLWarning)\n elif port_pattern.search(domain):\n message = (\"allowed_domains accepts only domains without ports. \"\n \"Ignoring entry %s in allowed_domains.\" % domain)\n warnings.warn(message, PortWarning)\n else:\n domains.append(re.escape(domain))\n regex = r'^(.*\\.)?(%s)$' % '|'.join(domains)\n return re.compile(regex)\n\n def spider_opened(self, spider):\n self.host_regex = self.get_host_regex(spider)\n self.domains_seen = set()\n\n\nclass URLWarning(Warning):\n pass\n\n\nclass PortWarning(Warning):\n pass\n", "path": "scrapy/spidermiddlewares/offsite.py"}]}
| 2,124 | 346 |
gh_patches_debug_15689
|
rasdani/github-patches
|
git_diff
|
ipython__ipython-9186
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tools/github_stats fails on 2fa.
```
$ python tools/github_stats.py --milestone 4.1 --since-tag 4.0.0
no cache
fetching GitHub stats since 2015-08-12 00:00:00 (tag: 4.0.0, milestone: 4.1)
Please enter your github username and password. These are not stored, only used to get an oAuth token. You can revoke this at any time on Github.
Username: *****
Password:*****
Traceback (most recent call last):
File "github_stats.py", line 165, in <module>
auth=True)
File "/Users/bussonniermatthias/dev/ipython/tools/gh_api.py", line 175, in get_milestone_id
milestones = get_milestones(project, auth=auth, **params)
File "/Users/bussonniermatthias/dev/ipython/tools/gh_api.py", line 168, in get_milestones
headers = make_auth_header()
File "/Users/bussonniermatthias/dev/ipython/tools/gh_api.py", line 73, in make_auth_header
return {'Authorization': 'token ' + get_auth_token()}
File "/Users/bussonniermatthias/dev/ipython/tools/gh_api.py", line 67, in get_auth_token
response.raise_for_status()
File "/Users/bussonniermatthias/anaconda3/lib/python3.5/site-packages/requests/models.py", line 840, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://api.github.com/authorizations
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tools/gh_api.py`
Content:
```
1 """Functions for Github API requests."""
2 from __future__ import print_function
3
4 try:
5 input = raw_input
6 except NameError:
7 pass
8
9 import os
10 import re
11 import sys
12
13 import requests
14 import getpass
15 import json
16
17 try:
18 import requests_cache
19 except ImportError:
20 print("no cache", file=sys.stderr)
21 else:
22 requests_cache.install_cache("gh_api", expire_after=3600)
23
24 # Keyring stores passwords by a 'username', but we're not storing a username and
25 # password
26 fake_username = 'ipython_tools'
27
28 class Obj(dict):
29 """Dictionary with attribute access to names."""
30 def __getattr__(self, name):
31 try:
32 return self[name]
33 except KeyError:
34 raise AttributeError(name)
35
36 def __setattr__(self, name, val):
37 self[name] = val
38
39 token = None
40 def get_auth_token():
41 global token
42
43 if token is not None:
44 return token
45
46 import keyring
47 token = keyring.get_password('github', fake_username)
48 if token is not None:
49 return token
50
51 print("Please enter your github username and password. These are not "
52 "stored, only used to get an oAuth token. You can revoke this at "
53 "any time on Github.")
54 user = input("Username: ")
55 pw = getpass.getpass("Password: ")
56
57 auth_request = {
58 "scopes": [
59 "public_repo",
60 "gist"
61 ],
62 "note": "IPython tools",
63 "note_url": "https://github.com/ipython/ipython/tree/master/tools",
64 }
65 response = requests.post('https://api.github.com/authorizations',
66 auth=(user, pw), data=json.dumps(auth_request))
67 response.raise_for_status()
68 token = json.loads(response.text)['token']
69 keyring.set_password('github', fake_username, token)
70 return token
71
72 def make_auth_header():
73 return {'Authorization': 'token ' + get_auth_token()}
74
75 def post_issue_comment(project, num, body):
76 url = 'https://api.github.com/repos/{project}/issues/{num}/comments'.format(project=project, num=num)
77 payload = json.dumps({'body': body})
78 requests.post(url, data=payload, headers=make_auth_header())
79
80 def post_gist(content, description='', filename='file', auth=False):
81 """Post some text to a Gist, and return the URL."""
82 post_data = json.dumps({
83 "description": description,
84 "public": True,
85 "files": {
86 filename: {
87 "content": content
88 }
89 }
90 }).encode('utf-8')
91
92 headers = make_auth_header() if auth else {}
93 response = requests.post("https://api.github.com/gists", data=post_data, headers=headers)
94 response.raise_for_status()
95 response_data = json.loads(response.text)
96 return response_data['html_url']
97
98 def get_pull_request(project, num, auth=False):
99 """get pull request info by number
100 """
101 url = "https://api.github.com/repos/{project}/pulls/{num}".format(project=project, num=num)
102 if auth:
103 header = make_auth_header()
104 else:
105 header = None
106 print("fetching %s" % url, file=sys.stderr)
107 response = requests.get(url, headers=header)
108 response.raise_for_status()
109 return json.loads(response.text, object_hook=Obj)
110
111 def get_pull_request_files(project, num, auth=False):
112 """get list of files in a pull request"""
113 url = "https://api.github.com/repos/{project}/pulls/{num}/files".format(project=project, num=num)
114 if auth:
115 header = make_auth_header()
116 else:
117 header = None
118 return get_paged_request(url, headers=header)
119
120 element_pat = re.compile(r'<(.+?)>')
121 rel_pat = re.compile(r'rel=[\'"](\w+)[\'"]')
122
123 def get_paged_request(url, headers=None, **params):
124 """get a full list, handling APIv3's paging"""
125 results = []
126 params.setdefault("per_page", 100)
127 while True:
128 if '?' in url:
129 params = None
130 print("fetching %s" % url, file=sys.stderr)
131 else:
132 print("fetching %s with %s" % (url, params), file=sys.stderr)
133 response = requests.get(url, headers=headers, params=params)
134 response.raise_for_status()
135 results.extend(response.json())
136 if 'next' in response.links:
137 url = response.links['next']['url']
138 else:
139 break
140 return results
141
142 def get_pulls_list(project, auth=False, **params):
143 """get pull request list"""
144 params.setdefault("state", "closed")
145 url = "https://api.github.com/repos/{project}/pulls".format(project=project)
146 if auth:
147 headers = make_auth_header()
148 else:
149 headers = None
150 pages = get_paged_request(url, headers=headers, **params)
151 return pages
152
153 def get_issues_list(project, auth=False, **params):
154 """get issues list"""
155 params.setdefault("state", "closed")
156 url = "https://api.github.com/repos/{project}/issues".format(project=project)
157 if auth:
158 headers = make_auth_header()
159 else:
160 headers = None
161 pages = get_paged_request(url, headers=headers, **params)
162 return pages
163
164 def get_milestones(project, auth=False, **params):
165 params.setdefault('state', 'all')
166 url = "https://api.github.com/repos/{project}/milestones".format(project=project)
167 if auth:
168 headers = make_auth_header()
169 else:
170 headers = None
171 milestones = get_paged_request(url, headers=headers, **params)
172 return milestones
173
174 def get_milestone_id(project, milestone, auth=False, **params):
175 milestones = get_milestones(project, auth=auth, **params)
176 for mstone in milestones:
177 if mstone['title'] == milestone:
178 return mstone['number']
179 else:
180 raise ValueError("milestone %s not found" % milestone)
181
182 def is_pull_request(issue):
183 """Return True if the given issue is a pull request."""
184 return bool(issue.get('pull_request', {}).get('html_url', None))
185
186 def get_authors(pr):
187 print("getting authors for #%i" % pr['number'], file=sys.stderr)
188 h = make_auth_header()
189 r = requests.get(pr['commits_url'], headers=h)
190 r.raise_for_status()
191 commits = r.json()
192 authors = []
193 for commit in commits:
194 author = commit['commit']['author']
195 authors.append("%s <%s>" % (author['name'], author['email']))
196 return authors
197
198 # encode_multipart_formdata is from urllib3.filepost
199 # The only change is to iter_fields, to enforce S3's required key ordering
200
201 def iter_fields(fields):
202 fields = fields.copy()
203 for key in ('key', 'acl', 'Filename', 'success_action_status', 'AWSAccessKeyId',
204 'Policy', 'Signature', 'Content-Type', 'file'):
205 yield (key, fields.pop(key))
206 for (k,v) in fields.items():
207 yield k,v
208
209 def encode_multipart_formdata(fields, boundary=None):
210 """
211 Encode a dictionary of ``fields`` using the multipart/form-data mime format.
212
213 :param fields:
214 Dictionary of fields or list of (key, value) field tuples. The key is
215 treated as the field name, and the value as the body of the form-data
216 bytes. If the value is a tuple of two elements, then the first element
217 is treated as the filename of the form-data section.
218
219 Field names and filenames must be unicode.
220
221 :param boundary:
222 If not specified, then a random boundary will be generated using
223 :func:`mimetools.choose_boundary`.
224 """
225 # copy requests imports in here:
226 from io import BytesIO
227 from requests.packages.urllib3.filepost import (
228 choose_boundary, six, writer, b, get_content_type
229 )
230 body = BytesIO()
231 if boundary is None:
232 boundary = choose_boundary()
233
234 for fieldname, value in iter_fields(fields):
235 body.write(b('--%s\r\n' % (boundary)))
236
237 if isinstance(value, tuple):
238 filename, data = value
239 writer(body).write('Content-Disposition: form-data; name="%s"; '
240 'filename="%s"\r\n' % (fieldname, filename))
241 body.write(b('Content-Type: %s\r\n\r\n' %
242 (get_content_type(filename))))
243 else:
244 data = value
245 writer(body).write('Content-Disposition: form-data; name="%s"\r\n'
246 % (fieldname))
247 body.write(b'Content-Type: text/plain\r\n\r\n')
248
249 if isinstance(data, int):
250 data = str(data) # Backwards compatibility
251 if isinstance(data, six.text_type):
252 writer(body).write(data)
253 else:
254 body.write(data)
255
256 body.write(b'\r\n')
257
258 body.write(b('--%s--\r\n' % (boundary)))
259
260 content_type = b('multipart/form-data; boundary=%s' % boundary)
261
262 return body.getvalue(), content_type
263
264
265 def post_download(project, filename, name=None, description=""):
266 """Upload a file to the GitHub downloads area"""
267 if name is None:
268 name = os.path.basename(filename)
269 with open(filename, 'rb') as f:
270 filedata = f.read()
271
272 url = "https://api.github.com/repos/{project}/downloads".format(project=project)
273
274 payload = json.dumps(dict(name=name, size=len(filedata),
275 description=description))
276 response = requests.post(url, data=payload, headers=make_auth_header())
277 response.raise_for_status()
278 reply = json.loads(response.content)
279 s3_url = reply['s3_url']
280
281 fields = dict(
282 key=reply['path'],
283 acl=reply['acl'],
284 success_action_status=201,
285 Filename=reply['name'],
286 AWSAccessKeyId=reply['accesskeyid'],
287 Policy=reply['policy'],
288 Signature=reply['signature'],
289 file=(reply['name'], filedata),
290 )
291 fields['Content-Type'] = reply['mime_type']
292 data, content_type = encode_multipart_formdata(fields)
293 s3r = requests.post(s3_url, data=data, headers={'Content-Type': content_type})
294 return s3r
295
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/tools/gh_api.py b/tools/gh_api.py
--- a/tools/gh_api.py
+++ b/tools/gh_api.py
@@ -64,6 +64,13 @@
}
response = requests.post('https://api.github.com/authorizations',
auth=(user, pw), data=json.dumps(auth_request))
+ if response.status_code == 401 and response.headers.get('X-GitHub-OTP') == 'required; sms':
+ print("Your login API resquest a SMS one time password")
+ sms_pw = getpass.getpass("SMS password: ")
+ response = requests.post('https://api.github.com/authorizations',
+ auth=(user, pw),
+ data=json.dumps(auth_request),
+ headers={'X-GitHub-OTP':sms_pw})
response.raise_for_status()
token = json.loads(response.text)['token']
keyring.set_password('github', fake_username, token)
|
{"golden_diff": "diff --git a/tools/gh_api.py b/tools/gh_api.py\n--- a/tools/gh_api.py\n+++ b/tools/gh_api.py\n@@ -64,6 +64,13 @@\n }\n response = requests.post('https://api.github.com/authorizations',\n auth=(user, pw), data=json.dumps(auth_request))\n+ if response.status_code == 401 and response.headers.get('X-GitHub-OTP') == 'required; sms':\n+ print(\"Your login API resquest a SMS one time password\")\n+ sms_pw = getpass.getpass(\"SMS password: \")\n+ response = requests.post('https://api.github.com/authorizations',\n+ auth=(user, pw), \n+ data=json.dumps(auth_request),\n+ headers={'X-GitHub-OTP':sms_pw})\n response.raise_for_status()\n token = json.loads(response.text)['token']\n keyring.set_password('github', fake_username, token)\n", "issue": "tools/github_stats fails on 2fa. \n```\n$ python tools/github_stats.py --milestone 4.1 --since-tag 4.0.0\nno cache\nfetching GitHub stats since 2015-08-12 00:00:00 (tag: 4.0.0, milestone: 4.1)\nPlease enter your github username and password. These are not stored, only used to get an oAuth token. You can revoke this at any time on Github.\nUsername: *****\nPassword:*****\nTraceback (most recent call last):\n File \"github_stats.py\", line 165, in <module>\n auth=True)\n File \"/Users/bussonniermatthias/dev/ipython/tools/gh_api.py\", line 175, in get_milestone_id\n milestones = get_milestones(project, auth=auth, **params)\n File \"/Users/bussonniermatthias/dev/ipython/tools/gh_api.py\", line 168, in get_milestones\n headers = make_auth_header()\n File \"/Users/bussonniermatthias/dev/ipython/tools/gh_api.py\", line 73, in make_auth_header\n return {'Authorization': 'token ' + get_auth_token()}\n File \"/Users/bussonniermatthias/dev/ipython/tools/gh_api.py\", line 67, in get_auth_token\n response.raise_for_status()\n File \"/Users/bussonniermatthias/anaconda3/lib/python3.5/site-packages/requests/models.py\", line 840, in raise_for_status\n raise HTTPError(http_error_msg, response=self)\nrequests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://api.github.com/authorizations\n```\n\n", "before_files": [{"content": "\"\"\"Functions for Github API requests.\"\"\"\nfrom __future__ import print_function\n\ntry:\n input = raw_input\nexcept NameError:\n pass\n\nimport os\nimport re\nimport sys\n\nimport requests\nimport getpass\nimport json\n\ntry:\n import requests_cache\nexcept ImportError:\n print(\"no cache\", file=sys.stderr)\nelse:\n requests_cache.install_cache(\"gh_api\", expire_after=3600)\n\n# Keyring stores passwords by a 'username', but we're not storing a username and\n# password\nfake_username = 'ipython_tools'\n\nclass Obj(dict):\n \"\"\"Dictionary with attribute access to names.\"\"\"\n def __getattr__(self, name):\n try:\n return self[name]\n except KeyError:\n raise AttributeError(name)\n\n def __setattr__(self, name, val):\n self[name] = val\n\ntoken = None\ndef get_auth_token():\n global token\n\n if token is not None:\n return token\n\n import keyring\n token = keyring.get_password('github', fake_username)\n if token is not None:\n return token\n\n print(\"Please enter your github username and password. These are not \"\n \"stored, only used to get an oAuth token. You can revoke this at \"\n \"any time on Github.\")\n user = input(\"Username: \")\n pw = getpass.getpass(\"Password: \")\n\n auth_request = {\n \"scopes\": [\n \"public_repo\",\n \"gist\"\n ],\n \"note\": \"IPython tools\",\n \"note_url\": \"https://github.com/ipython/ipython/tree/master/tools\",\n }\n response = requests.post('https://api.github.com/authorizations',\n auth=(user, pw), data=json.dumps(auth_request))\n response.raise_for_status()\n token = json.loads(response.text)['token']\n keyring.set_password('github', fake_username, token)\n return token\n\ndef make_auth_header():\n return {'Authorization': 'token ' + get_auth_token()}\n\ndef post_issue_comment(project, num, body):\n url = 'https://api.github.com/repos/{project}/issues/{num}/comments'.format(project=project, num=num)\n payload = json.dumps({'body': body})\n requests.post(url, data=payload, headers=make_auth_header())\n\ndef post_gist(content, description='', filename='file', auth=False):\n \"\"\"Post some text to a Gist, and return the URL.\"\"\"\n post_data = json.dumps({\n \"description\": description,\n \"public\": True,\n \"files\": {\n filename: {\n \"content\": content\n }\n }\n }).encode('utf-8')\n\n headers = make_auth_header() if auth else {}\n response = requests.post(\"https://api.github.com/gists\", data=post_data, headers=headers)\n response.raise_for_status()\n response_data = json.loads(response.text)\n return response_data['html_url']\n\ndef get_pull_request(project, num, auth=False):\n \"\"\"get pull request info by number\n \"\"\"\n url = \"https://api.github.com/repos/{project}/pulls/{num}\".format(project=project, num=num)\n if auth:\n header = make_auth_header()\n else:\n header = None\n print(\"fetching %s\" % url, file=sys.stderr)\n response = requests.get(url, headers=header)\n response.raise_for_status()\n return json.loads(response.text, object_hook=Obj)\n\ndef get_pull_request_files(project, num, auth=False):\n \"\"\"get list of files in a pull request\"\"\"\n url = \"https://api.github.com/repos/{project}/pulls/{num}/files\".format(project=project, num=num)\n if auth:\n header = make_auth_header()\n else:\n header = None\n return get_paged_request(url, headers=header)\n\nelement_pat = re.compile(r'<(.+?)>')\nrel_pat = re.compile(r'rel=[\\'\"](\\w+)[\\'\"]')\n\ndef get_paged_request(url, headers=None, **params):\n \"\"\"get a full list, handling APIv3's paging\"\"\"\n results = []\n params.setdefault(\"per_page\", 100)\n while True:\n if '?' in url:\n params = None\n print(\"fetching %s\" % url, file=sys.stderr)\n else:\n print(\"fetching %s with %s\" % (url, params), file=sys.stderr)\n response = requests.get(url, headers=headers, params=params)\n response.raise_for_status()\n results.extend(response.json())\n if 'next' in response.links:\n url = response.links['next']['url']\n else:\n break\n return results\n\ndef get_pulls_list(project, auth=False, **params):\n \"\"\"get pull request list\"\"\"\n params.setdefault(\"state\", \"closed\")\n url = \"https://api.github.com/repos/{project}/pulls\".format(project=project)\n if auth:\n headers = make_auth_header()\n else:\n headers = None\n pages = get_paged_request(url, headers=headers, **params)\n return pages\n\ndef get_issues_list(project, auth=False, **params):\n \"\"\"get issues list\"\"\"\n params.setdefault(\"state\", \"closed\")\n url = \"https://api.github.com/repos/{project}/issues\".format(project=project)\n if auth:\n headers = make_auth_header()\n else:\n headers = None\n pages = get_paged_request(url, headers=headers, **params)\n return pages\n\ndef get_milestones(project, auth=False, **params):\n params.setdefault('state', 'all')\n url = \"https://api.github.com/repos/{project}/milestones\".format(project=project)\n if auth:\n headers = make_auth_header()\n else:\n headers = None\n milestones = get_paged_request(url, headers=headers, **params)\n return milestones\n\ndef get_milestone_id(project, milestone, auth=False, **params):\n milestones = get_milestones(project, auth=auth, **params)\n for mstone in milestones:\n if mstone['title'] == milestone:\n return mstone['number']\n else:\n raise ValueError(\"milestone %s not found\" % milestone)\n\ndef is_pull_request(issue):\n \"\"\"Return True if the given issue is a pull request.\"\"\"\n return bool(issue.get('pull_request', {}).get('html_url', None))\n\ndef get_authors(pr):\n print(\"getting authors for #%i\" % pr['number'], file=sys.stderr)\n h = make_auth_header()\n r = requests.get(pr['commits_url'], headers=h)\n r.raise_for_status()\n commits = r.json()\n authors = []\n for commit in commits:\n author = commit['commit']['author']\n authors.append(\"%s <%s>\" % (author['name'], author['email']))\n return authors\n\n# encode_multipart_formdata is from urllib3.filepost\n# The only change is to iter_fields, to enforce S3's required key ordering\n\ndef iter_fields(fields):\n fields = fields.copy()\n for key in ('key', 'acl', 'Filename', 'success_action_status', 'AWSAccessKeyId',\n 'Policy', 'Signature', 'Content-Type', 'file'):\n yield (key, fields.pop(key))\n for (k,v) in fields.items():\n yield k,v\n\ndef encode_multipart_formdata(fields, boundary=None):\n \"\"\"\n Encode a dictionary of ``fields`` using the multipart/form-data mime format.\n\n :param fields:\n Dictionary of fields or list of (key, value) field tuples. The key is\n treated as the field name, and the value as the body of the form-data\n bytes. If the value is a tuple of two elements, then the first element\n is treated as the filename of the form-data section.\n\n Field names and filenames must be unicode.\n\n :param boundary:\n If not specified, then a random boundary will be generated using\n :func:`mimetools.choose_boundary`.\n \"\"\"\n # copy requests imports in here:\n from io import BytesIO\n from requests.packages.urllib3.filepost import (\n choose_boundary, six, writer, b, get_content_type\n )\n body = BytesIO()\n if boundary is None:\n boundary = choose_boundary()\n\n for fieldname, value in iter_fields(fields):\n body.write(b('--%s\\r\\n' % (boundary)))\n\n if isinstance(value, tuple):\n filename, data = value\n writer(body).write('Content-Disposition: form-data; name=\"%s\"; '\n 'filename=\"%s\"\\r\\n' % (fieldname, filename))\n body.write(b('Content-Type: %s\\r\\n\\r\\n' %\n (get_content_type(filename))))\n else:\n data = value\n writer(body).write('Content-Disposition: form-data; name=\"%s\"\\r\\n'\n % (fieldname))\n body.write(b'Content-Type: text/plain\\r\\n\\r\\n')\n\n if isinstance(data, int):\n data = str(data) # Backwards compatibility\n if isinstance(data, six.text_type):\n writer(body).write(data)\n else:\n body.write(data)\n\n body.write(b'\\r\\n')\n\n body.write(b('--%s--\\r\\n' % (boundary)))\n\n content_type = b('multipart/form-data; boundary=%s' % boundary)\n\n return body.getvalue(), content_type\n\n\ndef post_download(project, filename, name=None, description=\"\"):\n \"\"\"Upload a file to the GitHub downloads area\"\"\"\n if name is None:\n name = os.path.basename(filename)\n with open(filename, 'rb') as f:\n filedata = f.read()\n\n url = \"https://api.github.com/repos/{project}/downloads\".format(project=project)\n\n payload = json.dumps(dict(name=name, size=len(filedata),\n description=description))\n response = requests.post(url, data=payload, headers=make_auth_header())\n response.raise_for_status()\n reply = json.loads(response.content)\n s3_url = reply['s3_url']\n\n fields = dict(\n key=reply['path'],\n acl=reply['acl'],\n success_action_status=201,\n Filename=reply['name'],\n AWSAccessKeyId=reply['accesskeyid'],\n Policy=reply['policy'],\n Signature=reply['signature'],\n file=(reply['name'], filedata),\n )\n fields['Content-Type'] = reply['mime_type']\n data, content_type = encode_multipart_formdata(fields)\n s3r = requests.post(s3_url, data=data, headers={'Content-Type': content_type})\n return s3r\n", "path": "tools/gh_api.py"}], "after_files": [{"content": "\"\"\"Functions for Github API requests.\"\"\"\nfrom __future__ import print_function\n\ntry:\n input = raw_input\nexcept NameError:\n pass\n\nimport os\nimport re\nimport sys\n\nimport requests\nimport getpass\nimport json\n\ntry:\n import requests_cache\nexcept ImportError:\n print(\"no cache\", file=sys.stderr)\nelse:\n requests_cache.install_cache(\"gh_api\", expire_after=3600)\n\n# Keyring stores passwords by a 'username', but we're not storing a username and\n# password\nfake_username = 'ipython_tools'\n\nclass Obj(dict):\n \"\"\"Dictionary with attribute access to names.\"\"\"\n def __getattr__(self, name):\n try:\n return self[name]\n except KeyError:\n raise AttributeError(name)\n\n def __setattr__(self, name, val):\n self[name] = val\n\ntoken = None\ndef get_auth_token():\n global token\n\n if token is not None:\n return token\n\n import keyring\n token = keyring.get_password('github', fake_username)\n if token is not None:\n return token\n\n print(\"Please enter your github username and password. These are not \"\n \"stored, only used to get an oAuth token. You can revoke this at \"\n \"any time on Github.\")\n user = input(\"Username: \")\n pw = getpass.getpass(\"Password: \")\n\n auth_request = {\n \"scopes\": [\n \"public_repo\",\n \"gist\"\n ],\n \"note\": \"IPython tools\",\n \"note_url\": \"https://github.com/ipython/ipython/tree/master/tools\",\n }\n response = requests.post('https://api.github.com/authorizations',\n auth=(user, pw), data=json.dumps(auth_request))\n if response.status_code == 401 and response.headers.get('X-GitHub-OTP') == 'required; sms':\n print(\"Your login API resquest a SMS one time password\")\n sms_pw = getpass.getpass(\"SMS password: \")\n response = requests.post('https://api.github.com/authorizations',\n auth=(user, pw), \n data=json.dumps(auth_request),\n headers={'X-GitHub-OTP':sms_pw})\n response.raise_for_status()\n token = json.loads(response.text)['token']\n keyring.set_password('github', fake_username, token)\n return token\n\ndef make_auth_header():\n return {'Authorization': 'token ' + get_auth_token()}\n\ndef post_issue_comment(project, num, body):\n url = 'https://api.github.com/repos/{project}/issues/{num}/comments'.format(project=project, num=num)\n payload = json.dumps({'body': body})\n requests.post(url, data=payload, headers=make_auth_header())\n\ndef post_gist(content, description='', filename='file', auth=False):\n \"\"\"Post some text to a Gist, and return the URL.\"\"\"\n post_data = json.dumps({\n \"description\": description,\n \"public\": True,\n \"files\": {\n filename: {\n \"content\": content\n }\n }\n }).encode('utf-8')\n\n headers = make_auth_header() if auth else {}\n response = requests.post(\"https://api.github.com/gists\", data=post_data, headers=headers)\n response.raise_for_status()\n response_data = json.loads(response.text)\n return response_data['html_url']\n\ndef get_pull_request(project, num, auth=False):\n \"\"\"get pull request info by number\n \"\"\"\n url = \"https://api.github.com/repos/{project}/pulls/{num}\".format(project=project, num=num)\n if auth:\n header = make_auth_header()\n else:\n header = None\n print(\"fetching %s\" % url, file=sys.stderr)\n response = requests.get(url, headers=header)\n response.raise_for_status()\n return json.loads(response.text, object_hook=Obj)\n\ndef get_pull_request_files(project, num, auth=False):\n \"\"\"get list of files in a pull request\"\"\"\n url = \"https://api.github.com/repos/{project}/pulls/{num}/files\".format(project=project, num=num)\n if auth:\n header = make_auth_header()\n else:\n header = None\n return get_paged_request(url, headers=header)\n\nelement_pat = re.compile(r'<(.+?)>')\nrel_pat = re.compile(r'rel=[\\'\"](\\w+)[\\'\"]')\n\ndef get_paged_request(url, headers=None, **params):\n \"\"\"get a full list, handling APIv3's paging\"\"\"\n results = []\n params.setdefault(\"per_page\", 100)\n while True:\n if '?' in url:\n params = None\n print(\"fetching %s\" % url, file=sys.stderr)\n else:\n print(\"fetching %s with %s\" % (url, params), file=sys.stderr)\n response = requests.get(url, headers=headers, params=params)\n response.raise_for_status()\n results.extend(response.json())\n if 'next' in response.links:\n url = response.links['next']['url']\n else:\n break\n return results\n\ndef get_pulls_list(project, auth=False, **params):\n \"\"\"get pull request list\"\"\"\n params.setdefault(\"state\", \"closed\")\n url = \"https://api.github.com/repos/{project}/pulls\".format(project=project)\n if auth:\n headers = make_auth_header()\n else:\n headers = None\n pages = get_paged_request(url, headers=headers, **params)\n return pages\n\ndef get_issues_list(project, auth=False, **params):\n \"\"\"get issues list\"\"\"\n params.setdefault(\"state\", \"closed\")\n url = \"https://api.github.com/repos/{project}/issues\".format(project=project)\n if auth:\n headers = make_auth_header()\n else:\n headers = None\n pages = get_paged_request(url, headers=headers, **params)\n return pages\n\ndef get_milestones(project, auth=False, **params):\n params.setdefault('state', 'all')\n url = \"https://api.github.com/repos/{project}/milestones\".format(project=project)\n if auth:\n headers = make_auth_header()\n else:\n headers = None\n milestones = get_paged_request(url, headers=headers, **params)\n return milestones\n\ndef get_milestone_id(project, milestone, auth=False, **params):\n milestones = get_milestones(project, auth=auth, **params)\n for mstone in milestones:\n if mstone['title'] == milestone:\n return mstone['number']\n else:\n raise ValueError(\"milestone %s not found\" % milestone)\n\ndef is_pull_request(issue):\n \"\"\"Return True if the given issue is a pull request.\"\"\"\n return bool(issue.get('pull_request', {}).get('html_url', None))\n\ndef get_authors(pr):\n print(\"getting authors for #%i\" % pr['number'], file=sys.stderr)\n h = make_auth_header()\n r = requests.get(pr['commits_url'], headers=h)\n r.raise_for_status()\n commits = r.json()\n authors = []\n for commit in commits:\n author = commit['commit']['author']\n authors.append(\"%s <%s>\" % (author['name'], author['email']))\n return authors\n\n# encode_multipart_formdata is from urllib3.filepost\n# The only change is to iter_fields, to enforce S3's required key ordering\n\ndef iter_fields(fields):\n fields = fields.copy()\n for key in ('key', 'acl', 'Filename', 'success_action_status', 'AWSAccessKeyId',\n 'Policy', 'Signature', 'Content-Type', 'file'):\n yield (key, fields.pop(key))\n for (k,v) in fields.items():\n yield k,v\n\ndef encode_multipart_formdata(fields, boundary=None):\n \"\"\"\n Encode a dictionary of ``fields`` using the multipart/form-data mime format.\n\n :param fields:\n Dictionary of fields or list of (key, value) field tuples. The key is\n treated as the field name, and the value as the body of the form-data\n bytes. If the value is a tuple of two elements, then the first element\n is treated as the filename of the form-data section.\n\n Field names and filenames must be unicode.\n\n :param boundary:\n If not specified, then a random boundary will be generated using\n :func:`mimetools.choose_boundary`.\n \"\"\"\n # copy requests imports in here:\n from io import BytesIO\n from requests.packages.urllib3.filepost import (\n choose_boundary, six, writer, b, get_content_type\n )\n body = BytesIO()\n if boundary is None:\n boundary = choose_boundary()\n\n for fieldname, value in iter_fields(fields):\n body.write(b('--%s\\r\\n' % (boundary)))\n\n if isinstance(value, tuple):\n filename, data = value\n writer(body).write('Content-Disposition: form-data; name=\"%s\"; '\n 'filename=\"%s\"\\r\\n' % (fieldname, filename))\n body.write(b('Content-Type: %s\\r\\n\\r\\n' %\n (get_content_type(filename))))\n else:\n data = value\n writer(body).write('Content-Disposition: form-data; name=\"%s\"\\r\\n'\n % (fieldname))\n body.write(b'Content-Type: text/plain\\r\\n\\r\\n')\n\n if isinstance(data, int):\n data = str(data) # Backwards compatibility\n if isinstance(data, six.text_type):\n writer(body).write(data)\n else:\n body.write(data)\n\n body.write(b'\\r\\n')\n\n body.write(b('--%s--\\r\\n' % (boundary)))\n\n content_type = b('multipart/form-data; boundary=%s' % boundary)\n\n return body.getvalue(), content_type\n\n\ndef post_download(project, filename, name=None, description=\"\"):\n \"\"\"Upload a file to the GitHub downloads area\"\"\"\n if name is None:\n name = os.path.basename(filename)\n with open(filename, 'rb') as f:\n filedata = f.read()\n\n url = \"https://api.github.com/repos/{project}/downloads\".format(project=project)\n\n payload = json.dumps(dict(name=name, size=len(filedata),\n description=description))\n response = requests.post(url, data=payload, headers=make_auth_header())\n response.raise_for_status()\n reply = json.loads(response.content)\n s3_url = reply['s3_url']\n\n fields = dict(\n key=reply['path'],\n acl=reply['acl'],\n success_action_status=201,\n Filename=reply['name'],\n AWSAccessKeyId=reply['accesskeyid'],\n Policy=reply['policy'],\n Signature=reply['signature'],\n file=(reply['name'], filedata),\n )\n fields['Content-Type'] = reply['mime_type']\n data, content_type = encode_multipart_formdata(fields)\n s3r = requests.post(s3_url, data=data, headers={'Content-Type': content_type})\n return s3r\n", "path": "tools/gh_api.py"}]}
| 3,716 | 209 |
gh_patches_debug_24514
|
rasdani/github-patches
|
git_diff
|
bridgecrewio__checkov-2025
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Check CKV_AZURE_50 fails incorrectly for resource 'azurerm_virtual_machine'
CKV_AZURE_50 always fails for the resource "azurerm_virtual_machine" due to an incorrect check of the existence of the attribute "allow_extension_operations":
https://github.com/bridgecrewio/checkov/blob/25388a34231e09ac17b266ad9db0b4c0e806e956/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py#L15
The Terraform resource "azurerm_virtual_machine" does not have an attribute named "allow_extension_operations" (see [Terraform Resouce Docu](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/virtual_machine).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py`
Content:
```
1 from checkov.common.models.enums import CheckCategories, CheckResult
2 from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck
3
4
5 class AzureInstanceExtensions(BaseResourceValueCheck):
6 def __init__(self):
7 name = "Ensure Virtual Machine Extensions are not Installed"
8 id = "CKV_AZURE_50"
9 supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']
10 categories = [CheckCategories.GENERAL_SECURITY]
11 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,
12 missing_block_result=CheckResult.PASSED)
13
14 def get_inspected_key(self):
15 return 'allow_extension_operations'
16
17 def get_expected_value(self):
18 return False
19
20
21 check = AzureInstanceExtensions()
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py
--- a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py
+++ b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py
@@ -1,20 +1,21 @@
-from checkov.common.models.enums import CheckCategories, CheckResult
+from typing import Any
+
+from checkov.common.models.enums import CheckCategories
from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck
class AzureInstanceExtensions(BaseResourceValueCheck):
- def __init__(self):
+ def __init__(self) -> None:
name = "Ensure Virtual Machine Extensions are not Installed"
id = "CKV_AZURE_50"
- supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']
+ supported_resources = ["azurerm_linux_virtual_machine", "azurerm_windows_virtual_machine"]
categories = [CheckCategories.GENERAL_SECURITY]
- super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,
- missing_block_result=CheckResult.PASSED)
+ super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
- def get_inspected_key(self):
- return 'allow_extension_operations'
+ def get_inspected_key(self) -> str:
+ return "allow_extension_operations"
- def get_expected_value(self):
+ def get_expected_value(self) -> Any:
return False
|
{"golden_diff": "diff --git a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py\n--- a/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py\n+++ b/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py\n@@ -1,20 +1,21 @@\n-from checkov.common.models.enums import CheckCategories, CheckResult\n+from typing import Any\n+\n+from checkov.common.models.enums import CheckCategories\n from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n \n \n class AzureInstanceExtensions(BaseResourceValueCheck):\n- def __init__(self):\n+ def __init__(self) -> None:\n name = \"Ensure Virtual Machine Extensions are not Installed\"\n id = \"CKV_AZURE_50\"\n- supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']\n+ supported_resources = [\"azurerm_linux_virtual_machine\", \"azurerm_windows_virtual_machine\"]\n categories = [CheckCategories.GENERAL_SECURITY]\n- super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,\n- missing_block_result=CheckResult.PASSED)\n+ super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n \n- def get_inspected_key(self):\n- return 'allow_extension_operations'\n+ def get_inspected_key(self) -> str:\n+ return \"allow_extension_operations\"\n \n- def get_expected_value(self):\n+ def get_expected_value(self) -> Any:\n return False\n", "issue": "Check CKV_AZURE_50 fails incorrectly for resource 'azurerm_virtual_machine'\nCKV_AZURE_50 always fails for the resource \"azurerm_virtual_machine\" due to an incorrect check of the existence of the attribute \"allow_extension_operations\":\r\n\r\nhttps://github.com/bridgecrewio/checkov/blob/25388a34231e09ac17b266ad9db0b4c0e806e956/checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py#L15\r\n\r\nThe Terraform resource \"azurerm_virtual_machine\" does not have an attribute named \"allow_extension_operations\" (see [Terraform Resouce Docu](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/virtual_machine).\r\n\r\n\n", "before_files": [{"content": "from checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n\n\nclass AzureInstanceExtensions(BaseResourceValueCheck):\n def __init__(self):\n name = \"Ensure Virtual Machine Extensions are not Installed\"\n id = \"CKV_AZURE_50\"\n supported_resources = ['azurerm_virtual_machine', 'azurerm_linux_virtual_machine']\n categories = [CheckCategories.GENERAL_SECURITY]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources,\n missing_block_result=CheckResult.PASSED)\n\n def get_inspected_key(self):\n return 'allow_extension_operations'\n\n def get_expected_value(self):\n return False\n\n\ncheck = AzureInstanceExtensions()\n", "path": "checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py"}], "after_files": [{"content": "from typing import Any\n\nfrom checkov.common.models.enums import CheckCategories\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceValueCheck\n\n\nclass AzureInstanceExtensions(BaseResourceValueCheck):\n def __init__(self) -> None:\n name = \"Ensure Virtual Machine Extensions are not Installed\"\n id = \"CKV_AZURE_50\"\n supported_resources = [\"azurerm_linux_virtual_machine\", \"azurerm_windows_virtual_machine\"]\n categories = [CheckCategories.GENERAL_SECURITY]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def get_inspected_key(self) -> str:\n return \"allow_extension_operations\"\n\n def get_expected_value(self) -> Any:\n return False\n\n\ncheck = AzureInstanceExtensions()\n", "path": "checkov/terraform/checks/resource/azure/AzureInstanceExtensions.py"}]}
| 656 | 353 |
gh_patches_debug_24763
|
rasdani/github-patches
|
git_diff
|
certbot__certbot-8521
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Certbot-dns-google fails with HTTP 412 - Precondition not met
## My operating system is (include version):
CentOS 7
## I installed Certbot with (certbot-auto, OS package manager, pip, etc):
yum install -y certbot
pip install certbot-dns-google
## I ran this command and it produced this output:
certbot certonly --dns-google --dns-google-credentials <path> --server https://acme-v02.api.letsencrypt.org/directory -d ssl1234.example.net -d *.example.net -d example.net ...
## Certbot's behavior differed from what I expected because:
Encountered error adding TXT record: <HttpError 412 when requesting https://www.googleapis.com/dns/v1/projects/projectname/managedZones/id/changes?alt=json returned "Precondition not met for 'entity.change.deletions[0]'">
This problem only appeared if multiple domains are requested (around 15 SAN Names), single Domains work fine. I tested both wildcard domains and single domains - both fail sometimes.
## Here is a Certbot log showing the issue (if available):
###### Logs are stored in `/var/log/letsencrypt` by default. Feel free to redact domains, e-mail and IP addresses as you see fit.
2018-05-24 16:05:21,163:WARNING:certbot_dns_google.dns_google:Encountered error deleting TXT record: <HttpError 412 when requesting https://www.googleapis.com/dns/v1/projects/projectname/managedZones/id/changes?alt=json returned "Precondition not met for 'entity.change.deletions[0]'">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `certbot-dns-google/certbot_dns_google/_internal/dns_google.py`
Content:
```
1 """DNS Authenticator for Google Cloud DNS."""
2 import json
3 import logging
4
5 from googleapiclient import discovery
6 from googleapiclient import errors as googleapiclient_errors
7 import httplib2
8 from oauth2client.service_account import ServiceAccountCredentials
9 import zope.interface
10
11 from certbot import errors
12 from certbot import interfaces
13 from certbot.plugins import dns_common
14
15 logger = logging.getLogger(__name__)
16
17 ACCT_URL = 'https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount'
18 PERMISSIONS_URL = 'https://cloud.google.com/dns/access-control#permissions_and_roles'
19 METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'
20 METADATA_HEADERS = {'Metadata-Flavor': 'Google'}
21
22
23 @zope.interface.implementer(interfaces.IAuthenticator)
24 @zope.interface.provider(interfaces.IPluginFactory)
25 class Authenticator(dns_common.DNSAuthenticator):
26 """DNS Authenticator for Google Cloud DNS
27
28 This Authenticator uses the Google Cloud DNS API to fulfill a dns-01 challenge.
29 """
30
31 description = ('Obtain certificates using a DNS TXT record (if you are using Google Cloud DNS '
32 'for DNS).')
33 ttl = 60
34
35 def __init__(self, *args, **kwargs):
36 super(Authenticator, self).__init__(*args, **kwargs)
37 self.credentials = None
38
39 @classmethod
40 def add_parser_arguments(cls, add): # pylint: disable=arguments-differ
41 super(Authenticator, cls).add_parser_arguments(add, default_propagation_seconds=60)
42 add('credentials',
43 help=('Path to Google Cloud DNS service account JSON file. (See {0} for' +
44 'information about creating a service account and {1} for information about the' +
45 'required permissions.)').format(ACCT_URL, PERMISSIONS_URL),
46 default=None)
47
48 def more_info(self): # pylint: disable=missing-function-docstring
49 return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
50 'the Google Cloud DNS API.'
51
52 def _setup_credentials(self):
53 if self.conf('credentials') is None:
54 try:
55 # use project_id query to check for availability of google metadata server
56 # we won't use the result but know we're not on GCP when an exception is thrown
57 _GoogleClient.get_project_id()
58 except (ValueError, httplib2.ServerNotFoundError):
59 raise errors.PluginError('Unable to get Google Cloud Metadata and no credentials'
60 ' specified. Automatic credential lookup is only '
61 'available on Google Cloud Platform. Please configure'
62 ' credentials using --dns-google-credentials <file>')
63 else:
64 self._configure_file('credentials',
65 'path to Google Cloud DNS service account JSON file')
66
67 dns_common.validate_file_permissions(self.conf('credentials'))
68
69 def _perform(self, domain, validation_name, validation):
70 self._get_google_client().add_txt_record(domain, validation_name, validation, self.ttl)
71
72 def _cleanup(self, domain, validation_name, validation):
73 self._get_google_client().del_txt_record(domain, validation_name, validation, self.ttl)
74
75 def _get_google_client(self):
76 return _GoogleClient(self.conf('credentials'))
77
78
79 class _GoogleClient(object):
80 """
81 Encapsulates all communication with the Google Cloud DNS API.
82 """
83
84 def __init__(self, account_json=None, dns_api=None):
85
86 scopes = ['https://www.googleapis.com/auth/ndev.clouddns.readwrite']
87 if account_json is not None:
88 try:
89 credentials = ServiceAccountCredentials.from_json_keyfile_name(account_json, scopes)
90 with open(account_json) as account:
91 self.project_id = json.load(account)['project_id']
92 except Exception as e:
93 raise errors.PluginError(
94 "Error parsing credentials file '{}': {}".format(account_json, e))
95 else:
96 credentials = None
97 self.project_id = self.get_project_id()
98
99 if not dns_api:
100 self.dns = discovery.build('dns', 'v1',
101 credentials=credentials,
102 cache_discovery=False)
103 else:
104 self.dns = dns_api
105
106 def add_txt_record(self, domain, record_name, record_content, record_ttl):
107 """
108 Add a TXT record using the supplied information.
109
110 :param str domain: The domain to use to look up the managed zone.
111 :param str record_name: The record name (typically beginning with '_acme-challenge.').
112 :param str record_content: The record content (typically the challenge validation).
113 :param int record_ttl: The record TTL (number of seconds that the record may be cached).
114 :raises certbot.errors.PluginError: if an error occurs communicating with the Google API
115 """
116
117 zone_id = self._find_managed_zone_id(domain)
118
119 record_contents = self.get_existing_txt_rrset(zone_id, record_name)
120 if record_contents is None:
121 record_contents = []
122 add_records = record_contents[:]
123
124 if "\""+record_content+"\"" in record_contents:
125 # The process was interrupted previously and validation token exists
126 return
127
128 add_records.append(record_content)
129
130 data = {
131 "kind": "dns#change",
132 "additions": [
133 {
134 "kind": "dns#resourceRecordSet",
135 "type": "TXT",
136 "name": record_name + ".",
137 "rrdatas": add_records,
138 "ttl": record_ttl,
139 },
140 ],
141 }
142
143 if record_contents:
144 # We need to remove old records in the same request
145 data["deletions"] = [
146 {
147 "kind": "dns#resourceRecordSet",
148 "type": "TXT",
149 "name": record_name + ".",
150 "rrdatas": record_contents,
151 "ttl": record_ttl,
152 },
153 ]
154
155 changes = self.dns.changes()
156
157 try:
158 request = changes.create(project=self.project_id, managedZone=zone_id, body=data)
159 response = request.execute()
160
161 status = response['status']
162 change = response['id']
163 while status == 'pending':
164 request = changes.get(project=self.project_id, managedZone=zone_id, changeId=change)
165 response = request.execute()
166 status = response['status']
167 except googleapiclient_errors.Error as e:
168 logger.error('Encountered error adding TXT record: %s', e)
169 raise errors.PluginError('Error communicating with the Google Cloud DNS API: {0}'
170 .format(e))
171
172 def del_txt_record(self, domain, record_name, record_content, record_ttl):
173 """
174 Delete a TXT record using the supplied information.
175
176 :param str domain: The domain to use to look up the managed zone.
177 :param str record_name: The record name (typically beginning with '_acme-challenge.').
178 :param str record_content: The record content (typically the challenge validation).
179 :param int record_ttl: The record TTL (number of seconds that the record may be cached).
180 :raises certbot.errors.PluginError: if an error occurs communicating with the Google API
181 """
182
183 try:
184 zone_id = self._find_managed_zone_id(domain)
185 except errors.PluginError as e:
186 logger.warning('Error finding zone. Skipping cleanup.')
187 return
188
189 record_contents = self.get_existing_txt_rrset(zone_id, record_name)
190 if record_contents is None:
191 record_contents = ["\"" + record_content + "\""]
192
193 data = {
194 "kind": "dns#change",
195 "deletions": [
196 {
197 "kind": "dns#resourceRecordSet",
198 "type": "TXT",
199 "name": record_name + ".",
200 "rrdatas": record_contents,
201 "ttl": record_ttl,
202 },
203 ],
204 }
205
206 # Remove the record being deleted from the list
207 readd_contents = [r for r in record_contents if r != "\"" + record_content + "\""]
208 if readd_contents:
209 # We need to remove old records in the same request
210 data["additions"] = [
211 {
212 "kind": "dns#resourceRecordSet",
213 "type": "TXT",
214 "name": record_name + ".",
215 "rrdatas": readd_contents,
216 "ttl": record_ttl,
217 },
218 ]
219
220 changes = self.dns.changes()
221
222 try:
223 request = changes.create(project=self.project_id, managedZone=zone_id, body=data)
224 request.execute()
225 except googleapiclient_errors.Error as e:
226 logger.warning('Encountered error deleting TXT record: %s', e)
227
228 def get_existing_txt_rrset(self, zone_id, record_name):
229 """
230 Get existing TXT records from the RRset for the record name.
231
232 If an error occurs while requesting the record set, it is suppressed
233 and None is returned.
234
235 :param str zone_id: The ID of the managed zone.
236 :param str record_name: The record name (typically beginning with '_acme-challenge.').
237
238 :returns: List of TXT record values or None
239 :rtype: `list` of `string` or `None`
240
241 """
242 rrs_request = self.dns.resourceRecordSets()
243 request = rrs_request.list(managedZone=zone_id, project=self.project_id)
244 # Add dot as the API returns absolute domains
245 record_name += "."
246 try:
247 response = request.execute()
248 except googleapiclient_errors.Error:
249 logger.info("Unable to list existing records. If you're "
250 "requesting a wildcard certificate, this might not work.")
251 logger.debug("Error was:", exc_info=True)
252 else:
253 if response:
254 for rr in response["rrsets"]:
255 if rr["name"] == record_name and rr["type"] == "TXT":
256 return rr["rrdatas"]
257 return None
258
259 def _find_managed_zone_id(self, domain):
260 """
261 Find the managed zone for a given domain.
262
263 :param str domain: The domain for which to find the managed zone.
264 :returns: The ID of the managed zone, if found.
265 :rtype: str
266 :raises certbot.errors.PluginError: if the managed zone cannot be found.
267 """
268
269 zone_dns_name_guesses = dns_common.base_domain_name_guesses(domain)
270
271 mz = self.dns.managedZones()
272 for zone_name in zone_dns_name_guesses:
273 try:
274 request = mz.list(project=self.project_id, dnsName=zone_name + '.')
275 response = request.execute()
276 zones = response['managedZones']
277 except googleapiclient_errors.Error as e:
278 raise errors.PluginError('Encountered error finding managed zone: {0}'
279 .format(e))
280
281 for zone in zones:
282 zone_id = zone['id']
283 if 'privateVisibilityConfig' not in zone:
284 logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)
285 return zone_id
286
287 raise errors.PluginError('Unable to determine managed zone for {0} using zone names: {1}.'
288 .format(domain, zone_dns_name_guesses))
289
290 @staticmethod
291 def get_project_id():
292 """
293 Query the google metadata service for the current project ID
294
295 This only works on Google Cloud Platform
296
297 :raises ServerNotFoundError: Not running on Google Compute or DNS not available
298 :raises ValueError: Server is found, but response code is not 200
299 :returns: project id
300 """
301 url = '{0}project/project-id'.format(METADATA_URL)
302
303 # Request an access token from the metadata server.
304 http = httplib2.Http()
305 r, content = http.request(url, headers=METADATA_HEADERS)
306 if r.status != 200:
307 raise ValueError("Invalid status code: {0}".format(r))
308
309 if isinstance(content, bytes):
310 return content.decode()
311 return content
312
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/certbot-dns-google/certbot_dns_google/_internal/dns_google.py b/certbot-dns-google/certbot_dns_google/_internal/dns_google.py
--- a/certbot-dns-google/certbot_dns_google/_internal/dns_google.py
+++ b/certbot-dns-google/certbot_dns_google/_internal/dns_google.py
@@ -240,9 +240,10 @@
"""
rrs_request = self.dns.resourceRecordSets()
- request = rrs_request.list(managedZone=zone_id, project=self.project_id)
# Add dot as the API returns absolute domains
record_name += "."
+ request = rrs_request.list(project=self.project_id, managedZone=zone_id, name=record_name,
+ type="TXT")
try:
response = request.execute()
except googleapiclient_errors.Error:
@@ -250,10 +251,8 @@
"requesting a wildcard certificate, this might not work.")
logger.debug("Error was:", exc_info=True)
else:
- if response:
- for rr in response["rrsets"]:
- if rr["name"] == record_name and rr["type"] == "TXT":
- return rr["rrdatas"]
+ if response and response["rrsets"]:
+ return response["rrsets"][0]["rrdatas"]
return None
def _find_managed_zone_id(self, domain):
|
{"golden_diff": "diff --git a/certbot-dns-google/certbot_dns_google/_internal/dns_google.py b/certbot-dns-google/certbot_dns_google/_internal/dns_google.py\n--- a/certbot-dns-google/certbot_dns_google/_internal/dns_google.py\n+++ b/certbot-dns-google/certbot_dns_google/_internal/dns_google.py\n@@ -240,9 +240,10 @@\n \n \"\"\"\n rrs_request = self.dns.resourceRecordSets()\n- request = rrs_request.list(managedZone=zone_id, project=self.project_id)\n # Add dot as the API returns absolute domains\n record_name += \".\"\n+ request = rrs_request.list(project=self.project_id, managedZone=zone_id, name=record_name,\n+ type=\"TXT\")\n try:\n response = request.execute()\n except googleapiclient_errors.Error:\n@@ -250,10 +251,8 @@\n \"requesting a wildcard certificate, this might not work.\")\n logger.debug(\"Error was:\", exc_info=True)\n else:\n- if response:\n- for rr in response[\"rrsets\"]:\n- if rr[\"name\"] == record_name and rr[\"type\"] == \"TXT\":\n- return rr[\"rrdatas\"]\n+ if response and response[\"rrsets\"]:\n+ return response[\"rrsets\"][0][\"rrdatas\"]\n return None\n \n def _find_managed_zone_id(self, domain):\n", "issue": "Certbot-dns-google fails with HTTP 412 - Precondition not met\n## My operating system is (include version):\r\nCentOS 7\r\n\r\n## I installed Certbot with (certbot-auto, OS package manager, pip, etc):\r\nyum install -y certbot\r\npip install certbot-dns-google\r\n\r\n## I ran this command and it produced this output:\r\n\r\ncertbot certonly --dns-google --dns-google-credentials <path> --server https://acme-v02.api.letsencrypt.org/directory -d ssl1234.example.net -d *.example.net -d example.net ...\r\n\r\n## Certbot's behavior differed from what I expected because:\r\n\r\nEncountered error adding TXT record: <HttpError 412 when requesting https://www.googleapis.com/dns/v1/projects/projectname/managedZones/id/changes?alt=json returned \"Precondition not met for 'entity.change.deletions[0]'\">\r\n\r\nThis problem only appeared if multiple domains are requested (around 15 SAN Names), single Domains work fine. I tested both wildcard domains and single domains - both fail sometimes. \r\n\r\n## Here is a Certbot log showing the issue (if available):\r\n###### Logs are stored in `/var/log/letsencrypt` by default. Feel free to redact domains, e-mail and IP addresses as you see fit.\r\n\r\n2018-05-24 16:05:21,163:WARNING:certbot_dns_google.dns_google:Encountered error deleting TXT record: <HttpError 412 when requesting https://www.googleapis.com/dns/v1/projects/projectname/managedZones/id/changes?alt=json returned \"Precondition not met for 'entity.change.deletions[0]'\">\r\n\n", "before_files": [{"content": "\"\"\"DNS Authenticator for Google Cloud DNS.\"\"\"\nimport json\nimport logging\n\nfrom googleapiclient import discovery\nfrom googleapiclient import errors as googleapiclient_errors\nimport httplib2\nfrom oauth2client.service_account import ServiceAccountCredentials\nimport zope.interface\n\nfrom certbot import errors\nfrom certbot import interfaces\nfrom certbot.plugins import dns_common\n\nlogger = logging.getLogger(__name__)\n\nACCT_URL = 'https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount'\nPERMISSIONS_URL = 'https://cloud.google.com/dns/access-control#permissions_and_roles'\nMETADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'\nMETADATA_HEADERS = {'Metadata-Flavor': 'Google'}\n\n\n@zope.interface.implementer(interfaces.IAuthenticator)\n@zope.interface.provider(interfaces.IPluginFactory)\nclass Authenticator(dns_common.DNSAuthenticator):\n \"\"\"DNS Authenticator for Google Cloud DNS\n\n This Authenticator uses the Google Cloud DNS API to fulfill a dns-01 challenge.\n \"\"\"\n\n description = ('Obtain certificates using a DNS TXT record (if you are using Google Cloud DNS '\n 'for DNS).')\n ttl = 60\n\n def __init__(self, *args, **kwargs):\n super(Authenticator, self).__init__(*args, **kwargs)\n self.credentials = None\n\n @classmethod\n def add_parser_arguments(cls, add): # pylint: disable=arguments-differ\n super(Authenticator, cls).add_parser_arguments(add, default_propagation_seconds=60)\n add('credentials',\n help=('Path to Google Cloud DNS service account JSON file. (See {0} for' +\n 'information about creating a service account and {1} for information about the' +\n 'required permissions.)').format(ACCT_URL, PERMISSIONS_URL),\n default=None)\n\n def more_info(self): # pylint: disable=missing-function-docstring\n return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \\\n 'the Google Cloud DNS API.'\n\n def _setup_credentials(self):\n if self.conf('credentials') is None:\n try:\n # use project_id query to check for availability of google metadata server\n # we won't use the result but know we're not on GCP when an exception is thrown\n _GoogleClient.get_project_id()\n except (ValueError, httplib2.ServerNotFoundError):\n raise errors.PluginError('Unable to get Google Cloud Metadata and no credentials'\n ' specified. Automatic credential lookup is only '\n 'available on Google Cloud Platform. Please configure'\n ' credentials using --dns-google-credentials <file>')\n else:\n self._configure_file('credentials',\n 'path to Google Cloud DNS service account JSON file')\n\n dns_common.validate_file_permissions(self.conf('credentials'))\n\n def _perform(self, domain, validation_name, validation):\n self._get_google_client().add_txt_record(domain, validation_name, validation, self.ttl)\n\n def _cleanup(self, domain, validation_name, validation):\n self._get_google_client().del_txt_record(domain, validation_name, validation, self.ttl)\n\n def _get_google_client(self):\n return _GoogleClient(self.conf('credentials'))\n\n\nclass _GoogleClient(object):\n \"\"\"\n Encapsulates all communication with the Google Cloud DNS API.\n \"\"\"\n\n def __init__(self, account_json=None, dns_api=None):\n\n scopes = ['https://www.googleapis.com/auth/ndev.clouddns.readwrite']\n if account_json is not None:\n try:\n credentials = ServiceAccountCredentials.from_json_keyfile_name(account_json, scopes)\n with open(account_json) as account:\n self.project_id = json.load(account)['project_id']\n except Exception as e:\n raise errors.PluginError(\n \"Error parsing credentials file '{}': {}\".format(account_json, e))\n else:\n credentials = None\n self.project_id = self.get_project_id()\n\n if not dns_api:\n self.dns = discovery.build('dns', 'v1',\n credentials=credentials,\n cache_discovery=False)\n else:\n self.dns = dns_api\n\n def add_txt_record(self, domain, record_name, record_content, record_ttl):\n \"\"\"\n Add a TXT record using the supplied information.\n\n :param str domain: The domain to use to look up the managed zone.\n :param str record_name: The record name (typically beginning with '_acme-challenge.').\n :param str record_content: The record content (typically the challenge validation).\n :param int record_ttl: The record TTL (number of seconds that the record may be cached).\n :raises certbot.errors.PluginError: if an error occurs communicating with the Google API\n \"\"\"\n\n zone_id = self._find_managed_zone_id(domain)\n\n record_contents = self.get_existing_txt_rrset(zone_id, record_name)\n if record_contents is None:\n record_contents = []\n add_records = record_contents[:]\n\n if \"\\\"\"+record_content+\"\\\"\" in record_contents:\n # The process was interrupted previously and validation token exists\n return\n\n add_records.append(record_content)\n\n data = {\n \"kind\": \"dns#change\",\n \"additions\": [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": add_records,\n \"ttl\": record_ttl,\n },\n ],\n }\n\n if record_contents:\n # We need to remove old records in the same request\n data[\"deletions\"] = [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": record_contents,\n \"ttl\": record_ttl,\n },\n ]\n\n changes = self.dns.changes()\n\n try:\n request = changes.create(project=self.project_id, managedZone=zone_id, body=data)\n response = request.execute()\n\n status = response['status']\n change = response['id']\n while status == 'pending':\n request = changes.get(project=self.project_id, managedZone=zone_id, changeId=change)\n response = request.execute()\n status = response['status']\n except googleapiclient_errors.Error as e:\n logger.error('Encountered error adding TXT record: %s', e)\n raise errors.PluginError('Error communicating with the Google Cloud DNS API: {0}'\n .format(e))\n\n def del_txt_record(self, domain, record_name, record_content, record_ttl):\n \"\"\"\n Delete a TXT record using the supplied information.\n\n :param str domain: The domain to use to look up the managed zone.\n :param str record_name: The record name (typically beginning with '_acme-challenge.').\n :param str record_content: The record content (typically the challenge validation).\n :param int record_ttl: The record TTL (number of seconds that the record may be cached).\n :raises certbot.errors.PluginError: if an error occurs communicating with the Google API\n \"\"\"\n\n try:\n zone_id = self._find_managed_zone_id(domain)\n except errors.PluginError as e:\n logger.warning('Error finding zone. Skipping cleanup.')\n return\n\n record_contents = self.get_existing_txt_rrset(zone_id, record_name)\n if record_contents is None:\n record_contents = [\"\\\"\" + record_content + \"\\\"\"]\n\n data = {\n \"kind\": \"dns#change\",\n \"deletions\": [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": record_contents,\n \"ttl\": record_ttl,\n },\n ],\n }\n\n # Remove the record being deleted from the list\n readd_contents = [r for r in record_contents if r != \"\\\"\" + record_content + \"\\\"\"]\n if readd_contents:\n # We need to remove old records in the same request\n data[\"additions\"] = [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": readd_contents,\n \"ttl\": record_ttl,\n },\n ]\n\n changes = self.dns.changes()\n\n try:\n request = changes.create(project=self.project_id, managedZone=zone_id, body=data)\n request.execute()\n except googleapiclient_errors.Error as e:\n logger.warning('Encountered error deleting TXT record: %s', e)\n\n def get_existing_txt_rrset(self, zone_id, record_name):\n \"\"\"\n Get existing TXT records from the RRset for the record name.\n\n If an error occurs while requesting the record set, it is suppressed\n and None is returned.\n\n :param str zone_id: The ID of the managed zone.\n :param str record_name: The record name (typically beginning with '_acme-challenge.').\n\n :returns: List of TXT record values or None\n :rtype: `list` of `string` or `None`\n\n \"\"\"\n rrs_request = self.dns.resourceRecordSets()\n request = rrs_request.list(managedZone=zone_id, project=self.project_id)\n # Add dot as the API returns absolute domains\n record_name += \".\"\n try:\n response = request.execute()\n except googleapiclient_errors.Error:\n logger.info(\"Unable to list existing records. If you're \"\n \"requesting a wildcard certificate, this might not work.\")\n logger.debug(\"Error was:\", exc_info=True)\n else:\n if response:\n for rr in response[\"rrsets\"]:\n if rr[\"name\"] == record_name and rr[\"type\"] == \"TXT\":\n return rr[\"rrdatas\"]\n return None\n\n def _find_managed_zone_id(self, domain):\n \"\"\"\n Find the managed zone for a given domain.\n\n :param str domain: The domain for which to find the managed zone.\n :returns: The ID of the managed zone, if found.\n :rtype: str\n :raises certbot.errors.PluginError: if the managed zone cannot be found.\n \"\"\"\n\n zone_dns_name_guesses = dns_common.base_domain_name_guesses(domain)\n\n mz = self.dns.managedZones()\n for zone_name in zone_dns_name_guesses:\n try:\n request = mz.list(project=self.project_id, dnsName=zone_name + '.')\n response = request.execute()\n zones = response['managedZones']\n except googleapiclient_errors.Error as e:\n raise errors.PluginError('Encountered error finding managed zone: {0}'\n .format(e))\n\n for zone in zones:\n zone_id = zone['id']\n if 'privateVisibilityConfig' not in zone:\n logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)\n return zone_id\n\n raise errors.PluginError('Unable to determine managed zone for {0} using zone names: {1}.'\n .format(domain, zone_dns_name_guesses))\n\n @staticmethod\n def get_project_id():\n \"\"\"\n Query the google metadata service for the current project ID\n\n This only works on Google Cloud Platform\n\n :raises ServerNotFoundError: Not running on Google Compute or DNS not available\n :raises ValueError: Server is found, but response code is not 200\n :returns: project id\n \"\"\"\n url = '{0}project/project-id'.format(METADATA_URL)\n\n # Request an access token from the metadata server.\n http = httplib2.Http()\n r, content = http.request(url, headers=METADATA_HEADERS)\n if r.status != 200:\n raise ValueError(\"Invalid status code: {0}\".format(r))\n\n if isinstance(content, bytes):\n return content.decode()\n return content\n", "path": "certbot-dns-google/certbot_dns_google/_internal/dns_google.py"}], "after_files": [{"content": "\"\"\"DNS Authenticator for Google Cloud DNS.\"\"\"\nimport json\nimport logging\n\nfrom googleapiclient import discovery\nfrom googleapiclient import errors as googleapiclient_errors\nimport httplib2\nfrom oauth2client.service_account import ServiceAccountCredentials\nimport zope.interface\n\nfrom certbot import errors\nfrom certbot import interfaces\nfrom certbot.plugins import dns_common\n\nlogger = logging.getLogger(__name__)\n\nACCT_URL = 'https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount'\nPERMISSIONS_URL = 'https://cloud.google.com/dns/access-control#permissions_and_roles'\nMETADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'\nMETADATA_HEADERS = {'Metadata-Flavor': 'Google'}\n\n\n@zope.interface.implementer(interfaces.IAuthenticator)\n@zope.interface.provider(interfaces.IPluginFactory)\nclass Authenticator(dns_common.DNSAuthenticator):\n \"\"\"DNS Authenticator for Google Cloud DNS\n\n This Authenticator uses the Google Cloud DNS API to fulfill a dns-01 challenge.\n \"\"\"\n\n description = ('Obtain certificates using a DNS TXT record (if you are using Google Cloud DNS '\n 'for DNS).')\n ttl = 60\n\n def __init__(self, *args, **kwargs):\n super(Authenticator, self).__init__(*args, **kwargs)\n self.credentials = None\n\n @classmethod\n def add_parser_arguments(cls, add): # pylint: disable=arguments-differ\n super(Authenticator, cls).add_parser_arguments(add, default_propagation_seconds=60)\n add('credentials',\n help=('Path to Google Cloud DNS service account JSON file. (See {0} for' +\n 'information about creating a service account and {1} for information about the' +\n 'required permissions.)').format(ACCT_URL, PERMISSIONS_URL),\n default=None)\n\n def more_info(self): # pylint: disable=missing-function-docstring\n return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \\\n 'the Google Cloud DNS API.'\n\n def _setup_credentials(self):\n if self.conf('credentials') is None:\n try:\n # use project_id query to check for availability of google metadata server\n # we won't use the result but know we're not on GCP when an exception is thrown\n _GoogleClient.get_project_id()\n except (ValueError, httplib2.ServerNotFoundError):\n raise errors.PluginError('Unable to get Google Cloud Metadata and no credentials'\n ' specified. Automatic credential lookup is only '\n 'available on Google Cloud Platform. Please configure'\n ' credentials using --dns-google-credentials <file>')\n else:\n self._configure_file('credentials',\n 'path to Google Cloud DNS service account JSON file')\n\n dns_common.validate_file_permissions(self.conf('credentials'))\n\n def _perform(self, domain, validation_name, validation):\n self._get_google_client().add_txt_record(domain, validation_name, validation, self.ttl)\n\n def _cleanup(self, domain, validation_name, validation):\n self._get_google_client().del_txt_record(domain, validation_name, validation, self.ttl)\n\n def _get_google_client(self):\n return _GoogleClient(self.conf('credentials'))\n\n\nclass _GoogleClient(object):\n \"\"\"\n Encapsulates all communication with the Google Cloud DNS API.\n \"\"\"\n\n def __init__(self, account_json=None, dns_api=None):\n\n scopes = ['https://www.googleapis.com/auth/ndev.clouddns.readwrite']\n if account_json is not None:\n try:\n credentials = ServiceAccountCredentials.from_json_keyfile_name(account_json, scopes)\n with open(account_json) as account:\n self.project_id = json.load(account)['project_id']\n except Exception as e:\n raise errors.PluginError(\n \"Error parsing credentials file '{}': {}\".format(account_json, e))\n else:\n credentials = None\n self.project_id = self.get_project_id()\n\n if not dns_api:\n self.dns = discovery.build('dns', 'v1',\n credentials=credentials,\n cache_discovery=False)\n else:\n self.dns = dns_api\n\n def add_txt_record(self, domain, record_name, record_content, record_ttl):\n \"\"\"\n Add a TXT record using the supplied information.\n\n :param str domain: The domain to use to look up the managed zone.\n :param str record_name: The record name (typically beginning with '_acme-challenge.').\n :param str record_content: The record content (typically the challenge validation).\n :param int record_ttl: The record TTL (number of seconds that the record may be cached).\n :raises certbot.errors.PluginError: if an error occurs communicating with the Google API\n \"\"\"\n\n zone_id = self._find_managed_zone_id(domain)\n\n record_contents = self.get_existing_txt_rrset(zone_id, record_name)\n if record_contents is None:\n record_contents = []\n add_records = record_contents[:]\n\n if \"\\\"\"+record_content+\"\\\"\" in record_contents:\n # The process was interrupted previously and validation token exists\n return\n\n add_records.append(record_content)\n\n data = {\n \"kind\": \"dns#change\",\n \"additions\": [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": add_records,\n \"ttl\": record_ttl,\n },\n ],\n }\n\n if record_contents:\n # We need to remove old records in the same request\n data[\"deletions\"] = [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": record_contents,\n \"ttl\": record_ttl,\n },\n ]\n\n changes = self.dns.changes()\n\n try:\n request = changes.create(project=self.project_id, managedZone=zone_id, body=data)\n response = request.execute()\n\n status = response['status']\n change = response['id']\n while status == 'pending':\n request = changes.get(project=self.project_id, managedZone=zone_id, changeId=change)\n response = request.execute()\n status = response['status']\n except googleapiclient_errors.Error as e:\n logger.error('Encountered error adding TXT record: %s', e)\n raise errors.PluginError('Error communicating with the Google Cloud DNS API: {0}'\n .format(e))\n\n def del_txt_record(self, domain, record_name, record_content, record_ttl):\n \"\"\"\n Delete a TXT record using the supplied information.\n\n :param str domain: The domain to use to look up the managed zone.\n :param str record_name: The record name (typically beginning with '_acme-challenge.').\n :param str record_content: The record content (typically the challenge validation).\n :param int record_ttl: The record TTL (number of seconds that the record may be cached).\n :raises certbot.errors.PluginError: if an error occurs communicating with the Google API\n \"\"\"\n\n try:\n zone_id = self._find_managed_zone_id(domain)\n except errors.PluginError as e:\n logger.warning('Error finding zone. Skipping cleanup.')\n return\n\n record_contents = self.get_existing_txt_rrset(zone_id, record_name)\n if record_contents is None:\n record_contents = [\"\\\"\" + record_content + \"\\\"\"]\n\n data = {\n \"kind\": \"dns#change\",\n \"deletions\": [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": record_contents,\n \"ttl\": record_ttl,\n },\n ],\n }\n\n # Remove the record being deleted from the list\n readd_contents = [r for r in record_contents if r != \"\\\"\" + record_content + \"\\\"\"]\n if readd_contents:\n # We need to remove old records in the same request\n data[\"additions\"] = [\n {\n \"kind\": \"dns#resourceRecordSet\",\n \"type\": \"TXT\",\n \"name\": record_name + \".\",\n \"rrdatas\": readd_contents,\n \"ttl\": record_ttl,\n },\n ]\n\n changes = self.dns.changes()\n\n try:\n request = changes.create(project=self.project_id, managedZone=zone_id, body=data)\n request.execute()\n except googleapiclient_errors.Error as e:\n logger.warning('Encountered error deleting TXT record: %s', e)\n\n def get_existing_txt_rrset(self, zone_id, record_name):\n \"\"\"\n Get existing TXT records from the RRset for the record name.\n\n If an error occurs while requesting the record set, it is suppressed\n and None is returned.\n\n :param str zone_id: The ID of the managed zone.\n :param str record_name: The record name (typically beginning with '_acme-challenge.').\n\n :returns: List of TXT record values or None\n :rtype: `list` of `string` or `None`\n\n \"\"\"\n rrs_request = self.dns.resourceRecordSets()\n # Add dot as the API returns absolute domains\n record_name += \".\"\n request = rrs_request.list(project=self.project_id, managedZone=zone_id, name=record_name,\n type=\"TXT\")\n try:\n response = request.execute()\n except googleapiclient_errors.Error:\n logger.info(\"Unable to list existing records. If you're \"\n \"requesting a wildcard certificate, this might not work.\")\n logger.debug(\"Error was:\", exc_info=True)\n else:\n if response and response[\"rrsets\"]:\n return response[\"rrsets\"][0][\"rrdatas\"]\n return None\n\n def _find_managed_zone_id(self, domain):\n \"\"\"\n Find the managed zone for a given domain.\n\n :param str domain: The domain for which to find the managed zone.\n :returns: The ID of the managed zone, if found.\n :rtype: str\n :raises certbot.errors.PluginError: if the managed zone cannot be found.\n \"\"\"\n\n zone_dns_name_guesses = dns_common.base_domain_name_guesses(domain)\n\n mz = self.dns.managedZones()\n for zone_name in zone_dns_name_guesses:\n try:\n request = mz.list(project=self.project_id, dnsName=zone_name + '.')\n response = request.execute()\n zones = response['managedZones']\n except googleapiclient_errors.Error as e:\n raise errors.PluginError('Encountered error finding managed zone: {0}'\n .format(e))\n\n for zone in zones:\n zone_id = zone['id']\n if 'privateVisibilityConfig' not in zone:\n logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)\n return zone_id\n\n raise errors.PluginError('Unable to determine managed zone for {0} using zone names: {1}.'\n .format(domain, zone_dns_name_guesses))\n\n @staticmethod\n def get_project_id():\n \"\"\"\n Query the google metadata service for the current project ID\n\n This only works on Google Cloud Platform\n\n :raises ServerNotFoundError: Not running on Google Compute or DNS not available\n :raises ValueError: Server is found, but response code is not 200\n :returns: project id\n \"\"\"\n url = '{0}project/project-id'.format(METADATA_URL)\n\n # Request an access token from the metadata server.\n http = httplib2.Http()\n r, content = http.request(url, headers=METADATA_HEADERS)\n if r.status != 200:\n raise ValueError(\"Invalid status code: {0}\".format(r))\n\n if isinstance(content, bytes):\n return content.decode()\n return content\n", "path": "certbot-dns-google/certbot_dns_google/_internal/dns_google.py"}]}
| 4,049 | 317 |
gh_patches_debug_10912
|
rasdani/github-patches
|
git_diff
|
lnbits__lnbits-888
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Tip Jar extension broken
Tip Jars are broken on v0.9.1
Setup:
- venv python based setup (python 3.9.13)
- LNBits v0.9.1
- Backend: PostgreSQL
- LND-Rest connection
Steps to reproduce:
- add extension "Tip Jar"
- choose wallet
- do not choose onchain
- test tip jar, fill out donator and amount, hit submit
- error occurrs: `'NoneType' object has no attribute 'mempool_endpoint'`
debug log:
```
"POST /tipjar/api/v1/tips HTTP/1.1" 200
ERROR | lnbits.app:basic_error:182 | handled error
ERROR | lnbits.app:basic_error:183 | ERROR:
Traceback (most recent call last):
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/gzip.py", line 24, in __call__
await responder(scope, receive, send)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/gzip.py", line 43, in __call__
await self.app(scope, receive, self.send_with_gzip)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/cors.py", line 84, in __call__
await self.app(scope, receive, send)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/exceptions.py", line 93, in __call__
raise exc
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/exceptions.py", line 82, in __call__
await self.app(scope, receive, sender)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
raise e
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
await self.app(scope, receive, send)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/routing.py", line 670, in __call__
await route.handle(scope, receive, send)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/routing.py", line 266, in handle
await self.app(scope, receive, send)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/routing.py", line 65, in app
response = await func(request)
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/routing.py", line 227, in app
raw_response = await run_endpoint_function(
File "/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/routing.py", line 160, in run_endpoint_function
return await dependant.call(**values)
File "/home/lnbits/lnbits-legend/./lnbits/extensions/satspay/views.py", line 43, in display
"mempool_endpoint": onchainwallet_config.mempool_endpoint,
AttributeError: 'NoneType' object has no attribute 'mempool_endpoint'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lnbits/extensions/tipjar/crud.py`
Content:
```
1 from typing import Optional
2
3 from lnbits.db import SQLITE
4
5 from ..satspay.crud import delete_charge # type: ignore
6 from . import db
7 from .models import Tip, TipJar, createTipJar
8
9
10 async def create_tip(
11 id: int, wallet: str, message: str, name: str, sats: int, tipjar: str
12 ) -> Tip:
13 """Create a new Tip"""
14 await db.execute(
15 """
16 INSERT INTO tipjar.Tips (
17 id,
18 wallet,
19 name,
20 message,
21 sats,
22 tipjar
23 )
24 VALUES (?, ?, ?, ?, ?, ?)
25 """,
26 (id, wallet, name, message, sats, tipjar),
27 )
28
29 tip = await get_tip(id)
30 assert tip, "Newly created tip couldn't be retrieved"
31 return tip
32
33
34 async def create_tipjar(data: createTipJar) -> TipJar:
35 """Create a new TipJar"""
36
37 returning = "" if db.type == SQLITE else "RETURNING ID"
38 method = db.execute if db.type == SQLITE else db.fetchone
39
40 result = await (method)(
41 f"""
42 INSERT INTO tipjar.TipJars (
43 name,
44 wallet,
45 webhook,
46 onchain
47 )
48 VALUES (?, ?, ?, ?)
49 {returning}
50 """,
51 (data.name, data.wallet, data.webhook, data.onchain),
52 )
53 if db.type == SQLITE:
54 tipjar_id = result._result_proxy.lastrowid
55 else:
56 tipjar_id = result[0]
57
58 tipjar = await get_tipjar(tipjar_id)
59 assert tipjar
60 return tipjar
61
62
63 async def get_tipjar(tipjar_id: int) -> Optional[TipJar]:
64 """Return a tipjar by ID"""
65 row = await db.fetchone("SELECT * FROM tipjar.TipJars WHERE id = ?", (tipjar_id,))
66 return TipJar(**row) if row else None
67
68
69 async def get_tipjars(wallet_id: str) -> Optional[list]:
70 """Return all TipJars belonging assigned to the wallet_id"""
71 rows = await db.fetchall(
72 "SELECT * FROM tipjar.TipJars WHERE wallet = ?", (wallet_id,)
73 )
74 return [TipJar(**row) for row in rows] if rows else None
75
76
77 async def delete_tipjar(tipjar_id: int) -> None:
78 """Delete a TipJar and all corresponding Tips"""
79 await db.execute("DELETE FROM tipjar.TipJars WHERE id = ?", (tipjar_id,))
80 rows = await db.fetchall("SELECT * FROM tipjar.Tips WHERE tipjar = ?", (tipjar_id,))
81 for row in rows:
82 await delete_tip(row["id"])
83
84
85 async def get_tip(tip_id: str) -> Optional[Tip]:
86 """Return a Tip"""
87 row = await db.fetchone("SELECT * FROM tipjar.Tips WHERE id = ?", (tip_id,))
88 return Tip(**row) if row else None
89
90
91 async def get_tips(wallet_id: str) -> Optional[list]:
92 """Return all Tips assigned to wallet_id"""
93 rows = await db.fetchall("SELECT * FROM tipjar.Tips WHERE wallet = ?", (wallet_id,))
94 return [Tip(**row) for row in rows] if rows else None
95
96
97 async def delete_tip(tip_id: str) -> None:
98 """Delete a Tip and its corresponding statspay charge"""
99 await db.execute("DELETE FROM tipjar.Tips WHERE id = ?", (tip_id,))
100 await delete_charge(tip_id)
101
102
103 async def update_tip(tip_id: str, **kwargs) -> Tip:
104 """Update a Tip"""
105 q = ", ".join([f"{field[0]} = ?" for field in kwargs.items()])
106 await db.execute(
107 f"UPDATE tipjar.Tips SET {q} WHERE id = ?", (*kwargs.values(), tip_id)
108 )
109 row = await db.fetchone("SELECT * FROM tipjar.Tips WHERE id = ?", (tip_id,))
110 assert row, "Newly updated tip couldn't be retrieved"
111 return Tip(**row)
112
113
114 async def update_tipjar(tipjar_id: str, **kwargs) -> TipJar:
115 """Update a tipjar"""
116 q = ", ".join([f"{field[0]} = ?" for field in kwargs.items()])
117 await db.execute(
118 f"UPDATE tipjar.TipJars SET {q} WHERE id = ?", (*kwargs.values(), tipjar_id)
119 )
120 row = await db.fetchone("SELECT * FROM tipjar.TipJars WHERE id = ?", (tipjar_id,))
121 assert row, "Newly updated tipjar couldn't be retrieved"
122 return TipJar(**row)
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lnbits/extensions/tipjar/crud.py b/lnbits/extensions/tipjar/crud.py
--- a/lnbits/extensions/tipjar/crud.py
+++ b/lnbits/extensions/tipjar/crud.py
@@ -76,10 +76,10 @@
async def delete_tipjar(tipjar_id: int) -> None:
"""Delete a TipJar and all corresponding Tips"""
- await db.execute("DELETE FROM tipjar.TipJars WHERE id = ?", (tipjar_id,))
rows = await db.fetchall("SELECT * FROM tipjar.Tips WHERE tipjar = ?", (tipjar_id,))
for row in rows:
await delete_tip(row["id"])
+ await db.execute("DELETE FROM tipjar.TipJars WHERE id = ?", (tipjar_id,))
async def get_tip(tip_id: str) -> Optional[Tip]:
|
{"golden_diff": "diff --git a/lnbits/extensions/tipjar/crud.py b/lnbits/extensions/tipjar/crud.py\n--- a/lnbits/extensions/tipjar/crud.py\n+++ b/lnbits/extensions/tipjar/crud.py\n@@ -76,10 +76,10 @@\n \n async def delete_tipjar(tipjar_id: int) -> None:\n \"\"\"Delete a TipJar and all corresponding Tips\"\"\"\n- await db.execute(\"DELETE FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n rows = await db.fetchall(\"SELECT * FROM tipjar.Tips WHERE tipjar = ?\", (tipjar_id,))\n for row in rows:\n await delete_tip(row[\"id\"])\n+ await db.execute(\"DELETE FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n \n \n async def get_tip(tip_id: str) -> Optional[Tip]:\n", "issue": "Tip Jar extension broken\nTip Jars are broken on v0.9.1\r\n\r\nSetup:\r\n- venv python based setup (python 3.9.13)\r\n- LNBits v0.9.1\r\n- Backend: PostgreSQL\r\n- LND-Rest connection\r\n\r\nSteps to reproduce: \r\n- add extension \"Tip Jar\"\r\n- choose wallet\r\n- do not choose onchain\r\n- test tip jar, fill out donator and amount, hit submit\r\n- error occurrs: `'NoneType' object has no attribute 'mempool_endpoint'`\r\n\r\ndebug log:\r\n```\r\n\"POST /tipjar/api/v1/tips HTTP/1.1\" 200\r\nERROR | lnbits.app:basic_error:182 | handled error\r\nERROR | lnbits.app:basic_error:183 | ERROR:\r\nTraceback (most recent call last):\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/errors.py\", line 162, in __call__\r\n await self.app(scope, receive, _send)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/gzip.py\", line 24, in __call__\r\n await responder(scope, receive, send)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/gzip.py\", line 43, in __call__\r\n await self.app(scope, receive, self.send_with_gzip)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/middleware/cors.py\", line 84, in __call__\r\n await self.app(scope, receive, send)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/exceptions.py\", line 93, in __call__\r\n raise exc\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/exceptions.py\", line 82, in __call__\r\n await self.app(scope, receive, sender)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py\", line 21, in __call__\r\n raise e\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py\", line 18, in __call__\r\n await self.app(scope, receive, send)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/routing.py\", line 670, in __call__\r\n await route.handle(scope, receive, send)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/routing.py\", line 266, in handle\r\n await self.app(scope, receive, send)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/starlette/routing.py\", line 65, in app\r\n response = await func(request)\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/routing.py\", line 227, in app\r\n raw_response = await run_endpoint_function(\r\n File \"/home/lnbits/lnbits-legend/venv/lib/python3.9/site-packages/fastapi/routing.py\", line 160, in run_endpoint_function\r\n return await dependant.call(**values)\r\n File \"/home/lnbits/lnbits-legend/./lnbits/extensions/satspay/views.py\", line 43, in display\r\n \"mempool_endpoint\": onchainwallet_config.mempool_endpoint,\r\nAttributeError: 'NoneType' object has no attribute 'mempool_endpoint'\r\n```\n", "before_files": [{"content": "from typing import Optional\n\nfrom lnbits.db import SQLITE\n\nfrom ..satspay.crud import delete_charge # type: ignore\nfrom . import db\nfrom .models import Tip, TipJar, createTipJar\n\n\nasync def create_tip(\n id: int, wallet: str, message: str, name: str, sats: int, tipjar: str\n) -> Tip:\n \"\"\"Create a new Tip\"\"\"\n await db.execute(\n \"\"\"\n INSERT INTO tipjar.Tips (\n id,\n wallet,\n name,\n message,\n sats,\n tipjar\n )\n VALUES (?, ?, ?, ?, ?, ?)\n \"\"\",\n (id, wallet, name, message, sats, tipjar),\n )\n\n tip = await get_tip(id)\n assert tip, \"Newly created tip couldn't be retrieved\"\n return tip\n\n\nasync def create_tipjar(data: createTipJar) -> TipJar:\n \"\"\"Create a new TipJar\"\"\"\n\n returning = \"\" if db.type == SQLITE else \"RETURNING ID\"\n method = db.execute if db.type == SQLITE else db.fetchone\n\n result = await (method)(\n f\"\"\"\n INSERT INTO tipjar.TipJars (\n name,\n wallet,\n webhook,\n onchain\n )\n VALUES (?, ?, ?, ?)\n {returning}\n \"\"\",\n (data.name, data.wallet, data.webhook, data.onchain),\n )\n if db.type == SQLITE:\n tipjar_id = result._result_proxy.lastrowid\n else:\n tipjar_id = result[0]\n\n tipjar = await get_tipjar(tipjar_id)\n assert tipjar\n return tipjar\n\n\nasync def get_tipjar(tipjar_id: int) -> Optional[TipJar]:\n \"\"\"Return a tipjar by ID\"\"\"\n row = await db.fetchone(\"SELECT * FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n return TipJar(**row) if row else None\n\n\nasync def get_tipjars(wallet_id: str) -> Optional[list]:\n \"\"\"Return all TipJars belonging assigned to the wallet_id\"\"\"\n rows = await db.fetchall(\n \"SELECT * FROM tipjar.TipJars WHERE wallet = ?\", (wallet_id,)\n )\n return [TipJar(**row) for row in rows] if rows else None\n\n\nasync def delete_tipjar(tipjar_id: int) -> None:\n \"\"\"Delete a TipJar and all corresponding Tips\"\"\"\n await db.execute(\"DELETE FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n rows = await db.fetchall(\"SELECT * FROM tipjar.Tips WHERE tipjar = ?\", (tipjar_id,))\n for row in rows:\n await delete_tip(row[\"id\"])\n\n\nasync def get_tip(tip_id: str) -> Optional[Tip]:\n \"\"\"Return a Tip\"\"\"\n row = await db.fetchone(\"SELECT * FROM tipjar.Tips WHERE id = ?\", (tip_id,))\n return Tip(**row) if row else None\n\n\nasync def get_tips(wallet_id: str) -> Optional[list]:\n \"\"\"Return all Tips assigned to wallet_id\"\"\"\n rows = await db.fetchall(\"SELECT * FROM tipjar.Tips WHERE wallet = ?\", (wallet_id,))\n return [Tip(**row) for row in rows] if rows else None\n\n\nasync def delete_tip(tip_id: str) -> None:\n \"\"\"Delete a Tip and its corresponding statspay charge\"\"\"\n await db.execute(\"DELETE FROM tipjar.Tips WHERE id = ?\", (tip_id,))\n await delete_charge(tip_id)\n\n\nasync def update_tip(tip_id: str, **kwargs) -> Tip:\n \"\"\"Update a Tip\"\"\"\n q = \", \".join([f\"{field[0]} = ?\" for field in kwargs.items()])\n await db.execute(\n f\"UPDATE tipjar.Tips SET {q} WHERE id = ?\", (*kwargs.values(), tip_id)\n )\n row = await db.fetchone(\"SELECT * FROM tipjar.Tips WHERE id = ?\", (tip_id,))\n assert row, \"Newly updated tip couldn't be retrieved\"\n return Tip(**row)\n\n\nasync def update_tipjar(tipjar_id: str, **kwargs) -> TipJar:\n \"\"\"Update a tipjar\"\"\"\n q = \", \".join([f\"{field[0]} = ?\" for field in kwargs.items()])\n await db.execute(\n f\"UPDATE tipjar.TipJars SET {q} WHERE id = ?\", (*kwargs.values(), tipjar_id)\n )\n row = await db.fetchone(\"SELECT * FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n assert row, \"Newly updated tipjar couldn't be retrieved\"\n return TipJar(**row)\n", "path": "lnbits/extensions/tipjar/crud.py"}], "after_files": [{"content": "from typing import Optional\n\nfrom lnbits.db import SQLITE\n\nfrom ..satspay.crud import delete_charge # type: ignore\nfrom . import db\nfrom .models import Tip, TipJar, createTipJar\n\n\nasync def create_tip(\n id: int, wallet: str, message: str, name: str, sats: int, tipjar: str\n) -> Tip:\n \"\"\"Create a new Tip\"\"\"\n await db.execute(\n \"\"\"\n INSERT INTO tipjar.Tips (\n id,\n wallet,\n name,\n message,\n sats,\n tipjar\n )\n VALUES (?, ?, ?, ?, ?, ?)\n \"\"\",\n (id, wallet, name, message, sats, tipjar),\n )\n\n tip = await get_tip(id)\n assert tip, \"Newly created tip couldn't be retrieved\"\n return tip\n\n\nasync def create_tipjar(data: createTipJar) -> TipJar:\n \"\"\"Create a new TipJar\"\"\"\n\n returning = \"\" if db.type == SQLITE else \"RETURNING ID\"\n method = db.execute if db.type == SQLITE else db.fetchone\n\n result = await (method)(\n f\"\"\"\n INSERT INTO tipjar.TipJars (\n name,\n wallet,\n webhook,\n onchain\n )\n VALUES (?, ?, ?, ?)\n {returning}\n \"\"\",\n (data.name, data.wallet, data.webhook, data.onchain),\n )\n if db.type == SQLITE:\n tipjar_id = result._result_proxy.lastrowid\n else:\n tipjar_id = result[0]\n\n tipjar = await get_tipjar(tipjar_id)\n assert tipjar\n return tipjar\n\n\nasync def get_tipjar(tipjar_id: int) -> Optional[TipJar]:\n \"\"\"Return a tipjar by ID\"\"\"\n row = await db.fetchone(\"SELECT * FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n return TipJar(**row) if row else None\n\n\nasync def get_tipjars(wallet_id: str) -> Optional[list]:\n \"\"\"Return all TipJars belonging assigned to the wallet_id\"\"\"\n rows = await db.fetchall(\n \"SELECT * FROM tipjar.TipJars WHERE wallet = ?\", (wallet_id,)\n )\n return [TipJar(**row) for row in rows] if rows else None\n\n\nasync def delete_tipjar(tipjar_id: int) -> None:\n \"\"\"Delete a TipJar and all corresponding Tips\"\"\"\n rows = await db.fetchall(\"SELECT * FROM tipjar.Tips WHERE tipjar = ?\", (tipjar_id,))\n for row in rows:\n await delete_tip(row[\"id\"])\n await db.execute(\"DELETE FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n\n\nasync def get_tip(tip_id: str) -> Optional[Tip]:\n \"\"\"Return a Tip\"\"\"\n row = await db.fetchone(\"SELECT * FROM tipjar.Tips WHERE id = ?\", (tip_id,))\n return Tip(**row) if row else None\n\n\nasync def get_tips(wallet_id: str) -> Optional[list]:\n \"\"\"Return all Tips assigned to wallet_id\"\"\"\n rows = await db.fetchall(\"SELECT * FROM tipjar.Tips WHERE wallet = ?\", (wallet_id,))\n return [Tip(**row) for row in rows] if rows else None\n\n\nasync def delete_tip(tip_id: str) -> None:\n \"\"\"Delete a Tip and its corresponding statspay charge\"\"\"\n await db.execute(\"DELETE FROM tipjar.Tips WHERE id = ?\", (tip_id,))\n await delete_charge(tip_id)\n\n\nasync def update_tip(tip_id: str, **kwargs) -> Tip:\n \"\"\"Update a Tip\"\"\"\n q = \", \".join([f\"{field[0]} = ?\" for field in kwargs.items()])\n await db.execute(\n f\"UPDATE tipjar.Tips SET {q} WHERE id = ?\", (*kwargs.values(), tip_id)\n )\n row = await db.fetchone(\"SELECT * FROM tipjar.Tips WHERE id = ?\", (tip_id,))\n assert row, \"Newly updated tip couldn't be retrieved\"\n return Tip(**row)\n\n\nasync def update_tipjar(tipjar_id: str, **kwargs) -> TipJar:\n \"\"\"Update a tipjar\"\"\"\n q = \", \".join([f\"{field[0]} = ?\" for field in kwargs.items()])\n await db.execute(\n f\"UPDATE tipjar.TipJars SET {q} WHERE id = ?\", (*kwargs.values(), tipjar_id)\n )\n row = await db.fetchone(\"SELECT * FROM tipjar.TipJars WHERE id = ?\", (tipjar_id,))\n assert row, \"Newly updated tipjar couldn't be retrieved\"\n return TipJar(**row)\n", "path": "lnbits/extensions/tipjar/crud.py"}]}
| 2,409 | 193 |
gh_patches_debug_18533
|
rasdani/github-patches
|
git_diff
|
kivy__kivy-1812
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Splitter doesn't respect max_size upon resize (it doesn't resize proportionally)
When you have a Splitter in e.g. a boxlayout with some maximum size that depends on another widget's size, max_size is not respected. Also, I'd be nice, if when resizing the parent, the splitter's size adjusts itself proportionally to what it was before.
To see the issue, run the following code, maximize the window, move the splitter all the way to the left, and resize the window to a smaller size. The splitter will now be larger than the specified max_size:
```
from kivy.base import runTouchApp
from kivy.uix.boxlayout import BoxLayout
from kivy.lang import Builder
kv = '''
<Test>:
Label:
text: 'left'
Splitter:
max_size: self.parent.width - 200
Label:
text: 'right'
'''
Builder.load_string(kv)
class Test(BoxLayout):
pass
if __name__ == '__main__':
runTouchApp(Test())
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kivy/uix/splitter.py`
Content:
```
1 '''
2 Splitter
3 ======
4
5 .. versionadded:: 1.5.0
6
7 .. image:: images/splitter.jpg
8 :align: right
9
10 The :class:`Splitter` is a widget that helps you re-size it's child
11 widget/layout by letting you re-size it via
12 dragging the boundary or double tapping the boundary. This widget is similar to
13 the :class:`~kivy.uix.scrollview.ScrollView` in that it allows only one child
14 widget.
15
16 Usage::
17
18 splitter = Splitter(sizable_from = 'right')
19 splitter.add_widget(layout_or_widget_instance)
20 splitter.min_size = 100
21 splitter.max_size = 250
22
23 To change the size of the strip/border used for resizing::
24
25 splitter.strip_size = '10pt'
26
27 To change it's appearance::
28
29 splitter.strip_cls = your_custom_class
30
31 You can also change the appearance of the `strip_cls`, which defaults to
32 :class:`SplitterStrip`, by overriding the `kv` rule in your app::
33
34 <SplitterStrip>:
35 horizontal: True if self.parent and self.parent.sizable_from[0] \
36 in ('t', 'b') else False
37 background_normal: 'path to normal horizontal image' \
38 if self.horizontal else 'path to vertical normal image'
39 background_down: 'path to pressed horizontal image' \
40 if self.horizontal else 'path to vertical pressed image'
41
42 '''
43
44
45 __all__ = ('Splitter', )
46
47 from kivy.compat import string_types
48 from kivy.factory import Factory
49 from kivy.uix.button import Button
50 from kivy.properties import (OptionProperty, NumericProperty, ObjectProperty,
51 ListProperty)
52 from kivy.uix.boxlayout import BoxLayout
53
54
55 class SplitterStrip(Button):
56 '''Class used for tbe graphical representation of a
57 :class:`kivy.uix.splitter.SplitterStripe`.
58 '''
59 pass
60
61
62 class Splitter(BoxLayout):
63 '''See module documentation.
64
65 :Events:
66 `on_press`:
67 Fired when the splitter is pressed.
68 `on_release`:
69 Fired when the splitter is released.
70
71 .. versionchanged:: 1.6.0
72 Added `on_press` and `on_release` events.
73
74 '''
75
76 border = ListProperty([4, 4, 4, 4])
77 '''Border used for the
78 :class:`~kivy.graphics.vertex_instructions.BorderImage`
79 graphics instruction.
80
81 This must be a list of four values: (top, right, bottom, left).
82 Read the BorderImage instructions for more information about how
83 to use it.
84
85 :attr:`border` is a :class:`~kivy.properties.ListProperty` and
86 defaults to (4, 4, 4, 4).
87 '''
88
89 strip_cls = ObjectProperty(SplitterStrip)
90 '''Specifies the class of the resize Strip.
91
92 :attr:`strip_cls` is an :class:`kivy.properties.ObjectProperty` and
93 defaults to :class:`~kivy.uix.splitter.SplitterStrip`, which is of type
94 :class:`~kivy.uix.button.Button`.
95
96 .. versionchanged:: 1.8.0
97
98 If you set a string, the :class:`~kivy.factory.Factory` will be used to
99 resolve the class.
100
101 '''
102
103 sizable_from = OptionProperty('left', options=(
104 'left', 'right', 'top', 'bottom'))
105 '''Specifies whether the widget is resizable. Options are::
106 `left`, `right`, `top` or `bottom`
107
108 :attr:`sizable_from` is an :class:`~kivy.properties.OptionProperty`
109 and defaults to `left`.
110 '''
111
112 strip_size = NumericProperty('10pt')
113 '''Specifies the size of resize strip
114
115 :attr:`strp_size` is a :class:`~kivy.properties.NumericProperty`
116 defaults to `10pt`
117 '''
118
119 min_size = NumericProperty('100pt')
120 '''Specifies the minimum size beyond which the widget is not resizable.
121
122 :attr:`min_size` is a :class:`~kivy.properties.NumericProperty` and
123 defaults to `100pt`.
124 '''
125
126 max_size = NumericProperty('500pt')
127 '''Specifies the maximum size beyond which the widget is not resizable.
128
129 :attr:`max_size` is a :class:`~kivy.properties.NumericProperty`
130 and defaults to `500pt`.
131 '''
132
133 __events__ = ('on_press', 'on_release')
134
135 def __init__(self, **kwargs):
136 self._container = None
137 self._strip = None
138 super(Splitter, self).__init__(**kwargs)
139
140 def on_sizable_from(self, instance, sizable_from):
141 if not instance._container:
142 return
143
144 sup = super(Splitter, instance)
145 _strp = instance._strip
146 if _strp:
147 # remove any previous binds
148 _strp.unbind(on_touch_down=instance.strip_down)
149 _strp.unbind(on_touch_move=instance.strip_move)
150 _strp.unbind(on_touch_up=instance.strip_up)
151 self.unbind(disabled=_strp.setter('disabled'))
152
153 sup.remove_widget(instance._strip)
154 else:
155 cls = instance.strip_cls
156 if isinstance(cls, string_types):
157 cls = Factory.get(cls)
158 instance._strip = _strp = cls()
159
160 sz_frm = instance.sizable_from[0]
161 if sz_frm in ('l', 'r'):
162 _strp.size_hint = None, 1
163 _strp.width = instance.strip_size
164 instance.orientation = 'horizontal'
165 instance.unbind(strip_size=_strp.setter('width'))
166 instance.bind(strip_size=_strp.setter('width'))
167 else:
168 _strp.size_hint = 1, None
169 _strp.height = instance.strip_size
170 instance.orientation = 'vertical'
171 instance.unbind(strip_size=_strp.setter('height'))
172 instance.bind(strip_size=_strp.setter('height'))
173
174 index = 1
175 if sz_frm in ('r', 'b'):
176 index = 0
177 sup.add_widget(_strp, index)
178
179 _strp.bind(on_touch_down=instance.strip_down)
180 _strp.bind(on_touch_move=instance.strip_move)
181 _strp.bind(on_touch_up=instance.strip_up)
182 _strp.disabled = self.disabled
183 self.bind(disabled=_strp.setter('disabled'))
184
185 def add_widget(self, widget, index=0):
186 if self._container or not widget:
187 return Exception('Splitter accepts only one Child')
188 self._container = widget
189 sz_frm = self.sizable_from[0]
190 if sz_frm in ('l', 'r'):
191 widget.size_hint_x = 1
192 else:
193 widget.size_hint_y = 1
194
195 index = 0
196 if sz_frm in ('r', 'b'):
197 index = 1
198 super(Splitter, self).add_widget(widget, index)
199 self.on_sizable_from(self, self.sizable_from)
200
201 def remove_widget(self, widget, *largs):
202 super(Splitter, self).remove_widget(widget)
203 if widget == self._container:
204 self._container = None
205
206 def clear_widgets(self):
207 self.remove_widget(self._container)
208
209 def strip_down(self, instance, touch):
210 if not instance.collide_point(*touch.pos):
211 return False
212 touch.grab(self)
213 self.dispatch('on_press')
214
215 def on_press(self):
216 pass
217
218 def strip_move(self, instance, touch):
219 if touch.grab_current is not instance:
220 return False
221 sign = 1
222 max_size = self.max_size
223 min_size = self.min_size
224 sz_frm = self.sizable_from[0]
225
226 if sz_frm in ('t', 'b'):
227 diff_y = (touch.dy)
228 if sz_frm == 'b':
229 sign = -1
230 if self.size_hint_y:
231 self.size_hint_y = None
232 if self.height > 0:
233 self.height += sign * diff_y
234 else:
235 self.height = 1
236
237 height = self.height
238 self.height = max(min_size, min(height, max_size))
239 else:
240 diff_x = (touch.dx)
241 if sz_frm == 'l':
242 sign = -1
243 if self.size_hint_x:
244 self.size_hint_x = None
245 if self.width > 0:
246 self.width += sign * (diff_x)
247 else:
248 self.width = 1
249
250 width = self.width
251 self.width = max(min_size, min(width, max_size))
252
253 def strip_up(self, instance, touch):
254 if touch.grab_current is not instance:
255 return
256
257 if touch.is_double_tap:
258 max_size = self.max_size
259 min_size = self.min_size
260 sz_frm = self.sizable_from[0]
261 s = self.size
262
263 if sz_frm in ('t', 'b'):
264 if self.size_hint_y:
265 self.size_hint_y = None
266 if s[1] - min_size <= max_size - s[1]:
267 self.height = max_size
268 else:
269 self.height = min_size
270 else:
271 if self.size_hint_x:
272 self.size_hint_x = None
273 if s[0] - min_size <= max_size - s[0]:
274 self.width = max_size
275 else:
276 self.width = min_size
277 touch.ungrab(instance)
278 self.dispatch('on_release')
279
280 def on_release(self):
281 pass
282
283
284 if __name__ == '__main__':
285 from kivy.app import App
286 from kivy.uix.button import Button
287 from kivy.uix.floatlayout import FloatLayout
288
289 class SplitterApp(App):
290
291 def build(self):
292 root = FloatLayout()
293 bx = BoxLayout()
294 bx.add_widget(Button())
295 bx.add_widget(Button())
296 bx2 = BoxLayout()
297 bx2.add_widget(Button())
298 bx2.add_widget(Button())
299 bx2.add_widget(Button())
300 spl = Splitter(
301 size_hint=(1, .25),
302 pos_hint = {'top': 1},
303 sizable_from = 'bottom')
304 spl1 = Splitter(
305 sizable_from='left',
306 size_hint=(None, 1), width=90)
307 spl1.add_widget(Button())
308 bx.add_widget(spl1)
309 spl.add_widget(bx)
310
311 spl2 = Splitter(size_hint=(.25, 1))
312 spl2.add_widget(bx2)
313 spl2.sizable_from = 'right'
314 root.add_widget(spl)
315 root.add_widget(spl2)
316 return root
317
318 SplitterApp().run()
319
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/kivy/uix/splitter.py b/kivy/uix/splitter.py
--- a/kivy/uix/splitter.py
+++ b/kivy/uix/splitter.py
@@ -136,6 +136,9 @@
self._container = None
self._strip = None
super(Splitter, self).__init__(**kwargs)
+ self.bind(size=self._do_size,
+ max_size=self._do_size,
+ min_size=self._do_size)
def on_sizable_from(self, instance, sizable_from):
if not instance._container:
@@ -215,6 +218,12 @@
def on_press(self):
pass
+ def _do_size(self, instance, value):
+ if self.sizable_from[0] in ('l', 'r'):
+ self.width = max(self.min_size, min(self.width, self.max_size))
+ else:
+ self.height = max(self.min_size, min(self.height, self.max_size))
+
def strip_move(self, instance, touch):
if touch.grab_current is not instance:
return False
|
{"golden_diff": "diff --git a/kivy/uix/splitter.py b/kivy/uix/splitter.py\n--- a/kivy/uix/splitter.py\n+++ b/kivy/uix/splitter.py\n@@ -136,6 +136,9 @@\n self._container = None\n self._strip = None\n super(Splitter, self).__init__(**kwargs)\n+ self.bind(size=self._do_size,\n+ max_size=self._do_size,\n+ min_size=self._do_size)\n \n def on_sizable_from(self, instance, sizable_from):\n if not instance._container:\n@@ -215,6 +218,12 @@\n def on_press(self):\n pass\n \n+ def _do_size(self, instance, value):\n+ if self.sizable_from[0] in ('l', 'r'):\n+ self.width = max(self.min_size, min(self.width, self.max_size))\n+ else:\n+ self.height = max(self.min_size, min(self.height, self.max_size))\n+\n def strip_move(self, instance, touch):\n if touch.grab_current is not instance:\n return False\n", "issue": "Splitter doesn't respect max_size upon resize (it doesn't resize proportionally)\nWhen you have a Splitter in e.g. a boxlayout with some maximum size that depends on another widget's size, max_size is not respected. Also, I'd be nice, if when resizing the parent, the splitter's size adjusts itself proportionally to what it was before.\n\nTo see the issue, run the following code, maximize the window, move the splitter all the way to the left, and resize the window to a smaller size. The splitter will now be larger than the specified max_size:\n\n```\nfrom kivy.base import runTouchApp\nfrom kivy.uix.boxlayout import BoxLayout\nfrom kivy.lang import Builder\n\n\nkv = '''\n<Test>:\n Label:\n text: 'left'\n Splitter:\n max_size: self.parent.width - 200\n Label:\n text: 'right'\n\n\n'''\nBuilder.load_string(kv)\n\nclass Test(BoxLayout):\n pass\n\nif __name__ == '__main__':\n runTouchApp(Test())\n```\n\n", "before_files": [{"content": "'''\nSplitter\n======\n\n.. versionadded:: 1.5.0\n\n.. image:: images/splitter.jpg\n :align: right\n\nThe :class:`Splitter` is a widget that helps you re-size it's child\nwidget/layout by letting you re-size it via\ndragging the boundary or double tapping the boundary. This widget is similar to\nthe :class:`~kivy.uix.scrollview.ScrollView` in that it allows only one child\nwidget.\n\nUsage::\n\n splitter = Splitter(sizable_from = 'right')\n splitter.add_widget(layout_or_widget_instance)\n splitter.min_size = 100\n splitter.max_size = 250\n\nTo change the size of the strip/border used for resizing::\n\n splitter.strip_size = '10pt'\n\nTo change it's appearance::\n\n splitter.strip_cls = your_custom_class\n\nYou can also change the appearance of the `strip_cls`, which defaults to\n:class:`SplitterStrip`, by overriding the `kv` rule in your app::\n\n <SplitterStrip>:\n horizontal: True if self.parent and self.parent.sizable_from[0] \\\nin ('t', 'b') else False\n background_normal: 'path to normal horizontal image' \\\nif self.horizontal else 'path to vertical normal image'\n background_down: 'path to pressed horizontal image' \\\nif self.horizontal else 'path to vertical pressed image'\n\n'''\n\n\n__all__ = ('Splitter', )\n\nfrom kivy.compat import string_types\nfrom kivy.factory import Factory\nfrom kivy.uix.button import Button\nfrom kivy.properties import (OptionProperty, NumericProperty, ObjectProperty,\n ListProperty)\nfrom kivy.uix.boxlayout import BoxLayout\n\n\nclass SplitterStrip(Button):\n '''Class used for tbe graphical representation of a\n :class:`kivy.uix.splitter.SplitterStripe`.\n '''\n pass\n\n\nclass Splitter(BoxLayout):\n '''See module documentation.\n\n :Events:\n `on_press`:\n Fired when the splitter is pressed.\n `on_release`:\n Fired when the splitter is released.\n\n .. versionchanged:: 1.6.0\n Added `on_press` and `on_release` events.\n\n '''\n\n border = ListProperty([4, 4, 4, 4])\n '''Border used for the\n :class:`~kivy.graphics.vertex_instructions.BorderImage`\n graphics instruction.\n\n This must be a list of four values: (top, right, bottom, left).\n Read the BorderImage instructions for more information about how\n to use it.\n\n :attr:`border` is a :class:`~kivy.properties.ListProperty` and\n defaults to (4, 4, 4, 4).\n '''\n\n strip_cls = ObjectProperty(SplitterStrip)\n '''Specifies the class of the resize Strip.\n\n :attr:`strip_cls` is an :class:`kivy.properties.ObjectProperty` and\n defaults to :class:`~kivy.uix.splitter.SplitterStrip`, which is of type\n :class:`~kivy.uix.button.Button`.\n\n .. versionchanged:: 1.8.0\n\n If you set a string, the :class:`~kivy.factory.Factory` will be used to\n resolve the class.\n\n '''\n\n sizable_from = OptionProperty('left', options=(\n 'left', 'right', 'top', 'bottom'))\n '''Specifies whether the widget is resizable. Options are::\n `left`, `right`, `top` or `bottom`\n\n :attr:`sizable_from` is an :class:`~kivy.properties.OptionProperty`\n and defaults to `left`.\n '''\n\n strip_size = NumericProperty('10pt')\n '''Specifies the size of resize strip\n\n :attr:`strp_size` is a :class:`~kivy.properties.NumericProperty`\n defaults to `10pt`\n '''\n\n min_size = NumericProperty('100pt')\n '''Specifies the minimum size beyond which the widget is not resizable.\n\n :attr:`min_size` is a :class:`~kivy.properties.NumericProperty` and\n defaults to `100pt`.\n '''\n\n max_size = NumericProperty('500pt')\n '''Specifies the maximum size beyond which the widget is not resizable.\n\n :attr:`max_size` is a :class:`~kivy.properties.NumericProperty`\n and defaults to `500pt`.\n '''\n\n __events__ = ('on_press', 'on_release')\n\n def __init__(self, **kwargs):\n self._container = None\n self._strip = None\n super(Splitter, self).__init__(**kwargs)\n\n def on_sizable_from(self, instance, sizable_from):\n if not instance._container:\n return\n\n sup = super(Splitter, instance)\n _strp = instance._strip\n if _strp:\n # remove any previous binds\n _strp.unbind(on_touch_down=instance.strip_down)\n _strp.unbind(on_touch_move=instance.strip_move)\n _strp.unbind(on_touch_up=instance.strip_up)\n self.unbind(disabled=_strp.setter('disabled'))\n\n sup.remove_widget(instance._strip)\n else:\n cls = instance.strip_cls\n if isinstance(cls, string_types):\n cls = Factory.get(cls)\n instance._strip = _strp = cls()\n\n sz_frm = instance.sizable_from[0]\n if sz_frm in ('l', 'r'):\n _strp.size_hint = None, 1\n _strp.width = instance.strip_size\n instance.orientation = 'horizontal'\n instance.unbind(strip_size=_strp.setter('width'))\n instance.bind(strip_size=_strp.setter('width'))\n else:\n _strp.size_hint = 1, None\n _strp.height = instance.strip_size\n instance.orientation = 'vertical'\n instance.unbind(strip_size=_strp.setter('height'))\n instance.bind(strip_size=_strp.setter('height'))\n\n index = 1\n if sz_frm in ('r', 'b'):\n index = 0\n sup.add_widget(_strp, index)\n\n _strp.bind(on_touch_down=instance.strip_down)\n _strp.bind(on_touch_move=instance.strip_move)\n _strp.bind(on_touch_up=instance.strip_up)\n _strp.disabled = self.disabled\n self.bind(disabled=_strp.setter('disabled'))\n\n def add_widget(self, widget, index=0):\n if self._container or not widget:\n return Exception('Splitter accepts only one Child')\n self._container = widget\n sz_frm = self.sizable_from[0]\n if sz_frm in ('l', 'r'):\n widget.size_hint_x = 1\n else:\n widget.size_hint_y = 1\n\n index = 0\n if sz_frm in ('r', 'b'):\n index = 1\n super(Splitter, self).add_widget(widget, index)\n self.on_sizable_from(self, self.sizable_from)\n\n def remove_widget(self, widget, *largs):\n super(Splitter, self).remove_widget(widget)\n if widget == self._container:\n self._container = None\n\n def clear_widgets(self):\n self.remove_widget(self._container)\n\n def strip_down(self, instance, touch):\n if not instance.collide_point(*touch.pos):\n return False\n touch.grab(self)\n self.dispatch('on_press')\n\n def on_press(self):\n pass\n\n def strip_move(self, instance, touch):\n if touch.grab_current is not instance:\n return False\n sign = 1\n max_size = self.max_size\n min_size = self.min_size\n sz_frm = self.sizable_from[0]\n\n if sz_frm in ('t', 'b'):\n diff_y = (touch.dy)\n if sz_frm == 'b':\n sign = -1\n if self.size_hint_y:\n self.size_hint_y = None\n if self.height > 0:\n self.height += sign * diff_y\n else:\n self.height = 1\n\n height = self.height\n self.height = max(min_size, min(height, max_size))\n else:\n diff_x = (touch.dx)\n if sz_frm == 'l':\n sign = -1\n if self.size_hint_x:\n self.size_hint_x = None\n if self.width > 0:\n self.width += sign * (diff_x)\n else:\n self.width = 1\n\n width = self.width\n self.width = max(min_size, min(width, max_size))\n\n def strip_up(self, instance, touch):\n if touch.grab_current is not instance:\n return\n\n if touch.is_double_tap:\n max_size = self.max_size\n min_size = self.min_size\n sz_frm = self.sizable_from[0]\n s = self.size\n\n if sz_frm in ('t', 'b'):\n if self.size_hint_y:\n self.size_hint_y = None\n if s[1] - min_size <= max_size - s[1]:\n self.height = max_size\n else:\n self.height = min_size\n else:\n if self.size_hint_x:\n self.size_hint_x = None\n if s[0] - min_size <= max_size - s[0]:\n self.width = max_size\n else:\n self.width = min_size\n touch.ungrab(instance)\n self.dispatch('on_release')\n\n def on_release(self):\n pass\n\n\nif __name__ == '__main__':\n from kivy.app import App\n from kivy.uix.button import Button\n from kivy.uix.floatlayout import FloatLayout\n\n class SplitterApp(App):\n\n def build(self):\n root = FloatLayout()\n bx = BoxLayout()\n bx.add_widget(Button())\n bx.add_widget(Button())\n bx2 = BoxLayout()\n bx2.add_widget(Button())\n bx2.add_widget(Button())\n bx2.add_widget(Button())\n spl = Splitter(\n size_hint=(1, .25),\n pos_hint = {'top': 1},\n sizable_from = 'bottom')\n spl1 = Splitter(\n sizable_from='left',\n size_hint=(None, 1), width=90)\n spl1.add_widget(Button())\n bx.add_widget(spl1)\n spl.add_widget(bx)\n\n spl2 = Splitter(size_hint=(.25, 1))\n spl2.add_widget(bx2)\n spl2.sizable_from = 'right'\n root.add_widget(spl)\n root.add_widget(spl2)\n return root\n\n SplitterApp().run()\n", "path": "kivy/uix/splitter.py"}], "after_files": [{"content": "'''\nSplitter\n======\n\n.. versionadded:: 1.5.0\n\n.. image:: images/splitter.jpg\n :align: right\n\nThe :class:`Splitter` is a widget that helps you re-size it's child\nwidget/layout by letting you re-size it via\ndragging the boundary or double tapping the boundary. This widget is similar to\nthe :class:`~kivy.uix.scrollview.ScrollView` in that it allows only one child\nwidget.\n\nUsage::\n\n splitter = Splitter(sizable_from = 'right')\n splitter.add_widget(layout_or_widget_instance)\n splitter.min_size = 100\n splitter.max_size = 250\n\nTo change the size of the strip/border used for resizing::\n\n splitter.strip_size = '10pt'\n\nTo change it's appearance::\n\n splitter.strip_cls = your_custom_class\n\nYou can also change the appearance of the `strip_cls`, which defaults to\n:class:`SplitterStrip`, by overriding the `kv` rule in your app::\n\n <SplitterStrip>:\n horizontal: True if self.parent and self.parent.sizable_from[0] \\\nin ('t', 'b') else False\n background_normal: 'path to normal horizontal image' \\\nif self.horizontal else 'path to vertical normal image'\n background_down: 'path to pressed horizontal image' \\\nif self.horizontal else 'path to vertical pressed image'\n\n'''\n\n\n__all__ = ('Splitter', )\n\nfrom kivy.compat import string_types\nfrom kivy.factory import Factory\nfrom kivy.uix.button import Button\nfrom kivy.properties import (OptionProperty, NumericProperty, ObjectProperty,\n ListProperty)\nfrom kivy.uix.boxlayout import BoxLayout\n\n\nclass SplitterStrip(Button):\n '''Class used for tbe graphical representation of a\n :class:`kivy.uix.splitter.SplitterStripe`.\n '''\n pass\n\n\nclass Splitter(BoxLayout):\n '''See module documentation.\n\n :Events:\n `on_press`:\n Fired when the splitter is pressed.\n `on_release`:\n Fired when the splitter is released.\n\n .. versionchanged:: 1.6.0\n Added `on_press` and `on_release` events.\n\n '''\n\n border = ListProperty([4, 4, 4, 4])\n '''Border used for the\n :class:`~kivy.graphics.vertex_instructions.BorderImage`\n graphics instruction.\n\n This must be a list of four values: (top, right, bottom, left).\n Read the BorderImage instructions for more information about how\n to use it.\n\n :attr:`border` is a :class:`~kivy.properties.ListProperty` and\n defaults to (4, 4, 4, 4).\n '''\n\n strip_cls = ObjectProperty(SplitterStrip)\n '''Specifies the class of the resize Strip.\n\n :attr:`strip_cls` is an :class:`kivy.properties.ObjectProperty` and\n defaults to :class:`~kivy.uix.splitter.SplitterStrip`, which is of type\n :class:`~kivy.uix.button.Button`.\n\n .. versionchanged:: 1.8.0\n\n If you set a string, the :class:`~kivy.factory.Factory` will be used to\n resolve the class.\n\n '''\n\n sizable_from = OptionProperty('left', options=(\n 'left', 'right', 'top', 'bottom'))\n '''Specifies whether the widget is resizable. Options are::\n `left`, `right`, `top` or `bottom`\n\n :attr:`sizable_from` is an :class:`~kivy.properties.OptionProperty`\n and defaults to `left`.\n '''\n\n strip_size = NumericProperty('10pt')\n '''Specifies the size of resize strip\n\n :attr:`strp_size` is a :class:`~kivy.properties.NumericProperty`\n defaults to `10pt`\n '''\n\n min_size = NumericProperty('100pt')\n '''Specifies the minimum size beyond which the widget is not resizable.\n\n :attr:`min_size` is a :class:`~kivy.properties.NumericProperty` and\n defaults to `100pt`.\n '''\n\n max_size = NumericProperty('500pt')\n '''Specifies the maximum size beyond which the widget is not resizable.\n\n :attr:`max_size` is a :class:`~kivy.properties.NumericProperty`\n and defaults to `500pt`.\n '''\n\n __events__ = ('on_press', 'on_release')\n\n def __init__(self, **kwargs):\n self._container = None\n self._strip = None\n super(Splitter, self).__init__(**kwargs)\n self.bind(size=self._do_size,\n max_size=self._do_size,\n min_size=self._do_size)\n\n def on_sizable_from(self, instance, sizable_from):\n if not instance._container:\n return\n\n sup = super(Splitter, instance)\n _strp = instance._strip\n if _strp:\n # remove any previous binds\n _strp.unbind(on_touch_down=instance.strip_down)\n _strp.unbind(on_touch_move=instance.strip_move)\n _strp.unbind(on_touch_up=instance.strip_up)\n self.unbind(disabled=_strp.setter('disabled'))\n\n sup.remove_widget(instance._strip)\n else:\n cls = instance.strip_cls\n if isinstance(cls, string_types):\n cls = Factory.get(cls)\n instance._strip = _strp = cls()\n\n sz_frm = instance.sizable_from[0]\n if sz_frm in ('l', 'r'):\n _strp.size_hint = None, 1\n _strp.width = instance.strip_size\n instance.orientation = 'horizontal'\n instance.unbind(strip_size=_strp.setter('width'))\n instance.bind(strip_size=_strp.setter('width'))\n else:\n _strp.size_hint = 1, None\n _strp.height = instance.strip_size\n instance.orientation = 'vertical'\n instance.unbind(strip_size=_strp.setter('height'))\n instance.bind(strip_size=_strp.setter('height'))\n\n index = 1\n if sz_frm in ('r', 'b'):\n index = 0\n sup.add_widget(_strp, index)\n\n _strp.bind(on_touch_down=instance.strip_down)\n _strp.bind(on_touch_move=instance.strip_move)\n _strp.bind(on_touch_up=instance.strip_up)\n _strp.disabled = self.disabled\n self.bind(disabled=_strp.setter('disabled'))\n\n def add_widget(self, widget, index=0):\n if self._container or not widget:\n return Exception('Splitter accepts only one Child')\n self._container = widget\n sz_frm = self.sizable_from[0]\n if sz_frm in ('l', 'r'):\n widget.size_hint_x = 1\n else:\n widget.size_hint_y = 1\n\n index = 0\n if sz_frm in ('r', 'b'):\n index = 1\n super(Splitter, self).add_widget(widget, index)\n self.on_sizable_from(self, self.sizable_from)\n\n def remove_widget(self, widget, *largs):\n super(Splitter, self).remove_widget(widget)\n if widget == self._container:\n self._container = None\n\n def clear_widgets(self):\n self.remove_widget(self._container)\n\n def strip_down(self, instance, touch):\n if not instance.collide_point(*touch.pos):\n return False\n touch.grab(self)\n self.dispatch('on_press')\n\n def on_press(self):\n pass\n\n def _do_size(self, instance, value):\n if self.sizable_from[0] in ('l', 'r'):\n self.width = max(self.min_size, min(self.width, self.max_size))\n else:\n self.height = max(self.min_size, min(self.height, self.max_size))\n\n def strip_move(self, instance, touch):\n if touch.grab_current is not instance:\n return False\n sign = 1\n max_size = self.max_size\n min_size = self.min_size\n sz_frm = self.sizable_from[0]\n\n if sz_frm in ('t', 'b'):\n diff_y = (touch.dy)\n if sz_frm == 'b':\n sign = -1\n if self.size_hint_y:\n self.size_hint_y = None\n if self.height > 0:\n self.height += sign * diff_y\n else:\n self.height = 1\n\n height = self.height\n self.height = max(min_size, min(height, max_size))\n else:\n diff_x = (touch.dx)\n if sz_frm == 'l':\n sign = -1\n if self.size_hint_x:\n self.size_hint_x = None\n if self.width > 0:\n self.width += sign * (diff_x)\n else:\n self.width = 1\n\n width = self.width\n self.width = max(min_size, min(width, max_size))\n\n def strip_up(self, instance, touch):\n if touch.grab_current is not instance:\n return\n\n if touch.is_double_tap:\n max_size = self.max_size\n min_size = self.min_size\n sz_frm = self.sizable_from[0]\n s = self.size\n\n if sz_frm in ('t', 'b'):\n if self.size_hint_y:\n self.size_hint_y = None\n if s[1] - min_size <= max_size - s[1]:\n self.height = max_size\n else:\n self.height = min_size\n else:\n if self.size_hint_x:\n self.size_hint_x = None\n if s[0] - min_size <= max_size - s[0]:\n self.width = max_size\n else:\n self.width = min_size\n touch.ungrab(instance)\n self.dispatch('on_release')\n\n def on_release(self):\n pass\n\n\nif __name__ == '__main__':\n from kivy.app import App\n from kivy.uix.button import Button\n from kivy.uix.floatlayout import FloatLayout\n\n class SplitterApp(App):\n\n def build(self):\n root = FloatLayout()\n bx = BoxLayout()\n bx.add_widget(Button())\n bx.add_widget(Button())\n bx2 = BoxLayout()\n bx2.add_widget(Button())\n bx2.add_widget(Button())\n bx2.add_widget(Button())\n spl = Splitter(\n size_hint=(1, .25),\n pos_hint = {'top': 1},\n sizable_from = 'bottom')\n spl1 = Splitter(\n sizable_from='left',\n size_hint=(None, 1), width=90)\n spl1.add_widget(Button())\n bx.add_widget(spl1)\n spl.add_widget(bx)\n\n spl2 = Splitter(size_hint=(.25, 1))\n spl2.add_widget(bx2)\n spl2.sizable_from = 'right'\n root.add_widget(spl)\n root.add_widget(spl2)\n return root\n\n SplitterApp().run()\n", "path": "kivy/uix/splitter.py"}]}
| 3,670 | 252 |
gh_patches_debug_59105
|
rasdani/github-patches
|
git_diff
|
fedora-infra__bodhi-4220
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bodhi doesn't correctly mark updates as critpath updates for Rawhide
Rawhide updates are never identified as critpath updates:
https://bodhi.fedoraproject.org/updates/FEDORA-2021-c8f0463820
https://bodhi.fedoraproject.org/updates/FEDORA-2021-d8b1203333
The problem is in the interface with PDC:
https://github.com/fedora-infra/bodhi/blob/d275cc2bb9e513dc10e2009262adc989e199dadf/bodhi/server/models.py#L2103
and
https://github.com/fedora-infra/bodhi/blob/d275cc2bb9e513dc10e2009262adc989e199dadf/bodhi/server/util.py#L959
We query PDC for "name=f34", but rawhide in PDC is always "name=master"...
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bodhi/server/consumers/automatic_updates.py`
Content:
```
1 # Copyright © 2019 Red Hat, Inc. and others.
2 #
3 # This file is part of Bodhi.
4 #
5 # This program is free software; you can redistribute it and/or
6 # modify it under the terms of the GNU General Public License
7 # as published by the Free Software Foundation; either version 2
8 # of the License, or (at your option) any later version.
9 #
10 # This program is distributed in the hope that it will be useful,
11 # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 # GNU General Public License for more details.
14 #
15 # You should have received a copy of the GNU General Public License along with
16 # this program; if not, write to the Free Software Foundation, Inc., 51
17 # Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
18 """
19 The Bodhi handler that creates updates automatically from tagged builds.
20
21 This module is responsible for the process of creating updates when builds are
22 tagged with certain tags.
23 """
24
25 import logging
26 import re
27
28 import fedora_messaging
29
30 from bodhi.server import buildsys
31 from bodhi.server.config import config
32 from bodhi.server.models import (
33 Bug, Build, ContentType, Package, Release, Update, UpdateStatus, UpdateType, User)
34 from bodhi.server.tasks import work_on_bugs_task
35 from bodhi.server.util import transactional_session_maker
36
37 log = logging.getLogger('bodhi')
38
39
40 class AutomaticUpdateHandler:
41 """
42 The Bodhi Automatic Update Handler.
43
44 A consumer that listens for messages about tagged builds and creates
45 updates from them.
46 """
47
48 def __init__(self, db_factory: transactional_session_maker = None):
49 """
50 Initialize the Automatic Update Handler.
51
52 Args:
53 db_factory: If given, used as the db_factory for this handler. If
54 None (the default), a new TransactionalSessionMaker is created and
55 used.
56 """
57 if not db_factory:
58 self.db_factory = transactional_session_maker()
59 else:
60 self.db_factory = db_factory
61
62 def __call__(self, message: fedora_messaging.api.Message) -> None:
63 """Create updates from appropriately tagged builds.
64
65 Args:
66 message: The message we are processing.
67 """
68 body = message.body
69
70 missing = []
71 for mandatory in ('tag', 'build_id', 'name', 'version', 'release'):
72 if mandatory not in body:
73 missing.append(mandatory)
74 if missing:
75 log.debug(f"Received incomplete tag message. Missing: {', '.join(missing)}")
76 return
77
78 btag = body['tag']
79 bnvr = '{name}-{version}-{release}'.format(**body)
80
81 koji = buildsys.get_session()
82
83 kbuildinfo = koji.getBuild(bnvr)
84 if not kbuildinfo:
85 log.debug(f"Can't find Koji build for {bnvr}.")
86 return
87
88 if 'nvr' not in kbuildinfo:
89 log.debug(f"Koji build info for {bnvr} doesn't contain 'nvr'.")
90 return
91
92 if 'owner_name' not in kbuildinfo:
93 log.debug(f"Koji build info for {bnvr} doesn't contain 'owner_name'.")
94 return
95
96 if kbuildinfo['owner_name'] in config.get('automatic_updates_blacklist'):
97 log.debug(f"{bnvr} owned by {kbuildinfo['owner_name']} who is listed in "
98 "automatic_updates_blacklist, skipping.")
99 return
100
101 # some APIs want the Koji build info, some others want the same
102 # wrapped in a larger (request?) structure
103 rbuildinfo = {
104 'info': kbuildinfo,
105 'nvr': kbuildinfo['nvr'].rsplit('-', 2),
106 }
107
108 with self.db_factory() as dbsession:
109 rel = dbsession.query(Release).filter_by(create_automatic_updates=True,
110 candidate_tag=btag).first()
111 if not rel:
112 log.debug(f"Ignoring build being tagged into {btag!r}, no release configured for "
113 "automatic updates for it found.")
114 return
115
116 bcls = ContentType.infer_content_class(Build, kbuildinfo)
117 build = bcls.get(bnvr)
118 if build and build.update:
119 log.info(f"Build, active update for {bnvr} exists already, skipping.")
120 return
121
122 if not build:
123 log.debug(f"Build for {bnvr} doesn't exist yet, creating.")
124
125 # Package.get_or_create() infers content type already
126 log.debug("Getting/creating related package object.")
127 pkg = Package.get_or_create(dbsession, rbuildinfo)
128
129 log.debug("Creating build object, adding it to the DB.")
130 build = bcls(nvr=bnvr, package=pkg, release=rel)
131 dbsession.add(build)
132
133 owner_name = kbuildinfo['owner_name']
134 user = User.get(owner_name)
135 if not user:
136 log.debug(f"Creating bodhi user for '{owner_name}'.")
137 # Leave email, groups blank, these will be filled
138 # in or updated when they log into Bodhi next time, see
139 # bodhi.server.security:remember_me().
140 user = User(name=owner_name)
141 dbsession.add(user)
142
143 log.debug(f"Creating new update for {bnvr}.")
144 try:
145 changelog = build.get_changelog(lastupdate=True)
146 except ValueError:
147 # Often due to bot-generated builds
148 # https://github.com/fedora-infra/bodhi/issues/4146
149 changelog = None
150 except Exception:
151 # Re-raise exception, so that the message can be re-queued
152 raise
153 closing_bugs = []
154 if changelog:
155 log.debug("Adding changelog to update notes.")
156 notes = f"""Automatic update for {bnvr}.
157
158 ##### **Changelog**
159
160 ```
161 {changelog}
162 ```"""
163
164 if rel.name not in config.get('bz_exclude_rels'):
165 for b in re.finditer(config.get('bz_regex'), changelog, re.IGNORECASE):
166 idx = int(b.group(1))
167 log.debug(f'Adding bug #{idx} to the update.')
168 bug = Bug.get(idx)
169 if bug is None:
170 bug = Bug(bug_id=idx)
171 dbsession.add(bug)
172 dbsession.flush()
173 if bug not in closing_bugs:
174 closing_bugs.append(bug)
175 else:
176 notes = f"Automatic update for {bnvr}."
177 update = Update(
178 release=rel,
179 builds=[build],
180 bugs=closing_bugs,
181 notes=notes,
182 type=UpdateType.unspecified,
183 stable_karma=3,
184 unstable_karma=-3,
185 autokarma=False,
186 user=user,
187 status=UpdateStatus.pending,
188 )
189
190 # Comment on the update that it was automatically created.
191 update.comment(
192 dbsession,
193 str("This update was automatically created"),
194 author="bodhi",
195 )
196
197 update.add_tag(update.release.pending_signing_tag)
198
199 log.debug("Adding new update to the database.")
200 dbsession.add(update)
201
202 log.debug("Flushing changes to the database.")
203 dbsession.flush()
204
205 # Obsolete older updates which may be stuck in testing due to failed gating
206 try:
207 update.obsolete_older_updates(dbsession)
208 except Exception as e:
209 log.error(f'Problem obsoleting older updates: {e}')
210
211 alias = update.alias
212 buglist = [b.bug_id for b in update.bugs]
213
214 # This must be run after dbsession is closed so changes are committed to db
215 work_on_bugs_task.delay(alias, buglist)
216
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/bodhi/server/consumers/automatic_updates.py b/bodhi/server/consumers/automatic_updates.py
--- a/bodhi/server/consumers/automatic_updates.py
+++ b/bodhi/server/consumers/automatic_updates.py
@@ -185,6 +185,7 @@
autokarma=False,
user=user,
status=UpdateStatus.pending,
+ critpath=Update.contains_critpath_component([build], rel.branch),
)
# Comment on the update that it was automatically created.
|
{"golden_diff": "diff --git a/bodhi/server/consumers/automatic_updates.py b/bodhi/server/consumers/automatic_updates.py\n--- a/bodhi/server/consumers/automatic_updates.py\n+++ b/bodhi/server/consumers/automatic_updates.py\n@@ -185,6 +185,7 @@\n autokarma=False,\n user=user,\n status=UpdateStatus.pending,\n+ critpath=Update.contains_critpath_component([build], rel.branch),\n )\n \n # Comment on the update that it was automatically created.\n", "issue": "Bodhi doesn't correctly mark updates as critpath updates for Rawhide\nRawhide updates are never identified as critpath updates:\r\nhttps://bodhi.fedoraproject.org/updates/FEDORA-2021-c8f0463820\r\nhttps://bodhi.fedoraproject.org/updates/FEDORA-2021-d8b1203333\r\n\r\nThe problem is in the interface with PDC:\r\nhttps://github.com/fedora-infra/bodhi/blob/d275cc2bb9e513dc10e2009262adc989e199dadf/bodhi/server/models.py#L2103\r\nand\r\nhttps://github.com/fedora-infra/bodhi/blob/d275cc2bb9e513dc10e2009262adc989e199dadf/bodhi/server/util.py#L959\r\n\r\nWe query PDC for \"name=f34\", but rawhide in PDC is always \"name=master\"...\n", "before_files": [{"content": "# Copyright \u00a9 2019 Red Hat, Inc. and others.\n#\n# This file is part of Bodhi.\n#\n# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License along with\n# this program; if not, write to the Free Software Foundation, Inc., 51\n# Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n\"\"\"\nThe Bodhi handler that creates updates automatically from tagged builds.\n\nThis module is responsible for the process of creating updates when builds are\ntagged with certain tags.\n\"\"\"\n\nimport logging\nimport re\n\nimport fedora_messaging\n\nfrom bodhi.server import buildsys\nfrom bodhi.server.config import config\nfrom bodhi.server.models import (\n Bug, Build, ContentType, Package, Release, Update, UpdateStatus, UpdateType, User)\nfrom bodhi.server.tasks import work_on_bugs_task\nfrom bodhi.server.util import transactional_session_maker\n\nlog = logging.getLogger('bodhi')\n\n\nclass AutomaticUpdateHandler:\n \"\"\"\n The Bodhi Automatic Update Handler.\n\n A consumer that listens for messages about tagged builds and creates\n updates from them.\n \"\"\"\n\n def __init__(self, db_factory: transactional_session_maker = None):\n \"\"\"\n Initialize the Automatic Update Handler.\n\n Args:\n db_factory: If given, used as the db_factory for this handler. If\n None (the default), a new TransactionalSessionMaker is created and\n used.\n \"\"\"\n if not db_factory:\n self.db_factory = transactional_session_maker()\n else:\n self.db_factory = db_factory\n\n def __call__(self, message: fedora_messaging.api.Message) -> None:\n \"\"\"Create updates from appropriately tagged builds.\n\n Args:\n message: The message we are processing.\n \"\"\"\n body = message.body\n\n missing = []\n for mandatory in ('tag', 'build_id', 'name', 'version', 'release'):\n if mandatory not in body:\n missing.append(mandatory)\n if missing:\n log.debug(f\"Received incomplete tag message. Missing: {', '.join(missing)}\")\n return\n\n btag = body['tag']\n bnvr = '{name}-{version}-{release}'.format(**body)\n\n koji = buildsys.get_session()\n\n kbuildinfo = koji.getBuild(bnvr)\n if not kbuildinfo:\n log.debug(f\"Can't find Koji build for {bnvr}.\")\n return\n\n if 'nvr' not in kbuildinfo:\n log.debug(f\"Koji build info for {bnvr} doesn't contain 'nvr'.\")\n return\n\n if 'owner_name' not in kbuildinfo:\n log.debug(f\"Koji build info for {bnvr} doesn't contain 'owner_name'.\")\n return\n\n if kbuildinfo['owner_name'] in config.get('automatic_updates_blacklist'):\n log.debug(f\"{bnvr} owned by {kbuildinfo['owner_name']} who is listed in \"\n \"automatic_updates_blacklist, skipping.\")\n return\n\n # some APIs want the Koji build info, some others want the same\n # wrapped in a larger (request?) structure\n rbuildinfo = {\n 'info': kbuildinfo,\n 'nvr': kbuildinfo['nvr'].rsplit('-', 2),\n }\n\n with self.db_factory() as dbsession:\n rel = dbsession.query(Release).filter_by(create_automatic_updates=True,\n candidate_tag=btag).first()\n if not rel:\n log.debug(f\"Ignoring build being tagged into {btag!r}, no release configured for \"\n \"automatic updates for it found.\")\n return\n\n bcls = ContentType.infer_content_class(Build, kbuildinfo)\n build = bcls.get(bnvr)\n if build and build.update:\n log.info(f\"Build, active update for {bnvr} exists already, skipping.\")\n return\n\n if not build:\n log.debug(f\"Build for {bnvr} doesn't exist yet, creating.\")\n\n # Package.get_or_create() infers content type already\n log.debug(\"Getting/creating related package object.\")\n pkg = Package.get_or_create(dbsession, rbuildinfo)\n\n log.debug(\"Creating build object, adding it to the DB.\")\n build = bcls(nvr=bnvr, package=pkg, release=rel)\n dbsession.add(build)\n\n owner_name = kbuildinfo['owner_name']\n user = User.get(owner_name)\n if not user:\n log.debug(f\"Creating bodhi user for '{owner_name}'.\")\n # Leave email, groups blank, these will be filled\n # in or updated when they log into Bodhi next time, see\n # bodhi.server.security:remember_me().\n user = User(name=owner_name)\n dbsession.add(user)\n\n log.debug(f\"Creating new update for {bnvr}.\")\n try:\n changelog = build.get_changelog(lastupdate=True)\n except ValueError:\n # Often due to bot-generated builds\n # https://github.com/fedora-infra/bodhi/issues/4146\n changelog = None\n except Exception:\n # Re-raise exception, so that the message can be re-queued\n raise\n closing_bugs = []\n if changelog:\n log.debug(\"Adding changelog to update notes.\")\n notes = f\"\"\"Automatic update for {bnvr}.\n\n##### **Changelog**\n\n```\n{changelog}\n```\"\"\"\n\n if rel.name not in config.get('bz_exclude_rels'):\n for b in re.finditer(config.get('bz_regex'), changelog, re.IGNORECASE):\n idx = int(b.group(1))\n log.debug(f'Adding bug #{idx} to the update.')\n bug = Bug.get(idx)\n if bug is None:\n bug = Bug(bug_id=idx)\n dbsession.add(bug)\n dbsession.flush()\n if bug not in closing_bugs:\n closing_bugs.append(bug)\n else:\n notes = f\"Automatic update for {bnvr}.\"\n update = Update(\n release=rel,\n builds=[build],\n bugs=closing_bugs,\n notes=notes,\n type=UpdateType.unspecified,\n stable_karma=3,\n unstable_karma=-3,\n autokarma=False,\n user=user,\n status=UpdateStatus.pending,\n )\n\n # Comment on the update that it was automatically created.\n update.comment(\n dbsession,\n str(\"This update was automatically created\"),\n author=\"bodhi\",\n )\n\n update.add_tag(update.release.pending_signing_tag)\n\n log.debug(\"Adding new update to the database.\")\n dbsession.add(update)\n\n log.debug(\"Flushing changes to the database.\")\n dbsession.flush()\n\n # Obsolete older updates which may be stuck in testing due to failed gating\n try:\n update.obsolete_older_updates(dbsession)\n except Exception as e:\n log.error(f'Problem obsoleting older updates: {e}')\n\n alias = update.alias\n buglist = [b.bug_id for b in update.bugs]\n\n # This must be run after dbsession is closed so changes are committed to db\n work_on_bugs_task.delay(alias, buglist)\n", "path": "bodhi/server/consumers/automatic_updates.py"}], "after_files": [{"content": "# Copyright \u00a9 2019 Red Hat, Inc. and others.\n#\n# This file is part of Bodhi.\n#\n# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License along with\n# this program; if not, write to the Free Software Foundation, Inc., 51\n# Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n\"\"\"\nThe Bodhi handler that creates updates automatically from tagged builds.\n\nThis module is responsible for the process of creating updates when builds are\ntagged with certain tags.\n\"\"\"\n\nimport logging\nimport re\n\nimport fedora_messaging\n\nfrom bodhi.server import buildsys\nfrom bodhi.server.config import config\nfrom bodhi.server.models import (\n Bug, Build, ContentType, Package, Release, Update, UpdateStatus, UpdateType, User)\nfrom bodhi.server.tasks import work_on_bugs_task\nfrom bodhi.server.util import transactional_session_maker\n\nlog = logging.getLogger('bodhi')\n\n\nclass AutomaticUpdateHandler:\n \"\"\"\n The Bodhi Automatic Update Handler.\n\n A consumer that listens for messages about tagged builds and creates\n updates from them.\n \"\"\"\n\n def __init__(self, db_factory: transactional_session_maker = None):\n \"\"\"\n Initialize the Automatic Update Handler.\n\n Args:\n db_factory: If given, used as the db_factory for this handler. If\n None (the default), a new TransactionalSessionMaker is created and\n used.\n \"\"\"\n if not db_factory:\n self.db_factory = transactional_session_maker()\n else:\n self.db_factory = db_factory\n\n def __call__(self, message: fedora_messaging.api.Message) -> None:\n \"\"\"Create updates from appropriately tagged builds.\n\n Args:\n message: The message we are processing.\n \"\"\"\n body = message.body\n\n missing = []\n for mandatory in ('tag', 'build_id', 'name', 'version', 'release'):\n if mandatory not in body:\n missing.append(mandatory)\n if missing:\n log.debug(f\"Received incomplete tag message. Missing: {', '.join(missing)}\")\n return\n\n btag = body['tag']\n bnvr = '{name}-{version}-{release}'.format(**body)\n\n koji = buildsys.get_session()\n\n kbuildinfo = koji.getBuild(bnvr)\n if not kbuildinfo:\n log.debug(f\"Can't find Koji build for {bnvr}.\")\n return\n\n if 'nvr' not in kbuildinfo:\n log.debug(f\"Koji build info for {bnvr} doesn't contain 'nvr'.\")\n return\n\n if 'owner_name' not in kbuildinfo:\n log.debug(f\"Koji build info for {bnvr} doesn't contain 'owner_name'.\")\n return\n\n if kbuildinfo['owner_name'] in config.get('automatic_updates_blacklist'):\n log.debug(f\"{bnvr} owned by {kbuildinfo['owner_name']} who is listed in \"\n \"automatic_updates_blacklist, skipping.\")\n return\n\n # some APIs want the Koji build info, some others want the same\n # wrapped in a larger (request?) structure\n rbuildinfo = {\n 'info': kbuildinfo,\n 'nvr': kbuildinfo['nvr'].rsplit('-', 2),\n }\n\n with self.db_factory() as dbsession:\n rel = dbsession.query(Release).filter_by(create_automatic_updates=True,\n candidate_tag=btag).first()\n if not rel:\n log.debug(f\"Ignoring build being tagged into {btag!r}, no release configured for \"\n \"automatic updates for it found.\")\n return\n\n bcls = ContentType.infer_content_class(Build, kbuildinfo)\n build = bcls.get(bnvr)\n if build and build.update:\n log.info(f\"Build, active update for {bnvr} exists already, skipping.\")\n return\n\n if not build:\n log.debug(f\"Build for {bnvr} doesn't exist yet, creating.\")\n\n # Package.get_or_create() infers content type already\n log.debug(\"Getting/creating related package object.\")\n pkg = Package.get_or_create(dbsession, rbuildinfo)\n\n log.debug(\"Creating build object, adding it to the DB.\")\n build = bcls(nvr=bnvr, package=pkg, release=rel)\n dbsession.add(build)\n\n owner_name = kbuildinfo['owner_name']\n user = User.get(owner_name)\n if not user:\n log.debug(f\"Creating bodhi user for '{owner_name}'.\")\n # Leave email, groups blank, these will be filled\n # in or updated when they log into Bodhi next time, see\n # bodhi.server.security:remember_me().\n user = User(name=owner_name)\n dbsession.add(user)\n\n log.debug(f\"Creating new update for {bnvr}.\")\n try:\n changelog = build.get_changelog(lastupdate=True)\n except ValueError:\n # Often due to bot-generated builds\n # https://github.com/fedora-infra/bodhi/issues/4146\n changelog = None\n except Exception:\n # Re-raise exception, so that the message can be re-queued\n raise\n closing_bugs = []\n if changelog:\n log.debug(\"Adding changelog to update notes.\")\n notes = f\"\"\"Automatic update for {bnvr}.\n\n##### **Changelog**\n\n```\n{changelog}\n```\"\"\"\n\n if rel.name not in config.get('bz_exclude_rels'):\n for b in re.finditer(config.get('bz_regex'), changelog, re.IGNORECASE):\n idx = int(b.group(1))\n log.debug(f'Adding bug #{idx} to the update.')\n bug = Bug.get(idx)\n if bug is None:\n bug = Bug(bug_id=idx)\n dbsession.add(bug)\n dbsession.flush()\n if bug not in closing_bugs:\n closing_bugs.append(bug)\n else:\n notes = f\"Automatic update for {bnvr}.\"\n update = Update(\n release=rel,\n builds=[build],\n bugs=closing_bugs,\n notes=notes,\n type=UpdateType.unspecified,\n stable_karma=3,\n unstable_karma=-3,\n autokarma=False,\n user=user,\n status=UpdateStatus.pending,\n critpath=Update.contains_critpath_component([build], rel.branch),\n )\n\n # Comment on the update that it was automatically created.\n update.comment(\n dbsession,\n str(\"This update was automatically created\"),\n author=\"bodhi\",\n )\n\n update.add_tag(update.release.pending_signing_tag)\n\n log.debug(\"Adding new update to the database.\")\n dbsession.add(update)\n\n log.debug(\"Flushing changes to the database.\")\n dbsession.flush()\n\n # Obsolete older updates which may be stuck in testing due to failed gating\n try:\n update.obsolete_older_updates(dbsession)\n except Exception as e:\n log.error(f'Problem obsoleting older updates: {e}')\n\n alias = update.alias\n buglist = [b.bug_id for b in update.bugs]\n\n # This must be run after dbsession is closed so changes are committed to db\n work_on_bugs_task.delay(alias, buglist)\n", "path": "bodhi/server/consumers/automatic_updates.py"}]}
| 2,741 | 118 |
gh_patches_debug_5776
|
rasdani/github-patches
|
git_diff
|
ibis-project__ibis-6327
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
bug: `pivot_longer` doesn't work with field names containing spaces
### What happened?
when using pivot_longer, the names_to field and values_to arguments can't have a space in them
### What version of ibis are you using?
5.1.0
### What backend(s) are you using, if any?
DuckDB
### Relevant log output
```sh
>>> import ibis
>>> import ibis.selectors as s
>>> from ibis import _
>>> ibis.options.interactive = True
>>> relig_income = ibis.examples.relig_income_raw.fetch()
>>> relig_income.pivot_longer(~s.c("religion"), names_to="in come", values_to="my count")
This produces a ParserException: Parser Error: syntax error at or near "in"
LINE 2: ...e" := ?, "my count" := t1."$100-150k"), struct_pack("in come" := ?, "my count"...
This can be fixed by removing the space, but would be nice to at least fix the error message or documentation to say there should be no spaces in the names and values
```
### Code of Conduct
- [X] I agree to follow this project's Code of Conduct
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ibis/backends/base/sql/alchemy/datatypes.py`
Content:
```
1 from __future__ import annotations
2
3 from typing import Callable, Mapping, Optional
4
5 import sqlalchemy as sa
6 import sqlalchemy.types as sat
7 import toolz
8 from sqlalchemy.ext.compiler import compiles
9
10 import ibis.expr.datatypes as dt
11 from ibis.backends.base.sql.alchemy.geospatial import geospatial_supported
12 from ibis.common.collections import FrozenDict
13
14 if geospatial_supported:
15 import geoalchemy2 as ga
16
17
18 class ArrayType(sat.UserDefinedType):
19 def __init__(self, value_type: sat.TypeEngine):
20 self.value_type = sat.to_instance(value_type)
21
22 def result_processor(self, dialect, coltype) -> None:
23 if not coltype.lower().startswith("array"):
24 return None
25
26 inner_processor = (
27 self.value_type.result_processor(dialect, coltype[len("array(") : -1])
28 or toolz.identity
29 )
30
31 return lambda v: v if v is None else list(map(inner_processor, v))
32
33
34 @compiles(ArrayType, "default")
35 def compiles_array(element, compiler, **kw):
36 return f"ARRAY({compiler.process(element.value_type, **kw)})"
37
38
39 class StructType(sat.UserDefinedType):
40 cache_ok = True
41
42 def __init__(self, fields: Mapping[str, sat.TypeEngine]) -> None:
43 self.fields = FrozenDict(
44 {name: sat.to_instance(typ) for name, typ in fields.items()}
45 )
46
47
48 @compiles(StructType, "default")
49 def compiles_struct(element, compiler, **kw):
50 content = ", ".join(
51 f"{field} {compiler.process(typ, **kw)}"
52 for field, typ in element.fields.items()
53 )
54 return f"STRUCT({content})"
55
56
57 class MapType(sat.UserDefinedType):
58 def __init__(self, key_type: sat.TypeEngine, value_type: sat.TypeEngine):
59 self.key_type = sat.to_instance(key_type)
60 self.value_type = sat.to_instance(value_type)
61
62
63 @compiles(MapType, "default")
64 def compiles_map(element, compiler, **kw):
65 key_type = compiler.process(element.key_type, **kw)
66 value_type = compiler.process(element.value_type, **kw)
67 return f"MAP({key_type}, {value_type})"
68
69
70 class UInt64(sat.Integer):
71 pass
72
73
74 class UInt32(sat.Integer):
75 pass
76
77
78 class UInt16(sat.Integer):
79 pass
80
81
82 class UInt8(sat.Integer):
83 pass
84
85
86 @compiles(UInt64, "postgresql")
87 @compiles(UInt32, "postgresql")
88 @compiles(UInt16, "postgresql")
89 @compiles(UInt8, "postgresql")
90 @compiles(UInt64, "mssql")
91 @compiles(UInt32, "mssql")
92 @compiles(UInt16, "mssql")
93 @compiles(UInt8, "mssql")
94 @compiles(UInt64, "mysql")
95 @compiles(UInt32, "mysql")
96 @compiles(UInt16, "mysql")
97 @compiles(UInt8, "mysql")
98 @compiles(UInt64, "snowflake")
99 @compiles(UInt32, "snowflake")
100 @compiles(UInt16, "snowflake")
101 @compiles(UInt8, "snowflake")
102 @compiles(UInt64, "sqlite")
103 @compiles(UInt32, "sqlite")
104 @compiles(UInt16, "sqlite")
105 @compiles(UInt8, "sqlite")
106 @compiles(UInt64, "trino")
107 @compiles(UInt32, "trino")
108 @compiles(UInt16, "trino")
109 @compiles(UInt8, "trino")
110 def compile_uint(element, compiler, **kw):
111 dialect_name = compiler.dialect.name
112 raise TypeError(
113 f"unsigned integers are not supported in the {dialect_name} backend"
114 )
115
116
117 try:
118 UUID = sat.UUID
119 except AttributeError:
120
121 class UUID(sat.String):
122 pass
123
124 else:
125
126 @compiles(UUID, "default")
127 def compiles_uuid(element, compiler, **kw):
128 return "UUID"
129
130
131 class Unknown(sa.Text):
132 pass
133
134
135 _from_sqlalchemy_types = {
136 sat.BOOLEAN: dt.Boolean,
137 sat.Boolean: dt.Boolean,
138 sat.BINARY: dt.Binary,
139 sat.LargeBinary: dt.Binary,
140 sat.DATE: dt.Date,
141 sat.Date: dt.Date,
142 sat.TEXT: dt.String,
143 sat.Text: dt.String,
144 sat.TIME: dt.Time,
145 sat.Time: dt.Time,
146 sat.VARCHAR: dt.String,
147 sat.CHAR: dt.String,
148 sat.String: dt.String,
149 sat.SMALLINT: dt.Int16,
150 sat.SmallInteger: dt.Int16,
151 sat.INTEGER: dt.Int32,
152 sat.Integer: dt.Int32,
153 sat.BIGINT: dt.Int64,
154 sat.BigInteger: dt.Int64,
155 sat.REAL: dt.Float32,
156 sat.FLOAT: dt.Float64,
157 UInt16: dt.UInt16,
158 UInt32: dt.UInt32,
159 UInt64: dt.UInt64,
160 UInt8: dt.UInt8,
161 Unknown: dt.Unknown,
162 sat.JSON: dt.JSON,
163 UUID: dt.UUID,
164 }
165
166 _to_sqlalchemy_types = {
167 dt.Null: sat.NullType,
168 dt.Date: sat.Date,
169 dt.Time: sat.Time,
170 dt.Boolean: sat.Boolean,
171 dt.Binary: sat.LargeBinary,
172 dt.String: sat.Text,
173 dt.Decimal: sat.Numeric,
174 # Mantissa-based
175 dt.Float16: sat.REAL,
176 dt.Float32: sat.REAL,
177 # precision is the number of bits in the mantissa
178 # without specifying this, some backends interpret the type as FLOAT, which
179 # means float32 (and precision == 24)
180 dt.Float64: sat.FLOAT(precision=53),
181 dt.Int8: sat.SmallInteger,
182 dt.Int16: sat.SmallInteger,
183 dt.Int32: sat.Integer,
184 dt.Int64: sat.BigInteger,
185 dt.UInt8: UInt8,
186 dt.UInt16: UInt16,
187 dt.UInt32: UInt32,
188 dt.UInt64: UInt64,
189 dt.JSON: sat.JSON,
190 dt.Interval: sat.Interval,
191 dt.Unknown: Unknown,
192 dt.MACADDR: sat.Text,
193 dt.INET: sat.Text,
194 dt.UUID: UUID,
195 }
196
197
198 def dtype_to_sqlalchemy(
199 dtype: dt.DataType,
200 converter: Optional[Callable[[dt.DataType], sa.TypeEngine]] = None,
201 ):
202 """Convert an Ibis type to a SQLAlchemy type.
203
204 Parameters
205 ----------
206 dtype
207 Ibis type to convert.
208 converter
209 Converter function to use for nested types. If not provided, this function
210 will be used recursively. Should only be used when defining new converter for
211 dialects.
212
213 Returns
214 -------
215 sa.TypeEngine
216 """
217 convert = converter or dtype_to_sqlalchemy
218
219 if dtype.is_decimal():
220 return sat.NUMERIC(dtype.precision, dtype.scale)
221 elif dtype.is_timestamp():
222 return sat.TIMESTAMP(timezone=bool(dtype.timezone))
223 elif dtype.is_array():
224 return ArrayType(convert(dtype.value_type))
225 elif dtype.is_struct():
226 fields = {k: convert(v) for k, v in dtype.fields.items()}
227 return StructType(fields)
228 elif dtype.is_map():
229 return MapType(convert(dtype.key_type), convert(dtype.value_type))
230 elif dtype.is_geospatial():
231 if geospatial_supported:
232 if dtype.geotype == 'geometry':
233 return ga.Geometry
234 elif dtype.geotype == 'geography':
235 return ga.Geography
236 else:
237 return ga.types._GISType
238 else:
239 raise TypeError("geospatial types are not supported")
240 else:
241 return _to_sqlalchemy_types[type(dtype)]
242
243
244 _FLOAT_PREC_TO_TYPE = {
245 11: dt.Float16,
246 24: dt.Float32,
247 53: dt.Float64,
248 }
249
250 _GEOSPATIAL_TYPES = {
251 "POINT": dt.Point,
252 "LINESTRING": dt.LineString,
253 "POLYGON": dt.Polygon,
254 "MULTILINESTRING": dt.MultiLineString,
255 "MULTIPOINT": dt.MultiPoint,
256 "MULTIPOLYGON": dt.MultiPolygon,
257 "GEOMETRY": dt.Geometry,
258 "GEOGRAPHY": dt.Geography,
259 }
260
261
262 def dtype_from_sqlalchemy(typ, nullable=True, converter=None):
263 """Convert a SQLAlchemy type to an Ibis type.
264
265 Parameters
266 ----------
267 typ
268 SQLAlchemy type to convert.
269 nullable : bool, optional
270 Whether the returned type should be nullable.
271 converter
272 Converter function to use for nested types. If not provided, this function
273 will be used recursively. Should only be used when defining new converter for
274 dialects.
275
276 Returns
277 -------
278 dt.DataType
279 """
280 convert = converter or dtype_from_sqlalchemy
281
282 if dtype := _from_sqlalchemy_types.get(type(typ)):
283 return dtype(nullable=nullable)
284 elif isinstance(typ, sat.Float):
285 if (float_typ := _FLOAT_PREC_TO_TYPE.get(typ.precision)) is not None:
286 return float_typ(nullable=nullable)
287 return dt.Decimal(typ.precision, typ.scale, nullable=nullable)
288 elif isinstance(typ, sat.Numeric):
289 return dt.Decimal(typ.precision, typ.scale, nullable=nullable)
290 elif isinstance(typ, ArrayType):
291 return dt.Array(convert(typ.value_type), nullable=nullable)
292 elif isinstance(typ, sat.ARRAY):
293 ndim = typ.dimensions
294 if ndim is not None and ndim != 1:
295 raise NotImplementedError("Nested array types not yet supported")
296 return dt.Array(convert(typ.item_type), nullable=nullable)
297 elif isinstance(typ, StructType):
298 fields = {k: convert(v) for k, v in typ.fields.items()}
299 return dt.Struct(fields, nullable=nullable)
300 elif isinstance(typ, MapType):
301 return dt.Map(convert(typ.key_type), convert(typ.value_type), nullable=nullable)
302 elif isinstance(typ, sa.DateTime):
303 timezone = "UTC" if typ.timezone else None
304 return dt.Timestamp(timezone, nullable=nullable)
305 elif geospatial_supported and isinstance(typ, ga.types._GISType):
306 name = typ.geometry_type.upper()
307 try:
308 return _GEOSPATIAL_TYPES[name](geotype=typ.name, nullable=nullable)
309 except KeyError:
310 raise ValueError(f"Unrecognized geometry type: {name}")
311 else:
312 raise TypeError(f"Unable to convert type: {typ!r}")
313
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ibis/backends/base/sql/alchemy/datatypes.py b/ibis/backends/base/sql/alchemy/datatypes.py
--- a/ibis/backends/base/sql/alchemy/datatypes.py
+++ b/ibis/backends/base/sql/alchemy/datatypes.py
@@ -47,8 +47,9 @@
@compiles(StructType, "default")
def compiles_struct(element, compiler, **kw):
+ quote = compiler.dialect.identifier_preparer.quote
content = ", ".join(
- f"{field} {compiler.process(typ, **kw)}"
+ f"{quote(field)} {compiler.process(typ, **kw)}"
for field, typ in element.fields.items()
)
return f"STRUCT({content})"
|
{"golden_diff": "diff --git a/ibis/backends/base/sql/alchemy/datatypes.py b/ibis/backends/base/sql/alchemy/datatypes.py\n--- a/ibis/backends/base/sql/alchemy/datatypes.py\n+++ b/ibis/backends/base/sql/alchemy/datatypes.py\n@@ -47,8 +47,9 @@\n \n @compiles(StructType, \"default\")\n def compiles_struct(element, compiler, **kw):\n+ quote = compiler.dialect.identifier_preparer.quote\n content = \", \".join(\n- f\"{field} {compiler.process(typ, **kw)}\"\n+ f\"{quote(field)} {compiler.process(typ, **kw)}\"\n for field, typ in element.fields.items()\n )\n return f\"STRUCT({content})\"\n", "issue": "bug: `pivot_longer` doesn't work with field names containing spaces\n### What happened?\n\nwhen using pivot_longer, the names_to field and values_to arguments can't have a space in them\n\n### What version of ibis are you using?\n\n5.1.0\n\n### What backend(s) are you using, if any?\n\nDuckDB\n\n### Relevant log output\n\n```sh\n>>> import ibis\r\n>>> import ibis.selectors as s\r\n>>> from ibis import _\r\n>>> ibis.options.interactive = True\r\n>>> relig_income = ibis.examples.relig_income_raw.fetch()\r\n>>> relig_income.pivot_longer(~s.c(\"religion\"), names_to=\"in come\", values_to=\"my count\")\r\n\r\nThis produces a ParserException: Parser Error: syntax error at or near \"in\"\r\nLINE 2: ...e\" := ?, \"my count\" := t1.\"$100-150k\"), struct_pack(\"in come\" := ?, \"my count\"...\r\n\r\nThis can be fixed by removing the space, but would be nice to at least fix the error message or documentation to say there should be no spaces in the names and values\n```\n\n\n### Code of Conduct\n\n- [X] I agree to follow this project's Code of Conduct\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom typing import Callable, Mapping, Optional\n\nimport sqlalchemy as sa\nimport sqlalchemy.types as sat\nimport toolz\nfrom sqlalchemy.ext.compiler import compiles\n\nimport ibis.expr.datatypes as dt\nfrom ibis.backends.base.sql.alchemy.geospatial import geospatial_supported\nfrom ibis.common.collections import FrozenDict\n\nif geospatial_supported:\n import geoalchemy2 as ga\n\n\nclass ArrayType(sat.UserDefinedType):\n def __init__(self, value_type: sat.TypeEngine):\n self.value_type = sat.to_instance(value_type)\n\n def result_processor(self, dialect, coltype) -> None:\n if not coltype.lower().startswith(\"array\"):\n return None\n\n inner_processor = (\n self.value_type.result_processor(dialect, coltype[len(\"array(\") : -1])\n or toolz.identity\n )\n\n return lambda v: v if v is None else list(map(inner_processor, v))\n\n\n@compiles(ArrayType, \"default\")\ndef compiles_array(element, compiler, **kw):\n return f\"ARRAY({compiler.process(element.value_type, **kw)})\"\n\n\nclass StructType(sat.UserDefinedType):\n cache_ok = True\n\n def __init__(self, fields: Mapping[str, sat.TypeEngine]) -> None:\n self.fields = FrozenDict(\n {name: sat.to_instance(typ) for name, typ in fields.items()}\n )\n\n\n@compiles(StructType, \"default\")\ndef compiles_struct(element, compiler, **kw):\n content = \", \".join(\n f\"{field} {compiler.process(typ, **kw)}\"\n for field, typ in element.fields.items()\n )\n return f\"STRUCT({content})\"\n\n\nclass MapType(sat.UserDefinedType):\n def __init__(self, key_type: sat.TypeEngine, value_type: sat.TypeEngine):\n self.key_type = sat.to_instance(key_type)\n self.value_type = sat.to_instance(value_type)\n\n\n@compiles(MapType, \"default\")\ndef compiles_map(element, compiler, **kw):\n key_type = compiler.process(element.key_type, **kw)\n value_type = compiler.process(element.value_type, **kw)\n return f\"MAP({key_type}, {value_type})\"\n\n\nclass UInt64(sat.Integer):\n pass\n\n\nclass UInt32(sat.Integer):\n pass\n\n\nclass UInt16(sat.Integer):\n pass\n\n\nclass UInt8(sat.Integer):\n pass\n\n\n@compiles(UInt64, \"postgresql\")\n@compiles(UInt32, \"postgresql\")\n@compiles(UInt16, \"postgresql\")\n@compiles(UInt8, \"postgresql\")\n@compiles(UInt64, \"mssql\")\n@compiles(UInt32, \"mssql\")\n@compiles(UInt16, \"mssql\")\n@compiles(UInt8, \"mssql\")\n@compiles(UInt64, \"mysql\")\n@compiles(UInt32, \"mysql\")\n@compiles(UInt16, \"mysql\")\n@compiles(UInt8, \"mysql\")\n@compiles(UInt64, \"snowflake\")\n@compiles(UInt32, \"snowflake\")\n@compiles(UInt16, \"snowflake\")\n@compiles(UInt8, \"snowflake\")\n@compiles(UInt64, \"sqlite\")\n@compiles(UInt32, \"sqlite\")\n@compiles(UInt16, \"sqlite\")\n@compiles(UInt8, \"sqlite\")\n@compiles(UInt64, \"trino\")\n@compiles(UInt32, \"trino\")\n@compiles(UInt16, \"trino\")\n@compiles(UInt8, \"trino\")\ndef compile_uint(element, compiler, **kw):\n dialect_name = compiler.dialect.name\n raise TypeError(\n f\"unsigned integers are not supported in the {dialect_name} backend\"\n )\n\n\ntry:\n UUID = sat.UUID\nexcept AttributeError:\n\n class UUID(sat.String):\n pass\n\nelse:\n\n @compiles(UUID, \"default\")\n def compiles_uuid(element, compiler, **kw):\n return \"UUID\"\n\n\nclass Unknown(sa.Text):\n pass\n\n\n_from_sqlalchemy_types = {\n sat.BOOLEAN: dt.Boolean,\n sat.Boolean: dt.Boolean,\n sat.BINARY: dt.Binary,\n sat.LargeBinary: dt.Binary,\n sat.DATE: dt.Date,\n sat.Date: dt.Date,\n sat.TEXT: dt.String,\n sat.Text: dt.String,\n sat.TIME: dt.Time,\n sat.Time: dt.Time,\n sat.VARCHAR: dt.String,\n sat.CHAR: dt.String,\n sat.String: dt.String,\n sat.SMALLINT: dt.Int16,\n sat.SmallInteger: dt.Int16,\n sat.INTEGER: dt.Int32,\n sat.Integer: dt.Int32,\n sat.BIGINT: dt.Int64,\n sat.BigInteger: dt.Int64,\n sat.REAL: dt.Float32,\n sat.FLOAT: dt.Float64,\n UInt16: dt.UInt16,\n UInt32: dt.UInt32,\n UInt64: dt.UInt64,\n UInt8: dt.UInt8,\n Unknown: dt.Unknown,\n sat.JSON: dt.JSON,\n UUID: dt.UUID,\n}\n\n_to_sqlalchemy_types = {\n dt.Null: sat.NullType,\n dt.Date: sat.Date,\n dt.Time: sat.Time,\n dt.Boolean: sat.Boolean,\n dt.Binary: sat.LargeBinary,\n dt.String: sat.Text,\n dt.Decimal: sat.Numeric,\n # Mantissa-based\n dt.Float16: sat.REAL,\n dt.Float32: sat.REAL,\n # precision is the number of bits in the mantissa\n # without specifying this, some backends interpret the type as FLOAT, which\n # means float32 (and precision == 24)\n dt.Float64: sat.FLOAT(precision=53),\n dt.Int8: sat.SmallInteger,\n dt.Int16: sat.SmallInteger,\n dt.Int32: sat.Integer,\n dt.Int64: sat.BigInteger,\n dt.UInt8: UInt8,\n dt.UInt16: UInt16,\n dt.UInt32: UInt32,\n dt.UInt64: UInt64,\n dt.JSON: sat.JSON,\n dt.Interval: sat.Interval,\n dt.Unknown: Unknown,\n dt.MACADDR: sat.Text,\n dt.INET: sat.Text,\n dt.UUID: UUID,\n}\n\n\ndef dtype_to_sqlalchemy(\n dtype: dt.DataType,\n converter: Optional[Callable[[dt.DataType], sa.TypeEngine]] = None,\n):\n \"\"\"Convert an Ibis type to a SQLAlchemy type.\n\n Parameters\n ----------\n dtype\n Ibis type to convert.\n converter\n Converter function to use for nested types. If not provided, this function\n will be used recursively. Should only be used when defining new converter for\n dialects.\n\n Returns\n -------\n sa.TypeEngine\n \"\"\"\n convert = converter or dtype_to_sqlalchemy\n\n if dtype.is_decimal():\n return sat.NUMERIC(dtype.precision, dtype.scale)\n elif dtype.is_timestamp():\n return sat.TIMESTAMP(timezone=bool(dtype.timezone))\n elif dtype.is_array():\n return ArrayType(convert(dtype.value_type))\n elif dtype.is_struct():\n fields = {k: convert(v) for k, v in dtype.fields.items()}\n return StructType(fields)\n elif dtype.is_map():\n return MapType(convert(dtype.key_type), convert(dtype.value_type))\n elif dtype.is_geospatial():\n if geospatial_supported:\n if dtype.geotype == 'geometry':\n return ga.Geometry\n elif dtype.geotype == 'geography':\n return ga.Geography\n else:\n return ga.types._GISType\n else:\n raise TypeError(\"geospatial types are not supported\")\n else:\n return _to_sqlalchemy_types[type(dtype)]\n\n\n_FLOAT_PREC_TO_TYPE = {\n 11: dt.Float16,\n 24: dt.Float32,\n 53: dt.Float64,\n}\n\n_GEOSPATIAL_TYPES = {\n \"POINT\": dt.Point,\n \"LINESTRING\": dt.LineString,\n \"POLYGON\": dt.Polygon,\n \"MULTILINESTRING\": dt.MultiLineString,\n \"MULTIPOINT\": dt.MultiPoint,\n \"MULTIPOLYGON\": dt.MultiPolygon,\n \"GEOMETRY\": dt.Geometry,\n \"GEOGRAPHY\": dt.Geography,\n}\n\n\ndef dtype_from_sqlalchemy(typ, nullable=True, converter=None):\n \"\"\"Convert a SQLAlchemy type to an Ibis type.\n\n Parameters\n ----------\n typ\n SQLAlchemy type to convert.\n nullable : bool, optional\n Whether the returned type should be nullable.\n converter\n Converter function to use for nested types. If not provided, this function\n will be used recursively. Should only be used when defining new converter for\n dialects.\n\n Returns\n -------\n dt.DataType\n \"\"\"\n convert = converter or dtype_from_sqlalchemy\n\n if dtype := _from_sqlalchemy_types.get(type(typ)):\n return dtype(nullable=nullable)\n elif isinstance(typ, sat.Float):\n if (float_typ := _FLOAT_PREC_TO_TYPE.get(typ.precision)) is not None:\n return float_typ(nullable=nullable)\n return dt.Decimal(typ.precision, typ.scale, nullable=nullable)\n elif isinstance(typ, sat.Numeric):\n return dt.Decimal(typ.precision, typ.scale, nullable=nullable)\n elif isinstance(typ, ArrayType):\n return dt.Array(convert(typ.value_type), nullable=nullable)\n elif isinstance(typ, sat.ARRAY):\n ndim = typ.dimensions\n if ndim is not None and ndim != 1:\n raise NotImplementedError(\"Nested array types not yet supported\")\n return dt.Array(convert(typ.item_type), nullable=nullable)\n elif isinstance(typ, StructType):\n fields = {k: convert(v) for k, v in typ.fields.items()}\n return dt.Struct(fields, nullable=nullable)\n elif isinstance(typ, MapType):\n return dt.Map(convert(typ.key_type), convert(typ.value_type), nullable=nullable)\n elif isinstance(typ, sa.DateTime):\n timezone = \"UTC\" if typ.timezone else None\n return dt.Timestamp(timezone, nullable=nullable)\n elif geospatial_supported and isinstance(typ, ga.types._GISType):\n name = typ.geometry_type.upper()\n try:\n return _GEOSPATIAL_TYPES[name](geotype=typ.name, nullable=nullable)\n except KeyError:\n raise ValueError(f\"Unrecognized geometry type: {name}\")\n else:\n raise TypeError(f\"Unable to convert type: {typ!r}\")\n", "path": "ibis/backends/base/sql/alchemy/datatypes.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import Callable, Mapping, Optional\n\nimport sqlalchemy as sa\nimport sqlalchemy.types as sat\nimport toolz\nfrom sqlalchemy.ext.compiler import compiles\n\nimport ibis.expr.datatypes as dt\nfrom ibis.backends.base.sql.alchemy.geospatial import geospatial_supported\nfrom ibis.common.collections import FrozenDict\n\nif geospatial_supported:\n import geoalchemy2 as ga\n\n\nclass ArrayType(sat.UserDefinedType):\n def __init__(self, value_type: sat.TypeEngine):\n self.value_type = sat.to_instance(value_type)\n\n def result_processor(self, dialect, coltype) -> None:\n if not coltype.lower().startswith(\"array\"):\n return None\n\n inner_processor = (\n self.value_type.result_processor(dialect, coltype[len(\"array(\") : -1])\n or toolz.identity\n )\n\n return lambda v: v if v is None else list(map(inner_processor, v))\n\n\n@compiles(ArrayType, \"default\")\ndef compiles_array(element, compiler, **kw):\n return f\"ARRAY({compiler.process(element.value_type, **kw)})\"\n\n\nclass StructType(sat.UserDefinedType):\n cache_ok = True\n\n def __init__(self, fields: Mapping[str, sat.TypeEngine]) -> None:\n self.fields = FrozenDict(\n {name: sat.to_instance(typ) for name, typ in fields.items()}\n )\n\n\n@compiles(StructType, \"default\")\ndef compiles_struct(element, compiler, **kw):\n quote = compiler.dialect.identifier_preparer.quote\n content = \", \".join(\n f\"{quote(field)} {compiler.process(typ, **kw)}\"\n for field, typ in element.fields.items()\n )\n return f\"STRUCT({content})\"\n\n\nclass MapType(sat.UserDefinedType):\n def __init__(self, key_type: sat.TypeEngine, value_type: sat.TypeEngine):\n self.key_type = sat.to_instance(key_type)\n self.value_type = sat.to_instance(value_type)\n\n\n@compiles(MapType, \"default\")\ndef compiles_map(element, compiler, **kw):\n key_type = compiler.process(element.key_type, **kw)\n value_type = compiler.process(element.value_type, **kw)\n return f\"MAP({key_type}, {value_type})\"\n\n\nclass UInt64(sat.Integer):\n pass\n\n\nclass UInt32(sat.Integer):\n pass\n\n\nclass UInt16(sat.Integer):\n pass\n\n\nclass UInt8(sat.Integer):\n pass\n\n\n@compiles(UInt64, \"postgresql\")\n@compiles(UInt32, \"postgresql\")\n@compiles(UInt16, \"postgresql\")\n@compiles(UInt8, \"postgresql\")\n@compiles(UInt64, \"mssql\")\n@compiles(UInt32, \"mssql\")\n@compiles(UInt16, \"mssql\")\n@compiles(UInt8, \"mssql\")\n@compiles(UInt64, \"mysql\")\n@compiles(UInt32, \"mysql\")\n@compiles(UInt16, \"mysql\")\n@compiles(UInt8, \"mysql\")\n@compiles(UInt64, \"snowflake\")\n@compiles(UInt32, \"snowflake\")\n@compiles(UInt16, \"snowflake\")\n@compiles(UInt8, \"snowflake\")\n@compiles(UInt64, \"sqlite\")\n@compiles(UInt32, \"sqlite\")\n@compiles(UInt16, \"sqlite\")\n@compiles(UInt8, \"sqlite\")\n@compiles(UInt64, \"trino\")\n@compiles(UInt32, \"trino\")\n@compiles(UInt16, \"trino\")\n@compiles(UInt8, \"trino\")\ndef compile_uint(element, compiler, **kw):\n dialect_name = compiler.dialect.name\n raise TypeError(\n f\"unsigned integers are not supported in the {dialect_name} backend\"\n )\n\n\ntry:\n UUID = sat.UUID\nexcept AttributeError:\n\n class UUID(sat.String):\n pass\n\nelse:\n\n @compiles(UUID, \"default\")\n def compiles_uuid(element, compiler, **kw):\n return \"UUID\"\n\n\nclass Unknown(sa.Text):\n pass\n\n\n_from_sqlalchemy_types = {\n sat.BOOLEAN: dt.Boolean,\n sat.Boolean: dt.Boolean,\n sat.BINARY: dt.Binary,\n sat.LargeBinary: dt.Binary,\n sat.DATE: dt.Date,\n sat.Date: dt.Date,\n sat.TEXT: dt.String,\n sat.Text: dt.String,\n sat.TIME: dt.Time,\n sat.Time: dt.Time,\n sat.VARCHAR: dt.String,\n sat.CHAR: dt.String,\n sat.String: dt.String,\n sat.SMALLINT: dt.Int16,\n sat.SmallInteger: dt.Int16,\n sat.INTEGER: dt.Int32,\n sat.Integer: dt.Int32,\n sat.BIGINT: dt.Int64,\n sat.BigInteger: dt.Int64,\n sat.REAL: dt.Float32,\n sat.FLOAT: dt.Float64,\n UInt16: dt.UInt16,\n UInt32: dt.UInt32,\n UInt64: dt.UInt64,\n UInt8: dt.UInt8,\n Unknown: dt.Unknown,\n sat.JSON: dt.JSON,\n UUID: dt.UUID,\n}\n\n_to_sqlalchemy_types = {\n dt.Null: sat.NullType,\n dt.Date: sat.Date,\n dt.Time: sat.Time,\n dt.Boolean: sat.Boolean,\n dt.Binary: sat.LargeBinary,\n dt.String: sat.Text,\n dt.Decimal: sat.Numeric,\n # Mantissa-based\n dt.Float16: sat.REAL,\n dt.Float32: sat.REAL,\n # precision is the number of bits in the mantissa\n # without specifying this, some backends interpret the type as FLOAT, which\n # means float32 (and precision == 24)\n dt.Float64: sat.FLOAT(precision=53),\n dt.Int8: sat.SmallInteger,\n dt.Int16: sat.SmallInteger,\n dt.Int32: sat.Integer,\n dt.Int64: sat.BigInteger,\n dt.UInt8: UInt8,\n dt.UInt16: UInt16,\n dt.UInt32: UInt32,\n dt.UInt64: UInt64,\n dt.JSON: sat.JSON,\n dt.Interval: sat.Interval,\n dt.Unknown: Unknown,\n dt.MACADDR: sat.Text,\n dt.INET: sat.Text,\n dt.UUID: UUID,\n}\n\n\ndef dtype_to_sqlalchemy(\n dtype: dt.DataType,\n converter: Optional[Callable[[dt.DataType], sa.TypeEngine]] = None,\n):\n \"\"\"Convert an Ibis type to a SQLAlchemy type.\n\n Parameters\n ----------\n dtype\n Ibis type to convert.\n converter\n Converter function to use for nested types. If not provided, this function\n will be used recursively. Should only be used when defining new converter for\n dialects.\n\n Returns\n -------\n sa.TypeEngine\n \"\"\"\n convert = converter or dtype_to_sqlalchemy\n\n if dtype.is_decimal():\n return sat.NUMERIC(dtype.precision, dtype.scale)\n elif dtype.is_timestamp():\n return sat.TIMESTAMP(timezone=bool(dtype.timezone))\n elif dtype.is_array():\n return ArrayType(convert(dtype.value_type))\n elif dtype.is_struct():\n fields = {k: convert(v) for k, v in dtype.fields.items()}\n return StructType(fields)\n elif dtype.is_map():\n return MapType(convert(dtype.key_type), convert(dtype.value_type))\n elif dtype.is_geospatial():\n if geospatial_supported:\n if dtype.geotype == 'geometry':\n return ga.Geometry\n elif dtype.geotype == 'geography':\n return ga.Geography\n else:\n return ga.types._GISType\n else:\n raise TypeError(\"geospatial types are not supported\")\n else:\n return _to_sqlalchemy_types[type(dtype)]\n\n\n_FLOAT_PREC_TO_TYPE = {\n 11: dt.Float16,\n 24: dt.Float32,\n 53: dt.Float64,\n}\n\n_GEOSPATIAL_TYPES = {\n \"POINT\": dt.Point,\n \"LINESTRING\": dt.LineString,\n \"POLYGON\": dt.Polygon,\n \"MULTILINESTRING\": dt.MultiLineString,\n \"MULTIPOINT\": dt.MultiPoint,\n \"MULTIPOLYGON\": dt.MultiPolygon,\n \"GEOMETRY\": dt.Geometry,\n \"GEOGRAPHY\": dt.Geography,\n}\n\n\ndef dtype_from_sqlalchemy(typ, nullable=True, converter=None):\n \"\"\"Convert a SQLAlchemy type to an Ibis type.\n\n Parameters\n ----------\n typ\n SQLAlchemy type to convert.\n nullable : bool, optional\n Whether the returned type should be nullable.\n converter\n Converter function to use for nested types. If not provided, this function\n will be used recursively. Should only be used when defining new converter for\n dialects.\n\n Returns\n -------\n dt.DataType\n \"\"\"\n convert = converter or dtype_from_sqlalchemy\n\n if dtype := _from_sqlalchemy_types.get(type(typ)):\n return dtype(nullable=nullable)\n elif isinstance(typ, sat.Float):\n if (float_typ := _FLOAT_PREC_TO_TYPE.get(typ.precision)) is not None:\n return float_typ(nullable=nullable)\n return dt.Decimal(typ.precision, typ.scale, nullable=nullable)\n elif isinstance(typ, sat.Numeric):\n return dt.Decimal(typ.precision, typ.scale, nullable=nullable)\n elif isinstance(typ, ArrayType):\n return dt.Array(convert(typ.value_type), nullable=nullable)\n elif isinstance(typ, sat.ARRAY):\n ndim = typ.dimensions\n if ndim is not None and ndim != 1:\n raise NotImplementedError(\"Nested array types not yet supported\")\n return dt.Array(convert(typ.item_type), nullable=nullable)\n elif isinstance(typ, StructType):\n fields = {k: convert(v) for k, v in typ.fields.items()}\n return dt.Struct(fields, nullable=nullable)\n elif isinstance(typ, MapType):\n return dt.Map(convert(typ.key_type), convert(typ.value_type), nullable=nullable)\n elif isinstance(typ, sa.DateTime):\n timezone = \"UTC\" if typ.timezone else None\n return dt.Timestamp(timezone, nullable=nullable)\n elif geospatial_supported and isinstance(typ, ga.types._GISType):\n name = typ.geometry_type.upper()\n try:\n return _GEOSPATIAL_TYPES[name](geotype=typ.name, nullable=nullable)\n except KeyError:\n raise ValueError(f\"Unrecognized geometry type: {name}\")\n else:\n raise TypeError(f\"Unable to convert type: {typ!r}\")\n", "path": "ibis/backends/base/sql/alchemy/datatypes.py"}]}
| 3,656 | 162 |
gh_patches_debug_9846
|
rasdani/github-patches
|
git_diff
|
microsoft__playwright-python-472
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Interactive mode (REPL) (v 1.8.0a1) Error
**from playwright.sync_api import sync_playwright
playwright = sync_playwright().start()**
Traceback (most recent call last):
File "C:\Python37\lib\site-packages\playwright\_impl\_transport.py", line 27, in _get_stderr_fileno
return sys.stderr.fileno()
io.UnsupportedOperation: fileno
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<pyshell#1>", line 1, in <module>
playwright = sync_playwright().start()
File "C:\Python37\lib\site-packages\playwright\sync_api\_context_manager.py", line 73, in start
return self.__enter__()
File "C:\Python37\lib\site-packages\playwright\sync_api\_context_manager.py", line 67, in __enter__
dispatcher_fiber.switch()
File "C:\Python37\lib\site-packages\playwright\sync_api\_context_manager.py", line 48, in greenlet_main
loop.run_until_complete(self._connection.run_as_sync())
File "C:\Python37\lib\asyncio\base_events.py", line 587, in run_until_complete
return future.result()
File "C:\Python37\lib\site-packages\playwright\_impl\_connection.py", line 134, in run_as_sync
await self.run()
File "C:\Python37\lib\site-packages\playwright\_impl\_connection.py", line 139, in run
await self._transport.run()
File "C:\Python37\lib\site-packages\playwright\_impl\_transport.py", line 62, in run
stderr=_get_stderr_fileno(),
File "C:\Python37\lib\site-packages\playwright\_impl\_transport.py", line 34, in _get_stderr_fileno
return sys.__stderr__.fileno()
AttributeError: 'NoneType' object has no attribute 'fileno'
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `playwright/_impl/_transport.py`
Content:
```
1 # Copyright (c) Microsoft Corporation.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import asyncio
16 import io
17 import json
18 import os
19 import sys
20 from pathlib import Path
21 from typing import Dict, Optional
22
23
24 # Sourced from: https://github.com/pytest-dev/pytest/blob/da01ee0a4bb0af780167ecd228ab3ad249511302/src/_pytest/faulthandler.py#L69-L77
25 def _get_stderr_fileno() -> Optional[int]:
26 try:
27 return sys.stderr.fileno()
28 except (AttributeError, io.UnsupportedOperation):
29 # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.
30 # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors
31 # This is potentially dangerous, but the best we can do.
32 if not hasattr(sys, "__stderr__"):
33 return None
34 return sys.__stderr__.fileno()
35
36
37 class Transport:
38 def __init__(self, driver_executable: Path) -> None:
39 super().__init__()
40 self.on_message = lambda _: None
41 self._stopped = False
42 self._driver_executable = driver_executable
43 self._loop: asyncio.AbstractEventLoop
44
45 def stop(self) -> None:
46 self._stopped = True
47 self._output.close()
48
49 async def run(self) -> None:
50 self._loop = asyncio.get_running_loop()
51
52 driver_env = os.environ.copy()
53 # VSCode's JavaScript Debug Terminal provides it but driver/pkg does not support it
54 driver_env.pop("NODE_OPTIONS", None)
55
56 proc = await asyncio.create_subprocess_exec(
57 str(self._driver_executable),
58 "run-driver",
59 env=driver_env,
60 stdin=asyncio.subprocess.PIPE,
61 stdout=asyncio.subprocess.PIPE,
62 stderr=_get_stderr_fileno(),
63 limit=32768,
64 )
65 assert proc.stdout
66 assert proc.stdin
67 self._output = proc.stdin
68
69 while not self._stopped:
70 try:
71 buffer = await proc.stdout.readexactly(4)
72 length = int.from_bytes(buffer, byteorder="little", signed=False)
73 buffer = bytes(0)
74 while length:
75 to_read = min(length, 32768)
76 data = await proc.stdout.readexactly(to_read)
77 length -= to_read
78 if len(buffer):
79 buffer = buffer + data
80 else:
81 buffer = data
82 obj = json.loads(buffer)
83
84 if "DEBUGP" in os.environ: # pragma: no cover
85 print("\x1b[33mRECV>\x1b[0m", json.dumps(obj, indent=2))
86 self.on_message(obj)
87 except asyncio.IncompleteReadError:
88 break
89 await asyncio.sleep(0)
90
91 def send(self, message: Dict) -> None:
92 msg = json.dumps(message)
93 if "DEBUGP" in os.environ: # pragma: no cover
94 print("\x1b[32mSEND>\x1b[0m", json.dumps(message, indent=2))
95 data = msg.encode()
96 self._output.write(
97 len(data).to_bytes(4, byteorder="little", signed=False) + data
98 )
99
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/playwright/_impl/_transport.py b/playwright/_impl/_transport.py
--- a/playwright/_impl/_transport.py
+++ b/playwright/_impl/_transport.py
@@ -29,7 +29,7 @@
# pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.
# https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors
# This is potentially dangerous, but the best we can do.
- if not hasattr(sys, "__stderr__"):
+ if not hasattr(sys, "__stderr__") or not sys.__stderr__:
return None
return sys.__stderr__.fileno()
|
{"golden_diff": "diff --git a/playwright/_impl/_transport.py b/playwright/_impl/_transport.py\n--- a/playwright/_impl/_transport.py\n+++ b/playwright/_impl/_transport.py\n@@ -29,7 +29,7 @@\n # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.\n # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors\n # This is potentially dangerous, but the best we can do.\n- if not hasattr(sys, \"__stderr__\"):\n+ if not hasattr(sys, \"__stderr__\") or not sys.__stderr__:\n return None\n return sys.__stderr__.fileno()\n", "issue": "Interactive mode (REPL) (v 1.8.0a1) Error\n**from playwright.sync_api import sync_playwright\r\nplaywright = sync_playwright().start()**\r\n\r\nTraceback (most recent call last):\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_transport.py\", line 27, in _get_stderr_fileno\r\n return sys.stderr.fileno()\r\nio.UnsupportedOperation: fileno\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"<pyshell#1>\", line 1, in <module>\r\n playwright = sync_playwright().start()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\sync_api\\_context_manager.py\", line 73, in start\r\n return self.__enter__()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\sync_api\\_context_manager.py\", line 67, in __enter__\r\n dispatcher_fiber.switch()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\sync_api\\_context_manager.py\", line 48, in greenlet_main\r\n loop.run_until_complete(self._connection.run_as_sync())\r\n File \"C:\\Python37\\lib\\asyncio\\base_events.py\", line 587, in run_until_complete\r\n return future.result()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_connection.py\", line 134, in run_as_sync\r\n await self.run()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_connection.py\", line 139, in run\r\n await self._transport.run()\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_transport.py\", line 62, in run\r\n stderr=_get_stderr_fileno(),\r\n File \"C:\\Python37\\lib\\site-packages\\playwright\\_impl\\_transport.py\", line 34, in _get_stderr_fileno\r\n return sys.__stderr__.fileno()\r\nAttributeError: 'NoneType' object has no attribute 'fileno'\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport asyncio\nimport io\nimport json\nimport os\nimport sys\nfrom pathlib import Path\nfrom typing import Dict, Optional\n\n\n# Sourced from: https://github.com/pytest-dev/pytest/blob/da01ee0a4bb0af780167ecd228ab3ad249511302/src/_pytest/faulthandler.py#L69-L77\ndef _get_stderr_fileno() -> Optional[int]:\n try:\n return sys.stderr.fileno()\n except (AttributeError, io.UnsupportedOperation):\n # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.\n # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors\n # This is potentially dangerous, but the best we can do.\n if not hasattr(sys, \"__stderr__\"):\n return None\n return sys.__stderr__.fileno()\n\n\nclass Transport:\n def __init__(self, driver_executable: Path) -> None:\n super().__init__()\n self.on_message = lambda _: None\n self._stopped = False\n self._driver_executable = driver_executable\n self._loop: asyncio.AbstractEventLoop\n\n def stop(self) -> None:\n self._stopped = True\n self._output.close()\n\n async def run(self) -> None:\n self._loop = asyncio.get_running_loop()\n\n driver_env = os.environ.copy()\n # VSCode's JavaScript Debug Terminal provides it but driver/pkg does not support it\n driver_env.pop(\"NODE_OPTIONS\", None)\n\n proc = await asyncio.create_subprocess_exec(\n str(self._driver_executable),\n \"run-driver\",\n env=driver_env,\n stdin=asyncio.subprocess.PIPE,\n stdout=asyncio.subprocess.PIPE,\n stderr=_get_stderr_fileno(),\n limit=32768,\n )\n assert proc.stdout\n assert proc.stdin\n self._output = proc.stdin\n\n while not self._stopped:\n try:\n buffer = await proc.stdout.readexactly(4)\n length = int.from_bytes(buffer, byteorder=\"little\", signed=False)\n buffer = bytes(0)\n while length:\n to_read = min(length, 32768)\n data = await proc.stdout.readexactly(to_read)\n length -= to_read\n if len(buffer):\n buffer = buffer + data\n else:\n buffer = data\n obj = json.loads(buffer)\n\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[33mRECV>\\x1b[0m\", json.dumps(obj, indent=2))\n self.on_message(obj)\n except asyncio.IncompleteReadError:\n break\n await asyncio.sleep(0)\n\n def send(self, message: Dict) -> None:\n msg = json.dumps(message)\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[32mSEND>\\x1b[0m\", json.dumps(message, indent=2))\n data = msg.encode()\n self._output.write(\n len(data).to_bytes(4, byteorder=\"little\", signed=False) + data\n )\n", "path": "playwright/_impl/_transport.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport asyncio\nimport io\nimport json\nimport os\nimport sys\nfrom pathlib import Path\nfrom typing import Dict, Optional\n\n\n# Sourced from: https://github.com/pytest-dev/pytest/blob/da01ee0a4bb0af780167ecd228ab3ad249511302/src/_pytest/faulthandler.py#L69-L77\ndef _get_stderr_fileno() -> Optional[int]:\n try:\n return sys.stderr.fileno()\n except (AttributeError, io.UnsupportedOperation):\n # pytest-xdist monkeypatches sys.stderr with an object that is not an actual file.\n # https://docs.python.org/3/library/faulthandler.html#issue-with-file-descriptors\n # This is potentially dangerous, but the best we can do.\n if not hasattr(sys, \"__stderr__\") or not sys.__stderr__:\n return None\n return sys.__stderr__.fileno()\n\n\nclass Transport:\n def __init__(self, driver_executable: Path) -> None:\n super().__init__()\n self.on_message = lambda _: None\n self._stopped = False\n self._driver_executable = driver_executable\n self._loop: asyncio.AbstractEventLoop\n\n def stop(self) -> None:\n self._stopped = True\n self._output.close()\n\n async def run(self) -> None:\n self._loop = asyncio.get_running_loop()\n\n driver_env = os.environ.copy()\n # VSCode's JavaScript Debug Terminal provides it but driver/pkg does not support it\n driver_env.pop(\"NODE_OPTIONS\", None)\n\n proc = await asyncio.create_subprocess_exec(\n str(self._driver_executable),\n \"run-driver\",\n env=driver_env,\n stdin=asyncio.subprocess.PIPE,\n stdout=asyncio.subprocess.PIPE,\n stderr=_get_stderr_fileno(),\n limit=32768,\n )\n assert proc.stdout\n assert proc.stdin\n self._output = proc.stdin\n\n while not self._stopped:\n try:\n buffer = await proc.stdout.readexactly(4)\n length = int.from_bytes(buffer, byteorder=\"little\", signed=False)\n buffer = bytes(0)\n while length:\n to_read = min(length, 32768)\n data = await proc.stdout.readexactly(to_read)\n length -= to_read\n if len(buffer):\n buffer = buffer + data\n else:\n buffer = data\n obj = json.loads(buffer)\n\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[33mRECV>\\x1b[0m\", json.dumps(obj, indent=2))\n self.on_message(obj)\n except asyncio.IncompleteReadError:\n break\n await asyncio.sleep(0)\n\n def send(self, message: Dict) -> None:\n msg = json.dumps(message)\n if \"DEBUGP\" in os.environ: # pragma: no cover\n print(\"\\x1b[32mSEND>\\x1b[0m\", json.dumps(message, indent=2))\n data = msg.encode()\n self._output.write(\n len(data).to_bytes(4, byteorder=\"little\", signed=False) + data\n )\n", "path": "playwright/_impl/_transport.py"}]}
| 1,754 | 149 |
gh_patches_debug_829
|
rasdani/github-patches
|
git_diff
|
ivy-llc__ivy-15926
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
log
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/paddle/tensor/math.py`
Content:
```
1 # global
2 import ivy
3 from ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes
4 from ivy.functional.frontends.paddle.func_wrapper import (
5 to_ivy_arrays_and_back,
6 )
7
8
9 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
10 @to_ivy_arrays_and_back
11 def sin(x, name=None):
12 return ivy.sin(x)
13
14
15 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
16 @to_ivy_arrays_and_back
17 def cos(x, name=None):
18 return ivy.cos(x)
19
20
21 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
22 @to_ivy_arrays_and_back
23 def acos(x, name=None):
24 return ivy.acos(x)
25
26
27 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
28 @to_ivy_arrays_and_back
29 def cosh(x, name=None):
30 return ivy.cosh(x)
31
32
33 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
34 @to_ivy_arrays_and_back
35 def tanh(x, name=None):
36 return ivy.tanh(x)
37
38
39 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
40 @to_ivy_arrays_and_back
41 def acosh(x, name=None):
42 return ivy.acosh(x)
43
44
45 @with_supported_dtypes({"2.4.2 and below": ("float32", "float64")}, "paddle")
46 @to_ivy_arrays_and_back
47 def asin(x, name=None):
48 return ivy.asin(x)
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/ivy/functional/frontends/paddle/tensor/math.py b/ivy/functional/frontends/paddle/tensor/math.py
--- a/ivy/functional/frontends/paddle/tensor/math.py
+++ b/ivy/functional/frontends/paddle/tensor/math.py
@@ -46,3 +46,9 @@
@to_ivy_arrays_and_back
def asin(x, name=None):
return ivy.asin(x)
+
+
+@with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
+@to_ivy_arrays_and_back
+def log(x, name=None):
+ return ivy.log(x)
|
{"golden_diff": "diff --git a/ivy/functional/frontends/paddle/tensor/math.py b/ivy/functional/frontends/paddle/tensor/math.py\n--- a/ivy/functional/frontends/paddle/tensor/math.py\n+++ b/ivy/functional/frontends/paddle/tensor/math.py\n@@ -46,3 +46,9 @@\n @to_ivy_arrays_and_back\n def asin(x, name=None):\n return ivy.asin(x)\n+\n+\n+@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n+@to_ivy_arrays_and_back\n+def log(x, name=None):\n+ return ivy.log(x)\n", "issue": "log\n\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sin(x, name=None):\n return ivy.sin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cos(x, name=None):\n return ivy.cos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acos(x, name=None):\n return ivy.acos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cosh(x, name=None):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef tanh(x, name=None):\n return ivy.tanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acosh(x, name=None):\n return ivy.acosh(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef asin(x, name=None):\n return ivy.asin(x)\n", "path": "ivy/functional/frontends/paddle/tensor/math.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sin(x, name=None):\n return ivy.sin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cos(x, name=None):\n return ivy.cos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acos(x, name=None):\n return ivy.acos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cosh(x, name=None):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef tanh(x, name=None):\n return ivy.tanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acosh(x, name=None):\n return ivy.acosh(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef asin(x, name=None):\n return ivy.asin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef log(x, name=None):\n return ivy.log(x)\n", "path": "ivy/functional/frontends/paddle/tensor/math.py"}]}
| 793 | 155 |
gh_patches_debug_37848
|
rasdani/github-patches
|
git_diff
|
pantsbuild__pants-20177
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support optional 3rd party dependencies in helm
**Is your feature request related to a problem? Please describe.**
I am generating a helm chart that would ordinarily have chart dependencies that are optional based on a condition. The method of defining helm artifacts doesn't seem to allow for this.
**Describe the solution you'd like**
Preferably defining a 3rd party helm artifact would simply allow it to be used manually inside the chart.yaml instead of injecting the chart automatically (or at least have the option) so that you could manually configure the chart yaml as you would like.
**Describe alternatives you've considered**
I think I could make proxy 1st party chart that references the 3rd party chart and then make that optional but that is very awkward.
**Additional context**
Add any other context or screenshots about the feature request here.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/python/pants/backend/helm/dependency_inference/chart.py`
Content:
```
1 # Copyright 2022 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 from __future__ import annotations
5
6 import logging
7 from dataclasses import dataclass
8 from typing import Iterable
9
10 from pants.backend.helm.resolve import artifacts
11 from pants.backend.helm.resolve.artifacts import ThirdPartyHelmArtifactMapping
12 from pants.backend.helm.subsystems.helm import HelmSubsystem
13 from pants.backend.helm.target_types import (
14 AllHelmChartTargets,
15 HelmChartDependenciesField,
16 HelmChartMetaSourceField,
17 HelmChartTarget,
18 )
19 from pants.backend.helm.target_types import rules as helm_target_types_rules
20 from pants.backend.helm.util_rules import chart_metadata
21 from pants.backend.helm.util_rules.chart_metadata import HelmChartDependency, HelmChartMetadata
22 from pants.engine.addresses import Address
23 from pants.engine.internals.selectors import Get, MultiGet
24 from pants.engine.rules import collect_rules, rule
25 from pants.engine.target import (
26 DependenciesRequest,
27 ExplicitlyProvidedDependencies,
28 FieldSet,
29 InferDependenciesRequest,
30 InferredDependencies,
31 )
32 from pants.engine.unions import UnionRule
33 from pants.util.frozendict import FrozenDict
34 from pants.util.logging import LogLevel
35 from pants.util.ordered_set import OrderedSet
36 from pants.util.strutil import bullet_list, pluralize
37
38 logger = logging.getLogger(__name__)
39
40
41 class DuplicateHelmChartNamesFound(Exception):
42 def __init__(self, duplicates: Iterable[tuple[str, Address]]) -> None:
43 super().__init__(
44 f"Found more than one `{HelmChartTarget.alias}` target using the same chart name:\n\n"
45 f"{bullet_list([f'{addr} -> {name}' for name, addr in duplicates])}"
46 )
47
48
49 class UnknownHelmChartDependency(Exception):
50 def __init__(self, address: Address, dependency: HelmChartDependency) -> None:
51 super().__init__(
52 f"Can not find any declared artifact for dependency '{dependency.name}' "
53 f"declared at `Chart.yaml` in Helm chart at address: {address}"
54 )
55
56
57 class FirstPartyHelmChartMapping(FrozenDict[str, Address]):
58 pass
59
60
61 @rule
62 async def first_party_helm_chart_mapping(
63 all_helm_chart_tgts: AllHelmChartTargets,
64 ) -> FirstPartyHelmChartMapping:
65 charts_metadata = await MultiGet(
66 Get(HelmChartMetadata, HelmChartMetaSourceField, tgt[HelmChartMetaSourceField])
67 for tgt in all_helm_chart_tgts
68 )
69
70 name_addr_mapping: dict[str, Address] = {}
71 duplicate_chart_names: OrderedSet[tuple[str, Address]] = OrderedSet()
72
73 for meta, tgt in zip(charts_metadata, all_helm_chart_tgts):
74 if meta.name in name_addr_mapping:
75 duplicate_chart_names.add((meta.name, name_addr_mapping[meta.name]))
76 duplicate_chart_names.add((meta.name, tgt.address))
77 continue
78
79 name_addr_mapping[meta.name] = tgt.address
80
81 if duplicate_chart_names:
82 raise DuplicateHelmChartNamesFound(duplicate_chart_names)
83
84 return FirstPartyHelmChartMapping(name_addr_mapping)
85
86
87 @dataclass(frozen=True)
88 class HelmChartDependenciesInferenceFieldSet(FieldSet):
89 required_fields = (HelmChartMetaSourceField, HelmChartDependenciesField)
90
91 source: HelmChartMetaSourceField
92 dependencies: HelmChartDependenciesField
93
94
95 class InferHelmChartDependenciesRequest(InferDependenciesRequest):
96 infer_from = HelmChartDependenciesInferenceFieldSet
97
98
99 @rule(desc="Inferring Helm chart dependencies", level=LogLevel.DEBUG)
100 async def infer_chart_dependencies_via_metadata(
101 request: InferHelmChartDependenciesRequest,
102 first_party_mapping: FirstPartyHelmChartMapping,
103 third_party_mapping: ThirdPartyHelmArtifactMapping,
104 subsystem: HelmSubsystem,
105 ) -> InferredDependencies:
106 address = request.field_set.address
107
108 # Parse Chart.yaml for explicitly set dependencies.
109 explicitly_provided_deps, metadata = await MultiGet(
110 Get(ExplicitlyProvidedDependencies, DependenciesRequest(request.field_set.dependencies)),
111 Get(HelmChartMetadata, HelmChartMetaSourceField, request.field_set.source),
112 )
113
114 remotes = subsystem.remotes()
115
116 def resolve_dependency_url(dependency: HelmChartDependency) -> str | None:
117 if not dependency.repository:
118 registry = remotes.default_registry
119 if registry:
120 return f"{registry.address}/{dependency.name}"
121 return None
122 else:
123 return f"{dependency.repository}/{dependency.name}"
124
125 # Associate dependencies in Chart.yaml with addresses.
126 dependencies: OrderedSet[Address] = OrderedSet()
127 for chart_dep in metadata.dependencies:
128 candidate_addrs = []
129
130 first_party_dep = first_party_mapping.get(chart_dep.name)
131 if first_party_dep:
132 candidate_addrs.append(first_party_dep)
133
134 dependency_url = resolve_dependency_url(chart_dep)
135 third_party_dep = third_party_mapping.get(dependency_url) if dependency_url else None
136 if third_party_dep:
137 candidate_addrs.append(third_party_dep)
138
139 if not candidate_addrs:
140 raise UnknownHelmChartDependency(address, chart_dep)
141
142 matches = frozenset(candidate_addrs).difference(explicitly_provided_deps.includes)
143
144 explicitly_provided_deps.maybe_warn_of_ambiguous_dependency_inference(
145 matches,
146 address,
147 context=f"The Helm chart {address} declares `{chart_dep.name}` as dependency",
148 import_reference="helm dependency",
149 )
150
151 maybe_disambiguated = explicitly_provided_deps.disambiguated(matches)
152 if maybe_disambiguated:
153 dependencies.add(maybe_disambiguated)
154
155 logger.debug(
156 f"Inferred {pluralize(len(dependencies), 'dependency')} for target at address: {address}"
157 )
158 return InferredDependencies(dependencies)
159
160
161 def rules():
162 return [
163 *collect_rules(),
164 *artifacts.rules(),
165 *helm_target_types_rules(),
166 *chart_metadata.rules(),
167 UnionRule(InferDependenciesRequest, InferHelmChartDependenciesRequest),
168 ]
169
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/python/pants/backend/helm/dependency_inference/chart.py b/src/python/pants/backend/helm/dependency_inference/chart.py
--- a/src/python/pants/backend/helm/dependency_inference/chart.py
+++ b/src/python/pants/backend/helm/dependency_inference/chart.py
@@ -4,11 +4,13 @@
from __future__ import annotations
import logging
+import os
from dataclasses import dataclass
from typing import Iterable
from pants.backend.helm.resolve import artifacts
from pants.backend.helm.resolve.artifacts import ThirdPartyHelmArtifactMapping
+from pants.backend.helm.resolve.remotes import HelmRemotes
from pants.backend.helm.subsystems.helm import HelmSubsystem
from pants.backend.helm.target_types import (
AllHelmChartTargets,
@@ -96,6 +98,16 @@
infer_from = HelmChartDependenciesInferenceFieldSet
+def resolve_dependency_url(remotes: HelmRemotes, dependency: HelmChartDependency) -> str | None:
+ if not dependency.repository:
+ registry = remotes.default_registry
+ if registry:
+ return os.path.join(registry.address, dependency.name)
+ return None
+ else:
+ return os.path.join(dependency.repository, dependency.name)
+
+
@rule(desc="Inferring Helm chart dependencies", level=LogLevel.DEBUG)
async def infer_chart_dependencies_via_metadata(
request: InferHelmChartDependenciesRequest,
@@ -113,15 +125,6 @@
remotes = subsystem.remotes()
- def resolve_dependency_url(dependency: HelmChartDependency) -> str | None:
- if not dependency.repository:
- registry = remotes.default_registry
- if registry:
- return f"{registry.address}/{dependency.name}"
- return None
- else:
- return f"{dependency.repository}/{dependency.name}"
-
# Associate dependencies in Chart.yaml with addresses.
dependencies: OrderedSet[Address] = OrderedSet()
for chart_dep in metadata.dependencies:
@@ -131,7 +134,7 @@
if first_party_dep:
candidate_addrs.append(first_party_dep)
- dependency_url = resolve_dependency_url(chart_dep)
+ dependency_url = resolve_dependency_url(remotes, chart_dep)
third_party_dep = third_party_mapping.get(dependency_url) if dependency_url else None
if third_party_dep:
candidate_addrs.append(third_party_dep)
|
{"golden_diff": "diff --git a/src/python/pants/backend/helm/dependency_inference/chart.py b/src/python/pants/backend/helm/dependency_inference/chart.py\n--- a/src/python/pants/backend/helm/dependency_inference/chart.py\n+++ b/src/python/pants/backend/helm/dependency_inference/chart.py\n@@ -4,11 +4,13 @@\n from __future__ import annotations\n \n import logging\n+import os\n from dataclasses import dataclass\n from typing import Iterable\n \n from pants.backend.helm.resolve import artifacts\n from pants.backend.helm.resolve.artifacts import ThirdPartyHelmArtifactMapping\n+from pants.backend.helm.resolve.remotes import HelmRemotes\n from pants.backend.helm.subsystems.helm import HelmSubsystem\n from pants.backend.helm.target_types import (\n AllHelmChartTargets,\n@@ -96,6 +98,16 @@\n infer_from = HelmChartDependenciesInferenceFieldSet\n \n \n+def resolve_dependency_url(remotes: HelmRemotes, dependency: HelmChartDependency) -> str | None:\n+ if not dependency.repository:\n+ registry = remotes.default_registry\n+ if registry:\n+ return os.path.join(registry.address, dependency.name)\n+ return None\n+ else:\n+ return os.path.join(dependency.repository, dependency.name)\n+\n+\n @rule(desc=\"Inferring Helm chart dependencies\", level=LogLevel.DEBUG)\n async def infer_chart_dependencies_via_metadata(\n request: InferHelmChartDependenciesRequest,\n@@ -113,15 +125,6 @@\n \n remotes = subsystem.remotes()\n \n- def resolve_dependency_url(dependency: HelmChartDependency) -> str | None:\n- if not dependency.repository:\n- registry = remotes.default_registry\n- if registry:\n- return f\"{registry.address}/{dependency.name}\"\n- return None\n- else:\n- return f\"{dependency.repository}/{dependency.name}\"\n-\n # Associate dependencies in Chart.yaml with addresses.\n dependencies: OrderedSet[Address] = OrderedSet()\n for chart_dep in metadata.dependencies:\n@@ -131,7 +134,7 @@\n if first_party_dep:\n candidate_addrs.append(first_party_dep)\n \n- dependency_url = resolve_dependency_url(chart_dep)\n+ dependency_url = resolve_dependency_url(remotes, chart_dep)\n third_party_dep = third_party_mapping.get(dependency_url) if dependency_url else None\n if third_party_dep:\n candidate_addrs.append(third_party_dep)\n", "issue": "Support optional 3rd party dependencies in helm\n**Is your feature request related to a problem? Please describe.**\r\nI am generating a helm chart that would ordinarily have chart dependencies that are optional based on a condition. The method of defining helm artifacts doesn't seem to allow for this.\r\n\r\n**Describe the solution you'd like**\r\nPreferably defining a 3rd party helm artifact would simply allow it to be used manually inside the chart.yaml instead of injecting the chart automatically (or at least have the option) so that you could manually configure the chart yaml as you would like.\r\n\r\n**Describe alternatives you've considered**\r\nI think I could make proxy 1st party chart that references the 3rd party chart and then make that optional but that is very awkward.\r\n\r\n**Additional context**\r\nAdd any other context or screenshots about the feature request here.\r\n\n", "before_files": [{"content": "# Copyright 2022 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import annotations\n\nimport logging\nfrom dataclasses import dataclass\nfrom typing import Iterable\n\nfrom pants.backend.helm.resolve import artifacts\nfrom pants.backend.helm.resolve.artifacts import ThirdPartyHelmArtifactMapping\nfrom pants.backend.helm.subsystems.helm import HelmSubsystem\nfrom pants.backend.helm.target_types import (\n AllHelmChartTargets,\n HelmChartDependenciesField,\n HelmChartMetaSourceField,\n HelmChartTarget,\n)\nfrom pants.backend.helm.target_types import rules as helm_target_types_rules\nfrom pants.backend.helm.util_rules import chart_metadata\nfrom pants.backend.helm.util_rules.chart_metadata import HelmChartDependency, HelmChartMetadata\nfrom pants.engine.addresses import Address\nfrom pants.engine.internals.selectors import Get, MultiGet\nfrom pants.engine.rules import collect_rules, rule\nfrom pants.engine.target import (\n DependenciesRequest,\n ExplicitlyProvidedDependencies,\n FieldSet,\n InferDependenciesRequest,\n InferredDependencies,\n)\nfrom pants.engine.unions import UnionRule\nfrom pants.util.frozendict import FrozenDict\nfrom pants.util.logging import LogLevel\nfrom pants.util.ordered_set import OrderedSet\nfrom pants.util.strutil import bullet_list, pluralize\n\nlogger = logging.getLogger(__name__)\n\n\nclass DuplicateHelmChartNamesFound(Exception):\n def __init__(self, duplicates: Iterable[tuple[str, Address]]) -> None:\n super().__init__(\n f\"Found more than one `{HelmChartTarget.alias}` target using the same chart name:\\n\\n\"\n f\"{bullet_list([f'{addr} -> {name}' for name, addr in duplicates])}\"\n )\n\n\nclass UnknownHelmChartDependency(Exception):\n def __init__(self, address: Address, dependency: HelmChartDependency) -> None:\n super().__init__(\n f\"Can not find any declared artifact for dependency '{dependency.name}' \"\n f\"declared at `Chart.yaml` in Helm chart at address: {address}\"\n )\n\n\nclass FirstPartyHelmChartMapping(FrozenDict[str, Address]):\n pass\n\n\n@rule\nasync def first_party_helm_chart_mapping(\n all_helm_chart_tgts: AllHelmChartTargets,\n) -> FirstPartyHelmChartMapping:\n charts_metadata = await MultiGet(\n Get(HelmChartMetadata, HelmChartMetaSourceField, tgt[HelmChartMetaSourceField])\n for tgt in all_helm_chart_tgts\n )\n\n name_addr_mapping: dict[str, Address] = {}\n duplicate_chart_names: OrderedSet[tuple[str, Address]] = OrderedSet()\n\n for meta, tgt in zip(charts_metadata, all_helm_chart_tgts):\n if meta.name in name_addr_mapping:\n duplicate_chart_names.add((meta.name, name_addr_mapping[meta.name]))\n duplicate_chart_names.add((meta.name, tgt.address))\n continue\n\n name_addr_mapping[meta.name] = tgt.address\n\n if duplicate_chart_names:\n raise DuplicateHelmChartNamesFound(duplicate_chart_names)\n\n return FirstPartyHelmChartMapping(name_addr_mapping)\n\n\n@dataclass(frozen=True)\nclass HelmChartDependenciesInferenceFieldSet(FieldSet):\n required_fields = (HelmChartMetaSourceField, HelmChartDependenciesField)\n\n source: HelmChartMetaSourceField\n dependencies: HelmChartDependenciesField\n\n\nclass InferHelmChartDependenciesRequest(InferDependenciesRequest):\n infer_from = HelmChartDependenciesInferenceFieldSet\n\n\n@rule(desc=\"Inferring Helm chart dependencies\", level=LogLevel.DEBUG)\nasync def infer_chart_dependencies_via_metadata(\n request: InferHelmChartDependenciesRequest,\n first_party_mapping: FirstPartyHelmChartMapping,\n third_party_mapping: ThirdPartyHelmArtifactMapping,\n subsystem: HelmSubsystem,\n) -> InferredDependencies:\n address = request.field_set.address\n\n # Parse Chart.yaml for explicitly set dependencies.\n explicitly_provided_deps, metadata = await MultiGet(\n Get(ExplicitlyProvidedDependencies, DependenciesRequest(request.field_set.dependencies)),\n Get(HelmChartMetadata, HelmChartMetaSourceField, request.field_set.source),\n )\n\n remotes = subsystem.remotes()\n\n def resolve_dependency_url(dependency: HelmChartDependency) -> str | None:\n if not dependency.repository:\n registry = remotes.default_registry\n if registry:\n return f\"{registry.address}/{dependency.name}\"\n return None\n else:\n return f\"{dependency.repository}/{dependency.name}\"\n\n # Associate dependencies in Chart.yaml with addresses.\n dependencies: OrderedSet[Address] = OrderedSet()\n for chart_dep in metadata.dependencies:\n candidate_addrs = []\n\n first_party_dep = first_party_mapping.get(chart_dep.name)\n if first_party_dep:\n candidate_addrs.append(first_party_dep)\n\n dependency_url = resolve_dependency_url(chart_dep)\n third_party_dep = third_party_mapping.get(dependency_url) if dependency_url else None\n if third_party_dep:\n candidate_addrs.append(third_party_dep)\n\n if not candidate_addrs:\n raise UnknownHelmChartDependency(address, chart_dep)\n\n matches = frozenset(candidate_addrs).difference(explicitly_provided_deps.includes)\n\n explicitly_provided_deps.maybe_warn_of_ambiguous_dependency_inference(\n matches,\n address,\n context=f\"The Helm chart {address} declares `{chart_dep.name}` as dependency\",\n import_reference=\"helm dependency\",\n )\n\n maybe_disambiguated = explicitly_provided_deps.disambiguated(matches)\n if maybe_disambiguated:\n dependencies.add(maybe_disambiguated)\n\n logger.debug(\n f\"Inferred {pluralize(len(dependencies), 'dependency')} for target at address: {address}\"\n )\n return InferredDependencies(dependencies)\n\n\ndef rules():\n return [\n *collect_rules(),\n *artifacts.rules(),\n *helm_target_types_rules(),\n *chart_metadata.rules(),\n UnionRule(InferDependenciesRequest, InferHelmChartDependenciesRequest),\n ]\n", "path": "src/python/pants/backend/helm/dependency_inference/chart.py"}], "after_files": [{"content": "# Copyright 2022 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import annotations\n\nimport logging\nimport os\nfrom dataclasses import dataclass\nfrom typing import Iterable\n\nfrom pants.backend.helm.resolve import artifacts\nfrom pants.backend.helm.resolve.artifacts import ThirdPartyHelmArtifactMapping\nfrom pants.backend.helm.resolve.remotes import HelmRemotes\nfrom pants.backend.helm.subsystems.helm import HelmSubsystem\nfrom pants.backend.helm.target_types import (\n AllHelmChartTargets,\n HelmChartDependenciesField,\n HelmChartMetaSourceField,\n HelmChartTarget,\n)\nfrom pants.backend.helm.target_types import rules as helm_target_types_rules\nfrom pants.backend.helm.util_rules import chart_metadata\nfrom pants.backend.helm.util_rules.chart_metadata import HelmChartDependency, HelmChartMetadata\nfrom pants.engine.addresses import Address\nfrom pants.engine.internals.selectors import Get, MultiGet\nfrom pants.engine.rules import collect_rules, rule\nfrom pants.engine.target import (\n DependenciesRequest,\n ExplicitlyProvidedDependencies,\n FieldSet,\n InferDependenciesRequest,\n InferredDependencies,\n)\nfrom pants.engine.unions import UnionRule\nfrom pants.util.frozendict import FrozenDict\nfrom pants.util.logging import LogLevel\nfrom pants.util.ordered_set import OrderedSet\nfrom pants.util.strutil import bullet_list, pluralize\n\nlogger = logging.getLogger(__name__)\n\n\nclass DuplicateHelmChartNamesFound(Exception):\n def __init__(self, duplicates: Iterable[tuple[str, Address]]) -> None:\n super().__init__(\n f\"Found more than one `{HelmChartTarget.alias}` target using the same chart name:\\n\\n\"\n f\"{bullet_list([f'{addr} -> {name}' for name, addr in duplicates])}\"\n )\n\n\nclass UnknownHelmChartDependency(Exception):\n def __init__(self, address: Address, dependency: HelmChartDependency) -> None:\n super().__init__(\n f\"Can not find any declared artifact for dependency '{dependency.name}' \"\n f\"declared at `Chart.yaml` in Helm chart at address: {address}\"\n )\n\n\nclass FirstPartyHelmChartMapping(FrozenDict[str, Address]):\n pass\n\n\n@rule\nasync def first_party_helm_chart_mapping(\n all_helm_chart_tgts: AllHelmChartTargets,\n) -> FirstPartyHelmChartMapping:\n charts_metadata = await MultiGet(\n Get(HelmChartMetadata, HelmChartMetaSourceField, tgt[HelmChartMetaSourceField])\n for tgt in all_helm_chart_tgts\n )\n\n name_addr_mapping: dict[str, Address] = {}\n duplicate_chart_names: OrderedSet[tuple[str, Address]] = OrderedSet()\n\n for meta, tgt in zip(charts_metadata, all_helm_chart_tgts):\n if meta.name in name_addr_mapping:\n duplicate_chart_names.add((meta.name, name_addr_mapping[meta.name]))\n duplicate_chart_names.add((meta.name, tgt.address))\n continue\n\n name_addr_mapping[meta.name] = tgt.address\n\n if duplicate_chart_names:\n raise DuplicateHelmChartNamesFound(duplicate_chart_names)\n\n return FirstPartyHelmChartMapping(name_addr_mapping)\n\n\n@dataclass(frozen=True)\nclass HelmChartDependenciesInferenceFieldSet(FieldSet):\n required_fields = (HelmChartMetaSourceField, HelmChartDependenciesField)\n\n source: HelmChartMetaSourceField\n dependencies: HelmChartDependenciesField\n\n\nclass InferHelmChartDependenciesRequest(InferDependenciesRequest):\n infer_from = HelmChartDependenciesInferenceFieldSet\n\n\ndef resolve_dependency_url(remotes: HelmRemotes, dependency: HelmChartDependency) -> str | None:\n if not dependency.repository:\n registry = remotes.default_registry\n if registry:\n return os.path.join(registry.address, dependency.name)\n return None\n else:\n return os.path.join(dependency.repository, dependency.name)\n\n\n@rule(desc=\"Inferring Helm chart dependencies\", level=LogLevel.DEBUG)\nasync def infer_chart_dependencies_via_metadata(\n request: InferHelmChartDependenciesRequest,\n first_party_mapping: FirstPartyHelmChartMapping,\n third_party_mapping: ThirdPartyHelmArtifactMapping,\n subsystem: HelmSubsystem,\n) -> InferredDependencies:\n address = request.field_set.address\n\n # Parse Chart.yaml for explicitly set dependencies.\n explicitly_provided_deps, metadata = await MultiGet(\n Get(ExplicitlyProvidedDependencies, DependenciesRequest(request.field_set.dependencies)),\n Get(HelmChartMetadata, HelmChartMetaSourceField, request.field_set.source),\n )\n\n remotes = subsystem.remotes()\n\n # Associate dependencies in Chart.yaml with addresses.\n dependencies: OrderedSet[Address] = OrderedSet()\n for chart_dep in metadata.dependencies:\n candidate_addrs = []\n\n first_party_dep = first_party_mapping.get(chart_dep.name)\n if first_party_dep:\n candidate_addrs.append(first_party_dep)\n\n dependency_url = resolve_dependency_url(remotes, chart_dep)\n third_party_dep = third_party_mapping.get(dependency_url) if dependency_url else None\n if third_party_dep:\n candidate_addrs.append(third_party_dep)\n\n if not candidate_addrs:\n raise UnknownHelmChartDependency(address, chart_dep)\n\n matches = frozenset(candidate_addrs).difference(explicitly_provided_deps.includes)\n\n explicitly_provided_deps.maybe_warn_of_ambiguous_dependency_inference(\n matches,\n address,\n context=f\"The Helm chart {address} declares `{chart_dep.name}` as dependency\",\n import_reference=\"helm dependency\",\n )\n\n maybe_disambiguated = explicitly_provided_deps.disambiguated(matches)\n if maybe_disambiguated:\n dependencies.add(maybe_disambiguated)\n\n logger.debug(\n f\"Inferred {pluralize(len(dependencies), 'dependency')} for target at address: {address}\"\n )\n return InferredDependencies(dependencies)\n\n\ndef rules():\n return [\n *collect_rules(),\n *artifacts.rules(),\n *helm_target_types_rules(),\n *chart_metadata.rules(),\n UnionRule(InferDependenciesRequest, InferHelmChartDependenciesRequest),\n ]\n", "path": "src/python/pants/backend/helm/dependency_inference/chart.py"}]}
| 2,136 | 528 |
gh_patches_debug_36271
|
rasdani/github-patches
|
git_diff
|
opsdroid__opsdroid-1835
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Upgrade redis database backend to use aioredis v2
Looks like v2 of `aioredis` is out and has breaking changes which affect us. In #1809 I've pinned us to v1 for now but we should upgrade things to work with v2.
Specifically importing the parser fails
https://github.com/opsdroid/opsdroid/blob/a45490d1bdceca39b49880e20262b55ea0be101d/opsdroid/database/redis/__init__.py#L6
```python-traceback
ImportError while importing test module '/home/runner/work/opsdroid/opsdroid/opsdroid/database/redis/tests/test_redis.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/opt/hostedtoolcache/Python/3.7.11/x64/lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
opsdroid/database/redis/tests/test_redis.py:8: in <module>
from opsdroid.database.redis import RedisDatabase
opsdroid/database/redis/__init__.py:6: in <module>
from aioredis import parser
E ImportError: cannot import name 'parser' from 'aioredis' (/home/runner/work/opsdroid/opsdroid/.tox/py37-e2e/lib/python3.7/site-packages/aioredis/__init__.py)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opsdroid/database/redis/__init__.py`
Content:
```
1 """Module for storing data within Redis."""
2 import json
3 import logging
4
5 import aioredis
6 from aioredis import parser
7 from voluptuous import Any
8
9 from opsdroid.database import Database
10 from opsdroid.helper import JSONEncoder, JSONDecoder
11
12 _LOGGER = logging.getLogger(__name__)
13 CONFIG_SCHEMA = {"host": str, "port": Any(int, str), "database": int, "password": str}
14
15
16 class RedisDatabase(Database):
17 """Database class for storing data within a Redis instance."""
18
19 def __init__(self, config, opsdroid=None):
20 """Initialise the redis database.
21
22 Set basic properties of the database. Initialise properties like
23 name, connection arguments, database file, table name and config.
24
25 Args:
26 config (dict): The configuration of the database which consists
27 of `file` and `table` name of the sqlite database
28 specified in `configuration.yaml` file.
29 opsdroid (OpsDroid): An instance of opsdroid.core.
30
31 """
32 super().__init__(config, opsdroid=opsdroid)
33 self.config = config
34 self.client = None
35 self.host = self.config.get("host", "localhost")
36 self.port = self.config.get("port", 6379)
37 self.database = self.config.get("database", 0)
38 self.password = self.config.get("password", None)
39 _LOGGER.debug(_("Loaded Redis database connector."))
40
41 async def connect(self):
42 """Connect to the database.
43
44 This method will connect to a Redis database. By default it will
45 connect to Redis on localhost on port 6379
46
47 """
48 try:
49 self.client = await aioredis.create_pool(
50 address=(self.host, int(self.port)),
51 db=self.database,
52 password=self.password,
53 parser=parser.PyReader,
54 )
55
56 _LOGGER.info(
57 _("Connected to Redis database %s from %s on port %s."),
58 self.database,
59 self.host,
60 self.port,
61 )
62 except OSError:
63 _LOGGER.warning(
64 _("Unable to connect to Redis database on address: %s port: %s."),
65 self.host,
66 self.port,
67 )
68
69 async def put(self, key, data):
70 """Store the data object in Redis against the key.
71
72 Args:
73 key (string): The key to store the data object under.
74 data (object): The data object to store.
75
76 """
77 if self.client:
78 _LOGGER.debug(_("Putting %s into Redis."), key)
79 await self.client.execute("SET", key, json.dumps(data, cls=JSONEncoder))
80
81 async def get(self, key):
82 """Get data from Redis for a given key.
83
84 Args:
85 key (string): The key to lookup in the database.
86
87 Returns:
88 object or None: The data object stored for that key, or None if no
89 object found for that key.
90
91 """
92 if self.client:
93 _LOGGER.debug(_("Getting %s from Redis."), key)
94 data = await self.client.execute("GET", key)
95
96 if data:
97 return json.loads(data, object_hook=JSONDecoder())
98
99 return None
100
101 async def delete(self, key):
102 """Delete data from Redis for a given key.
103
104 Args:
105 key (string): The key to delete in the database.
106
107 """
108 if self.client:
109 _LOGGER.debug(_("Deleting %s from Redis."), key)
110 await self.client.execute("DEL", key)
111
112 async def disconnect(self):
113 """Disconnect from the database."""
114 if self.client:
115 self.client.close()
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py
--- a/opsdroid/database/redis/__init__.py
+++ b/opsdroid/database/redis/__init__.py
@@ -2,8 +2,7 @@
import json
import logging
-import aioredis
-from aioredis import parser
+from aioredis import Redis
from voluptuous import Any
from opsdroid.database import Database
@@ -46,12 +45,13 @@
"""
try:
- self.client = await aioredis.create_pool(
- address=(self.host, int(self.port)),
+ self.client = Redis(
+ host=self.host,
+ port=int(self.port),
db=self.database,
password=self.password,
- parser=parser.PyReader,
)
+ await self.client.ping() # to actually initiate a connection
_LOGGER.info(
_("Connected to Redis database %s from %s on port %s."),
@@ -76,7 +76,9 @@
"""
if self.client:
_LOGGER.debug(_("Putting %s into Redis."), key)
- await self.client.execute("SET", key, json.dumps(data, cls=JSONEncoder))
+ await self.client.execute_command(
+ "SET", key, json.dumps(data, cls=JSONEncoder)
+ )
async def get(self, key):
"""Get data from Redis for a given key.
@@ -91,7 +93,7 @@
"""
if self.client:
_LOGGER.debug(_("Getting %s from Redis."), key)
- data = await self.client.execute("GET", key)
+ data = await self.client.execute_command("GET", key)
if data:
return json.loads(data, object_hook=JSONDecoder())
@@ -107,9 +109,9 @@
"""
if self.client:
_LOGGER.debug(_("Deleting %s from Redis."), key)
- await self.client.execute("DEL", key)
+ await self.client.execute_command("DEL", key)
async def disconnect(self):
"""Disconnect from the database."""
if self.client:
- self.client.close()
+ await self.client.close()
|
{"golden_diff": "diff --git a/opsdroid/database/redis/__init__.py b/opsdroid/database/redis/__init__.py\n--- a/opsdroid/database/redis/__init__.py\n+++ b/opsdroid/database/redis/__init__.py\n@@ -2,8 +2,7 @@\n import json\n import logging\n \n-import aioredis\n-from aioredis import parser\n+from aioredis import Redis\n from voluptuous import Any\n \n from opsdroid.database import Database\n@@ -46,12 +45,13 @@\n \n \"\"\"\n try:\n- self.client = await aioredis.create_pool(\n- address=(self.host, int(self.port)),\n+ self.client = Redis(\n+ host=self.host,\n+ port=int(self.port),\n db=self.database,\n password=self.password,\n- parser=parser.PyReader,\n )\n+ await self.client.ping() # to actually initiate a connection\n \n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n@@ -76,7 +76,9 @@\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n- await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n+ await self.client.execute_command(\n+ \"SET\", key, json.dumps(data, cls=JSONEncoder)\n+ )\n \n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n@@ -91,7 +93,7 @@\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n- data = await self.client.execute(\"GET\", key)\n+ data = await self.client.execute_command(\"GET\", key)\n \n if data:\n return json.loads(data, object_hook=JSONDecoder())\n@@ -107,9 +109,9 @@\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n- await self.client.execute(\"DEL\", key)\n+ await self.client.execute_command(\"DEL\", key)\n \n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n- self.client.close()\n+ await self.client.close()\n", "issue": "Upgrade redis database backend to use aioredis v2\nLooks like v2 of `aioredis` is out and has breaking changes which affect us. In #1809 I've pinned us to v1 for now but we should upgrade things to work with v2.\r\n\r\nSpecifically importing the parser fails\r\n\r\nhttps://github.com/opsdroid/opsdroid/blob/a45490d1bdceca39b49880e20262b55ea0be101d/opsdroid/database/redis/__init__.py#L6\r\n\r\n```python-traceback\r\n ImportError while importing test module '/home/runner/work/opsdroid/opsdroid/opsdroid/database/redis/tests/test_redis.py'.\r\n Hint: make sure your test modules/packages have valid Python names.\r\n Traceback:\r\n /opt/hostedtoolcache/Python/3.7.11/x64/lib/python3.7/importlib/__init__.py:127: in import_module\r\n return _bootstrap._gcd_import(name[level:], package, level)\r\n opsdroid/database/redis/tests/test_redis.py:8: in <module>\r\n from opsdroid.database.redis import RedisDatabase\r\n opsdroid/database/redis/__init__.py:6: in <module>\r\n from aioredis import parser\r\n E ImportError: cannot import name 'parser' from 'aioredis' (/home/runner/work/opsdroid/opsdroid/.tox/py37-e2e/lib/python3.7/site-packages/aioredis/__init__.py)\r\n```\n", "before_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nimport aioredis\nfrom aioredis import parser\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = await aioredis.create_pool(\n address=(self.host, int(self.port)),\n db=self.database,\n password=self.password,\n parser=parser.PyReader,\n )\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute(\"SET\", key, json.dumps(data, cls=JSONEncoder))\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute(\"GET\", key)\n\n if data:\n return json.loads(data, object_hook=JSONDecoder())\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}], "after_files": [{"content": "\"\"\"Module for storing data within Redis.\"\"\"\nimport json\nimport logging\n\nfrom aioredis import Redis\nfrom voluptuous import Any\n\nfrom opsdroid.database import Database\nfrom opsdroid.helper import JSONEncoder, JSONDecoder\n\n_LOGGER = logging.getLogger(__name__)\nCONFIG_SCHEMA = {\"host\": str, \"port\": Any(int, str), \"database\": int, \"password\": str}\n\n\nclass RedisDatabase(Database):\n \"\"\"Database class for storing data within a Redis instance.\"\"\"\n\n def __init__(self, config, opsdroid=None):\n \"\"\"Initialise the redis database.\n\n Set basic properties of the database. Initialise properties like\n name, connection arguments, database file, table name and config.\n\n Args:\n config (dict): The configuration of the database which consists\n of `file` and `table` name of the sqlite database\n specified in `configuration.yaml` file.\n opsdroid (OpsDroid): An instance of opsdroid.core.\n\n \"\"\"\n super().__init__(config, opsdroid=opsdroid)\n self.config = config\n self.client = None\n self.host = self.config.get(\"host\", \"localhost\")\n self.port = self.config.get(\"port\", 6379)\n self.database = self.config.get(\"database\", 0)\n self.password = self.config.get(\"password\", None)\n _LOGGER.debug(_(\"Loaded Redis database connector.\"))\n\n async def connect(self):\n \"\"\"Connect to the database.\n\n This method will connect to a Redis database. By default it will\n connect to Redis on localhost on port 6379\n\n \"\"\"\n try:\n self.client = Redis(\n host=self.host,\n port=int(self.port),\n db=self.database,\n password=self.password,\n )\n await self.client.ping() # to actually initiate a connection\n\n _LOGGER.info(\n _(\"Connected to Redis database %s from %s on port %s.\"),\n self.database,\n self.host,\n self.port,\n )\n except OSError:\n _LOGGER.warning(\n _(\"Unable to connect to Redis database on address: %s port: %s.\"),\n self.host,\n self.port,\n )\n\n async def put(self, key, data):\n \"\"\"Store the data object in Redis against the key.\n\n Args:\n key (string): The key to store the data object under.\n data (object): The data object to store.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Putting %s into Redis.\"), key)\n await self.client.execute_command(\n \"SET\", key, json.dumps(data, cls=JSONEncoder)\n )\n\n async def get(self, key):\n \"\"\"Get data from Redis for a given key.\n\n Args:\n key (string): The key to lookup in the database.\n\n Returns:\n object or None: The data object stored for that key, or None if no\n object found for that key.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Getting %s from Redis.\"), key)\n data = await self.client.execute_command(\"GET\", key)\n\n if data:\n return json.loads(data, object_hook=JSONDecoder())\n\n return None\n\n async def delete(self, key):\n \"\"\"Delete data from Redis for a given key.\n\n Args:\n key (string): The key to delete in the database.\n\n \"\"\"\n if self.client:\n _LOGGER.debug(_(\"Deleting %s from Redis.\"), key)\n await self.client.execute_command(\"DEL\", key)\n\n async def disconnect(self):\n \"\"\"Disconnect from the database.\"\"\"\n if self.client:\n await self.client.close()\n", "path": "opsdroid/database/redis/__init__.py"}]}
| 1,633 | 490 |
gh_patches_debug_28142
|
rasdani/github-patches
|
git_diff
|
pypa__pip-8617
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fail legacy install if .egg-info directory was not created
**What's the problem this feature will solve?**
Currently on the legacy install code path ([here](https://github.com/pypa/pip/blob/e0239735a3d8a504e00133090dde2210b6564a38/src/pip/_internal/req/req_install.py#L997-L1004)) if execution of `setup.py install ...` didn't result in an `.egg-info` directory being created we trace a warning and continue.
This is bad because we will have installed an uninstallable package and the only penalty is a small warning which is very easy to miss.
**Describe the solution you'd like**
Instead of tracing a warning we should throw an exception.
This prevents additional packages from being installed and possibly hiding the warning from users.
**Alternative Solutions**
Continue tracing the warning as before.
**Additional context**
* [this support query](https://kite.trade/forum/discussion/4335/could-not-find-egg-info-directory-in-install-record-for-kiteconnect) for [this package](https://github.com/zerodhatech/pykiteconnect/blob/7af60a776fbfafb6c30cfbd8d4f6c192618c5af5/setup.py#L48) which does an egg install in its setup.py regardless of setuptools version.
* [this query](https://stackoverflow.com/questions/26091641/what-does-a-could-not-find-egg-info-directory-in-install-record-from-pip-mean) which implicates having an older version of setuptools
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pip/_internal/operations/install/legacy.py`
Content:
```
1 """Legacy installation process, i.e. `setup.py install`.
2 """
3
4 import logging
5 import os
6 import sys
7 from distutils.util import change_root
8
9 from pip._internal.utils.deprecation import deprecated
10 from pip._internal.utils.logging import indent_log
11 from pip._internal.utils.misc import ensure_dir
12 from pip._internal.utils.setuptools_build import make_setuptools_install_args
13 from pip._internal.utils.subprocess import runner_with_spinner_message
14 from pip._internal.utils.temp_dir import TempDirectory
15 from pip._internal.utils.typing import MYPY_CHECK_RUNNING
16
17 if MYPY_CHECK_RUNNING:
18 from typing import List, Optional, Sequence
19
20 from pip._internal.build_env import BuildEnvironment
21 from pip._internal.models.scheme import Scheme
22
23
24 logger = logging.getLogger(__name__)
25
26
27 class LegacyInstallFailure(Exception):
28 def __init__(self):
29 # type: () -> None
30 self.parent = sys.exc_info()
31
32
33 def install(
34 install_options, # type: List[str]
35 global_options, # type: Sequence[str]
36 root, # type: Optional[str]
37 home, # type: Optional[str]
38 prefix, # type: Optional[str]
39 use_user_site, # type: bool
40 pycompile, # type: bool
41 scheme, # type: Scheme
42 setup_py_path, # type: str
43 isolated, # type: bool
44 req_name, # type: str
45 build_env, # type: BuildEnvironment
46 unpacked_source_directory, # type: str
47 req_description, # type: str
48 ):
49 # type: (...) -> bool
50
51 header_dir = scheme.headers
52
53 with TempDirectory(kind="record") as temp_dir:
54 try:
55 record_filename = os.path.join(temp_dir.path, 'install-record.txt')
56 install_args = make_setuptools_install_args(
57 setup_py_path,
58 global_options=global_options,
59 install_options=install_options,
60 record_filename=record_filename,
61 root=root,
62 prefix=prefix,
63 header_dir=header_dir,
64 home=home,
65 use_user_site=use_user_site,
66 no_user_config=isolated,
67 pycompile=pycompile,
68 )
69
70 runner = runner_with_spinner_message(
71 "Running setup.py install for {}".format(req_name)
72 )
73 with indent_log(), build_env:
74 runner(
75 cmd=install_args,
76 cwd=unpacked_source_directory,
77 )
78
79 if not os.path.exists(record_filename):
80 logger.debug('Record file %s not found', record_filename)
81 # Signal to the caller that we didn't install the new package
82 return False
83
84 except Exception:
85 # Signal to the caller that we didn't install the new package
86 raise LegacyInstallFailure
87
88 # At this point, we have successfully installed the requirement.
89
90 # We intentionally do not use any encoding to read the file because
91 # setuptools writes the file using distutils.file_util.write_file,
92 # which does not specify an encoding.
93 with open(record_filename) as f:
94 record_lines = f.read().splitlines()
95
96 def prepend_root(path):
97 # type: (str) -> str
98 if root is None or not os.path.isabs(path):
99 return path
100 else:
101 return change_root(root, path)
102
103 for line in record_lines:
104 directory = os.path.dirname(line)
105 if directory.endswith('.egg-info'):
106 egg_info_dir = prepend_root(directory)
107 break
108 else:
109 deprecated(
110 reason=(
111 "{} did not indicate that it installed an "
112 ".egg-info directory. Only setup.py projects "
113 "generating .egg-info directories are supported."
114 ).format(req_description),
115 replacement=(
116 "for maintainers: updating the setup.py of {0}. "
117 "For users: contact the maintainers of {0} to let "
118 "them know to update their setup.py.".format(
119 req_name
120 )
121 ),
122 gone_in="20.2",
123 issue=6998,
124 )
125 # FIXME: put the record somewhere
126 return True
127
128 new_lines = []
129 for line in record_lines:
130 filename = line.strip()
131 if os.path.isdir(filename):
132 filename += os.path.sep
133 new_lines.append(
134 os.path.relpath(prepend_root(filename), egg_info_dir)
135 )
136 new_lines.sort()
137 ensure_dir(egg_info_dir)
138 inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt')
139 with open(inst_files_path, 'w') as f:
140 f.write('\n'.join(new_lines) + '\n')
141
142 return True
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/pip/_internal/operations/install/legacy.py b/src/pip/_internal/operations/install/legacy.py
--- a/src/pip/_internal/operations/install/legacy.py
+++ b/src/pip/_internal/operations/install/legacy.py
@@ -6,7 +6,7 @@
import sys
from distutils.util import change_root
-from pip._internal.utils.deprecation import deprecated
+from pip._internal.exceptions import InstallationError
from pip._internal.utils.logging import indent_log
from pip._internal.utils.misc import ensure_dir
from pip._internal.utils.setuptools_build import make_setuptools_install_args
@@ -106,24 +106,12 @@
egg_info_dir = prepend_root(directory)
break
else:
- deprecated(
- reason=(
- "{} did not indicate that it installed an "
- ".egg-info directory. Only setup.py projects "
- "generating .egg-info directories are supported."
- ).format(req_description),
- replacement=(
- "for maintainers: updating the setup.py of {0}. "
- "For users: contact the maintainers of {0} to let "
- "them know to update their setup.py.".format(
- req_name
- )
- ),
- gone_in="20.2",
- issue=6998,
- )
- # FIXME: put the record somewhere
- return True
+ message = (
+ "{} did not indicate that it installed an "
+ ".egg-info directory. Only setup.py projects "
+ "generating .egg-info directories are supported."
+ ).format(req_description)
+ raise InstallationError(message)
new_lines = []
for line in record_lines:
|
{"golden_diff": "diff --git a/src/pip/_internal/operations/install/legacy.py b/src/pip/_internal/operations/install/legacy.py\n--- a/src/pip/_internal/operations/install/legacy.py\n+++ b/src/pip/_internal/operations/install/legacy.py\n@@ -6,7 +6,7 @@\n import sys\n from distutils.util import change_root\n \n-from pip._internal.utils.deprecation import deprecated\n+from pip._internal.exceptions import InstallationError\n from pip._internal.utils.logging import indent_log\n from pip._internal.utils.misc import ensure_dir\n from pip._internal.utils.setuptools_build import make_setuptools_install_args\n@@ -106,24 +106,12 @@\n egg_info_dir = prepend_root(directory)\n break\n else:\n- deprecated(\n- reason=(\n- \"{} did not indicate that it installed an \"\n- \".egg-info directory. Only setup.py projects \"\n- \"generating .egg-info directories are supported.\"\n- ).format(req_description),\n- replacement=(\n- \"for maintainers: updating the setup.py of {0}. \"\n- \"For users: contact the maintainers of {0} to let \"\n- \"them know to update their setup.py.\".format(\n- req_name\n- )\n- ),\n- gone_in=\"20.2\",\n- issue=6998,\n- )\n- # FIXME: put the record somewhere\n- return True\n+ message = (\n+ \"{} did not indicate that it installed an \"\n+ \".egg-info directory. Only setup.py projects \"\n+ \"generating .egg-info directories are supported.\"\n+ ).format(req_description)\n+ raise InstallationError(message)\n \n new_lines = []\n for line in record_lines:\n", "issue": "Fail legacy install if .egg-info directory was not created\n**What's the problem this feature will solve?**\r\n\r\nCurrently on the legacy install code path ([here](https://github.com/pypa/pip/blob/e0239735a3d8a504e00133090dde2210b6564a38/src/pip/_internal/req/req_install.py#L997-L1004)) if execution of `setup.py install ...` didn't result in an `.egg-info` directory being created we trace a warning and continue.\r\n\r\nThis is bad because we will have installed an uninstallable package and the only penalty is a small warning which is very easy to miss.\r\n\r\n**Describe the solution you'd like**\r\n\r\nInstead of tracing a warning we should throw an exception.\r\n\r\nThis prevents additional packages from being installed and possibly hiding the warning from users.\r\n\r\n**Alternative Solutions**\r\n\r\nContinue tracing the warning as before.\r\n\r\n**Additional context**\r\n\r\n* [this support query](https://kite.trade/forum/discussion/4335/could-not-find-egg-info-directory-in-install-record-for-kiteconnect) for [this package](https://github.com/zerodhatech/pykiteconnect/blob/7af60a776fbfafb6c30cfbd8d4f6c192618c5af5/setup.py#L48) which does an egg install in its setup.py regardless of setuptools version.\r\n* [this query](https://stackoverflow.com/questions/26091641/what-does-a-could-not-find-egg-info-directory-in-install-record-from-pip-mean) which implicates having an older version of setuptools\n", "before_files": [{"content": "\"\"\"Legacy installation process, i.e. `setup.py install`.\n\"\"\"\n\nimport logging\nimport os\nimport sys\nfrom distutils.util import change_root\n\nfrom pip._internal.utils.deprecation import deprecated\nfrom pip._internal.utils.logging import indent_log\nfrom pip._internal.utils.misc import ensure_dir\nfrom pip._internal.utils.setuptools_build import make_setuptools_install_args\nfrom pip._internal.utils.subprocess import runner_with_spinner_message\nfrom pip._internal.utils.temp_dir import TempDirectory\nfrom pip._internal.utils.typing import MYPY_CHECK_RUNNING\n\nif MYPY_CHECK_RUNNING:\n from typing import List, Optional, Sequence\n\n from pip._internal.build_env import BuildEnvironment\n from pip._internal.models.scheme import Scheme\n\n\nlogger = logging.getLogger(__name__)\n\n\nclass LegacyInstallFailure(Exception):\n def __init__(self):\n # type: () -> None\n self.parent = sys.exc_info()\n\n\ndef install(\n install_options, # type: List[str]\n global_options, # type: Sequence[str]\n root, # type: Optional[str]\n home, # type: Optional[str]\n prefix, # type: Optional[str]\n use_user_site, # type: bool\n pycompile, # type: bool\n scheme, # type: Scheme\n setup_py_path, # type: str\n isolated, # type: bool\n req_name, # type: str\n build_env, # type: BuildEnvironment\n unpacked_source_directory, # type: str\n req_description, # type: str\n):\n # type: (...) -> bool\n\n header_dir = scheme.headers\n\n with TempDirectory(kind=\"record\") as temp_dir:\n try:\n record_filename = os.path.join(temp_dir.path, 'install-record.txt')\n install_args = make_setuptools_install_args(\n setup_py_path,\n global_options=global_options,\n install_options=install_options,\n record_filename=record_filename,\n root=root,\n prefix=prefix,\n header_dir=header_dir,\n home=home,\n use_user_site=use_user_site,\n no_user_config=isolated,\n pycompile=pycompile,\n )\n\n runner = runner_with_spinner_message(\n \"Running setup.py install for {}\".format(req_name)\n )\n with indent_log(), build_env:\n runner(\n cmd=install_args,\n cwd=unpacked_source_directory,\n )\n\n if not os.path.exists(record_filename):\n logger.debug('Record file %s not found', record_filename)\n # Signal to the caller that we didn't install the new package\n return False\n\n except Exception:\n # Signal to the caller that we didn't install the new package\n raise LegacyInstallFailure\n\n # At this point, we have successfully installed the requirement.\n\n # We intentionally do not use any encoding to read the file because\n # setuptools writes the file using distutils.file_util.write_file,\n # which does not specify an encoding.\n with open(record_filename) as f:\n record_lines = f.read().splitlines()\n\n def prepend_root(path):\n # type: (str) -> str\n if root is None or not os.path.isabs(path):\n return path\n else:\n return change_root(root, path)\n\n for line in record_lines:\n directory = os.path.dirname(line)\n if directory.endswith('.egg-info'):\n egg_info_dir = prepend_root(directory)\n break\n else:\n deprecated(\n reason=(\n \"{} did not indicate that it installed an \"\n \".egg-info directory. Only setup.py projects \"\n \"generating .egg-info directories are supported.\"\n ).format(req_description),\n replacement=(\n \"for maintainers: updating the setup.py of {0}. \"\n \"For users: contact the maintainers of {0} to let \"\n \"them know to update their setup.py.\".format(\n req_name\n )\n ),\n gone_in=\"20.2\",\n issue=6998,\n )\n # FIXME: put the record somewhere\n return True\n\n new_lines = []\n for line in record_lines:\n filename = line.strip()\n if os.path.isdir(filename):\n filename += os.path.sep\n new_lines.append(\n os.path.relpath(prepend_root(filename), egg_info_dir)\n )\n new_lines.sort()\n ensure_dir(egg_info_dir)\n inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt')\n with open(inst_files_path, 'w') as f:\n f.write('\\n'.join(new_lines) + '\\n')\n\n return True\n", "path": "src/pip/_internal/operations/install/legacy.py"}], "after_files": [{"content": "\"\"\"Legacy installation process, i.e. `setup.py install`.\n\"\"\"\n\nimport logging\nimport os\nimport sys\nfrom distutils.util import change_root\n\nfrom pip._internal.exceptions import InstallationError\nfrom pip._internal.utils.logging import indent_log\nfrom pip._internal.utils.misc import ensure_dir\nfrom pip._internal.utils.setuptools_build import make_setuptools_install_args\nfrom pip._internal.utils.subprocess import runner_with_spinner_message\nfrom pip._internal.utils.temp_dir import TempDirectory\nfrom pip._internal.utils.typing import MYPY_CHECK_RUNNING\n\nif MYPY_CHECK_RUNNING:\n from typing import List, Optional, Sequence\n\n from pip._internal.build_env import BuildEnvironment\n from pip._internal.models.scheme import Scheme\n\n\nlogger = logging.getLogger(__name__)\n\n\nclass LegacyInstallFailure(Exception):\n def __init__(self):\n # type: () -> None\n self.parent = sys.exc_info()\n\n\ndef install(\n install_options, # type: List[str]\n global_options, # type: Sequence[str]\n root, # type: Optional[str]\n home, # type: Optional[str]\n prefix, # type: Optional[str]\n use_user_site, # type: bool\n pycompile, # type: bool\n scheme, # type: Scheme\n setup_py_path, # type: str\n isolated, # type: bool\n req_name, # type: str\n build_env, # type: BuildEnvironment\n unpacked_source_directory, # type: str\n req_description, # type: str\n):\n # type: (...) -> bool\n\n header_dir = scheme.headers\n\n with TempDirectory(kind=\"record\") as temp_dir:\n try:\n record_filename = os.path.join(temp_dir.path, 'install-record.txt')\n install_args = make_setuptools_install_args(\n setup_py_path,\n global_options=global_options,\n install_options=install_options,\n record_filename=record_filename,\n root=root,\n prefix=prefix,\n header_dir=header_dir,\n home=home,\n use_user_site=use_user_site,\n no_user_config=isolated,\n pycompile=pycompile,\n )\n\n runner = runner_with_spinner_message(\n \"Running setup.py install for {}\".format(req_name)\n )\n with indent_log(), build_env:\n runner(\n cmd=install_args,\n cwd=unpacked_source_directory,\n )\n\n if not os.path.exists(record_filename):\n logger.debug('Record file %s not found', record_filename)\n # Signal to the caller that we didn't install the new package\n return False\n\n except Exception:\n # Signal to the caller that we didn't install the new package\n raise LegacyInstallFailure\n\n # At this point, we have successfully installed the requirement.\n\n # We intentionally do not use any encoding to read the file because\n # setuptools writes the file using distutils.file_util.write_file,\n # which does not specify an encoding.\n with open(record_filename) as f:\n record_lines = f.read().splitlines()\n\n def prepend_root(path):\n # type: (str) -> str\n if root is None or not os.path.isabs(path):\n return path\n else:\n return change_root(root, path)\n\n for line in record_lines:\n directory = os.path.dirname(line)\n if directory.endswith('.egg-info'):\n egg_info_dir = prepend_root(directory)\n break\n else:\n message = (\n \"{} did not indicate that it installed an \"\n \".egg-info directory. Only setup.py projects \"\n \"generating .egg-info directories are supported.\"\n ).format(req_description)\n raise InstallationError(message)\n\n new_lines = []\n for line in record_lines:\n filename = line.strip()\n if os.path.isdir(filename):\n filename += os.path.sep\n new_lines.append(\n os.path.relpath(prepend_root(filename), egg_info_dir)\n )\n new_lines.sort()\n ensure_dir(egg_info_dir)\n inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt')\n with open(inst_files_path, 'w') as f:\n f.write('\\n'.join(new_lines) + '\\n')\n\n return True\n", "path": "src/pip/_internal/operations/install/legacy.py"}]}
| 1,957 | 378 |
gh_patches_debug_18128
|
rasdani/github-patches
|
git_diff
|
Lightning-AI__pytorch-lightning-1377
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Tensorboard logger error: lightning_logs directory not exists in multi-node DDP on nodes with rank != 0
## 🐛 Bug
In multi-node DDP train mode on all nodes except rank 0 errors appears at the start of the training caused by accessing lightning_logs directory in tensorboard logger which is not exist at the moment.
### To Reproduce
Steps to reproduce the behavior:
1. setup multi-node cluster (without SLURM)
2. set environment variables on each node:
```
export MASTER_ADDR=<rank 0 node IP>
export MASTER_PORT=23456
export RANK=<node id>
export SLURM_NODEID=<node id>
export WORLD_SIZE=<world-size>
```
3. install dependencies:
```
pip install torch torchvision hydra-core pytorch-lightning
```
4. copy app.y and conf.yaml to each node
5. run script on each node
```
python app.py
```
6. see the error:
```
Exception:
-- Process 0 terminated with the following error:
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/torch/multiprocessing/spawn.py", line 19, in _wrap
fn(i, *args)
File "/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/trainer/distrib_data_parallel.py", line 342, in ddp_train
self.run_pretrain_routine(model)
File "/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/trainer/trainer.py", line 777, in run_pretrain_routine
self.configure_checkpoint_callback()
File "/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/trainer/callback_config.py", line 45, in configure_checkpoint_callback
f'version_{self.logger.version}',
File "/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/loggers/tensorboard.py", line 161, in version
self._version = self._get_next_version()
File "/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/loggers/tensorboard.py", line 167, in _get_next_version
for d in os.listdir(root_dir):
FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/pytorch-lightning-intro-guide/outputs/2020-04-04/15-53-26/lightning_logs'
```
#### Code sample
app.py:
```
import pathlib
import hydra
import pytorch_lightning as pl
import torch
from omegaconf import OmegaConf
from torch.nn import functional as F
from torch.optim import Adam
from torch.utils.data import DataLoader, random_split
from torchvision import datasets, transforms
class LitMNIST(pl.LightningModule):
def __init__(self):
super().__init__()
self.layer_1 = torch.nn.Linear(28 * 28, 128)
self.layer_2 = torch.nn.Linear(128, 256)
self.layer_3 = torch.nn.Linear(256, 10)
self.train_dataset = None
self.val_dataset = None
self.test_dataset = None
def forward(self, x):
batch_size, channels, width, height = x.size()
x = x.view(batch_size, -1)
x = self.layer_1(x)
x = F.relu(x)
x = self.layer_2(x)
x = F.relu(x)
x = self.layer_3(x)
x = F.log_softmax(x, dim=1)
return x
def prepare_data(self):
# transform
transform = transforms.Compose(
[transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))])
# download
data_dir = pathlib.Path.home() / 'data'
mnist_train = datasets.MNIST(data_dir, train=True,
download=True, transform=transform)
mnist_test = datasets.MNIST(data_dir, train=False,
download=True, transform=transform)
# train/val split
mnist_train, mnist_val = random_split(mnist_train, [55000, 5000])
# assign to use in dataloaders
self.train_dataset = mnist_train
self.val_dataset = mnist_val
self.test_dataset = mnist_test
def train_dataloader(self):
return DataLoader(self.train_dataset, batch_size=64)
def val_dataloader(self):
return DataLoader(self.val_dataset, batch_size=64)
def test_dataloader(self):
return DataLoader(self.test_dataset, batch_size=64)
def configure_optimizers(self):
return Adam(self.parameters(), lr=1e-3)
def training_step(self, batch, batch_idx):
x, y = batch
logits = self(x)
loss = F.nll_loss(logits, y)
# add logging
logs = {'loss': loss}
return {'loss': loss, 'log': logs}
def validation_step(self, batch, batch_idx):
x, y = batch
logits = self(x)
loss = F.nll_loss(logits, y)
return {'val_loss': loss}
def validation_epoch_end(self, outputs):
avg_loss = torch.stack( # pylint: disable=no-member
[x['val_loss'] for x in outputs]).mean()
tensorboard_logs = {'val_loss': avg_loss}
return {'avg_val_loss': avg_loss, 'log': tensorboard_logs}
def test_step(self, batch, batch_idx):
x, y = batch
logits = self(x)
loss = F.nll_loss(logits, y)
return {'val_loss': loss}
def test_epoch_end(self, outputs):
avg_loss = torch.stack( # pylint: disable=no-member
[x['val_loss'] for x in outputs]).mean()
tensorboard_logs = {'val_loss': avg_loss}
return {'avg_val_loss': avg_loss, 'log': tensorboard_logs}
def init_ddp_connection(self, proc_rank: int, world_size: int) -> None:
torch.distributed.init_process_group(
'nccl', rank=proc_rank, world_size=world_size)
@hydra.main(config_path='conf.yaml')
def main(conf: OmegaConf):
model = LitMNIST()
trainer = pl.Trainer(gpus=conf.gpus,
num_nodes=conf.num_nodes,
distributed_backend=conf.distributed_backend,
max_epochs=3)
trainer.fit(model)
if __name__ == '__main__':
main() # pylint: disable=no-value-for-parameter
```
conf.yaml:
```
gpus: 1
num_nodes: 2
distributed_backend: ddp
```
### Expected behavior
Train should go without error
### Environment
```
cuda:
GPU:
Tesla K80
Tesla K80
Tesla K80
Tesla K80
Tesla K80
Tesla K80
Tesla K80
Tesla K80
available: True
version: 10.1
packages:
numpy: 1.18.1
pyTorch_debug: False
pyTorch_version: 1.4.0
pytorch-lightning: 0.7.1
tensorboard: 2.2.0
tqdm: 4.45.0
system:
OS: Linux
architecture:
64bit
processor: x86_64
python: 3.6.10
version: #113-Ubuntu SMP Wed Jan 29 14:54:54 UTC 2020
```
### Additional context
<!-- Add any other context about the problem here. -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pytorch_lightning/loggers/tensorboard.py`
Content:
```
1 import csv
2 import os
3 from argparse import Namespace
4 from typing import Optional, Dict, Union, Any
5 from warnings import warn
6
7 import torch
8 from pkg_resources import parse_version
9 from torch.utils.tensorboard import SummaryWriter
10
11 from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only
12
13
14 class TensorBoardLogger(LightningLoggerBase):
15 r"""
16
17 Log to local file system in TensorBoard format
18
19 Implemented using :class:`torch.utils.tensorboard.SummaryWriter`. Logs are saved to
20 ``os.path.join(save_dir, name, version)``
21
22 Example:
23 .. code-block:: python
24
25 logger = TensorBoardLogger("tb_logs", name="my_model")
26 trainer = Trainer(logger=logger)
27 trainer.train(model)
28
29 Args:
30 save_dir: Save directory
31 name: Experiment name. Defaults to "default". If it is the empty string then no per-experiment
32 subdirectory is used.
33 version: Experiment version. If version is not specified the logger inspects the save
34 directory for existing versions, then automatically assigns the next available version.
35 If it is a string then it is used as the run-specific subdirectory name,
36 otherwise version_${version} is used.
37 \**kwargs: Other arguments are passed directly to the :class:`SummaryWriter` constructor.
38
39 """
40 NAME_CSV_TAGS = 'meta_tags.csv'
41
42 def __init__(
43 self, save_dir: str, name: Optional[str] = "default",
44 version: Optional[Union[int, str]] = None, **kwargs
45 ):
46 super().__init__()
47 self.save_dir = save_dir
48 self._name = name
49 self._version = version
50
51 self._experiment = None
52 self.tags = {}
53 self.kwargs = kwargs
54
55 @property
56 def root_dir(self) -> str:
57 """
58 Parent directory for all tensorboard checkpoint subdirectories.
59 If the experiment name parameter is None or the empty string, no experiment subdirectory is used
60 and checkpoint will be saved in save_dir/version_dir
61 """
62 if self.name is None or len(self.name) == 0:
63 return self.save_dir
64 else:
65 return os.path.join(self.save_dir, self.name)
66
67 @property
68 def log_dir(self) -> str:
69 """
70 The directory for this run's tensorboard checkpoint. By default, it is named 'version_${self.version}'
71 but it can be overridden by passing a string value for the constructor's version parameter
72 instead of None or an int
73 """
74 # create a pseudo standard path ala test-tube
75 version = self.version if isinstance(self.version, str) else f"version_{self.version}"
76 log_dir = os.path.join(self.root_dir, version)
77 return log_dir
78
79 @property
80 def experiment(self) -> SummaryWriter:
81 r"""
82
83 Actual tensorboard object. To use tensorboard features do the following.
84
85 Example::
86
87 self.logger.experiment.some_tensorboard_function()
88
89 """
90 if self._experiment is not None:
91 return self._experiment
92
93 os.makedirs(self.root_dir, exist_ok=True)
94 self._experiment = SummaryWriter(log_dir=self.log_dir, **self.kwargs)
95 return self._experiment
96
97 @rank_zero_only
98 def log_hyperparams(self, params: Union[Dict[str, Any], Namespace]) -> None:
99 params = self._convert_params(params)
100 params = self._flatten_dict(params)
101 sanitized_params = self._sanitize_params(params)
102
103 if parse_version(torch.__version__) < parse_version("1.3.0"):
104 warn(
105 f"Hyperparameter logging is not available for Torch version {torch.__version__}."
106 " Skipping log_hyperparams. Upgrade to Torch 1.3.0 or above to enable"
107 " hyperparameter logging."
108 )
109 else:
110 from torch.utils.tensorboard.summary import hparams
111 exp, ssi, sei = hparams(sanitized_params, {})
112 writer = self.experiment._get_file_writer()
113 writer.add_summary(exp)
114 writer.add_summary(ssi)
115 writer.add_summary(sei)
116
117 # some alternative should be added
118 self.tags.update(sanitized_params)
119
120 @rank_zero_only
121 def log_metrics(self, metrics: Dict[str, float], step: Optional[int] = None) -> None:
122 for k, v in metrics.items():
123 if isinstance(v, torch.Tensor):
124 v = v.item()
125 self.experiment.add_scalar(k, v, step)
126
127 @rank_zero_only
128 def save(self) -> None:
129 try:
130 self.experiment.flush()
131 except AttributeError:
132 # you are using PT version (<v1.2) which does not have implemented flush
133 self.experiment._get_file_writer().flush()
134
135 dir_path = self.log_dir
136 if not os.path.isdir(dir_path):
137 dir_path = self.save_dir
138
139 # prepare the file path
140 meta_tags_path = os.path.join(dir_path, self.NAME_CSV_TAGS)
141
142 # save the metatags file
143 with open(meta_tags_path, 'w', newline='') as csvfile:
144 fieldnames = ['key', 'value']
145 writer = csv.DictWriter(csvfile, fieldnames=fieldnames)
146 writer.writerow({'key': 'key', 'value': 'value'})
147 for k, v in self.tags.items():
148 writer.writerow({'key': k, 'value': v})
149
150 @rank_zero_only
151 def finalize(self, status: str) -> None:
152 self.save()
153
154 @property
155 def name(self) -> str:
156 return self._name
157
158 @property
159 def version(self) -> int:
160 if self._version is None:
161 self._version = self._get_next_version()
162 return self._version
163
164 def _get_next_version(self):
165 root_dir = os.path.join(self.save_dir, self.name)
166 existing_versions = []
167 for d in os.listdir(root_dir):
168 if os.path.isdir(os.path.join(root_dir, d)) and d.startswith("version_"):
169 existing_versions.append(int(d.split("_")[1]))
170
171 if len(existing_versions) == 0:
172 return 0
173
174 return max(existing_versions) + 1
175
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pytorch_lightning/loggers/tensorboard.py b/pytorch_lightning/loggers/tensorboard.py
--- a/pytorch_lightning/loggers/tensorboard.py
+++ b/pytorch_lightning/loggers/tensorboard.py
@@ -9,6 +9,7 @@
from torch.utils.tensorboard import SummaryWriter
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only
+from pytorch_lightning import _logger as log
class TensorBoardLogger(LightningLoggerBase):
@@ -163,6 +164,11 @@
def _get_next_version(self):
root_dir = os.path.join(self.save_dir, self.name)
+
+ if not os.path.isdir(root_dir):
+ log.warning('Missing logger folder: %s', root_dir)
+ return 0
+
existing_versions = []
for d in os.listdir(root_dir):
if os.path.isdir(os.path.join(root_dir, d)) and d.startswith("version_"):
|
{"golden_diff": "diff --git a/pytorch_lightning/loggers/tensorboard.py b/pytorch_lightning/loggers/tensorboard.py\n--- a/pytorch_lightning/loggers/tensorboard.py\n+++ b/pytorch_lightning/loggers/tensorboard.py\n@@ -9,6 +9,7 @@\n from torch.utils.tensorboard import SummaryWriter\n \n from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only\n+from pytorch_lightning import _logger as log\n \n \n class TensorBoardLogger(LightningLoggerBase):\n@@ -163,6 +164,11 @@\n \n def _get_next_version(self):\n root_dir = os.path.join(self.save_dir, self.name)\n+\n+ if not os.path.isdir(root_dir):\n+ log.warning('Missing logger folder: %s', root_dir)\n+ return 0\n+\n existing_versions = []\n for d in os.listdir(root_dir):\n if os.path.isdir(os.path.join(root_dir, d)) and d.startswith(\"version_\"):\n", "issue": "Tensorboard logger error: lightning_logs directory not exists in multi-node DDP on nodes with rank != 0\n## \ud83d\udc1b Bug\r\n\r\nIn multi-node DDP train mode on all nodes except rank 0 errors appears at the start of the training caused by accessing lightning_logs directory in tensorboard logger which is not exist at the moment.\r\n\r\n### To Reproduce\r\n\r\nSteps to reproduce the behavior:\r\n\r\n1. setup multi-node cluster (without SLURM)\r\n2. set environment variables on each node:\r\n```\r\nexport MASTER_ADDR=<rank 0 node IP>\r\nexport MASTER_PORT=23456\r\nexport RANK=<node id>\r\nexport SLURM_NODEID=<node id>\r\nexport WORLD_SIZE=<world-size>\r\n```\r\n3. install dependencies:\r\n```\r\npip install torch torchvision hydra-core pytorch-lightning\r\n```\r\n4. copy app.y and conf.yaml to each node\r\n5. run script on each node\r\n```\r\npython app.py\r\n```\r\n6. see the error:\r\n```\r\nException:\r\n\r\n-- Process 0 terminated with the following error:\r\nTraceback (most recent call last):\r\n File \"/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/torch/multiprocessing/spawn.py\", line 19, in _wrap\r\n fn(i, *args)\r\n File \"/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/trainer/distrib_data_parallel.py\", line 342, in ddp_train\r\n self.run_pretrain_routine(model)\r\n File \"/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/trainer/trainer.py\", line 777, in run_pretrain_routine\r\n self.configure_checkpoint_callback()\r\n File \"/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/trainer/callback_config.py\", line 45, in configure_checkpoint_callback\r\n f'version_{self.logger.version}',\r\n File \"/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/loggers/tensorboard.py\", line 161, in version\r\n self._version = self._get_next_version()\r\n File \"/home/ubuntu/anaconda3/envs/nightly_pt/lib/python3.6/site-packages/pytorch_lightning/loggers/tensorboard.py\", line 167, in _get_next_version\r\n for d in os.listdir(root_dir):\r\nFileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/pytorch-lightning-intro-guide/outputs/2020-04-04/15-53-26/lightning_logs'\r\n```\r\n\r\n#### Code sample\r\n\r\napp.py:\r\n```\r\nimport pathlib\r\n\r\nimport hydra\r\nimport pytorch_lightning as pl\r\nimport torch\r\nfrom omegaconf import OmegaConf\r\nfrom torch.nn import functional as F\r\nfrom torch.optim import Adam\r\nfrom torch.utils.data import DataLoader, random_split\r\nfrom torchvision import datasets, transforms\r\n\r\n\r\nclass LitMNIST(pl.LightningModule):\r\n def __init__(self):\r\n super().__init__()\r\n self.layer_1 = torch.nn.Linear(28 * 28, 128)\r\n self.layer_2 = torch.nn.Linear(128, 256)\r\n self.layer_3 = torch.nn.Linear(256, 10)\r\n\r\n self.train_dataset = None\r\n self.val_dataset = None\r\n self.test_dataset = None\r\n\r\n def forward(self, x):\r\n batch_size, channels, width, height = x.size()\r\n x = x.view(batch_size, -1)\r\n x = self.layer_1(x)\r\n x = F.relu(x)\r\n x = self.layer_2(x)\r\n x = F.relu(x)\r\n x = self.layer_3(x)\r\n x = F.log_softmax(x, dim=1)\r\n return x\r\n\r\n def prepare_data(self):\r\n # transform\r\n transform = transforms.Compose(\r\n [transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))])\r\n\r\n # download\r\n data_dir = pathlib.Path.home() / 'data'\r\n mnist_train = datasets.MNIST(data_dir, train=True,\r\n download=True, transform=transform)\r\n mnist_test = datasets.MNIST(data_dir, train=False,\r\n download=True, transform=transform)\r\n\r\n # train/val split\r\n mnist_train, mnist_val = random_split(mnist_train, [55000, 5000])\r\n\r\n # assign to use in dataloaders\r\n self.train_dataset = mnist_train\r\n self.val_dataset = mnist_val\r\n self.test_dataset = mnist_test\r\n\r\n def train_dataloader(self):\r\n return DataLoader(self.train_dataset, batch_size=64)\r\n\r\n def val_dataloader(self):\r\n return DataLoader(self.val_dataset, batch_size=64)\r\n\r\n def test_dataloader(self):\r\n return DataLoader(self.test_dataset, batch_size=64)\r\n\r\n def configure_optimizers(self):\r\n return Adam(self.parameters(), lr=1e-3)\r\n\r\n def training_step(self, batch, batch_idx):\r\n x, y = batch\r\n logits = self(x)\r\n loss = F.nll_loss(logits, y)\r\n\r\n # add logging\r\n logs = {'loss': loss}\r\n return {'loss': loss, 'log': logs}\r\n\r\n def validation_step(self, batch, batch_idx):\r\n x, y = batch\r\n logits = self(x)\r\n loss = F.nll_loss(logits, y)\r\n return {'val_loss': loss}\r\n\r\n def validation_epoch_end(self, outputs):\r\n avg_loss = torch.stack( # pylint: disable=no-member\r\n [x['val_loss'] for x in outputs]).mean()\r\n tensorboard_logs = {'val_loss': avg_loss}\r\n return {'avg_val_loss': avg_loss, 'log': tensorboard_logs}\r\n\r\n def test_step(self, batch, batch_idx):\r\n x, y = batch\r\n logits = self(x)\r\n loss = F.nll_loss(logits, y)\r\n return {'val_loss': loss}\r\n\r\n def test_epoch_end(self, outputs):\r\n avg_loss = torch.stack( # pylint: disable=no-member\r\n [x['val_loss'] for x in outputs]).mean()\r\n tensorboard_logs = {'val_loss': avg_loss}\r\n return {'avg_val_loss': avg_loss, 'log': tensorboard_logs}\r\n\r\n def init_ddp_connection(self, proc_rank: int, world_size: int) -> None:\r\n torch.distributed.init_process_group(\r\n 'nccl', rank=proc_rank, world_size=world_size)\r\n\r\n\r\n@hydra.main(config_path='conf.yaml')\r\ndef main(conf: OmegaConf):\r\n model = LitMNIST()\r\n\r\n trainer = pl.Trainer(gpus=conf.gpus,\r\n num_nodes=conf.num_nodes,\r\n distributed_backend=conf.distributed_backend,\r\n max_epochs=3)\r\n trainer.fit(model)\r\n\r\n\r\nif __name__ == '__main__':\r\n main() # pylint: disable=no-value-for-parameter\r\n```\r\n\r\nconf.yaml:\r\n```\r\ngpus: 1\r\nnum_nodes: 2\r\ndistributed_backend: ddp\r\n```\r\n\r\n### Expected behavior\r\n\r\nTrain should go without error\r\n\r\n### Environment\r\n```\r\ncuda:\r\n\tGPU:\r\n\t\tTesla K80\r\n\t\tTesla K80\r\n\t\tTesla K80\r\n\t\tTesla K80\r\n\t\tTesla K80\r\n\t\tTesla K80\r\n\t\tTesla K80\r\n\t\tTesla K80\r\n\tavailable: True\r\n\tversion: 10.1\r\npackages:\r\n\tnumpy: 1.18.1\r\n\tpyTorch_debug: False\r\n\tpyTorch_version: 1.4.0\r\n\tpytorch-lightning: 0.7.1\r\n\ttensorboard: 2.2.0\r\n\ttqdm: 4.45.0\r\nsystem:\r\n\tOS: Linux\r\n\tarchitecture:\r\n\t\t64bit\r\n\r\n\tprocessor: x86_64\r\n\tpython: 3.6.10\r\n\tversion: #113-Ubuntu SMP Wed Jan 29 14:54:54 UTC 2020\r\n```\r\n### Additional context\r\n\r\n<!-- Add any other context about the problem here. -->\r\n\n", "before_files": [{"content": "import csv\nimport os\nfrom argparse import Namespace\nfrom typing import Optional, Dict, Union, Any\nfrom warnings import warn\n\nimport torch\nfrom pkg_resources import parse_version\nfrom torch.utils.tensorboard import SummaryWriter\n\nfrom pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only\n\n\nclass TensorBoardLogger(LightningLoggerBase):\n r\"\"\"\n\n Log to local file system in TensorBoard format\n\n Implemented using :class:`torch.utils.tensorboard.SummaryWriter`. Logs are saved to\n ``os.path.join(save_dir, name, version)``\n\n Example:\n .. code-block:: python\n\n logger = TensorBoardLogger(\"tb_logs\", name=\"my_model\")\n trainer = Trainer(logger=logger)\n trainer.train(model)\n\n Args:\n save_dir: Save directory\n name: Experiment name. Defaults to \"default\". If it is the empty string then no per-experiment\n subdirectory is used.\n version: Experiment version. If version is not specified the logger inspects the save\n directory for existing versions, then automatically assigns the next available version.\n If it is a string then it is used as the run-specific subdirectory name,\n otherwise version_${version} is used.\n \\**kwargs: Other arguments are passed directly to the :class:`SummaryWriter` constructor.\n\n \"\"\"\n NAME_CSV_TAGS = 'meta_tags.csv'\n\n def __init__(\n self, save_dir: str, name: Optional[str] = \"default\",\n version: Optional[Union[int, str]] = None, **kwargs\n ):\n super().__init__()\n self.save_dir = save_dir\n self._name = name\n self._version = version\n\n self._experiment = None\n self.tags = {}\n self.kwargs = kwargs\n\n @property\n def root_dir(self) -> str:\n \"\"\"\n Parent directory for all tensorboard checkpoint subdirectories.\n If the experiment name parameter is None or the empty string, no experiment subdirectory is used\n and checkpoint will be saved in save_dir/version_dir\n \"\"\"\n if self.name is None or len(self.name) == 0:\n return self.save_dir\n else:\n return os.path.join(self.save_dir, self.name)\n\n @property\n def log_dir(self) -> str:\n \"\"\"\n The directory for this run's tensorboard checkpoint. By default, it is named 'version_${self.version}'\n but it can be overridden by passing a string value for the constructor's version parameter\n instead of None or an int\n \"\"\"\n # create a pseudo standard path ala test-tube\n version = self.version if isinstance(self.version, str) else f\"version_{self.version}\"\n log_dir = os.path.join(self.root_dir, version)\n return log_dir\n\n @property\n def experiment(self) -> SummaryWriter:\n r\"\"\"\n\n Actual tensorboard object. To use tensorboard features do the following.\n\n Example::\n\n self.logger.experiment.some_tensorboard_function()\n\n \"\"\"\n if self._experiment is not None:\n return self._experiment\n\n os.makedirs(self.root_dir, exist_ok=True)\n self._experiment = SummaryWriter(log_dir=self.log_dir, **self.kwargs)\n return self._experiment\n\n @rank_zero_only\n def log_hyperparams(self, params: Union[Dict[str, Any], Namespace]) -> None:\n params = self._convert_params(params)\n params = self._flatten_dict(params)\n sanitized_params = self._sanitize_params(params)\n\n if parse_version(torch.__version__) < parse_version(\"1.3.0\"):\n warn(\n f\"Hyperparameter logging is not available for Torch version {torch.__version__}.\"\n \" Skipping log_hyperparams. Upgrade to Torch 1.3.0 or above to enable\"\n \" hyperparameter logging.\"\n )\n else:\n from torch.utils.tensorboard.summary import hparams\n exp, ssi, sei = hparams(sanitized_params, {})\n writer = self.experiment._get_file_writer()\n writer.add_summary(exp)\n writer.add_summary(ssi)\n writer.add_summary(sei)\n\n # some alternative should be added\n self.tags.update(sanitized_params)\n\n @rank_zero_only\n def log_metrics(self, metrics: Dict[str, float], step: Optional[int] = None) -> None:\n for k, v in metrics.items():\n if isinstance(v, torch.Tensor):\n v = v.item()\n self.experiment.add_scalar(k, v, step)\n\n @rank_zero_only\n def save(self) -> None:\n try:\n self.experiment.flush()\n except AttributeError:\n # you are using PT version (<v1.2) which does not have implemented flush\n self.experiment._get_file_writer().flush()\n\n dir_path = self.log_dir\n if not os.path.isdir(dir_path):\n dir_path = self.save_dir\n\n # prepare the file path\n meta_tags_path = os.path.join(dir_path, self.NAME_CSV_TAGS)\n\n # save the metatags file\n with open(meta_tags_path, 'w', newline='') as csvfile:\n fieldnames = ['key', 'value']\n writer = csv.DictWriter(csvfile, fieldnames=fieldnames)\n writer.writerow({'key': 'key', 'value': 'value'})\n for k, v in self.tags.items():\n writer.writerow({'key': k, 'value': v})\n\n @rank_zero_only\n def finalize(self, status: str) -> None:\n self.save()\n\n @property\n def name(self) -> str:\n return self._name\n\n @property\n def version(self) -> int:\n if self._version is None:\n self._version = self._get_next_version()\n return self._version\n\n def _get_next_version(self):\n root_dir = os.path.join(self.save_dir, self.name)\n existing_versions = []\n for d in os.listdir(root_dir):\n if os.path.isdir(os.path.join(root_dir, d)) and d.startswith(\"version_\"):\n existing_versions.append(int(d.split(\"_\")[1]))\n\n if len(existing_versions) == 0:\n return 0\n\n return max(existing_versions) + 1\n", "path": "pytorch_lightning/loggers/tensorboard.py"}], "after_files": [{"content": "import csv\nimport os\nfrom argparse import Namespace\nfrom typing import Optional, Dict, Union, Any\nfrom warnings import warn\n\nimport torch\nfrom pkg_resources import parse_version\nfrom torch.utils.tensorboard import SummaryWriter\n\nfrom pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_only\nfrom pytorch_lightning import _logger as log\n\n\nclass TensorBoardLogger(LightningLoggerBase):\n r\"\"\"\n\n Log to local file system in TensorBoard format\n\n Implemented using :class:`torch.utils.tensorboard.SummaryWriter`. Logs are saved to\n ``os.path.join(save_dir, name, version)``\n\n Example:\n .. code-block:: python\n\n logger = TensorBoardLogger(\"tb_logs\", name=\"my_model\")\n trainer = Trainer(logger=logger)\n trainer.train(model)\n\n Args:\n save_dir: Save directory\n name: Experiment name. Defaults to \"default\". If it is the empty string then no per-experiment\n subdirectory is used.\n version: Experiment version. If version is not specified the logger inspects the save\n directory for existing versions, then automatically assigns the next available version.\n If it is a string then it is used as the run-specific subdirectory name,\n otherwise version_${version} is used.\n \\**kwargs: Other arguments are passed directly to the :class:`SummaryWriter` constructor.\n\n \"\"\"\n NAME_CSV_TAGS = 'meta_tags.csv'\n\n def __init__(\n self, save_dir: str, name: Optional[str] = \"default\",\n version: Optional[Union[int, str]] = None, **kwargs\n ):\n super().__init__()\n self.save_dir = save_dir\n self._name = name\n self._version = version\n\n self._experiment = None\n self.tags = {}\n self.kwargs = kwargs\n\n @property\n def root_dir(self) -> str:\n \"\"\"\n Parent directory for all tensorboard checkpoint subdirectories.\n If the experiment name parameter is None or the empty string, no experiment subdirectory is used\n and checkpoint will be saved in save_dir/version_dir\n \"\"\"\n if self.name is None or len(self.name) == 0:\n return self.save_dir\n else:\n return os.path.join(self.save_dir, self.name)\n\n @property\n def log_dir(self) -> str:\n \"\"\"\n The directory for this run's tensorboard checkpoint. By default, it is named 'version_${self.version}'\n but it can be overridden by passing a string value for the constructor's version parameter\n instead of None or an int\n \"\"\"\n # create a pseudo standard path ala test-tube\n version = self.version if isinstance(self.version, str) else f\"version_{self.version}\"\n log_dir = os.path.join(self.root_dir, version)\n return log_dir\n\n @property\n def experiment(self) -> SummaryWriter:\n r\"\"\"\n\n Actual tensorboard object. To use tensorboard features do the following.\n\n Example::\n\n self.logger.experiment.some_tensorboard_function()\n\n \"\"\"\n if self._experiment is not None:\n return self._experiment\n\n os.makedirs(self.root_dir, exist_ok=True)\n self._experiment = SummaryWriter(log_dir=self.log_dir, **self.kwargs)\n return self._experiment\n\n @rank_zero_only\n def log_hyperparams(self, params: Union[Dict[str, Any], Namespace]) -> None:\n params = self._convert_params(params)\n params = self._flatten_dict(params)\n sanitized_params = self._sanitize_params(params)\n\n if parse_version(torch.__version__) < parse_version(\"1.3.0\"):\n warn(\n f\"Hyperparameter logging is not available for Torch version {torch.__version__}.\"\n \" Skipping log_hyperparams. Upgrade to Torch 1.3.0 or above to enable\"\n \" hyperparameter logging.\"\n )\n else:\n from torch.utils.tensorboard.summary import hparams\n exp, ssi, sei = hparams(sanitized_params, {})\n writer = self.experiment._get_file_writer()\n writer.add_summary(exp)\n writer.add_summary(ssi)\n writer.add_summary(sei)\n\n # some alternative should be added\n self.tags.update(sanitized_params)\n\n @rank_zero_only\n def log_metrics(self, metrics: Dict[str, float], step: Optional[int] = None) -> None:\n for k, v in metrics.items():\n if isinstance(v, torch.Tensor):\n v = v.item()\n self.experiment.add_scalar(k, v, step)\n\n @rank_zero_only\n def save(self) -> None:\n try:\n self.experiment.flush()\n except AttributeError:\n # you are using PT version (<v1.2) which does not have implemented flush\n self.experiment._get_file_writer().flush()\n\n dir_path = self.log_dir\n if not os.path.isdir(dir_path):\n dir_path = self.save_dir\n\n # prepare the file path\n meta_tags_path = os.path.join(dir_path, self.NAME_CSV_TAGS)\n\n # save the metatags file\n with open(meta_tags_path, 'w', newline='') as csvfile:\n fieldnames = ['key', 'value']\n writer = csv.DictWriter(csvfile, fieldnames=fieldnames)\n writer.writerow({'key': 'key', 'value': 'value'})\n for k, v in self.tags.items():\n writer.writerow({'key': k, 'value': v})\n\n @rank_zero_only\n def finalize(self, status: str) -> None:\n self.save()\n\n @property\n def name(self) -> str:\n return self._name\n\n @property\n def version(self) -> int:\n if self._version is None:\n self._version = self._get_next_version()\n return self._version\n\n def _get_next_version(self):\n root_dir = os.path.join(self.save_dir, self.name)\n\n if not os.path.isdir(root_dir):\n log.warning('Missing logger folder: %s', root_dir)\n return 0\n\n existing_versions = []\n for d in os.listdir(root_dir):\n if os.path.isdir(os.path.join(root_dir, d)) and d.startswith(\"version_\"):\n existing_versions.append(int(d.split(\"_\")[1]))\n\n if len(existing_versions) == 0:\n return 0\n\n return max(existing_versions) + 1\n", "path": "pytorch_lightning/loggers/tensorboard.py"}]}
| 3,825 | 216 |
gh_patches_debug_38319
|
rasdani/github-patches
|
git_diff
|
python__peps-2415
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Infra: Fix non-PEP txt files being included and erroring out the build
Presently, as described in #2404 , Sphinx attempts to render any `.txt` files not explicitly excluded in `exclude_patterns` of `conf.py`, in the root or any subdirectories. This results in build failure if any are found, particularly in untracked files which we don't control (e.g. `vendor.txt` in venvs).
Per the [Sphinx docs](https://www.sphinx-doc.org/en/master/usage/configuration.html#id11) we can use full Unix shell-style glob syntax, so adding the glob `**/*.txt` to `exclude_patterns` neatly fixes most instance of this problem by ensuring that `.txt` files are not matched unless they are in the root directory (as all the valid PEPs are).
To handle any remaining cases, i.e. non-PEP `.txt` files that _are_ in the root directory and not explicitly excluded, adding `*[!0-9].txt`,`*[!0-9]?.txt`, `*[!0-9]??.txt`, `*[!0-9]???.txt`, `*[!p]?????.txt`, `*[!e]??????.txt`, and `*[!p]???????.txt` ensures that to be included, `.txt` files must match the equivalent of the regex `^pep?[0-9]{4}\.txt$`, which is only used for PEPs.
This solution was tested on some sample files in the root and subdirectories, including the reported `venv-spam/vendor.txt`, and several with varying degrees of names somewhat similar to PEPs, and it excluded all of them, while not excluding any valid PEPs.
It also obviates the need for some of our other explicit excludes, but I opted to leave them for now
Fix #2404
`make sphinx` Breaks when Certain Untracked Files are Present
* `make sphinx` fails if I have some file like "pep-0684-old.rst" (the filename regex in pep_sphinx_extensions/pep_zero_generator/pep_index_generator.py isn't precise enough)
* `make sphinx` fails if I have a venv dir named "venv-spam" sitting at the top of the repo (tries to parse a vendor.txt file inside the venv)
In both cases I'd expect it to work fine, ignoring the relevant files.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pep_sphinx_extensions/__init__.py`
Content:
```
1 """Sphinx extensions for performant PEP processing"""
2
3 from __future__ import annotations
4
5 from typing import TYPE_CHECKING
6
7 from docutils.writers.html5_polyglot import HTMLTranslator
8 from sphinx import environment
9
10 from pep_sphinx_extensions.pep_processor.html import pep_html_builder
11 from pep_sphinx_extensions.pep_processor.html import pep_html_translator
12 from pep_sphinx_extensions.pep_processor.parsing import pep_parser
13 from pep_sphinx_extensions.pep_processor.parsing import pep_role
14 from pep_sphinx_extensions.pep_processor.transforms import pep_references
15 from pep_sphinx_extensions.pep_zero_generator.pep_index_generator import create_pep_zero
16
17 if TYPE_CHECKING:
18 from sphinx.application import Sphinx
19
20
21 def _depart_maths():
22 pass # No-op callable for the type checker
23
24
25 def _update_config_for_builder(app: Sphinx) -> None:
26 app.env.document_ids = {} # For PEPReferenceRoleTitleText
27 if app.builder.name == "dirhtml":
28 app.env.settings["pep_url"] = "../pep-{:0>4}"
29
30 # internal_builder exists if Sphinx is run by build.py
31 if "internal_builder" not in app.tags:
32 app.connect("build-finished", _post_build) # Post-build tasks
33
34
35 def _post_build(app: Sphinx, exception: Exception | None) -> None:
36 from pathlib import Path
37
38 from build import create_index_file
39
40 if exception is not None:
41 return
42 create_index_file(Path(app.outdir), app.builder.name)
43
44
45 def setup(app: Sphinx) -> dict[str, bool]:
46 """Initialize Sphinx extension."""
47
48 environment.default_settings["pep_url"] = "pep-{:0>4}.html"
49 environment.default_settings["halt_level"] = 2 # Fail on Docutils warning
50
51 # Register plugin logic
52 app.add_builder(pep_html_builder.FileBuilder, override=True)
53 app.add_builder(pep_html_builder.DirectoryBuilder, override=True)
54
55 app.add_source_parser(pep_parser.PEPParser) # Add PEP transforms
56
57 app.set_translator("html", pep_html_translator.PEPTranslator) # Docutils Node Visitor overrides (html builder)
58 app.set_translator("dirhtml", pep_html_translator.PEPTranslator) # Docutils Node Visitor overrides (dirhtml builder)
59
60 app.add_role("pep", pep_role.PEPRole(), override=True) # Transform PEP references to links
61
62 app.add_post_transform(pep_references.PEPReferenceRoleTitleText)
63
64 # Register event callbacks
65 app.connect("builder-inited", _update_config_for_builder) # Update configuration values for builder used
66 app.connect("env-before-read-docs", create_pep_zero) # PEP 0 hook
67
68 # Mathematics rendering
69 inline_maths = HTMLTranslator.visit_math, _depart_maths
70 block_maths = HTMLTranslator.visit_math_block, _depart_maths
71 app.add_html_math_renderer("maths_to_html", inline_maths, block_maths) # Render maths to HTML
72
73 # Parallel safety: https://www.sphinx-doc.org/en/master/extdev/index.html#extension-metadata
74 return {"parallel_read_safe": True, "parallel_write_safe": True}
75
```
Path: `conf.py`
Content:
```
1 # This file is placed in the public domain or under the
2 # CC0-1.0-Universal license, whichever is more permissive.
3
4 """Configuration for building PEPs using Sphinx."""
5
6 from pathlib import Path
7 import sys
8
9 sys.path.append(str(Path("pep_sphinx_extensions").absolute()))
10
11 # -- Project information -----------------------------------------------------
12
13 project = "PEPs"
14 master_doc = "contents"
15
16 # -- General configuration ---------------------------------------------------
17
18 # Add any Sphinx extension module names here, as strings.
19 extensions = ["pep_sphinx_extensions", "sphinx.ext.githubpages"]
20
21 # The file extensions of source files. Sphinx uses these suffixes as sources.
22 source_suffix = {
23 ".rst": "pep",
24 ".txt": "pep",
25 }
26
27 # List of patterns (relative to source dir) to ignore when looking for source files.
28 exclude_patterns = [
29 # Windows:
30 "Thumbs.db",
31 ".DS_Store",
32 # Python:
33 ".venv",
34 "venv",
35 "requirements.txt",
36 # Sphinx:
37 "build",
38 "output.txt", # Link-check output
39 # PEPs:
40 "pep-0012",
41 "README.rst",
42 "CONTRIBUTING.rst",
43 "pep_sphinx_extensions/LICENCE.rst",
44 # Miscellaneous
45 ".codespell",
46 ]
47
48 # -- Options for HTML output -------------------------------------------------
49
50 # HTML output settings
51 html_math_renderer = "maths_to_html" # Maths rendering
52
53 # Theme settings
54 html_theme_path = ["pep_sphinx_extensions"]
55 html_theme = "pep_theme" # The actual theme directory (child of html_theme_path)
56 html_use_index = False # Disable index (we use PEP 0)
57 html_style = "" # must be defined here or in theme.conf, but is unused
58 html_permalinks = False # handled in the PEPContents transform
59 html_baseurl = "https://peps.python.org" # to create the CNAME file
60 gettext_auto_build = False # speed-ups
61
62 templates_path = ['pep_sphinx_extensions/pep_theme/templates'] # Theme template relative paths from `confdir`
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/conf.py b/conf.py
--- a/conf.py
+++ b/conf.py
@@ -8,6 +8,11 @@
sys.path.append(str(Path("pep_sphinx_extensions").absolute()))
+# Add 'include_patterns' as a config variable
+from sphinx.config import Config
+Config.config_values['include_patterns'] = [], 'env', []
+del Config
+
# -- Project information -----------------------------------------------------
project = "PEPs"
@@ -25,24 +30,20 @@
}
# List of patterns (relative to source dir) to ignore when looking for source files.
+include_patterns = [
+ # Required for Sphinx
+ "contents.rst",
+ # PEP files
+ "pep-????.rst",
+ "pep-????.txt",
+ # PEP ancillary files
+ "pep-????/*.rst",
+ # Documentation
+ "docs/*.rst",
+]
exclude_patterns = [
- # Windows:
- "Thumbs.db",
- ".DS_Store",
- # Python:
- ".venv",
- "venv",
- "requirements.txt",
- # Sphinx:
- "build",
- "output.txt", # Link-check output
- # PEPs:
- "pep-0012",
- "README.rst",
- "CONTRIBUTING.rst",
- "pep_sphinx_extensions/LICENCE.rst",
- # Miscellaneous
- ".codespell",
+ # PEP Template
+ "pep-0012/pep-NNNN.rst",
]
# -- Options for HTML output -------------------------------------------------
diff --git a/pep_sphinx_extensions/__init__.py b/pep_sphinx_extensions/__init__.py
--- a/pep_sphinx_extensions/__init__.py
+++ b/pep_sphinx_extensions/__init__.py
@@ -6,6 +6,7 @@
from docutils.writers.html5_polyglot import HTMLTranslator
from sphinx import environment
+from sphinx import project
from pep_sphinx_extensions.pep_processor.html import pep_html_builder
from pep_sphinx_extensions.pep_processor.html import pep_html_translator
@@ -16,6 +17,37 @@
if TYPE_CHECKING:
from sphinx.application import Sphinx
+ from sphinx.config import Config
+
+
+def find_files(self: environment.BuildEnvironment, config: Config, _b) -> None:
+ """Find all pep source files."""
+ import fnmatch
+ from pathlib import Path
+
+ root = Path(self.project.srcdir).absolute()
+ self.project.docnames = set()
+ for pattern in config.include_patterns:
+ for path in root.glob(pattern):
+ filename = str(path.relative_to(root))
+ if any(fnmatch.fnmatch(filename, pattern) for pattern in config.exclude_patterns):
+ continue
+
+ doc_name = self.project.path2doc(filename)
+ if not doc_name:
+ continue
+
+ if doc_name not in self.project.docnames:
+ self.project.docnames.add(doc_name)
+ continue
+
+ other_files = [str(f.relative_to(root)) for f in root.glob(f"{doc_name}.*")]
+ project.logger.warning(
+ f'multiple files found for the document "{doc_name}": {other_files!r}\n'
+ f'Use {self.doc2path(doc_name)!r} for the build.', once=True)
+
+
+environment.BuildEnvironment.find_files = find_files
def _depart_maths():
|
{"golden_diff": "diff --git a/conf.py b/conf.py\n--- a/conf.py\n+++ b/conf.py\n@@ -8,6 +8,11 @@\n \n sys.path.append(str(Path(\"pep_sphinx_extensions\").absolute()))\n \n+# Add 'include_patterns' as a config variable\n+from sphinx.config import Config\n+Config.config_values['include_patterns'] = [], 'env', []\n+del Config\n+\n # -- Project information -----------------------------------------------------\n \n project = \"PEPs\"\n@@ -25,24 +30,20 @@\n }\n \n # List of patterns (relative to source dir) to ignore when looking for source files.\n+include_patterns = [\n+ # Required for Sphinx\n+ \"contents.rst\",\n+ # PEP files\n+ \"pep-????.rst\",\n+ \"pep-????.txt\",\n+ # PEP ancillary files\n+ \"pep-????/*.rst\",\n+ # Documentation\n+ \"docs/*.rst\",\n+]\n exclude_patterns = [\n- # Windows:\n- \"Thumbs.db\",\n- \".DS_Store\",\n- # Python:\n- \".venv\",\n- \"venv\",\n- \"requirements.txt\",\n- # Sphinx:\n- \"build\",\n- \"output.txt\", # Link-check output\n- # PEPs:\n- \"pep-0012\",\n- \"README.rst\",\n- \"CONTRIBUTING.rst\",\n- \"pep_sphinx_extensions/LICENCE.rst\",\n- # Miscellaneous\n- \".codespell\",\n+ # PEP Template\n+ \"pep-0012/pep-NNNN.rst\",\n ]\n \n # -- Options for HTML output -------------------------------------------------\ndiff --git a/pep_sphinx_extensions/__init__.py b/pep_sphinx_extensions/__init__.py\n--- a/pep_sphinx_extensions/__init__.py\n+++ b/pep_sphinx_extensions/__init__.py\n@@ -6,6 +6,7 @@\n \n from docutils.writers.html5_polyglot import HTMLTranslator\n from sphinx import environment\n+from sphinx import project\n \n from pep_sphinx_extensions.pep_processor.html import pep_html_builder\n from pep_sphinx_extensions.pep_processor.html import pep_html_translator\n@@ -16,6 +17,37 @@\n \n if TYPE_CHECKING:\n from sphinx.application import Sphinx\n+ from sphinx.config import Config\n+\n+\n+def find_files(self: environment.BuildEnvironment, config: Config, _b) -> None:\n+ \"\"\"Find all pep source files.\"\"\"\n+ import fnmatch\n+ from pathlib import Path\n+\n+ root = Path(self.project.srcdir).absolute()\n+ self.project.docnames = set()\n+ for pattern in config.include_patterns:\n+ for path in root.glob(pattern):\n+ filename = str(path.relative_to(root))\n+ if any(fnmatch.fnmatch(filename, pattern) for pattern in config.exclude_patterns):\n+ continue\n+\n+ doc_name = self.project.path2doc(filename)\n+ if not doc_name:\n+ continue\n+\n+ if doc_name not in self.project.docnames:\n+ self.project.docnames.add(doc_name)\n+ continue\n+\n+ other_files = [str(f.relative_to(root)) for f in root.glob(f\"{doc_name}.*\")]\n+ project.logger.warning(\n+ f'multiple files found for the document \"{doc_name}\": {other_files!r}\\n'\n+ f'Use {self.doc2path(doc_name)!r} for the build.', once=True)\n+\n+\n+environment.BuildEnvironment.find_files = find_files\n \n \n def _depart_maths():\n", "issue": "Infra: Fix non-PEP txt files being included and erroring out the build\nPresently, as described in #2404 , Sphinx attempts to render any `.txt` files not explicitly excluded in `exclude_patterns` of `conf.py`, in the root or any subdirectories. This results in build failure if any are found, particularly in untracked files which we don't control (e.g. `vendor.txt` in venvs).\r\n\r\nPer the [Sphinx docs](https://www.sphinx-doc.org/en/master/usage/configuration.html#id11) we can use full Unix shell-style glob syntax, so adding the glob `**/*.txt` to `exclude_patterns` neatly fixes most instance of this problem by ensuring that `.txt` files are not matched unless they are in the root directory (as all the valid PEPs are).\r\n\r\nTo handle any remaining cases, i.e. non-PEP `.txt` files that _are_ in the root directory and not explicitly excluded, adding `*[!0-9].txt`,`*[!0-9]?.txt`, `*[!0-9]??.txt`, `*[!0-9]???.txt`, `*[!p]?????.txt`, `*[!e]??????.txt`, and `*[!p]???????.txt` ensures that to be included, `.txt` files must match the equivalent of the regex `^pep?[0-9]{4}\\.txt$`, which is only used for PEPs.\r\n\r\nThis solution was tested on some sample files in the root and subdirectories, including the reported `venv-spam/vendor.txt`, and several with varying degrees of names somewhat similar to PEPs, and it excluded all of them, while not excluding any valid PEPs.\r\n\r\nIt also obviates the need for some of our other explicit excludes, but I opted to leave them for now\r\n\r\nFix #2404 \n`make sphinx` Breaks when Certain Untracked Files are Present\n* `make sphinx` fails if I have some file like \"pep-0684-old.rst\" (the filename regex in pep_sphinx_extensions/pep_zero_generator/pep_index_generator.py isn't precise enough)\r\n* `make sphinx` fails if I have a venv dir named \"venv-spam\" sitting at the top of the repo (tries to parse a vendor.txt file inside the venv)\r\n\r\nIn both cases I'd expect it to work fine, ignoring the relevant files.\n", "before_files": [{"content": "\"\"\"Sphinx extensions for performant PEP processing\"\"\"\n\nfrom __future__ import annotations\n\nfrom typing import TYPE_CHECKING\n\nfrom docutils.writers.html5_polyglot import HTMLTranslator\nfrom sphinx import environment\n\nfrom pep_sphinx_extensions.pep_processor.html import pep_html_builder\nfrom pep_sphinx_extensions.pep_processor.html import pep_html_translator\nfrom pep_sphinx_extensions.pep_processor.parsing import pep_parser\nfrom pep_sphinx_extensions.pep_processor.parsing import pep_role\nfrom pep_sphinx_extensions.pep_processor.transforms import pep_references\nfrom pep_sphinx_extensions.pep_zero_generator.pep_index_generator import create_pep_zero\n\nif TYPE_CHECKING:\n from sphinx.application import Sphinx\n\n\ndef _depart_maths():\n pass # No-op callable for the type checker\n\n\ndef _update_config_for_builder(app: Sphinx) -> None:\n app.env.document_ids = {} # For PEPReferenceRoleTitleText\n if app.builder.name == \"dirhtml\":\n app.env.settings[\"pep_url\"] = \"../pep-{:0>4}\"\n\n # internal_builder exists if Sphinx is run by build.py\n if \"internal_builder\" not in app.tags:\n app.connect(\"build-finished\", _post_build) # Post-build tasks\n\n\ndef _post_build(app: Sphinx, exception: Exception | None) -> None:\n from pathlib import Path\n\n from build import create_index_file\n\n if exception is not None:\n return\n create_index_file(Path(app.outdir), app.builder.name)\n\n\ndef setup(app: Sphinx) -> dict[str, bool]:\n \"\"\"Initialize Sphinx extension.\"\"\"\n\n environment.default_settings[\"pep_url\"] = \"pep-{:0>4}.html\"\n environment.default_settings[\"halt_level\"] = 2 # Fail on Docutils warning\n\n # Register plugin logic\n app.add_builder(pep_html_builder.FileBuilder, override=True)\n app.add_builder(pep_html_builder.DirectoryBuilder, override=True)\n\n app.add_source_parser(pep_parser.PEPParser) # Add PEP transforms\n\n app.set_translator(\"html\", pep_html_translator.PEPTranslator) # Docutils Node Visitor overrides (html builder)\n app.set_translator(\"dirhtml\", pep_html_translator.PEPTranslator) # Docutils Node Visitor overrides (dirhtml builder)\n\n app.add_role(\"pep\", pep_role.PEPRole(), override=True) # Transform PEP references to links\n\n app.add_post_transform(pep_references.PEPReferenceRoleTitleText)\n\n # Register event callbacks\n app.connect(\"builder-inited\", _update_config_for_builder) # Update configuration values for builder used\n app.connect(\"env-before-read-docs\", create_pep_zero) # PEP 0 hook\n\n # Mathematics rendering\n inline_maths = HTMLTranslator.visit_math, _depart_maths\n block_maths = HTMLTranslator.visit_math_block, _depart_maths\n app.add_html_math_renderer(\"maths_to_html\", inline_maths, block_maths) # Render maths to HTML\n\n # Parallel safety: https://www.sphinx-doc.org/en/master/extdev/index.html#extension-metadata\n return {\"parallel_read_safe\": True, \"parallel_write_safe\": True}\n", "path": "pep_sphinx_extensions/__init__.py"}, {"content": "# This file is placed in the public domain or under the\n# CC0-1.0-Universal license, whichever is more permissive.\n\n\"\"\"Configuration for building PEPs using Sphinx.\"\"\"\n\nfrom pathlib import Path\nimport sys\n\nsys.path.append(str(Path(\"pep_sphinx_extensions\").absolute()))\n\n# -- Project information -----------------------------------------------------\n\nproject = \"PEPs\"\nmaster_doc = \"contents\"\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings.\nextensions = [\"pep_sphinx_extensions\", \"sphinx.ext.githubpages\"]\n\n# The file extensions of source files. Sphinx uses these suffixes as sources.\nsource_suffix = {\n \".rst\": \"pep\",\n \".txt\": \"pep\",\n}\n\n# List of patterns (relative to source dir) to ignore when looking for source files.\nexclude_patterns = [\n # Windows:\n \"Thumbs.db\",\n \".DS_Store\",\n # Python:\n \".venv\",\n \"venv\",\n \"requirements.txt\",\n # Sphinx:\n \"build\",\n \"output.txt\", # Link-check output\n # PEPs:\n \"pep-0012\",\n \"README.rst\",\n \"CONTRIBUTING.rst\",\n \"pep_sphinx_extensions/LICENCE.rst\",\n # Miscellaneous\n \".codespell\",\n]\n\n# -- Options for HTML output -------------------------------------------------\n\n# HTML output settings\nhtml_math_renderer = \"maths_to_html\" # Maths rendering\n\n# Theme settings\nhtml_theme_path = [\"pep_sphinx_extensions\"]\nhtml_theme = \"pep_theme\" # The actual theme directory (child of html_theme_path)\nhtml_use_index = False # Disable index (we use PEP 0)\nhtml_style = \"\" # must be defined here or in theme.conf, but is unused\nhtml_permalinks = False # handled in the PEPContents transform\nhtml_baseurl = \"https://peps.python.org\" # to create the CNAME file\ngettext_auto_build = False # speed-ups\n\ntemplates_path = ['pep_sphinx_extensions/pep_theme/templates'] # Theme template relative paths from `confdir`\n", "path": "conf.py"}], "after_files": [{"content": "\"\"\"Sphinx extensions for performant PEP processing\"\"\"\n\nfrom __future__ import annotations\n\nfrom typing import TYPE_CHECKING\n\nfrom docutils.writers.html5_polyglot import HTMLTranslator\nfrom sphinx import environment\nfrom sphinx import project\n\nfrom pep_sphinx_extensions.pep_processor.html import pep_html_builder\nfrom pep_sphinx_extensions.pep_processor.html import pep_html_translator\nfrom pep_sphinx_extensions.pep_processor.parsing import pep_parser\nfrom pep_sphinx_extensions.pep_processor.parsing import pep_role\nfrom pep_sphinx_extensions.pep_processor.transforms import pep_references\nfrom pep_sphinx_extensions.pep_zero_generator.pep_index_generator import create_pep_zero\n\nif TYPE_CHECKING:\n from sphinx.application import Sphinx\n from sphinx.config import Config\n\n\ndef find_files(self: environment.BuildEnvironment, config: Config, _b) -> None:\n \"\"\"Find all pep source files.\"\"\"\n import fnmatch\n from pathlib import Path\n\n root = Path(self.project.srcdir).absolute()\n self.project.docnames = set()\n for pattern in config.include_patterns:\n for path in root.glob(pattern):\n filename = str(path.relative_to(root))\n if any(fnmatch.fnmatch(filename, pattern) for pattern in config.exclude_patterns):\n continue\n\n doc_name = self.project.path2doc(filename)\n if not doc_name:\n continue\n\n if doc_name not in self.project.docnames:\n self.project.docnames.add(doc_name)\n continue\n\n other_files = [str(f.relative_to(root)) for f in root.glob(f\"{doc_name}.*\")]\n project.logger.warning(\n f'multiple files found for the document \"{doc_name}\": {other_files!r}\\n'\n f'Use {self.doc2path(doc_name)!r} for the build.', once=True)\n\n\nenvironment.BuildEnvironment.find_files = find_files\n\n\ndef _depart_maths():\n pass # No-op callable for the type checker\n\n\ndef _update_config_for_builder(app: Sphinx) -> None:\n app.env.document_ids = {} # For PEPReferenceRoleTitleText\n if app.builder.name == \"dirhtml\":\n app.env.settings[\"pep_url\"] = \"../pep-{:0>4}\"\n\n # internal_builder exists if Sphinx is run by build.py\n if \"internal_builder\" not in app.tags:\n app.connect(\"build-finished\", _post_build) # Post-build tasks\n\n\ndef _post_build(app: Sphinx, exception: Exception | None) -> None:\n from pathlib import Path\n\n from build import create_index_file\n\n if exception is not None:\n return\n create_index_file(Path(app.outdir), app.builder.name)\n\n\ndef setup(app: Sphinx) -> dict[str, bool]:\n \"\"\"Initialize Sphinx extension.\"\"\"\n\n environment.default_settings[\"pep_url\"] = \"pep-{:0>4}.html\"\n environment.default_settings[\"halt_level\"] = 2 # Fail on Docutils warning\n\n # Register plugin logic\n app.add_builder(pep_html_builder.FileBuilder, override=True)\n app.add_builder(pep_html_builder.DirectoryBuilder, override=True)\n\n app.add_source_parser(pep_parser.PEPParser) # Add PEP transforms\n\n app.set_translator(\"html\", pep_html_translator.PEPTranslator) # Docutils Node Visitor overrides (html builder)\n app.set_translator(\"dirhtml\", pep_html_translator.PEPTranslator) # Docutils Node Visitor overrides (dirhtml builder)\n\n app.add_role(\"pep\", pep_role.PEPRole(), override=True) # Transform PEP references to links\n\n app.add_post_transform(pep_references.PEPReferenceRoleTitleText)\n\n # Register event callbacks\n app.connect(\"builder-inited\", _update_config_for_builder) # Update configuration values for builder used\n app.connect(\"env-before-read-docs\", create_pep_zero) # PEP 0 hook\n\n # Mathematics rendering\n inline_maths = HTMLTranslator.visit_math, _depart_maths\n block_maths = HTMLTranslator.visit_math_block, _depart_maths\n app.add_html_math_renderer(\"maths_to_html\", inline_maths, block_maths) # Render maths to HTML\n\n # Parallel safety: https://www.sphinx-doc.org/en/master/extdev/index.html#extension-metadata\n return {\"parallel_read_safe\": True, \"parallel_write_safe\": True}\n", "path": "pep_sphinx_extensions/__init__.py"}, {"content": "# This file is placed in the public domain or under the\n# CC0-1.0-Universal license, whichever is more permissive.\n\n\"\"\"Configuration for building PEPs using Sphinx.\"\"\"\n\nfrom pathlib import Path\nimport sys\n\nsys.path.append(str(Path(\"pep_sphinx_extensions\").absolute()))\n\n# Add 'include_patterns' as a config variable\nfrom sphinx.config import Config\nConfig.config_values['include_patterns'] = [], 'env', []\ndel Config\n\n# -- Project information -----------------------------------------------------\n\nproject = \"PEPs\"\nmaster_doc = \"contents\"\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings.\nextensions = [\"pep_sphinx_extensions\", \"sphinx.ext.githubpages\"]\n\n# The file extensions of source files. Sphinx uses these suffixes as sources.\nsource_suffix = {\n \".rst\": \"pep\",\n \".txt\": \"pep\",\n}\n\n# List of patterns (relative to source dir) to ignore when looking for source files.\ninclude_patterns = [\n # Required for Sphinx\n \"contents.rst\",\n # PEP files\n \"pep-????.rst\",\n \"pep-????.txt\",\n # PEP ancillary files\n \"pep-????/*.rst\",\n # Documentation\n \"docs/*.rst\",\n]\nexclude_patterns = [\n # PEP Template\n \"pep-0012/pep-NNNN.rst\",\n]\n\n# -- Options for HTML output -------------------------------------------------\n\n# HTML output settings\nhtml_math_renderer = \"maths_to_html\" # Maths rendering\n\n# Theme settings\nhtml_theme_path = [\"pep_sphinx_extensions\"]\nhtml_theme = \"pep_theme\" # The actual theme directory (child of html_theme_path)\nhtml_use_index = False # Disable index (we use PEP 0)\nhtml_style = \"\" # must be defined here or in theme.conf, but is unused\nhtml_permalinks = False # handled in the PEPContents transform\nhtml_baseurl = \"https://peps.python.org\" # to create the CNAME file\ngettext_auto_build = False # speed-ups\n\ntemplates_path = ['pep_sphinx_extensions/pep_theme/templates'] # Theme template relative paths from `confdir`\n", "path": "conf.py"}]}
| 2,236 | 792 |
gh_patches_debug_12063
|
rasdani/github-patches
|
git_diff
|
spyder-ide__spyder-3619
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error when connecting to external kernels
qt 4.8.7 does not have this bug, but qt5.6 with mac will have
```
Spyder Internal Console
This console is used to report application
internal errors and to inspect Spyder
internals with the following commands:
spy.app, spy.window, dir(spy)
Please don't use it to run your code
>>> WARNING:traitlets:kernel died: 3.0032799243927
Traceback (most recent call last):
File "/Users/HWJ/anaconda3/lib/python3.5/site-packages/qtconsole/base_frontend_mixin.py", line 163, in _dispatch
handler(msg)
File "/Users/HWJ/anaconda3/lib/python3.5/site-packages/spyder/widgets/ipythonconsole/namespacebrowser.py", line 182, in _handle_execute_reply
self.set_namespace_view_settings()
File "/Users/HWJ/anaconda3/lib/python3.5/site-packages/spyder/widgets/ipythonconsole/namespacebrowser.py", line 68, in set_namespace_view_settings
settings = to_text_string(self.namespacebrowser.get_view_settings())
AttributeError: 'NoneType' object has no attribute 'get_view_settings'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `spyder/widgets/ipythonconsole/namespacebrowser.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright © Spyder Project Contributors
4 # Licensed under the terms of the MIT License
5 # (see spyder/__init__.py for details)
6
7 """
8 Widget that handle communications between the IPython Console and
9 the Variable Explorer
10 """
11
12 from qtpy.QtCore import QEventLoop
13
14 from ipykernel.pickleutil import CannedObject
15 from ipykernel.serialize import deserialize_object
16 from qtconsole.rich_jupyter_widget import RichJupyterWidget
17
18 from spyder.config.base import _
19 from spyder.py3compat import to_text_string
20
21
22 class NamepaceBrowserWidget(RichJupyterWidget):
23 """
24 Widget with the necessary attributes and methods to handle communications
25 between the IPython Console and the Variable Explorer
26 """
27
28 # Reference to the nsb widget connected to this client
29 namespacebrowser = None
30
31 # To save the replies of kernel method executions (except
32 # getting values of variables)
33 _kernel_methods = {}
34
35 # To save values and messages returned by the kernel
36 _kernel_value = None
37 _kernel_is_starting = True
38
39 # --- Public API --------------------------------------------------
40 def set_namespacebrowser(self, namespacebrowser):
41 """Set namespace browser widget"""
42 self.namespacebrowser = namespacebrowser
43 self.configure_namespacebrowser()
44
45 def configure_namespacebrowser(self):
46 """Configure associated namespace browser widget"""
47 # Tell it that we are connected to client
48 self.namespacebrowser.is_ipyclient = True
49
50 # Update namespace view
51 self.sig_namespace_view.connect(lambda data:
52 self.namespacebrowser.process_remote_view(data))
53
54 # Update properties of variables
55 self.sig_var_properties.connect(lambda data:
56 self.namespacebrowser.set_var_properties(data))
57
58 def refresh_namespacebrowser(self):
59 """Refresh namespace browser"""
60 if self.namespacebrowser:
61 self.silent_exec_method(
62 'get_ipython().kernel.get_namespace_view()')
63 self.silent_exec_method(
64 'get_ipython().kernel.get_var_properties()')
65
66 def set_namespace_view_settings(self):
67 """Set the namespace view settings"""
68 settings = to_text_string(self.namespacebrowser.get_view_settings())
69 code = u"get_ipython().kernel.namespace_view_settings = %s" % settings
70 self.silent_execute(code)
71
72 def get_value(self, name):
73 """Ask kernel for a value"""
74 # Don't ask for values while reading (ipdb) is active
75 if self._reading:
76 raise ValueError(_("Inspecting and setting values while debugging "
77 "in IPython consoles is not supported yet by "
78 "Spyder."))
79
80 # Wait until the kernel returns the value
81 wait_loop = QEventLoop()
82 self.sig_got_reply.connect(wait_loop.quit)
83 self.silent_execute("get_ipython().kernel.get_value('%s')" % name)
84 wait_loop.exec_()
85
86 # Remove loop connection and loop
87 self.sig_got_reply.disconnect(wait_loop.quit)
88 wait_loop = None
89
90 # Handle exceptions
91 if self._kernel_value is None:
92 if self._kernel_reply:
93 msg = self._kernel_reply[:]
94 self._kernel_reply = None
95 raise ValueError(msg)
96
97 return self._kernel_value
98
99 def set_value(self, name, value):
100 """Set value for a variable"""
101 value = to_text_string(value)
102 self.silent_execute("get_ipython().kernel.set_value('%s', %s)" %
103 (name, value))
104
105 def remove_value(self, name):
106 """Remove a variable"""
107 self.silent_execute("get_ipython().kernel.remove_value('%s')" % name)
108
109 def copy_value(self, orig_name, new_name):
110 """Copy a variable"""
111 self.silent_execute("get_ipython().kernel.copy_value('%s', '%s')" %
112 (orig_name, new_name))
113
114 def load_data(self, filename, ext):
115 # Wait until the kernel tries to load the file
116 wait_loop = QEventLoop()
117 self.sig_got_reply.connect(wait_loop.quit)
118 self.silent_exec_method(
119 "get_ipython().kernel.load_data('%s', '%s')" % (filename, ext))
120 wait_loop.exec_()
121
122 # Remove loop connection and loop
123 self.sig_got_reply.disconnect(wait_loop.quit)
124 wait_loop = None
125
126 return self._kernel_reply
127
128 def save_namespace(self, filename):
129 # Wait until the kernel tries to save the file
130 wait_loop = QEventLoop()
131 self.sig_got_reply.connect(wait_loop.quit)
132 self.silent_exec_method("get_ipython().kernel.save_namespace('%s')" %
133 filename)
134 wait_loop.exec_()
135
136 # Remove loop connection and loop
137 self.sig_got_reply.disconnect(wait_loop.quit)
138 wait_loop = None
139
140 return self._kernel_reply
141
142 # ---- Private API (defined by us) ------------------------------
143 def _handle_data_message(self, msg):
144 """
145 Handle raw (serialized) data sent by the kernel
146
147 We only handle data asked by Spyder, in case people use
148 publish_data for other purposes.
149 """
150 # Deserialize data
151 try:
152 data = deserialize_object(msg['buffers'])[0]
153 except Exception as msg:
154 self._kernel_value = None
155 self._kernel_reply = repr(msg)
156 self.sig_got_reply.emit()
157 return
158
159 # We only handle data asked by Spyder
160 value = data.get('__spy_data__', None)
161 if value is not None:
162 if isinstance(value, CannedObject):
163 value = value.get_object()
164 self._kernel_value = value
165 self.sig_got_reply.emit()
166
167 # ---- Private API (overrode by us) ----------------------------
168 def _handle_execute_reply(self, msg):
169 """
170 Reimplemented to handle communications between Spyder
171 and the kernel
172 """
173 msg_id = msg['parent_header']['msg_id']
174 info = self._request_info['execute'].get(msg_id)
175 # unset reading flag, because if execute finished, raw_input can't
176 # still be pending.
177 self._reading = False
178
179 # Refresh namespacebrowser after the kernel starts running
180 exec_count = msg['content']['execution_count']
181 if exec_count == 0 and self._kernel_is_starting:
182 self.set_namespace_view_settings()
183 self.refresh_namespacebrowser()
184 self._kernel_is_starting = False
185
186 # Handle silent execution of kernel methods
187 if info and info.kind == 'silent_exec_method' and not self._hidden:
188 self.handle_exec_method(msg)
189 self._request_info['execute'].pop(msg_id)
190 else:
191 super(NamepaceBrowserWidget, self)._handle_execute_reply(msg)
192
193 def _handle_status(self, msg):
194 """
195 Reimplemented to refresh the namespacebrowser after kernel
196 restarts
197 """
198 state = msg['content'].get('execution_state', '')
199 msg_type = msg['parent_header'].get('msg_type', '')
200 if state == 'starting' and not self._kernel_is_starting:
201 # This handles restarts when the kernel dies
202 # unexpectedly
203 self._kernel_is_starting = True
204 elif state == 'idle' and msg_type == 'shutdown_request':
205 # This handles restarts asked by the user
206 if self.namespacebrowser is not None:
207 self.set_namespace_view_settings()
208 self.refresh_namespacebrowser()
209 else:
210 super(NamepaceBrowserWidget, self)._handle_status(msg)
211
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/spyder/widgets/ipythonconsole/namespacebrowser.py b/spyder/widgets/ipythonconsole/namespacebrowser.py
--- a/spyder/widgets/ipythonconsole/namespacebrowser.py
+++ b/spyder/widgets/ipythonconsole/namespacebrowser.py
@@ -179,8 +179,9 @@
# Refresh namespacebrowser after the kernel starts running
exec_count = msg['content']['execution_count']
if exec_count == 0 and self._kernel_is_starting:
- self.set_namespace_view_settings()
- self.refresh_namespacebrowser()
+ if self.namespacebrowser is not None:
+ self.set_namespace_view_settings()
+ self.refresh_namespacebrowser()
self._kernel_is_starting = False
# Handle silent execution of kernel methods
|
{"golden_diff": "diff --git a/spyder/widgets/ipythonconsole/namespacebrowser.py b/spyder/widgets/ipythonconsole/namespacebrowser.py\n--- a/spyder/widgets/ipythonconsole/namespacebrowser.py\n+++ b/spyder/widgets/ipythonconsole/namespacebrowser.py\n@@ -179,8 +179,9 @@\n # Refresh namespacebrowser after the kernel starts running\n exec_count = msg['content']['execution_count']\n if exec_count == 0 and self._kernel_is_starting:\n- self.set_namespace_view_settings()\n- self.refresh_namespacebrowser()\n+ if self.namespacebrowser is not None:\n+ self.set_namespace_view_settings()\n+ self.refresh_namespacebrowser()\n self._kernel_is_starting = False\n \n # Handle silent execution of kernel methods\n", "issue": "Error when connecting to external kernels\nqt 4.8.7 does not have this bug, but qt5.6 with mac will have\n\n```\nSpyder Internal Console\n\nThis console is used to report application\ninternal errors and to inspect Spyder\ninternals with the following commands:\n spy.app, spy.window, dir(spy)\n\nPlease don't use it to run your code\n\n>>> WARNING:traitlets:kernel died: 3.0032799243927\nTraceback (most recent call last):\n File \"/Users/HWJ/anaconda3/lib/python3.5/site-packages/qtconsole/base_frontend_mixin.py\", line 163, in _dispatch\n handler(msg)\n File \"/Users/HWJ/anaconda3/lib/python3.5/site-packages/spyder/widgets/ipythonconsole/namespacebrowser.py\", line 182, in _handle_execute_reply\n self.set_namespace_view_settings()\n File \"/Users/HWJ/anaconda3/lib/python3.5/site-packages/spyder/widgets/ipythonconsole/namespacebrowser.py\", line 68, in set_namespace_view_settings\n settings = to_text_string(self.namespacebrowser.get_view_settings())\nAttributeError: 'NoneType' object has no attribute 'get_view_settings'\n```\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright \u00a9 Spyder Project Contributors\n# Licensed under the terms of the MIT License\n# (see spyder/__init__.py for details)\n\n\"\"\"\nWidget that handle communications between the IPython Console and\nthe Variable Explorer\n\"\"\"\n\nfrom qtpy.QtCore import QEventLoop\n\nfrom ipykernel.pickleutil import CannedObject\nfrom ipykernel.serialize import deserialize_object\nfrom qtconsole.rich_jupyter_widget import RichJupyterWidget\n\nfrom spyder.config.base import _\nfrom spyder.py3compat import to_text_string\n\n\nclass NamepaceBrowserWidget(RichJupyterWidget):\n \"\"\"\n Widget with the necessary attributes and methods to handle communications\n between the IPython Console and the Variable Explorer\n \"\"\"\n\n # Reference to the nsb widget connected to this client\n namespacebrowser = None\n\n # To save the replies of kernel method executions (except\n # getting values of variables)\n _kernel_methods = {}\n\n # To save values and messages returned by the kernel\n _kernel_value = None\n _kernel_is_starting = True\n\n # --- Public API --------------------------------------------------\n def set_namespacebrowser(self, namespacebrowser):\n \"\"\"Set namespace browser widget\"\"\"\n self.namespacebrowser = namespacebrowser\n self.configure_namespacebrowser()\n\n def configure_namespacebrowser(self):\n \"\"\"Configure associated namespace browser widget\"\"\"\n # Tell it that we are connected to client\n self.namespacebrowser.is_ipyclient = True\n\n # Update namespace view\n self.sig_namespace_view.connect(lambda data:\n self.namespacebrowser.process_remote_view(data))\n\n # Update properties of variables\n self.sig_var_properties.connect(lambda data:\n self.namespacebrowser.set_var_properties(data))\n\n def refresh_namespacebrowser(self):\n \"\"\"Refresh namespace browser\"\"\"\n if self.namespacebrowser:\n self.silent_exec_method(\n 'get_ipython().kernel.get_namespace_view()')\n self.silent_exec_method(\n 'get_ipython().kernel.get_var_properties()')\n\n def set_namespace_view_settings(self):\n \"\"\"Set the namespace view settings\"\"\"\n settings = to_text_string(self.namespacebrowser.get_view_settings())\n code = u\"get_ipython().kernel.namespace_view_settings = %s\" % settings\n self.silent_execute(code)\n\n def get_value(self, name):\n \"\"\"Ask kernel for a value\"\"\"\n # Don't ask for values while reading (ipdb) is active\n if self._reading:\n raise ValueError(_(\"Inspecting and setting values while debugging \"\n \"in IPython consoles is not supported yet by \"\n \"Spyder.\"))\n\n # Wait until the kernel returns the value\n wait_loop = QEventLoop()\n self.sig_got_reply.connect(wait_loop.quit)\n self.silent_execute(\"get_ipython().kernel.get_value('%s')\" % name)\n wait_loop.exec_()\n\n # Remove loop connection and loop\n self.sig_got_reply.disconnect(wait_loop.quit)\n wait_loop = None\n\n # Handle exceptions\n if self._kernel_value is None:\n if self._kernel_reply:\n msg = self._kernel_reply[:]\n self._kernel_reply = None\n raise ValueError(msg)\n\n return self._kernel_value\n\n def set_value(self, name, value):\n \"\"\"Set value for a variable\"\"\"\n value = to_text_string(value)\n self.silent_execute(\"get_ipython().kernel.set_value('%s', %s)\" %\n (name, value))\n\n def remove_value(self, name):\n \"\"\"Remove a variable\"\"\"\n self.silent_execute(\"get_ipython().kernel.remove_value('%s')\" % name)\n\n def copy_value(self, orig_name, new_name):\n \"\"\"Copy a variable\"\"\"\n self.silent_execute(\"get_ipython().kernel.copy_value('%s', '%s')\" %\n (orig_name, new_name))\n\n def load_data(self, filename, ext):\n # Wait until the kernel tries to load the file\n wait_loop = QEventLoop()\n self.sig_got_reply.connect(wait_loop.quit)\n self.silent_exec_method(\n \"get_ipython().kernel.load_data('%s', '%s')\" % (filename, ext))\n wait_loop.exec_()\n\n # Remove loop connection and loop\n self.sig_got_reply.disconnect(wait_loop.quit)\n wait_loop = None\n\n return self._kernel_reply\n\n def save_namespace(self, filename):\n # Wait until the kernel tries to save the file\n wait_loop = QEventLoop()\n self.sig_got_reply.connect(wait_loop.quit)\n self.silent_exec_method(\"get_ipython().kernel.save_namespace('%s')\" %\n filename)\n wait_loop.exec_()\n\n # Remove loop connection and loop\n self.sig_got_reply.disconnect(wait_loop.quit)\n wait_loop = None\n\n return self._kernel_reply\n\n # ---- Private API (defined by us) ------------------------------\n def _handle_data_message(self, msg):\n \"\"\"\n Handle raw (serialized) data sent by the kernel\n\n We only handle data asked by Spyder, in case people use\n publish_data for other purposes.\n \"\"\"\n # Deserialize data\n try:\n data = deserialize_object(msg['buffers'])[0]\n except Exception as msg:\n self._kernel_value = None\n self._kernel_reply = repr(msg)\n self.sig_got_reply.emit()\n return\n\n # We only handle data asked by Spyder\n value = data.get('__spy_data__', None)\n if value is not None:\n if isinstance(value, CannedObject):\n value = value.get_object()\n self._kernel_value = value\n self.sig_got_reply.emit()\n\n # ---- Private API (overrode by us) ----------------------------\n def _handle_execute_reply(self, msg):\n \"\"\"\n Reimplemented to handle communications between Spyder\n and the kernel\n \"\"\"\n msg_id = msg['parent_header']['msg_id']\n info = self._request_info['execute'].get(msg_id)\n # unset reading flag, because if execute finished, raw_input can't\n # still be pending.\n self._reading = False\n\n # Refresh namespacebrowser after the kernel starts running\n exec_count = msg['content']['execution_count']\n if exec_count == 0 and self._kernel_is_starting:\n self.set_namespace_view_settings()\n self.refresh_namespacebrowser()\n self._kernel_is_starting = False\n\n # Handle silent execution of kernel methods\n if info and info.kind == 'silent_exec_method' and not self._hidden:\n self.handle_exec_method(msg)\n self._request_info['execute'].pop(msg_id)\n else:\n super(NamepaceBrowserWidget, self)._handle_execute_reply(msg)\n\n def _handle_status(self, msg):\n \"\"\"\n Reimplemented to refresh the namespacebrowser after kernel\n restarts\n \"\"\"\n state = msg['content'].get('execution_state', '')\n msg_type = msg['parent_header'].get('msg_type', '')\n if state == 'starting' and not self._kernel_is_starting:\n # This handles restarts when the kernel dies\n # unexpectedly\n self._kernel_is_starting = True\n elif state == 'idle' and msg_type == 'shutdown_request':\n # This handles restarts asked by the user\n if self.namespacebrowser is not None:\n self.set_namespace_view_settings()\n self.refresh_namespacebrowser()\n else:\n super(NamepaceBrowserWidget, self)._handle_status(msg)\n", "path": "spyder/widgets/ipythonconsole/namespacebrowser.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright \u00a9 Spyder Project Contributors\n# Licensed under the terms of the MIT License\n# (see spyder/__init__.py for details)\n\n\"\"\"\nWidget that handle communications between the IPython Console and\nthe Variable Explorer\n\"\"\"\n\nfrom qtpy.QtCore import QEventLoop\n\nfrom ipykernel.pickleutil import CannedObject\nfrom ipykernel.serialize import deserialize_object\nfrom qtconsole.rich_jupyter_widget import RichJupyterWidget\n\nfrom spyder.config.base import _\nfrom spyder.py3compat import to_text_string\n\n\nclass NamepaceBrowserWidget(RichJupyterWidget):\n \"\"\"\n Widget with the necessary attributes and methods to handle communications\n between the IPython Console and the Variable Explorer\n \"\"\"\n\n # Reference to the nsb widget connected to this client\n namespacebrowser = None\n\n # To save the replies of kernel method executions (except\n # getting values of variables)\n _kernel_methods = {}\n\n # To save values and messages returned by the kernel\n _kernel_value = None\n _kernel_is_starting = True\n\n # --- Public API --------------------------------------------------\n def set_namespacebrowser(self, namespacebrowser):\n \"\"\"Set namespace browser widget\"\"\"\n self.namespacebrowser = namespacebrowser\n self.configure_namespacebrowser()\n\n def configure_namespacebrowser(self):\n \"\"\"Configure associated namespace browser widget\"\"\"\n # Tell it that we are connected to client\n self.namespacebrowser.is_ipyclient = True\n\n # Update namespace view\n self.sig_namespace_view.connect(lambda data:\n self.namespacebrowser.process_remote_view(data))\n\n # Update properties of variables\n self.sig_var_properties.connect(lambda data:\n self.namespacebrowser.set_var_properties(data))\n\n def refresh_namespacebrowser(self):\n \"\"\"Refresh namespace browser\"\"\"\n if self.namespacebrowser:\n self.silent_exec_method(\n 'get_ipython().kernel.get_namespace_view()')\n self.silent_exec_method(\n 'get_ipython().kernel.get_var_properties()')\n\n def set_namespace_view_settings(self):\n \"\"\"Set the namespace view settings\"\"\"\n settings = to_text_string(self.namespacebrowser.get_view_settings())\n code = u\"get_ipython().kernel.namespace_view_settings = %s\" % settings\n self.silent_execute(code)\n\n def get_value(self, name):\n \"\"\"Ask kernel for a value\"\"\"\n # Don't ask for values while reading (ipdb) is active\n if self._reading:\n raise ValueError(_(\"Inspecting and setting values while debugging \"\n \"in IPython consoles is not supported yet by \"\n \"Spyder.\"))\n\n # Wait until the kernel returns the value\n wait_loop = QEventLoop()\n self.sig_got_reply.connect(wait_loop.quit)\n self.silent_execute(\"get_ipython().kernel.get_value('%s')\" % name)\n wait_loop.exec_()\n\n # Remove loop connection and loop\n self.sig_got_reply.disconnect(wait_loop.quit)\n wait_loop = None\n\n # Handle exceptions\n if self._kernel_value is None:\n if self._kernel_reply:\n msg = self._kernel_reply[:]\n self._kernel_reply = None\n raise ValueError(msg)\n\n return self._kernel_value\n\n def set_value(self, name, value):\n \"\"\"Set value for a variable\"\"\"\n value = to_text_string(value)\n self.silent_execute(\"get_ipython().kernel.set_value('%s', %s)\" %\n (name, value))\n\n def remove_value(self, name):\n \"\"\"Remove a variable\"\"\"\n self.silent_execute(\"get_ipython().kernel.remove_value('%s')\" % name)\n\n def copy_value(self, orig_name, new_name):\n \"\"\"Copy a variable\"\"\"\n self.silent_execute(\"get_ipython().kernel.copy_value('%s', '%s')\" %\n (orig_name, new_name))\n\n def load_data(self, filename, ext):\n # Wait until the kernel tries to load the file\n wait_loop = QEventLoop()\n self.sig_got_reply.connect(wait_loop.quit)\n self.silent_exec_method(\n \"get_ipython().kernel.load_data('%s', '%s')\" % (filename, ext))\n wait_loop.exec_()\n\n # Remove loop connection and loop\n self.sig_got_reply.disconnect(wait_loop.quit)\n wait_loop = None\n\n return self._kernel_reply\n\n def save_namespace(self, filename):\n # Wait until the kernel tries to save the file\n wait_loop = QEventLoop()\n self.sig_got_reply.connect(wait_loop.quit)\n self.silent_exec_method(\"get_ipython().kernel.save_namespace('%s')\" %\n filename)\n wait_loop.exec_()\n\n # Remove loop connection and loop\n self.sig_got_reply.disconnect(wait_loop.quit)\n wait_loop = None\n\n return self._kernel_reply\n\n # ---- Private API (defined by us) ------------------------------\n def _handle_data_message(self, msg):\n \"\"\"\n Handle raw (serialized) data sent by the kernel\n\n We only handle data asked by Spyder, in case people use\n publish_data for other purposes.\n \"\"\"\n # Deserialize data\n try:\n data = deserialize_object(msg['buffers'])[0]\n except Exception as msg:\n self._kernel_value = None\n self._kernel_reply = repr(msg)\n self.sig_got_reply.emit()\n return\n\n # We only handle data asked by Spyder\n value = data.get('__spy_data__', None)\n if value is not None:\n if isinstance(value, CannedObject):\n value = value.get_object()\n self._kernel_value = value\n self.sig_got_reply.emit()\n\n # ---- Private API (overrode by us) ----------------------------\n def _handle_execute_reply(self, msg):\n \"\"\"\n Reimplemented to handle communications between Spyder\n and the kernel\n \"\"\"\n msg_id = msg['parent_header']['msg_id']\n info = self._request_info['execute'].get(msg_id)\n # unset reading flag, because if execute finished, raw_input can't\n # still be pending.\n self._reading = False\n\n # Refresh namespacebrowser after the kernel starts running\n exec_count = msg['content']['execution_count']\n if exec_count == 0 and self._kernel_is_starting:\n if self.namespacebrowser is not None:\n self.set_namespace_view_settings()\n self.refresh_namespacebrowser()\n self._kernel_is_starting = False\n\n # Handle silent execution of kernel methods\n if info and info.kind == 'silent_exec_method' and not self._hidden:\n self.handle_exec_method(msg)\n self._request_info['execute'].pop(msg_id)\n else:\n super(NamepaceBrowserWidget, self)._handle_execute_reply(msg)\n\n def _handle_status(self, msg):\n \"\"\"\n Reimplemented to refresh the namespacebrowser after kernel\n restarts\n \"\"\"\n state = msg['content'].get('execution_state', '')\n msg_type = msg['parent_header'].get('msg_type', '')\n if state == 'starting' and not self._kernel_is_starting:\n # This handles restarts when the kernel dies\n # unexpectedly\n self._kernel_is_starting = True\n elif state == 'idle' and msg_type == 'shutdown_request':\n # This handles restarts asked by the user\n if self.namespacebrowser is not None:\n self.set_namespace_view_settings()\n self.refresh_namespacebrowser()\n else:\n super(NamepaceBrowserWidget, self)._handle_status(msg)\n", "path": "spyder/widgets/ipythonconsole/namespacebrowser.py"}]}
| 2,657 | 170 |
gh_patches_debug_73
|
rasdani/github-patches
|
git_diff
|
pypa__setuptools-1043
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
No module named 'six'
Flask's Tox/Travis builds all started failing. Looks like a new version of setuptools was just released that has a problem with six.
~~~pytb
Obtaining file:///home/david/Projects/flask
Installing collected packages: Flask
Running setup.py develop for Flask
Complete output from command /home/david/Projects/flask/.tox/py/bin/python3 -c "import setuptools, tokenize;__file__='/home/david/Projects/flask/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" develop --no-deps:
/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py:336: UserWarning: Normalizing '0.13-dev' to '0.13.dev0'
normalized_version,
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/david/Projects/flask/setup.py", line 109, in <module>
'''
File "/usr/lib64/python3.6/distutils/core.py", line 134, in setup
ok = dist.parse_command_line()
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py", line 363, in parse_command_line
result = _Distribution.parse_command_line(self)
File "/usr/lib64/python3.6/distutils/dist.py", line 472, in parse_command_line
args = self._parse_command_opts(parser, args)
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py", line 674, in _parse_command_opts
nargs = _Distribution._parse_command_opts(self, parser, args)
File "/usr/lib64/python3.6/distutils/dist.py", line 528, in _parse_command_opts
cmd_class = self.get_command_class(command)
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py", line 495, in get_command_class
self.cmdclass[command] = cmdclass = ep.load()
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2303, in load
return self.resolve()
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2309, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/develop.py", line 11, in <module>
from setuptools.command.easy_install import easy_install
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/easy_install.py", line 49, in <module>
from setuptools.py27compat import rmtree_safe
File "/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/py27compat.py", line 7, in <module>
import six
ModuleNotFoundError: No module named 'six'
~~~
Example failed build log: https://travis-ci.org/pallets/flask/jobs/238166427#L242
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setuptools/py27compat.py`
Content:
```
1 """
2 Compatibility Support for Python 2.7 and earlier
3 """
4
5 import platform
6
7 import six
8
9
10 def get_all_headers(message, key):
11 """
12 Given an HTTPMessage, return all headers matching a given key.
13 """
14 return message.get_all(key)
15
16
17 if six.PY2:
18 def get_all_headers(message, key):
19 return message.getheaders(key)
20
21
22 linux_py2_ascii = (
23 platform.system() == 'Linux' and
24 six.PY2
25 )
26
27 rmtree_safe = str if linux_py2_ascii else lambda x: x
28 """Workaround for http://bugs.python.org/issue24672"""
29
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/setuptools/py27compat.py b/setuptools/py27compat.py
--- a/setuptools/py27compat.py
+++ b/setuptools/py27compat.py
@@ -4,7 +4,7 @@
import platform
-import six
+from setuptools.extern import six
def get_all_headers(message, key):
|
{"golden_diff": "diff --git a/setuptools/py27compat.py b/setuptools/py27compat.py\n--- a/setuptools/py27compat.py\n+++ b/setuptools/py27compat.py\n@@ -4,7 +4,7 @@\n \n import platform\n \n-import six\n+from setuptools.extern import six\n \n \n def get_all_headers(message, key):\n", "issue": "No module named 'six'\nFlask's Tox/Travis builds all started failing. Looks like a new version of setuptools was just released that has a problem with six.\r\n\r\n~~~pytb\r\nObtaining file:///home/david/Projects/flask\r\nInstalling collected packages: Flask\r\n Running setup.py develop for Flask\r\n Complete output from command /home/david/Projects/flask/.tox/py/bin/python3 -c \"import setuptools, tokenize;__file__='/home/david/Projects/flask/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\\r\\n', '\\n');f.close();exec(compile(code, __file__, 'exec'))\" develop --no-deps:\r\n /home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py:336: UserWarning: Normalizing '0.13-dev' to '0.13.dev0'\r\n normalized_version,\r\n Traceback (most recent call last):\r\n File \"<string>\", line 1, in <module>\r\n File \"/home/david/Projects/flask/setup.py\", line 109, in <module>\r\n '''\r\n File \"/usr/lib64/python3.6/distutils/core.py\", line 134, in setup\r\n ok = dist.parse_command_line()\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py\", line 363, in parse_command_line\r\n result = _Distribution.parse_command_line(self)\r\n File \"/usr/lib64/python3.6/distutils/dist.py\", line 472, in parse_command_line\r\n args = self._parse_command_opts(parser, args)\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py\", line 674, in _parse_command_opts\r\n nargs = _Distribution._parse_command_opts(self, parser, args)\r\n File \"/usr/lib64/python3.6/distutils/dist.py\", line 528, in _parse_command_opts\r\n cmd_class = self.get_command_class(command)\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/dist.py\", line 495, in get_command_class\r\n self.cmdclass[command] = cmdclass = ep.load()\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py\", line 2303, in load\r\n return self.resolve()\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/pkg_resources/__init__.py\", line 2309, in resolve\r\n module = __import__(self.module_name, fromlist=['__name__'], level=0)\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/develop.py\", line 11, in <module>\r\n from setuptools.command.easy_install import easy_install\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/command/easy_install.py\", line 49, in <module>\r\n from setuptools.py27compat import rmtree_safe\r\n File \"/home/david/Projects/flask/.tox/py/lib/python3.6/site-packages/setuptools/py27compat.py\", line 7, in <module>\r\n import six\r\n ModuleNotFoundError: No module named 'six'\r\n~~~\r\n\r\nExample failed build log: https://travis-ci.org/pallets/flask/jobs/238166427#L242\n", "before_files": [{"content": "\"\"\"\nCompatibility Support for Python 2.7 and earlier\n\"\"\"\n\nimport platform\n\nimport six\n\n\ndef get_all_headers(message, key):\n \"\"\"\n Given an HTTPMessage, return all headers matching a given key.\n \"\"\"\n return message.get_all(key)\n\n\nif six.PY2:\n def get_all_headers(message, key):\n return message.getheaders(key)\n\n\nlinux_py2_ascii = (\n platform.system() == 'Linux' and\n six.PY2\n)\n\nrmtree_safe = str if linux_py2_ascii else lambda x: x\n\"\"\"Workaround for http://bugs.python.org/issue24672\"\"\"\n", "path": "setuptools/py27compat.py"}], "after_files": [{"content": "\"\"\"\nCompatibility Support for Python 2.7 and earlier\n\"\"\"\n\nimport platform\n\nfrom setuptools.extern import six\n\n\ndef get_all_headers(message, key):\n \"\"\"\n Given an HTTPMessage, return all headers matching a given key.\n \"\"\"\n return message.get_all(key)\n\n\nif six.PY2:\n def get_all_headers(message, key):\n return message.getheaders(key)\n\n\nlinux_py2_ascii = (\n platform.system() == 'Linux' and\n six.PY2\n)\n\nrmtree_safe = str if linux_py2_ascii else lambda x: x\n\"\"\"Workaround for http://bugs.python.org/issue24672\"\"\"\n", "path": "setuptools/py27compat.py"}]}
| 1,254 | 73 |
gh_patches_debug_23359
|
rasdani/github-patches
|
git_diff
|
openstates__openstates-scrapers-2072
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
MS failing since at least 2017-12-31
MS has been failing since 2017-12-31
Based on automated runs it appears that MS has not run successfully in 7 days (2017-12-31).
```
00:00:18 CRITICAL pupa: Session(s) 2018 Regular Session were reported by Mississippi.get_session_list() but were not found in Mississippi.legislative_sessions or Mississippi.ignored_scraped_sessions.
no pupa_settings on path, using defaults
ms (scrape, import)
bills: {}
committee: {}
people: {}
```
Visit http://bobsled.openstates.org for more info.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `openstates/ms/people.py`
Content:
```
1 import lxml.etree
2
3 from pupa.scrape import Person, Scraper
4
5 import scrapelib
6 import os.path
7
8
9 CAP_ADDRESS = """P. O. Box 1018
10 Jackson, MS 39215"""
11
12
13 class MSLegislatorScraper(Scraper):
14
15 def scrape(self, chamber=None):
16 if chamber:
17 yield from self.scrape_legs(chamber)
18 else:
19 yield from self.scrape_legs('upper')
20 yield from self.scrape_legs('lower')
21
22 def scrape_legs(self, chamber):
23 if chamber == 'upper':
24 url = 'http://billstatus.ls.state.ms.us/members/ss_membs.xml'
25 range_num = 5
26 else:
27 url = 'http://billstatus.ls.state.ms.us/members/hr_membs.xml'
28 range_num = 6
29
30 leg_dir_page = self.get(url)
31 root = lxml.etree.fromstring(leg_dir_page.content)
32 for mr in root.xpath('//LEGISLATURE/MEMBER'):
33 for num in range(1, range_num):
34 leg_path = "string(M%s_NAME)" % num
35 leg_link_path = "string(M%s_LINK)" % num
36 leg = mr.xpath(leg_path)
37 leg_link = mr.xpath(leg_link_path)
38 role = ""
39 yield from self.scrape_details(chamber, leg, leg_link, role)
40
41 # TODO: come back and do roles correctly at some point
42
43 # if chamber == 'lower':
44 # chair_name = root.xpath('string(//CHAIR_NAME)')
45 # chair_link = root.xpath('string(//CHAIR_LINK)')
46 # role = root.xpath('string(//CHAIR_TITLE)')
47 # yield from self.scrape_details(chamber, chair_name, chair_link, role)
48 # else:
49 # Senate Chair is the Governor. Info has to be hard coded
50 # chair_name = root.xpath('string(//CHAIR_NAME)')
51 # role = root.xpath('string(//CHAIR_TITLE)')
52 # TODO: if we're going to hardcode the governor, do it better
53 # district = "Governor"
54 # leg = Legislator(term_name, chamber, district, chair_name,
55 # first_name="", last_name="", middle_name="",
56 # party="Republican", role=role)
57
58 # disabling this since it creates duplicates right now
59 # protemp_name = root.xpath('string(//PROTEMP_NAME)')
60 # protemp_link = root.xpath('string(//PROTEMP_LINK)')
61 # role = root.xpath('string(//PROTEMP_TITLE)')
62 # yield from self.scrape_details(chamber, protemp_name, protemp_link, role)
63
64 def scrape_details(self, chamber, leg_name, leg_link, role):
65 if not leg_link:
66 # Vacant post, likely:
67 if "Vacancy" in leg_name:
68 return
69 raise Exception("leg_link is null. something went wrong")
70 try:
71 url = 'http://billstatus.ls.state.ms.us/members/%s' % leg_link
72 url_root = os.path.dirname(url)
73 details_page = self.get(url)
74 root = lxml.etree.fromstring(details_page.content)
75 party = root.xpath('string(//PARTY)')
76
77 district = root.xpath('string(//DISTRICT)')
78
79 photo = "%s/%s" % (url_root, root.xpath('string(//IMG_NAME)'))
80
81 home_phone = root.xpath('string(//H_PHONE)')
82
83 home_address = root.xpath('string(//H_ADDRESS)')
84 home_address2 = root.xpath('string(//H_ADDRESS2)')
85 home_city = root.xpath('string(//H_CITY)')
86 home_zip = root.xpath('string(//H_ZIP)')
87
88 home_address_total = "%s\n%s\n%s\n%s" % (
89 home_address,
90 home_address2,
91 home_city,
92 home_zip
93 )
94
95 # bis_phone = root.xpath('string(//B_PHONE)')
96 capital_phone = root.xpath('string(//CAP_PHONE)')
97 # other_phone = root.xpath('string(//OTH_PHONE)')
98 org_info = root.xpath('string(//ORG_INFO)')
99 email_name = root.xpath('string(//EMAIL_ADDRESS)').strip()
100 cap_room = root.xpath('string(//CAP_ROOM)')
101
102 if leg_name in ('Lataisha Jackson', 'John G. Faulkner', 'Cheikh Taylor'):
103 assert not party, ("Remove special-casing for this Democrat without a "
104 "listed party: {}").format(leg_name)
105 party = 'Democratic'
106 elif leg_name in ('James W. Mathis',
107 'John Glen Corley',
108 'Missy McGee',
109 'Kevin Ford'):
110 assert not party, ("Remove special-casing for this Republican without"
111 " a listed party: {}").format(leg_name)
112 party = 'Republican'
113 elif party == 'D':
114 party = 'Democratic'
115 elif party == 'R':
116 party = 'Republican'
117 else:
118 raise AssertionError(
119 "A member with no identifiable party was found: {}".format(leg_name))
120 leg = Person(primary_org=chamber,
121 district=district,
122 party=party,
123 image=photo,
124 name=leg_name,
125 role=role
126 )
127 leg.extras['org_info'] = org_info
128 leg.add_source(url)
129 leg.add_link(url)
130
131 if email_name != "":
132 if "@" in email_name:
133 email = email_name
134 else:
135 email = '%s@%s.ms.gov' % (email_name,
136 {"upper": "senate", "lower": "house"}[chamber])
137 leg.add_contact_detail(type='email', value=email, note='Capitol Office')
138
139 if capital_phone != "":
140 leg.add_contact_detail(type='voice', value=capital_phone, note='Capitol Office')
141
142 if cap_room != "":
143 address = "Room %s\n%s" % (cap_room, CAP_ADDRESS)
144 else:
145 address = CAP_ADDRESS
146 leg.add_contact_detail(type='address', value=address, note='Capitol Office')
147
148 if home_phone != "":
149 leg.add_contact_detail(type='voice', value=home_phone, note='District Office')
150
151 if home_address_total != "":
152 leg.add_contact_detail(type='address',
153 value=home_address_total,
154 note='District Office')
155
156 yield leg
157 except scrapelib.HTTPError as e:
158 self.warning(str(e))
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/openstates/ms/people.py b/openstates/ms/people.py
--- a/openstates/ms/people.py
+++ b/openstates/ms/people.py
@@ -99,14 +99,12 @@
email_name = root.xpath('string(//EMAIL_ADDRESS)').strip()
cap_room = root.xpath('string(//CAP_ROOM)')
- if leg_name in ('Lataisha Jackson', 'John G. Faulkner', 'Cheikh Taylor'):
+ if leg_name in ('Lataisha Jackson', 'John G. Faulkner'):
assert not party, ("Remove special-casing for this Democrat without a "
"listed party: {}").format(leg_name)
party = 'Democratic'
elif leg_name in ('James W. Mathis',
- 'John Glen Corley',
- 'Missy McGee',
- 'Kevin Ford'):
+ 'John Glen Corley'):
assert not party, ("Remove special-casing for this Republican without"
" a listed party: {}").format(leg_name)
party = 'Republican'
|
{"golden_diff": "diff --git a/openstates/ms/people.py b/openstates/ms/people.py\n--- a/openstates/ms/people.py\n+++ b/openstates/ms/people.py\n@@ -99,14 +99,12 @@\n email_name = root.xpath('string(//EMAIL_ADDRESS)').strip()\n cap_room = root.xpath('string(//CAP_ROOM)')\n \n- if leg_name in ('Lataisha Jackson', 'John G. Faulkner', 'Cheikh Taylor'):\n+ if leg_name in ('Lataisha Jackson', 'John G. Faulkner'):\n assert not party, (\"Remove special-casing for this Democrat without a \"\n \"listed party: {}\").format(leg_name)\n party = 'Democratic'\n elif leg_name in ('James W. Mathis',\n- 'John Glen Corley',\n- 'Missy McGee',\n- 'Kevin Ford'):\n+ 'John Glen Corley'):\n assert not party, (\"Remove special-casing for this Republican without\"\n \" a listed party: {}\").format(leg_name)\n party = 'Republican'\n", "issue": "MS failing since at least 2017-12-31\nMS has been failing since 2017-12-31\n\nBased on automated runs it appears that MS has not run successfully in 7 days (2017-12-31).\n\n\n```\n 00:00:18 CRITICAL pupa: Session(s) 2018 Regular Session were reported by Mississippi.get_session_list() but were not found in Mississippi.legislative_sessions or Mississippi.ignored_scraped_sessions.\nno pupa_settings on path, using defaults\nms (scrape, import)\n bills: {}\n committee: {}\n people: {}\n```\n\nVisit http://bobsled.openstates.org for more info.\n\n", "before_files": [{"content": "import lxml.etree\n\nfrom pupa.scrape import Person, Scraper\n\nimport scrapelib\nimport os.path\n\n\nCAP_ADDRESS = \"\"\"P. O. Box 1018\nJackson, MS 39215\"\"\"\n\n\nclass MSLegislatorScraper(Scraper):\n\n def scrape(self, chamber=None):\n if chamber:\n yield from self.scrape_legs(chamber)\n else:\n yield from self.scrape_legs('upper')\n yield from self.scrape_legs('lower')\n\n def scrape_legs(self, chamber):\n if chamber == 'upper':\n url = 'http://billstatus.ls.state.ms.us/members/ss_membs.xml'\n range_num = 5\n else:\n url = 'http://billstatus.ls.state.ms.us/members/hr_membs.xml'\n range_num = 6\n\n leg_dir_page = self.get(url)\n root = lxml.etree.fromstring(leg_dir_page.content)\n for mr in root.xpath('//LEGISLATURE/MEMBER'):\n for num in range(1, range_num):\n leg_path = \"string(M%s_NAME)\" % num\n leg_link_path = \"string(M%s_LINK)\" % num\n leg = mr.xpath(leg_path)\n leg_link = mr.xpath(leg_link_path)\n role = \"\"\n yield from self.scrape_details(chamber, leg, leg_link, role)\n\n # TODO: come back and do roles correctly at some point\n\n # if chamber == 'lower':\n # chair_name = root.xpath('string(//CHAIR_NAME)')\n # chair_link = root.xpath('string(//CHAIR_LINK)')\n # role = root.xpath('string(//CHAIR_TITLE)')\n # yield from self.scrape_details(chamber, chair_name, chair_link, role)\n # else:\n # Senate Chair is the Governor. Info has to be hard coded\n # chair_name = root.xpath('string(//CHAIR_NAME)')\n # role = root.xpath('string(//CHAIR_TITLE)')\n # TODO: if we're going to hardcode the governor, do it better\n # district = \"Governor\"\n # leg = Legislator(term_name, chamber, district, chair_name,\n # first_name=\"\", last_name=\"\", middle_name=\"\",\n # party=\"Republican\", role=role)\n\n # disabling this since it creates duplicates right now\n # protemp_name = root.xpath('string(//PROTEMP_NAME)')\n # protemp_link = root.xpath('string(//PROTEMP_LINK)')\n # role = root.xpath('string(//PROTEMP_TITLE)')\n # yield from self.scrape_details(chamber, protemp_name, protemp_link, role)\n\n def scrape_details(self, chamber, leg_name, leg_link, role):\n if not leg_link:\n # Vacant post, likely:\n if \"Vacancy\" in leg_name:\n return\n raise Exception(\"leg_link is null. something went wrong\")\n try:\n url = 'http://billstatus.ls.state.ms.us/members/%s' % leg_link\n url_root = os.path.dirname(url)\n details_page = self.get(url)\n root = lxml.etree.fromstring(details_page.content)\n party = root.xpath('string(//PARTY)')\n\n district = root.xpath('string(//DISTRICT)')\n\n photo = \"%s/%s\" % (url_root, root.xpath('string(//IMG_NAME)'))\n\n home_phone = root.xpath('string(//H_PHONE)')\n\n home_address = root.xpath('string(//H_ADDRESS)')\n home_address2 = root.xpath('string(//H_ADDRESS2)')\n home_city = root.xpath('string(//H_CITY)')\n home_zip = root.xpath('string(//H_ZIP)')\n\n home_address_total = \"%s\\n%s\\n%s\\n%s\" % (\n home_address,\n home_address2,\n home_city,\n home_zip\n )\n\n # bis_phone = root.xpath('string(//B_PHONE)')\n capital_phone = root.xpath('string(//CAP_PHONE)')\n # other_phone = root.xpath('string(//OTH_PHONE)')\n org_info = root.xpath('string(//ORG_INFO)')\n email_name = root.xpath('string(//EMAIL_ADDRESS)').strip()\n cap_room = root.xpath('string(//CAP_ROOM)')\n\n if leg_name in ('Lataisha Jackson', 'John G. Faulkner', 'Cheikh Taylor'):\n assert not party, (\"Remove special-casing for this Democrat without a \"\n \"listed party: {}\").format(leg_name)\n party = 'Democratic'\n elif leg_name in ('James W. Mathis',\n 'John Glen Corley',\n 'Missy McGee',\n 'Kevin Ford'):\n assert not party, (\"Remove special-casing for this Republican without\"\n \" a listed party: {}\").format(leg_name)\n party = 'Republican'\n elif party == 'D':\n party = 'Democratic'\n elif party == 'R':\n party = 'Republican'\n else:\n raise AssertionError(\n \"A member with no identifiable party was found: {}\".format(leg_name))\n leg = Person(primary_org=chamber,\n district=district,\n party=party,\n image=photo,\n name=leg_name,\n role=role\n )\n leg.extras['org_info'] = org_info\n leg.add_source(url)\n leg.add_link(url)\n\n if email_name != \"\":\n if \"@\" in email_name:\n email = email_name\n else:\n email = '%s@%s.ms.gov' % (email_name,\n {\"upper\": \"senate\", \"lower\": \"house\"}[chamber])\n leg.add_contact_detail(type='email', value=email, note='Capitol Office')\n\n if capital_phone != \"\":\n leg.add_contact_detail(type='voice', value=capital_phone, note='Capitol Office')\n\n if cap_room != \"\":\n address = \"Room %s\\n%s\" % (cap_room, CAP_ADDRESS)\n else:\n address = CAP_ADDRESS\n leg.add_contact_detail(type='address', value=address, note='Capitol Office')\n\n if home_phone != \"\":\n leg.add_contact_detail(type='voice', value=home_phone, note='District Office')\n\n if home_address_total != \"\":\n leg.add_contact_detail(type='address',\n value=home_address_total,\n note='District Office')\n\n yield leg\n except scrapelib.HTTPError as e:\n self.warning(str(e))\n", "path": "openstates/ms/people.py"}], "after_files": [{"content": "import lxml.etree\n\nfrom pupa.scrape import Person, Scraper\n\nimport scrapelib\nimport os.path\n\n\nCAP_ADDRESS = \"\"\"P. O. Box 1018\nJackson, MS 39215\"\"\"\n\n\nclass MSLegislatorScraper(Scraper):\n\n def scrape(self, chamber=None):\n if chamber:\n yield from self.scrape_legs(chamber)\n else:\n yield from self.scrape_legs('upper')\n yield from self.scrape_legs('lower')\n\n def scrape_legs(self, chamber):\n if chamber == 'upper':\n url = 'http://billstatus.ls.state.ms.us/members/ss_membs.xml'\n range_num = 5\n else:\n url = 'http://billstatus.ls.state.ms.us/members/hr_membs.xml'\n range_num = 6\n\n leg_dir_page = self.get(url)\n root = lxml.etree.fromstring(leg_dir_page.content)\n for mr in root.xpath('//LEGISLATURE/MEMBER'):\n for num in range(1, range_num):\n leg_path = \"string(M%s_NAME)\" % num\n leg_link_path = \"string(M%s_LINK)\" % num\n leg = mr.xpath(leg_path)\n leg_link = mr.xpath(leg_link_path)\n role = \"\"\n yield from self.scrape_details(chamber, leg, leg_link, role)\n\n # TODO: come back and do roles correctly at some point\n\n # if chamber == 'lower':\n # chair_name = root.xpath('string(//CHAIR_NAME)')\n # chair_link = root.xpath('string(//CHAIR_LINK)')\n # role = root.xpath('string(//CHAIR_TITLE)')\n # yield from self.scrape_details(chamber, chair_name, chair_link, role)\n # else:\n # Senate Chair is the Governor. Info has to be hard coded\n # chair_name = root.xpath('string(//CHAIR_NAME)')\n # role = root.xpath('string(//CHAIR_TITLE)')\n # TODO: if we're going to hardcode the governor, do it better\n # district = \"Governor\"\n # leg = Legislator(term_name, chamber, district, chair_name,\n # first_name=\"\", last_name=\"\", middle_name=\"\",\n # party=\"Republican\", role=role)\n\n # disabling this since it creates duplicates right now\n # protemp_name = root.xpath('string(//PROTEMP_NAME)')\n # protemp_link = root.xpath('string(//PROTEMP_LINK)')\n # role = root.xpath('string(//PROTEMP_TITLE)')\n # yield from self.scrape_details(chamber, protemp_name, protemp_link, role)\n\n def scrape_details(self, chamber, leg_name, leg_link, role):\n if not leg_link:\n # Vacant post, likely:\n if \"Vacancy\" in leg_name:\n return\n raise Exception(\"leg_link is null. something went wrong\")\n try:\n url = 'http://billstatus.ls.state.ms.us/members/%s' % leg_link\n url_root = os.path.dirname(url)\n details_page = self.get(url)\n root = lxml.etree.fromstring(details_page.content)\n party = root.xpath('string(//PARTY)')\n\n district = root.xpath('string(//DISTRICT)')\n\n photo = \"%s/%s\" % (url_root, root.xpath('string(//IMG_NAME)'))\n\n home_phone = root.xpath('string(//H_PHONE)')\n\n home_address = root.xpath('string(//H_ADDRESS)')\n home_address2 = root.xpath('string(//H_ADDRESS2)')\n home_city = root.xpath('string(//H_CITY)')\n home_zip = root.xpath('string(//H_ZIP)')\n\n home_address_total = \"%s\\n%s\\n%s\\n%s\" % (\n home_address,\n home_address2,\n home_city,\n home_zip\n )\n\n # bis_phone = root.xpath('string(//B_PHONE)')\n capital_phone = root.xpath('string(//CAP_PHONE)')\n # other_phone = root.xpath('string(//OTH_PHONE)')\n org_info = root.xpath('string(//ORG_INFO)')\n email_name = root.xpath('string(//EMAIL_ADDRESS)').strip()\n cap_room = root.xpath('string(//CAP_ROOM)')\n\n if leg_name in ('Lataisha Jackson', 'John G. Faulkner'):\n assert not party, (\"Remove special-casing for this Democrat without a \"\n \"listed party: {}\").format(leg_name)\n party = 'Democratic'\n elif leg_name in ('James W. Mathis',\n 'John Glen Corley'):\n assert not party, (\"Remove special-casing for this Republican without\"\n \" a listed party: {}\").format(leg_name)\n party = 'Republican'\n elif party == 'D':\n party = 'Democratic'\n elif party == 'R':\n party = 'Republican'\n else:\n raise AssertionError(\n \"A member with no identifiable party was found: {}\".format(leg_name))\n leg = Person(primary_org=chamber,\n district=district,\n party=party,\n image=photo,\n name=leg_name,\n role=role\n )\n leg.extras['org_info'] = org_info\n leg.add_source(url)\n leg.add_link(url)\n\n if email_name != \"\":\n if \"@\" in email_name:\n email = email_name\n else:\n email = '%s@%s.ms.gov' % (email_name,\n {\"upper\": \"senate\", \"lower\": \"house\"}[chamber])\n leg.add_contact_detail(type='email', value=email, note='Capitol Office')\n\n if capital_phone != \"\":\n leg.add_contact_detail(type='voice', value=capital_phone, note='Capitol Office')\n\n if cap_room != \"\":\n address = \"Room %s\\n%s\" % (cap_room, CAP_ADDRESS)\n else:\n address = CAP_ADDRESS\n leg.add_contact_detail(type='address', value=address, note='Capitol Office')\n\n if home_phone != \"\":\n leg.add_contact_detail(type='voice', value=home_phone, note='District Office')\n\n if home_address_total != \"\":\n leg.add_contact_detail(type='address',\n value=home_address_total,\n note='District Office')\n\n yield leg\n except scrapelib.HTTPError as e:\n self.warning(str(e))\n", "path": "openstates/ms/people.py"}]}
| 2,189 | 230 |
gh_patches_debug_28656
|
rasdani/github-patches
|
git_diff
|
openai__gym-2909
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug Report] Undue UserWarning: We recommend you to use a symmetric and normalized Box action space (range=[-1, 1])
It seems like there is an inversion in the `gym.utils.env_checker.check_env` function: observation space is checked with `check_action_space`
https://github.com/openai/gym/blob/053ee809324861dc1bec19feee97b47ba1557a94/gym/utils/env_checker.py#L237
and action space is checked with `check_observation_space`
https://github.com/openai/gym/blob/053ee809324861dc1bec19feee97b47ba1557a94/gym/utils/env_checker.py#L233
causing undue warnings such as
```log
UserWarning: WARN: We recommend you to use a symmetric and normalized Box action space (range=[-1, 1])
```
### Checklist
- [x] I have checked that there is no similar [issue](https://github.com/openai/gym/issues) in the repo (**required**)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gym/wrappers/env_checker.py`
Content:
```
1 """A passive environment checker wrapper for an environment's observation and action space along with the reset, step and render functions."""
2 from typing import Tuple, Union
3
4 import gym
5 from gym.core import ActType, ObsType
6 from gym.utils.passive_env_checker import (
7 check_action_space,
8 check_observation_space,
9 passive_env_render_check,
10 passive_env_reset_check,
11 passive_env_step_check,
12 )
13
14
15 class PassiveEnvChecker(gym.Wrapper):
16 """A passive environment checker wrapper that surrounds the step, reset and render functions to check they follow the gym API."""
17
18 def __init__(self, env):
19 """Initialises the wrapper with the environments, run the observation and action space tests."""
20 super().__init__(env)
21
22 assert hasattr(
23 env, "action_space"
24 ), "You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/"
25 check_observation_space(env.action_space)
26 assert hasattr(
27 env, "observation_space"
28 ), "You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/"
29 check_action_space(env.observation_space)
30
31 self.checked_reset = False
32 self.checked_step = False
33 self.checked_render = False
34
35 def step(self, action: ActType) -> Tuple[ObsType, float, bool, dict]:
36 """Steps through the environment that on the first call will run the `passive_env_step_check`."""
37 if self.checked_step is False:
38 self.checked_step = True
39 return passive_env_step_check(self.env, action)
40 else:
41 return self.env.step(action)
42
43 def reset(self, **kwargs) -> Union[ObsType, Tuple[ObsType, dict]]:
44 """Resets the environment that on the first call will run the `passive_env_reset_check`."""
45 if self.checked_reset is False:
46 self.checked_reset = True
47 return passive_env_reset_check(self.env, **kwargs)
48 else:
49 return self.env.reset(**kwargs)
50
51 def render(self, *args, **kwargs):
52 """Renders the environment that on the first call will run the `passive_env_render_check`."""
53 if self.checked_render is False:
54 self.checked_render = True
55 return passive_env_render_check(self.env, *args, **kwargs)
56 else:
57 return self.env.render(*args, **kwargs)
58
```
Path: `gym/utils/env_checker.py`
Content:
```
1 """A set of functions for checking an environment details.
2
3 This file is originally from the Stable Baselines3 repository hosted on GitHub
4 (https://github.com/DLR-RM/stable-baselines3/)
5 Original Author: Antonin Raffin
6
7 It also uses some warnings/assertions from the PettingZoo repository hosted on GitHub
8 (https://github.com/PettingZoo-Team/PettingZoo)
9 Original Author: J K Terry
10
11 This was rewritten and split into "env_checker.py" and "passive_env_checker.py" for invasive and passive environment checking
12 Original Author: Mark Towers
13
14 These projects are covered by the MIT License.
15 """
16
17 import inspect
18 from copy import deepcopy
19
20 import numpy as np
21
22 import gym
23 from gym import error, logger
24 from gym.utils.passive_env_checker import (
25 check_action_space,
26 check_observation_space,
27 passive_env_reset_check,
28 passive_env_step_check,
29 )
30
31
32 def data_equivalence(data_1, data_2) -> bool:
33 """Assert equality between data 1 and 2, i.e observations, actions, info.
34
35 Args:
36 data_1: data structure 1
37 data_2: data structure 2
38
39 Returns:
40 If observation 1 and 2 are equivalent
41 """
42 if type(data_1) == type(data_2):
43 if isinstance(data_1, dict):
44 return data_1.keys() == data_2.keys() and all(
45 data_equivalence(data_1[k], data_2[k]) for k in data_1.keys()
46 )
47 elif isinstance(data_1, tuple):
48 return len(data_1) == len(data_2) and all(
49 data_equivalence(o_1, o_2) for o_1, o_2 in zip(data_1, data_2)
50 )
51 elif isinstance(data_1, np.ndarray):
52 return np.all(data_1 == data_2)
53 else:
54 return data_1 == data_2
55 else:
56 return False
57
58
59 def check_reset_seed(env: gym.Env):
60 """Check that the environment can be reset with a seed.
61
62 Args:
63 env: The environment to check
64
65 Raises:
66 AssertionError: The environment cannot be reset with a random seed,
67 even though `seed` or `kwargs` appear in the signature.
68 """
69 signature = inspect.signature(env.reset)
70 if "seed" in signature.parameters or "kwargs" in signature.parameters:
71 try:
72 obs_1 = env.reset(seed=123)
73 assert obs_1 in env.observation_space
74 obs_2 = env.reset(seed=123)
75 assert obs_2 in env.observation_space
76 assert data_equivalence(obs_1, obs_2)
77 seed_123_rng = deepcopy(env.unwrapped.np_random)
78
79 # Note: for some environment, they may initialise at the same state, therefore we cannot check the obs_1 != obs_3
80 obs_4 = env.reset(seed=None)
81 assert obs_4 in env.observation_space
82
83 assert (
84 env.unwrapped.np_random.bit_generator.state
85 != seed_123_rng.bit_generator.state
86 )
87 except TypeError as e:
88 raise AssertionError(
89 "The environment cannot be reset with a random seed, even though `seed` or `kwargs` appear in the signature. "
90 "This should never happen, please report this issue. "
91 f"The error was: {e}"
92 )
93
94 if env.unwrapped.np_random is None:
95 logger.warn(
96 "Resetting the environment did not result in seeding its random number generator. "
97 "This is likely due to not calling `super().reset(seed=seed)` in the `reset` method. "
98 "If you do not use the python-level random number generator, this is not a problem."
99 )
100
101 seed_param = signature.parameters.get("seed")
102 # Check the default value is None
103 if seed_param is not None and seed_param.default is not None:
104 logger.warn(
105 "The default seed argument in reset should be `None`, "
106 "otherwise the environment will by default always be deterministic"
107 )
108 else:
109 raise error.Error(
110 "The `reset` method does not provide the `return_info` keyword argument"
111 )
112
113
114 def check_reset_info(env: gym.Env):
115 """Checks that :meth:`reset` supports the ``return_info`` keyword.
116
117 Args:
118 env: The environment to check
119
120 Raises:
121 AssertionError: The environment cannot be reset with `return_info=True`,
122 even though `return_info` or `kwargs` appear in the signature.
123 """
124 signature = inspect.signature(env.reset)
125 if "return_info" in signature.parameters or "kwargs" in signature.parameters:
126 try:
127 result = env.reset(return_info=True)
128 assert (
129 len(result) == 2
130 ), "Calling the reset method with `return_info=True` did not return a 2-tuple"
131 obs, info = result
132 assert isinstance(
133 info, dict
134 ), "The second element returned by `env.reset(return_info=True)` was not a dictionary"
135 except TypeError as e:
136 raise AssertionError(
137 "The environment cannot be reset with `return_info=True`, even though `return_info` or `kwargs` "
138 "appear in the signature. This should never happen, please report this issue. "
139 f"The error was: {e}"
140 )
141 else:
142 raise error.Error(
143 "The `reset` method does not provide the `return_info` keyword argument"
144 )
145
146
147 def check_reset_options(env: gym.Env):
148 """Check that the environment can be reset with options.
149
150 Args:
151 env: The environment to check
152
153 Raises:
154 AssertionError: The environment cannot be reset with options,
155 even though `options` or `kwargs` appear in the signature.
156 """
157 signature = inspect.signature(env.reset)
158 if "options" in signature.parameters or "kwargs" in signature.parameters:
159 try:
160 env.reset(options={})
161 except TypeError as e:
162 raise AssertionError(
163 "The environment cannot be reset with options, even though `options` or `kwargs` appear in the signature. "
164 "This should never happen, please report this issue. "
165 f"The error was: {e}"
166 )
167 else:
168 raise error.Error(
169 "The `reset` method does not provide the `options` keyword argument"
170 )
171
172
173 def check_render(env: gym.Env, warn: bool = True):
174 """Check the declared render modes/fps of the environment.
175
176 Args:
177 env: The environment to check
178 warn: Whether to output additional warnings
179 """
180 render_modes = env.metadata.get("render_modes")
181 if render_modes is None:
182 if warn:
183 logger.warn(
184 "No render modes was declared in the environment "
185 " (env.metadata['render_modes'] is None or not defined), "
186 "you may have trouble when calling `.render()`"
187 )
188
189 render_fps = env.metadata.get("render_fps")
190 # We only require `render_fps` if rendering is actually implemented
191 if render_fps is None and render_modes is not None and len(render_modes) > 0:
192 if warn:
193 logger.warn(
194 "No render fps was declared in the environment "
195 " (env.metadata['render_fps'] is None or not defined), "
196 "rendering may occur at inconsistent fps"
197 )
198
199 if warn:
200 if not hasattr(env, "render_mode"): # TODO: raise an error with gym 1.0
201 logger.warn("Environments must define render_mode attribute.")
202 elif env.render_mode is not None and env.render_mode not in render_modes:
203 logger.warn(
204 "The environment was initialized successfully with an unsupported render mode."
205 )
206
207
208 def check_env(env: gym.Env, warn: bool = None, skip_render_check: bool = True):
209 """Check that an environment follows Gym API.
210
211 This is an invasive function that calls the environment's reset and step.
212
213 This is particularly useful when using a custom environment.
214 Please take a look at https://www.gymlibrary.ml/content/environment_creation/
215 for more information about the API.
216
217 Args:
218 env: The Gym environment that will be checked
219 warn: Ignored
220 skip_render_check: Whether to skip the checks for the render method. True by default (useful for the CI)
221 """
222 if warn is not None:
223 logger.warn("`check_env` warn parameter is now ignored.")
224
225 assert isinstance(
226 env, gym.Env
227 ), "Your environment must inherit from the gym.Env class https://www.gymlibrary.ml/content/environment_creation/"
228
229 # ============= Check the spaces (observation and action) ================
230 assert hasattr(
231 env, "action_space"
232 ), "You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/"
233 check_observation_space(env.action_space)
234 assert hasattr(
235 env, "observation_space"
236 ), "You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/"
237 check_action_space(env.observation_space)
238
239 # ==== Check the reset method ====
240 check_reset_seed(env)
241 check_reset_options(env)
242 check_reset_info(env)
243
244 # ============ Check the returned values ===============
245 passive_env_reset_check(env)
246 passive_env_step_check(env, env.action_space.sample())
247
248 # ==== Check the render method and the declared render modes ====
249 if not skip_render_check:
250 check_render(env)
251
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/gym/utils/env_checker.py b/gym/utils/env_checker.py
--- a/gym/utils/env_checker.py
+++ b/gym/utils/env_checker.py
@@ -230,11 +230,11 @@
assert hasattr(
env, "action_space"
), "You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/"
- check_observation_space(env.action_space)
+ check_action_space(env.action_space)
assert hasattr(
env, "observation_space"
), "You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/"
- check_action_space(env.observation_space)
+ check_observation_space(env.observation_space)
# ==== Check the reset method ====
check_reset_seed(env)
diff --git a/gym/wrappers/env_checker.py b/gym/wrappers/env_checker.py
--- a/gym/wrappers/env_checker.py
+++ b/gym/wrappers/env_checker.py
@@ -22,11 +22,11 @@
assert hasattr(
env, "action_space"
), "You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/"
- check_observation_space(env.action_space)
+ check_action_space(env.action_space)
assert hasattr(
env, "observation_space"
), "You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/"
- check_action_space(env.observation_space)
+ check_observation_space(env.observation_space)
self.checked_reset = False
self.checked_step = False
|
{"golden_diff": "diff --git a/gym/utils/env_checker.py b/gym/utils/env_checker.py\n--- a/gym/utils/env_checker.py\n+++ b/gym/utils/env_checker.py\n@@ -230,11 +230,11 @@\n assert hasattr(\n env, \"action_space\"\n ), \"You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/\"\n- check_observation_space(env.action_space)\n+ check_action_space(env.action_space)\n assert hasattr(\n env, \"observation_space\"\n ), \"You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/\"\n- check_action_space(env.observation_space)\n+ check_observation_space(env.observation_space)\n \n # ==== Check the reset method ====\n check_reset_seed(env)\ndiff --git a/gym/wrappers/env_checker.py b/gym/wrappers/env_checker.py\n--- a/gym/wrappers/env_checker.py\n+++ b/gym/wrappers/env_checker.py\n@@ -22,11 +22,11 @@\n assert hasattr(\n env, \"action_space\"\n ), \"You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/\"\n- check_observation_space(env.action_space)\n+ check_action_space(env.action_space)\n assert hasattr(\n env, \"observation_space\"\n ), \"You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/\"\n- check_action_space(env.observation_space)\n+ check_observation_space(env.observation_space)\n \n self.checked_reset = False\n self.checked_step = False\n", "issue": "[Bug Report] Undue UserWarning: We recommend you to use a symmetric and normalized Box action space (range=[-1, 1])\nIt seems like there is an inversion in the `gym.utils.env_checker.check_env` function: observation space is checked with `check_action_space`\r\n\r\nhttps://github.com/openai/gym/blob/053ee809324861dc1bec19feee97b47ba1557a94/gym/utils/env_checker.py#L237\r\n\r\nand action space is checked with `check_observation_space`\r\n\r\nhttps://github.com/openai/gym/blob/053ee809324861dc1bec19feee97b47ba1557a94/gym/utils/env_checker.py#L233\r\n\r\ncausing undue warnings such as \r\n\r\n```log\r\nUserWarning: WARN: We recommend you to use a symmetric and normalized Box action space (range=[-1, 1])\r\n```\r\n\r\n### Checklist\r\n\r\n- [x] I have checked that there is no similar [issue](https://github.com/openai/gym/issues) in the repo (**required**)\r\n\n", "before_files": [{"content": "\"\"\"A passive environment checker wrapper for an environment's observation and action space along with the reset, step and render functions.\"\"\"\nfrom typing import Tuple, Union\n\nimport gym\nfrom gym.core import ActType, ObsType\nfrom gym.utils.passive_env_checker import (\n check_action_space,\n check_observation_space,\n passive_env_render_check,\n passive_env_reset_check,\n passive_env_step_check,\n)\n\n\nclass PassiveEnvChecker(gym.Wrapper):\n \"\"\"A passive environment checker wrapper that surrounds the step, reset and render functions to check they follow the gym API.\"\"\"\n\n def __init__(self, env):\n \"\"\"Initialises the wrapper with the environments, run the observation and action space tests.\"\"\"\n super().__init__(env)\n\n assert hasattr(\n env, \"action_space\"\n ), \"You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_observation_space(env.action_space)\n assert hasattr(\n env, \"observation_space\"\n ), \"You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_action_space(env.observation_space)\n\n self.checked_reset = False\n self.checked_step = False\n self.checked_render = False\n\n def step(self, action: ActType) -> Tuple[ObsType, float, bool, dict]:\n \"\"\"Steps through the environment that on the first call will run the `passive_env_step_check`.\"\"\"\n if self.checked_step is False:\n self.checked_step = True\n return passive_env_step_check(self.env, action)\n else:\n return self.env.step(action)\n\n def reset(self, **kwargs) -> Union[ObsType, Tuple[ObsType, dict]]:\n \"\"\"Resets the environment that on the first call will run the `passive_env_reset_check`.\"\"\"\n if self.checked_reset is False:\n self.checked_reset = True\n return passive_env_reset_check(self.env, **kwargs)\n else:\n return self.env.reset(**kwargs)\n\n def render(self, *args, **kwargs):\n \"\"\"Renders the environment that on the first call will run the `passive_env_render_check`.\"\"\"\n if self.checked_render is False:\n self.checked_render = True\n return passive_env_render_check(self.env, *args, **kwargs)\n else:\n return self.env.render(*args, **kwargs)\n", "path": "gym/wrappers/env_checker.py"}, {"content": "\"\"\"A set of functions for checking an environment details.\n\nThis file is originally from the Stable Baselines3 repository hosted on GitHub\n(https://github.com/DLR-RM/stable-baselines3/)\nOriginal Author: Antonin Raffin\n\nIt also uses some warnings/assertions from the PettingZoo repository hosted on GitHub\n(https://github.com/PettingZoo-Team/PettingZoo)\nOriginal Author: J K Terry\n\nThis was rewritten and split into \"env_checker.py\" and \"passive_env_checker.py\" for invasive and passive environment checking\nOriginal Author: Mark Towers\n\nThese projects are covered by the MIT License.\n\"\"\"\n\nimport inspect\nfrom copy import deepcopy\n\nimport numpy as np\n\nimport gym\nfrom gym import error, logger\nfrom gym.utils.passive_env_checker import (\n check_action_space,\n check_observation_space,\n passive_env_reset_check,\n passive_env_step_check,\n)\n\n\ndef data_equivalence(data_1, data_2) -> bool:\n \"\"\"Assert equality between data 1 and 2, i.e observations, actions, info.\n\n Args:\n data_1: data structure 1\n data_2: data structure 2\n\n Returns:\n If observation 1 and 2 are equivalent\n \"\"\"\n if type(data_1) == type(data_2):\n if isinstance(data_1, dict):\n return data_1.keys() == data_2.keys() and all(\n data_equivalence(data_1[k], data_2[k]) for k in data_1.keys()\n )\n elif isinstance(data_1, tuple):\n return len(data_1) == len(data_2) and all(\n data_equivalence(o_1, o_2) for o_1, o_2 in zip(data_1, data_2)\n )\n elif isinstance(data_1, np.ndarray):\n return np.all(data_1 == data_2)\n else:\n return data_1 == data_2\n else:\n return False\n\n\ndef check_reset_seed(env: gym.Env):\n \"\"\"Check that the environment can be reset with a seed.\n\n Args:\n env: The environment to check\n\n Raises:\n AssertionError: The environment cannot be reset with a random seed,\n even though `seed` or `kwargs` appear in the signature.\n \"\"\"\n signature = inspect.signature(env.reset)\n if \"seed\" in signature.parameters or \"kwargs\" in signature.parameters:\n try:\n obs_1 = env.reset(seed=123)\n assert obs_1 in env.observation_space\n obs_2 = env.reset(seed=123)\n assert obs_2 in env.observation_space\n assert data_equivalence(obs_1, obs_2)\n seed_123_rng = deepcopy(env.unwrapped.np_random)\n\n # Note: for some environment, they may initialise at the same state, therefore we cannot check the obs_1 != obs_3\n obs_4 = env.reset(seed=None)\n assert obs_4 in env.observation_space\n\n assert (\n env.unwrapped.np_random.bit_generator.state\n != seed_123_rng.bit_generator.state\n )\n except TypeError as e:\n raise AssertionError(\n \"The environment cannot be reset with a random seed, even though `seed` or `kwargs` appear in the signature. \"\n \"This should never happen, please report this issue. \"\n f\"The error was: {e}\"\n )\n\n if env.unwrapped.np_random is None:\n logger.warn(\n \"Resetting the environment did not result in seeding its random number generator. \"\n \"This is likely due to not calling `super().reset(seed=seed)` in the `reset` method. \"\n \"If you do not use the python-level random number generator, this is not a problem.\"\n )\n\n seed_param = signature.parameters.get(\"seed\")\n # Check the default value is None\n if seed_param is not None and seed_param.default is not None:\n logger.warn(\n \"The default seed argument in reset should be `None`, \"\n \"otherwise the environment will by default always be deterministic\"\n )\n else:\n raise error.Error(\n \"The `reset` method does not provide the `return_info` keyword argument\"\n )\n\n\ndef check_reset_info(env: gym.Env):\n \"\"\"Checks that :meth:`reset` supports the ``return_info`` keyword.\n\n Args:\n env: The environment to check\n\n Raises:\n AssertionError: The environment cannot be reset with `return_info=True`,\n even though `return_info` or `kwargs` appear in the signature.\n \"\"\"\n signature = inspect.signature(env.reset)\n if \"return_info\" in signature.parameters or \"kwargs\" in signature.parameters:\n try:\n result = env.reset(return_info=True)\n assert (\n len(result) == 2\n ), \"Calling the reset method with `return_info=True` did not return a 2-tuple\"\n obs, info = result\n assert isinstance(\n info, dict\n ), \"The second element returned by `env.reset(return_info=True)` was not a dictionary\"\n except TypeError as e:\n raise AssertionError(\n \"The environment cannot be reset with `return_info=True`, even though `return_info` or `kwargs` \"\n \"appear in the signature. This should never happen, please report this issue. \"\n f\"The error was: {e}\"\n )\n else:\n raise error.Error(\n \"The `reset` method does not provide the `return_info` keyword argument\"\n )\n\n\ndef check_reset_options(env: gym.Env):\n \"\"\"Check that the environment can be reset with options.\n\n Args:\n env: The environment to check\n\n Raises:\n AssertionError: The environment cannot be reset with options,\n even though `options` or `kwargs` appear in the signature.\n \"\"\"\n signature = inspect.signature(env.reset)\n if \"options\" in signature.parameters or \"kwargs\" in signature.parameters:\n try:\n env.reset(options={})\n except TypeError as e:\n raise AssertionError(\n \"The environment cannot be reset with options, even though `options` or `kwargs` appear in the signature. \"\n \"This should never happen, please report this issue. \"\n f\"The error was: {e}\"\n )\n else:\n raise error.Error(\n \"The `reset` method does not provide the `options` keyword argument\"\n )\n\n\ndef check_render(env: gym.Env, warn: bool = True):\n \"\"\"Check the declared render modes/fps of the environment.\n\n Args:\n env: The environment to check\n warn: Whether to output additional warnings\n \"\"\"\n render_modes = env.metadata.get(\"render_modes\")\n if render_modes is None:\n if warn:\n logger.warn(\n \"No render modes was declared in the environment \"\n \" (env.metadata['render_modes'] is None or not defined), \"\n \"you may have trouble when calling `.render()`\"\n )\n\n render_fps = env.metadata.get(\"render_fps\")\n # We only require `render_fps` if rendering is actually implemented\n if render_fps is None and render_modes is not None and len(render_modes) > 0:\n if warn:\n logger.warn(\n \"No render fps was declared in the environment \"\n \" (env.metadata['render_fps'] is None or not defined), \"\n \"rendering may occur at inconsistent fps\"\n )\n\n if warn:\n if not hasattr(env, \"render_mode\"): # TODO: raise an error with gym 1.0\n logger.warn(\"Environments must define render_mode attribute.\")\n elif env.render_mode is not None and env.render_mode not in render_modes:\n logger.warn(\n \"The environment was initialized successfully with an unsupported render mode.\"\n )\n\n\ndef check_env(env: gym.Env, warn: bool = None, skip_render_check: bool = True):\n \"\"\"Check that an environment follows Gym API.\n\n This is an invasive function that calls the environment's reset and step.\n\n This is particularly useful when using a custom environment.\n Please take a look at https://www.gymlibrary.ml/content/environment_creation/\n for more information about the API.\n\n Args:\n env: The Gym environment that will be checked\n warn: Ignored\n skip_render_check: Whether to skip the checks for the render method. True by default (useful for the CI)\n \"\"\"\n if warn is not None:\n logger.warn(\"`check_env` warn parameter is now ignored.\")\n\n assert isinstance(\n env, gym.Env\n ), \"Your environment must inherit from the gym.Env class https://www.gymlibrary.ml/content/environment_creation/\"\n\n # ============= Check the spaces (observation and action) ================\n assert hasattr(\n env, \"action_space\"\n ), \"You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_observation_space(env.action_space)\n assert hasattr(\n env, \"observation_space\"\n ), \"You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_action_space(env.observation_space)\n\n # ==== Check the reset method ====\n check_reset_seed(env)\n check_reset_options(env)\n check_reset_info(env)\n\n # ============ Check the returned values ===============\n passive_env_reset_check(env)\n passive_env_step_check(env, env.action_space.sample())\n\n # ==== Check the render method and the declared render modes ====\n if not skip_render_check:\n check_render(env)\n", "path": "gym/utils/env_checker.py"}], "after_files": [{"content": "\"\"\"A passive environment checker wrapper for an environment's observation and action space along with the reset, step and render functions.\"\"\"\nfrom typing import Tuple, Union\n\nimport gym\nfrom gym.core import ActType, ObsType\nfrom gym.utils.passive_env_checker import (\n check_action_space,\n check_observation_space,\n passive_env_render_check,\n passive_env_reset_check,\n passive_env_step_check,\n)\n\n\nclass PassiveEnvChecker(gym.Wrapper):\n \"\"\"A passive environment checker wrapper that surrounds the step, reset and render functions to check they follow the gym API.\"\"\"\n\n def __init__(self, env):\n \"\"\"Initialises the wrapper with the environments, run the observation and action space tests.\"\"\"\n super().__init__(env)\n\n assert hasattr(\n env, \"action_space\"\n ), \"You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_action_space(env.action_space)\n assert hasattr(\n env, \"observation_space\"\n ), \"You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_observation_space(env.observation_space)\n\n self.checked_reset = False\n self.checked_step = False\n self.checked_render = False\n\n def step(self, action: ActType) -> Tuple[ObsType, float, bool, dict]:\n \"\"\"Steps through the environment that on the first call will run the `passive_env_step_check`.\"\"\"\n if self.checked_step is False:\n self.checked_step = True\n return passive_env_step_check(self.env, action)\n else:\n return self.env.step(action)\n\n def reset(self, **kwargs) -> Union[ObsType, Tuple[ObsType, dict]]:\n \"\"\"Resets the environment that on the first call will run the `passive_env_reset_check`.\"\"\"\n if self.checked_reset is False:\n self.checked_reset = True\n return passive_env_reset_check(self.env, **kwargs)\n else:\n return self.env.reset(**kwargs)\n\n def render(self, *args, **kwargs):\n \"\"\"Renders the environment that on the first call will run the `passive_env_render_check`.\"\"\"\n if self.checked_render is False:\n self.checked_render = True\n return passive_env_render_check(self.env, *args, **kwargs)\n else:\n return self.env.render(*args, **kwargs)\n", "path": "gym/wrappers/env_checker.py"}, {"content": "\"\"\"A set of functions for checking an environment details.\n\nThis file is originally from the Stable Baselines3 repository hosted on GitHub\n(https://github.com/DLR-RM/stable-baselines3/)\nOriginal Author: Antonin Raffin\n\nIt also uses some warnings/assertions from the PettingZoo repository hosted on GitHub\n(https://github.com/PettingZoo-Team/PettingZoo)\nOriginal Author: J K Terry\n\nThis was rewritten and split into \"env_checker.py\" and \"passive_env_checker.py\" for invasive and passive environment checking\nOriginal Author: Mark Towers\n\nThese projects are covered by the MIT License.\n\"\"\"\n\nimport inspect\nfrom copy import deepcopy\n\nimport numpy as np\n\nimport gym\nfrom gym import error, logger\nfrom gym.utils.passive_env_checker import (\n check_action_space,\n check_observation_space,\n passive_env_reset_check,\n passive_env_step_check,\n)\n\n\ndef data_equivalence(data_1, data_2) -> bool:\n \"\"\"Assert equality between data 1 and 2, i.e observations, actions, info.\n\n Args:\n data_1: data structure 1\n data_2: data structure 2\n\n Returns:\n If observation 1 and 2 are equivalent\n \"\"\"\n if type(data_1) == type(data_2):\n if isinstance(data_1, dict):\n return data_1.keys() == data_2.keys() and all(\n data_equivalence(data_1[k], data_2[k]) for k in data_1.keys()\n )\n elif isinstance(data_1, tuple):\n return len(data_1) == len(data_2) and all(\n data_equivalence(o_1, o_2) for o_1, o_2 in zip(data_1, data_2)\n )\n elif isinstance(data_1, np.ndarray):\n return np.all(data_1 == data_2)\n else:\n return data_1 == data_2\n else:\n return False\n\n\ndef check_reset_seed(env: gym.Env):\n \"\"\"Check that the environment can be reset with a seed.\n\n Args:\n env: The environment to check\n\n Raises:\n AssertionError: The environment cannot be reset with a random seed,\n even though `seed` or `kwargs` appear in the signature.\n \"\"\"\n signature = inspect.signature(env.reset)\n if \"seed\" in signature.parameters or \"kwargs\" in signature.parameters:\n try:\n obs_1 = env.reset(seed=123)\n assert obs_1 in env.observation_space\n obs_2 = env.reset(seed=123)\n assert obs_2 in env.observation_space\n assert data_equivalence(obs_1, obs_2)\n seed_123_rng = deepcopy(env.unwrapped.np_random)\n\n # Note: for some environment, they may initialise at the same state, therefore we cannot check the obs_1 != obs_3\n obs_4 = env.reset(seed=None)\n assert obs_4 in env.observation_space\n\n assert (\n env.unwrapped.np_random.bit_generator.state\n != seed_123_rng.bit_generator.state\n )\n except TypeError as e:\n raise AssertionError(\n \"The environment cannot be reset with a random seed, even though `seed` or `kwargs` appear in the signature. \"\n \"This should never happen, please report this issue. \"\n f\"The error was: {e}\"\n )\n\n if env.unwrapped.np_random is None:\n logger.warn(\n \"Resetting the environment did not result in seeding its random number generator. \"\n \"This is likely due to not calling `super().reset(seed=seed)` in the `reset` method. \"\n \"If you do not use the python-level random number generator, this is not a problem.\"\n )\n\n seed_param = signature.parameters.get(\"seed\")\n # Check the default value is None\n if seed_param is not None and seed_param.default is not None:\n logger.warn(\n \"The default seed argument in reset should be `None`, \"\n \"otherwise the environment will by default always be deterministic\"\n )\n else:\n raise error.Error(\n \"The `reset` method does not provide the `return_info` keyword argument\"\n )\n\n\ndef check_reset_info(env: gym.Env):\n \"\"\"Checks that :meth:`reset` supports the ``return_info`` keyword.\n\n Args:\n env: The environment to check\n\n Raises:\n AssertionError: The environment cannot be reset with `return_info=True`,\n even though `return_info` or `kwargs` appear in the signature.\n \"\"\"\n signature = inspect.signature(env.reset)\n if \"return_info\" in signature.parameters or \"kwargs\" in signature.parameters:\n try:\n result = env.reset(return_info=True)\n assert (\n len(result) == 2\n ), \"Calling the reset method with `return_info=True` did not return a 2-tuple\"\n obs, info = result\n assert isinstance(\n info, dict\n ), \"The second element returned by `env.reset(return_info=True)` was not a dictionary\"\n except TypeError as e:\n raise AssertionError(\n \"The environment cannot be reset with `return_info=True`, even though `return_info` or `kwargs` \"\n \"appear in the signature. This should never happen, please report this issue. \"\n f\"The error was: {e}\"\n )\n else:\n raise error.Error(\n \"The `reset` method does not provide the `return_info` keyword argument\"\n )\n\n\ndef check_reset_options(env: gym.Env):\n \"\"\"Check that the environment can be reset with options.\n\n Args:\n env: The environment to check\n\n Raises:\n AssertionError: The environment cannot be reset with options,\n even though `options` or `kwargs` appear in the signature.\n \"\"\"\n signature = inspect.signature(env.reset)\n if \"options\" in signature.parameters or \"kwargs\" in signature.parameters:\n try:\n env.reset(options={})\n except TypeError as e:\n raise AssertionError(\n \"The environment cannot be reset with options, even though `options` or `kwargs` appear in the signature. \"\n \"This should never happen, please report this issue. \"\n f\"The error was: {e}\"\n )\n else:\n raise error.Error(\n \"The `reset` method does not provide the `options` keyword argument\"\n )\n\n\ndef check_render(env: gym.Env, warn: bool = True):\n \"\"\"Check the declared render modes/fps of the environment.\n\n Args:\n env: The environment to check\n warn: Whether to output additional warnings\n \"\"\"\n render_modes = env.metadata.get(\"render_modes\")\n if render_modes is None:\n if warn:\n logger.warn(\n \"No render modes was declared in the environment \"\n \" (env.metadata['render_modes'] is None or not defined), \"\n \"you may have trouble when calling `.render()`\"\n )\n\n render_fps = env.metadata.get(\"render_fps\")\n # We only require `render_fps` if rendering is actually implemented\n if render_fps is None and render_modes is not None and len(render_modes) > 0:\n if warn:\n logger.warn(\n \"No render fps was declared in the environment \"\n \" (env.metadata['render_fps'] is None or not defined), \"\n \"rendering may occur at inconsistent fps\"\n )\n\n if warn:\n if not hasattr(env, \"render_mode\"): # TODO: raise an error with gym 1.0\n logger.warn(\"Environments must define render_mode attribute.\")\n elif env.render_mode is not None and env.render_mode not in render_modes:\n logger.warn(\n \"The environment was initialized successfully with an unsupported render mode.\"\n )\n\n\ndef check_env(env: gym.Env, warn: bool = None, skip_render_check: bool = True):\n \"\"\"Check that an environment follows Gym API.\n\n This is an invasive function that calls the environment's reset and step.\n\n This is particularly useful when using a custom environment.\n Please take a look at https://www.gymlibrary.ml/content/environment_creation/\n for more information about the API.\n\n Args:\n env: The Gym environment that will be checked\n warn: Ignored\n skip_render_check: Whether to skip the checks for the render method. True by default (useful for the CI)\n \"\"\"\n if warn is not None:\n logger.warn(\"`check_env` warn parameter is now ignored.\")\n\n assert isinstance(\n env, gym.Env\n ), \"Your environment must inherit from the gym.Env class https://www.gymlibrary.ml/content/environment_creation/\"\n\n # ============= Check the spaces (observation and action) ================\n assert hasattr(\n env, \"action_space\"\n ), \"You must specify a action space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_action_space(env.action_space)\n assert hasattr(\n env, \"observation_space\"\n ), \"You must specify an observation space. https://www.gymlibrary.ml/content/environment_creation/\"\n check_observation_space(env.observation_space)\n\n # ==== Check the reset method ====\n check_reset_seed(env)\n check_reset_options(env)\n check_reset_info(env)\n\n # ============ Check the returned values ===============\n passive_env_reset_check(env)\n passive_env_step_check(env, env.action_space.sample())\n\n # ==== Check the render method and the declared render modes ====\n if not skip_render_check:\n check_render(env)\n", "path": "gym/utils/env_checker.py"}]}
| 3,842 | 353 |
gh_patches_debug_23965
|
rasdani/github-patches
|
git_diff
|
ansible__ansible-41597
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
service_facts returns misleading state on systemd hosts
##### SUMMARY
The system module service_facts returns misleading state information on systemd hosts.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
service_facts
##### ANSIBLE VERSION
```
ansible 2.5.2
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/USER/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.5 (default, Feb 20 2018, 09:19:12) [GCC 4.8.5 20150623 (Red Hat 4.8.5-28)]
```
##### CONFIGURATION
N/A
##### OS / ENVIRONMENT
Ansible controller: RHEL 7.5
Remote host: RHEL 7.4
##### STEPS TO REPRODUCE
The service_facts module returns one of two states for each service it discovers: either "running" or "stopped".
```json
{
"ansible_facts": {
"services": {
"rsyncd.service": {
"name": "rsyncd.service",
"source": "systemd",
"state": "stopped"
},
"rsyslog.service": {
"name": "rsyslog.service",
"source": "systemd",
"state": "running"
},
...
}
}
}
```
On systemd hosts, the module obtains state information via `systemctl list-unit-files --type=service` (see [line 181 in module source code](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/system/service_facts.py#L181)). However, this command does not report running/stopped status. Instead, it reports enabled/disable status which is not the same thing. Enabled/disabled determines whether the service will be started by systemd _on reboot_, but says nothing about whether it is currently running or stopped.
Here are two examples:
**1. Service disabled & started:**
```bash
# On remote host
systemctl disable <service>
systemctl start <service>
systemctl status <service>
# systemctl reports: loaded, disabled, active/running
```
**2. Service enabled & stopped:**
```bash
# On remote host
systemctl enable <service>
systemctl stop <service>
systemctl status <service>
# systemctl reports: loaded, enabled, inactive/dead
```
**On the Ansible controller,** use the service_facts module to query the <service> just configured on the remote host.
##### EXPECTED RESULTS
**1. Service disabled & started:** service_facts should report "running"
**2. Service enabled & stopped:** service_facts should report "stopped"
##### ACTUAL RESULTS
**1. Service disabled & started:** service_facts reports "stopped"
**2. Service enabled & stopped:** service_facts reports "running"
These misleading reports are caused by the test in [line 186 of the source code](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/system/service_facts.py#L186). The service_facts module explicitly tests whether the service is "enabled" and equates that with running/stopped, which is incorrect for systemd/systemctl.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/ansible/modules/system/service_facts.py`
Content:
```
1 #!/usr/bin/python
2 # (c) 2017, Ansible Project
3 # GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
4
5 # originally copied from AWX's scan_services module to bring this functionality
6 # into Core
7
8 from __future__ import absolute_import, division, print_function
9 __metaclass__ = type
10
11
12 ANSIBLE_METADATA = {'metadata_version': '1.1',
13 'status': ['preview'],
14 'supported_by': 'community'}
15
16 DOCUMENTATION = '''
17 ---
18 module: service_facts
19 short_description: Return service state information as fact data
20 description:
21 - Return service state information as fact data for various service management utilities
22 version_added: "2.5"
23 requirements: ["Any of the following supported init systems: systemd, sysv, upstart"]
24
25 author:
26 - Matthew Jones
27 - Adam Miller (@maxamillion)
28 '''
29
30 EXAMPLES = '''
31 - name: populate service facts
32 service_facts:
33
34 - debug:
35 var: ansible_facts.services
36
37 '''
38
39 RETURN = '''
40 ansible_facts:
41 description: facts to add to ansible_facts about the services on the system
42 returned: always
43 type: complex
44 contains:
45 "services": {
46 "network": {
47 "source": "sysv",
48 "state": "running",
49 "name": "network"
50 },
51 arp-ethers.service: {
52 "source": "systemd",
53 "state": "stopped",
54 "name": "arp-ethers.service"
55 }
56 }
57 '''
58
59
60 import re
61 from ansible.module_utils.basic import AnsibleModule
62
63
64 class BaseService(object):
65
66 def __init__(self, module):
67 self.module = module
68 self.incomplete_warning = False
69
70
71 class ServiceScanService(BaseService):
72
73 def gather_services(self):
74 services = {}
75 service_path = self.module.get_bin_path("service")
76 if service_path is None:
77 return None
78 initctl_path = self.module.get_bin_path("initctl")
79 chkconfig_path = self.module.get_bin_path("chkconfig")
80
81 # sysvinit
82 if service_path is not None and chkconfig_path is None:
83 rc, stdout, stderr = self.module.run_command("%s --status-all 2>&1 | grep -E \"\\[ (\\+|\\-) \\]\"" % service_path, use_unsafe_shell=True)
84 for line in stdout.split("\n"):
85 line_data = line.split()
86 if len(line_data) < 4:
87 continue # Skipping because we expected more data
88 service_name = " ".join(line_data[3:])
89 if line_data[1] == "+":
90 service_state = "running"
91 else:
92 service_state = "stopped"
93 services[service_name] = {"name": service_name, "state": service_state, "source": "sysv"}
94
95 # Upstart
96 if initctl_path is not None and chkconfig_path is None:
97 p = re.compile(r'^\s?(?P<name>.*)\s(?P<goal>\w+)\/(?P<state>\w+)(\,\sprocess\s(?P<pid>[0-9]+))?\s*$')
98 rc, stdout, stderr = self.module.run_command("%s list" % initctl_path)
99 real_stdout = stdout.replace("\r", "")
100 for line in real_stdout.split("\n"):
101 m = p.match(line)
102 if not m:
103 continue
104 service_name = m.group('name')
105 service_goal = m.group('goal')
106 service_state = m.group('state')
107 if m.group('pid'):
108 pid = m.group('pid')
109 else:
110 pid = None # NOQA
111 payload = {"name": service_name, "state": service_state, "goal": service_goal, "source": "upstart"}
112 services[service_name] = payload
113
114 # RH sysvinit
115 elif chkconfig_path is not None:
116 # print '%s --status-all | grep -E "is (running|stopped)"' % service_path
117 p = re.compile(
118 r'(?P<service>.*?)\s+[0-9]:(?P<rl0>on|off)\s+[0-9]:(?P<rl1>on|off)\s+[0-9]:(?P<rl2>on|off)\s+'
119 r'[0-9]:(?P<rl3>on|off)\s+[0-9]:(?P<rl4>on|off)\s+[0-9]:(?P<rl5>on|off)\s+[0-9]:(?P<rl6>on|off)')
120 rc, stdout, stderr = self.module.run_command('%s' % chkconfig_path, use_unsafe_shell=True)
121 # Check for special cases where stdout does not fit pattern
122 match_any = False
123 for line in stdout.split('\n'):
124 if p.match(line):
125 match_any = True
126 if not match_any:
127 p_simple = re.compile(r'(?P<service>.*?)\s+(?P<rl0>on|off)')
128 match_any = False
129 for line in stdout.split('\n'):
130 if p_simple.match(line):
131 match_any = True
132 if match_any:
133 # Try extra flags " -l --allservices" needed for SLES11
134 rc, stdout, stderr = self.module.run_command('%s -l --allservices' % chkconfig_path, use_unsafe_shell=True)
135 elif '--list' in stderr:
136 # Extra flag needed for RHEL5
137 rc, stdout, stderr = self.module.run_command('%s --list' % chkconfig_path, use_unsafe_shell=True)
138 for line in stdout.split('\n'):
139 m = p.match(line)
140 if m:
141 service_name = m.group('service')
142 service_state = 'stopped'
143 if m.group('rl3') == 'on':
144 rc, stdout, stderr = self.module.run_command('%s %s status' % (service_path, service_name), use_unsafe_shell=True)
145 service_state = rc
146 if rc in (0,):
147 service_state = 'running'
148 # elif rc in (1,3):
149 else:
150 if 'root' in stderr or 'permission' in stderr.lower() or 'not in sudoers' in stderr.lower():
151 self.incomplete_warning = True
152 continue
153 else:
154 service_state = 'stopped'
155 service_data = {"name": service_name, "state": service_state, "source": "sysv"}
156 services[service_name] = service_data
157 return services
158
159
160 class SystemctlScanService(BaseService):
161
162 def systemd_enabled(self):
163 # Check if init is the systemd command, using comm as cmdline could be symlink
164 try:
165 f = open('/proc/1/comm', 'r')
166 except IOError:
167 # If comm doesn't exist, old kernel, no systemd
168 return False
169 for line in f:
170 if 'systemd' in line:
171 return True
172 return False
173
174 def gather_services(self):
175 services = {}
176 if not self.systemd_enabled():
177 return None
178 systemctl_path = self.module.get_bin_path("systemctl", opt_dirs=["/usr/bin", "/usr/local/bin"])
179 if systemctl_path is None:
180 return None
181 rc, stdout, stderr = self.module.run_command("%s list-unit-files --type=service | tail -n +2 | head -n -2" % systemctl_path, use_unsafe_shell=True)
182 for line in stdout.split("\n"):
183 line_data = line.split()
184 if len(line_data) != 2:
185 continue
186 if line_data[1] == "enabled":
187 state_val = "running"
188 else:
189 state_val = "stopped"
190 services[line_data[0]] = {"name": line_data[0], "state": state_val, "source": "systemd"}
191 return services
192
193
194 def main():
195 module = AnsibleModule(argument_spec=dict(), supports_check_mode=True)
196 service_modules = (ServiceScanService, SystemctlScanService)
197 all_services = {}
198 incomplete_warning = False
199 for svc_module in service_modules:
200 svcmod = svc_module(module)
201 svc = svcmod.gather_services()
202 if svc is not None:
203 all_services.update(svc)
204 if svcmod.incomplete_warning:
205 incomplete_warning = True
206 if len(all_services) == 0:
207 results = dict(skipped=True, msg="Failed to find any services. Sometimes this is due to insufficient privileges.")
208 else:
209 results = dict(ansible_facts=dict(services=all_services))
210 if incomplete_warning:
211 results['msg'] = "WARNING: Could not find status for all services. Sometimes this is due to insufficient privileges."
212 module.exit_json(**results)
213
214
215 if __name__ == '__main__':
216 main()
217
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lib/ansible/modules/system/service_facts.py b/lib/ansible/modules/system/service_facts.py
--- a/lib/ansible/modules/system/service_facts.py
+++ b/lib/ansible/modules/system/service_facts.py
@@ -178,16 +178,16 @@
systemctl_path = self.module.get_bin_path("systemctl", opt_dirs=["/usr/bin", "/usr/local/bin"])
if systemctl_path is None:
return None
- rc, stdout, stderr = self.module.run_command("%s list-unit-files --type=service | tail -n +2 | head -n -2" % systemctl_path, use_unsafe_shell=True)
- for line in stdout.split("\n"):
- line_data = line.split()
- if len(line_data) != 2:
- continue
- if line_data[1] == "enabled":
+ rc, stdout, stderr = self.module.run_command("%s list-units --no-pager --type service --all" % systemctl_path, use_unsafe_shell=True)
+ for line in [svc_line for svc_line in stdout.split('\n') if '.service' in svc_line and 'not-found' not in svc_line]:
+ service_name = line.split()[0]
+ if "running" in line:
state_val = "running"
else:
+ if 'failed' in line:
+ service_name = line.split()[1]
state_val = "stopped"
- services[line_data[0]] = {"name": line_data[0], "state": state_val, "source": "systemd"}
+ services[service_name] = {"name": service_name, "state": state_val, "source": "systemd"}
return services
|
{"golden_diff": "diff --git a/lib/ansible/modules/system/service_facts.py b/lib/ansible/modules/system/service_facts.py\n--- a/lib/ansible/modules/system/service_facts.py\n+++ b/lib/ansible/modules/system/service_facts.py\n@@ -178,16 +178,16 @@\n systemctl_path = self.module.get_bin_path(\"systemctl\", opt_dirs=[\"/usr/bin\", \"/usr/local/bin\"])\n if systemctl_path is None:\n return None\n- rc, stdout, stderr = self.module.run_command(\"%s list-unit-files --type=service | tail -n +2 | head -n -2\" % systemctl_path, use_unsafe_shell=True)\n- for line in stdout.split(\"\\n\"):\n- line_data = line.split()\n- if len(line_data) != 2:\n- continue\n- if line_data[1] == \"enabled\":\n+ rc, stdout, stderr = self.module.run_command(\"%s list-units --no-pager --type service --all\" % systemctl_path, use_unsafe_shell=True)\n+ for line in [svc_line for svc_line in stdout.split('\\n') if '.service' in svc_line and 'not-found' not in svc_line]:\n+ service_name = line.split()[0]\n+ if \"running\" in line:\n state_val = \"running\"\n else:\n+ if 'failed' in line:\n+ service_name = line.split()[1]\n state_val = \"stopped\"\n- services[line_data[0]] = {\"name\": line_data[0], \"state\": state_val, \"source\": \"systemd\"}\n+ services[service_name] = {\"name\": service_name, \"state\": state_val, \"source\": \"systemd\"}\n return services\n", "issue": "service_facts returns misleading state on systemd hosts\n##### SUMMARY\r\nThe system module service_facts returns misleading state information on systemd hosts.\r\n\r\n##### ISSUE TYPE\r\n - Bug Report\r\n\r\n##### COMPONENT NAME\r\nservice_facts\r\n\r\n##### ANSIBLE VERSION\r\n```\r\nansible 2.5.2\r\n config file = /etc/ansible/ansible.cfg\r\n configured module search path = [u'/home/USER/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']\r\n ansible python module location = /usr/lib/python2.7/site-packages/ansible\r\n executable location = /usr/bin/ansible\r\n python version = 2.7.5 (default, Feb 20 2018, 09:19:12) [GCC 4.8.5 20150623 (Red Hat 4.8.5-28)]\r\n```\r\n\r\n##### CONFIGURATION\r\nN/A\r\n\r\n##### OS / ENVIRONMENT\r\nAnsible controller: RHEL 7.5\r\nRemote host: RHEL 7.4\r\n\r\n##### STEPS TO REPRODUCE\r\nThe service_facts module returns one of two states for each service it discovers: either \"running\" or \"stopped\". \r\n\r\n```json\r\n{\r\n \"ansible_facts\": { \r\n \"services\": {\r\n \"rsyncd.service\": {\r\n \"name\": \"rsyncd.service\",\r\n \"source\": \"systemd\",\r\n \"state\": \"stopped\"\r\n },\r\n \"rsyslog.service\": {\r\n \"name\": \"rsyslog.service\",\r\n \"source\": \"systemd\",\r\n \"state\": \"running\"\r\n },\r\n ... \r\n } \r\n }\r\n}\r\n```\r\n\r\nOn systemd hosts, the module obtains state information via `systemctl list-unit-files --type=service` (see [line 181 in module source code](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/system/service_facts.py#L181)). However, this command does not report running/stopped status. Instead, it reports enabled/disable status which is not the same thing. Enabled/disabled determines whether the service will be started by systemd _on reboot_, but says nothing about whether it is currently running or stopped.\r\n\r\nHere are two examples:\r\n\r\n**1. Service disabled & started:**\r\n\r\n```bash\r\n# On remote host\r\nsystemctl disable <service>\r\nsystemctl start <service>\r\nsystemctl status <service>\r\n# systemctl reports: loaded, disabled, active/running\r\n```\r\n\r\n**2. Service enabled & stopped:**\r\n\r\n```bash\r\n# On remote host\r\nsystemctl enable <service>\r\nsystemctl stop <service>\r\nsystemctl status <service>\r\n# systemctl reports: loaded, enabled, inactive/dead\r\n```\r\n\r\n**On the Ansible controller,** use the service_facts module to query the <service> just configured on the remote host.\r\n\r\n##### EXPECTED RESULTS\r\n**1. Service disabled & started:** service_facts should report \"running\"\r\n\r\n**2. Service enabled & stopped:** service_facts should report \"stopped\"\r\n\r\n##### ACTUAL RESULTS\r\n**1. Service disabled & started:** service_facts reports \"stopped\"\r\n\r\n**2. Service enabled & stopped:** service_facts reports \"running\"\r\n\r\nThese misleading reports are caused by the test in [line 186 of the source code](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/system/service_facts.py#L186). The service_facts module explicitly tests whether the service is \"enabled\" and equates that with running/stopped, which is incorrect for systemd/systemctl.\r\n\n", "before_files": [{"content": "#!/usr/bin/python\n# (c) 2017, Ansible Project\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\n# originally copied from AWX's scan_services module to bring this functionality\n# into Core\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nANSIBLE_METADATA = {'metadata_version': '1.1',\n 'status': ['preview'],\n 'supported_by': 'community'}\n\nDOCUMENTATION = '''\n---\nmodule: service_facts\nshort_description: Return service state information as fact data\ndescription:\n - Return service state information as fact data for various service management utilities\nversion_added: \"2.5\"\nrequirements: [\"Any of the following supported init systems: systemd, sysv, upstart\"]\n\nauthor:\n - Matthew Jones\n - Adam Miller (@maxamillion)\n'''\n\nEXAMPLES = '''\n- name: populate service facts\n service_facts:\n\n- debug:\n var: ansible_facts.services\n\n'''\n\nRETURN = '''\nansible_facts:\n description: facts to add to ansible_facts about the services on the system\n returned: always\n type: complex\n contains:\n \"services\": {\n \"network\": {\n \"source\": \"sysv\",\n \"state\": \"running\",\n \"name\": \"network\"\n },\n arp-ethers.service: {\n \"source\": \"systemd\",\n \"state\": \"stopped\",\n \"name\": \"arp-ethers.service\"\n }\n }\n'''\n\n\nimport re\nfrom ansible.module_utils.basic import AnsibleModule\n\n\nclass BaseService(object):\n\n def __init__(self, module):\n self.module = module\n self.incomplete_warning = False\n\n\nclass ServiceScanService(BaseService):\n\n def gather_services(self):\n services = {}\n service_path = self.module.get_bin_path(\"service\")\n if service_path is None:\n return None\n initctl_path = self.module.get_bin_path(\"initctl\")\n chkconfig_path = self.module.get_bin_path(\"chkconfig\")\n\n # sysvinit\n if service_path is not None and chkconfig_path is None:\n rc, stdout, stderr = self.module.run_command(\"%s --status-all 2>&1 | grep -E \\\"\\\\[ (\\\\+|\\\\-) \\\\]\\\"\" % service_path, use_unsafe_shell=True)\n for line in stdout.split(\"\\n\"):\n line_data = line.split()\n if len(line_data) < 4:\n continue # Skipping because we expected more data\n service_name = \" \".join(line_data[3:])\n if line_data[1] == \"+\":\n service_state = \"running\"\n else:\n service_state = \"stopped\"\n services[service_name] = {\"name\": service_name, \"state\": service_state, \"source\": \"sysv\"}\n\n # Upstart\n if initctl_path is not None and chkconfig_path is None:\n p = re.compile(r'^\\s?(?P<name>.*)\\s(?P<goal>\\w+)\\/(?P<state>\\w+)(\\,\\sprocess\\s(?P<pid>[0-9]+))?\\s*$')\n rc, stdout, stderr = self.module.run_command(\"%s list\" % initctl_path)\n real_stdout = stdout.replace(\"\\r\", \"\")\n for line in real_stdout.split(\"\\n\"):\n m = p.match(line)\n if not m:\n continue\n service_name = m.group('name')\n service_goal = m.group('goal')\n service_state = m.group('state')\n if m.group('pid'):\n pid = m.group('pid')\n else:\n pid = None # NOQA\n payload = {\"name\": service_name, \"state\": service_state, \"goal\": service_goal, \"source\": \"upstart\"}\n services[service_name] = payload\n\n # RH sysvinit\n elif chkconfig_path is not None:\n # print '%s --status-all | grep -E \"is (running|stopped)\"' % service_path\n p = re.compile(\n r'(?P<service>.*?)\\s+[0-9]:(?P<rl0>on|off)\\s+[0-9]:(?P<rl1>on|off)\\s+[0-9]:(?P<rl2>on|off)\\s+'\n r'[0-9]:(?P<rl3>on|off)\\s+[0-9]:(?P<rl4>on|off)\\s+[0-9]:(?P<rl5>on|off)\\s+[0-9]:(?P<rl6>on|off)')\n rc, stdout, stderr = self.module.run_command('%s' % chkconfig_path, use_unsafe_shell=True)\n # Check for special cases where stdout does not fit pattern\n match_any = False\n for line in stdout.split('\\n'):\n if p.match(line):\n match_any = True\n if not match_any:\n p_simple = re.compile(r'(?P<service>.*?)\\s+(?P<rl0>on|off)')\n match_any = False\n for line in stdout.split('\\n'):\n if p_simple.match(line):\n match_any = True\n if match_any:\n # Try extra flags \" -l --allservices\" needed for SLES11\n rc, stdout, stderr = self.module.run_command('%s -l --allservices' % chkconfig_path, use_unsafe_shell=True)\n elif '--list' in stderr:\n # Extra flag needed for RHEL5\n rc, stdout, stderr = self.module.run_command('%s --list' % chkconfig_path, use_unsafe_shell=True)\n for line in stdout.split('\\n'):\n m = p.match(line)\n if m:\n service_name = m.group('service')\n service_state = 'stopped'\n if m.group('rl3') == 'on':\n rc, stdout, stderr = self.module.run_command('%s %s status' % (service_path, service_name), use_unsafe_shell=True)\n service_state = rc\n if rc in (0,):\n service_state = 'running'\n # elif rc in (1,3):\n else:\n if 'root' in stderr or 'permission' in stderr.lower() or 'not in sudoers' in stderr.lower():\n self.incomplete_warning = True\n continue\n else:\n service_state = 'stopped'\n service_data = {\"name\": service_name, \"state\": service_state, \"source\": \"sysv\"}\n services[service_name] = service_data\n return services\n\n\nclass SystemctlScanService(BaseService):\n\n def systemd_enabled(self):\n # Check if init is the systemd command, using comm as cmdline could be symlink\n try:\n f = open('/proc/1/comm', 'r')\n except IOError:\n # If comm doesn't exist, old kernel, no systemd\n return False\n for line in f:\n if 'systemd' in line:\n return True\n return False\n\n def gather_services(self):\n services = {}\n if not self.systemd_enabled():\n return None\n systemctl_path = self.module.get_bin_path(\"systemctl\", opt_dirs=[\"/usr/bin\", \"/usr/local/bin\"])\n if systemctl_path is None:\n return None\n rc, stdout, stderr = self.module.run_command(\"%s list-unit-files --type=service | tail -n +2 | head -n -2\" % systemctl_path, use_unsafe_shell=True)\n for line in stdout.split(\"\\n\"):\n line_data = line.split()\n if len(line_data) != 2:\n continue\n if line_data[1] == \"enabled\":\n state_val = \"running\"\n else:\n state_val = \"stopped\"\n services[line_data[0]] = {\"name\": line_data[0], \"state\": state_val, \"source\": \"systemd\"}\n return services\n\n\ndef main():\n module = AnsibleModule(argument_spec=dict(), supports_check_mode=True)\n service_modules = (ServiceScanService, SystemctlScanService)\n all_services = {}\n incomplete_warning = False\n for svc_module in service_modules:\n svcmod = svc_module(module)\n svc = svcmod.gather_services()\n if svc is not None:\n all_services.update(svc)\n if svcmod.incomplete_warning:\n incomplete_warning = True\n if len(all_services) == 0:\n results = dict(skipped=True, msg=\"Failed to find any services. Sometimes this is due to insufficient privileges.\")\n else:\n results = dict(ansible_facts=dict(services=all_services))\n if incomplete_warning:\n results['msg'] = \"WARNING: Could not find status for all services. Sometimes this is due to insufficient privileges.\"\n module.exit_json(**results)\n\n\nif __name__ == '__main__':\n main()\n", "path": "lib/ansible/modules/system/service_facts.py"}], "after_files": [{"content": "#!/usr/bin/python\n# (c) 2017, Ansible Project\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\n# originally copied from AWX's scan_services module to bring this functionality\n# into Core\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nANSIBLE_METADATA = {'metadata_version': '1.1',\n 'status': ['preview'],\n 'supported_by': 'community'}\n\nDOCUMENTATION = '''\n---\nmodule: service_facts\nshort_description: Return service state information as fact data\ndescription:\n - Return service state information as fact data for various service management utilities\nversion_added: \"2.5\"\nrequirements: [\"Any of the following supported init systems: systemd, sysv, upstart\"]\n\nauthor:\n - Matthew Jones\n - Adam Miller (@maxamillion)\n'''\n\nEXAMPLES = '''\n- name: populate service facts\n service_facts:\n\n- debug:\n var: ansible_facts.services\n\n'''\n\nRETURN = '''\nansible_facts:\n description: facts to add to ansible_facts about the services on the system\n returned: always\n type: complex\n contains:\n \"services\": {\n \"network\": {\n \"source\": \"sysv\",\n \"state\": \"running\",\n \"name\": \"network\"\n },\n arp-ethers.service: {\n \"source\": \"systemd\",\n \"state\": \"stopped\",\n \"name\": \"arp-ethers.service\"\n }\n }\n'''\n\n\nimport re\nfrom ansible.module_utils.basic import AnsibleModule\n\n\nclass BaseService(object):\n\n def __init__(self, module):\n self.module = module\n self.incomplete_warning = False\n\n\nclass ServiceScanService(BaseService):\n\n def gather_services(self):\n services = {}\n service_path = self.module.get_bin_path(\"service\")\n if service_path is None:\n return None\n initctl_path = self.module.get_bin_path(\"initctl\")\n chkconfig_path = self.module.get_bin_path(\"chkconfig\")\n\n # sysvinit\n if service_path is not None and chkconfig_path is None:\n rc, stdout, stderr = self.module.run_command(\"%s --status-all 2>&1 | grep -E \\\"\\\\[ (\\\\+|\\\\-) \\\\]\\\"\" % service_path, use_unsafe_shell=True)\n for line in stdout.split(\"\\n\"):\n line_data = line.split()\n if len(line_data) < 4:\n continue # Skipping because we expected more data\n service_name = \" \".join(line_data[3:])\n if line_data[1] == \"+\":\n service_state = \"running\"\n else:\n service_state = \"stopped\"\n services[service_name] = {\"name\": service_name, \"state\": service_state, \"source\": \"sysv\"}\n\n # Upstart\n if initctl_path is not None and chkconfig_path is None:\n p = re.compile(r'^\\s?(?P<name>.*)\\s(?P<goal>\\w+)\\/(?P<state>\\w+)(\\,\\sprocess\\s(?P<pid>[0-9]+))?\\s*$')\n rc, stdout, stderr = self.module.run_command(\"%s list\" % initctl_path)\n real_stdout = stdout.replace(\"\\r\", \"\")\n for line in real_stdout.split(\"\\n\"):\n m = p.match(line)\n if not m:\n continue\n service_name = m.group('name')\n service_goal = m.group('goal')\n service_state = m.group('state')\n if m.group('pid'):\n pid = m.group('pid')\n else:\n pid = None # NOQA\n payload = {\"name\": service_name, \"state\": service_state, \"goal\": service_goal, \"source\": \"upstart\"}\n services[service_name] = payload\n\n # RH sysvinit\n elif chkconfig_path is not None:\n # print '%s --status-all | grep -E \"is (running|stopped)\"' % service_path\n p = re.compile(\n r'(?P<service>.*?)\\s+[0-9]:(?P<rl0>on|off)\\s+[0-9]:(?P<rl1>on|off)\\s+[0-9]:(?P<rl2>on|off)\\s+'\n r'[0-9]:(?P<rl3>on|off)\\s+[0-9]:(?P<rl4>on|off)\\s+[0-9]:(?P<rl5>on|off)\\s+[0-9]:(?P<rl6>on|off)')\n rc, stdout, stderr = self.module.run_command('%s' % chkconfig_path, use_unsafe_shell=True)\n # Check for special cases where stdout does not fit pattern\n match_any = False\n for line in stdout.split('\\n'):\n if p.match(line):\n match_any = True\n if not match_any:\n p_simple = re.compile(r'(?P<service>.*?)\\s+(?P<rl0>on|off)')\n match_any = False\n for line in stdout.split('\\n'):\n if p_simple.match(line):\n match_any = True\n if match_any:\n # Try extra flags \" -l --allservices\" needed for SLES11\n rc, stdout, stderr = self.module.run_command('%s -l --allservices' % chkconfig_path, use_unsafe_shell=True)\n elif '--list' in stderr:\n # Extra flag needed for RHEL5\n rc, stdout, stderr = self.module.run_command('%s --list' % chkconfig_path, use_unsafe_shell=True)\n for line in stdout.split('\\n'):\n m = p.match(line)\n if m:\n service_name = m.group('service')\n service_state = 'stopped'\n if m.group('rl3') == 'on':\n rc, stdout, stderr = self.module.run_command('%s %s status' % (service_path, service_name), use_unsafe_shell=True)\n service_state = rc\n if rc in (0,):\n service_state = 'running'\n # elif rc in (1,3):\n else:\n if 'root' in stderr or 'permission' in stderr.lower() or 'not in sudoers' in stderr.lower():\n self.incomplete_warning = True\n continue\n else:\n service_state = 'stopped'\n service_data = {\"name\": service_name, \"state\": service_state, \"source\": \"sysv\"}\n services[service_name] = service_data\n return services\n\n\nclass SystemctlScanService(BaseService):\n\n def systemd_enabled(self):\n # Check if init is the systemd command, using comm as cmdline could be symlink\n try:\n f = open('/proc/1/comm', 'r')\n except IOError:\n # If comm doesn't exist, old kernel, no systemd\n return False\n for line in f:\n if 'systemd' in line:\n return True\n return False\n\n def gather_services(self):\n services = {}\n if not self.systemd_enabled():\n return None\n systemctl_path = self.module.get_bin_path(\"systemctl\", opt_dirs=[\"/usr/bin\", \"/usr/local/bin\"])\n if systemctl_path is None:\n return None\n rc, stdout, stderr = self.module.run_command(\"%s list-units --no-pager --type service --all\" % systemctl_path, use_unsafe_shell=True)\n for line in [svc_line for svc_line in stdout.split('\\n') if '.service' in svc_line and 'not-found' not in svc_line]:\n service_name = line.split()[0]\n if \"running\" in line:\n state_val = \"running\"\n else:\n if 'failed' in line:\n service_name = line.split()[1]\n state_val = \"stopped\"\n services[service_name] = {\"name\": service_name, \"state\": state_val, \"source\": \"systemd\"}\n return services\n\n\ndef main():\n module = AnsibleModule(argument_spec=dict(), supports_check_mode=True)\n service_modules = (ServiceScanService, SystemctlScanService)\n all_services = {}\n incomplete_warning = False\n for svc_module in service_modules:\n svcmod = svc_module(module)\n svc = svcmod.gather_services()\n if svc is not None:\n all_services.update(svc)\n if svcmod.incomplete_warning:\n incomplete_warning = True\n if len(all_services) == 0:\n results = dict(skipped=True, msg=\"Failed to find any services. Sometimes this is due to insufficient privileges.\")\n else:\n results = dict(ansible_facts=dict(services=all_services))\n if incomplete_warning:\n results['msg'] = \"WARNING: Could not find status for all services. Sometimes this is due to insufficient privileges.\"\n module.exit_json(**results)\n\n\nif __name__ == '__main__':\n main()\n", "path": "lib/ansible/modules/system/service_facts.py"}]}
| 3,497 | 376 |
gh_patches_debug_40886
|
rasdani/github-patches
|
git_diff
|
pwndbg__pwndbg-900
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
telescope should not fold single values, should probably be configurable
In this screenshot, we should simply display the 0x28 and 0x90 and 0x108 offset in the stack trace. The ellipsis isn't particularly useful here, and actually adds to cognitive load due to having to check how many values are skipped.
This should probably be behind a config, e.g. `telescope-fold-minimum` which is expressed in pointer widths. This will allow people to control the setting to their choosing (or disable folding it with a value of 0 or -1 or something).

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwndbg/commands/telescope.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 """
4 Prints out pointer chains starting at some address in memory.
5
6 Generally used to print out the stack or register values.
7 """
8
9 import argparse
10 import collections
11 import math
12
13 import pwndbg.arch
14 import pwndbg.chain
15 import pwndbg.color.telescope as T
16 import pwndbg.color.theme as theme
17 import pwndbg.commands
18 import pwndbg.config
19 import pwndbg.memory
20 import pwndbg.regs
21 import pwndbg.typeinfo
22
23 telescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')
24 skip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,
25 'whether to skip repeating values of the telescope command')
26
27 offset_separator = theme.Parameter('telescope-offset-separator', '│', 'offset separator of the telescope command')
28 offset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')
29 repeating_marker = theme.Parameter('telescope-repeating-marker', '... ↓',
30 'repeating values marker of the telescope command')
31
32
33 parser = argparse.ArgumentParser(description="""
34 Recursively dereferences pointers starting at the specified address
35 ($sp by default)
36 """)
37 parser.add_argument("address", nargs="?", default=None, type=int, help="The address to telescope at.")
38 parser.add_argument("count", nargs="?", default=telescope_lines, type=int, help="The number of lines to show.")
39 @pwndbg.commands.ArgparsedCommand(parser)
40 @pwndbg.commands.OnlyWhenRunning
41 def telescope(address=None, count=telescope_lines, to_string=False):
42 """
43 Recursively dereferences pointers starting at the specified address
44 ($sp by default)
45 """
46 ptrsize = pwndbg.typeinfo.ptrsize
47 if telescope.repeat:
48 address = telescope.last_address + ptrsize
49 telescope.offset += 1
50 else:
51 telescope.offset = 0
52
53 address = int(address if address else pwndbg.regs.sp) & pwndbg.arch.ptrmask
54 count = max(int(count), 1) & pwndbg.arch.ptrmask
55 delimiter = T.delimiter(offset_delimiter)
56 separator = T.separator(offset_separator)
57
58 # Allow invocation of "telescope 20" to dump 20 bytes at the stack pointer
59 if address < pwndbg.memory.MMAP_MIN_ADDR and not pwndbg.memory.peek(address):
60 count = address
61 address = pwndbg.regs.sp
62
63 # Allow invocation of "telescope a b" to dump all bytes from A to B
64 if int(address) <= int(count):
65 # adjust count if it is an address. use ceil division as count is number of
66 # ptrsize values and we don't want to strip out a value if dest is unaligned
67 count -= address
68 count = max(math.ceil(count / ptrsize), 1)
69
70 reg_values = collections.defaultdict(lambda: [])
71 for reg in pwndbg.regs.common:
72 reg_values[pwndbg.regs[reg]].append(reg)
73 # address = pwndbg.memory.poi(pwndbg.typeinfo.ppvoid, address)
74
75 start = address
76 stop = address + (count*ptrsize)
77 step = ptrsize
78
79 # Find all registers which show up in the trace
80 regs = {}
81 for i in range(start, stop, step):
82 values = list(reg_values[i])
83
84 for width in range(1, pwndbg.arch.ptrsize):
85 values.extend('%s-%i' % (r,width) for r in reg_values[i+width])
86
87 regs[i] = ' '.join(values)
88
89 # Find the longest set of register information
90 if regs:
91 longest_regs = max(map(len, regs.values())) + 1
92 else:
93 longest_regs = 0
94
95 # Print everything out
96 result = []
97 last = None
98 skip = False
99 for i,addr in enumerate(range(start, stop, step)):
100 if not pwndbg.memory.peek(addr):
101 result.append("<Could not read memory at %#x>" % addr)
102 break
103
104 # Collapse repeating values.
105 value = pwndbg.memory.pvoid(addr)
106 if skip_repeating_values and last == value:
107 if not skip:
108 result.append(T.repeating_marker('%s' % repeating_marker))
109 skip = True
110 continue
111 last = value
112 skip = False
113
114 line = ' '.join((T.offset("%02x%s%04x%s" % (i + telescope.offset, delimiter,
115 addr - start + (telescope.offset * ptrsize), separator)),
116 T.register(regs[addr].ljust(longest_regs)),
117 pwndbg.chain.format(addr)))
118 result.append(line)
119 telescope.offset += i
120 telescope.last_address = addr
121
122 if not to_string:
123 print('\n'.join(result))
124
125 return result
126
127
128 parser = argparse.ArgumentParser(description='dereferences on stack data with specified count and offset.')
129 parser.add_argument('count', nargs='?', default=8, type=int,
130 help='number of element to dump')
131 parser.add_argument('offset', nargs='?', default=0, type=int,
132 help='Element offset from $sp (support negative offset)')
133
134
135 @pwndbg.commands.ArgparsedCommand(parser)
136 @pwndbg.commands.OnlyWhenRunning
137 def stack(count, offset):
138 ptrsize = pwndbg.typeinfo.ptrsize
139 telescope.repeat = stack.repeat
140 telescope(address=pwndbg.regs.sp + offset * ptrsize, count=count)
141
142
143 telescope.last_address = 0
144 telescope.offset = 0
145
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/pwndbg/commands/telescope.py b/pwndbg/commands/telescope.py
--- a/pwndbg/commands/telescope.py
+++ b/pwndbg/commands/telescope.py
@@ -23,6 +23,8 @@
telescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')
skip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,
'whether to skip repeating values of the telescope command')
+skip_repeating_values_minimum = pwndbg.config.Parameter('telescope-skip-repeating-val-minimum', 3,
+ 'minimum amount of repeated values before skipping lines')
offset_separator = theme.Parameter('telescope-offset-separator', '│', 'offset separator of the telescope command')
offset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')
@@ -88,34 +90,47 @@
# Find the longest set of register information
if regs:
- longest_regs = max(map(len, regs.values())) + 1
+ longest_regs = max(map(len, regs.values()))
else:
longest_regs = 0
# Print everything out
result = []
- last = None
- skip = False
- for i,addr in enumerate(range(start, stop, step)):
+ last = None
+ collapse_buffer = []
+ skipped_padding = 2 + len(offset_delimiter) + 4 + len(offset_separator) + 1 + longest_regs + 1 - len(repeating_marker)
+
+ # Collapse repeating values exceeding minimum delta.
+ def collapse_repeating_values():
+ # The first line was already printed, hence increment by 1
+ if collapse_buffer and len(collapse_buffer) + 1 >= skip_repeating_values_minimum:
+ result.append(T.repeating_marker('%s%s%i skipped' % (repeating_marker, ' ' * skipped_padding, len(collapse_buffer))))
+ else:
+ result.extend(collapse_buffer)
+ collapse_buffer.clear()
+
+ for i, addr in enumerate(range(start, stop, step)):
if not pwndbg.memory.peek(addr):
+ collapse_repeating_values()
result.append("<Could not read memory at %#x>" % addr)
break
- # Collapse repeating values.
- value = pwndbg.memory.pvoid(addr)
- if skip_repeating_values and last == value:
- if not skip:
- result.append(T.repeating_marker('%s' % repeating_marker))
- skip = True
- continue
- last = value
- skip = False
-
line = ' '.join((T.offset("%02x%s%04x%s" % (i + telescope.offset, delimiter,
addr - start + (telescope.offset * ptrsize), separator)),
T.register(regs[addr].ljust(longest_regs)),
pwndbg.chain.format(addr)))
+
+ # Buffer repeating values.
+ if skip_repeating_values:
+ value = pwndbg.memory.pvoid(addr)
+ if last == value:
+ collapse_buffer.append(line)
+ continue
+ collapse_repeating_values()
+ last = value
+
result.append(line)
+
telescope.offset += i
telescope.last_address = addr
|
{"golden_diff": "diff --git a/pwndbg/commands/telescope.py b/pwndbg/commands/telescope.py\n--- a/pwndbg/commands/telescope.py\n+++ b/pwndbg/commands/telescope.py\n@@ -23,6 +23,8 @@\n telescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')\n skip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,\n 'whether to skip repeating values of the telescope command')\n+skip_repeating_values_minimum = pwndbg.config.Parameter('telescope-skip-repeating-val-minimum', 3,\n+ 'minimum amount of repeated values before skipping lines')\n \n offset_separator = theme.Parameter('telescope-offset-separator', '\u2502', 'offset separator of the telescope command')\n offset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')\n@@ -88,34 +90,47 @@\n \n # Find the longest set of register information\n if regs:\n- longest_regs = max(map(len, regs.values())) + 1\n+ longest_regs = max(map(len, regs.values()))\n else:\n longest_regs = 0\n \n # Print everything out\n result = []\n- last = None\n- skip = False\n- for i,addr in enumerate(range(start, stop, step)):\n+ last = None\n+ collapse_buffer = []\n+ skipped_padding = 2 + len(offset_delimiter) + 4 + len(offset_separator) + 1 + longest_regs + 1 - len(repeating_marker)\n+\n+ # Collapse repeating values exceeding minimum delta.\n+ def collapse_repeating_values():\n+ # The first line was already printed, hence increment by 1\n+ if collapse_buffer and len(collapse_buffer) + 1 >= skip_repeating_values_minimum:\n+ result.append(T.repeating_marker('%s%s%i skipped' % (repeating_marker, ' ' * skipped_padding, len(collapse_buffer))))\n+ else:\n+ result.extend(collapse_buffer)\n+ collapse_buffer.clear()\n+\n+ for i, addr in enumerate(range(start, stop, step)):\n if not pwndbg.memory.peek(addr):\n+ collapse_repeating_values()\n result.append(\"<Could not read memory at %#x>\" % addr)\n break\n \n- # Collapse repeating values.\n- value = pwndbg.memory.pvoid(addr)\n- if skip_repeating_values and last == value:\n- if not skip:\n- result.append(T.repeating_marker('%s' % repeating_marker))\n- skip = True\n- continue\n- last = value\n- skip = False\n-\n line = ' '.join((T.offset(\"%02x%s%04x%s\" % (i + telescope.offset, delimiter,\n addr - start + (telescope.offset * ptrsize), separator)),\n T.register(regs[addr].ljust(longest_regs)),\n pwndbg.chain.format(addr)))\n+\n+ # Buffer repeating values.\n+ if skip_repeating_values:\n+ value = pwndbg.memory.pvoid(addr)\n+ if last == value:\n+ collapse_buffer.append(line)\n+ continue\n+ collapse_repeating_values()\n+ last = value\n+\n result.append(line)\n+\n telescope.offset += i\n telescope.last_address = addr\n", "issue": "telescope should not fold single values, should probably be configurable\nIn this screenshot, we should simply display the 0x28 and 0x90 and 0x108 offset in the stack trace. The ellipsis isn't particularly useful here, and actually adds to cognitive load due to having to check how many values are skipped.\r\n\r\nThis should probably be behind a config, e.g. `telescope-fold-minimum` which is expressed in pointer widths. This will allow people to control the setting to their choosing (or disable folding it with a value of 0 or -1 or something).\r\n\r\n\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nPrints out pointer chains starting at some address in memory.\n\nGenerally used to print out the stack or register values.\n\"\"\"\n\nimport argparse\nimport collections\nimport math\n\nimport pwndbg.arch\nimport pwndbg.chain\nimport pwndbg.color.telescope as T\nimport pwndbg.color.theme as theme\nimport pwndbg.commands\nimport pwndbg.config\nimport pwndbg.memory\nimport pwndbg.regs\nimport pwndbg.typeinfo\n\ntelescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')\nskip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,\n 'whether to skip repeating values of the telescope command')\n\noffset_separator = theme.Parameter('telescope-offset-separator', '\u2502', 'offset separator of the telescope command')\noffset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')\nrepeating_marker = theme.Parameter('telescope-repeating-marker', '... \u2193',\n 'repeating values marker of the telescope command')\n\n\nparser = argparse.ArgumentParser(description=\"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\")\nparser.add_argument(\"address\", nargs=\"?\", default=None, type=int, help=\"The address to telescope at.\")\nparser.add_argument(\"count\", nargs=\"?\", default=telescope_lines, type=int, help=\"The number of lines to show.\")\n@pwndbg.commands.ArgparsedCommand(parser)\n@pwndbg.commands.OnlyWhenRunning\ndef telescope(address=None, count=telescope_lines, to_string=False):\n \"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\"\n ptrsize = pwndbg.typeinfo.ptrsize\n if telescope.repeat:\n address = telescope.last_address + ptrsize\n telescope.offset += 1\n else:\n telescope.offset = 0\n\n address = int(address if address else pwndbg.regs.sp) & pwndbg.arch.ptrmask\n count = max(int(count), 1) & pwndbg.arch.ptrmask\n delimiter = T.delimiter(offset_delimiter)\n separator = T.separator(offset_separator)\n\n # Allow invocation of \"telescope 20\" to dump 20 bytes at the stack pointer\n if address < pwndbg.memory.MMAP_MIN_ADDR and not pwndbg.memory.peek(address):\n count = address\n address = pwndbg.regs.sp\n\n # Allow invocation of \"telescope a b\" to dump all bytes from A to B\n if int(address) <= int(count):\n # adjust count if it is an address. use ceil division as count is number of\n # ptrsize values and we don't want to strip out a value if dest is unaligned\n count -= address\n count = max(math.ceil(count / ptrsize), 1)\n\n reg_values = collections.defaultdict(lambda: [])\n for reg in pwndbg.regs.common:\n reg_values[pwndbg.regs[reg]].append(reg)\n # address = pwndbg.memory.poi(pwndbg.typeinfo.ppvoid, address)\n\n start = address\n stop = address + (count*ptrsize)\n step = ptrsize\n\n # Find all registers which show up in the trace\n regs = {}\n for i in range(start, stop, step):\n values = list(reg_values[i])\n\n for width in range(1, pwndbg.arch.ptrsize):\n values.extend('%s-%i' % (r,width) for r in reg_values[i+width])\n\n regs[i] = ' '.join(values)\n\n # Find the longest set of register information\n if regs:\n longest_regs = max(map(len, regs.values())) + 1\n else:\n longest_regs = 0\n\n # Print everything out\n result = []\n last = None\n skip = False\n for i,addr in enumerate(range(start, stop, step)):\n if not pwndbg.memory.peek(addr):\n result.append(\"<Could not read memory at %#x>\" % addr)\n break\n\n # Collapse repeating values.\n value = pwndbg.memory.pvoid(addr)\n if skip_repeating_values and last == value:\n if not skip:\n result.append(T.repeating_marker('%s' % repeating_marker))\n skip = True\n continue\n last = value\n skip = False\n\n line = ' '.join((T.offset(\"%02x%s%04x%s\" % (i + telescope.offset, delimiter,\n addr - start + (telescope.offset * ptrsize), separator)),\n T.register(regs[addr].ljust(longest_regs)),\n pwndbg.chain.format(addr)))\n result.append(line)\n telescope.offset += i\n telescope.last_address = addr\n\n if not to_string:\n print('\\n'.join(result))\n\n return result\n\n\nparser = argparse.ArgumentParser(description='dereferences on stack data with specified count and offset.')\nparser.add_argument('count', nargs='?', default=8, type=int,\n help='number of element to dump')\nparser.add_argument('offset', nargs='?', default=0, type=int,\n help='Element offset from $sp (support negative offset)')\n\n\n@pwndbg.commands.ArgparsedCommand(parser)\n@pwndbg.commands.OnlyWhenRunning\ndef stack(count, offset):\n ptrsize = pwndbg.typeinfo.ptrsize\n telescope.repeat = stack.repeat\n telescope(address=pwndbg.regs.sp + offset * ptrsize, count=count)\n\n\ntelescope.last_address = 0\ntelescope.offset = 0\n", "path": "pwndbg/commands/telescope.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nPrints out pointer chains starting at some address in memory.\n\nGenerally used to print out the stack or register values.\n\"\"\"\n\nimport argparse\nimport collections\nimport math\n\nimport pwndbg.arch\nimport pwndbg.chain\nimport pwndbg.color.telescope as T\nimport pwndbg.color.theme as theme\nimport pwndbg.commands\nimport pwndbg.config\nimport pwndbg.memory\nimport pwndbg.regs\nimport pwndbg.typeinfo\n\ntelescope_lines = pwndbg.config.Parameter('telescope-lines', 8, 'number of lines to printed by the telescope command')\nskip_repeating_values = pwndbg.config.Parameter('telescope-skip-repeating-val', True,\n 'whether to skip repeating values of the telescope command')\nskip_repeating_values_minimum = pwndbg.config.Parameter('telescope-skip-repeating-val-minimum', 3,\n 'minimum amount of repeated values before skipping lines')\n\noffset_separator = theme.Parameter('telescope-offset-separator', '\u2502', 'offset separator of the telescope command')\noffset_delimiter = theme.Parameter('telescope-offset-delimiter', ':', 'offset delimiter of the telescope command')\nrepeating_marker = theme.Parameter('telescope-repeating-marker', '... \u2193',\n 'repeating values marker of the telescope command')\n\n\nparser = argparse.ArgumentParser(description=\"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\")\nparser.add_argument(\"address\", nargs=\"?\", default=None, type=int, help=\"The address to telescope at.\")\nparser.add_argument(\"count\", nargs=\"?\", default=telescope_lines, type=int, help=\"The number of lines to show.\")\n@pwndbg.commands.ArgparsedCommand(parser)\n@pwndbg.commands.OnlyWhenRunning\ndef telescope(address=None, count=telescope_lines, to_string=False):\n \"\"\"\n Recursively dereferences pointers starting at the specified address\n ($sp by default)\n \"\"\"\n ptrsize = pwndbg.typeinfo.ptrsize\n if telescope.repeat:\n address = telescope.last_address + ptrsize\n telescope.offset += 1\n else:\n telescope.offset = 0\n\n address = int(address if address else pwndbg.regs.sp) & pwndbg.arch.ptrmask\n count = max(int(count), 1) & pwndbg.arch.ptrmask\n delimiter = T.delimiter(offset_delimiter)\n separator = T.separator(offset_separator)\n\n # Allow invocation of \"telescope 20\" to dump 20 bytes at the stack pointer\n if address < pwndbg.memory.MMAP_MIN_ADDR and not pwndbg.memory.peek(address):\n count = address\n address = pwndbg.regs.sp\n\n # Allow invocation of \"telescope a b\" to dump all bytes from A to B\n if int(address) <= int(count):\n # adjust count if it is an address. use ceil division as count is number of\n # ptrsize values and we don't want to strip out a value if dest is unaligned\n count -= address\n count = max(math.ceil(count / ptrsize), 1)\n\n reg_values = collections.defaultdict(lambda: [])\n for reg in pwndbg.regs.common:\n reg_values[pwndbg.regs[reg]].append(reg)\n # address = pwndbg.memory.poi(pwndbg.typeinfo.ppvoid, address)\n\n start = address\n stop = address + (count*ptrsize)\n step = ptrsize\n\n # Find all registers which show up in the trace\n regs = {}\n for i in range(start, stop, step):\n values = list(reg_values[i])\n\n for width in range(1, pwndbg.arch.ptrsize):\n values.extend('%s-%i' % (r,width) for r in reg_values[i+width])\n\n regs[i] = ' '.join(values)\n\n # Find the longest set of register information\n if regs:\n longest_regs = max(map(len, regs.values()))\n else:\n longest_regs = 0\n\n # Print everything out\n result = []\n last = None\n collapse_buffer = []\n skipped_padding = 2 + len(offset_delimiter) + 4 + len(offset_separator) + 1 + longest_regs + 1 - len(repeating_marker)\n\n # Collapse repeating values exceeding minimum delta.\n def collapse_repeating_values():\n # The first line was already printed, hence increment by 1\n if collapse_buffer and len(collapse_buffer) + 1 >= skip_repeating_values_minimum:\n result.append(T.repeating_marker('%s%s%i skipped' % (repeating_marker, ' ' * skipped_padding, len(collapse_buffer))))\n else:\n result.extend(collapse_buffer)\n collapse_buffer.clear()\n\n for i, addr in enumerate(range(start, stop, step)):\n if not pwndbg.memory.peek(addr):\n collapse_repeating_values()\n result.append(\"<Could not read memory at %#x>\" % addr)\n break\n\n line = ' '.join((T.offset(\"%02x%s%04x%s\" % (i + telescope.offset, delimiter,\n addr - start + (telescope.offset * ptrsize), separator)),\n T.register(regs[addr].ljust(longest_regs)),\n pwndbg.chain.format(addr)))\n\n # Buffer repeating values.\n if skip_repeating_values:\n value = pwndbg.memory.pvoid(addr)\n if last == value:\n collapse_buffer.append(line)\n continue\n collapse_repeating_values()\n last = value\n\n result.append(line)\n\n telescope.offset += i\n telescope.last_address = addr\n\n if not to_string:\n print('\\n'.join(result))\n\n return result\n\n\nparser = argparse.ArgumentParser(description='dereferences on stack data with specified count and offset.')\nparser.add_argument('count', nargs='?', default=8, type=int,\n help='number of element to dump')\nparser.add_argument('offset', nargs='?', default=0, type=int,\n help='Element offset from $sp (support negative offset)')\n\n\n@pwndbg.commands.ArgparsedCommand(parser)\n@pwndbg.commands.OnlyWhenRunning\ndef stack(count, offset):\n ptrsize = pwndbg.typeinfo.ptrsize\n telescope.repeat = stack.repeat\n telescope(address=pwndbg.regs.sp + offset * ptrsize, count=count)\n\n\ntelescope.last_address = 0\ntelescope.offset = 0\n", "path": "pwndbg/commands/telescope.py"}]}
| 2,019 | 735 |
gh_patches_debug_283
|
rasdani/github-patches
|
git_diff
|
scikit-image__scikit-image-938
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Draw circle does not obey shape argument - v0.93
In previous version worked fine, but I just installed new version 0.93 and draw.circle is not working properly. When I apply the circle for image 1024x1024 with following arguments:
rr,cc=circle(-5.2796287128712879E+02, 1.5003712871287132E+02, 9.8910961199417170E+02, (1024,1024))
I get negative values in rr and cc arrays. Which obviously leads to errors when applied to:
img[rr,cc]=0
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `skimage/draw/draw.py`
Content:
```
1 # coding: utf-8
2 import numpy as np
3
4
5 def _coords_inside_image(rr, cc, shape):
6 mask = (rr >= 0) & (rr < shape[0]) & (cc >= 0) & (cc < shape[1])
7 return rr[mask], cc[mask]
8
9
10 def ellipse(cy, cx, yradius, xradius, shape=None):
11 """Generate coordinates of pixels within ellipse.
12
13 Parameters
14 ----------
15 cy, cx : double
16 Centre coordinate of ellipse.
17 yradius, xradius : double
18 Minor and major semi-axes. ``(x/xradius)**2 + (y/yradius)**2 = 1``.
19 shape : tuple, optional
20 Image shape which is used to determine maximum extents of output pixel
21 coordinates. This is useful for ellipses which exceed the image size.
22 By default the full extents of the ellipse are used.
23
24 Returns
25 -------
26 rr, cc : ndarray of int
27 Pixel coordinates of ellipse.
28 May be used to directly index into an array, e.g.
29 ``img[rr, cc] = 1``.
30
31 Examples
32 --------
33 >>> from skimage.draw import ellipse
34 >>> img = np.zeros((10, 10), dtype=np.uint8)
35 >>> rr, cc = ellipse(5, 5, 3, 4)
36 >>> img[rr, cc] = 1
37 >>> img
38 array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
39 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
40 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
41 [0, 0, 0, 1, 1, 1, 1, 1, 0, 0],
42 [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],
43 [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],
44 [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],
45 [0, 0, 0, 1, 1, 1, 1, 1, 0, 0],
46 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
47 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]], dtype=uint8)
48
49 """
50
51 dr = 1 / float(yradius)
52 dc = 1 / float(xradius)
53
54 r, c = np.ogrid[-1:1:dr, -1:1:dc]
55 rr, cc = np.nonzero(r ** 2 + c ** 2 < 1)
56
57 rr.flags.writeable = True
58 cc.flags.writeable = True
59 rr += cy - yradius
60 cc += cx - xradius
61
62 if shape is not None:
63 _coords_inside_image(rr, cc, shape)
64
65 return rr, cc
66
67
68 def circle(cy, cx, radius, shape=None):
69 """Generate coordinates of pixels within circle.
70
71 Parameters
72 ----------
73 cy, cx : double
74 Centre coordinate of circle.
75 radius: double
76 Radius of circle.
77 shape : tuple, optional
78 Image shape which is used to determine maximum extents of output pixel
79 coordinates. This is useful for circles which exceed the image size.
80 By default the full extents of the circle are used.
81
82 Returns
83 -------
84 rr, cc : ndarray of int
85 Pixel coordinates of circle.
86 May be used to directly index into an array, e.g.
87 ``img[rr, cc] = 1``.
88 Notes
89 -----
90 This function is a wrapper for skimage.draw.ellipse()
91
92 Examples
93 --------
94 >>> from skimage.draw import circle
95 >>> img = np.zeros((10, 10), dtype=np.uint8)
96 >>> rr, cc = circle(4, 4, 5)
97 >>> img[rr, cc] = 1
98 >>> img
99 array([[0, 0, 1, 1, 1, 1, 1, 0, 0, 0],
100 [0, 1, 1, 1, 1, 1, 1, 1, 0, 0],
101 [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],
102 [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],
103 [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],
104 [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],
105 [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],
106 [0, 1, 1, 1, 1, 1, 1, 1, 0, 0],
107 [0, 0, 1, 1, 1, 1, 1, 0, 0, 0],
108 [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]], dtype=uint8)
109
110 """
111
112 return ellipse(cy, cx, radius, radius, shape)
113
114
115 def set_color(img, coords, color):
116 """Set pixel color in the image at the given coordinates.
117
118 Coordinates that exceed the shape of the image will be ignored.
119
120 Parameters
121 ----------
122 img : (M, N, D) ndarray
123 Image
124 coords : ((P,) ndarray, (P,) ndarray)
125 Coordinates of pixels to be colored.
126 color : (D,) ndarray
127 Color to be assigned to coordinates in the image.
128
129 Returns
130 -------
131 img : (M, N, D) ndarray
132 The updated image.
133
134 Examples
135 --------
136 >>> from skimage.draw import line, set_color
137 >>> img = np.zeros((10, 10), dtype=np.uint8)
138 >>> rr, cc = line(1, 1, 20, 20)
139 >>> set_color(img, (rr, cc), 1)
140 >>> img
141 array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
142 [0, 1, 0, 0, 0, 0, 0, 0, 0, 0],
143 [0, 0, 1, 0, 0, 0, 0, 0, 0, 0],
144 [0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
145 [0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
146 [0, 0, 0, 0, 0, 1, 0, 0, 0, 0],
147 [0, 0, 0, 0, 0, 0, 1, 0, 0, 0],
148 [0, 0, 0, 0, 0, 0, 0, 1, 0, 0],
149 [0, 0, 0, 0, 0, 0, 0, 0, 1, 0],
150 [0, 0, 0, 0, 0, 0, 0, 0, 0, 1]], dtype=uint8)
151
152 """
153
154 rr, cc = coords
155 rr, cc = _coords_inside_image(rr, cc, img.shape)
156 img[rr, cc] = color
157
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/skimage/draw/draw.py b/skimage/draw/draw.py
--- a/skimage/draw/draw.py
+++ b/skimage/draw/draw.py
@@ -60,7 +60,7 @@
cc += cx - xradius
if shape is not None:
- _coords_inside_image(rr, cc, shape)
+ return _coords_inside_image(rr, cc, shape)
return rr, cc
|
{"golden_diff": "diff --git a/skimage/draw/draw.py b/skimage/draw/draw.py\n--- a/skimage/draw/draw.py\n+++ b/skimage/draw/draw.py\n@@ -60,7 +60,7 @@\n cc += cx - xradius\n \n if shape is not None:\n- _coords_inside_image(rr, cc, shape)\n+ return _coords_inside_image(rr, cc, shape)\n \n return rr, cc\n", "issue": "Draw circle does not obey shape argument - v0.93\nIn previous version worked fine, but I just installed new version 0.93 and draw.circle is not working properly. When I apply the circle for image 1024x1024 with following arguments:\nrr,cc=circle(-5.2796287128712879E+02, 1.5003712871287132E+02, 9.8910961199417170E+02, (1024,1024))\n\nI get negative values in rr and cc arrays. Which obviously leads to errors when applied to:\nimg[rr,cc]=0\n\n", "before_files": [{"content": "# coding: utf-8\nimport numpy as np\n\n\ndef _coords_inside_image(rr, cc, shape):\n mask = (rr >= 0) & (rr < shape[0]) & (cc >= 0) & (cc < shape[1])\n return rr[mask], cc[mask]\n\n\ndef ellipse(cy, cx, yradius, xradius, shape=None):\n \"\"\"Generate coordinates of pixels within ellipse.\n\n Parameters\n ----------\n cy, cx : double\n Centre coordinate of ellipse.\n yradius, xradius : double\n Minor and major semi-axes. ``(x/xradius)**2 + (y/yradius)**2 = 1``.\n shape : tuple, optional\n Image shape which is used to determine maximum extents of output pixel\n coordinates. This is useful for ellipses which exceed the image size.\n By default the full extents of the ellipse are used.\n\n Returns\n -------\n rr, cc : ndarray of int\n Pixel coordinates of ellipse.\n May be used to directly index into an array, e.g.\n ``img[rr, cc] = 1``.\n\n Examples\n --------\n >>> from skimage.draw import ellipse\n >>> img = np.zeros((10, 10), dtype=np.uint8)\n >>> rr, cc = ellipse(5, 5, 3, 4)\n >>> img[rr, cc] = 1\n >>> img\n array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 1, 1, 1, 1, 1, 0, 0],\n [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 0, 0, 1, 1, 1, 1, 1, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]], dtype=uint8)\n\n \"\"\"\n\n dr = 1 / float(yradius)\n dc = 1 / float(xradius)\n\n r, c = np.ogrid[-1:1:dr, -1:1:dc]\n rr, cc = np.nonzero(r ** 2 + c ** 2 < 1)\n\n rr.flags.writeable = True\n cc.flags.writeable = True\n rr += cy - yradius\n cc += cx - xradius\n\n if shape is not None:\n _coords_inside_image(rr, cc, shape)\n\n return rr, cc\n\n\ndef circle(cy, cx, radius, shape=None):\n \"\"\"Generate coordinates of pixels within circle.\n\n Parameters\n ----------\n cy, cx : double\n Centre coordinate of circle.\n radius: double\n Radius of circle.\n shape : tuple, optional\n Image shape which is used to determine maximum extents of output pixel\n coordinates. This is useful for circles which exceed the image size.\n By default the full extents of the circle are used.\n\n Returns\n -------\n rr, cc : ndarray of int\n Pixel coordinates of circle.\n May be used to directly index into an array, e.g.\n ``img[rr, cc] = 1``.\n Notes\n -----\n This function is a wrapper for skimage.draw.ellipse()\n\n Examples\n --------\n >>> from skimage.draw import circle\n >>> img = np.zeros((10, 10), dtype=np.uint8)\n >>> rr, cc = circle(4, 4, 5)\n >>> img[rr, cc] = 1\n >>> img\n array([[0, 0, 1, 1, 1, 1, 1, 0, 0, 0],\n [0, 1, 1, 1, 1, 1, 1, 1, 0, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 1, 1, 1, 1, 1, 1, 1, 0, 0],\n [0, 0, 1, 1, 1, 1, 1, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]], dtype=uint8)\n\n \"\"\"\n\n return ellipse(cy, cx, radius, radius, shape)\n\n\ndef set_color(img, coords, color):\n \"\"\"Set pixel color in the image at the given coordinates.\n\n Coordinates that exceed the shape of the image will be ignored.\n\n Parameters\n ----------\n img : (M, N, D) ndarray\n Image\n coords : ((P,) ndarray, (P,) ndarray)\n Coordinates of pixels to be colored.\n color : (D,) ndarray\n Color to be assigned to coordinates in the image.\n\n Returns\n -------\n img : (M, N, D) ndarray\n The updated image.\n\n Examples\n --------\n >>> from skimage.draw import line, set_color\n >>> img = np.zeros((10, 10), dtype=np.uint8)\n >>> rr, cc = line(1, 1, 20, 20)\n >>> set_color(img, (rr, cc), 1)\n >>> img\n array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 1, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 1, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 1, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 1, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 1, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 1, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 1, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 1, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 1]], dtype=uint8)\n\n \"\"\"\n\n rr, cc = coords\n rr, cc = _coords_inside_image(rr, cc, img.shape)\n img[rr, cc] = color\n", "path": "skimage/draw/draw.py"}], "after_files": [{"content": "# coding: utf-8\nimport numpy as np\n\n\ndef _coords_inside_image(rr, cc, shape):\n mask = (rr >= 0) & (rr < shape[0]) & (cc >= 0) & (cc < shape[1])\n return rr[mask], cc[mask]\n\n\ndef ellipse(cy, cx, yradius, xradius, shape=None):\n \"\"\"Generate coordinates of pixels within ellipse.\n\n Parameters\n ----------\n cy, cx : double\n Centre coordinate of ellipse.\n yradius, xradius : double\n Minor and major semi-axes. ``(x/xradius)**2 + (y/yradius)**2 = 1``.\n shape : tuple, optional\n Image shape which is used to determine maximum extents of output pixel\n coordinates. This is useful for ellipses which exceed the image size.\n By default the full extents of the ellipse are used.\n\n Returns\n -------\n rr, cc : ndarray of int\n Pixel coordinates of ellipse.\n May be used to directly index into an array, e.g.\n ``img[rr, cc] = 1``.\n\n Examples\n --------\n >>> from skimage.draw import ellipse\n >>> img = np.zeros((10, 10), dtype=np.uint8)\n >>> rr, cc = ellipse(5, 5, 3, 4)\n >>> img[rr, cc] = 1\n >>> img\n array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 1, 1, 1, 1, 1, 0, 0],\n [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 0, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 0, 0, 1, 1, 1, 1, 1, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]], dtype=uint8)\n\n \"\"\"\n\n dr = 1 / float(yradius)\n dc = 1 / float(xradius)\n\n r, c = np.ogrid[-1:1:dr, -1:1:dc]\n rr, cc = np.nonzero(r ** 2 + c ** 2 < 1)\n\n rr.flags.writeable = True\n cc.flags.writeable = True\n rr += cy - yradius\n cc += cx - xradius\n\n if shape is not None:\n return _coords_inside_image(rr, cc, shape)\n\n return rr, cc\n\n\ndef circle(cy, cx, radius, shape=None):\n \"\"\"Generate coordinates of pixels within circle.\n\n Parameters\n ----------\n cy, cx : double\n Centre coordinate of circle.\n radius: double\n Radius of circle.\n shape : tuple, optional\n Image shape which is used to determine maximum extents of output pixel\n coordinates. This is useful for circles which exceed the image size.\n By default the full extents of the circle are used.\n\n Returns\n -------\n rr, cc : ndarray of int\n Pixel coordinates of circle.\n May be used to directly index into an array, e.g.\n ``img[rr, cc] = 1``.\n Notes\n -----\n This function is a wrapper for skimage.draw.ellipse()\n\n Examples\n --------\n >>> from skimage.draw import circle\n >>> img = np.zeros((10, 10), dtype=np.uint8)\n >>> rr, cc = circle(4, 4, 5)\n >>> img[rr, cc] = 1\n >>> img\n array([[0, 0, 1, 1, 1, 1, 1, 0, 0, 0],\n [0, 1, 1, 1, 1, 1, 1, 1, 0, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [1, 1, 1, 1, 1, 1, 1, 1, 1, 0],\n [0, 1, 1, 1, 1, 1, 1, 1, 0, 0],\n [0, 0, 1, 1, 1, 1, 1, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 0]], dtype=uint8)\n\n \"\"\"\n\n return ellipse(cy, cx, radius, radius, shape)\n\n\ndef set_color(img, coords, color):\n \"\"\"Set pixel color in the image at the given coordinates.\n\n Coordinates that exceed the shape of the image will be ignored.\n\n Parameters\n ----------\n img : (M, N, D) ndarray\n Image\n coords : ((P,) ndarray, (P,) ndarray)\n Coordinates of pixels to be colored.\n color : (D,) ndarray\n Color to be assigned to coordinates in the image.\n\n Returns\n -------\n img : (M, N, D) ndarray\n The updated image.\n\n Examples\n --------\n >>> from skimage.draw import line, set_color\n >>> img = np.zeros((10, 10), dtype=np.uint8)\n >>> rr, cc = line(1, 1, 20, 20)\n >>> set_color(img, (rr, cc), 1)\n >>> img\n array([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 1, 0, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 1, 0, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 1, 0, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 1, 0, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 1, 0, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 1, 0, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 1, 0, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 1, 0],\n [0, 0, 0, 0, 0, 0, 0, 0, 0, 1]], dtype=uint8)\n\n \"\"\"\n\n rr, cc = coords\n rr, cc = _coords_inside_image(rr, cc, img.shape)\n img[rr, cc] = color\n", "path": "skimage/draw/draw.py"}]}
| 2,668 | 103 |
gh_patches_debug_15236
|
rasdani/github-patches
|
git_diff
|
lightly-ai__lightly-298
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
loader.num_workers=12 doesn't actually spawn multiple workers
I've noticed that when trying to create an embedding with the param `loader.num_workers` there seems to be no difference in CPU consumption;
`lightly-embed input_dir='./75000' trainer.max_epochs=1 loader.num_workers=12`
This could be that my images are ~8kb and there is some other bottleneck preventing the CPUs to max-out? in My ActivityMonitor (macOSX) I see that I have 80% idle CPU
System: MacBook Pro (16-inch, 2019), 2.6 GHz 6-Core Intel Core i7
OS: MacOSX 11.2.3
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lightly/cli/upload_cli.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """**Lightly Upload:** Upload images to the Lightly platform.
3
4 This module contains the entrypoint for the **lightly-upload**
5 command-line interface.
6 """
7
8 # Copyright (c) 2020. Lightly AG and its affiliates.
9 # All Rights Reserved
10
11 import hydra
12
13 import torchvision
14
15 from lightly.cli._helpers import fix_input_path
16
17 from lightly.api.utils import getenv
18 from lightly.api.api_workflow_client import ApiWorkflowClient
19 from lightly.data import LightlyDataset
20
21
22 def _upload_cli(cfg, is_cli_call=True):
23
24 input_dir = cfg['input_dir']
25 if input_dir and is_cli_call:
26 input_dir = fix_input_path(input_dir)
27
28 path_to_embeddings = cfg['embeddings']
29 if path_to_embeddings and is_cli_call:
30 path_to_embeddings = fix_input_path(path_to_embeddings)
31
32 dataset_id = cfg['dataset_id']
33 token = cfg['token']
34
35 size = cfg['resize']
36 if not isinstance(size, int):
37 size = tuple(size)
38 transform = None
39 if isinstance(size, tuple) or size > 0:
40 transform = torchvision.transforms.Resize(size)
41
42 if not token or not dataset_id:
43 print('Please specify your access token and dataset id.')
44 print('For help, try: lightly-upload --help')
45 return
46
47 api_workflow_client = ApiWorkflowClient(
48 token=token, dataset_id=dataset_id
49 )
50
51 if input_dir:
52 mode = cfg['upload']
53 dataset = LightlyDataset(input_dir=input_dir, transform=transform)
54 api_workflow_client.upload_dataset(
55 input=dataset, mode=mode
56 )
57
58 if path_to_embeddings:
59 name = cfg['embedding_name']
60 api_workflow_client.upload_embeddings(
61 path_to_embeddings_csv=path_to_embeddings, name=name
62 )
63
64
65 @hydra.main(config_path='config', config_name='config')
66 def upload_cli(cfg):
67 """Upload images/embeddings from the command-line to the Lightly platform.
68
69 Args:
70 cfg:
71 The default configs are loaded from the config file.
72 To overwrite them please see the section on the config file
73 (.config.config.yaml).
74
75 Command-Line Args:
76 input_dir:
77 Path to the input directory where images are stored.
78 embeddings:
79 Path to the csv file storing the embeddings generated by
80 lightly.
81 token:
82 User access token to the Lightly platform. If dataset_id
83 and token are specified, the images and embeddings are
84 uploaded to the platform.
85 dataset_id:
86 Identifier of the dataset on the Lightly platform. If
87 dataset_id and token are specified, the images and
88 embeddings are uploaded to the platform.
89 upload:
90 String to determine whether to upload the full images,
91 thumbnails only, or metadata only.
92
93 Must be one of ['full', 'thumbnails', 'metadata']
94 embedding_name:
95 Assign the embedding a name in order to identify it on the
96 Lightly platform.
97 resize:
98 Desired size of the uploaded images. If negative, default size is used.
99 If size is a sequence like (h, w), output size will be matched to
100 this. If size is an int, smaller edge of the image will be matched
101 to this number. i.e, if height > width, then image will be rescaled
102 to (size * height / width, size).
103
104 Examples:
105 >>> # upload thumbnails to the Lightly platform
106 >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ'
107 >>>
108 >>> # upload full images to the Lightly platform
109 >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ' upload='full'
110 >>>
111 >>> # upload metadata to the Lightly platform
112 >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ' upload='metadata'
113 >>>
114 >>> # upload embeddings to the Lightly platform (must have uploaded images beforehand)
115 >>> lightly-upload embeddings=embeddings.csv token='123' dataset_id='XYZ'
116 >>>
117 >>> # upload both, images and embeddings in a single command
118 >>> lightly-upload input_dir=data/ embeddings=embeddings.csv upload='full' \\
119 >>> token='123' dataset_id='XYZ'
120
121 """
122 _upload_cli(cfg)
123
124
125 def entry():
126 upload_cli()
127
```
Path: `lightly/api/api_workflow_upload_dataset.py`
Content:
```
1 import warnings
2 from concurrent.futures.thread import ThreadPoolExecutor
3 from typing import Union
4 import tqdm
5
6 from lightly.openapi_generated.swagger_client import TagCreator
7 from lightly.openapi_generated.swagger_client.models.sample_create_request import SampleCreateRequest
8 from lightly.api.utils import check_filename, check_image, get_thumbnail_from_img, PIL_to_bytes
9 from lightly.openapi_generated.swagger_client.models.initial_tag_create_request import InitialTagCreateRequest
10 from lightly.data.dataset import LightlyDataset
11
12 from lightly.api.utils import retry
13
14
15 class _UploadDatasetMixin:
16
17 def upload_dataset(self, input: Union[str, LightlyDataset], max_workers: int = 8,
18 mode: str = 'thumbnails', verbose: bool = True):
19 """Uploads a dataset to to the Lightly cloud solution.
20
21 Args:
22 input:
23 one of the following:
24 - the path to the dataset, e.g. "path/to/dataset"
25 - the dataset in form of a LightlyDataset
26 max_workers:
27 Maximum number of workers uploading images in parallel.
28 max_requests:
29 Maximum number of requests a single worker can do before he has
30 to wait for the others.
31 mode:
32 One of [full, thumbnails, metadata]. Whether to upload thumbnails,
33 full images, or metadata only.
34
35 Raises:
36 ValueError if dataset is too large or input has the wrong type
37 RuntimeError if the connection to the server failed.
38
39 """
40 no_tags_on_server = len(self._get_all_tags())
41 if no_tags_on_server > 0:
42 warnings.warn(f"Dataset with id {self.dataset_id} has already been completely uploaded to the platform. Skipping upload.")
43 return
44
45 # Check input variable 'input'
46 if isinstance(input, str):
47 dataset = LightlyDataset(input_dir=input)
48 elif isinstance(input, LightlyDataset):
49 dataset = input
50 else:
51 raise ValueError(f"input must either be a LightlyDataset or the path to the dataset as str, "
52 f"but is of type {type(input)}")
53
54 # check the allowed dataset size
55 max_dataset_size_str = self.quota_api.get_quota_maximum_dataset_size()
56 max_dataset_size = int(max_dataset_size_str)
57 if len(dataset) > max_dataset_size:
58 msg = f'Your dataset has {len(dataset)} samples which'
59 msg += f' is more than the allowed maximum of {max_dataset_size}'
60 raise ValueError(msg)
61
62 # handle the case where len(dataset) < max_workers
63 max_workers = min(len(dataset), max_workers)
64
65 # upload the samples
66 if verbose:
67 print(f'Uploading images (with {max_workers} workers).', flush=True)
68
69 pbar = tqdm.tqdm(unit='imgs', total=len(dataset))
70 tqdm_lock = tqdm.tqdm.get_lock()
71
72 # define lambda function for concurrent upload
73 def lambda_(i):
74 # load image
75 image, label, filename = dataset[i]
76 # try to upload image
77 try:
78 self._upload_single_image(
79 image=image,
80 label=label,
81 filename=filename,
82 mode=mode,
83 )
84 success = True
85 except Exception as e:
86 warnings.warn(f"Upload of image {filename} failed with error {e}")
87 success = False
88
89 # update the progress bar
90 tqdm_lock.acquire() # lock
91 pbar.update(1) # update
92 tqdm_lock.release() # unlock
93 # return whether the upload was successful
94 return success
95
96 with ThreadPoolExecutor(max_workers=max_workers) as executor:
97 results = list(executor.map(
98 lambda_, [i for i in range(len(dataset))], chunksize=1))
99
100 if not all(results):
101 msg = 'Warning: Unsuccessful upload(s)! '
102 msg += 'This could cause problems when uploading embeddings.'
103 msg += 'Failed at image: {}'.format(results.index(False))
104 warnings.warn(msg)
105
106 # set image type of data and create initial tag
107 if mode == 'full':
108 img_type = 'full'
109 elif mode == 'thumbnails':
110 img_type = 'thumbnail'
111 else:
112 img_type = 'meta'
113
114 initial_tag_create_request = InitialTagCreateRequest(img_type=img_type, creator=TagCreator.USER_PIP)
115 self.tags_api.create_initial_tag_by_dataset_id(body=initial_tag_create_request, dataset_id=self.dataset_id)
116
117 def _upload_single_image(self, image, label, filename: str, mode):
118 """Uploads a single image to the Lightly platform.
119
120 """
121
122 # check whether the filename is too long
123 basename = filename
124 if not check_filename(basename):
125 msg = (f'Filename {basename} is longer than the allowed maximum of '
126 'characters and will be skipped.')
127 warnings.warn(msg)
128 return False
129
130 # calculate metadata, and check if corrupted
131 metadata = check_image(image)
132
133 # generate thumbnail if necessary
134 thumbname = None
135 if not metadata['is_corrupted'] and mode in ["thumbnails", "full"]:
136 thumbname = '.'.join(basename.split('.')[:-1]) + '_thumb.webp'
137
138 body = SampleCreateRequest(file_name=basename, thumb_name=thumbname, meta_data=metadata)
139 sample_id = retry(
140 self.samples_api.create_sample_by_dataset_id,
141 body=body,
142 dataset_id=self.dataset_id
143 ).id
144
145 if not metadata['is_corrupted'] and mode in ["thumbnails", "full"]:
146 thumbnail = get_thumbnail_from_img(image)
147 image_to_upload = PIL_to_bytes(thumbnail, ext='webp', quality=90)
148
149 signed_url = retry(
150 self.samples_api.get_sample_image_write_url_by_id,
151 dataset_id=self.dataset_id,
152 sample_id=sample_id,
153 is_thumbnail=True
154 )
155
156 # try to upload thumbnail
157 retry(
158 self.upload_file_with_signed_url,
159 image_to_upload,
160 signed_url
161 )
162
163 thumbnail.close()
164
165 if not metadata['is_corrupted'] and mode == "full":
166 image_to_upload = PIL_to_bytes(image)
167
168 signed_url = retry(
169 self.samples_api.get_sample_image_write_url_by_id,
170 dataset_id=self.dataset_id,
171 sample_id=sample_id,
172 is_thumbnail=False
173 )
174
175 # try to upload thumbnail
176 retry(
177 self.upload_file_with_signed_url,
178 image_to_upload,
179 signed_url
180 )
181
182 image.close()
183
184
185
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/lightly/api/api_workflow_upload_dataset.py b/lightly/api/api_workflow_upload_dataset.py
--- a/lightly/api/api_workflow_upload_dataset.py
+++ b/lightly/api/api_workflow_upload_dataset.py
@@ -61,6 +61,7 @@
# handle the case where len(dataset) < max_workers
max_workers = min(len(dataset), max_workers)
+ max_workers = max(max_workers, 1)
# upload the samples
if verbose:
diff --git a/lightly/cli/upload_cli.py b/lightly/cli/upload_cli.py
--- a/lightly/cli/upload_cli.py
+++ b/lightly/cli/upload_cli.py
@@ -52,7 +52,7 @@
mode = cfg['upload']
dataset = LightlyDataset(input_dir=input_dir, transform=transform)
api_workflow_client.upload_dataset(
- input=dataset, mode=mode
+ input=dataset, mode=mode, max_workers=cfg['loader']['num_workers']
)
if path_to_embeddings:
|
{"golden_diff": "diff --git a/lightly/api/api_workflow_upload_dataset.py b/lightly/api/api_workflow_upload_dataset.py\n--- a/lightly/api/api_workflow_upload_dataset.py\n+++ b/lightly/api/api_workflow_upload_dataset.py\n@@ -61,6 +61,7 @@\n \n # handle the case where len(dataset) < max_workers\n max_workers = min(len(dataset), max_workers)\n+ max_workers = max(max_workers, 1)\n \n # upload the samples\n if verbose:\ndiff --git a/lightly/cli/upload_cli.py b/lightly/cli/upload_cli.py\n--- a/lightly/cli/upload_cli.py\n+++ b/lightly/cli/upload_cli.py\n@@ -52,7 +52,7 @@\n mode = cfg['upload']\n dataset = LightlyDataset(input_dir=input_dir, transform=transform)\n api_workflow_client.upload_dataset(\n- input=dataset, mode=mode\n+ input=dataset, mode=mode, max_workers=cfg['loader']['num_workers']\n )\n \n if path_to_embeddings:\n", "issue": "loader.num_workers=12 doesn't actually spawn multiple workers\nI've noticed that when trying to create an embedding with the param `loader.num_workers` there seems to be no difference in CPU consumption;\r\n`lightly-embed input_dir='./75000' trainer.max_epochs=1 loader.num_workers=12`\r\n\r\nThis could be that my images are ~8kb and there is some other bottleneck preventing the CPUs to max-out? in My ActivityMonitor (macOSX) I see that I have 80% idle CPU\r\n\r\nSystem: MacBook Pro (16-inch, 2019), 2.6 GHz 6-Core Intel Core i7\r\nOS: MacOSX 11.2.3\r\n\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"**Lightly Upload:** Upload images to the Lightly platform.\n\nThis module contains the entrypoint for the **lightly-upload**\ncommand-line interface.\n\"\"\"\n\n# Copyright (c) 2020. Lightly AG and its affiliates.\n# All Rights Reserved\n\nimport hydra\n\nimport torchvision\n\nfrom lightly.cli._helpers import fix_input_path\n\nfrom lightly.api.utils import getenv\nfrom lightly.api.api_workflow_client import ApiWorkflowClient\nfrom lightly.data import LightlyDataset\n\n\ndef _upload_cli(cfg, is_cli_call=True):\n\n input_dir = cfg['input_dir']\n if input_dir and is_cli_call:\n input_dir = fix_input_path(input_dir)\n\n path_to_embeddings = cfg['embeddings']\n if path_to_embeddings and is_cli_call:\n path_to_embeddings = fix_input_path(path_to_embeddings)\n\n dataset_id = cfg['dataset_id']\n token = cfg['token']\n\n size = cfg['resize']\n if not isinstance(size, int):\n size = tuple(size)\n transform = None\n if isinstance(size, tuple) or size > 0:\n transform = torchvision.transforms.Resize(size)\n\n if not token or not dataset_id:\n print('Please specify your access token and dataset id.')\n print('For help, try: lightly-upload --help')\n return\n\n api_workflow_client = ApiWorkflowClient(\n token=token, dataset_id=dataset_id\n )\n\n if input_dir:\n mode = cfg['upload']\n dataset = LightlyDataset(input_dir=input_dir, transform=transform)\n api_workflow_client.upload_dataset(\n input=dataset, mode=mode\n )\n\n if path_to_embeddings:\n name = cfg['embedding_name']\n api_workflow_client.upload_embeddings(\n path_to_embeddings_csv=path_to_embeddings, name=name\n )\n\n\n@hydra.main(config_path='config', config_name='config')\ndef upload_cli(cfg):\n \"\"\"Upload images/embeddings from the command-line to the Lightly platform.\n\n Args:\n cfg:\n The default configs are loaded from the config file.\n To overwrite them please see the section on the config file \n (.config.config.yaml).\n \n Command-Line Args:\n input_dir:\n Path to the input directory where images are stored.\n embeddings:\n Path to the csv file storing the embeddings generated by\n lightly.\n token:\n User access token to the Lightly platform. If dataset_id\n and token are specified, the images and embeddings are \n uploaded to the platform.\n dataset_id:\n Identifier of the dataset on the Lightly platform. If \n dataset_id and token are specified, the images and \n embeddings are uploaded to the platform.\n upload:\n String to determine whether to upload the full images, \n thumbnails only, or metadata only.\n\n Must be one of ['full', 'thumbnails', 'metadata']\n embedding_name:\n Assign the embedding a name in order to identify it on the \n Lightly platform.\n resize:\n Desired size of the uploaded images. If negative, default size is used.\n If size is a sequence like (h, w), output size will be matched to \n this. If size is an int, smaller edge of the image will be matched \n to this number. i.e, if height > width, then image will be rescaled\n to (size * height / width, size).\n\n Examples:\n >>> #\u00a0upload thumbnails to the Lightly platform\n >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ'\n >>> \n >>> # upload full images to the Lightly platform\n >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ' upload='full'\n >>>\n >>> # upload metadata to the Lightly platform\n >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ' upload='metadata'\n >>>\n >>> # upload embeddings to the Lightly platform (must have uploaded images beforehand)\n >>> lightly-upload embeddings=embeddings.csv token='123' dataset_id='XYZ'\n >>>\n >>> # upload both, images and embeddings in a single command\n >>> lightly-upload input_dir=data/ embeddings=embeddings.csv upload='full' \\\\\n >>> token='123' dataset_id='XYZ'\n\n \"\"\"\n _upload_cli(cfg)\n\n\ndef entry():\n upload_cli()\n", "path": "lightly/cli/upload_cli.py"}, {"content": "import warnings\nfrom concurrent.futures.thread import ThreadPoolExecutor\nfrom typing import Union\nimport tqdm\n\nfrom lightly.openapi_generated.swagger_client import TagCreator\nfrom lightly.openapi_generated.swagger_client.models.sample_create_request import SampleCreateRequest\nfrom lightly.api.utils import check_filename, check_image, get_thumbnail_from_img, PIL_to_bytes\nfrom lightly.openapi_generated.swagger_client.models.initial_tag_create_request import InitialTagCreateRequest\nfrom lightly.data.dataset import LightlyDataset\n\nfrom lightly.api.utils import retry\n\n\nclass _UploadDatasetMixin:\n\n def upload_dataset(self, input: Union[str, LightlyDataset], max_workers: int = 8,\n mode: str = 'thumbnails', verbose: bool = True):\n \"\"\"Uploads a dataset to to the Lightly cloud solution.\n\n Args:\n input:\n one of the following:\n - the path to the dataset, e.g. \"path/to/dataset\"\n - the dataset in form of a LightlyDataset\n max_workers:\n Maximum number of workers uploading images in parallel.\n max_requests:\n Maximum number of requests a single worker can do before he has\n to wait for the others.\n mode:\n One of [full, thumbnails, metadata]. Whether to upload thumbnails,\n full images, or metadata only.\n\n Raises:\n ValueError if dataset is too large or input has the wrong type\n RuntimeError if the connection to the server failed.\n\n \"\"\"\n no_tags_on_server = len(self._get_all_tags())\n if no_tags_on_server > 0:\n warnings.warn(f\"Dataset with id {self.dataset_id} has already been completely uploaded to the platform. Skipping upload.\")\n return\n\n # Check input variable 'input'\n if isinstance(input, str):\n dataset = LightlyDataset(input_dir=input)\n elif isinstance(input, LightlyDataset):\n dataset = input\n else:\n raise ValueError(f\"input must either be a LightlyDataset or the path to the dataset as str, \"\n f\"but is of type {type(input)}\")\n\n # check the allowed dataset size\n max_dataset_size_str = self.quota_api.get_quota_maximum_dataset_size()\n max_dataset_size = int(max_dataset_size_str)\n if len(dataset) > max_dataset_size:\n msg = f'Your dataset has {len(dataset)} samples which'\n msg += f' is more than the allowed maximum of {max_dataset_size}'\n raise ValueError(msg)\n\n # handle the case where len(dataset) < max_workers\n max_workers = min(len(dataset), max_workers)\n\n # upload the samples\n if verbose:\n print(f'Uploading images (with {max_workers} workers).', flush=True)\n\n pbar = tqdm.tqdm(unit='imgs', total=len(dataset))\n tqdm_lock = tqdm.tqdm.get_lock()\n\n # define lambda function for concurrent upload\n def lambda_(i):\n # load image\n image, label, filename = dataset[i]\n # try to upload image\n try:\n self._upload_single_image(\n image=image,\n label=label,\n filename=filename,\n mode=mode,\n )\n success = True\n except Exception as e:\n warnings.warn(f\"Upload of image {filename} failed with error {e}\")\n success = False\n\n # update the progress bar\n tqdm_lock.acquire() # lock\n pbar.update(1) # update\n tqdm_lock.release() # unlock\n # return whether the upload was successful\n return success\n\n with ThreadPoolExecutor(max_workers=max_workers) as executor:\n results = list(executor.map(\n lambda_, [i for i in range(len(dataset))], chunksize=1))\n\n if not all(results):\n msg = 'Warning: Unsuccessful upload(s)! '\n msg += 'This could cause problems when uploading embeddings.'\n msg += 'Failed at image: {}'.format(results.index(False))\n warnings.warn(msg)\n\n # set image type of data and create initial tag\n if mode == 'full':\n img_type = 'full'\n elif mode == 'thumbnails':\n img_type = 'thumbnail'\n else:\n img_type = 'meta'\n\n initial_tag_create_request = InitialTagCreateRequest(img_type=img_type, creator=TagCreator.USER_PIP)\n self.tags_api.create_initial_tag_by_dataset_id(body=initial_tag_create_request, dataset_id=self.dataset_id)\n\n def _upload_single_image(self, image, label, filename: str, mode):\n \"\"\"Uploads a single image to the Lightly platform.\n\n \"\"\"\n\n # check whether the filename is too long\n basename = filename\n if not check_filename(basename):\n msg = (f'Filename {basename} is longer than the allowed maximum of '\n 'characters and will be skipped.')\n warnings.warn(msg)\n return False\n\n # calculate metadata, and check if corrupted\n metadata = check_image(image)\n\n # generate thumbnail if necessary\n thumbname = None\n if not metadata['is_corrupted'] and mode in [\"thumbnails\", \"full\"]:\n thumbname = '.'.join(basename.split('.')[:-1]) + '_thumb.webp'\n\n body = SampleCreateRequest(file_name=basename, thumb_name=thumbname, meta_data=metadata)\n sample_id = retry(\n self.samples_api.create_sample_by_dataset_id,\n body=body,\n dataset_id=self.dataset_id\n ).id\n\n if not metadata['is_corrupted'] and mode in [\"thumbnails\", \"full\"]:\n thumbnail = get_thumbnail_from_img(image)\n image_to_upload = PIL_to_bytes(thumbnail, ext='webp', quality=90)\n\n signed_url = retry(\n self.samples_api.get_sample_image_write_url_by_id,\n dataset_id=self.dataset_id,\n sample_id=sample_id,\n is_thumbnail=True\n )\n\n # try to upload thumbnail\n retry(\n self.upload_file_with_signed_url,\n image_to_upload,\n signed_url\n )\n\n thumbnail.close()\n\n if not metadata['is_corrupted'] and mode == \"full\":\n image_to_upload = PIL_to_bytes(image)\n\n signed_url = retry(\n self.samples_api.get_sample_image_write_url_by_id,\n dataset_id=self.dataset_id,\n sample_id=sample_id,\n is_thumbnail=False\n )\n\n # try to upload thumbnail\n retry(\n self.upload_file_with_signed_url,\n image_to_upload,\n signed_url\n )\n\n image.close()\n\n\n", "path": "lightly/api/api_workflow_upload_dataset.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"**Lightly Upload:** Upload images to the Lightly platform.\n\nThis module contains the entrypoint for the **lightly-upload**\ncommand-line interface.\n\"\"\"\n\n# Copyright (c) 2020. Lightly AG and its affiliates.\n# All Rights Reserved\n\nimport hydra\n\nimport torchvision\n\nfrom lightly.cli._helpers import fix_input_path\n\nfrom lightly.api.utils import getenv\nfrom lightly.api.api_workflow_client import ApiWorkflowClient\nfrom lightly.data import LightlyDataset\n\n\ndef _upload_cli(cfg, is_cli_call=True):\n\n input_dir = cfg['input_dir']\n if input_dir and is_cli_call:\n input_dir = fix_input_path(input_dir)\n\n path_to_embeddings = cfg['embeddings']\n if path_to_embeddings and is_cli_call:\n path_to_embeddings = fix_input_path(path_to_embeddings)\n\n dataset_id = cfg['dataset_id']\n token = cfg['token']\n\n size = cfg['resize']\n if not isinstance(size, int):\n size = tuple(size)\n transform = None\n if isinstance(size, tuple) or size > 0:\n transform = torchvision.transforms.Resize(size)\n\n if not token or not dataset_id:\n print('Please specify your access token and dataset id.')\n print('For help, try: lightly-upload --help')\n return\n\n api_workflow_client = ApiWorkflowClient(\n token=token, dataset_id=dataset_id\n )\n\n if input_dir:\n mode = cfg['upload']\n dataset = LightlyDataset(input_dir=input_dir, transform=transform)\n api_workflow_client.upload_dataset(\n input=dataset, mode=mode, max_workers=cfg['loader']['num_workers']\n )\n\n if path_to_embeddings:\n name = cfg['embedding_name']\n api_workflow_client.upload_embeddings(\n path_to_embeddings_csv=path_to_embeddings, name=name\n )\n\n\n@hydra.main(config_path='config', config_name='config')\ndef upload_cli(cfg):\n \"\"\"Upload images/embeddings from the command-line to the Lightly platform.\n\n Args:\n cfg:\n The default configs are loaded from the config file.\n To overwrite them please see the section on the config file \n (.config.config.yaml).\n \n Command-Line Args:\n input_dir:\n Path to the input directory where images are stored.\n embeddings:\n Path to the csv file storing the embeddings generated by\n lightly.\n token:\n User access token to the Lightly platform. If dataset_id\n and token are specified, the images and embeddings are \n uploaded to the platform.\n dataset_id:\n Identifier of the dataset on the Lightly platform. If \n dataset_id and token are specified, the images and \n embeddings are uploaded to the platform.\n upload:\n String to determine whether to upload the full images, \n thumbnails only, or metadata only.\n\n Must be one of ['full', 'thumbnails', 'metadata']\n embedding_name:\n Assign the embedding a name in order to identify it on the \n Lightly platform.\n resize:\n Desired size of the uploaded images. If negative, default size is used.\n If size is a sequence like (h, w), output size will be matched to \n this. If size is an int, smaller edge of the image will be matched \n to this number. i.e, if height > width, then image will be rescaled\n to (size * height / width, size).\n\n Examples:\n >>> #\u00a0upload thumbnails to the Lightly platform\n >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ'\n >>> \n >>> # upload full images to the Lightly platform\n >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ' upload='full'\n >>>\n >>> # upload metadata to the Lightly platform\n >>> lightly-upload input_dir=data/ token='123' dataset_id='XYZ' upload='metadata'\n >>>\n >>> # upload embeddings to the Lightly platform (must have uploaded images beforehand)\n >>> lightly-upload embeddings=embeddings.csv token='123' dataset_id='XYZ'\n >>>\n >>> # upload both, images and embeddings in a single command\n >>> lightly-upload input_dir=data/ embeddings=embeddings.csv upload='full' \\\\\n >>> token='123' dataset_id='XYZ'\n\n \"\"\"\n _upload_cli(cfg)\n\n\ndef entry():\n upload_cli()\n", "path": "lightly/cli/upload_cli.py"}, {"content": "import warnings\nfrom concurrent.futures.thread import ThreadPoolExecutor\nfrom typing import Union\nimport tqdm\n\nfrom lightly.openapi_generated.swagger_client import TagCreator\nfrom lightly.openapi_generated.swagger_client.models.sample_create_request import SampleCreateRequest\nfrom lightly.api.utils import check_filename, check_image, get_thumbnail_from_img, PIL_to_bytes\nfrom lightly.openapi_generated.swagger_client.models.initial_tag_create_request import InitialTagCreateRequest\nfrom lightly.data.dataset import LightlyDataset\n\nfrom lightly.api.utils import retry\n\n\nclass _UploadDatasetMixin:\n\n def upload_dataset(self, input: Union[str, LightlyDataset], max_workers: int = 8,\n mode: str = 'thumbnails', verbose: bool = True):\n \"\"\"Uploads a dataset to to the Lightly cloud solution.\n\n Args:\n input:\n one of the following:\n - the path to the dataset, e.g. \"path/to/dataset\"\n - the dataset in form of a LightlyDataset\n max_workers:\n Maximum number of workers uploading images in parallel.\n max_requests:\n Maximum number of requests a single worker can do before he has\n to wait for the others.\n mode:\n One of [full, thumbnails, metadata]. Whether to upload thumbnails,\n full images, or metadata only.\n\n Raises:\n ValueError if dataset is too large or input has the wrong type\n RuntimeError if the connection to the server failed.\n\n \"\"\"\n no_tags_on_server = len(self._get_all_tags())\n if no_tags_on_server > 0:\n warnings.warn(f\"Dataset with id {self.dataset_id} has already been completely uploaded to the platform. Skipping upload.\")\n return\n\n # Check input variable 'input'\n if isinstance(input, str):\n dataset = LightlyDataset(input_dir=input)\n elif isinstance(input, LightlyDataset):\n dataset = input\n else:\n raise ValueError(f\"input must either be a LightlyDataset or the path to the dataset as str, \"\n f\"but is of type {type(input)}\")\n\n # check the allowed dataset size\n max_dataset_size_str = self.quota_api.get_quota_maximum_dataset_size()\n max_dataset_size = int(max_dataset_size_str)\n if len(dataset) > max_dataset_size:\n msg = f'Your dataset has {len(dataset)} samples which'\n msg += f' is more than the allowed maximum of {max_dataset_size}'\n raise ValueError(msg)\n\n # handle the case where len(dataset) < max_workers\n max_workers = min(len(dataset), max_workers)\n max_workers = max(max_workers, 1)\n\n # upload the samples\n if verbose:\n print(f'Uploading images (with {max_workers} workers).', flush=True)\n\n pbar = tqdm.tqdm(unit='imgs', total=len(dataset))\n tqdm_lock = tqdm.tqdm.get_lock()\n\n # define lambda function for concurrent upload\n def lambda_(i):\n # load image\n image, label, filename = dataset[i]\n # try to upload image\n try:\n self._upload_single_image(\n image=image,\n label=label,\n filename=filename,\n mode=mode,\n )\n success = True\n except Exception as e:\n warnings.warn(f\"Upload of image {filename} failed with error {e}\")\n success = False\n\n # update the progress bar\n tqdm_lock.acquire() # lock\n pbar.update(1) # update\n tqdm_lock.release() # unlock\n # return whether the upload was successful\n return success\n\n with ThreadPoolExecutor(max_workers=max_workers) as executor:\n results = list(executor.map(\n lambda_, [i for i in range(len(dataset))], chunksize=1))\n\n if not all(results):\n msg = 'Warning: Unsuccessful upload(s)! '\n msg += 'This could cause problems when uploading embeddings.'\n msg += 'Failed at image: {}'.format(results.index(False))\n warnings.warn(msg)\n\n # set image type of data and create initial tag\n if mode == 'full':\n img_type = 'full'\n elif mode == 'thumbnails':\n img_type = 'thumbnail'\n else:\n img_type = 'meta'\n\n initial_tag_create_request = InitialTagCreateRequest(img_type=img_type, creator=TagCreator.USER_PIP)\n self.tags_api.create_initial_tag_by_dataset_id(body=initial_tag_create_request, dataset_id=self.dataset_id)\n\n def _upload_single_image(self, image, label, filename: str, mode):\n \"\"\"Uploads a single image to the Lightly platform.\n\n \"\"\"\n\n # check whether the filename is too long\n basename = filename\n if not check_filename(basename):\n msg = (f'Filename {basename} is longer than the allowed maximum of '\n 'characters and will be skipped.')\n warnings.warn(msg)\n return False\n\n # calculate metadata, and check if corrupted\n metadata = check_image(image)\n\n # generate thumbnail if necessary\n thumbname = None\n if not metadata['is_corrupted'] and mode in [\"thumbnails\", \"full\"]:\n thumbname = '.'.join(basename.split('.')[:-1]) + '_thumb.webp'\n\n body = SampleCreateRequest(file_name=basename, thumb_name=thumbname, meta_data=metadata)\n sample_id = retry(\n self.samples_api.create_sample_by_dataset_id,\n body=body,\n dataset_id=self.dataset_id\n ).id\n\n if not metadata['is_corrupted'] and mode in [\"thumbnails\", \"full\"]:\n thumbnail = get_thumbnail_from_img(image)\n image_to_upload = PIL_to_bytes(thumbnail, ext='webp', quality=90)\n\n signed_url = retry(\n self.samples_api.get_sample_image_write_url_by_id,\n dataset_id=self.dataset_id,\n sample_id=sample_id,\n is_thumbnail=True\n )\n\n # try to upload thumbnail\n retry(\n self.upload_file_with_signed_url,\n image_to_upload,\n signed_url\n )\n\n thumbnail.close()\n\n if not metadata['is_corrupted'] and mode == \"full\":\n image_to_upload = PIL_to_bytes(image)\n\n signed_url = retry(\n self.samples_api.get_sample_image_write_url_by_id,\n dataset_id=self.dataset_id,\n sample_id=sample_id,\n is_thumbnail=False\n )\n\n # try to upload thumbnail\n retry(\n self.upload_file_with_signed_url,\n image_to_upload,\n signed_url\n )\n\n image.close()\n\n\n", "path": "lightly/api/api_workflow_upload_dataset.py"}]}
| 3,471 | 220 |
gh_patches_debug_32135
|
rasdani/github-patches
|
git_diff
|
Azure__azure-cli-extensions-1830
|
We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
az monitor app-insights query Execute a simple query over past 3.5 days.
## Describe the bug
Trying to query Application Insights via the example provided via the [az monitor app-insights query](https://docs.microsoft.com/en-us/cli/azure/ext/application-insights/monitor/app-insights?view=azure-cli-latest#ext-application-insights-az-monitor-app-insights-query) docs.
**Command Name**
`az monitor app-insights query
Extension Name: application-insights. Version: 0.1.1.`
**Errors:**
```
unsupported operand type(s) for -: 'datetime.datetime' and 'str'
Traceback (most recent call last):
python3.6/site-packages/knack/cli.py, ln 206, in invoke
cmd_result = self.invocation.execute(args)
cli/core/commands/__init__.py, ln 603, in execute
raise ex
cli/core/commands/__init__.py, ln 661, in _run_jobs_serially
results.append(self._run_job(expanded_arg, cmd_copy))
...
cliextensions/application-insights/azext_applicationinsights/util.py, ln 40, in get_timespan
start_time = (dateutil.parser.parse(end_time) - offset).isoformat()
TypeError: unsupported operand type(s) for -: 'datetime.datetime' and 'str'
```
## To Reproduce:
Steps to reproduce the behavior. Note that argument values have been redacted, as they may contain sensitive information.
- _Put any pre-requisite steps here..._
- `az monitor app-insights query --app <Your-GUID> --analytics-query 'requests | summarize count() by bin(timestamp, 1h)' --offset P3DT12H`
## Expected Behavior
Execute a simple query over the past 3.5 days.
## Environment Summary
```
Linux-4.4.0-18362-Microsoft-x86_64-with-debian-buster-sid
Python 3.6.5
Shell: bash
azure-cli 2.0.75
Extensions:
application-insights 0.1.1
log-analytics 0.1.3
```
## Additional Context
<!--Please don't remove this:-->
<!--auto-generated-->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/application-insights/azext_applicationinsights/custom.py`
Content:
```
1 # --------------------------------------------------------------------------------------------
2 # Copyright (c) Microsoft Corporation. All rights reserved.
3 # Licensed under the MIT License. See License.txt in the project root for license information.
4 # --------------------------------------------------------------------------------------------
5
6 # pylint: disable=line-too-long, protected-access
7
8 from knack.util import CLIError
9 from knack.log import get_logger
10 from azext_applicationinsights.vendored_sdks.applicationinsights.models import ErrorResponseException
11 from msrestazure.azure_exceptions import CloudError
12 from .util import get_id_from_azure_resource, get_query_targets, get_timespan, get_linked_properties
13
14 logger = get_logger(__name__)
15 HELP_MESSAGE = " Please use `az feature register --name AIWorkspacePreview --namespace microsoft.insights` to register the feature"
16
17
18 def execute_query(cmd, client, application, analytics_query, start_time=None, end_time=None, offset='1h', resource_group_name=None):
19 """Executes a query against the provided Application Insights application."""
20 from .vendored_sdks.applicationinsights.models import QueryBody
21 targets = get_query_targets(cmd.cli_ctx, application, resource_group_name)
22 try:
23 return client.query.execute(targets[0], QueryBody(query=analytics_query, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), applications=targets[1:]))
24 except ErrorResponseException as ex:
25 if "PathNotFoundError" in ex.message:
26 raise ValueError("The Application Insight is not found. Please check the app id again.")
27 raise ex
28
29
30 def get_events(cmd, client, application, event_type, event=None, start_time=None, end_time=None, offset='1h', resource_group_name=None):
31 timespan = get_timespan(cmd.cli_ctx, start_time, end_time, offset)
32 if event:
33 return client.events.get(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), event_type, event, timespan=timespan)
34 return client.events.get_by_type(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), event_type, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset))
35
36
37 def get_metric(cmd, client, application, metric, start_time=None, end_time=None, offset='1h', interval=None, aggregation=None, segment=None, top=None, orderby=None, filter_arg=None, resource_group_name=None):
38 return client.metrics.get(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), metric, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), interval=interval, aggregation=aggregation, segment=segment, top=top, orderby=orderby, filter_arg=filter_arg)
39
40
41 def get_metrics_metadata(cmd, client, application, resource_group_name=None):
42 return client.metrics.get_metadata(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name))
43
44
45 def create_or_update_component(cmd, client, application, resource_group_name, location, tags=None,
46 kind="web", application_type='web', workspace_resource_id=None,
47 public_network_access_for_ingestion=None, public_network_access_for_query=None):
48 # due to service limitation, we have to do such a hack. We must refract the logic later.
49 if workspace_resource_id is None:
50 from .vendored_sdks.mgmt_applicationinsights.v2018_05_01_preview.models import ApplicationInsightsComponent
51 component = ApplicationInsightsComponent(location=location, kind=kind, application_type=application_type, tags=tags,
52 public_network_access_for_ingestion=public_network_access_for_ingestion,
53 public_network_access_for_query=public_network_access_for_query)
54 return client.create_or_update(resource_group_name, application, component)
55
56 from .vendored_sdks.mgmt_applicationinsights.v2020_02_02_preview.models import ApplicationInsightsComponent
57 component = ApplicationInsightsComponent(location=location, kind=kind, application_type=application_type,
58 tags=tags, workspace_resource_id=workspace_resource_id,
59 public_network_access_for_ingestion=public_network_access_for_ingestion,
60 public_network_access_for_query=public_network_access_for_query)
61 from ._client_factory import applicationinsights_mgmt_plane_client
62 client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components
63 try:
64 return client.create_or_update(resource_group_name, application, component)
65 except CloudError as ex:
66 ex.error._message = ex.error._message + HELP_MESSAGE
67 raise ex
68
69
70 def update_component(cmd, client, application, resource_group_name, kind=None, workspace_resource_id=None,
71 public_network_access_for_ingestion=None, public_network_access_for_query=None):
72 from ._client_factory import applicationinsights_mgmt_plane_client
73 existing_component = None
74 if workspace_resource_id is not None:
75 latest_client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components
76 try:
77 existing_component = latest_client.get(resource_group_name, application)
78 except CloudError as ex:
79 ex.error._message = ex.error._message + HELP_MESSAGE
80 raise ex
81 existing_component.workspace_resource_id = workspace_resource_id or None
82 else:
83 existing_component = client.get(resource_group_name, application)
84 if kind:
85 existing_component.kind = kind
86 if public_network_access_for_ingestion is not None:
87 existing_component.public_network_access_for_ingestion = public_network_access_for_ingestion
88 if public_network_access_for_query is not None:
89 existing_component.public_network_access_for_query = public_network_access_for_query
90
91 if hasattr(existing_component, 'workspace_resource_id') and existing_component.workspace_resource_id is not None:
92 client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components
93 return client.create_or_update(resource_group_name, application, existing_component)
94
95 from .vendored_sdks.mgmt_applicationinsights.v2018_05_01_preview.models import ApplicationInsightsComponent
96 component = ApplicationInsightsComponent(**(vars(existing_component)))
97 return client.create_or_update(resource_group_name, application, component)
98
99
100 def update_component_tags(client, application, resource_group_name, tags):
101 return client.update_tags(resource_group_name, application, tags)
102
103
104 def get_component(client, application, resource_group_name):
105 return client.get(resource_group_name, application)
106
107
108 def show_components(cmd, client, application=None, resource_group_name=None):
109 from ._client_factory import applicationinsights_mgmt_plane_client
110 if application:
111 if resource_group_name:
112 latest_client = applicationinsights_mgmt_plane_client(cmd.cli_ctx,
113 api_version='2020-02-02-preview').components
114 try:
115 return latest_client.get(resource_group_name, application)
116 except CloudError:
117 logger.warning(HELP_MESSAGE)
118 return client.get(resource_group_name, application)
119 raise CLIError("Application provided without resource group. Either specify app with resource group, or remove app.")
120 if resource_group_name:
121 return client.list_by_resource_group(resource_group_name)
122 return client.list()
123
124
125 def delete_component(client, application, resource_group_name):
126 return client.delete(resource_group_name, application)
127
128
129 def create_api_key(cmd, client, application, resource_group_name, api_key, read_properties=None, write_properties=None):
130 from .vendored_sdks.mgmt_applicationinsights.models import APIKeyRequest
131 if read_properties is None:
132 read_properties = ['ReadTelemetry', 'AuthenticateSDKControlChannel']
133 if write_properties is None:
134 write_properties = ['WriteAnnotations']
135 linked_read_properties, linked_write_properties = get_linked_properties(cmd.cli_ctx, application, resource_group_name, read_properties, write_properties)
136 api_key_request = APIKeyRequest(name=api_key,
137 linked_read_properties=linked_read_properties,
138 linked_write_properties=linked_write_properties)
139 return client.create(resource_group_name, application, api_key_request)
140
141
142 def show_api_key(client, application, resource_group_name, api_key=None):
143 if api_key is None:
144 return client.list(resource_group_name, application)
145 result = list(filter(lambda result: result.name == api_key, client.list(resource_group_name, application)))
146 if len(result) == 1:
147 return result[0]
148 if len(result) > 1:
149 return result
150 return None
151
152
153 def delete_api_key(client, application, resource_group_name, api_key):
154 existing_key = list(filter(lambda result: result.name == api_key, client.list(resource_group_name, application)))
155 if existing_key != []:
156 return client.delete(resource_group_name, application, existing_key[0].id.split('/')[-1])
157 raise CLIError('--api-key provided but key not found for deletion.')
158
159
160 def show_component_billing(client, application, resource_group_name):
161 return client.get(resource_group_name=resource_group_name, resource_name=application)
162
163
164 def update_component_billing(client, application, resource_group_name, cap=None, stop_sending_notification_when_hitting_cap=None):
165 billing_features = client.get(resource_group_name=resource_group_name, resource_name=application)
166 if cap is not None:
167 billing_features.data_volume_cap.cap = cap
168 if stop_sending_notification_when_hitting_cap is not None:
169 billing_features.data_volume_cap.stop_send_notification_when_hit_cap = stop_sending_notification_when_hitting_cap
170 return client.update(resource_group_name=resource_group_name,
171 resource_name=application,
172 data_volume_cap=billing_features.data_volume_cap,
173 current_billing_features=billing_features.current_billing_features)
174
175
176 def get_component_linked_storage_account(client, resource_group_name, application):
177 return client.get(resource_group_name=resource_group_name, resource_name=application)
178
179
180 def create_component_linked_storage_account(client, resource_group_name, application, storage_account_id):
181 return client.create_and_update(resource_group_name=resource_group_name, resource_name=application, linked_storage_account=storage_account_id)
182
183
184 def update_component_linked_storage_account(client, resource_group_name, application, storage_account_id):
185 return client.update(resource_group_name=resource_group_name, resource_name=application, linked_storage_account=storage_account_id)
186
187
188 def delete_component_linked_storage_account(client, resource_group_name, application):
189 return client.delete(resource_group_name=resource_group_name, resource_name=application)
190
```
Path: `src/application-insights/setup.py`
Content:
```
1 #!/usr/bin/env python
2
3 # --------------------------------------------------------------------------------------------
4 # Copyright (c) Microsoft Corporation. All rights reserved.
5 # Licensed under the MIT License. See License.txt in the project root for license information.
6 # --------------------------------------------------------------------------------------------
7
8 from codecs import open
9 from setuptools import setup, find_packages
10
11 VERSION = "0.1.7"
12
13 CLASSIFIERS = [
14 'Development Status :: 4 - Beta',
15 'Intended Audience :: Developers',
16 'Intended Audience :: System Administrators',
17 'Programming Language :: Python',
18 'Programming Language :: Python :: 2',
19 'Programming Language :: Python :: 2.7',
20 'Programming Language :: Python :: 3',
21 'Programming Language :: Python :: 3.4',
22 'Programming Language :: Python :: 3.5',
23 'Programming Language :: Python :: 3.6',
24 'License :: OSI Approved :: MIT License',
25 ]
26
27 DEPENDENCIES = []
28
29 with open('README.rst', 'r', encoding='utf-8') as f:
30 README = f.read()
31 with open('HISTORY.rst', 'r', encoding='utf-8') as f:
32 HISTORY = f.read()
33
34 setup(
35 name='application-insights',
36 version=VERSION,
37 description='Support for managing Application Insights components and querying metrics, events, and logs from such components.',
38 long_description=README + '\n\n' + HISTORY,
39 license='MIT',
40 author='Ace Eldeib',
41 author_email='aleldeib@microsoft.com',
42 url='https://github.com/Azure/azure-cli-extensions/tree/master/src/application-insights',
43 classifiers=CLASSIFIERS,
44 packages=find_packages(exclude=["tests"]),
45 package_data={'azext_applicationinsights': ['azext_metadata.json']},
46 install_requires=DEPENDENCIES
47 )
48
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
|
diff --git a/src/application-insights/azext_applicationinsights/custom.py b/src/application-insights/azext_applicationinsights/custom.py
--- a/src/application-insights/azext_applicationinsights/custom.py
+++ b/src/application-insights/azext_applicationinsights/custom.py
@@ -5,6 +5,8 @@
# pylint: disable=line-too-long, protected-access
+import datetime
+import isodate
from knack.util import CLIError
from knack.log import get_logger
from azext_applicationinsights.vendored_sdks.applicationinsights.models import ErrorResponseException
@@ -19,6 +21,8 @@
"""Executes a query against the provided Application Insights application."""
from .vendored_sdks.applicationinsights.models import QueryBody
targets = get_query_targets(cmd.cli_ctx, application, resource_group_name)
+ if not isinstance(offset, datetime.timedelta):
+ offset = isodate.parse_duration(offset)
try:
return client.query.execute(targets[0], QueryBody(query=analytics_query, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), applications=targets[1:]))
except ErrorResponseException as ex:
diff --git a/src/application-insights/setup.py b/src/application-insights/setup.py
--- a/src/application-insights/setup.py
+++ b/src/application-insights/setup.py
@@ -8,7 +8,7 @@
from codecs import open
from setuptools import setup, find_packages
-VERSION = "0.1.7"
+VERSION = "0.1.8"
CLASSIFIERS = [
'Development Status :: 4 - Beta',
@@ -24,7 +24,9 @@
'License :: OSI Approved :: MIT License',
]
-DEPENDENCIES = []
+DEPENDENCIES = [
+ "isodate~=0.6.0"
+]
with open('README.rst', 'r', encoding='utf-8') as f:
README = f.read()
|
{"golden_diff": "diff --git a/src/application-insights/azext_applicationinsights/custom.py b/src/application-insights/azext_applicationinsights/custom.py\n--- a/src/application-insights/azext_applicationinsights/custom.py\n+++ b/src/application-insights/azext_applicationinsights/custom.py\n@@ -5,6 +5,8 @@\n \n # pylint: disable=line-too-long, protected-access\n \n+import datetime\n+import isodate\n from knack.util import CLIError\n from knack.log import get_logger\n from azext_applicationinsights.vendored_sdks.applicationinsights.models import ErrorResponseException\n@@ -19,6 +21,8 @@\n \"\"\"Executes a query against the provided Application Insights application.\"\"\"\n from .vendored_sdks.applicationinsights.models import QueryBody\n targets = get_query_targets(cmd.cli_ctx, application, resource_group_name)\n+ if not isinstance(offset, datetime.timedelta):\n+ offset = isodate.parse_duration(offset)\n try:\n return client.query.execute(targets[0], QueryBody(query=analytics_query, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), applications=targets[1:]))\n except ErrorResponseException as ex:\ndiff --git a/src/application-insights/setup.py b/src/application-insights/setup.py\n--- a/src/application-insights/setup.py\n+++ b/src/application-insights/setup.py\n@@ -8,7 +8,7 @@\n from codecs import open\n from setuptools import setup, find_packages\n \n-VERSION = \"0.1.7\"\n+VERSION = \"0.1.8\"\n \n CLASSIFIERS = [\n 'Development Status :: 4 - Beta',\n@@ -24,7 +24,9 @@\n 'License :: OSI Approved :: MIT License',\n ]\n \n-DEPENDENCIES = []\n+DEPENDENCIES = [\n+ \"isodate~=0.6.0\"\n+]\n \n with open('README.rst', 'r', encoding='utf-8') as f:\n README = f.read()\n", "issue": "az monitor app-insights query Execute a simple query over past 3.5 days.\n## Describe the bug\r\nTrying to query Application Insights via the example provided via the [az monitor app-insights query](https://docs.microsoft.com/en-us/cli/azure/ext/application-insights/monitor/app-insights?view=azure-cli-latest#ext-application-insights-az-monitor-app-insights-query) docs.\r\n\r\n**Command Name**\r\n`az monitor app-insights query\r\nExtension Name: application-insights. Version: 0.1.1.`\r\n\r\n**Errors:**\r\n```\r\nunsupported operand type(s) for -: 'datetime.datetime' and 'str'\r\nTraceback (most recent call last):\r\npython3.6/site-packages/knack/cli.py, ln 206, in invoke\r\n cmd_result = self.invocation.execute(args)\r\ncli/core/commands/__init__.py, ln 603, in execute\r\n raise ex\r\ncli/core/commands/__init__.py, ln 661, in _run_jobs_serially\r\n results.append(self._run_job(expanded_arg, cmd_copy))\r\n...\r\ncliextensions/application-insights/azext_applicationinsights/util.py, ln 40, in get_timespan\r\n start_time = (dateutil.parser.parse(end_time) - offset).isoformat()\r\nTypeError: unsupported operand type(s) for -: 'datetime.datetime' and 'str'\r\n```\r\n\r\n## To Reproduce:\r\nSteps to reproduce the behavior. Note that argument values have been redacted, as they may contain sensitive information.\r\n\r\n- _Put any pre-requisite steps here..._\r\n- `az monitor app-insights query --app <Your-GUID> --analytics-query 'requests | summarize count() by bin(timestamp, 1h)' --offset P3DT12H`\r\n\r\n## Expected Behavior\r\nExecute a simple query over the past 3.5 days.\r\n\r\n## Environment Summary\r\n```\r\nLinux-4.4.0-18362-Microsoft-x86_64-with-debian-buster-sid\r\nPython 3.6.5\r\nShell: bash\r\n\r\nazure-cli 2.0.75\r\n\r\nExtensions:\r\napplication-insights 0.1.1\r\nlog-analytics 0.1.3\r\n\r\n```\r\n## Additional Context\r\n\r\n<!--Please don't remove this:-->\r\n<!--auto-generated-->\r\n\n", "before_files": [{"content": "# --------------------------------------------------------------------------------------------\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See License.txt in the project root for license information.\n# --------------------------------------------------------------------------------------------\n\n# pylint: disable=line-too-long, protected-access\n\nfrom knack.util import CLIError\nfrom knack.log import get_logger\nfrom azext_applicationinsights.vendored_sdks.applicationinsights.models import ErrorResponseException\nfrom msrestazure.azure_exceptions import CloudError\nfrom .util import get_id_from_azure_resource, get_query_targets, get_timespan, get_linked_properties\n\nlogger = get_logger(__name__)\nHELP_MESSAGE = \" Please use `az feature register --name AIWorkspacePreview --namespace microsoft.insights` to register the feature\"\n\n\ndef execute_query(cmd, client, application, analytics_query, start_time=None, end_time=None, offset='1h', resource_group_name=None):\n \"\"\"Executes a query against the provided Application Insights application.\"\"\"\n from .vendored_sdks.applicationinsights.models import QueryBody\n targets = get_query_targets(cmd.cli_ctx, application, resource_group_name)\n try:\n return client.query.execute(targets[0], QueryBody(query=analytics_query, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), applications=targets[1:]))\n except ErrorResponseException as ex:\n if \"PathNotFoundError\" in ex.message:\n raise ValueError(\"The Application Insight is not found. Please check the app id again.\")\n raise ex\n\n\ndef get_events(cmd, client, application, event_type, event=None, start_time=None, end_time=None, offset='1h', resource_group_name=None):\n timespan = get_timespan(cmd.cli_ctx, start_time, end_time, offset)\n if event:\n return client.events.get(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), event_type, event, timespan=timespan)\n return client.events.get_by_type(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), event_type, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset))\n\n\ndef get_metric(cmd, client, application, metric, start_time=None, end_time=None, offset='1h', interval=None, aggregation=None, segment=None, top=None, orderby=None, filter_arg=None, resource_group_name=None):\n return client.metrics.get(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), metric, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), interval=interval, aggregation=aggregation, segment=segment, top=top, orderby=orderby, filter_arg=filter_arg)\n\n\ndef get_metrics_metadata(cmd, client, application, resource_group_name=None):\n return client.metrics.get_metadata(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name))\n\n\ndef create_or_update_component(cmd, client, application, resource_group_name, location, tags=None,\n kind=\"web\", application_type='web', workspace_resource_id=None,\n public_network_access_for_ingestion=None, public_network_access_for_query=None):\n # due to service limitation, we have to do such a hack. We must refract the logic later.\n if workspace_resource_id is None:\n from .vendored_sdks.mgmt_applicationinsights.v2018_05_01_preview.models import ApplicationInsightsComponent\n component = ApplicationInsightsComponent(location=location, kind=kind, application_type=application_type, tags=tags,\n public_network_access_for_ingestion=public_network_access_for_ingestion,\n public_network_access_for_query=public_network_access_for_query)\n return client.create_or_update(resource_group_name, application, component)\n\n from .vendored_sdks.mgmt_applicationinsights.v2020_02_02_preview.models import ApplicationInsightsComponent\n component = ApplicationInsightsComponent(location=location, kind=kind, application_type=application_type,\n tags=tags, workspace_resource_id=workspace_resource_id,\n public_network_access_for_ingestion=public_network_access_for_ingestion,\n public_network_access_for_query=public_network_access_for_query)\n from ._client_factory import applicationinsights_mgmt_plane_client\n client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components\n try:\n return client.create_or_update(resource_group_name, application, component)\n except CloudError as ex:\n ex.error._message = ex.error._message + HELP_MESSAGE\n raise ex\n\n\ndef update_component(cmd, client, application, resource_group_name, kind=None, workspace_resource_id=None,\n public_network_access_for_ingestion=None, public_network_access_for_query=None):\n from ._client_factory import applicationinsights_mgmt_plane_client\n existing_component = None\n if workspace_resource_id is not None:\n latest_client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components\n try:\n existing_component = latest_client.get(resource_group_name, application)\n except CloudError as ex:\n ex.error._message = ex.error._message + HELP_MESSAGE\n raise ex\n existing_component.workspace_resource_id = workspace_resource_id or None\n else:\n existing_component = client.get(resource_group_name, application)\n if kind:\n existing_component.kind = kind\n if public_network_access_for_ingestion is not None:\n existing_component.public_network_access_for_ingestion = public_network_access_for_ingestion\n if public_network_access_for_query is not None:\n existing_component.public_network_access_for_query = public_network_access_for_query\n\n if hasattr(existing_component, 'workspace_resource_id') and existing_component.workspace_resource_id is not None:\n client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components\n return client.create_or_update(resource_group_name, application, existing_component)\n\n from .vendored_sdks.mgmt_applicationinsights.v2018_05_01_preview.models import ApplicationInsightsComponent\n component = ApplicationInsightsComponent(**(vars(existing_component)))\n return client.create_or_update(resource_group_name, application, component)\n\n\ndef update_component_tags(client, application, resource_group_name, tags):\n return client.update_tags(resource_group_name, application, tags)\n\n\ndef get_component(client, application, resource_group_name):\n return client.get(resource_group_name, application)\n\n\ndef show_components(cmd, client, application=None, resource_group_name=None):\n from ._client_factory import applicationinsights_mgmt_plane_client\n if application:\n if resource_group_name:\n latest_client = applicationinsights_mgmt_plane_client(cmd.cli_ctx,\n api_version='2020-02-02-preview').components\n try:\n return latest_client.get(resource_group_name, application)\n except CloudError:\n logger.warning(HELP_MESSAGE)\n return client.get(resource_group_name, application)\n raise CLIError(\"Application provided without resource group. Either specify app with resource group, or remove app.\")\n if resource_group_name:\n return client.list_by_resource_group(resource_group_name)\n return client.list()\n\n\ndef delete_component(client, application, resource_group_name):\n return client.delete(resource_group_name, application)\n\n\ndef create_api_key(cmd, client, application, resource_group_name, api_key, read_properties=None, write_properties=None):\n from .vendored_sdks.mgmt_applicationinsights.models import APIKeyRequest\n if read_properties is None:\n read_properties = ['ReadTelemetry', 'AuthenticateSDKControlChannel']\n if write_properties is None:\n write_properties = ['WriteAnnotations']\n linked_read_properties, linked_write_properties = get_linked_properties(cmd.cli_ctx, application, resource_group_name, read_properties, write_properties)\n api_key_request = APIKeyRequest(name=api_key,\n linked_read_properties=linked_read_properties,\n linked_write_properties=linked_write_properties)\n return client.create(resource_group_name, application, api_key_request)\n\n\ndef show_api_key(client, application, resource_group_name, api_key=None):\n if api_key is None:\n return client.list(resource_group_name, application)\n result = list(filter(lambda result: result.name == api_key, client.list(resource_group_name, application)))\n if len(result) == 1:\n return result[0]\n if len(result) > 1:\n return result\n return None\n\n\ndef delete_api_key(client, application, resource_group_name, api_key):\n existing_key = list(filter(lambda result: result.name == api_key, client.list(resource_group_name, application)))\n if existing_key != []:\n return client.delete(resource_group_name, application, existing_key[0].id.split('/')[-1])\n raise CLIError('--api-key provided but key not found for deletion.')\n\n\ndef show_component_billing(client, application, resource_group_name):\n return client.get(resource_group_name=resource_group_name, resource_name=application)\n\n\ndef update_component_billing(client, application, resource_group_name, cap=None, stop_sending_notification_when_hitting_cap=None):\n billing_features = client.get(resource_group_name=resource_group_name, resource_name=application)\n if cap is not None:\n billing_features.data_volume_cap.cap = cap\n if stop_sending_notification_when_hitting_cap is not None:\n billing_features.data_volume_cap.stop_send_notification_when_hit_cap = stop_sending_notification_when_hitting_cap\n return client.update(resource_group_name=resource_group_name,\n resource_name=application,\n data_volume_cap=billing_features.data_volume_cap,\n current_billing_features=billing_features.current_billing_features)\n\n\ndef get_component_linked_storage_account(client, resource_group_name, application):\n return client.get(resource_group_name=resource_group_name, resource_name=application)\n\n\ndef create_component_linked_storage_account(client, resource_group_name, application, storage_account_id):\n return client.create_and_update(resource_group_name=resource_group_name, resource_name=application, linked_storage_account=storage_account_id)\n\n\ndef update_component_linked_storage_account(client, resource_group_name, application, storage_account_id):\n return client.update(resource_group_name=resource_group_name, resource_name=application, linked_storage_account=storage_account_id)\n\n\ndef delete_component_linked_storage_account(client, resource_group_name, application):\n return client.delete(resource_group_name=resource_group_name, resource_name=application)\n", "path": "src/application-insights/azext_applicationinsights/custom.py"}, {"content": "#!/usr/bin/env python\n\n# --------------------------------------------------------------------------------------------\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See License.txt in the project root for license information.\n# --------------------------------------------------------------------------------------------\n\nfrom codecs import open\nfrom setuptools import setup, find_packages\n\nVERSION = \"0.1.7\"\n\nCLASSIFIERS = [\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: Developers',\n 'Intended Audience :: System Administrators',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'License :: OSI Approved :: MIT License',\n]\n\nDEPENDENCIES = []\n\nwith open('README.rst', 'r', encoding='utf-8') as f:\n README = f.read()\nwith open('HISTORY.rst', 'r', encoding='utf-8') as f:\n HISTORY = f.read()\n\nsetup(\n name='application-insights',\n version=VERSION,\n description='Support for managing Application Insights components and querying metrics, events, and logs from such components.',\n long_description=README + '\\n\\n' + HISTORY,\n license='MIT',\n author='Ace Eldeib',\n author_email='aleldeib@microsoft.com',\n url='https://github.com/Azure/azure-cli-extensions/tree/master/src/application-insights',\n classifiers=CLASSIFIERS,\n packages=find_packages(exclude=[\"tests\"]),\n package_data={'azext_applicationinsights': ['azext_metadata.json']},\n install_requires=DEPENDENCIES\n)\n", "path": "src/application-insights/setup.py"}], "after_files": [{"content": "# --------------------------------------------------------------------------------------------\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See License.txt in the project root for license information.\n# --------------------------------------------------------------------------------------------\n\n# pylint: disable=line-too-long, protected-access\n\nimport datetime\nimport isodate\nfrom knack.util import CLIError\nfrom knack.log import get_logger\nfrom azext_applicationinsights.vendored_sdks.applicationinsights.models import ErrorResponseException\nfrom msrestazure.azure_exceptions import CloudError\nfrom .util import get_id_from_azure_resource, get_query_targets, get_timespan, get_linked_properties\n\nlogger = get_logger(__name__)\nHELP_MESSAGE = \" Please use `az feature register --name AIWorkspacePreview --namespace microsoft.insights` to register the feature\"\n\n\ndef execute_query(cmd, client, application, analytics_query, start_time=None, end_time=None, offset='1h', resource_group_name=None):\n \"\"\"Executes a query against the provided Application Insights application.\"\"\"\n from .vendored_sdks.applicationinsights.models import QueryBody\n targets = get_query_targets(cmd.cli_ctx, application, resource_group_name)\n if not isinstance(offset, datetime.timedelta):\n offset = isodate.parse_duration(offset)\n try:\n return client.query.execute(targets[0], QueryBody(query=analytics_query, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), applications=targets[1:]))\n except ErrorResponseException as ex:\n if \"PathNotFoundError\" in ex.message:\n raise ValueError(\"The Application Insight is not found. Please check the app id again.\")\n raise ex\n\n\ndef get_events(cmd, client, application, event_type, event=None, start_time=None, end_time=None, offset='1h', resource_group_name=None):\n timespan = get_timespan(cmd.cli_ctx, start_time, end_time, offset)\n if event:\n return client.events.get(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), event_type, event, timespan=timespan)\n return client.events.get_by_type(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), event_type, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset))\n\n\ndef get_metric(cmd, client, application, metric, start_time=None, end_time=None, offset='1h', interval=None, aggregation=None, segment=None, top=None, orderby=None, filter_arg=None, resource_group_name=None):\n return client.metrics.get(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name), metric, timespan=get_timespan(cmd.cli_ctx, start_time, end_time, offset), interval=interval, aggregation=aggregation, segment=segment, top=top, orderby=orderby, filter_arg=filter_arg)\n\n\ndef get_metrics_metadata(cmd, client, application, resource_group_name=None):\n return client.metrics.get_metadata(get_id_from_azure_resource(cmd.cli_ctx, application, resource_group=resource_group_name))\n\n\ndef create_or_update_component(cmd, client, application, resource_group_name, location, tags=None,\n kind=\"web\", application_type='web', workspace_resource_id=None,\n public_network_access_for_ingestion=None, public_network_access_for_query=None):\n # due to service limitation, we have to do such a hack. We must refract the logic later.\n if workspace_resource_id is None:\n from .vendored_sdks.mgmt_applicationinsights.v2018_05_01_preview.models import ApplicationInsightsComponent\n component = ApplicationInsightsComponent(location=location, kind=kind, application_type=application_type, tags=tags,\n public_network_access_for_ingestion=public_network_access_for_ingestion,\n public_network_access_for_query=public_network_access_for_query)\n return client.create_or_update(resource_group_name, application, component)\n\n from .vendored_sdks.mgmt_applicationinsights.v2020_02_02_preview.models import ApplicationInsightsComponent\n component = ApplicationInsightsComponent(location=location, kind=kind, application_type=application_type,\n tags=tags, workspace_resource_id=workspace_resource_id,\n public_network_access_for_ingestion=public_network_access_for_ingestion,\n public_network_access_for_query=public_network_access_for_query)\n from ._client_factory import applicationinsights_mgmt_plane_client\n client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components\n try:\n return client.create_or_update(resource_group_name, application, component)\n except CloudError as ex:\n ex.error._message = ex.error._message + HELP_MESSAGE\n raise ex\n\n\ndef update_component(cmd, client, application, resource_group_name, kind=None, workspace_resource_id=None,\n public_network_access_for_ingestion=None, public_network_access_for_query=None):\n from ._client_factory import applicationinsights_mgmt_plane_client\n existing_component = None\n if workspace_resource_id is not None:\n latest_client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components\n try:\n existing_component = latest_client.get(resource_group_name, application)\n except CloudError as ex:\n ex.error._message = ex.error._message + HELP_MESSAGE\n raise ex\n existing_component.workspace_resource_id = workspace_resource_id or None\n else:\n existing_component = client.get(resource_group_name, application)\n if kind:\n existing_component.kind = kind\n if public_network_access_for_ingestion is not None:\n existing_component.public_network_access_for_ingestion = public_network_access_for_ingestion\n if public_network_access_for_query is not None:\n existing_component.public_network_access_for_query = public_network_access_for_query\n\n if hasattr(existing_component, 'workspace_resource_id') and existing_component.workspace_resource_id is not None:\n client = applicationinsights_mgmt_plane_client(cmd.cli_ctx, api_version='2020-02-02-preview').components\n return client.create_or_update(resource_group_name, application, existing_component)\n\n from .vendored_sdks.mgmt_applicationinsights.v2018_05_01_preview.models import ApplicationInsightsComponent\n component = ApplicationInsightsComponent(**(vars(existing_component)))\n return client.create_or_update(resource_group_name, application, component)\n\n\ndef update_component_tags(client, application, resource_group_name, tags):\n return client.update_tags(resource_group_name, application, tags)\n\n\ndef get_component(client, application, resource_group_name):\n return client.get(resource_group_name, application)\n\n\ndef show_components(cmd, client, application=None, resource_group_name=None):\n from ._client_factory import applicationinsights_mgmt_plane_client\n if application:\n if resource_group_name:\n latest_client = applicationinsights_mgmt_plane_client(cmd.cli_ctx,\n api_version='2020-02-02-preview').components\n try:\n return latest_client.get(resource_group_name, application)\n except CloudError:\n logger.warning(HELP_MESSAGE)\n return client.get(resource_group_name, application)\n raise CLIError(\"Application provided without resource group. Either specify app with resource group, or remove app.\")\n if resource_group_name:\n return client.list_by_resource_group(resource_group_name)\n return client.list()\n\n\ndef delete_component(client, application, resource_group_name):\n return client.delete(resource_group_name, application)\n\n\ndef create_api_key(cmd, client, application, resource_group_name, api_key, read_properties=None, write_properties=None):\n from .vendored_sdks.mgmt_applicationinsights.models import APIKeyRequest\n if read_properties is None:\n read_properties = ['ReadTelemetry', 'AuthenticateSDKControlChannel']\n if write_properties is None:\n write_properties = ['WriteAnnotations']\n linked_read_properties, linked_write_properties = get_linked_properties(cmd.cli_ctx, application, resource_group_name, read_properties, write_properties)\n api_key_request = APIKeyRequest(name=api_key,\n linked_read_properties=linked_read_properties,\n linked_write_properties=linked_write_properties)\n return client.create(resource_group_name, application, api_key_request)\n\n\ndef show_api_key(client, application, resource_group_name, api_key=None):\n if api_key is None:\n return client.list(resource_group_name, application)\n result = list(filter(lambda result: result.name == api_key, client.list(resource_group_name, application)))\n if len(result) == 1:\n return result[0]\n if len(result) > 1:\n return result\n return None\n\n\ndef delete_api_key(client, application, resource_group_name, api_key):\n existing_key = list(filter(lambda result: result.name == api_key, client.list(resource_group_name, application)))\n if existing_key != []:\n return client.delete(resource_group_name, application, existing_key[0].id.split('/')[-1])\n raise CLIError('--api-key provided but key not found for deletion.')\n\n\ndef show_component_billing(client, application, resource_group_name):\n return client.get(resource_group_name=resource_group_name, resource_name=application)\n\n\ndef update_component_billing(client, application, resource_group_name, cap=None, stop_sending_notification_when_hitting_cap=None):\n billing_features = client.get(resource_group_name=resource_group_name, resource_name=application)\n if cap is not None:\n billing_features.data_volume_cap.cap = cap\n if stop_sending_notification_when_hitting_cap is not None:\n billing_features.data_volume_cap.stop_send_notification_when_hit_cap = stop_sending_notification_when_hitting_cap\n return client.update(resource_group_name=resource_group_name,\n resource_name=application,\n data_volume_cap=billing_features.data_volume_cap,\n current_billing_features=billing_features.current_billing_features)\n\n\ndef get_component_linked_storage_account(client, resource_group_name, application):\n return client.get(resource_group_name=resource_group_name, resource_name=application)\n\n\ndef create_component_linked_storage_account(client, resource_group_name, application, storage_account_id):\n return client.create_and_update(resource_group_name=resource_group_name, resource_name=application, linked_storage_account=storage_account_id)\n\n\ndef update_component_linked_storage_account(client, resource_group_name, application, storage_account_id):\n return client.update(resource_group_name=resource_group_name, resource_name=application, linked_storage_account=storage_account_id)\n\n\ndef delete_component_linked_storage_account(client, resource_group_name, application):\n return client.delete(resource_group_name=resource_group_name, resource_name=application)\n", "path": "src/application-insights/azext_applicationinsights/custom.py"}, {"content": "#!/usr/bin/env python\n\n# --------------------------------------------------------------------------------------------\n# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See License.txt in the project root for license information.\n# --------------------------------------------------------------------------------------------\n\nfrom codecs import open\nfrom setuptools import setup, find_packages\n\nVERSION = \"0.1.8\"\n\nCLASSIFIERS = [\n 'Development Status :: 4 - Beta',\n 'Intended Audience :: Developers',\n 'Intended Audience :: System Administrators',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'License :: OSI Approved :: MIT License',\n]\n\nDEPENDENCIES = [\n \"isodate~=0.6.0\"\n]\n\nwith open('README.rst', 'r', encoding='utf-8') as f:\n README = f.read()\nwith open('HISTORY.rst', 'r', encoding='utf-8') as f:\n HISTORY = f.read()\n\nsetup(\n name='application-insights',\n version=VERSION,\n description='Support for managing Application Insights components and querying metrics, events, and logs from such components.',\n long_description=README + '\\n\\n' + HISTORY,\n license='MIT',\n author='Ace Eldeib',\n author_email='aleldeib@microsoft.com',\n url='https://github.com/Azure/azure-cli-extensions/tree/master/src/application-insights',\n classifiers=CLASSIFIERS,\n packages=find_packages(exclude=[\"tests\"]),\n package_data={'azext_applicationinsights': ['azext_metadata.json']},\n install_requires=DEPENDENCIES\n)\n", "path": "src/application-insights/setup.py"}]}
| 3,916 | 420 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.