text
stringlengths 20
57.3k
| labels
class label 4
classes |
---|---|
Title: Unknown error when runnnig experiments
Body: /kind bug
Got unexpected error in metric collector using file collector
> F1005 06:27:13.600768 14 main.go:396] Failed to Report logs: rpc error: code = Unknown desc = Pepare SQL statement failed: Error 1390: Prepared statement contains too many placeholders
The job finished properly, but showed as error in the pod status. Found the error above in the file metric collector log. How shall I debug this?
The output looks like:
```
objective: 0.716
loss_avg: 0.716
acc_avg: 0.500
objective: 0.704
loss_avg: 0.704
acc_avg: 0.540
Epoch eval:
ROCAUC: 0.605
Validation:
objective: 0.671
loss_avg: 0.671
acc_avg: 0.605
Epoch eval:
ROCAUC: 0.609
Validation rocauc: 0.609
time(s): 67894.691
```
**What did you expect to happen:**
Works well
**Anything else you would like to add:**
There's warning above kind of related to this
> W1005 06:27:08.292701 14 file-metricscollector.go:71] Metrics will not have timestamp since error parsing time objective:: parsing time "objective:" as "2006-01-02T15:04:05.999999999Z07:00": cannot parse "objective:" as "2006"
W1005 06:27:08.292708 14 file-metricscollector.go:71] Metrics will not have timestamp since error parsing time Validation: parsing time "Validation" as "2006-01-02T15:04:05.999999999Z07:00": cannot parse "Validation" as "2006"
**Environment:**
- Kubeflow version (`kfctl version`): Didn't install kf
- Minikube version (`minikube version`):
- Kubernetes version: (use `kubectl version`): 1.20
- OS (e.g. from `/etc/os-release`): Amazon Linux2
| 0easy
|
Title: Change Requests Duration metric API
Body: The canonical definition is here: https://chaoss.community/?p=3587 | 0easy
|
Title: Good First Issue [Deep Learning]: Add Support for Custom Loss Functions in `create_and_compile` Function
Body: ### Issue Description:
Currently, the `create_and_compile` (ForecasterRNN) function allows specifying predefined loss functions via the `loss` argument, but there is no support for passing custom loss functions directly. To enhance the flexibility of model compilation, the function should be updated to allow users to pass custom loss functions to the `create_and_compile` function.
### Feature Request:
Modify the `create_and_compile` function to accept custom loss functions. This should be done by allowing a callable loss function to be passed as the `loss` argument. The function should validate whether the loss function is either a string (for predefined losses) or a callable (for custom losses).
### Acceptance Criteria:
- [ ] Update the `create_and_compile` function to accept a custom loss function (a callable) in addition to predefined loss strings.
- [ ] Ensure proper validation for the `loss` argument:
- [ ] Maintain the current logic that handles passing compile arguments (`compile_kwargs`), but ensure that custom loss functions are prioritized when passed.
- [ ] Add unit tests to validate that the `create_and_compile` function can handle both predefined loss strings and custom loss functions.
- [ ] Update the documentation and docstrings to describe the new behavior for custom loss functions.
### Example:
```python
# Define a custom loss function
def custom_loss_function(y_true, y_pred):
return tf.reduce_mean(tf.square(y_true - y_pred))
# Pass the custom loss function to `create_and_compile`
model = create_and_compile(
lags=10,
n_series=5,
recurrent_units=[32, 64],
dense_units=[64],
optimizer="adam",
loss=custom_loss_function, # Custom loss function passed here
compile_kwargs={}
)
```
### Notes:
- Ensure that the custom loss function signature matches the expected format (`y_true`, `y_pred`).
- When a custom loss function is passed, it should override any predefined loss functions.
- Ensure that when a custom loss function is provided, it is correctly handled in conjunction with `compile_kwargs`.
This enhancement will allow users to leverage specialized loss functions, further extending the flexibility and utility of the model creation and compilation process. | 0easy
|
Title: Marketplace - creator page - change the font of section header
Body:
### Describe your issue.
Please change this section header to use the "large-poppins" style that's outlined here: https://www.figma.com/design/Ll8EOTAVIlNlbfOCqa1fG9/Agent-Store-V2?node-id=2759-9596&t=2JI1c3X9fIXeTTbE-1
Font: Poppins
weight: semi bold
size: 18px
line height: 28px
<img width="1228" alt="Screenshot 2024-12-16 at 21 13 50" src="https://github.com/user-attachments/assets/7f333b63-0659-4c3a-9f51-53596af9b561" />
### Upload Activity Log Content
_No response_
### Upload Error Log Content
_No response_ | 0easy
|
Title: Include the prompt in /g output
Body: When /g is used it only shows the output as it is. I'd recommend including the input in the output but bold it so if the output is a natural continuation of the prompt it'll keep the context

Here the prompt is "What would happen if " | 0easy
|
Title: `VAR` syntax doesn't log the variable value like `Set * Variable` does
Body: The new `VAR` syntax introduced in RF7 doesn't log the variable value.
The `Set * Variable` keywords (like: `Set Variable`, `Set Test Variable`, `Set Suite Variable`, ...) automatically log the variable value and this is a nice behavior. Isn't it? :)
If the `VAR` syntax is intended to replace `Set * Variable` keywords, it would be great if the VAR syntax also performed the same logging.
Code example:
```robotframework
*** Variables ***
${VARIABLE} ${1}
${VARIABLE_2} ${2}
*** Test Cases ***
Test With Set Variable
${local_variable}= Set Variable ${3}
Set Test Variable ${VARIABLE}
Set Suite Variable ${VARIABLE_2}
Test With VAR Syntax
VAR ${local_variable}= ${2}
VAR ${VARIABLE} ${VARIABLE} scope=TEST
VAR ${VARIABLE_2} ${VARIABLE_2} scope=SUITE
```
Log result:

| 0easy
|
Title: [DO] Unexpect output in `sky check`
Body: <!-- Describe the bug report / feature request here -->
`sky check` shows a long output when DO dependencies are not setup. We should avoid the long error message, and make it more user friendly, e.g. see other clouds below
```
DO: disabled
Reason: Traceback (most recent call last):
File "/workspaces/skypilot/sky/clouds/do.py", line 263, in check_credentials
do_utils.client().droplets.list()
File "/workspaces/skypilot/sky/provision/do/utils.py", line 94, in client
_client = _init_client()
File "/workspaces/skypilot/sky/provision/do/utils.py", line 56, in _init_client
raise DigitalOceanError(
sky.provision.do.utils.DigitalOceanError: no credentials file found from the following paths ['/home/vscode/Library/Application Support/doctl/config.yaml', '/home/vscode/.config/doctl/config.yaml']
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/workspaces/skypilot/sky/adaptors/common.py", line 31, in load_module
self._module = importlib.import_module(self._module_name)
File "/opt/conda/envs/sky/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1004, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'pydo'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/workspaces/skypilot/sky/check.py", line 41, in check_one_cloud
ok, reason = cloud.check_credentials()
File "/workspaces/skypilot/sky/clouds/do.py", line 264, in check_credentials
except do.exceptions().HttpResponseError as err:
File "/workspaces/skypilot/sky/adaptors/common.py", line 64, in wrapper
m.load_module()
File "/workspaces/skypilot/sky/adaptors/common.py", line 36, in load_module
raise ImportError(self._import_error_message) from e
ImportError: Failed to import dependencies for DO. Try pip install "skypilot[do]"
```
```
OCI: disabled
Reason: Missing credential file at ~/.oci/config. For more details, refer to: https://docs.skypilot.co/en/latest/getting-started/installation.html#oracle-cloud-infrastructure-oci
Paperspace: disabled
Reason: Failed to access Paperspace Cloud with credentials.
To configure credentials, follow the instructions at: https://docs.skypilot.co/en/latest/getting-started/installation.html#paperspace
Generate API key and create a json at `~/.paperspace/config.json` with
{"apiKey": "[YOUR API KEY]"}
Reason: Credentials not found
```
<!-- If relevant, fill in versioning info to help us troubleshoot -->
_Version & Commit info:_
* `sky -v`: PLEASE_FILL_IN
* `sky -c`: PLEASE_FILL_IN
| 0easy
|
Title: Unable to install requirements
Body: When installing the requirements I am met with a host of "Could not find a version..." errors. I have confirmed that I am indeed using Python 3.5 and followed your instructions on creating and activating a conda environment. See error below:
```
Collecting anaconda-navigator==1.8.7 (from -r requirements.txt (line 7))
Could not find a version that satisfies the requirement anaconda-navigator==1.8.7 (from -r requirements.txt (line 7)) (from versions: )
No matching distribution found for anaconda-navigator==1.8.7 (from -r requirements.txt (line 7))
You are using pip version 10.0.1, however version 19.0.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
``` | 0easy
|
Title: Can not set number of worker processes for run-parallel when no spinner is shown
Body: ## Issue
Can not set number of worker processes for tox `run-parallel` command via `--parallel` option or `TOX_PARALLEL` environment variable if spinner is disabled via `--parallel-no-spinner` option or `CI` environment variable. Numeric value of `--parallel` option gets ignored and maximum number of workers is always used (which is probably equal to number of CPU cores). The issue results in OOM (out-of-memory) on build agents using `run-parallel` with `CI` environment variable set.
## Environment
Tox 4.11.4 works as expected. Newer versions exhibit the issue. Git blame points to https://github.com/tox-dev/tox/commit/7bbf89ef04fbb5651aa2e0c46dc020869842fc0e#diff-40a5f9d4fad958c6e6fb3bd9cf117200f0b4a8a23284e850b38a80477f3b2d9dR86

## Output of running tox
Actual behaviour (run-parallel in 2 jobs):
```console
$ tox run-parallel --parallel 2 --parallel-no-spinner -x 'tox.envlist=job-{a,b,c,d,e}'
job-e: OK ✔ in 10.14 seconds
job-b: OK ✔ in 10.15 seconds
job-c: OK ✔ in 10.15 seconds
job-d: OK ✔ in 10.15 seconds
job-a: OK (10.15=setup[0.10]+cmd[10.03,0.02] seconds)
job-b: OK (10.15=setup[0.10]+cmd[10.03,0.02] seconds)
job-c: OK (10.15=setup[0.10]+cmd[10.04,0.02] seconds)
job-d: OK (10.15=setup[0.10]+cmd[10.03,0.02] seconds)
job-e: OK (10.14=setup[0.10]+cmd[10.02,0.02] seconds)
congratulations :) (10.22 seconds)
```
Expected behavoiur:
```console
$ tox run-parallel --parallel 2 --parallel-no-spinner -x 'tox.envlist=job-{a,b,c,d,e}'
job-b: OK ✔ in 10.09 seconds
job-a: OK ✔ in 10.1 seconds
job-c: OK ✔ in 10.06 seconds
job-d: OK ✔ in 10.06 seconds
job-a: OK (10.10=setup[0.06]+cmd[10.03,0.02] seconds)
job-b: OK (10.09=setup[0.06]+cmd[10.02,0.02] seconds)
job-c: OK (10.06=setup[0.02]+cmd[10.03,0.02] seconds)
job-d: OK (10.06=setup[0.02]+cmd[10.03,0.02] seconds)
job-e: OK (10.06=setup[0.01]+cmd[10.03,0.02] seconds)
congratulations :) (30.28 seconds)
```
i.e. execution should took around 30 seconds, not 10.
## Minimal example
```console
[tox]
[testenv]
commands =
job: python -c 'import time;time.sleep(10)'
a,b,c,d,e: python -c 'print("ok")'
``` | 0easy
|
Title: Problems with `changedir` and carriage return
Body: ## Issue
A recent upgrade to `tox` makes all my installation environments fail. After digging, it turns out that the handling of `changedir` is buggy in the new versions of `tox` (>= 4).
In the following toy example it is creating folders with invalid names (spurious carriage return), which looks like it is trying to access a non-existing folder in my real setup.
* tested and reproduced with `tox-4.0.4.dev1+gaadcbce` (from github) and `tox-4.0.3`.
## Environment
Provide at least:
- OS: Ubuntu 22.04
- python 3.10
- `pip list` of the host Python where `tox` is installed:
```console
Package Version
------------- -------
cachetools 5.2.0
chardet 5.1.0
colorama 0.4.6
distlib 0.3.6
filelock 3.8.2
packaging 22.0
pip 22.0.2
platformdirs 2.6.0
pluggy 1.0.0
pyproject_api 1.2.1
setuptools 59.6.0
tomli 2.0.1
tox 4.0.3
virtualenv 20.17.1
```
(clean venv with only `tox`, switching back and forth to the master version from git).
## Output of running tox
Provide the output of `tox -rvv`:
```console
check_tox: 199 I find interpreter for spec PythonSpec(path=/data/code/reasonal/sandbox/venv_check_tox/bin/python3) [virtualenv/discovery/builtin.py:56]
check_tox: 199 D discover exe for PythonInfo(spec=CPython3.10.6.final.0-64, exe=/data/code/reasonal/sandbox/venv_check_tox/bin/python3, platform=linux, version='3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]', encoding_fs_io=utf-8-utf-8) in /usr [virtualenv/discovery/py_info.py:437]
check_tox: 200 D filesystem is case-sensitive [virtualenv/info.py:24]
check_tox: 201 D Attempting to acquire lock 139710263612496 on /home/raffi/.local/share/virtualenv/py_info/1/8a94588eda9d64d9e9a351ab8144e55b1fabf5113b54e67dd26a8c27df0381b3.lock [filelock/_api.py:172]
check_tox: 201 D Lock 139710263612496 acquired on /home/raffi/.local/share/virtualenv/py_info/1/8a94588eda9d64d9e9a351ab8144e55b1fabf5113b54e67dd26a8c27df0381b3.lock [filelock/_api.py:176]
check_tox: 201 D got python info of /usr/bin/python3.10 from /home/raffi/.local/share/virtualenv/py_info/1/8a94588eda9d64d9e9a351ab8144e55b1fabf5113b54e67dd26a8c27df0381b3.json [virtualenv/app_data/via_disk_folder.py:129]
check_tox: 202 D Attempting to release lock 139710263612496 on /home/raffi/.local/share/virtualenv/py_info/1/8a94588eda9d64d9e9a351ab8144e55b1fabf5113b54e67dd26a8c27df0381b3.lock [filelock/_api.py:209]
check_tox: 202 D Lock 139710263612496 released on /home/raffi/.local/share/virtualenv/py_info/1/8a94588eda9d64d9e9a351ab8144e55b1fabf5113b54e67dd26a8c27df0381b3.lock [filelock/_api.py:212]
check_tox: 202 I proposed PythonInfo(spec=CPython3.10.6.final.0-64, system=/usr/bin/python3.10, exe=/data/code/reasonal/sandbox/venv_check_tox/bin/python3, platform=linux, version='3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]', encoding_fs_io=utf-8-utf-8) [virtualenv/discovery/builtin.py:63]
check_tox: 202 D accepted PythonInfo(spec=CPython3.10.6.final.0-64, system=/usr/bin/python3.10, exe=/data/code/reasonal/sandbox/venv_check_tox/bin/python3, platform=linux, version='3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]', encoding_fs_io=utf-8-utf-8) [virtualenv/discovery/builtin.py:65]
check_tox: 236 I create virtual environment via CPython3Posix(dest=/data/code/sandbox/tox_example/.tox/check_tox, clear=False, no_vcs_ignore=False, global=False) [virtualenv/run/session.py:48]
check_tox: 236 D create folder /data/code/sandbox/tox_example/.tox/check_tox/bin [virtualenv/util/path/_sync.py:9]
check_tox: 236 D create folder /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages [virtualenv/util/path/_sync.py:9]
check_tox: 236 D write /data/code/sandbox/tox_example/.tox/check_tox/pyvenv.cfg [virtualenv/create/pyenv_cfg.py:30]
check_tox: 236 D home = /usr/bin [virtualenv/create/pyenv_cfg.py:34]
check_tox: 236 D implementation = CPython [virtualenv/create/pyenv_cfg.py:34]
check_tox: 237 D version_info = 3.10.6.final.0 [virtualenv/create/pyenv_cfg.py:34]
check_tox: 237 D virtualenv = 20.17.1 [virtualenv/create/pyenv_cfg.py:34]
check_tox: 237 D include-system-site-packages = false [virtualenv/create/pyenv_cfg.py:34]
check_tox: 237 D base-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox: 237 D base-exec-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox: 237 D base-executable = /usr/bin/python3.10 [virtualenv/create/pyenv_cfg.py:34]
check_tox: 237 D symlink /usr/bin/python3.10 to /data/code/sandbox/tox_example/.tox/check_tox/bin/python [virtualenv/util/path/_sync.py:28]
check_tox: 237 D create virtualenv import hook file /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/_virtualenv.pth [virtualenv/create/via_global_ref/api.py:89]
check_tox: 237 D create /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/_virtualenv.py [virtualenv/create/via_global_ref/api.py:92]
check_tox: 238 D ============================== target debug ============================== [virtualenv/run/session.py:50]
check_tox: 238 D debug via /data/code/sandbox/tox_example/.tox/check_tox/bin/python /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/create/debug.py [virtualenv/create/creator.py:197]
check_tox: 238 D {
"sys": {
"executable": "/data/code/sandbox/tox_example/.tox/check_tox/bin/python",
"_base_executable": "/data/code/sandbox/tox_example/.tox/check_tox/bin/python",
"prefix": "/data/code/sandbox/tox_example/.tox/check_tox",
"base_prefix": "/usr",
"real_prefix": null,
"exec_prefix": "/data/code/sandbox/tox_example/.tox/check_tox",
"base_exec_prefix": "/usr",
"path": [
"/usr/lib/python310.zip",
"/usr/lib/python3.10",
"/usr/lib/python3.10/lib-dynload",
"/data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages"
],
"meta_path": [
"<class '_virtualenv._Finder'>",
"<class '_frozen_importlib.BuiltinImporter'>",
"<class '_frozen_importlib.FrozenImporter'>",
"<class '_frozen_importlib_external.PathFinder'>"
],
"fs_encoding": "utf-8",
"io_encoding": "utf-8"
},
"version": "3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]",
"makefile_filename": "/usr/lib/python3.10/config-3.10-x86_64-linux-gnu/Makefile",
"os": "<module 'os' from '/usr/lib/python3.10/os.py'>",
"site": "<module 'site' from '/usr/lib/python3.10/site.py'>",
"datetime": "<module 'datetime' from '/usr/lib/python3.10/datetime.py'>",
"math": "<module 'math' (built-in)>",
"json": "<module 'json' from '/usr/lib/python3.10/json/__init__.py'>"
} [virtualenv/run/session.py:51]
check_tox: 274 I add seed packages via FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=/home/raffi/.local/share/virtualenv) [virtualenv/run/session.py:55]
check_tox: 276 D got embed update of distribution pip from /home/raffi/.local/share/virtualenv/wheel/3.10/embed/3/pip.json [virtualenv/app_data/via_disk_folder.py:129]
check_tox: 279 D got embed update of distribution wheel from /home/raffi/.local/share/virtualenv/wheel/3.10/embed/3/wheel.json [virtualenv/app_data/via_disk_folder.py:129]
check_tox: 280 D got embed update of distribution setuptools from /home/raffi/.local/share/virtualenv/wheel/3.10/embed/3/setuptools.json [virtualenv/app_data/via_disk_folder.py:129]
check_tox: 280 D install pip from wheel /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/seed/wheels/embed/pip-22.3.1-py3-none-any.whl via CopyPipInstall [virtualenv/seed/embed/via_app_data/via_app_data.py:47]
check_tox: 281 D install wheel from wheel /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/seed/wheels/embed/wheel-0.38.4-py3-none-any.whl via CopyPipInstall [virtualenv/seed/embed/via_app_data/via_app_data.py:47]
check_tox: 281 D Attempting to acquire lock 139710168482960 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:172]
check_tox: 281 D install setuptools from wheel /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/seed/wheels/embed/setuptools-65.6.3-py3-none-any.whl via CopyPipInstall [virtualenv/seed/embed/via_app_data/via_app_data.py:47]
check_tox: 281 D Lock 139710168482960 acquired on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:176]
check_tox: 282 D Attempting to acquire lock 139710168482192 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:172]
check_tox: 282 D Attempting to acquire lock 139710168483344 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:172]
check_tox: 282 D Attempting to release lock 139710168482960 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:209]
check_tox: 282 D Lock 139710168482192 acquired on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:176]
check_tox: 282 D Lock 139710168483344 acquired on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:176]
check_tox: 282 D Lock 139710168482960 released on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:212]
check_tox: 283 D Attempting to release lock 139710168482192 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:209]
check_tox: 283 D Lock 139710168482192 released on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:212]
check_tox: 283 D Attempting to release lock 139710168483344 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:209]
check_tox: 283 D Lock 139710168483344 released on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:212]
check_tox: 283 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any/pip-22.3.1.virtualenv to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/pip-22.3.1.virtualenv [virtualenv/util/path/_sync.py:36]
check_tox: 283 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any/wheel to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/wheel [virtualenv/util/path/_sync.py:36]
check_tox: 284 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any/pip-22.3.1.dist-info to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/pip-22.3.1.dist-info [virtualenv/util/path/_sync.py:36]
check_tox: 284 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/_distutils_hack to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/_distutils_hack [virtualenv/util/path/_sync.py:36]
check_tox: 286 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/setuptools to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/setuptools [virtualenv/util/path/_sync.py:36]
check_tox: 288 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any/pip to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/pip [virtualenv/util/path/_sync.py:36]
check_tox: 292 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any/wheel-0.38.4.dist-info to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/wheel-0.38.4.dist-info [virtualenv/util/path/_sync.py:36]
check_tox: 296 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any/wheel-0.38.4.virtualenv to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/wheel-0.38.4.virtualenv [virtualenv/util/path/_sync.py:36]
check_tox: 298 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/wheel-3.10 to 775 [distlib/util.py:572]
check_tox: 298 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/wheel3 to 775 [distlib/util.py:572]
check_tox: 299 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/wheel to 775 [distlib/util.py:572]
check_tox: 299 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/wheel3.10 to 775 [distlib/util.py:572]
check_tox: 299 D generated console scripts wheel-3.10 wheel3 wheel3.10 wheel [virtualenv/seed/embed/via_app_data/pip_install/base.py:41]
check_tox: 342 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/pkg_resources to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/pkg_resources [virtualenv/util/path/_sync.py:36]
check_tox: 352 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/distutils-precedence.pth to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/distutils-precedence.pth [virtualenv/util/path/_sync.py:36]
check_tox: 353 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/setuptools-65.6.3.dist-info to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/setuptools-65.6.3.dist-info [virtualenv/util/path/_sync.py:36]
check_tox: 354 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/setuptools-65.6.3.virtualenv to /data/code/sandbox/tox_example/.tox/check_tox/lib/python3.10/site-packages/setuptools-65.6.3.virtualenv [virtualenv/util/path/_sync.py:36]
check_tox: 355 D generated console scripts [virtualenv/seed/embed/via_app_data/pip_install/base.py:41]
check_tox: 391 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/pip to 775 [distlib/util.py:572]
check_tox: 392 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/pip3 to 775 [distlib/util.py:572]
check_tox: 392 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/pip3.10 to 775 [distlib/util.py:572]
check_tox: 392 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox/bin/pip-3.10 to 775 [distlib/util.py:572]
check_tox: 392 D generated console scripts pip pip3.10 pip-3.10 pip3 [virtualenv/seed/embed/via_app_data/pip_install/base.py:41]
check_tox: 392 I add activators for Bash, CShell, Fish, Nushell, PowerShell, Python [virtualenv/run/session.py:61]
check_tox: 396 D write /data/code/sandbox/tox_example/.tox/check_tox/pyvenv.cfg [virtualenv/create/pyenv_cfg.py:30]
check_tox: 396 D home = /usr/bin [virtualenv/create/pyenv_cfg.py:34]
check_tox: 396 D implementation = CPython [virtualenv/create/pyenv_cfg.py:34]
check_tox: 396 D version_info = 3.10.6.final.0 [virtualenv/create/pyenv_cfg.py:34]
check_tox: 396 D virtualenv = 20.17.1 [virtualenv/create/pyenv_cfg.py:34]
check_tox: 396 D include-system-site-packages = false [virtualenv/create/pyenv_cfg.py:34]
check_tox: 397 D base-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox: 397 D base-exec-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox: 397 D base-executable = /usr/bin/python3.10 [virtualenv/create/pyenv_cfg.py:34]
check_tox: 401 W install_deps> python -I -m pip install unittest-xml-reporting==3.0.4 [tox/tox_env/api.py:398]
Collecting unittest-xml-reporting==3.0.4
Using cached unittest_xml_reporting-3.0.4-py2.py3-none-any.whl (19 kB)
Installing collected packages: unittest-xml-reporting
Successfully installed unittest-xml-reporting-3.0.4
check_tox: 1760 I exit 0 (1.36 seconds) /data/code/sandbox/tox_example> python -I -m pip install unittest-xml-reporting==3.0.4 pid=230406 [tox/execute/api.py:275]
check_tox: 1762 W commands[0] /data/code/sandbox/tox_example/check_tox> python -c 'import os;print(os.listdir('"'"'.'"'"'))' [tox/tox_env/api.py:398]
['test_tox']
check_tox: 1784 I exit 0 (0.02 seconds) /data/code/sandbox/tox_example/check_tox> python -c 'import os;print(os.listdir('"'"'.'"'"'))' pid=230420 [tox/execute/api.py:275]
check_tox: OK ✔ in 1.59 seconds
check_tox-installation: 1789 I find interpreter for spec PythonSpec(path=/data/code/reasonal/sandbox/venv_check_tox/bin/python3) [virtualenv/discovery/builtin.py:56]
check_tox-installation: 1789 D discover exe from cache /usr - exact False: PythonInfo({'architecture': 64, 'base_exec_prefix': '/usr', 'base_prefix': '/usr', 'distutils_install': {'data': '', 'headers': 'include/python3.10/UNKNOWN', 'platlib': 'lib/python3.10/site-packages', 'purelib': 'lib/python3.10/site-packages', 'scripts': 'bin'}, 'exec_prefix': '/usr', 'executable': '/data/code/reasonal/sandbox/venv_check_tox/bin/python3', 'file_system_encoding': 'utf-8', 'has_venv': True, 'implementation': 'CPython', 'max_size': 9223372036854775807, 'original_executable': '/usr/bin/python3.10', 'os': 'posix', 'path': ['/home/raffi/.local/share/virtualenv/unzip/20.13.0', '/data/programs/pycharm-community-2022.1.3/plugins/python-ce/helpers', '/usr/lib/python310.zip', '/usr/lib/python3.10', '/usr/lib/python3.10/lib-dynload', '/usr/local/lib/python3.10/dist-packages', '/usr/lib/python3/dist-packages'], 'platform': 'linux', 'prefix': '/usr', 'real_prefix': None, 'stdout_encoding': 'utf-8', 'sysconfig': {'makefile_filename': '/usr/lib/python3.10/config-3.10-x86_64-linux-gnu/Makefile'}, 'sysconfig_paths': {'data': '{base}/local', 'include': '{installed_base}/include/python{py_version_short}{abiflags}', 'platlib': '{platbase}/local/lib/python{py_version_short}/dist-packages', 'platstdlib': '{platbase}/lib/python{py_version_short}', 'purelib': '{base}/local/lib/python{py_version_short}/dist-packages', 'scripts': '{base}/local/bin', 'stdlib': '{installed_base}/lib/python{py_version_short}'}, 'sysconfig_scheme': None, 'sysconfig_vars': {'PYTHONFRAMEWORK': '', 'abiflags': '', 'base': '/usr', 'installed_base': '/usr', 'platbase': '/usr', 'py_version_short': '3.10'}, 'system_executable': '/usr/bin/python3.10', 'system_stdlib': '/usr/lib/python3.10', 'system_stdlib_platform': '/usr/lib/python3.10', 'version': '3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]', 'version_info': VersionInfo(major=3, minor=10, micro=6, releaselevel='final', serial=0)}) [virtualenv/discovery/py_info.py:435]
check_tox-installation: 1790 I proposed PythonInfo(spec=CPython3.10.6.final.0-64, system=/usr/bin/python3.10, exe=/data/code/reasonal/sandbox/venv_check_tox/bin/python3, platform=linux, version='3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]', encoding_fs_io=utf-8-utf-8) [virtualenv/discovery/builtin.py:63]
check_tox-installation: 1790 D accepted PythonInfo(spec=CPython3.10.6.final.0-64, system=/usr/bin/python3.10, exe=/data/code/reasonal/sandbox/venv_check_tox/bin/python3, platform=linux, version='3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]', encoding_fs_io=utf-8-utf-8) [virtualenv/discovery/builtin.py:65]
check_tox-installation: 1793 I create virtual environment via CPython3Posix(dest=/data/code/sandbox/tox_example/.tox/check_tox-installation, clear=False, no_vcs_ignore=False, global=False) [virtualenv/run/session.py:48]
check_tox-installation: 1793 D create folder /data/code/sandbox/tox_example/.tox/check_tox-installation/bin [virtualenv/util/path/_sync.py:9]
check_tox-installation: 1794 D create folder /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages [virtualenv/util/path/_sync.py:9]
check_tox-installation: 1794 D write /data/code/sandbox/tox_example/.tox/check_tox-installation/pyvenv.cfg [virtualenv/create/pyenv_cfg.py:30]
check_tox-installation: 1794 D home = /usr/bin [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1794 D implementation = CPython [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1794 D version_info = 3.10.6.final.0 [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1794 D virtualenv = 20.17.1 [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1794 D include-system-site-packages = false [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1794 D base-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1795 D base-exec-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1795 D base-executable = /usr/bin/python3.10 [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1795 D symlink /usr/bin/python3.10 to /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/python [virtualenv/util/path/_sync.py:28]
check_tox-installation: 1796 D create virtualenv import hook file /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/_virtualenv.pth [virtualenv/create/via_global_ref/api.py:89]
check_tox-installation: 1796 D create /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/_virtualenv.py [virtualenv/create/via_global_ref/api.py:92]
check_tox-installation: 1796 D ============================== target debug ============================== [virtualenv/run/session.py:50]
check_tox-installation: 1797 D debug via /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/python /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/create/debug.py [virtualenv/create/creator.py:197]
check_tox-installation: 1796 D {
"sys": {
"executable": "/data/code/sandbox/tox_example/.tox/check_tox-installation/bin/python",
"_base_executable": "/data/code/sandbox/tox_example/.tox/check_tox-installation/bin/python",
"prefix": "/data/code/sandbox/tox_example/.tox/check_tox-installation",
"base_prefix": "/usr",
"real_prefix": null,
"exec_prefix": "/data/code/sandbox/tox_example/.tox/check_tox-installation",
"base_exec_prefix": "/usr",
"path": [
"/usr/lib/python310.zip",
"/usr/lib/python3.10",
"/usr/lib/python3.10/lib-dynload",
"/data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages"
],
"meta_path": [
"<class '_virtualenv._Finder'>",
"<class '_frozen_importlib.BuiltinImporter'>",
"<class '_frozen_importlib.FrozenImporter'>",
"<class '_frozen_importlib_external.PathFinder'>"
],
"fs_encoding": "utf-8",
"io_encoding": "utf-8"
},
"version": "3.10.6 (main, Nov 2 2022, 18:53:38) [GCC 11.3.0]",
"makefile_filename": "/usr/lib/python3.10/config-3.10-x86_64-linux-gnu/Makefile",
"os": "<module 'os' from '/usr/lib/python3.10/os.py'>",
"site": "<module 'site' from '/usr/lib/python3.10/site.py'>",
"datetime": "<module 'datetime' from '/usr/lib/python3.10/datetime.py'>",
"math": "<module 'math' (built-in)>",
"json": "<module 'json' from '/usr/lib/python3.10/json/__init__.py'>"
} [virtualenv/run/session.py:51]
check_tox-installation: 1830 I add seed packages via FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=/home/raffi/.local/share/virtualenv) [virtualenv/run/session.py:55]
check_tox-installation: 1834 D got embed update of distribution wheel from /home/raffi/.local/share/virtualenv/wheel/3.10/embed/3/wheel.json [virtualenv/app_data/via_disk_folder.py:129]
check_tox-installation: 1834 D got embed update of distribution pip from /home/raffi/.local/share/virtualenv/wheel/3.10/embed/3/pip.json [virtualenv/app_data/via_disk_folder.py:129]
check_tox-installation: 1834 D got embed update of distribution setuptools from /home/raffi/.local/share/virtualenv/wheel/3.10/embed/3/setuptools.json [virtualenv/app_data/via_disk_folder.py:129]
check_tox-installation: 1835 D install wheel from wheel /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/seed/wheels/embed/wheel-0.38.4-py3-none-any.whl via CopyPipInstall [virtualenv/seed/embed/via_app_data/via_app_data.py:47]
check_tox-installation: 1835 D install pip from wheel /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/seed/wheels/embed/pip-22.3.1-py3-none-any.whl via CopyPipInstall [virtualenv/seed/embed/via_app_data/via_app_data.py:47]
check_tox-installation: 1836 D Attempting to acquire lock 139710168493424 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:172]
check_tox-installation: 1836 D install setuptools from wheel /data/code/reasonal/sandbox/venv_check_tox/lib/python3.10/site-packages/virtualenv/seed/wheels/embed/setuptools-65.6.3-py3-none-any.whl via CopyPipInstall [virtualenv/seed/embed/via_app_data/via_app_data.py:47]
check_tox-installation: 1837 D Attempting to acquire lock 139710168494336 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:172]
check_tox-installation: 1837 D Lock 139710168493424 acquired on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:176]
check_tox-installation: 1837 D Lock 139710168494336 acquired on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:176]
check_tox-installation: 1837 D Attempting to release lock 139710168493424 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:209]
check_tox-installation: 1837 D Lock 139710168493424 released on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any.lock [filelock/_api.py:212]
check_tox-installation: 1837 D Attempting to acquire lock 139710168494768 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:172]
check_tox-installation: 1837 D Attempting to release lock 139710168494336 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:209]
check_tox-installation: 1838 D Lock 139710168494768 acquired on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:176]
check_tox-installation: 1838 D Lock 139710168494336 released on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any.lock [filelock/_api.py:212]
check_tox-installation: 1838 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any/pip-22.3.1.virtualenv to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/pip-22.3.1.virtualenv [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1838 D Attempting to release lock 139710168494768 on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:209]
check_tox-installation: 1838 D Lock 139710168494768 released on /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any.lock [filelock/_api.py:212]
check_tox-installation: 1839 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any/wheel to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/wheel [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1839 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any/pip-22.3.1.dist-info to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/pip-22.3.1.dist-info [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1840 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/_distutils_hack to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/_distutils_hack [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1841 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/setuptools to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/setuptools [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1842 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/pip-22.3.1-py3-none-any/pip to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/pip [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1848 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any/wheel-0.38.4.dist-info to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/wheel-0.38.4.dist-info [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1851 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/wheel-0.38.4-py3-none-any/wheel-0.38.4.virtualenv to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/wheel-0.38.4.virtualenv [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1853 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/wheel-3.10 to 775 [distlib/util.py:572]
check_tox-installation: 1853 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/wheel3 to 775 [distlib/util.py:572]
check_tox-installation: 1854 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/wheel to 775 [distlib/util.py:572]
check_tox-installation: 1854 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/wheel3.10 to 775 [distlib/util.py:572]
check_tox-installation: 1854 D generated console scripts wheel3.10 wheel wheel-3.10 wheel3 [virtualenv/seed/embed/via_app_data/pip_install/base.py:41]
check_tox-installation: 1903 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/pkg_resources to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/pkg_resources [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1916 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/distutils-precedence.pth to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/distutils-precedence.pth [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1917 D copy directory /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/setuptools-65.6.3.dist-info to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/setuptools-65.6.3.dist-info [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1919 D copy /home/raffi/.local/share/virtualenv/wheel/3.10/image/1/CopyPipInstall/setuptools-65.6.3-py3-none-any/setuptools-65.6.3.virtualenv to /data/code/sandbox/tox_example/.tox/check_tox-installation/lib/python3.10/site-packages/setuptools-65.6.3.virtualenv [virtualenv/util/path/_sync.py:36]
check_tox-installation: 1920 D generated console scripts [virtualenv/seed/embed/via_app_data/pip_install/base.py:41]
check_tox-installation: 1952 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/pip to 775 [distlib/util.py:572]
check_tox-installation: 1953 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/pip3 to 775 [distlib/util.py:572]
check_tox-installation: 1953 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/pip3.10 to 775 [distlib/util.py:572]
check_tox-installation: 1953 D changing mode of /data/code/sandbox/tox_example/.tox/check_tox-installation/bin/pip-3.10 to 775 [distlib/util.py:572]
check_tox-installation: 1953 D generated console scripts pip3.10 pip-3.10 pip3 pip [virtualenv/seed/embed/via_app_data/pip_install/base.py:41]
check_tox-installation: 1953 I add activators for Bash, CShell, Fish, Nushell, PowerShell, Python [virtualenv/run/session.py:61]
check_tox-installation: 1955 D write /data/code/sandbox/tox_example/.tox/check_tox-installation/pyvenv.cfg [virtualenv/create/pyenv_cfg.py:30]
check_tox-installation: 1955 D home = /usr/bin [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1955 D implementation = CPython [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1955 D version_info = 3.10.6.final.0 [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1955 D virtualenv = 20.17.1 [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1955 D include-system-site-packages = false [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1955 D base-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1955 D base-exec-prefix = /usr [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1955 D base-executable = /usr/bin/python3.10 [virtualenv/create/pyenv_cfg.py:34]
check_tox-installation: 1957 W commands[0] /data/code/sandbox/tox_example/check_tox
/data/code/sandbox/tox_example/check_tox> python -c 'import os;print(os.listdir('"'"'.'"'"'))' [tox/tox_env/api.py:398]
[]
check_tox-installation: 1983 I exit 0 (0.02 seconds) /data/code/sandbox/tox_example/check_tox
/data/code/sandbox/tox_example/check_tox> python -c 'import os;print(os.listdir('"'"'.'"'"'))' pid=230433 [tox/execute/api.py:275]
check_tox: OK (1.59=setup[1.57]+cmd[0.02] seconds)
check_tox-installation: OK (0.20=setup[0.17]+cmd[0.02] seconds)
congratulations :) (1.84 seconds)
```
## Minimal example
If possible, provide a minimal reproducer for the issue:
```console
[tox]
minversion = 4
env_list =
check_tox
check_tox{,-installation}
skipsdist = true
[default]
deps =
unittest-xml-reporting==3.0.4
changedir =
check_tox{,-installation}: {toxinidir}/check_tox
commands_pre =
# nothing
commands =
python -m xmlrunner discover -vvv
setenv =
#
[testenv:check_tox]
skip_install = true
changedir = {[default]changedir}
deps =
{[default]deps}
commands_pre =
{[default]commands_pre}
commands =
python -c "import os;print(os.listdir('.'))"
setenv =
{[default]setenv}
[testenv:check_tox-installation]
skip_install = true
changedir = {[default]changedir}
commands =
python -c "import os;print(os.listdir('.'))"
```
Prior to running `tox`:
```bash
mkdir check_tox
touch check_tox/test_tox
```
The output of the execution of `tox` seems correct, but it is not:
* the environment `check_tox` goes properly to the folder `check_tox` and prints the content of that folder which is `test_tox`
* the environment `check_tox-installation` does not return anything (empty list)
Checking the project folder, we have this:
```bash
> ls -a
total 60
drwxrwxr-x 5 raffi raffi 4096 Dec 8 20:31 .
drwxr-xr-x 38 raffi raffi 4096 Dec 8 20:24 ..
drwxrwxr-x 2 raffi raffi 4096 Dec 8 20:28 check_tox
drwxrwxr-x 3 raffi raffi 4096 Dec 8 20:30 'check_tox'$'\n'
-rw-rw-r-- 1 raffi raffi 32983 Dec 8 20:31 output.txt
drwxrwxr-x 4 raffi raffi 4096 Dec 8 20:31 .tox
-rw-rw-r-- 1 raffi raffi 794 Dec 8 20:30 tox_bug.ini
```
As you can see above, there is a folder `'check_tox'$'\n'` (that I cannot delete from the terminal :) ) that is being created. I believe the environment `check_tox-installation` is creating this one (which I can confirm by running `tox -e check_tox-installation`, and this could explain why my real project installation is failing with a message like
```bash
/home/runner/work/repo/data_service> pip install '.[vendor]'
dataservice-installation: FAIL code 2 (0.04=setup[0.04]+cmd[0.00] seconds)
evaluation failed :( (0.10 seconds)
Error: Process completed with exit code 2.
```
as it thinks that the folder `data_service` does not exist. | 0easy
|
Title: Add the missing docstrings to the `endpoint_node.py` file
Body: Add the missing docstrings to the [endpoint_node.py](https://github.com/scanapi/scanapi/blob/main/scanapi/tree/endpoint_node.py) file
[Here](https://github.com/scanapi/scanapi/wiki/First-Pull-Request#7-make-your-changes) you can find instructions of how we create the [docstrings](https://www.python.org/dev/peps/pep-0257/#what-is-a-docstring).
Child of https://github.com/scanapi/scanapi/issues/411 | 0easy
|
Title: Consider setting a fixed minimum Rust version
Body: When I use the procedure in the README to build pydantic-core, the build fails. The issue can be reproduced in the following way:
```
git clone -b v2.16.1 https://github.com/pydantic/pydantic-core.git
cd pydantic-core
python3.11 -m venv env
source env/bin/activate
make install
```
<details>
<summary>It results in this output</summary>
<pre>
+ git clone -b v2.16.1 https://github.com/pydantic/pydantic-core.git
Cloning into 'pydantic-core'...
Note: switching to '4538190f0e7a47a99ca44351f744007b016511d4'.
You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.
If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:
git switch -c <new-branch-name>
Or undo this operation with:
git switch -
Turn off this advice by setting config variable advice.detachedHead to false
\+ cd pydantic-core
\+ python3.11 -m venv env
\+ source env/bin/activate
++ deactivate nondestructive
++ '[' -n '' ']'
++ '[' -n '' ']'
++ '[' -n /usr/bin/bash -o -n '' ']'
++ hash -r
++ '[' -n '' ']'
++ unset VIRTUAL_ENV
++ unset VIRTUAL_ENV_PROMPT
++ '[' '!' nondestructive = nondestructive ']'
++ VIRTUAL_ENV=/tmp/pydantic-core/env
++ export VIRTUAL_ENV
++ _OLD_VIRTUAL_PATH=/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin
++ PATH=/tmp/pydantic-core/env/bin:/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin
++ export PATH
++ '[' -n '' ']'
++ '[' -z '' ']'
++ _OLD_VIRTUAL_PS1=
++ PS1='(env) '
++ export PS1
++ VIRTUAL_ENV_PROMPT='(env) '
++ export VIRTUAL_ENV_PROMPT
++ '[' -n /usr/bin/bash -o -n '' ']'
++ hash -r
\+ make install
/usr/bin/which: no maturin in (/tmp/pydantic-core/env/bin:/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
/usr/bin/which: no maturin in (/tmp/pydantic-core/env/bin:/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
/usr/bin/which: no maturin in (/tmp/pydantic-core/env/bin:/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
/usr/bin/which: no maturin in (/tmp/pydantic-core/env/bin:/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
/usr/bin/which: no maturin in (/tmp/pydantic-core/env/bin:/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
/usr/bin/which: no maturin in (/tmp/pydantic-core/env/bin:/home/arnaud/.local/bin:/home/arnaud/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin)
pip install -U pip wheel pre-commit
Looking in indexes: https://artifacts.internal.inmanta.com/inmanta/dev
Requirement already satisfied: pip in ./env/lib64/python3.11/site-packages (22.3.1)
Collecting pip
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/505/2d7889c1f9d05/pip-23.3.2-py3-none-any.whl (2.1 MB)
Collecting wheel
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/177/f9c9b0d45c478/wheel-0.42.0-py3-none-any.whl (65 kB)
Collecting pre-commit
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/c25/5039ef399049a/pre_commit-3.6.0-py2.py3-none-any.whl (204 kB)
Collecting cfgv>=2.0.0
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/b72/65b1f29fd3316/cfgv-3.4.0-py2.py3-none-any.whl (7.2 kB)
Collecting identify>=1.0.0
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/d40/ce5fcd7628176/identify-2.5.33-py2.py3-none-any.whl (98 kB)
Collecting nodeenv>=0.11.1
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/df8/65724bb3c3adc/nodeenv-1.8.0-py2.py3-none-any.whl (22 kB)
Collecting pyyaml>=5.1
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/d2b/04aac4d386b17/PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (757 kB)
Collecting virtualenv>=20.10.0
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/423/8949c5ffe6876/virtualenv-20.25.0-py3-none-any.whl (3.8 MB)
Requirement already satisfied: setuptools in ./env/lib64/python3.11/site-packages (from nodeenv>=0.11.1->pre-commit) (65.5.1)
Collecting distlib<1,>=0.3.7
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/034/db59a0b96f8ca/distlib-0.3.8-py2.py3-none-any.whl (468 kB)
Collecting filelock<4,>=3.12.2
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/57d/bda9b35157b05/filelock-3.13.1-py3-none-any.whl (11 kB)
Collecting platformdirs<5,>=3.9.1
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/061/4df2a2f37e1a6/platformdirs-4.2.0-py3-none-any.whl (17 kB)
Installing collected packages: distlib, wheel, pyyaml, platformdirs, pip, nodeenv, identify, filelock, cfgv, virtualenv, pre-commit
Attempting uninstall: pip
Found existing installation: pip 22.3.1
Uninstalling pip-22.3.1:
Successfully uninstalled pip-22.3.1
Successfully installed cfgv-3.4.0 distlib-0.3.8 filelock-3.13.1 identify-2.5.33 nodeenv-1.8.0 pip-23.3.2 platformdirs-4.2.0 pre-commit-3.6.0 pyyaml-6.0.1 virtualenv-20.25.0 wheel-0.42.0
pip install -r tests/requirements.txt
Looking in indexes: https://artifacts.internal.inmanta.com/inmanta/dev
Collecting git+https://github.com/dateutil/dateutil.git@f2293200747fb03d56c6c5997bfebeabe703576f (from -r tests/requirements.txt (line 6))
Cloning https://github.com/dateutil/dateutil.git (to revision f2293200747fb03d56c6c5997bfebeabe703576f) to /tmp/pip-req-build-q77eh6k6
Running command git clone --filter=blob:none --quiet https://github.com/dateutil/dateutil.git /tmp/pip-req-build-q77eh6k6
Running command git rev-parse -q --verify 'sha^f2293200747fb03d56c6c5997bfebeabe703576f'
Running command git fetch -q https://github.com/dateutil/dateutil.git f2293200747fb03d56c6c5997bfebeabe703576f
Running command git checkout -q f2293200747fb03d56c6c5997bfebeabe703576f
Resolved https://github.com/dateutil/dateutil.git to commit f2293200747fb03d56c6c5997bfebeabe703576f
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Ignoring exceptiongroup: markers 'python_version < "3.11"' don't match your environment
Collecting coverage==7.4.0 (from -r tests/requirements.txt (line 1))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/485/e9f897cf4856a/coverage-7.4.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (237 kB)
Collecting dirty-equals==0.7.1.post0 (from -r tests/requirements.txt (line 2))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/7fb/9217ea7cd04c0/dirty_equals-0.7.1.post0-py3-none-any.whl (27 kB)
Collecting hypothesis==6.92.9 (from -r tests/requirements.txt (line 3))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/8c1/ab9f3c883fe63/hypothesis-6.92.9-py3-none-any.whl (431 kB)
Collecting pandas==2.1.3 (from -r tests/requirements.txt (line 8))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/d5d/ed6ff28abbf0e/pandas-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.2 MB)
Collecting pytest==7.4.4 (from -r tests/requirements.txt (line 9))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/b09/0cdf5ed60bf4c/pytest-7.4.4-py3-none-any.whl (325 kB)
Collecting pytest-codspeed~=2.2.0 (from -r tests/requirements.txt (line 11))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/5da/48b842fc46592/pytest_codspeed-2.2.0-py3-none-any.whl (10 kB)
Collecting pytest-examples==0.0.10 (from -r tests/requirements.txt (line 14))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/3d0/b52424e454846/pytest_examples-0.0.10-py3-none-any.whl (17 kB)
Collecting pytest-speed==0.3.5 (from -r tests/requirements.txt (line 15))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/ef9/f17d6800b158e/pytest_speed-0.3.5-py3-none-any.whl (9.7 kB)
Collecting pytest-mock==3.11.1 (from -r tests/requirements.txt (line 16))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/21c/279fff83d7076/pytest_mock-3.11.1-py3-none-any.whl (9.6 kB)
Collecting pytest-pretty==1.2.0 (from -r tests/requirements.txt (line 17))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/6f7/9122bf53864ae/pytest_pretty-1.2.0-py3-none-any.whl (6.2 kB)
Collecting pytest-timeout==2.2.0 (from -r tests/requirements.txt (line 18))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/bde/531e096466f49/pytest_timeout-2.2.0-py3-none-any.whl (13 kB)
Collecting pytz==2023.3.post1 (from -r tests/requirements.txt (line 19))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/ce4/2d816b81b6850/pytz-2023.3.post1-py2.py3-none-any.whl (502 kB)
Collecting numpy==1.26.2 (from -r tests/requirements.txt (line 21))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/96c/a5482c3dbdd05/numpy-1.26.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)
Collecting attrs>=22.2.0 (from hypothesis==6.92.9->-r tests/requirements.txt (line 3))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/99b/87a485a5820b2/attrs-23.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis==6.92.9->-r tests/requirements.txt (line 3))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/a16/3dcaede0f1c02/sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting tzdata>=2022.1 (from pandas==2.1.3->-r tests/requirements.txt (line 8))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/aa3/ace4329eeacda/tzdata-2023.4-py2.py3-none-any.whl (346 kB)
Collecting iniconfig (from pytest==7.4.4->-r tests/requirements.txt (line 9))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/b6a/85871a79d2e3b/iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting packaging (from pytest==7.4.4->-r tests/requirements.txt (line 9))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/8c4/91190033a9af7/packaging-23.2-py3-none-any.whl (53 kB)
Collecting pluggy<2.0,>=0.12 (from pytest==7.4.4->-r tests/requirements.txt (line 9))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/7db/9f7b503d67d1c/pluggy-1.4.0-py3-none-any.whl (20 kB)
Collecting black>=23 (from pytest-examples==0.0.10->-r tests/requirements.txt (line 14))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/b3d/64db762eae4a5/black-24.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.7 MB)
Collecting ruff>=0.0.258 (from pytest-examples==0.0.10->-r tests/requirements.txt (line 14))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/b17/b93c02cdb6aeb/ruff-0.1.15-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.5 MB)
Collecting click>=7 (from pytest-speed==0.3.5->-r tests/requirements.txt (line 15))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/ae7/4fb96c20a0277/click-8.1.7-py3-none-any.whl (97 kB)
Collecting rich>=12 (from pytest-speed==0.3.5->-r tests/requirements.txt (line 15))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/6da/14c108c4866ee/rich-13.7.0-py3-none-any.whl (240 kB)
Collecting six>=1.5 (from python-dateutil==2.8.3.dev33+gf229320->-r tests/requirements.txt (line 6))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/8ab/b2f1d86890a2d/six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting cffi~=1.15.1 (from pytest-codspeed~=2.2.0->-r tests/requirements.txt (line 11))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/944/11f22c3985aca/cffi-1.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (462 kB)
Collecting filelock~=3.12.2 (from pytest-codspeed~=2.2.0->-r tests/requirements.txt (line 11))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/08c/21d87ded6e2b9/filelock-3.12.4-py3-none-any.whl (11 kB)
Collecting mypy-extensions>=0.4.3 (from black>=23->pytest-examples==0.0.10->-r tests/requirements.txt (line 14))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/439/2f6c0eb8a5668/mypy_extensions-1.0.0-py3-none-any.whl (4.7 kB)
Collecting pathspec>=0.9.0 (from black>=23->pytest-examples==0.0.10->-r tests/requirements.txt (line 14))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/a0d/503e138a4c123/pathspec-0.12.1-py3-none-any.whl (31 kB)
Requirement already satisfied: platformdirs>=2 in ./env/lib64/python3.11/site-packages (from black>=23->pytest-examples==0.0.10->-r tests/requirements.txt (line 14)) (4.2.0)
Collecting pycparser (from cffi~=1.15.1->pytest-codspeed~=2.2.0->-r tests/requirements.txt (line 11))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/8ee/45429555515e1/pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting markdown-it-py>=2.2.0 (from rich>=12->pytest-speed==0.3.5->-r tests/requirements.txt (line 15))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/355/216845c60bd96/markdown_it_py-3.0.0-py3-none-any.whl (87 kB)
Collecting pygments<3.0.0,>=2.13.0 (from rich>=12->pytest-speed==0.3.5->-r tests/requirements.txt (line 15))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/b27/c2826c47d0f32/pygments-2.17.2-py3-none-any.whl (1.2 MB)
Collecting mdurl~=0.1 (from markdown-it-py>=2.2.0->rich>=12->pytest-speed==0.3.5->-r tests/requirements.txt (line 15))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/840/08a41e51615a4/mdurl-0.1.2-py3-none-any.whl (10.0 kB)
Building wheels for collected packages: python-dateutil
Building wheel for python-dateutil (pyproject.toml): started
Building wheel for python-dateutil (pyproject.toml): finished with status 'done'
Created wheel for python-dateutil: filename=python_dateutil-2.8.3.dev33+gf229320-py2.py3-none-any.whl size=78541 sha256=d7e0b776f15454f56276ca921cc5d8d5e1fb95456eaf891b59906d007a55b118
Stored in directory: /home/arnaud/.cache/pip/wheels/c8/bf/d9/05659cade1e93b6b2474d6824f963c0dfd9a402e3561b5686d
Successfully built python-dateutil
Installing collected packages: sortedcontainers, pytz, tzdata, six, ruff, pygments, pycparser, pluggy, pathspec, packaging, numpy, mypy-extensions, mdurl, iniconfig, filelock, dirty-equals, coverage, click, attrs, python-dateutil, pytest, markdown-it-py, hypothesis, cffi, black, rich, pytest-timeout, pytest-mock, pytest-examples, pytest-codspeed, pandas, pytest-speed, pytest-pretty
Attempting uninstall: filelock
Found existing installation: filelock 3.13.1
Uninstalling filelock-3.13.1:
Successfully uninstalled filelock-3.13.1
Successfully installed attrs-23.2.0 black-24.1.1 cffi-1.15.1 click-8.1.7 coverage-7.4.0 dirty-equals-0.7.1.post0 filelock-3.12.4 hypothesis-6.92.9 iniconfig-2.0.0 markdown-it-py-3.0.0 mdurl-0.1.2 mypy-extensions-1.0.0 numpy-1.26.2 packaging-23.2 pandas-2.1.3 pathspec-0.12.1 pluggy-1.4.0 pycparser-2.21 pygments-2.17.2 pytest-7.4.4 pytest-codspeed-2.2.0 pytest-examples-0.0.10 pytest-mock-3.11.1 pytest-pretty-1.2.0 pytest-speed-0.3.5 pytest-timeout-2.2.0 python-dateutil-2.8.3.dev33+gf229320 pytz-2023.3.post1 rich-13.7.0 ruff-0.1.15 six-1.16.0 sortedcontainers-2.4.0 tzdata-2023.4
pip install -r tests/requirements-linting.txt
Looking in indexes: https://artifacts.internal.inmanta.com/inmanta/dev
Collecting griffe==0.38.1 (from -r tests/requirements-linting.txt (line 1))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/334/c79d3b5964ade/griffe-0.38.1-py3-none-any.whl (1.0 MB)
Collecting pyright==1.1.345 (from -r tests/requirements-linting.txt (line 2))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/008/91361baf58698/pyright-1.1.345-py3-none-any.whl (18 kB)
Collecting ruff==0.1.13 (from -r tests/requirements-linting.txt (line 3))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/9a1/600942485c6e6/ruff-0.1.13-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.5 MB)
Collecting mypy==1.8.0 (from -r tests/requirements-linting.txt (line 4))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/d19/c413b3c07cbec/mypy-1.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.4 MB)
Collecting colorama>=0.4 (from griffe==0.38.1->-r tests/requirements-linting.txt (line 1))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/4f1/d9991f5acc0ca/colorama-0.4.6-py2.py3-none-any.whl (25 kB)
Requirement already satisfied: nodeenv>=1.6.0 in ./env/lib64/python3.11/site-packages (from pyright==1.1.345->-r tests/requirements-linting.txt (line 2)) (1.8.0)
Collecting typing-extensions>=4.1.0 (from mypy==1.8.0->-r tests/requirements-linting.txt (line 4))
Using cached https://artifacts.internal.inmanta.com/root/pypi/%2Bf/af7/2aea155e91adf/typing_extensions-4.9.0-py3-none-any.whl (32 kB)
Requirement already satisfied: mypy-extensions>=1.0.0 in ./env/lib64/python3.11/site-packages (from mypy==1.8.0->-r tests/requirements-linting.txt (line 4)) (1.0.0)
Requirement already satisfied: setuptools in ./env/lib64/python3.11/site-packages (from nodeenv>=1.6.0->pyright==1.1.345->-r tests/requirements-linting.txt (line 2)) (65.5.1)
Installing collected packages: typing-extensions, ruff, colorama, pyright, mypy, griffe
Attempting uninstall: ruff
Found existing installation: ruff 0.1.15
Uninstalling ruff-0.1.15:
Successfully uninstalled ruff-0.1.15
Successfully installed colorama-0.4.6 griffe-0.38.1 mypy-1.8.0 pyright-1.1.345 ruff-0.1.13 typing-extensions-4.9.0
pip install -e .
Looking in indexes: https://artifacts.internal.inmanta.com/inmanta/dev
Obtaining file:///tmp/pydantic-core
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Checking if build backend supports build_editable: started
Checking if build backend supports build_editable: finished with status 'done'
Getting requirements to build editable: started
Getting requirements to build editable: finished with status 'done'
Preparing editable metadata (pyproject.toml): started
Preparing editable metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: typing-extensions!=4.7.0,>=4.6.0 in ./env/lib64/python3.11/site-packages (from pydantic_core==2.16.1) (4.9.0)
Building wheels for collected packages: pydantic_core
Building editable for pydantic_core (pyproject.toml): started
Building editable for pydantic_core (pyproject.toml): finished with status 'error'
error: subprocess-exited-with-error
× Building editable for pydantic_core (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [154 lines of output]
Running `maturin pep517 build-wheel -i /tmp/pydantic-core/env/bin/python3.11 --compatibility off --editable`
📦 Including license file "/tmp/pydantic-core/LICENSE"
🍹 Building a mixed python/rust project
🔗 Found pyo3 bindings
🐍 Found CPython 3.11 at /tmp/pydantic-core/env/bin/python3.11
📡 Using build options features, bindings from pyproject.toml
warning: unused manifest key `lints` (may be supported in a future version)
this Cargo does not support nightly features, but if you
switch to nightly channel you can pass
`-Zlints` to enable this feature.
Compiling autocfg v1.1.0
Compiling proc-macro2 v1.0.76
Compiling target-lexicon v0.12.9
Compiling unicode-ident v1.0.10
Compiling python3-dll-a v0.2.9
Compiling libc v0.2.147
Compiling once_cell v1.18.0
Compiling version_check v0.9.4
Compiling cfg-if v1.0.0
Compiling heck v0.4.1
Compiling static_assertions v1.1.0
Compiling lexical-util v0.8.5
Compiling num-traits v0.2.16
Compiling ahash v0.8.7
Compiling quote v1.0.35
Compiling pyo3-build-config v0.20.2
Compiling syn v2.0.48
Compiling getrandom v0.2.10
Compiling lock_api v0.4.10
Compiling num-integer v0.1.45
Compiling parking_lot_core v0.9.8
Compiling zerocopy v0.7.32
Compiling rustversion v1.0.13
Compiling memoffset v0.9.0
Compiling num-bigint v0.4.4
Compiling scopeguard v1.1.0
Compiling smallvec v1.11.2
Compiling allocator-api2 v0.2.16
Compiling tinyvec_macros v0.1.1
Compiling tinyvec v1.6.0
Compiling hashbrown v0.14.3
Compiling pyo3-ffi v0.20.2
Compiling pyo3 v0.20.2
Compiling lexical-write-integer v0.8.5
Compiling lexical-parse-integer v0.8.6
Compiling serde v1.0.195
Compiling memchr v2.6.3
Compiling lexical-parse-float v0.8.5
Compiling lexical-write-float v0.8.5
Compiling unicode-normalization v0.1.22
Compiling aho-corasick v1.0.2
Compiling parking_lot v0.12.1
Compiling pyo3-macros-backend v0.20.2
Compiling unicode-bidi v0.3.13
Compiling indoc v2.0.4
Compiling equivalent v1.0.1
Compiling serde_json v1.0.109
Compiling percent-encoding v2.3.1
Compiling regex-syntax v0.8.2
Compiling unindent v0.2.3
Compiling form_urlencoded v1.2.1
Compiling indexmap v2.0.0
Compiling idna v0.5.0
Compiling lexical-core v0.8.5
Compiling pydantic-core v2.16.1 (/tmp/pydantic-core)
Compiling ryu v1.0.14
Compiling serde_derive v1.0.195
Compiling strum_macros v0.25.3
Compiling regex-automata v0.4.3
Compiling pyo3-macros v0.20.2
Compiling itoa v1.0.8
Compiling enum_dispatch v0.3.12
Compiling url v2.5.0
Compiling strum v0.25.0
Compiling speedate v0.13.0
Compiling regex v1.10.2
Compiling uuid v1.6.1
Compiling base64 v0.21.7
Compiling jiter v0.0.6
error[E0446]: crate-private type `extra::Extra<'_>` in public interface
--> src/serializers/fields.rs:149:5
|
149 | / pub fn main_to_python<'py>(
150 | | &self,
151 | | py: Python<'py>,
152 | | main_iter: impl Iterator<Item = PyResult<(&'py PyAny, &'py PyAny)>>,
... |
155 | | extra: Extra,
156 | | ) -> PyResult<&'py PyDict> {
| |______________________________^ can't leak crate-private type
|
::: src/serializers/extra.rs:73:1
|
73 | pub(crate) struct Extra<'a> {
| --------------------------- `extra::Extra<'_>` declared as crate-private
error[E0446]: crate-private type `extra::Extra<'_>` in public interface
--> src/serializers/fields.rs:214:5
|
214 | / pub fn main_serde_serialize<'py, S: serde::ser::Serializer>(
215 | | &self,
216 | | main_iter: impl Iterator<Item = PyResult<(&'py PyAny, &'py PyAny)>>,
217 | | expected_len: usize,
... |
221 | | extra: Extra,
222 | | ) -> Result<S::SerializeMap, S::Error> {
| |__________________________________________^ can't leak crate-private type
|
::: src/serializers/extra.rs:73:1
|
73 | pub(crate) struct Extra<'a> {
| --------------------------- `extra::Extra<'_>` declared as crate-private
error[E0446]: crate-private type `extra::Extra<'_>` in public interface
--> src/serializers/fields.rs:260:5
|
260 | / pub fn add_computed_fields_python(
261 | | &self,
262 | | model: Option<&PyAny>,
263 | | output_dict: &PyDict,
... |
266 | | extra: &Extra,
267 | | ) -> PyResult<()> {
| |_____________________^ can't leak crate-private type
|
::: src/serializers/extra.rs:73:1
|
73 | pub(crate) struct Extra<'a> {
| --------------------------- `extra::Extra<'_>` declared as crate-private
error[E0446]: crate-private type `extra::Extra<'_>` in public interface
--> src/serializers/fields.rs:277:5
|
277 | / pub fn add_computed_fields_json<S: serde::ser::Serializer>(
278 | | &self,
279 | | model: Option<&PyAny>,
280 | | map: &mut S::SerializeMap,
... |
283 | | extra: &Extra,
284 | | ) -> Result<(), S::Error> {
| |_____________________________^ can't leak crate-private type
|
::: src/serializers/extra.rs:73:1
|
73 | pub(crate) struct Extra<'a> {
| --------------------------- `extra::Extra<'_>` declared as crate-private
For more information about this error, try `rustc --explain E0446`.
error: could not compile `pydantic-core` (lib) due to 4 previous errors
💥 maturin failed
Caused by: Failed to build a native library through cargo
Caused by: Cargo build finished with "exit status: 101": `env -u CARGO PYO3_ENVIRONMENT_SIGNATURE="cpython-3.11-64bit" PYO3_PYTHON="/tmp/pydantic-core/env/bin/python3.11" PYTHON_SYS_EXECUTABLE="/tmp/pydantic-core/env/bin/python3.11" "cargo" "rustc" "--features" "pyo3/extension-module" "--message-format" "json-render-diagnostics" "--manifest-path" "/tmp/pydantic-core/Cargo.toml" "--release" "--lib" "--crate-type" "cdylib"`
Error: command ['maturin', 'pep517', 'build-wheel', '-i', '/tmp/pydantic-core/env/bin/python3.11', '--compatibility', 'off', '--editable'] returned non-zero exit status 1
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building editable for pydantic_core
Failed to build pydantic_core
ERROR: Could not build wheels for pydantic_core, which is required to install pyproject.toml-based projects
make: *** [Makefile:18: install] Error 1
</pre>
</details>
Could someone provide me more insight into what is going wrong here?
System details:
* Python version: 3.11.5
* Rust version: 1.71.1 | 0easy
|
Title: OpenAI episodes vs steps
Body: It seems like this framework trains for N steps rather than N episodes.
When I use the env.wrapper.Monitor() to encapsulate the results for upload to OpenAI gym site, I'm getting a "can't reset a non-done environment error"
```
Error("Tried to reset environment which is not done. While the monitor is active for {}, you cannot call reset() unless the episode is over.".format(self.env_id))
gym.error.Error: Tried to reset environment which is not done. While the monitor is active for LunarLander-v2, you cannot call reset() unless the episode is over.
```
Is there a way with this framework to train based on a number of episode rather than just a fixed number of steps. Using the 'nb_steps' means that the last episode will likely be 'in-progress' when training finishes.
Perhaps there's some simple setting I over looked, but I didn't see it. | 0easy
|
Title: Investigate use of cast() in reports.
Body: Investigate whether these instances of `cast()` are needed.
PRs should be small (at most a few files).
reports/graphs.py
- https://github.com/capitalone/DataProfiler/blob/main/dataprofiler/reports/graphs.py#L77
reports/utils.py
- https://github.com/capitalone/DataProfiler/blob/main/dataprofiler/reports/utils.py#L53 | 0easy
|
Title: Update GitHub actions for Postgres 14, and Python 3.10
Body: Python 3.10 and Postgres 14 have both been released. | 0easy
|
Title: Academic Open Source Project Impact metric API
Body: The canonical definition is here: https://chaoss.community/?p=3583 New data sources may need to be developed for this one. | 0easy
|
Title: Not able to set kernel name for notebook
Body: Hi,
I am trying to use my default JupyterLab kernel for executing notebooks via Notebooker but I don't think this configuration is applied. I've tried following commands
`notebooker-cli --notebook-kernel-name=python3`
`notebooker-cli --notebook-kernel-name python3`
Opening the Notebooker output for the executing notebook I see following kernelspec:
` "kernelspec": {
"display_name": "notebooker_kernel",
"language": "python",
"name": "notebooker_kernel"
}`
but I expect it to be
` "kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
}`
| 0easy
|
Title: Donchian Indicator - Miscalculation
Body: Hi, I think the Donchian Channels indicator (DC, a volatility indicator) isn't computing the right thing : the computation should be on the highest highs and the lowest lows, instead of on the closes. So the code therefore becomes :
Definition :
def donchian(high, low, lower_length=None, upper_length=None, offset=None, **kwargs):
line 15 :
# Calculate Result
lower = low.rolling(lower_length, min_periods=lower_min_periods).min()
upper = high.rolling(upper_length, min_periods=upper_min_periods).max()
mid = 0.5 * (lower + upper)
| 0easy
|
Title: Add citation inside cookiecutter to be able to find publicly generated projects
Body: By default, add a link back from the README.md so that we can better find examples where people are using the template.
| 0easy
|
Title: Technical Indicators - VWAP?
Body: Noticed VWAP is not included - any chance of adding it? I can attempt to put something together, but someone more familiar with the codebase would probably have more success. | 0easy
|
Title: Truncate descriptions on agent library cards
Body: Descriptions must be clamped to a few lines, to avoid this:
<img src="https://uploads.linear.app/a47946b5-12cd-4b3d-8822-df04c855879f/bcd8a7a2-6ab4-4d0f-9765-0a2697ef70a4/95e4a3ac-08c1-43d8-9f0d-fe2793a6c661?signature=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJwYXRoIjoiL2E0Nzk0NmI1LTEyY2QtNGIzZC04ODIyLWRmMDRjODU1ODc5Zi9iY2Q4YTdhMi02YWI0LTRkMGYtOTc2NS0wYTI2OTdlZjcwYTQvOTVlNGEzYWMtMDhjMS00M2Q4LTlmMGQtZmUyNzkzYTZjNjYxIiwiaWF0IjoxNzQxODcyMjQxLCJleHAiOjMzMzEyNDMyMjQxfQ.dxTtqfMAafZIBSUVJ71JTUgMuG9icC0HjlOWFZPJ4qo " alt="image.png" width="1804" data-linear-height="853" /> | 0easy
|
Title: Webapp: Pagespeed improvements
Body: The pagespeed of the webpage (webapp) could be improved.
## Problem
Currently the reported First Contentful Paint for mobile devices is 10.2 s which is very high.
https://pagespeed.web.dev/report?url=https%3A%2F%2Fdesec.io%2F&form_factor=mobile
## Suggestions by report
* Eliminate render-blocking resources
* Enable text compression
* Reduce unused CSS
* Ensure text remains visible during webfont load
## Comments and possible solutions
Eliminate render-blocking resources:
* enabling text compression also solves this problem
* no explicit solution recommendable
Enable text compression:
* can be enabled in the nginx config
* current nginx supports gzip and brotli (better gzip for modern browsers)
* just for for static content, not for API because of security problem BREACH
* biggest impact: **halves the loading time**
Reduce unused CSS:
* ~90kb by PR #630
Ensure text remains visible during webfont load:
* no action required, because for font can not be improved
* for font `roboto` improvement was done in #626
## Current pagespeed reports


| 0easy
|
Title: [ENH] Unit Conversion
Body: # Brief Description
I would like to propose a set of `engineering` methods that would convert units (e.g., Fahrenheit to Celsius, or Pascals to mmHg, etc.) as well as prefixes (e.g., milligrams to kilograms, kWh to MWh, etc.).
This could go down a big rabbit hole. But maybe we could rely on [Pint](https://pint.readthedocs.io/en/0.9/), or a similar package.
# Example API
```python
# Convert a column of temperatures in Celsius to Fahrenheit
df.convert_units(column_name='temperature_c',
existing_units='C',
to_units='F',
create_new_column=False)
```
| 0easy
|
Title: Unify frontend and backend language preference using shared cookie
Body: ## Bug Report
**Problematic behavior**
Here you can see "Doc private" appears in English but the user interface is in French.
https://github.com/user-attachments/assets/0ae0e457-15f7-423c-b4bc-c1c5fc9d48bc
Same when I'm loading a doc I don't have access to. The message only is available in French even if I sent the interface in English and reload.

| 0easy
|
Title: Concatenating variables containing bytes should yield bytes
Body: When joining bytes with ${x}${y} string is returned instead of bytes.
consider following test case:
```robot
*** Test Cases ***
Join bytes
${destination} = Convert To Bytes FF FF FF FF FF FF hex
${source} = Convert To Bytes 00 80 E1 00 00 00 hex
${data} = Convert To Bytes 00 01 08 00 06 04 00 01 hex
${frame} = Set Variable ${destination}${source}${data}
```
currently results in `${frame} -> '\\xff\\xff\\xff\\xff\\xff\\xff\x00\\x80\\xe1\x00\x00\x00\x00\x01\x08\x00\x06\x04\x00\x01'`
expected result is bytes | 0easy
|
Title: async is a reserved word
Body: One of the files is named `async` but this is a reserved word in python 3.7+
```
==================================== ERRORS ====================================
___________ ERROR collecting tests/integration/test_async_arctic.py ____________
../../../virtualenv/python3.7.1/lib/python3.7/site-packages/_pytest/python.py:450: in _importtestmodule
mod = self.fspath.pyimport(ensuresyspath=importmode)
../../../virtualenv/python3.7.1/lib/python3.7/site-packages/py/_path/local.py:668: in pyimport
__import__(modname)
E File "/home/travis/build/manahl/arctic/tests/integration/test_async_arctic.py", line 1
E import arctic.async as aasync
E ^
E SyntaxError: invalid syntax
``` | 0easy
|
Title: Add docs on how to create custom QApplications
Body: Follow up from https://github.com/pytest-dev/pytest-qt/issues/172#issuecomment-415588753 | 0easy
|
Title: tox -vv does not show pip output for success executions
Body: 🤔 | 0easy
|
Title: Make outputs and index names for Custom Indexes feature look nicer (add embeds)
Body: Use discord embeds to make the outputs returned by the /index commands look nicer and look more cohesive with the theme of our bot.
Also index names that show up in /index compose and /index load and etc can be made to look nicer, maybe there's an easier way for a user to name the index as they make it so we don't have to default to the name of the file + the time? Not sure though, looking for insights on this. | 0easy
|
Title: improve error message when deepcopying dag
Body: In some situations, the dag object may fail to deepcopy, throwing an uninformative error. we should catch it and show something like: "an error occurred when copying your DAG"
https://github.com/ploomber/ploomber/blob/afadae033e161090610692bfe4f95e941448ea05/src/ploomber/dag/dag.py#L680 | 0easy
|
Title: Image Understanding
Body: use blip, google object detection, google OCR. | 0easy
|
Title: Add optional typed base class for dynamic library API
Body: The main benefit would be automatic method name completion and type validation provided by IDEs.
Usage would be something like this:
```python
from robot.api import DynamicLibrary
class Library(DynamicLibrary):
...
```
The base class should have all mandatory and optional dynamic API methods with appropriate type information and documentation. The mandatory methods should `raise NotImplementedError` and optional should return dummy information that's equivalent to not having these methods at all. We could also add a separate base class for the hybrid API although it only has `get_keyword_names`.
This is so easy to implement that I believe it could be done already in RF 6.1. The hardest part is deciding what the class name should be and should it be imported directly from `robot.api` or somewhere like `robot.api.libraries`, `robot.api.interface` or `robot.api.base`. A separate submodule could make sense if we add other similar base classes. | 0easy
|
Title: Collections: Support ignoring order in values when comparing dictionaries
Body: Currently https://github.com/robotframework/robotframework/blob/master/src/robot/libraries/Collections.py#L839 we are missing out on ignore_order already done for
```
def lists_should_be_equal(self, list1, list2, msg=None, values=True,
names=None, ignore_order=False, ignore_case=False):
```
Ref : https://github.com/ngoan1608/robotframework/commit/1d852208255d1245cce33a34c700478455cc7298#diff-0bb57664ba32d4e2b91c6435cf0b867d630c2db282af83004073303030775bd1
Adding as well ignore_key would be much appreciated since we might deal with ID's and timestamps, and in such cases those keys are to be ignored.
| 0easy
|
Title: Cleanup Normalizer Converter Code
Body: We have the original draft of the normalizer code uploaded in [this branch](https://github.com/microsoft/hummingbird/tree/cleanup/normalizer). All credit here goes to @scnakandala, the original author and brains behind this.
It contains an un-edited implementation of SklearnNormalizer [here](https://github.com/microsoft/hummingbird/blob/cleanup/normalizer/hummingbird/ml/operator_converters/normalizer.py).
There is a single test file [here](https://github.com/microsoft/hummingbird/blob/cleanup/normalizer/tests/test_sklearn_normalizer_converter.py) that needs to be cleaned up and passing.
| 0easy
|
Title: Test for DeltaY stopper
Body: We should test the DeltaY stopper with the *_minimize functions. | 0easy
|
Title: Exploding gradients/weights/logits in a fine-tuning notebook
Body: Hello, after I successfully fine-tuned, I run to the last step and when generating text, the following error occurs.
May I ask what is the reason?


| 0easy
|
Title: Support controlling separator when appending current value using `Set Suite Metadata`, `Set Test Documentation` and other such keywords
Body: Two deficiencies identified with the current Set Suite Metadata keyword, which we use to record stuff like version numbers.
- [x] Add argument to define a separator between multiple metadata returned values when append=True. Currently, the default behavior is just to put in a space.
https://robotframework.org/robotframework/latest/libraries/BuiltIn.html#Set%20Suite%20Metadata
> Arguments:
> name
> value
> append = False
> top = False
>
### Example
**current:**
```
FOR ${device} IN @{devices}
Set Suite Metadata | Versions | ${version} | append=True
END
```
**output**
Versions: 1.0 1.2 1.0 1.0 2.3
**proposal:**
```
FOR ${device} IN @{devices}
Set Suite Metadata | Versions | ${version} | append=True | separator=,
END
```
**output**
Versions: 1.0, 1.2, 1.0, 1.0, 2.3
- [ ] Argument to allow unique returned values only
To continue the example above:
```
FOR ${device} IN @{devices}
Set Suite Metadata | Versions | ${version} | append=True | separator=, | unique=True
END
```
**output**
Versions: 1.0, 1.2, 2.3 | 0easy
|
Title: Feature Request -- module agregetor blogs
Body: Module like Python Planet
Add module planet agregetor blogs
Libs:
Python Planet 2.0
http://www.planetplanet.org/
https://people.gnome.org/~jdub/bzr/planet/2.0/
Get feed rss convert to json and import into MongoDB
| 0easy
|
Title: Zoom and pan not working on Linux/Gnome/Wayland
Body: ### Describe the bug
It looks like on some OS/desktop environments/display servers (not sure what's causing the issue) it is not possible to use the Altair zoom and pan functionality that was introduced in https://github.com/marimo-team/marimo/pull/1855.
I'm on Manjaro with Gnome (Wayland) and on none of the tested browsers (Firefox, Chromium, Vivaldi) I can zoom into plots or pan. Unsure which key is the `meta key` I tried:
- Shift:
- Shift + Scrolling: Nothing
- Shift + Drag: Nothing
- Multi legend item selection: Working
- Strg:
- Strg + Scrolling: Browser scroll is triggered
- Strg + click & drag: Interval selection
- Windows key:
- Windows + Scrolling: Workspace is changed
- Windows + dragging: Browser window goes into window mode
Also tried `Alt Gr`, and combination of keys...
This is a follow up on discussion https://github.com/marimo-team/marimo/discussions/250#discussioncomment-12188920
### Environment
<details>
```
{
"marimo": "0.11.4",
"OS": "Linux",
"OS Version": "6.6.74-1-MANJARO",
"Processor": "",
"Python Version": "3.12.8",
"Binaries": {
"Browser": "--",
"Node": "v23.7.0"
},
"Dependencies": {
"click": "8.1.8",
"docutils": "0.21.2",
"itsdangerous": "2.2.0",
"jedi": "0.19.2",
"markdown": "3.7",
"narwhals": "1.26.0",
"packaging": "24.2",
"psutil": "7.0.0",
"pygments": "2.19.1",
"pymdown-extensions": "10.14.3",
"pyyaml": "6.0.2",
"ruff": "0.9.6",
"starlette": "0.45.3",
"tomlkit": "0.13.2",
"typing-extensions": "4.12.2",
"uvicorn": "0.34.0",
"websockets": "14.2"
},
"Optional Dependencies": {
"altair": "5.5.0",
"pandas": "2.2.3",
"pyarrow": "19.0.0"
},
"Experimental Flags": {}
}
```
</details>
Also tried on https://marimo.app
### Code to reproduce
import marimo as mo
import altair as alt
from vega_datasets import data
cars = data.cars()
mo.ui.altair_chart(alt.Chart(cars).mark_point().encode(
x='Horsepower',
y='Miles_per_Gallon',
color='Origin'
)) | 0easy
|
Title: Minor typo in docs
Body: https://geopandas.org/en/stable/docs/reference/api/geopandas.GeoSeries.boundary.html
"Returns a GeoSeries of lower dimensional objects representing each geometries’s set-theoretic boundary."
"geometries’s" ought to be "geometry's" | 0easy
|
Title: btnG is now btnK
Body: I believe for me btnG is now btnK when I went on google... I was getting error but when I made this changed and it worked. | 0easy
|
Title: [MNT] Add metrics parameter for deep clustering module
Body: ### Describe the issue
The deep clustering module is missing the metrics parameter, in AEResNEet it exists but unused
### Suggest a potential alternative/fix
it should be default to None, and then set to ["mse"] if its None internally, the same way its done in deep clustering but here its mse instead of accuracy
### Additional context
Its for verbose purpose, doesnt affect anything else | 0easy
|
Title: Annoying backshash console spam
Body: https://user-images.githubusercontent.com/34655998/154974450-76427f2c-54dd-4a7e-8164-322c412f37f1.mp4
```
$DOTGLOB = True
$COMPLETION_IN_THREAD = True
$COMPLETIONS_BRACKETS = False
$COMPLETIONS_DISPLAY = "single"
$CASE_SENSITIVE_COMPLETIONS = False
$UPDATE_COMPLETIONS_ON_KEYPRESS = True
$COMMANDS_CACHE_SAVE_INTERMEDIATE = True
$XONSH_HISTORY_BACKEND = "sqlite"
$XONSH_HISTORY_FILE = "/home/ganer/.xonsh_history"
$XONSH_CACHE_EVERYTHING = True
$XONSH_CTRL_BKSP_DELETION = True
$UPDATE_PROMPT_ON_KEYPRESS = True
$XONSH_COLOR_STYLE = 'default'
$XONSH_STYLE_OVERRIDES['Token.Literal.Number'] = '#1111ff'
```
```
+------------------+----------------------+
| xonsh | 0.11.0 |
| Git SHA | 337cf25a |
| Commit Date | Nov 17 15:37:41 2021 |
| Python | 3.10.2 |
| PLY | 3.11 |
| have readline | True |
| prompt toolkit | 3.0.24 |
| shell type | prompt_toolkit |
| history backend | sqlite |
| pygments | 2.11.2 |
| on posix | True |
| on linux | True |
| distro | gentoo |
| on wsl | False |
| on darwin | False |
| on windows | False |
| on cygwin | False |
| on msys2 | False |
| is superuser | False |
| default encoding | utf-8 |
| xonsh encoding | utf-8 |
| encoding errors | surrogateescape |
| on jupyter | False |
| jupyter kernel | None |
| xontrib | [] |
| RC file 1 | /home/ganer/.xonshrc |
+------------------+----------------------+
```
## For community
⬇️ **Please click the 👍 reaction instead of leaving a `+1` or 👍 comment**
| 0easy
|
Title: docs: retry feature requires explicit message_id from client
Body: **Describe the bug**
I set the retry parameter to an integer value, expecting it to stop requeuing the rejected message after the specified number of retries (as described [here](https://faststream.airt.ai/latest/rabbit/ack/)). However, the message is requeuing infinitely.
**How to reproduce**
Include source code:
```python
from faststream import FastStream
from faststream.rabbit import RabbitBroker, RabbitQueue
from dataclasses import dataclass
@dataclass
class MessageFormat:
message: str
broker = RabbitBroker("amqp://guest:guest@localhost:5672/")
app = FastStream(broker)
@broker.subscriber(RabbitQueue("testqueue", durable=True, robust=True), retry=3)
async def handle(message: MessageFormat):
raise Exception("Error")
```
**Environment**
```
Running FastStream 0.5.4 with CPython 3.11.7 on Window
```
| 0easy
|
Title: concatenate_columns adds the function of ignoring empty values
Body: # Problem description
When I use the function of concatenating multiple columns, I find that it does not handle null values as expected.
## This is the current output
```Python
df.concatenate_columns(["cat_1","cat_2","cat_3"],"cat",sep=",")
```
<table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>cat_1</th> <th>cat_2</th> <th>cat_3</th> <th>cat</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>a</td> <td>b</td> <td>c</td> <td>a,b,c</td> </tr> <tr> <th>1</th> <td>c</td> <td>NaN</td> <td>f</td> <td>c,nan,f</td> </tr> <tr> <th>2</th> <td>a</td> <td>b</td> <td>NaN</td> <td>a,b,nan</td> </tr> </tbody></table>
## Here is the output I want:
```Python
df.concatenate_columns(["cat_1","cat_2","cat_3"],"cat",sep=",",ignore_empty=True)
```
<table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>cat_1</th> <th>cat_2</th> <th>cat_3</th> <th>cat</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>a</td> <td>b</td> <td>c</td> <td>a,b,c</td> </tr> <tr> <th>1</th> <td>c</td> <td>NaN</td> <td>f</td> <td>c,f</td> </tr> <tr> <th>2</th> <td>a</td> <td>b</td> <td>NaN</td> <td>a,b</td> </tr> </tbody></table> | 0easy
|
Title: Table prefixes are not picked up by autogenerate
Body: **Describe the bug**
Currently sqlalchemy [`Table.prefixes`](https://docs.sqlalchemy.org/en/13/core/metadata.html#sqlalchemy.schema.Table.params.prefixes) are not picked up by alembic when generating a migration using
`alembic revision --autogenerate`
**Expected behavior**
The `prefixes` are picked up by alembic. I've not tested it with other table keyboard, but maybe it's a more common theme.
**To Reproduce**
Please try to provide a [Minimal, Complete, and Verifiable](http://stackoverflow.com/help/mcve) example, with the migration script and/or the SQLAlchemy tables or models involved.
See also [Reporting Bugs](https://www.sqlalchemy.org/participate.html#bugs) on the website.
```py
# Add a new table like this one to the metadata
import sqlalchemy as sa
sa.Table(
"foo",
metadata,
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("other", sa.Integer),
prefixes=["UNLOGGED"],
)
```
**Error**
```
# The reflected migration code generated is
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
"foo",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("other", sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
# ### end Alembic commands ###
```
**Versions.**
- OS: windows
- Python: 3.7.7
- Alembic: 1.4.2
- SQLAlchemy: 1.3.18
- Database: postgres
- DBAPI: pycopg2
**Additional context**
Manually adding `prefixes=[...]` to the `create_table` function after autogenerate does work correctly, so this is just a nice to have feature
**Have a nice day!**
| 0easy
|
Title: psycopg2 deprecated
Body: Latest psycopg2 recommends using the pip installation name "psycopg2-binary" instead of the previous "psycopg2" in pipfile/requirements.txt | 0easy
|
Title: Long name in face selection don't work well
Body: 
With a long name it make somethink weird | 0easy
|
Title: dataset
Body: 
请问,为什么search和template的路径最终都是x.jpg,template不是z.jpg吗? | 0easy
|
Title: colorbar option
Body: currently, there is no way to plot a colorbar.
`hyp.plot(data, colorbar=True)`? | 0easy
|
Title: Add support for optional raise in the CHECK API
Body: ```python
if KORNIA_CHECK_IS_GRAY(guidance, raise=False):
...
```
_Originally posted by @johnnv1 in https://github.com/kornia/kornia/pull/2322#discussion_r1167259191_
| 0easy
|
Title: m2cgen output for xgboost with binary:logistic objective returns raw (not transformed) scores
Body: Our xgboost models use the `binary:logistic'` objective function, however the m2cgen converted version of the models return raw scores instead of the transformed scores.
This is fine as long as the user knows this is happening! I didn't, so it took a while to figure out what was going on. I'm wondering if perhaps a useful warning could be raised for users to alert them of this issue? A warning could include a note that they can transform these scores back to the expected probabilities [0, 1] by `prob = logistic.cdf(score - base_score)` where `base_score` is an attribute of the xgboost model.
In our case, I'd like to minimize unnecessary processing on the device, so I am actually happy with the current m2cgen output and will instead inverse transform our threshold when evaluating the model output from the transpiled model...but it did take me a bit before I figured out what was going on, which is why I'm suggesting that a user friendly message might be raised when an unsupported objective function is encountered.
Thanks for creating & sharing this great tool! | 0easy
|
Title: Add new Pages to Crypto Dashboard
Body: Currently, the crypto dashboard includes a landing page and a logout button. However, two pages are missing: Transactions and Statistics.
For the Statistics page, I have some ideas. It could display the past returns of all coins in the wallet. Users could select a starting point or a range to compare the past returns of all coins or just a selection. This should be represented as a graph with a base of 100, showing compounded returns over time.
I am unsure what to include on the Transactions page.
I would appreciate any ideas or assistance. :) | 0easy
|
Title: Mapper - import from inexistent sklearn.cluster._hierarchical
Body: I wanted to try out giotto mapper running `Christmas-Mapper.ipynb`, but when I installed `giotto-learn-nightly` `scikit-learn` was upgraded to 0.22.1, and this new version has changed `sklearn.cluster._hierarchical` to `sklearn.cluster._hierarchical_fast` (see [https://github.com/scikit-learn/scikit-learn/tree/master/sklearn/cluster](https://github.com/scikit-learn/scikit-learn/tree/master/sklearn/cluster)])
Here giotto.mapper imports from `sklearn.cluster._hierarchical`:
[https://github.com/giotto-ai/giotto-learn/blob/master/giotto/mapper/cluster.py#L7](sklearn.cluster._hierarchical).
| 0easy
|
Title: Document GraphQLField
Body: including its various parameters
e.g. `required`, `is_list`, `source` | 0easy
|
Title: `DjangoValidationCache` and its base class are undocumented
Body: Looking through the codebase, there is something called `DjangoValidationCache` that seemingly can be used instead of `ValidationCache` to use Django's cache backend instead of an in-memory cache. This seems incredibly useful, but I cannot find that this class is documented anywhere. Are we supposed to use it? | 0easy
|
Title: is there any way to calculate atr trailing stops via python for historical ohlc data
Body: @twopirllc Thanks for such a beautiful library for technical indicators, it helps me a lot in many ways.
I am stuck on one point, didn't find any relevant python indicator library or calculation for getting values of ATR Trailing Stops, is there any way please help me out.
Thanks in advance. | 0easy
|
Title: HiLo is not working perfectly, same numbers printing many time.
Body: **Which version are you running? The lastest version is on Github. Pip is for major releases.**
```python
import pandas_ta as ta
print(ta.version)
```
**Do you have _TA Lib_ also installed in your environment?**
```sh
$ pip list
```
**Did you upgrade? Did the upgrade resolve the issue?**
```sh
$ pip install -U git+https://github.com/twopirllc/pandas-ta
```
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Provide sample code.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Additional context**
Add any other context about the problem here.
Thanks for using Pandas TA!
| 0easy
|
Title: More Examples 💪
Body: I would love some help with some more cool `examples/`
If you contribute one, I'll mail you a sticker 😍 | 0easy
|
Title: Is there any way to use puppeteer-real-browser instead of playwright web agent
Body: Actually the main purpose is to pass strong cloudflare security. The general playwright can not do this, but puppeteer-real-browser make it perfectly | 0easy
|
Title: Add doc strings to included.py files
Body: | 0easy
|
Title: Add ROC curve for binary classification report
Body: It will be fantastic to add ROC curve in a `README.md` report file for the model in a binary classification task. | 0easy
|
Title: Sugestion: Clone another pelican theme Malt
Body: based on https://github.com/grupydf/grupybr-template
we can clone Malt theme
It is very easy to clone Pelican themes because they are Jinja2 :smile:
| 0easy
|
Title: Enhancement: Support `stricturl`
Body: Hi, is it true? I got an error:
```sh
E pydantic_factories.exceptions.ParameterError: Unsupported type: <class 'xxx.xxx.models.UrlValue'>
E
E Either extend the providers map or add a factory function for this model field
``` | 0easy
|
Title: Missing exchange config between France and Andorra
Body: ## Description
There are a exchange between France and Andorra but we are missing this in the app.
The first step to get it added will be to create a exchange config for it. | 0easy
|
Title: New Action: Automatic geographic attribute visualizations
Body: For geo-related attributes (longitude, latitude, state, countries, etc), we should detect it as a special data type and display a choropleth automatically. | 0easy
|
Title: [New feature] Add apply_to_images to HistogramMatching
Body: | 0easy
|
Title: Slider `marks` keys are only strings
Body: we should either enforce that or cast back to integers. See https://community.plot.ly/t/slider-not-working/8471/8?u=chriddyp for context. | 0easy
|
Title: Add examples for BackupService and RestoreService
Body: <!--
Thank you for suggesting an idea to improve this client.
* Please add a :+1: or comment on a similar existing feature request instead of opening a new one.
* https://github.com/influxdata/influxdb-client-python/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+is%3Aclosed+sort%3Aupdated-desc+label%3A%22enhancement%22+
-->
__Proposal:__
Add examples demonstrating how to use the `BackupService` and `RestoreService` as they are both currently undocumented.
__Current behavior:__
Examples are missing apart from brief comments in the source code.
__Desired behavior:__
It would be nice to have examples demonstrating how to use both services, possibly in the `examples` directory or in the README. Personally, I would like to use `BackupService` and `RestoreService` to perform migrations but am unsure how to use these services.
__Alternatives considered:__
Other alternatives to the BackupService and RestoreService is using the Influx CLI with Python subprocess calls, but this is less testable and not preferred.
__Use case:__
Performing backups or restores using the InfluxDB Python client would be much easier with documentation.
| 0easy
|
Title: 商品库存无法设置问题,开源版V1.8
Body: 用docker搭建了一个V1.8版本的,后台新增商品的时候没有设置库存选项,已有的商品也无法修改库存,请问是系统不支持设置库存吗?还是我没有找到方法?感谢作者回复。 | 0easy
|
Title: Cannot translate text: 'NoneType' object has no attribute 'translate'
Body: On the website ([libretranslate.com](https://libretranslate.com/?source=auto&target=en)) I ran into this error when trying to translate Norwegian text with **Auto Detect (Experimental)** enabled.
It results in a 500 (internal server error) with the following message:
> Cannot translate text: 'NoneType' object has no attribute 'translate'
~~I'm going to assume that the error stems from the auto-detection picking out a language which LibreTranslate doesn't support, but have not verified if this is the case yet. (Just a hypothesis.)~~
I realize now that LibreTranslate doesn't support Norwegian, but this should still probably be handled better regardless.
| 0easy
|
Title: 刚创建的发卡管理员初始密码错误
Body: 1. BUG反馈请描述最小复现步骤
2. 普通问题:99%的答案都在帮助文档里,请仔细阅读https://kmfaka.baklib-free.com/
3. 新功能新概念提交:请文字描述或截图标注
刚创建的发卡管理员初始密码错误,数据库用的自己的。但是找不到相关链接
| 0easy
|
Title: Possible future data leak on CUBE and IFISHER
Body: After quick inspection, if I'm not wrong, there seem to be a future data leak with CUBE and IFISHER when using `study.AllStudy`, as they both have an offset set by default to -1. Please update the readme to point this out. AI model time series feature importance impact evaluation:

| 0easy
|
Title: SpecModifierAlias: multiple modifiers
Body: I'm going to leave this for future. If anybody wants to implement. PR is welcome!
After #5443 (#5494 #5481) we may want to implement multiple modifiers i.e.
```xsh
$(@json @noerr curl https://json.api)
```
## For community
⬇️ **Please click the 👍 reaction instead of leaving a `+1` or 👍 comment**
| 0easy
|
Title: Make CORS configurable from an environment variable
Body: ## Tell us about the problem you're trying to solve
Currently, CORS is enabled by default for the API server. Refer: https://github.com/chaos-genius/chaos_genius/blob/d0d2922ac8bddbf9cc9b20ee6f76d477725fd85c/chaos_genius/app.py#L52
For security reasons, CORS should be disabled by default. Refer: https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS
## Describe the solution you'd like
A variable needs to be added in settings.py (https://github.com/chaos-genius/chaos_genius/blob/d0d2922ac8bddbf9cc9b20ee6f76d477725fd85c/chaos_genius/settings.py) to enable/disable CORS. By default, CORS will be disabled.
The variable will also need to be added in the .env (https://github.com/chaos-genius/chaos_genius/blob/d0d2922ac8bddbf9cc9b20ee6f76d477725fd85c/.env) and all of the docker-compose files.
## Describe alternatives you've considered
N/A
## Additional context
N/A
Please leave a reply or reach out to us on our [community slack](https://github.com/chaos-genius/chaos_genius#octocat-community) if you need any help.
| 0easy
|
Title: [BUG-REPORT] writer_threads is ignored when exporting to hdf5
Body: Thank you for reaching out and helping us improve Vaex!
Before you submit a new Issue, please read through the [documentation](https://docs.vaex.io/en/latest/). Also, make sure you search through the Open and Closed Issues - your problem may already be discussed or addressed.
**Description**
Please provide a clear and concise description of the problem. This should contain all the steps needed to reproduce the problem. A minimal code example that exposes the problem is very appreciated.
**Software information**
- Vaex version (`import vaex; vaex.__version__)`: 4.3.0
- Vaex was installed via: pip
- OS: ubuntu 20.04
**Additional information**
We're offering this option in API, but writer_threads is not in use
| 0easy
|
Title: [BUG] - Kafka authentication "check_hostname" false is not beign applied.
Body: ### Mage version
0.9.70
### Describe the bug
I'm trying to authenticate Kafka within a streaming pipeline like this:
```
connector_type: kafka
# bootstrap_server: "{{ env_var('DATAPLATFORM01_KAFKA_SERVERS') }}"
topic: visibilidade-direto.transient-ingestion
consumer_group: mage-to-hwc-kafka-listener
include_metadata: false
api_version: 0.10.2
# Uncomment the config below to use SSL config
# security_protocol: "SSL"
# ssl_config:
# cafile: "/etc/kafka/ssl/ca.pem"
# # certfile: "/etc/kafka/ssl/cert.pem"
# # keyfile: "/etc/kafka/ssl/key.pem"
# # password: "{{ env_var('KAFKA_PASSWORD') }}"
# check_hostname: false
# Uncomment the config below to use SASL_SSL config
security_protocol: "SASL_SSL"
sasl_config:
mechanism: "PLAIN"
username: "{{ env_var('KAFKA_USERNAME') }}"
password: "{{ env_var('KAFKA_PASSWORD') }}"
ssl_config:
cafile: "/etc/kafka/ssl/ca.pem"
certfile: "/etc/kafka/ssl/cert.pem"
keyfile: "/etc/kafka/ssl/key.pem"
password: "{{ env_var('KAFKA_PASSWORD') }}"
check_hostname: false
```
But when running the pipeline it's not matching the ssl certificate.

When trying to verify the handshake in the pod it works.
`openssl s_client -connect <broker_ip> -CAfile /etc/kafka/ssl/ca.pem -cert /etc/kafka/ssl/cert.pem -key /etc/kafka/ssl/key.pem -showcerts`

### To reproduce
_No response_
### Expected behavior
_No response_
### Screenshots
_No response_
### Operating system
_No response_
### Additional context
_No response_ | 0easy
|
Title: df.ta.to_utc unexpected behavior caused by JupyterLab
Body: **Which version are you running? The lastest version is on Github. Pip is for major releases.**
0.3.14b0
**Do you have _TA Lib_ also installed in your environment?**
yes
**Did you upgrade? Did the upgrade resolve the issue?**
I try upgrade, not solve.
**Describe the bug**
The `df.ta.to_utc` is a property, but it is not idempotent. When you type `df.ta.` then press `tab` for auto completion, it will cause the `df.ta.to_utc` be called. I don't know why this design as a property rather a method.
**To Reproduce**
1. `df = ...` # In JupyterLab, create a dataframe with non-UTC datetime index.

2. Type `df.ta.`, then press `tab`.
3. Show the `df` again. If you can't see the index time zone changed, try step 2 multi-times then try step 3 again, you will see it.

**Expected behavior**
The `df.ta.to_utc` property should design as a method.
**Screenshots**
https://user-images.githubusercontent.com/4510984/155868723-36fe4bb3-31b2-4221-a3f9-6e392e78b42d.mp4
| 0easy
|
Title: Change the parsing model object produced by `Test Tags` (and `Force Tags`) to `TestTags`
Body: We are gong to rename `Force Tags` to `Test Tags` (#4365) and as the first step adding `Test Tags` as an alias for `Force Tags` in RF 5.1 (#4368). There were no other code changes and currently `Test Tags` (and `Force Tags`) is parsed into a `ForceTags` object. This should be changed to `TestTags` in RF 5.2.
The change doesn't affect test execution at all, but tools working with the parsing model are affected. Luckily it's pretty easy to handle both objects by having both `visit_ForceTags` and `visit_TestTags` in a model visitor.
| 0easy
|
Title: Add roadmap link to the README
Body: It would be good to see a link to our [roadmap](https://wagtail.org/roadmap/) in the readme. While we're at it it may make sense to make the words 'release notes' into a link to our [docs releases](https://docs.wagtail.org/en/stable/releases/index.html) page.
https://github.com/wagtail/wagtail/blob/912c0881f9bf17adb03a8c7fe83ab36c969b0521/README.md?plain=1#L73

| 0easy
|
Title: Make plotting functions work with array-like inputs
Body: Plots such as `plot_roc_curve` must be able to take any array-like objects. As of today, they only take numpy arrays as input, otherwise an exception is raised. This numpy array conversion must be done inside the function itself.
Example:
```python
skplt.plot_roc_curve([0, 1], [[0.2, 0.8], [0.8, 0.2]])
```
does not work while
```python
skplt.plot_roc_curve(np.array([0, 1]), np.array([[0.2, 0.8], [0.8, 0.2]]))
```
does
| 0easy
|
Title: Add Jupyter notebook examples
Body: It would be nice to have Jupyter notebooks in the "examples" folder showing the different plots as used in a Jupyter notebook. It could contain the same exact code as the examples in the .py files, but adjusted for size (Jupyter notebook plots tend to come out much smaller). | 0easy
|
Title: [MNT] Add joblib backend option and set default to all parallelized estimators
Body: ### Describe the issue
Some of the estimators that use joblib for parallelization use process-based backend, while other use threads-based backend. Ideally, we want this to be a parameter tunable by the users.
### Suggest a potential alternative/fix
Including a `joblib_backend` parameter which would default to `threading` (from discussions with Matthew), and use this parameter to set the joblib `backend` parameter during `Parallel` calls would fix the issue.
| 0easy
|
Title: Edge case: subprocess call in @events.on_post_rc leads to suspending xonsh by OS
Body: <!--- Provide a general summary of the issue in the Title above -->
<!--- If you have a question along the lines of "How do I do this Bash command in xonsh"
please first look over the Bash to Xonsh translation guide: https://xon.sh/bash_to_xsh.html
If you don't find an answer there, please do open an issue! -->
## xonfig
<details>
```
$ xonfig
+------------------+----------------------------------+
| xonsh | 0.14.0 |
| Python | 3.11.6 |
| PLY | 3.11 |
| have readline | True |
| prompt toolkit | 3.0.39 |
| shell type | prompt_toolkit |
| history backend | json |
| pygments | 2.16.1 |
| on posix | True |
| on linux | True |
| distro | unknown |
| on wsl | False |
| on darwin | False |
| on windows | False |
| on cygwin | False |
| on msys2 | False |
| is superuser | False |
| default encoding | utf-8 |
| xonsh encoding | utf-8 |
| encoding errors | surrogateescape |
| xontrib | [] |
| RC file 1 | /etc/xonsh/xonshrc |
| RC file 2 | /home/oskar/.config/xonsh/rc.xsh |
+------------------+----------------------------------+
```
</details>
## Expected Behavior
Xonsh should launch normally and not become a background bash process.
## Current Behavior
When a subprocess call is made in the `@events.on_post_rc` event, xonsh becomes a background process instead of the main process when launched from bash.
```
~ ✦ [18:56:59] ❯ xonsh
[2]+ Stopped xonsh
✦ [18:56:37] ❯
```
## Steps to Reproduce
This rc-file reproduces the issue for me
```
@events.on_post_rc
def __direnv_post_rc():
r = $(ls)
```
Then just start xonsh from bash.
I stumbled upon this issue when loading the xonsh-direnv xontrib. So I have reported it there as well.
## For community
⬇️ **Please click the 👍 reaction instead of leaving a `+1` or 👍 comment**
| 0easy
|
Title: Processbar issues
Body: Looks like this needs to be converted to support python 3:
```
File "/Users/rich/.pyenv/versions/3.5.2/lib/python3.5/site-packages/shop/cascade/processbar.py", line 85, in get_identifier
content = unicode(Truncator(content).words(3, truncate=' ...'))
NameError: name 'unicode' is not defined
```
Option examples:
```
# Python 2 and 3: alternative 1
from builtins import str
templates = [u"blog/blog_post_detail_%s.html" % str(slug)]
# Python 2 and 3: alternative 2
from builtins import str as text
templates = [u"blog/blog_post_detail_%s.html" % text(slug)]
``` | 0easy
|
Title: Creator profile picture - fix margins and the size of the creator profile photo
Body: ### Describe your issue.
Fix the profile photo size and also the margins of the profile photo. The current sizing is all off...
<img width="633" alt="Image" src="https://github.com/user-attachments/assets/2048e61e-fdd0-4248-b615-7549d94ff272" />
| 0easy
|
Title: Migrate integration tests to slack_sdk package and add some legacy tests
Body: Currently, the integration tests still use `slack` package and verify if the WebClient and so on under the package works. As the source code is automatically generated from `slack_sdk.web.WebClient`, this is still good enough at this point. However, ideally we can use `slack_sdk` package for the tests and have additional tests for `slack` package.
### Category (place an `x` in each of the `[ ]`)
- [ ] **slack_sdk.web.WebClient** (Web API client)
- [ ] **slack_sdk.webhook.WebhookClient** (Incoming Webhook, response_url sender)
- [ ] **slack_sdk.models** (UI component builders)
- [ ] **slack_sdk.oauth** (OAuth Flow Utilities)
- [ ] **slack_sdk.rtm.RTMClient** (RTM client)
- [ ] **slack_sdk.signature** (Request Signature Verifier)
- [x] tests
### Requirements
Please read the [Contributing guidelines](https://github.com/slackapi/python-slack-sdk/blob/main/.github/contributing.md) and [Code of Conduct](https://slackhq.github.io/code-of-conduct) before creating this issue or pull request. By submitting, you are agreeing to those rules.
| 0easy
|
Title: Avoid strict version requirements for requests
Body: DataProfiler has a [strict requirement](https://github.com/capitalone/DataProfiler/blob/main/requirements.txt#L16) for 'requests' which makes it difficult to use as a dependency in large projects. I was wondering if this can be avoided. | 0easy
|
Title: Update websockets dependency requirements
Body: Hey!
I have been using this SDK with a uvicorn server. They have recently upgraded the websockets requirements ([link](https://github.com/encode/uvicorn/pull/1065)), and the version 9.1 includes a fix for a security issue introduced in version 8.
Currently, they require `>=9.1`, while this SDK requires `>=8,<9`. It would be awesome if you could also bump your requirements so that we didn't need to pin the uvicorn version and could also benefit from this security fix.
I'd be happy to open a PR to help with the upgrade if needed, but I couldn't quite figure out how the library is used and what code changes might need to be done for this major upgrade.
### Category (place an `x` in each of the `[ ]`)
- [ ] **slack_sdk.web.WebClient (sync/async)** (Web API client)
- [ ] **slack_sdk.webhook.WebhookClient (sync/async)** (Incoming Webhook, response_url sender)
- [ ] **slack_sdk.models** (UI component builders)
- [ ] **slack_sdk.oauth** (OAuth Flow Utilities)
- [X] **slack_sdk.socket_mode** (Socket Mode client)
- [ ] **slack_sdk.audit_logs** (Audit Logs API client)
- [ ] **slack_sdk.scim** (SCIM API client)
- [ ] **slack_sdk.rtm** (RTM client)
- [ ] **slack_sdk.signature** (Request Signature Verifier)
### Requirements
Please read the [Contributing guidelines](https://github.com/slackapi/python-slack-sdk/blob/main/.github/contributing.md) and [Code of Conduct](https://slackhq.github.io/code-of-conduct) before creating this issue or pull request. By submitting, you are agreeing to those rules. | 0easy
|
Title: Better Topic Modeling Visualization/Saving for Image Only Analysis
Body: ### Feature request
I would like to see a swipe gallery of each topic. Each image would contain the top representative images of that topic in a grid (similar to what is shown at the end of the process described here: https://maartengr.github.io/BERTopic/getting_started/multimodal/multimodal.html#images-only) with the captions underneath. I would like the option to create the grid with the original images since the images right now are low quality in the grid. If creating a swipe gallery is too much, something that just saves the topics in a folder as 1_keyword1_keyword2_keyword3_keyword_4.jpg also works.
### Motivation
Currently, at the end of the image modeling process (https://maartengr.github.io/BERTopic/getting_started/multimodal/multimodal.html#images-only), there is a grid that shows the list of keywords and the grid. This works well for exploratory analysis but is hard to display later. Right now, there is no way to display the results in a better way. In addition, even saving the grid image is inefficient because it is generated from the compressed images rather than the original.
### Your contribution
I have created a save captions and grid function I could add, but I don't think it will help with the larger issue of not being able to utilize the original images to create a grid as I don't have the original dataset. | 0easy
|
Title: Add Ray workflows adapter
Body: # what
Ray workflows seems like something we could easily add too https://docs.ray.io/en/latest/workflows/concepts.html given that we now have GraphAdapters
# Task
- [x] Implement something very similar to the RayGraphAdapter, i.e. RayWorkflowGraphAdapter. The hypothesis is that then we just need to use workflow step function to wrap hamilton functions.
- [ ] implement an integration test for it
- [x] Implement a hello world with it
# Context
_Originally posted by @skrawcz in https://github.com/stitchfix/hamilton/issues/10#issuecomment-1029427745_ | 0easy
|
Title: ENH: implement `np.tri`
Body: ### Is your feature request related to a problem? Please describe
`np.tri` is a wildly used function. It is basicly a combination of `np.ones` and `np.tril` and should be easy to implement.
Check this out for how to implement a mars operand: https://docs.pymars.org/en/latest/development/operand.html
And the impl of `ones` may be a good example: https://github.com/xprobe-inc/xorbits/blob/main/python/xorbits/_mars/tensor/datasource/ones.py
| 0easy
|
Title: Cannot specify `allowed_updates` while using `executor.start_polling`
Body: Context (ru): [@aiogram_ru#524096](https://t.me/aiogram_ru/524096) | 0easy
|
Title: Marketplace - creator page - reduce margins between divider line and section title
Body:
### Describe your issue.
Reduce margins to 25px.
Current margins are at 32px
<img width="1100" alt="Screenshot 2024-12-16 at 21 23 33" src="https://github.com/user-attachments/assets/76759c5d-4d43-4d8d-a71e-e4133eabda51" />
### Upload Activity Log Content
_No response_
### Upload Error Log Content
_No response_ | 0easy
|
Title: EncodingWarning when running with PYTHONWARNDEFAULTENCODING=1
Body: When running with PYTHONWARNDEFAULTENCODING=1, I see a number of warnings:
```
/var/folders/sx/n5gkrgfx6zd91ymxr2sr9wvw00n8zm/T/pip-run-p21p1hcv/xonsh/lazyjson.py:208: EncodingWarning: 'encoding' argument not specified
self._f = open(f, newline="\n")
/var/folders/sx/n5gkrgfx6zd91ymxr2sr9wvw00n8zm/T/pip-run-p21p1hcv/xonsh/prompt/vc.py:243: EncodingWarning: 'encoding' argument not specified.
s = subprocess.check_output(
/var/folders/sx/n5gkrgfx6zd91ymxr2sr9wvw00n8zm/T/pip-run-p21p1hcv/xonsh/procs/specs.py:182: EncodingWarning: 'encoding' argument not specified
return open(fname, mode, buffering=buffering)
/var/folders/sx/n5gkrgfx6zd91ymxr2sr9wvw00n8zm/T/pip-run-p21p1hcv/xonsh/history/json.py:306: EncodingWarning: 'encoding' argument not specified
with open(self.filename, newline="\n") as f:
/var/folders/sx/n5gkrgfx6zd91ymxr2sr9wvw00n8zm/T/pip-run-p21p1hcv/xonsh/history/json.py:317: EncodingWarning: 'encoding' argument not specified
with open(self.filename, "w", newline="\n") as f:
```
In other projects, simply adding `encoding='utf-8'` normalizes the behavior across platforms to the current behavior as observed on Unix and addresses the warnings. That's my recommended approach. | 0easy
|
Title: HideFromTaskbar removes minimize and maximize icons permanently
Body: When using dialog.HideFromTaskbar, the minimize and maximize icons of the window are removed and only a small "X"-Icon persists, even after un-hiding the dialog.
Is that intended and maybe related to `self.exstyle() | win32defines.WS_EX_TOOLWINDOW`? | 0easy
|
Title: Export confirm page does not show the list of fields to be exported
Body: **Describe the bug**
Import workflow shows the list of fields that will be imported for each resource:

However the export workflow does not show this:

**To Reproduce**
Click on 'Export' button for the Book model.
**Versions (please complete the following information):**
- Django Import Export: 3.3.3
- Python 3.11
- Django 4.2
**Expected behavior**
A list of fields should be presented to the user for each resource on export.
| 0easy
|
Title: Add support for other LLMs
Body: Investigate adding support for:
* Bard
* Claude
* Llama
It woud also be nice to host the Llama version on HuggingFace such that people don't need an API token | 0easy
|
Title: We would like an `environment.yml` file.
Body: As @lagru pointed out, typically, the requirement spec that pip uses can not be parsed by conda. And because packages might be named differently on conda-forge vs PyPI, there's no way around manually curating the dependencies, just like SciPy does (cf. [`environment.yml`](https://github.com/scipy/scipy/blob/main/environment.yml)).
_Originally posted by @lagru in https://github.com/scikit-image/scikit-image/pull/6781#discussion_r1127587232_ | 0easy
|
Title: api: accept auth action codes without (or with incorrect?) base64 padding (=)
Body: Some users' email agents don't recognize that `==/` may be part of the account activation URL, for example, and receive `400 Bad Request`. We should fix the padding automatically if that happens. | 0easy
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.