id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
2610560206
|
python312Packages.codecov: drop
Removes the codecov dependency from every package that depends on it, and the package itself. Per upstream, this package is deprecated and does not work. Further, we shouldn't be wasting energy and time testing coverage in CI.
Things done
Built on platform(s)
[x] x86_64-linux
[ ] aarch64-linux
[ ] x86_64-darwin
[ ] aarch64-darwin
For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
[ ] sandbox = relaxed
[ ] sandbox = true
[x] Tested, as applicable:
NixOS test(s) (look inside nixos/tests)
and/or package tests
or, for functions and "core" functionality, tests in lib/tests or pkgs/test
made sure NixOS tests are linked to the relevant packages
[x] Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
[x] Tested basic functionality of all binary files (usually in ./result/bin/)
24.11 Release Notes (or backporting 23.11 and 24.05 Release notes)
[ ] (Package updates) Added a release notes entry if the change is major or breaking
[ ] (Module updates) Added a release notes entry if the change is significant
[ ] (Module addition) Added a release notes entry if adding a new NixOS module
[x] Fits CONTRIBUTING.md.
Add a :+1: reaction to pull requests you find important.
nixpkgs-review result
Generated using nixpkgs-review.
Command: nixpkgs-review pr 350862
x86_64-linux
:fast_forward: 4 packages marked as broken and skipped:
krr
krr.dist
opsdroid
opsdroid.dist
:white_check_mark: 74 packages built:
bepasty
bepasty.dist
prowler
prowler.dist
python311Packages.asteroid-filterbanks
python311Packages.asteroid-filterbanks.dist
python311Packages.heudiconv
python311Packages.heudiconv.dist
python311Packages.mkdocs-rss-plugin
python311Packages.mkdocs-rss-plugin.dist
python311Packages.nipype
python311Packages.nipype.dist
python311Packages.niworkflows
python311Packages.niworkflows.dist
python311Packages.pyannote-audio
python311Packages.pyannote-audio.dist
python311Packages.slack-bolt
python311Packages.slack-bolt.dist
python311Packages.slack-sdk
python311Packages.slack-sdk.dist
python311Packages.snakemake
python311Packages.snakemake-executor-plugin-cluster-generic
python311Packages.snakemake-executor-plugin-cluster-generic.dist
python311Packages.snakemake-interface-executor-plugins
python311Packages.snakemake-interface-executor-plugins.dist
python311Packages.snakemake-interface-storage-plugins
python311Packages.snakemake-interface-storage-plugins.dist
python311Packages.snakemake-storage-plugin-fs
python311Packages.snakemake-storage-plugin-fs.dist
python311Packages.snakemake-storage-plugin-s3
python311Packages.snakemake-storage-plugin-s3.dist
python311Packages.snakemake-storage-plugin-xrootd
python311Packages.snakemake-storage-plugin-xrootd.dist
python311Packages.snakemake.dist
python311Packages.throttler
python311Packages.throttler.dist
python311Packages.validator-collection
python311Packages.validator-collection.dist
python312Packages.asteroid-filterbanks
python312Packages.asteroid-filterbanks.dist
python312Packages.heudiconv
python312Packages.heudiconv.dist
python312Packages.mkdocs-rss-plugin
python312Packages.mkdocs-rss-plugin.dist
python312Packages.nipype
python312Packages.nipype.dist
python312Packages.niworkflows
python312Packages.niworkflows.dist
python312Packages.pyannote-audio
python312Packages.pyannote-audio.dist
python312Packages.slack-bolt
python312Packages.slack-bolt.dist
python312Packages.slack-sdk
python312Packages.slack-sdk.dist
snakemake (python312Packages.snakemake)
python312Packages.snakemake-executor-plugin-cluster-generic
python312Packages.snakemake-executor-plugin-cluster-generic.dist
python312Packages.snakemake-interface-executor-plugins
python312Packages.snakemake-interface-executor-plugins.dist
python312Packages.snakemake-interface-storage-plugins
python312Packages.snakemake-interface-storage-plugins.dist
python312Packages.snakemake-storage-plugin-fs
python312Packages.snakemake-storage-plugin-fs.dist
python312Packages.snakemake-storage-plugin-s3
python312Packages.snakemake-storage-plugin-s3.dist
python312Packages.snakemake-storage-plugin-xrootd
python312Packages.snakemake-storage-plugin-xrootd.dist
snakemake.dist (python312Packages.snakemake.dist)
python312Packages.throttler
python312Packages.throttler.dist
python312Packages.validator-collection
python312Packages.validator-collection.dist
whisper-ctranslate2
whisper-ctranslate2.dist
(Apparently we’re going with anywhere‐on‐Earth rather than UTC so this was totally fine!)
While I think it’s probably fine in this case, note that package removals for non‐security reasons are not really allowed right now due to having passed the breaking change freeze point of the release cycle.
so for future removals until master allows breaking again, I should target staging? If I need to remove a package of course
You’ll need to wait until approximately November 14; the reason the freeze ends earlier on staging{,-next} is just because those branches won’t get merged until after the branch‐off happens. You can apply the shiny new label to PRs that are waiting for that to happen (and preferably mark them as drafts to avoid accidental merges).
|
gharchive/pull-request
| 2024-10-24T06:02:14 |
2025-04-01T04:55:26.659380
|
{
"authors": [
"Scrumplex",
"emilazy",
"pyrox0"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/350862",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
298090907
|
docker-credential-gcr: init at 1.4.3
Add a (Golang based) binary that provides authentication to Docker repositories hosted on Google Cloud Registry (GCR). This tool is required when using https://gcr.io based Docker repositories hosted on GCP, making use of various Google Oauth features to authenticate.
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option build-use-sandbox in nix.conf on non-NixOS)
Built on platform(s)
[x] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[x] Tested execution of all binary files (usually in ./result/bin/)
[x] Fits CONTRIBUTING.md.
Only installed NixOS over the weekend, and planning to get back to using it. Obviously, I've forgotten a lot of how it works. Let me know if there's anything missing/wrong with this package request.
Also, is it possible to add the local nixpkgs repo as some form of source (like a channel) and install this tool declaratively via configuration.nix, for eg. I already have channels setup for unstable, so I can pick out packages from that channels with a prefix for eg. unstable.spotify, can I do the same for local repos with some sort of prefix until this lands in unstable ? I'm sure this is possible, just not sure how to best do it as a beginner. Any pointers would be much appreciated.
Something like either of this would be super.
https://nixos.wiki/wiki/Cheatsheet#Upgrading_individual_packages_to_a_different_channel
https://nixos.wiki/wiki/FAQ#How_can_I_install_a_package_from_unstable_while_remaining_on_the_stable_channel.3F
@nlewo Updated !
@GrahamcOfBorg build docker-credential-gcr
Success on aarch64-linux (full log)
Partial log (click to expand)
shrinking /nix/store/73717jiqg24iq6kgk49qfdzq9cligv5f-docker-credential-gcr-1.4.3-bin/bin/docker-credential-gcr
strip is /nix/store/skd6ix5ipkyhxzq7naylj4digawakl4j-binutils-2.28.1/bin/strip
stripping (with command strip and flags -S) in /nix/store/73717jiqg24iq6kgk49qfdzq9cligv5f-docker-credential-gcr-1.4.3-bin/bin
patching script interpreter paths in /nix/store/73717jiqg24iq6kgk49qfdzq9cligv5f-docker-credential-gcr-1.4.3-bin
checking for references to /build in /nix/store/73717jiqg24iq6kgk49qfdzq9cligv5f-docker-credential-gcr-1.4.3-bin...
shrinking RPATHs of ELF executables and libraries in /nix/store/ykwhjd8pqix9k6agc4llfz0bfndnl80c-docker-credential-gcr-1.4.3
strip is /nix/store/skd6ix5ipkyhxzq7naylj4digawakl4j-binutils-2.28.1/bin/strip
patching script interpreter paths in /nix/store/ykwhjd8pqix9k6agc4llfz0bfndnl80c-docker-credential-gcr-1.4.3
checking for references to /build in /nix/store/ykwhjd8pqix9k6agc4llfz0bfndnl80c-docker-credential-gcr-1.4.3...
/nix/store/73717jiqg24iq6kgk49qfdzq9cligv5f-docker-credential-gcr-1.4.3-bin
Success on x86_64-linux (full log)
Partial log (click to expand)
shrinking /nix/store/x6gbhs013kygifriqmpfy7fcnzn08sbl-docker-credential-gcr-1.4.3-bin/bin/docker-credential-gcr
strip is /nix/store/adidfx4pa7vmvby0gjqqmiwg2x49yr27-binutils-2.28.1/bin/strip
stripping (with command strip and flags -S) in /nix/store/x6gbhs013kygifriqmpfy7fcnzn08sbl-docker-credential-gcr-1.4.3-bin/bin
patching script interpreter paths in /nix/store/x6gbhs013kygifriqmpfy7fcnzn08sbl-docker-credential-gcr-1.4.3-bin
checking for references to /tmp/nix-build-docker-credential-gcr-1.4.3.drv-0 in /nix/store/x6gbhs013kygifriqmpfy7fcnzn08sbl-docker-credential-gcr-1.4.3-bin...
shrinking RPATHs of ELF executables and libraries in /nix/store/w169xvaq8x60g7g1kw2l8awbn5ahnpcy-docker-credential-gcr-1.4.3
strip is /nix/store/adidfx4pa7vmvby0gjqqmiwg2x49yr27-binutils-2.28.1/bin/strip
patching script interpreter paths in /nix/store/w169xvaq8x60g7g1kw2l8awbn5ahnpcy-docker-credential-gcr-1.4.3
checking for references to /tmp/nix-build-docker-credential-gcr-1.4.3.drv-0 in /nix/store/w169xvaq8x60g7g1kw2l8awbn5ahnpcy-docker-credential-gcr-1.4.3...
/nix/store/x6gbhs013kygifriqmpfy7fcnzn08sbl-docker-credential-gcr-1.4.3-bin
Thanks!
|
gharchive/pull-request
| 2018-02-18T14:58:22 |
2025-04-01T04:55:26.669290
|
{
"authors": [
"GrahamcOfBorg",
"nlewo",
"suvash"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/35117",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2621732283
|
factorio: 2.0.11 -> 2.0.12
Things done
Built on platform(s)
[x] x86_64-linux
[ ] aarch64-linux
[ ] x86_64-darwin
[ ] aarch64-darwin
For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
[ ] sandbox = relaxed
[ ] sandbox = true
[ ] Tested, as applicable:
NixOS test(s) (look inside nixos/tests)
and/or package tests
or, for functions and "core" functionality, tests in lib/tests or pkgs/test
made sure NixOS tests are linked to the relevant packages
[ ] Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
[ ] Tested basic functionality of all binary files (usually in ./result/bin/)
24.11 Release Notes (or backporting 23.11 and 24.05 Release notes)
[ ] (Package updates) Added a release notes entry if the change is major or breaking
[ ] (Module updates) Added a release notes entry if the change is significant
[ ] (Module addition) Added a release notes entry if adding a new NixOS module
[ ] Fits CONTRIBUTING.md.
Add a :+1: reaction to pull requests you find important.
2.0.13 released
Tested and working fine
|
gharchive/pull-request
| 2024-10-29T16:11:45 |
2025-04-01T04:55:26.679137
|
{
"authors": [
"greaka",
"oluceps"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/352137",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2662222729
|
python312Packages.appthreat-vulnerability-db: 6.1.1 -> 6.2.1
Diff: https://github.com/AppThreat/vulnerability-db/compare/refs/tags/v6.1.1...v6.2.1
Changelog: https://github.com/AppThreat/vulnerability-db/releases/tag/v6.2.1
Things done
Built on platform(s)
[ ] x86_64-linux
[ ] aarch64-linux
[ ] x86_64-darwin
[ ] aarch64-darwin
For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
[ ] sandbox = relaxed
[ ] sandbox = true
[ ] Tested, as applicable:
NixOS test(s) (look inside nixos/tests)
and/or package tests
or, for functions and "core" functionality, tests in lib/tests or pkgs/test
made sure NixOS tests are linked to the relevant packages
[ ] Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
[ ] Tested basic functionality of all binary files (usually in ./result/bin/)
25.05 Release Notes (or backporting 24.11 and 25.05 Release notes)
[ ] (Package updates) Added a release notes entry if the change is major or breaking
[ ] (Module updates) Added a release notes entry if the change is significant
[ ] (Module addition) Added a release notes entry if adding a new NixOS module
[ ] Fits CONTRIBUTING.md.
Add a :+1: reaction to pull requests you find important.
nixpkgs-review result
Generated using nixpkgs-review.
Command: nixpkgs-review pr 356187
x86_64-linux
:white_check_mark: 6 packages built:
dep-scan
dep-scan.dist
python311Packages.appthreat-vulnerability-db
python311Packages.appthreat-vulnerability-db.dist
python312Packages.appthreat-vulnerability-db
python312Packages.appthreat-vulnerability-db.dist
|
gharchive/pull-request
| 2024-11-15T14:51:25 |
2025-04-01T04:55:26.690467
|
{
"authors": [
"fabaff"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/356187",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2664134931
|
python312Packages.rzpipe: 0.6.0 -> 0.6.2
Changelog: https://github.com/rizinorg/rizin/releases/tag/v0.6.2
Things done
Built on platform(s)
[ ] x86_64-linux
[ ] aarch64-linux
[ ] x86_64-darwin
[ ] aarch64-darwin
For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
[ ] sandbox = relaxed
[ ] sandbox = true
[ ] Tested, as applicable:
NixOS test(s) (look inside nixos/tests)
and/or package tests
or, for functions and "core" functionality, tests in lib/tests or pkgs/test
made sure NixOS tests are linked to the relevant packages
[ ] Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
[ ] Tested basic functionality of all binary files (usually in ./result/bin/)
25.05 Release Notes (or backporting 24.11 and 25.05 Release notes)
[ ] (Package updates) Added a release notes entry if the change is major or breaking
[ ] (Module updates) Added a release notes entry if the change is significant
[ ] (Module addition) Added a release notes entry if adding a new NixOS module
[ ] Fits CONTRIBUTING.md.
Add a :+1: reaction to pull requests you find important.
nixpkgs-review result
Generated using nixpkgs-review.
Command: nixpkgs-review pr 356435
x86_64-linux
:white_check_mark: 9 packages built:
apkleaks
apkleaks.dist
jadx
python311Packages.rzpipe
python311Packages.rzpipe.dist
python312Packages.rzpipe
python312Packages.rzpipe.dist
quark-engine
quark-engine.dist
|
gharchive/pull-request
| 2024-11-16T10:32:25 |
2025-04-01T04:55:26.701585
|
{
"authors": [
"fabaff"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/356435",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2683984939
|
xdg-desktop-portal-termfilechooser: init at 0-unstable-2021-07-14
This is the patch of the termfilechooser package from #277531 applied to current master.
Closes #213438.
Things done
Built on platform(s)
[x] x86_64-linux
[ ] aarch64-linux
[ ] x86_64-darwin
[ ] aarch64-darwin
For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
[ ] sandbox = relaxed
[ ] sandbox = true
[ ] Tested, as applicable:
NixOS test(s) (look inside nixos/tests)
and/or package tests
or, for functions and "core" functionality, tests in lib/tests or pkgs/test
made sure NixOS tests are linked to the relevant packages
[x] Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
[x] Tested basic functionality of all binary files (usually in ./result/bin/)
25.05 Release Notes (or backporting 24.11 and 25.05 Release notes)
[ ] (Package updates) Added a release notes entry if the change is major or breaking
[ ] (Module updates) Added a release notes entry if the change is significant
[ ] (Module addition) Added a release notes entry if adding a new NixOS module
[x] Fits CONTRIBUTING.md.
Add a :+1: reaction to pull requests you find important.
The original repository is not maintained for a while now
This fork seems to be active, and it would be great to have this repo merged instead since it also offers more customisations and such
https://github.com/boydaihungst/xdg-desktop-portal-termfilechooser
|
gharchive/pull-request
| 2024-11-22T17:28:46 |
2025-04-01T04:55:26.711545
|
{
"authors": [
"Eddio0141",
"bpeetz"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/358205",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
300886653
|
tvheadend: 4.2.4 -> 4.2.5
Semi-automatic update. These checks were performed:
built on NixOS
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend -h got 0 exit code
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend --help got 0 exit code
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend help got 0 exit code
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend -V and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend -v and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend --version and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend version and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend -h and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend --help and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/tvheadend help and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped -h got 0 exit code
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped --help got 0 exit code
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped help got 0 exit code
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped -V and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped -v and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped --version and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped version and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped -h and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped --help and found version 4.2.5
ran /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped help and found version 4.2.5
found 4.2.5 with grep in /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5
cc "@simonvandel"
Failure on x86_64-darwin (full log)
Partial log (click to expand)
In file included from src/uuid.c:20:
In file included from /private/tmp/nix-build-tvheadend-4.2.5.drv-0/source/src/tvheadend.h:41:
In file included from /private/tmp/nix-build-tvheadend-4.2.5.drv-0/source/src/tvhlog.h:33:
/private/tmp/nix-build-tvheadend-4.2.5.drv-0/source/src/clock.h:29:2: error: "Platforms without monotonic clocks are not supported!"
#error "Platforms without monotonic clocks are not supported!"
^
1 error generated.
make: *** [Makefile:635: /private/tmp/nix-build-tvheadend-4.2.5.drv-0/source/build.darwin/src/uuid.o] Error 1
builder for '/nix/store/rh9nng5x6yfjfig8vmr7grmnnq5yfipp-tvheadend-4.2.5.drv' failed with exit code 2
error: build of '/nix/store/rh9nng5x6yfjfig8vmr7grmnnq5yfipp-tvheadend-4.2.5.drv' failed
Success on x86_64-linux (full log)
Partial log (click to expand)
find /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/share/tvheadend -name .git -exec rm -rf {} \; &>/dev/null || /bin/true
post-installation fixup
shrinking RPATHs of ELF executables and libraries in /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5
shrinking /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin/.tvheadend-wrapped
gzipping man pages under /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/share/man/
strip is /nix/store/b0zlxla7dmy1iwc3g459rjznx59797xy-binutils-2.28.1/bin/strip
stripping (with command strip and flags -S) in /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5/bin
patching script interpreter paths in /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5
checking for references to /tmp/nix-build-tvheadend-4.2.5.drv-0 in /nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5...
/nix/store/ffbiwxbfdywyhn0ng9lz8zpzjkvaswah-tvheadend-4.2.5
Success on aarch64-linux (full log)
Partial log (click to expand)
find /nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5/share/tvheadend -name .git -exec rm -rf {} \; &>/dev/null || /bin/true
post-installation fixup
shrinking RPATHs of ELF executables and libraries in /nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5
shrinking /nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5/bin/.tvheadend-wrapped
gzipping man pages under /nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5/share/man/
strip is /nix/store/lvx1acn1ig1j2km8jds5x3ggh3f2wa8v-binutils-2.28.1/bin/strip
stripping (with command strip and flags -S) in /nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5/bin
patching script interpreter paths in /nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5
checking for references to /build in /nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5...
/nix/store/pvyd0xh63acgcr5qb67dgyhimflqb3dv-tvheadend-4.2.5
|
gharchive/pull-request
| 2018-02-28T04:21:42 |
2025-04-01T04:55:26.722682
|
{
"authors": [
"GrahamcOfBorg",
"ryantm"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/36072",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
324064082
|
debianutils: 4.8.4 -> 4.8.6
Semi-automatic update generated by https://github.com/ryantm/nixpkgs-update tools.
This update was made based on information from https://repology.org/metapackage/debianutils/versions.
These checks were done:
built on NixOS
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/run-parts passed the binary check.
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/tempfile passed the binary check.
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/ischroot passed the binary check.
Warning: no invocation of /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/which had a zero exit code or showed the expected version
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/savelog passed the binary check.
Warning: no invocation of /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/add-shell had a zero exit code or showed the expected version
Warning: no invocation of /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/installkernel had a zero exit code or showed the expected version
Warning: no invocation of /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/remove-shell had a zero exit code or showed the expected version
4 of 8 passed binary check by having a zero exit code.
1 of 8 passed binary check by having the new version present in output.
found 4.8.6 with grep in /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6
directory tree listing: https://gist.github.com/d23555dbc7c71f2c0b98dcd75d09c013
du listing: https://gist.github.com/58757b8859445f56eba651aa845fe59f
meta.description for debianutils is: '"Miscellaneous utilities specific to Debian"'.
Success on aarch64-linux (full log)
Attempted: debianutils
Partial log (click to expand)
stripping (with command strip and flags -S) in /nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/bin /nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/sbin
patching script interpreter paths in /nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6
/nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/sbin/remove-shell: interpreter directive changed from "/bin/sh -e" to "/nix/store/l4w7xwjy2nmk31fl5kgyy7gg8z7l9n8z-bash-4.4-p19/bin/sh -e"
/nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/sbin/add-shell: interpreter directive changed from "/bin/sh -e" to "/nix/store/l4w7xwjy2nmk31fl5kgyy7gg8z7l9n8z-bash-4.4-p19/bin/sh -e"
/nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/sbin/installkernel: interpreter directive changed from "/bin/sh" to "/nix/store/l4w7xwjy2nmk31fl5kgyy7gg8z7l9n8z-bash-4.4-p19/bin/sh"
/nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/bin/savelog: interpreter directive changed from " /bin/sh" to "/nix/store/l4w7xwjy2nmk31fl5kgyy7gg8z7l9n8z-bash-4.4-p19/bin/sh"
/nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/bin/which: interpreter directive changed from " /bin/sh" to "/nix/store/l4w7xwjy2nmk31fl5kgyy7gg8z7l9n8z-bash-4.4-p19/bin/sh"
checking for references to /build in /nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6...
moving /nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/sbin/* to /nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6/bin
/nix/store/gmfdcjq3hyvb7r1xwwfrasyv2y0ml60c-debianutils-4.8.6
Success on x86_64-linux (full log)
Attempted: debianutils
Partial log (click to expand)
stripping (with command strip and flags -S) in /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/sbin
patching script interpreter paths in /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/which: interpreter directive changed from " /bin/sh" to "/nix/store/xn5gv3lpfy91yvfy9b0i7klfcxh9xskz-bash-4.4-p19/bin/sh"
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin/savelog: interpreter directive changed from " /bin/sh" to "/nix/store/xn5gv3lpfy91yvfy9b0i7klfcxh9xskz-bash-4.4-p19/bin/sh"
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/sbin/installkernel: interpreter directive changed from "/bin/sh" to "/nix/store/xn5gv3lpfy91yvfy9b0i7klfcxh9xskz-bash-4.4-p19/bin/sh"
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/sbin/remove-shell: interpreter directive changed from "/bin/sh -e" to "/nix/store/xn5gv3lpfy91yvfy9b0i7klfcxh9xskz-bash-4.4-p19/bin/sh -e"
/nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/sbin/add-shell: interpreter directive changed from "/bin/sh -e" to "/nix/store/xn5gv3lpfy91yvfy9b0i7klfcxh9xskz-bash-4.4-p19/bin/sh -e"
checking for references to /build in /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6...
moving /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/sbin/* to /nix/store/lzdwpfz2xm977k12qpkvilaaw3sg6f4l-debianutils-4.8.6/bin
Success on x86_64-darwin (full log)
Attempted: debianutils
Partial log (click to expand)
strip is /nix/store/kdff2gim6417493yha769kh00n63lnrw-cctools-binutils-darwin/bin/strip
stripping (with command strip and flags -S) in /nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/bin /nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/sbin
patching script interpreter paths in /nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6
/nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/bin/savelog: interpreter directive changed from " /bin/sh" to "/nix/store/r8bx3qf1bpncb14i9gzma4vr089pc3pv-bash-4.4-p19/bin/sh"
/nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/bin/which: interpreter directive changed from " /bin/sh" to "/nix/store/r8bx3qf1bpncb14i9gzma4vr089pc3pv-bash-4.4-p19/bin/sh"
/nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/sbin/add-shell: interpreter directive changed from "/bin/sh -e" to "/nix/store/r8bx3qf1bpncb14i9gzma4vr089pc3pv-bash-4.4-p19/bin/sh -e"
/nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/sbin/installkernel: interpreter directive changed from "/bin/sh" to "/nix/store/r8bx3qf1bpncb14i9gzma4vr089pc3pv-bash-4.4-p19/bin/sh"
/nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/sbin/remove-shell: interpreter directive changed from "/bin/sh -e" to "/nix/store/r8bx3qf1bpncb14i9gzma4vr089pc3pv-bash-4.4-p19/bin/sh -e"
moving /nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/sbin/* to /nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6/bin
/nix/store/s9p31za5nzm0is7b67qyk7znpilz9myh-debianutils-4.8.6
|
gharchive/pull-request
| 2018-05-17T15:04:14 |
2025-04-01T04:55:26.733407
|
{
"authors": [
"GrahamcOfBorg",
"r-ryantm"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/40678",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
324191791
|
elfutils: add patch that fixes compilation under gcc8
Motivation for this change
Was failing: https://hydra.nixos.org/build/73828348/nixlog/2
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option build-use-sandbox in nix.conf on non-NixOS)
Built on platform(s)
[x] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[x] Fits CONTRIBUTING.md.
Success on x86_64-linux (full log)
Attempted: elfutils
Partial log (click to expand)
shrinking /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170/lib/elfutils/libebl_arm-0.170.so
shrinking /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170/lib/elfutils/libebl_bpf-0.170.so
shrinking /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170/lib/libelf-0.170.so
shrinking /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170/lib/libdw-0.170.so
strip is /nix/store/85wqgd5aj4g57g1fsrnmdbq4mf1kz957-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170/lib /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170/bin
patching script interpreter paths in /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170
/nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170/bin/eu-make-debug-archive: interpreter directive changed from "/bin/sh" to "/nix/store/sq6jb5limj5dsxbglam036fnvlfmn81f-bash-4.4-p19/bin/sh"
checking for references to /build in /nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170...
/nix/store/npf2h77rgvj9wpysg0p5fqgfb6abda2p-elfutils-0.170
Success on aarch64-linux (full log)
Attempted: elfutils
Partial log (click to expand)
shrinking /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170/lib/elfutils/libebl_i386-0.170.so
shrinking /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170/lib/libasm-0.170.so
shrinking /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170/lib/libdw-0.170.so
shrinking /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170/lib/libelf-0.170.so
strip is /nix/store/ks7k1wdljx2knaayzr528cwbj6v970km-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170/lib /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170/bin
patching script interpreter paths in /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170
/nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170/bin/eu-make-debug-archive: interpreter directive changed from "/bin/sh" to "/nix/store/xdzakdwslnr0skxr9y6lr475c8sra4h1-bash-4.4-p19/bin/sh"
checking for references to /build in /nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170...
/nix/store/vzh7alpbbnhd18ymxfr4qbwy746mvxi2-elfutils-0.170
Success on x86_64-linux (full log)
Attempted: elfutils
Partial log (click to expand)
shrinking /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170/lib/elfutils/libebl_arm-0.170.so
shrinking /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170/lib/elfutils/libebl_bpf-0.170.so
shrinking /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170/lib/libelf-0.170.so
shrinking /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170/lib/libdw-0.170.so
strip is /nix/store/85wqgd5aj4g57g1fsrnmdbq4mf1kz957-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170/lib /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170/bin
patching script interpreter paths in /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170
/nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170/bin/eu-make-debug-archive: interpreter directive changed from "/bin/sh" to "/nix/store/sq6jb5limj5dsxbglam036fnvlfmn81f-bash-4.4-p19/bin/sh"
checking for references to /build in /nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170...
/nix/store/cj8pd8zlpxma7i59msgag11pwdcms5iz-elfutils-0.170
Success on aarch64-linux (full log)
Attempted: elfutils
Partial log (click to expand)
shrinking /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170/lib/elfutils/libebl_i386-0.170.so
shrinking /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170/lib/libasm-0.170.so
shrinking /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170/lib/libdw-0.170.so
shrinking /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170/lib/libelf-0.170.so
strip is /nix/store/ks7k1wdljx2knaayzr528cwbj6v970km-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170/lib /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170/bin
patching script interpreter paths in /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170
/nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170/bin/eu-make-debug-archive: interpreter directive changed from "/bin/sh" to "/nix/store/xdzakdwslnr0skxr9y6lr475c8sra4h1-bash-4.4-p19/bin/sh"
checking for references to /build in /nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170...
/nix/store/fg9k0dmzvyn2hiwxxdvqvvhxvhfiz664-elfutils-0.170
@jtojnar Better?
Success on x86_64-linux (full log)
Attempted: elfutils
Partial log (click to expand)
shrinking /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170/bin/eu-ar
shrinking /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170/bin/eu-unstrip
shrinking /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170/bin/eu-stack
shrinking /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170/bin/eu-elfcompress
strip is /nix/store/85wqgd5aj4g57g1fsrnmdbq4mf1kz957-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170/lib /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170/bin
patching script interpreter paths in /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170
/nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170/bin/eu-make-debug-archive: interpreter directive changed from "/bin/sh" to "/nix/store/sq6jb5limj5dsxbglam036fnvlfmn81f-bash-4.4-p19/bin/sh"
checking for references to /build in /nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170...
/nix/store/mycsi0qc9hjgzvl4j3dkfnnd0y58qflr-elfutils-0.170
Success on aarch64-linux (full log)
Attempted: elfutils
Partial log (click to expand)
shrinking /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170/lib/elfutils/libebl_i386-0.170.so
shrinking /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170/lib/libasm-0.170.so
shrinking /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170/lib/libdw-0.170.so
shrinking /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170/lib/libelf-0.170.so
strip is /nix/store/ks7k1wdljx2knaayzr528cwbj6v970km-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170/lib /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170/bin
patching script interpreter paths in /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170
/nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170/bin/eu-make-debug-archive: interpreter directive changed from "/bin/sh" to "/nix/store/xdzakdwslnr0skxr9y6lr475c8sra4h1-bash-4.4-p19/bin/sh"
checking for references to /build in /nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170...
/nix/store/f8hzbrz6xw822hfdzvw0j8g7m95mcxrv-elfutils-0.170
|
gharchive/pull-request
| 2018-05-17T21:26:16 |
2025-04-01T04:55:26.745399
|
{
"authors": [
"GrahamcOfBorg",
"Synthetica9"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/40705",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
326730071
|
oauth2_proxy: Handle attributes being derivations
Motivation for this change
Any attribute being a derivation would cause infinite recursion
Things done
Handle derivations separately from sets when mapping the config to CLI arguments.
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option build-use-sandbox in nix.conf on non-NixOS)
Built on platform(s)
[x] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[x] Tested execution of all binary files (usually in ./result/bin/)
[x] Fits CONTRIBUTING.md.
I'm using toString but apparently that's not quite the same as ${attr}. I know this works because I'm already using it, but maybe someone has an opinion on it? I went with toString because it seemed more expressive to me in this case.
|
gharchive/pull-request
| 2018-05-26T11:07:15 |
2025-04-01T04:55:26.750491
|
{
"authors": [
"mkaito"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/41098",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
356979173
|
puredata: 0.48-0 -> 0.48-2
Motivation for this change
Things done
[x] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[x] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[x] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[x] Fits CONTRIBUTING.md.
@GrahamcOfBorg build puredata
No attempt on x86_64-darwin (full log)
The following builds were skipped because they don't evaluate on x86_64-darwin: puredata
Partial log (click to expand)
a) For `nixos-rebuild` you can set
{ nixpkgs.config.allowUnsupportedSystem = true; }
in configuration.nix to override this.
b) For `nix-env`, `nix-build`, `nix-shell` or any other Nix command you can add
{ allowUnsupportedSystem = true; }
to ~/.config/nixpkgs/config.nix.
Success on x86_64-linux (full log)
Attempted: puredata
Partial log (click to expand)
shrinking /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/bin/.pd-wrapped
shrinking /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/bin/pdreceive
shrinking /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/bin/pdsend
gzipping man pages under /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/share/man/
strip is /nix/store/h0lbngpv6ln56hjj59i6l77vxq25flbz-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/lib /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/bin
patching script interpreter paths in /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2
/nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/lib/pd/tcl/pd-gui.tcl: interpreter directive changed from "/bin/sh" to "/nix/store/czx8vkrb9jdgjyz8qfksh10vrnqa723l-bash-4.4-p23/bin/sh"
/nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2/bin/pd-gui: interpreter directive changed from "/bin/sh" to "/nix/store/czx8vkrb9jdgjyz8qfksh10vrnqa723l-bash-4.4-p23/bin/sh"
checking for references to /build in /nix/store/3bz4qjw38sfnfgaachwwfhr1w83fqpb0-puredata-0.48-2...
Success on aarch64-linux (full log)
Attempted: puredata
Partial log (click to expand)
shrinking /nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2/lib/pd/extra/bob~/bob~.pd_linux
shrinking /nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2/lib/pd/bin/pd-watchdog
gzipping man pages under /nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2/share/man/
strip is /nix/store/y4ymnvgxygpq05h03kyzbj572zmh6zla-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2/lib /nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2/bin
patching script interpreter paths in /nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2
/nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2/bin/pd-gui: interpreter directive changed from "/bin/sh" to "/nix/store/fqm2x6kiay1q4vg7pqp4wp17bdijlyc3-bash-4.4-p23/bin/sh"
/nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2/lib/pd/tcl/pd-gui.tcl: interpreter directive changed from "/bin/sh" to "/nix/store/fqm2x6kiay1q4vg7pqp4wp17bdijlyc3-bash-4.4-p23/bin/sh"
checking for references to /build in /nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2...
/nix/store/5kvq8xs3rw7vm4n9jv5a4jj25qkkw85k-puredata-0.48-2
|
gharchive/pull-request
| 2018-09-04T21:15:52 |
2025-04-01T04:55:26.759163
|
{
"authors": [
"GrahamcOfBorg",
"magnetophon",
"xeji"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/46065",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
377191476
|
nixos/fontconfig/make-fonts-cache: don't fail to clean the cache
Motivation for this change
Today I couldn't rebuild because of:
[...]
/nix/store/lxkrsrlqplz2n75hvpq6vr0gam0gfgrl-font-cursor-misc-1.0.3/lib/X11/fonts/misc: skipping, existing cache is valid: 1 fonts, 0 dirs
/var/cache/fontconfig: cleaning cache directory
/nix/store/xq3c44ha15pfa5a9mv1z9mni3cfghsna-fc-cache: cleaning cache directory
fc-cache: succeeded
rm: cannot remove '/nix/store/xq3c44ha15pfa5a9mv1z9mni3cfghsna-fc-cache/CACHEDIR.TAG': No such file or directory
builder for '/nix/store/zxfmil40n79vhn5hb4flqc76j99a3l7b-fc-cache.drv' failed with exit code 1
Strangely, when fixing this it seems the file really exists. No clue.
Anyway this is a good pretext to add preferLocalBuild=true.
I guess that it's worth backporting, since I experienced this bug on 18.09.
Things done
[x] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[x] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[x] Fits CONTRIBUTING.md.
@GrahamcOfBorg test sddm chromium
No attempt on aarch64-linux (full log)
The following builds were skipped because they don't evaluate on aarch64-linux: tests.sddm, tests.chromium
Partial log (click to expand)
Cannot nix-instantiate `tests.chromium' because:
error: while evaluating 'recursiveUpdate' at /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:415:26, called from /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:148:28:
while evaluating 'recursiveUpdateUntil' at /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:384:37, called from /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:416:5:
while evaluating 'zipAttrsWith' at /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:347:21, called from /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:394:8:
while evaluating 'zipAttrsWithNames' at /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:332:33, called from /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/lib/attrsets.nix:347:27:
while evaluating the attribute 'chromium' at /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/nixos/tests/all-tests.nix:40:3:
while evaluating 'handleTestOn' at /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/nixos/tests/all-tests.nix:19:33, called from /var/lib/gc-of-borg/nix-test-rs-30/repo/38dca4e3aa6bca43ea96d2fcc04e8229/builder/grahamc-aarch64-community-30/nixos/tests/all-tests.nix:40:15:
access to path '/nix/store/gin2b1811sa86ni6x9fjb9bdlval79rj-grahamc-aarch64-community-30' is forbidden in restricted mode
Since ofborg's nixos test running is currently broken, I've run the sddm test locally, which worked fine.
|
gharchive/pull-request
| 2018-11-04T20:32:10 |
2025-04-01T04:55:26.767502
|
{
"authors": [
"GrahamcOfBorg",
"lheckemann",
"symphorien"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/49762",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
379452027
|
bzflag: 2.4.16 -> 2.4.18
Semi-automatic update generated by https://github.com/ryantm/nixpkgs-update tools. This update was made based on information from https://repology.org/metapackage/bzflag/versions.
meta.description for bzflag is: '"Multiplayer 3D Tank game"'.
Checks done (click to expand)
built on NixOS
/nix/store/jiqa97rbb97rl953kdpgx945lv31d1bn-bzflag-2.4.18/bin/bzfs passed the binary check.
/nix/store/jiqa97rbb97rl953kdpgx945lv31d1bn-bzflag-2.4.18/bin/bzflag passed the binary check.
Warning: no invocation of /nix/store/jiqa97rbb97rl953kdpgx945lv31d1bn-bzflag-2.4.18/bin/bzadmin had a zero exit code or showed the expected version
2 of 3 passed binary check by having a zero exit code.
0 of 3 passed binary check by having the new version present in output.
found 2.4.18 with grep in /nix/store/jiqa97rbb97rl953kdpgx945lv31d1bn-bzflag-2.4.18
directory tree listing: https://gist.github.com/586cfdc1bedd2c31f4e397791dbc7d8e
du listing: https://gist.github.com/70732c3641e6c724169737b585ec0569
Outpath report (click to expand)
Outpath difference report
3 total rebuild paths
1 package rebuilds
1 x86_64-linux rebuilds
1 i686-linux rebuilds
0 x86_64-darwin rebuilds
1 aarch64-linux rebuilds
First ten rebuilds by attrpath
bzflag
Instructions to test this update (click to expand)
Either download from Cachix:
nix-store -r /nix/store/jiqa97rbb97rl953kdpgx945lv31d1bn-bzflag-2.4.18 \
--option binary-caches 'https://cache.nixos.org/ https://r-ryantm.cachix.org/' \
--option trusted-public-keys '
r-ryantm.cachix.org-1:gkUbLkouDAyvBdpBX0JOdIiD2/DP1ldF3Z3Y6Gqcc4c=
cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY=
'
(r-ryantm's Cachix cache is only trusted for this store-path realization.)
Or, build yourself:
nix-build -A bzflag https://github.com/r-ryantm/nixpkgs/archive/d19b7e487cbbc3a113d88be864710a0912d3c46e.tar.gz
After you've downloaded or built it, look at the files and if there are any, run the binaries:
ls -la /nix/store/jiqa97rbb97rl953kdpgx945lv31d1bn-bzflag-2.4.18
ls -la /nix/store/jiqa97rbb97rl953kdpgx945lv31d1bn-bzflag-2.4.18/bin
cc @fpletz for testing.
Success on aarch64-linux (full log)
Attempted: bzflag
Partial log (click to expand)
shrinking /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18/lib/bzflag/airspawn.so
shrinking /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18/bin/bzadmin
shrinking /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18/bin/bzflag
shrinking /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18/bin/bzfs
gzipping man pages under /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18/share/man/
strip is /nix/store/p9akxn2sfy4wkhqdqa3li97pc6jaz3r1-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18/lib /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18/bin
patching script interpreter paths in /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18
checking for references to /build in /nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18...
/nix/store/v7sk94c1yd0ikvm0g6cn5jiyyl5pkr7w-bzflag-2.4.18
Success on x86_64-linux (full log)
Attempted: bzflag
Partial log (click to expand)
shrinking /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18/lib/bzflag/nagware.so
shrinking /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18/bin/bzfs
shrinking /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18/bin/bzadmin
shrinking /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18/bin/bzflag
gzipping man pages under /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18/share/man/
strip is /nix/store/vcc4svb8gy29g4pam2zja6llkbcwsyiq-binutils-2.30/bin/strip
stripping (with command strip and flags -S) in /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18/lib /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18/bin
patching script interpreter paths in /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18
checking for references to /build in /nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18...
/nix/store/w0ymj7a7ci9g4g0s665sijxsxlfpf54q-bzflag-2.4.18
|
gharchive/pull-request
| 2018-11-10T19:10:24 |
2025-04-01T04:55:26.777253
|
{
"authors": [
"GrahamcOfBorg",
"r-ryantm"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/50205",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
381934278
|
cyrus_sasl: 2.1.26 -> 2.1.27
Motivation for this change
Things done
[x] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[x] NixOS
[ ] macOS
[x] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Fits CONTRIBUTING.md.
CVE patch is already applied, unsurprisingly.
many build system fixes, no need for special workaround w/musl
Success on x86_64-linux (full log)
Attempted: cyrus_sasl
Partial log (click to expand)
gzipping man pages under /nix/store/cgbw9qax8k64i93484icy2f3dszc0c8a-cyrus-sasl-2.1.27-man/share/man/
/nix/store/6b5b8qvgqd40j0khlm6rx2zddj7jw5xr-cyrus-sasl-2.1.27-bin
strip is /nix/store/j13c9z0z495wxdc97if8z451x53m1j25-binutils-2.30/bin/strip
patching script interpreter paths in /nix/store/cgbw9qax8k64i93484icy2f3dszc0c8a-cyrus-sasl-2.1.27-man
checking for references to /build/ in /nix/store/cgbw9qax8k64i93484icy2f3dszc0c8a-cyrus-sasl-2.1.27-man...
shrinking RPATHs of ELF executables and libraries in /nix/store/bfjv4zvszhldrd89gyhzrmszxfm7xgh9-cyrus-sasl-2.1.27-devdoc
gzipping man pages under /nix/store/bfjv4zvszhldrd89gyhzrmszxfm7xgh9-cyrus-sasl-2.1.27-devdoc/share/man/
strip is /nix/store/j13c9z0z495wxdc97if8z451x53m1j25-binutils-2.30/bin/strip
patching script interpreter paths in /nix/store/bfjv4zvszhldrd89gyhzrmszxfm7xgh9-cyrus-sasl-2.1.27-devdoc
checking for references to /build/ in /nix/store/bfjv4zvszhldrd89gyhzrmszxfm7xgh9-cyrus-sasl-2.1.27-devdoc...
Success on aarch64-linux (full log)
Attempted: cyrus_sasl
Partial log (click to expand)
gzipping man pages under /nix/store/n28wkj5ynvk2q4w8hdpfqcyhijal6jzh-cyrus-sasl-2.1.27-man/share/man/
strip is /nix/store/1jprj383qnzd50vn8qq3a0id6a08a9fi-binutils-2.30/bin/strip
patching script interpreter paths in /nix/store/n28wkj5ynvk2q4w8hdpfqcyhijal6jzh-cyrus-sasl-2.1.27-man
checking for references to /build/ in /nix/store/n28wkj5ynvk2q4w8hdpfqcyhijal6jzh-cyrus-sasl-2.1.27-man...
shrinking RPATHs of ELF executables and libraries in /nix/store/hf76z9k52wk3anpdqsp9pkj8bapnaip8-cyrus-sasl-2.1.27-devdoc
gzipping man pages under /nix/store/hf76z9k52wk3anpdqsp9pkj8bapnaip8-cyrus-sasl-2.1.27-devdoc/share/man/
strip is /nix/store/1jprj383qnzd50vn8qq3a0id6a08a9fi-binutils-2.30/bin/strip
patching script interpreter paths in /nix/store/hf76z9k52wk3anpdqsp9pkj8bapnaip8-cyrus-sasl-2.1.27-devdoc
checking for references to /build/ in /nix/store/hf76z9k52wk3anpdqsp9pkj8bapnaip8-cyrus-sasl-2.1.27-devdoc...
/nix/store/acwxngb4mw9ryddyqa33xj5hm9dn3pi0-cyrus-sasl-2.1.27-bin
Timed out, unknown build status on x86_64-darwin (full log)
Attempted: cyrus_sasl
Partial log (click to expand)
cannot build derivation '/nix/store/b5wlvkm0hqfk86liipakgy1qn03ph7l8-nghttp2-1.34.0.drv': 5 dependencies couldn't be built
cannot build derivation '/nix/store/wh704hpxy3ygp59sfq430g1279fciz35-curl-7.62.0.drv': 6 dependencies couldn't be built
cannot build derivation '/nix/store/z412hn5xw2mgx4bqfxmbyprbpfvll3y5-libtool-2.4.6.drv': 4 dependencies couldn't be built
cannot build derivation '/nix/store/l2dm3caganr8iagfl7x9lwkyln5v9ry9-cyrus-sasl-2.1.27.tar.gz.drv': 3 dependencies couldn't be built
cannot build derivation '/nix/store/16is7wic839z0x7l32ngi23mvadgszlv-hook.drv': 5 dependencies couldn't be built
cannot build derivation '/nix/store/b5jqjaxd1s939vv8bhxsyl4zrxnkm8yy-flex-2.6.4.drv': 6 dependencies couldn't be built
cannot build derivation '/nix/store/85hm02vy6lmll4qjn5vlzfmjmmmddl9z-bootstrap_cmds-dev-tools-7.0.drv': 3 dependencies couldn't be built
cannot build derivation '/nix/store/7dplh881hc1l8918y746dxn05d97qc6s-libkrb5-1.15.2.drv': 5 dependencies couldn't be built
cannot build derivation '/nix/store/xc7gdxvpqnkavlnf8s48y0vk148l56v9-cyrus-sasl-2.1.27.drv': 7 dependencies couldn't be built
error: build of '/nix/store/xc7gdxvpqnkavlnf8s48y0vk148l56v9-cyrus-sasl-2.1.27.drv' failed
Release notes:
https://www.cyrusimap.org/sasl/sasl/release-notes/2.1/index.html#new-in-2-1-27
|
gharchive/pull-request
| 2018-11-18T06:47:45 |
2025-04-01T04:55:26.786448
|
{
"authors": [
"GrahamcOfBorg",
"c0bw3b",
"dtzWill"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/50546",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
393938479
|
graphite-web: use django-2.0
Don't merge yet, this is WIP!
Motivation for this change
graphite-web uses django-1.8 which is EOL (see: https://github.com/NixOS/nixpkgs/issues/52679). This PR is WIP on upgrading graphite-web to django-2.0.
Things done
The graphite test still fails with the following errors:
$ nix-build nixos/release.nix -A tests.graphite.x86_64-linux
...
one# [ 6.895737] waitress-serve[833]: Fatal Python error: initsite: Failed to import the site module
one# [ 6.905333] waitress-serve[833]: Traceback (most recent call last):
one# [ 6.906536] waitress-serve[833]: File "/nix/store/xayy7lwmdzw6n3mc77gmzna0kjy9ijih-python-2.7.15-env/lib/python2.7/site-packages/site.py", line 73, in <module>
one# [ 6.912432] waitress-serve[833]: __boot()
one# [ 6.920091] waitress-serve[833]: File "/nix/store/xayy7lwmdzw6n3mc77gmzna0kjy9ijih-python-2.7.15-env/lib/python2.7/site-packages/site.py", line 26, in __boot
one# [ 6.926912] waitress-serve[833]: import imp # Avoid import loop in Python 3
one# [ 6.930116] waitress-serve[833]: File "/nix/store/9vcy8i7mcwmcp486krc22v7851spcdqj-python3-3.7.1/lib/python3.7/imp.py", line 27, in <module>
one# [ 6.933937] waitress-serve[833]: import tokenize
one# [ 6.936721] waitress-serve[833]: File "/nix/store/9vcy8i7mcwmcp486krc22v7851spcdqj-python3-3.7.1/lib/python3.7/tokenize.py", line 33, in <module>
one# [ 6.939588] waitress-serve[833]: import re
one# [ 6.941482] waitress-serve[833]: File "/nix/store/9vcy8i7mcwmcp486krc22v7851spcdqj-python3-3.7.1/lib/python3.7/re.py", line 143, in <module>
one# [ 6.944112] waitress-serve[833]: class RegexFlag(enum.IntFlag):
one# [ 6.946966] waitress-serve[833]: AttributeError: module 'enum' has no attribute 'IntFlag'
...
one# [ 8.431924] twistd[824]: /nix/store/vg1295z6rm414zs82w43kbn7j1w7x7ky-python2.7-Twisted-18.9.0/bin/twistd: Unknown command: carbon-cache
...
one# [ 8.841970] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: Could not import graphite_local_settings, using defaults!
...
one# [ 14.738893] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: Copying '/nix/store/10d3rlsfjqd9wfr3pzy76yb9a9pmbh58-python3.7-graphite-web-1.1.5/webapp/content/css/dashboard-white.css'
one# [ 14.751132] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: Traceback (most recent call last):
one# [ 14.752829] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/zq1b8rsi84zvqdb4hz55fcd6998v4c16-python3.7-Django-2.0.9/bin/..django-admin.py-wrapped-wrapped", line 7, in <module>
one# [ 14.760313] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: management.execute_from_command_line()
one# [ 14.762681] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/core/management/__init__.py", line 371, in execute_from_command_line
one# [ 14.767967] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: utility.execute()
one# [ 14.776164] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/core/management/__init__.py", line 365, in execute
one# [ 14.784284] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: self.fetch_command(subcommand).run_from_argv(self.argv)
one# [ 14.786918] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/core/management/base.py", line 288, in run_from_argv
one# [ 14.794391] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: self.execute(*args, **cmd_options)
one# [ 14.798830] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/core/management/base.py", line 335, in execute
one# [ 14.803147] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: output = self.handle(*args, **options)
one# [ 14.812308] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 189, in handle
one# [ 14.821309] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: collected = self.collect()
one# [ 14.824616] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 114, in collect
one# [ 14.832197] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: handler(path, prefixed_path, storage)
one# [ 14.837422] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 354, in copy_file
one# [ 14.843898] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: self.storage.save(prefixed_path, source_file)
one# [ 14.848272] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/core/files/storage.py", line 49, in save
one# [ 14.858343] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: return self._save(name, content)
one# [ 14.860681] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/django/core/files/storage.py", line 236, in _save
one# [ 14.870400] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: os.makedirs(directory)
one# [ 14.872156] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/9vcy8i7mcwmcp486krc22v7851spcdqj-python3-3.7.1/lib/python3.7/os.py", line 211, in makedirs
one# [ 14.876310] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: makedirs(head, exist_ok=exist_ok)
one# [ 14.882180] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: File "/nix/store/9vcy8i7mcwmcp486krc22v7851spcdqj-python3-3.7.1/lib/python3.7/os.py", line 221, in makedirs
one# [ 14.886159] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: mkdir(name, mode)
one# [ 14.889284] z6glijz8wpf06x1s040gx4zaljprdbzc-unit-script-graphiteWeb-pre-start[806]: OSError: [Errno 30] Read-only file system: '/nix/store/yy16rzxsmp6bzmc16n00ksqfndpcm5hn-python3-3.7.1-env/lib/python3.7/site-packages/opt/graphite/static'
...
error: unit ‘graphiteWeb.service’ reached state ‘failed’
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[ ] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Assured whether relevant documentation is up to date
[ ] Fits CONTRIBUTING.md.
Don't include the enum34 package.
@basvandijk Any updates?
|
gharchive/pull-request
| 2018-12-24T23:36:50 |
2025-04-01T04:55:26.795237
|
{
"authors": [
"FRidh",
"basvandijk",
"dotlambda"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/52799",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
419489243
|
pythonPackages.m2crypto: 0.30.1 -> 0.32.0
Motivation for this change
Fixes builds with openssl1.1 and otherwise just carries bugfixes
Things done
[X] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[X] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nox --run "nox-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Assured whether relevant documentation is up to date
[X] Fits CONTRIBUTING.md.
@GrahamcOfBorg build python3Packages.m2crypto python27Packages.m2crypto
Closed in favor of #57050
|
gharchive/pull-request
| 2019-03-11T14:09:34 |
2025-04-01T04:55:26.800562
|
{
"authors": [
"andir"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/57390",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
435499132
|
brogue: add .desktop file and icon
Motivation for this change
Add XDG .desktop file and icon. Note that the .desktop file included in
the source archive is not used because it uses unsuitable paths and
refers to an old version of the game.
Things done
[X] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[X] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[X] Tested compilation of all pkgs that depend on this change using nix-shell -p nix-review --run "nix-review wip"
[X] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[X] Assured whether relevant documentation is up to date
[X] Fits CONTRIBUTING.md.
@GrahamcOfBorg build brogue
|
gharchive/pull-request
| 2019-04-21T11:12:01 |
2025-04-01T04:55:26.805570
|
{
"authors": [
"c0bw3b",
"lightbulbjim"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/59954",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
454809776
|
exiftool: 11.48 -> 11.50
Motivation for this change
Latest release.
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[ ] NixOS
[x] macOS
[x] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nix-review --run "nix-review wip"
[x] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Assured whether relevant documentation is up to date
[x] Fits CONTRIBUTING.md.
@GrahamcOfBorg build exiftool
|
gharchive/pull-request
| 2019-06-11T17:16:30 |
2025-04-01T04:55:26.810172
|
{
"authors": [
"bdesham",
"dywedir"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/62987",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
504488902
|
oxidized: fix git-crypt dependency
Motivation for this change
Fix dependency (add git) so it will be possible to use git-crypt. Fixes https://github.com/NixOS/nixpkgs/issues/70838
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
Built on platform(s)
[x] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nix-review --run "nix-review wip"
[x] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Ensured that relevant documentation is up to date
[ ] Fits CONTRIBUTING.md.
Notify maintainers
cc @WilliButz @nicknovitski
@GrahamcOfBorg build oxidized
|
gharchive/pull-request
| 2019-10-09T08:16:54 |
2025-04-01T04:55:26.815881
|
{
"authors": [
"1000101",
"mmahut"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/70839",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
66269622
|
haskell: fix taffybar on GHC 7.10
This:
Updates the versions for the gtk2hs subprojects
Fixes HStringTemplate-0.7.3
Fixes taffybar
I've contacted the authors of taffybar and HStringTemplate, and have provided patches for them to review.
@peti: Considering that HStringTemplate-0.8.3 is (at least in part) meant to fix the build on GHC 7.10, and HStringTemplate-0.7.3 doesn't build at all GHC 7.10, I don't see any harm in committing this as is.
If you feel that merging this isn't acceptable for any reason, feel free to revert it.
|
gharchive/pull-request
| 2015-04-04T03:36:31 |
2025-04-01T04:55:26.818327
|
{
"authors": [
"cstrahan"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/7161",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
546588204
|
Deepin desktop modules
Motivation for this change
Let's propose the last step deepin outside of #63813, as it has quickly become more complex than it was at first. And that's because of the switch to dde-kwin upstream breaking everything in NixOS :frowning:
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS linux)
Built on platform(s)
[ ] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Ensured that relevant documentation is up to date
[ ] Fits CONTRIBUTING.md.
Notify maintainers
cc @
This is still important.
Packaging of the Deepin Desktop Environment has been cancelled (https://github.com/NixOS/nixpkgs/issues/94870).
Closing.
|
gharchive/pull-request
| 2020-01-08T01:05:16 |
2025-04-01T04:55:26.824188
|
{
"authors": [
"romildo",
"worldofpeace"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/77293",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
573423876
|
elm-format: 0.8.2 -> 0.8.3
Motivation for this change
New release of elm-format. https://github.com/avh4/elm-format/releases/tag/0.8.3 Changelog:
Bug fixes:
performance is improved, allowing excessively long lists to be formatted without crashing
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS linux)
Built on platform(s)
[ ] NixOS
[x] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[x] Tested compilation of all pkgs that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review wip"
[x] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Ensured that relevant documentation is up to date
[ ] Fits CONTRIBUTING.md.
@GrahamcOfBorg build elm-format
@GrahamcOfBorg build elmPackages.elm-format
Thanks!
|
gharchive/pull-request
| 2020-02-29T22:36:28 |
2025-04-01T04:55:26.830195
|
{
"authors": [
"avh4",
"marsam"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/81413",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
589749677
|
nixos/systemd: remove one DefaultBlockIOAccounting
Motivation for this change
DefaultBlockIOAccounting=yes is set twice in the same file, remove one
copy.
Things done
[x] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS linux)
Built on platform(s)
[x] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Ensured that relevant documentation is up to date
[x] Fits CONTRIBUTING.md.
Nice catch, thanks!
|
gharchive/pull-request
| 2020-03-29T09:03:19 |
2025-04-01T04:55:26.835510
|
{
"authors": [
"Emantor",
"flokli"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/83660",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
605015857
|
[19.09] Use qt5's mkDerivation in packages that otherwise crash
Backport of #84673.
I've built all these packages without problem.
|
gharchive/pull-request
| 2020-04-22T19:40:07 |
2025-04-01T04:55:26.836381
|
{
"authors": [
"mmilata",
"worldofpeace"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/85805",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
606354250
|
sd-image: use parallel pbzip2 implementation
Motivation for this change
Using parallel bzip2 implementation is faster on multi-processor machines. I tried to compress a 2.7 GB aarch64 NixOS image with the following results:
bzip2
output size: 627353716 bytes
duration: 217 seconds
real 3m36.963s
user 3m35.370s
sys 0m1.552s
pbzip2 -p4:
output size: 632445704 bytes (increase of ~0.1%)
duration: 96 seconds (55% faster)
real 1m36.191s
user 6m11.421s
sys 0m8.036s
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS linux)
Built on platform(s)
[ ] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Ensured that relevant documentation is up to date
[ ] Fits CONTRIBUTING.md.
Another option would be to use zstd for final compression of the image (zstd is already a dependency as it decompresses the rootfs image).
zstd -T4:
output size: 614908377 (decrease of 2%)
duration: 10 seconds (95% faster)
real 0m10.069s
user 0m36.025s
sys 0m1.431s
@prusnak Hi!
I agree that using bzip2 is counterproductive as it's extremely slow, especially on more constrained HW. I think pbzip is okay as a stop-gap, but just using zstd is a much better approach that will benefit even single-core systems.
FWIW: I've suggested in the past that we move all our compression to zstd as it has amazing ratio and speed.
I will gladly rework the PR to use zstd instead if that's what we really want.
IIRC the last time I tried to do this there was some discussion on whether or not the output format, in this case the compression algo, of a drv constituted a "public interface" or not; in which case changing it can't be done without further consideration as it breaks things.
IMO: The benefits are significant enough to be worth it, but it needs to be added to the release notes.
Okay, let's close this and pursue a bigger change (zstd) in another PR: https://github.com/NixOS/nixpkgs/pull/85947
|
gharchive/pull-request
| 2020-04-24T14:26:34 |
2025-04-01T04:55:26.845629
|
{
"authors": [
"lovesegfault",
"prusnak"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/85941",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
681403809
|
idris2: Enable --install, bugfixes
Motivation for this change
This removes the need of the bin/idris2_app folder
and replaces it with proper links to the nix-storage folders
This also allows the user to override IDRIS2_PREFIX which
will allow them to use --install to install packages.
Example of how directories are created by the wrapper when user changes the prefix:
$ IDRIS2_PREFIX=~/.idris2 idris2 --paths
+ Working Directory :: "/home/wchresta/src/nixpkgs"
+ Source Directory :: Nothing
+ Build Directory :: "build"
+ Output Directory :: "build/exec"
+ Installation Prefix :: "/home/wchresta/.idris2"
+ Extra Directories :: [".", "/nix/store/h4851cakq6w4cw3hcy2c65b87l80br96-idris2/idris2-0.2.1/base", "/nix/store/h4851cakq6w4cw3hcy2c65b87l80br96-idris2/idris2-0.2.1/contrib", "/nix/store/h4851cakq6w4cw3hcy2c65b87l80br96-idris2/idris2-0.2.1/network", "/nix/store/h4851cakq6w4cw3hcy2c65b87l80br96-idris2/idris2-0.2.1/prelude", "/home/wchresta/.idris2/idris2-0.2.1/prelude", "/home/wchresta/.idris2/idris2-0.2.1/base"]
+ CG Library Directories :: ["/nix/store/h4851cakq6w4cw3hcy2c65b87l80br96-idris2/idris2-0.2.1/lib", "/home/wchresta/.idris2/idris2-0.2.1/lib", "/home/wchresta/src/nixpkgs"]
+ Data Directories :: ["/nix/store/h4851cakq6w4cw3hcy2c65b87l80br96-idris2/idris2-0.2.1/support", "/home/wchresta/.idris2/idris2-0.2.1/support"]
Things done
Fix: idris2_app/ was exposed in bin/
Remove native Idris2 wrapper that set LD_LIBRARY_PATH
Improve new Idris2 wrapper to set Idris2 paths to out folders
[X] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS linux)
Built on platform(s)
[X] NixOS
[ ] macOS
[X] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[X] Tested compilation of all pkgs that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review wip"
[X] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[X] Ensured that relevant documentation is up to date
[X] Fits CONTRIBUTING.md.
@GrahamcOfBorg build idris2
nixpkgs-review succeeds:
https://github.com/NixOS/nixpkgs/pull/95776
1 package built:
idris2
I think we're ready to merge here.
pinging @veprbl and @Lassulus (as they've merged previous idris2 PRs)
|
gharchive/pull-request
| 2020-08-18T23:33:14 |
2025-04-01T04:55:26.853379
|
{
"authors": [
"bcdarwin",
"wchresta"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/95776",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
703819093
|
[20.03] dmidecode version bump to 3.2.8
(cherry picked from commit 0cd61da5f265b3e7e40baff86fad7e9d9cc3bc06)
Motivation for this change
PR#97614 was missing dmidecode's version bump.
Things done
[ ] Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS linux)
Built on platform(s)
[ ] NixOS
[ ] macOS
[ ] other Linux distributions
[ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
[ ] Tested compilation of all pkgs that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review wip"
[ ] Tested execution of all binary files (usually in ./result/bin/)
[ ] Determined the impact on package closure size (by running nix path-info -S before and after)
[ ] Ensured that relevant documentation is up to date
[ ] Fits CONTRIBUTING.md.
Converted to draft until #98171 is ready.
|
gharchive/pull-request
| 2020-09-17T18:38:19 |
2025-04-01T04:55:26.859153
|
{
"authors": [
"samueldr",
"superherointj"
],
"repo": "NixOS/nixpkgs",
"url": "https://github.com/NixOS/nixpkgs/pull/98175",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1852423326
|
🛑 rimgo ri.zzls.xyz is down
In 1c6214b, rimgo ri.zzls.xyz (https://ri.zzls.xyz) was down:
HTTP code: 0
Response time: 0 ms
Resolved: rimgo ri.zzls.xyz is back up in a351ce7.
|
gharchive/issue
| 2023-08-16T02:55:18 |
2025-04-01T04:55:26.872653
|
{
"authors": [
"Mine1984Craft"
],
"repo": "NoPlagiarism/services-personal-upptime",
"url": "https://github.com/NoPlagiarism/services-personal-upptime/issues/108",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1820413097
|
🛑 BreezeWiki breezewiki.pussthecat.org is down
In 9f8fa8d, BreezeWiki breezewiki.pussthecat.org (https://breezewiki.pussthecat.org) was down:
HTTP code: 502
Response time: 382 ms
Resolved: BreezeWiki breezewiki.pussthecat.org is back up in b11e90a.
|
gharchive/issue
| 2023-07-25T14:00:17 |
2025-04-01T04:55:26.875827
|
{
"authors": [
"NoPlagiarism"
],
"repo": "NoPlagiarism/services-personal-upptime",
"url": "https://github.com/NoPlagiarism/services-personal-upptime/issues/1732",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1943511892
|
🛑 GotHub gothub.dev.projectsegfau.lt is down
In 9359557, GotHub gothub.dev.projectsegfau.lt (https://gothub.dev.projectsegfau.lt) was down:
HTTP code: 0
Response time: 0 ms
Resolved: GotHub gothub.dev.projectsegfau.lt is back up in 53aa5cc after 8 minutes.
|
gharchive/issue
| 2023-10-14T19:42:09 |
2025-04-01T04:55:26.879023
|
{
"authors": [
"Mine1984Craft"
],
"repo": "NoPlagiarism/services-personal-upptime",
"url": "https://github.com/NoPlagiarism/services-personal-upptime/issues/2242",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1838079018
|
🛑 libreddit libreddit.domain.glass is down
In bf10d6e, libreddit libreddit.domain.glass (https://libreddit.domain.glass) was down:
HTTP code: 503
Response time: 188 ms
Resolved: libreddit libreddit.domain.glass is back up in 6bdefd4.
|
gharchive/issue
| 2023-08-06T06:50:09 |
2025-04-01T04:55:26.881460
|
{
"authors": [
"NoPlagiarism"
],
"repo": "NoPlagiarism/services-personal-upptime",
"url": "https://github.com/NoPlagiarism/services-personal-upptime/issues/2332",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1847286563
|
🛑 BreezeWiki bw.projectsegfau.lt is down
In 01a9c13, BreezeWiki bw.projectsegfau.lt (https://bw.projectsegfau.lt) was down:
HTTP code: 502
Response time: 253 ms
Resolved: BreezeWiki bw.projectsegfau.lt is back up in c2c38d6.
|
gharchive/issue
| 2023-08-11T18:55:24 |
2025-04-01T04:55:26.884978
|
{
"authors": [
"Mine1984Craft"
],
"repo": "NoPlagiarism/services-personal-upptime",
"url": "https://github.com/NoPlagiarism/services-personal-upptime/issues/27",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2471304609
|
About ddpm--------------No module named 'ddpm.discriminator'、generative et.al.
Dear author, when I run the code according to your prompts, I encounter the following problem:
Traceback (most recent call last):
File "train_ddpm.py", line 2, in
from ddpm.diffusion import Unet3D,GaussianDiffusion, Trainer
File "/home//Desktop/projects/3d_medical/X-ray2CTPA-main/ddpm/init.py", line 1, in
from ddpm.diffusion import Unet3D, GaussianDiffusion, Trainer
File "/home//Desktop/projects/3d_medical/X-ray2CTPA-main/ddpm/diffusion.py", line 38, in
from ddpm.discriminator import Discriminator, ones_target, zeros_target
ModuleNotFoundError: No module named 'ddpm.discriminator'
Then locate the place in the picture. However, I found that ddpm does not seem to have a discriminator and generative. I would like to ask how to solve this problem? Thank you
I have solved it. The generative could be found in : https://github.com/Project-MONAI/GenerativeModels/tree/main
And you should pip install monai package.
Exactly. Thanks I'll update the requirements.txt file
|
gharchive/issue
| 2024-08-17T03:13:55 |
2025-04-01T04:55:26.889325
|
{
"authors": [
"NoaCahan",
"zfw-cv"
],
"repo": "NoaCahan/X-ray2CTPA",
"url": "https://github.com/NoaCahan/X-ray2CTPA/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
726152064
|
Possibilty to interact with opened page ?
Hi.
Very nice solution to bypass cloudflare and work great.
I need to interact with a opened page, like click button, filling form, etc...
Not using postdata, but really interact like with selenium....
Is possible for you to add ? Or already existing but don't find how to ?
Thanks
Hmm, I guess it could be configured to share the port of the webdriver so that you could connect to it... I don't really have time to look into it atm, but I think it would be a great addition to the project if you had time to work that out.
If anyone submits a PR for this I'd happily review and merge it!
I m actually working on it with your source....but i must understand puppeteer function works...
I ll share you my results
Thanks
Any update?
You can try https://github.com/yoori/flare-bypasser - it allow to write extensions for interact with page after solve (but with using nodriver) ...
|
gharchive/issue
| 2020-10-21T05:41:07 |
2025-04-01T04:55:26.892309
|
{
"authors": [
"NoahCardoza",
"iconmix",
"okaiff"
],
"repo": "NoahCardoza/CloudProxy",
"url": "https://github.com/NoahCardoza/CloudProxy/issues/26",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
193098273
|
Problem using AjaxGrid
Hi,
Sorry, I don't know if I am polluting this space, but there is no other place to ask this. I just cannot make it work in my project. Here is question and code:
https://github.com/LeonardoDaga/MVC6.Grid.Demo1stAspNetApp/issues/1
Please help!
Thanks,
Vedran
I found the source of this error. Thanks!
|
gharchive/issue
| 2016-12-02T11:48:26 |
2025-04-01T04:55:26.906365
|
{
"authors": [
"vzdesic"
],
"repo": "NonFactors/MVC6.Grid.Web",
"url": "https://github.com/NonFactors/MVC6.Grid.Web/issues/13",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
231196994
|
add ffjson marshal, update deployment yaml & makefile
updates to ffjson for json marshalling
from ffjson https://github.com/pquerna/ffjson
Performance Status:
MarshalJSON is 2x to 3x faster than encoding/json.
UnmarshalJSON is 2x to 3x faster than encoding/json.
|
gharchive/pull-request
| 2017-05-24T23:10:41 |
2025-04-01T04:55:26.929986
|
{
"authors": [
"13scoobie",
"jimcal"
],
"repo": "Nordstrom/choices",
"url": "https://github.com/Nordstrom/choices/pull/102",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
446561358
|
Show minimap with image overlay
Hi,
I'm trying to show minimap with image and markers,
my problem is that the minimap doesn't show all map as expected...
I see all the map on the screen but in the minimap just part of the map, this is a picture:
I attached my code:
let osm2 = this.mapDataService.drawMap(this.mapData);
let markers2 = this.itemsDrawService.draw(this.items);
let layers = new LayerGroup([...osm2.layers, ...markers2]);
let miniMap = new MiniMap(layers, {toggleDisplay: true});
miniMap.addTo(this.map);
Thanks
Yael
Hi,
I have the same issue, is there a solution?
Thanks
Michael
|
gharchive/issue
| 2019-05-21T11:17:58 |
2025-04-01T04:55:26.934846
|
{
"authors": [
"michael-eckhart-woellkart",
"yaelFriedmann"
],
"repo": "Norkart/Leaflet-MiniMap",
"url": "https://github.com/Norkart/Leaflet-MiniMap/issues/151",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2279713223
|
Added option to skip caching when enumerating rows
This might be a special request but I have a use case where not adding read rows to the cache would be ideal.
I'm writing a Dalamud plugin that searches for translations of game content and that means that I read a lot of rows from many sheets in order to find an item that matches the search string (ex: ex: "popoto" will match "popoto", "popoto step" and "Popotoes au Gratin")
In a Dalamud context the sheet cache (and the row cache) are always kept in memory and using ExcelModule.RemoveSheetFromCache<T>() would be detrimental to other users of the cache.
Therefore it would simply be best for me be able to iterate over rows without loading a gigabyte of data into the cache ^^'
PS. I'm not sure why I had to add another GetEnumerator method, but it wouldn't compile without it
PS. I'm not sure why I had to add another GetEnumerator method, but it wouldn't compile without it
because it breaks the contract with the enumerable interface
it would be better to do a threadlocal internal static that is updated by some idisposable that you using var _ = new nocache() or something and read that in the enumerator, so disposing it restores the original behaviour for you and you can use it in a scope
Hey, just a little bump. Could you look at the updated code? I would really appreciate having this feature upstreamed
hey sorry, this got lost in my notifs! will sort now - thanks for updating the PR
|
gharchive/pull-request
| 2024-05-05T20:23:04 |
2025-04-01T04:55:26.952141
|
{
"authors": [
"NotAdam",
"Olaren15"
],
"repo": "NotAdam/Lumina",
"url": "https://github.com/NotAdam/Lumina/pull/83",
"license": "WTFPL",
"license_type": "permissive",
"license_source": "github-api"
}
|
1907412172
|
Updated the Image of Sales Forecast Issue# 278
Added an illustrative picture from SILK Corp Images.
@Aini-Bashir please review this pull request.
@Aini-Bashir Please check it now.
@Aini-Bashir. Here is the link: [Link] [https://envato-shoebox-0.imgix.net/c57a/ee3b-7f8a-4075-96d7-1b75751d3270/b_grzes31.jpg?auto=compress%2Cformat&fit=max&mark=https%3A%2F%2Felements-assets.envato.com%2Fstatic%2Fwatermark2.png&markalign=center%2Cmiddle&markalpha=18&w=800&s=88e8c6e5a547b4b88320f67048d9f7cf]
Link:
@rahmatzeb, send the link again. Write the "Link" and then press ctrl+k, it will pop up a screen where you'll paste the link and insert it. Thanks
Okay, let me do it again.
Link
Asset 1: Link
|
gharchive/pull-request
| 2023-09-21T17:08:51 |
2025-04-01T04:55:26.960513
|
{
"authors": [
"Aini-Bashir",
"rahmatzeb"
],
"repo": "NoteHive/Silk-Corp-Guide",
"url": "https://github.com/NoteHive/Silk-Corp-Guide/pull/304",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
152795800
|
Remove all references to codeplex (as appropriate on the site)
I see references to codeplex in our site. Customer have complained that codeplex results come up on searches and hopefully reducing links form the gallery will drop the rank.
cleaned it up
|
gharchive/issue
| 2016-05-03T15:00:19 |
2025-04-01T04:55:27.109783
|
{
"authors": [
"harikmenon"
],
"repo": "NuGet/NuGetGallery",
"url": "https://github.com/NuGet/NuGetGallery/issues/3013",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
275842165
|
Update catalog2lucene to add the case-insensitive package version field to all documents
After deploying this, we must run db2lucene to that old packages have this field.
Addressed by https://github.com/NuGet/NuGet.Services.Metadata/pull/256.
Duplicate of https://github.com/NuGet/NuGetGallery/issues/4583.
|
gharchive/issue
| 2017-11-21T20:03:09 |
2025-04-01T04:55:27.111350
|
{
"authors": [
"joelverhagen"
],
"repo": "NuGet/NuGetGallery",
"url": "https://github.com/NuGet/NuGetGallery/issues/5052",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
512700840
|
Confusing forms and lexemes when getting lexemes
If there is no lexeme movies, but there is movie with the form movies, "get_or_create_lexeme" will get you the lexeme movie, so if you want to update sense "cinema" it will be wrong. I had about 300 lexemes mixed up due to this issue, while running a bot.
Do I understand you correctly: you were expecting that "get_or_create_lexeme" should also search for an lexeme that has an form with the given string?
|
gharchive/issue
| 2019-10-25T20:16:33 |
2025-04-01T04:55:27.112585
|
{
"authors": [
"Nudin",
"Uziel302"
],
"repo": "Nudin/LexData",
"url": "https://github.com/Nudin/LexData/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
945881639
|
About cross compiling
Can I use Nuitka to generate ARM(arm64-v8a) modules on x86 machines? I want to compile and run the.so file generated by hello.py on Android.
Duplicate of #43
|
gharchive/issue
| 2021-07-16T02:12:08 |
2025-04-01T04:55:27.114445
|
{
"authors": [
"kayhayen",
"liukangcc"
],
"repo": "Nuitka/Nuitka",
"url": "https://github.com/Nuitka/Nuitka/issues/1162",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1148909711
|
nuitka 0.7 fails on virtualenv when using --onefile (ERRNO 5)
Nuitka version, full Python version, flavor, OS, etc. as output by this
command (it does more than you think, and we are adding more all the time):
python -m nuitka --version
0.7
Commercial: None
Python: 3.9.6 (tags/v3.9.6:db3ff76, Jun 28 2021, 15:26:21) [MSC v.1929 64 bit (AMD64)]
Flavor: Unknown
Executable: C:\Users\rauln\Desktop\vsm\venv\Scripts\python.exe
OS: Windows
Arch: x86_64
WindowsRelease: 10
How did you install Nuitka and Python
pip install nuitka within virtualenv
The specific PyPI names and versions
Nuitka==0.7
Many times when you get an error from Nuitka, your setup may be special
print("hello world") fails, too, if '--onefile' is provided. If not, it seems to work, so probably my setup is wrong, but as far as I know I did everything correctly and I even left to Nuitka to install the C toolchain, too.
Also supply a Short, Self Contained, Correct, Example
That demonstrates the issue i.e a small piece of code which reproduces
the issue and can be run with out any other (or as few as possible)
external dependencies. Issues without this may get rejected without much
consideration.
Provide in your issue the Nuitka options used
I just used --onefile, and that seems to be the cause of the error. --follow-imports works, too, but --onefile generates the error below.
The resulting error
Detecting used DLLs: 0%| | 0/18Traceback (most recent call last):
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\__main__.py", line 137, in <module>
main()
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\__main__.py", line 123, in main
MainControl.main()
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\MainControl.py", line 963, in main
copyDllsUsed(
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\freezer\Standalone.py", line 1371, in copyDllsUsed
used_dlls = _detectUsedDLLs(
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\freezer\Standalone.py", line 1143, in _detectUsedDLLs
worker_pool.submit(
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\utils\ThreadedExecutor.py", line 45, in submit
self.results.append(function(*args))
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\freezer\Standalone.py", line 1103, in addDLLInfo
used_dlls = _detectBinaryDLLs(
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\freezer\Standalone.py", line 1070, in _detectBinaryDLLs
return detectBinaryPathDLLsWindowsDependencyWalker(
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\freezer\Standalone.py", line 1033, in detectBinaryPathDLLsWindowsDependencyWalker
result = detectDLLsWithDependencyWalker(binary_filename, scan_dirs)
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\freezer\DependsExe.py", line 152, in detectDLLsWithDependencyWalker
"scan_dirs": "\n".join(
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\freezer\DependsExe.py", line 153, in <genexpr>
"UserDir %s" % getExternalUsePath(dirname) for dirname in scan_dirs
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\utils\FileOperations.py", line 759, in getExternalUsePath
filename = getWindowsShortPathName(filename)
File "C:\Users\rauln\Desktop\vsm\venv\lib\site-packages\nuitka\utils\FileOperations.py", line 724, in getWindowsShortPathName
raise WindowsError(
OSError: [Errno 5] Acceso denegado.
The error is caused when running getWindowsShortPathName on path C:\Program Files\WindowsApps\Microsoft.WindowsTerminal_1.11.3471.0_x64__8wekyb3d8bbwe, don't know why.
Thanks for the report, so you got something in your PATH, that you are not allowed to access. It looks like a permission denied error. I am going to change it such that it will then just not shorten the filename. This is mostly an optional step to shorten the path, because external tools frequently mishandle them, and Dependency Walker is one of them, but that probably also means, DLLs won't be loaded or even loadable from there anyway.
I pushed a fix to factory: https://nuitka.net/doc/factory.html
Thanks for your report, this is something that could happen due to lots of things. Windows terminal is from the App store, and these have very mixed permissions, but this is probably in PATH for a few things to work with it.
This is also going to be in 0.7.1 hotfix once that comes out, probably today/tomorrow.
thanks for the kind words, but obviously Nuitka has to cope with these things, and thanks for reporting it. It's part of the 0.7.1 hotfix I just made.
I can confirm it is solved in the hotfix release. Thanks A LOT for being so incredibly fast 😃
If you are happy, make sure to share Nuitka on your social networks. That is the best non-monetary support I can get.
|
gharchive/issue
| 2022-02-24T06:24:49 |
2025-04-01T04:55:27.124105
|
{
"authors": [
"DervishD",
"kayhayen"
],
"repo": "Nuitka/Nuitka",
"url": "https://github.com/Nuitka/Nuitka/issues/1442",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
794238353
|
tests: Ignore /usr/etc/
Closes #811
Just noting that the latest tag (.3) didnt make it to PyPI again. No drama, not urgent, however I'd like to include this fix in the next packaging release for openSUSE, and possibly also a fix for https://github.com/Nuitka/Nuitka/issues/960 if that is possible.
Just noting that the latest tag (.3) didnt make it to PyPI again. No drama, not urgent, however I'd like to include this fix in the next packaging release for openSUSE, and possibly also a fix for https://github.com/Nuitka/Nuitka/issues/960 if that is possible.
.3 is now on PyPI. I'll finish off the openSUSE package with that.
.3 is now on PyPI. I'll finish off the openSUSE package with that.
|
gharchive/pull-request
| 2021-01-26T13:38:46 |
2025-04-01T04:55:27.127090
|
{
"authors": [
"jayvdb"
],
"repo": "Nuitka/Nuitka",
"url": "https://github.com/Nuitka/Nuitka/pull/954",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1373267867
|
公招识别读取头像图片路径的处理似乎有问题
[v0.5.4]版本中
nonebot_plugin_arktools/open_recruitment/data_source.py中
第283行
avatar = Image.open(Path().parent / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像
获取到的是nonebot运行路径下的avatar目录,可能应该改为
avatar = Image.open(Path(__file__).parent.parent / "_data" / "operator_info" / "image" / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像
此处为插件安装的目录,与自动更新下载路径一致。
[v0.5.4]版本中 nonebot_plugin_arktools/open_recruitment/data_source.py中 第283行 avatar = Image.open(Path().parent / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像 获取到的是nonebot运行路径下的avatar目录,可能应该改为 avatar = Image.open(Path(__file__).parent.parent / "_data" / "operator_info" / "image" / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像 此处为插件安装的目录,与自动更新下载路径一致。
确实,改了之后这个报错就解决了
[v0.5.4]版本中 nonebot_plugin_arktools/open_recruitment/data_source.py中 第283行 avatar = Image.open(Path().parent / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像 获取到的是nonebot运行路径下的avatar目录,可能应该改为 avatar = Image.open(Path(__file__).parent.parent / "_data" / "operator_info" / "image" / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像 此处为插件安装的目录,与自动更新下载路径一致。
确实,改了之后这个报错就解决了
[v0.5.4]版本中 nonebot_plugin_arktools/open_recruitment/data_source.py中 第283行 avatar = Image.open(Path().parent / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像 获取到的是nonebot运行路径下的avatar目录,可能应该改为 avatar = Image.open(Path(__file__).parent.parent / "_data" / "operator_info" / "image" / "avatar" / f"{op[1]}.png").convert("RGBA").resize((128, 128)) # 头像 此处为插件安装的目录,与自动更新下载路径一致。
已修复
|
gharchive/issue
| 2022-09-14T16:25:21 |
2025-04-01T04:55:27.137824
|
{
"authors": [
"NumberSir",
"Yangleis",
"liuzj288"
],
"repo": "NumberSir/nonebot_plugin_arktools",
"url": "https://github.com/NumberSir/nonebot_plugin_arktools/issues/11",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2367944038
|
[Question] Does this work with BlockTheSpot?
Since it doesn't have lyrics in its package, was wondering if this worked in conjunction with BlockTheSpot+Spicetify,
I dont really understand this question, but this is made for Spicetify.
If you use BlockTheSpot + Spicetify, this should work too.
|
gharchive/issue
| 2024-06-22T18:15:47 |
2025-04-01T04:55:27.213128
|
{
"authors": [
"Nuzair46",
"Tulip-0333"
],
"repo": "Nuzair46/Lyrixed",
"url": "https://github.com/Nuzair46/Lyrixed/issues/12",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
243888760
|
Languages Tag View (chip)
This PR shouldn't be merged as long as long as nyaa website doesn't move to 1.0.1. This PR would then need to be tested on the new API.
Furthermore I have a problem with the taiwan flag, I can't make it work.
Close #31 after merge
Close #31 after merge
Nyaa is currently 1.0.1
Hey. I can't see those flags
|
gharchive/pull-request
| 2017-07-19T00:23:27 |
2025-04-01T04:55:27.214968
|
{
"authors": [
"akuma06",
"xdk78"
],
"repo": "NyaaPantsu/NyaaPantsu-android-app",
"url": "https://github.com/NyaaPantsu/NyaaPantsu-android-app/pull/40",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
35147395
|
Box Selection Tool
Suggestion from technicians: Implement a selection tool similar to lasso, but that selects using a box.
I'm going to close this issue as it hasn't been requested lately.
|
gharchive/issue
| 2014-06-06T13:54:41 |
2025-04-01T04:55:27.371127
|
{
"authors": [
"AmberSJones"
],
"repo": "ODM2/ODMToolsPython",
"url": "https://github.com/ODM2/ODMToolsPython/issues/120",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2012912365
|
Ssh newkeys 6578 v2
Ticket
Redmine ticket: https://redmine.openinfosecfoundation.org/issues/6578
Suricata PR required https://github.com/OISF/suricata/pull/9903
Replaced by https://github.com/OISF/suricata-verify/pull/1507
|
gharchive/pull-request
| 2023-11-27T19:03:45 |
2025-04-01T04:55:27.408317
|
{
"authors": [
"catenacyber"
],
"repo": "OISF/suricata-verify",
"url": "https://github.com/OISF/suricata-verify/pull/1498",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
563186329
|
Remove noisy queue clearing printing
Since these tests get run with all logging enabled these prints are actually added a lot of unnecessary noise on successful runs.
Completely purge rabbit after performance test runs
|
gharchive/pull-request
| 2020-02-11T12:51:42 |
2025-04-01T04:55:27.481848
|
{
"authors": [
"AdamHawtin"
],
"repo": "ONSdigital/census-rm-performance-tests",
"url": "https://github.com/ONSdigital/census-rm-performance-tests/pull/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1014945851
|
Update dp-mongodb
What
Update dp-mongodb to latest version
Update Go to latest version
How to review
Check the changes make sense, there is no reference to v2 and tests pass
Who can review
Anyone
Some unit tests have failed.
|
gharchive/pull-request
| 2021-10-04T09:31:03 |
2025-04-01T04:55:27.483175
|
{
"authors": [
"cookel2",
"rafahop"
],
"repo": "ONSdigital/dp-permissions-api",
"url": "https://github.com/ONSdigital/dp-permissions-api/pull/34",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1031112760
|
Update to chromedriver version 94
What is the context of this PR?
Update ChromeDriver to 94.
How to review
Check functional tests pass.
Checklist
[ ] New static content marked up for translation
[ ] Newly defined schema content included in eq-translations repo
Update Chromedriver
|
gharchive/pull-request
| 2021-10-20T08:16:29 |
2025-04-01T04:55:27.485369
|
{
"authors": [
"petechd"
],
"repo": "ONSdigital/eq-questionnaire-runner",
"url": "https://github.com/ONSdigital/eq-questionnaire-runner/pull/691",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
991050643
|
Add Concourse task to deploy Notify API credentials
What is the context of this PR?
Added a Concourse task to deploy the Notify API credentials into Secret Manager and give the App Engine service account access to it.
How to review
Review using https://github.com/ONSdigital/eq-pipelines/pull/266
OR
Use the ci/README.md to deploy a secret into Secrets Manager. (Must be an existing env)
Checklist
*[ ] Tests updated
@berroar @LJBabbage I've made some changes to read the dev key from the repo instead of k8s after a discussion with Mark. That is what we intended from the beginning I think, just lost track.
|
gharchive/pull-request
| 2021-09-08T11:51:37 |
2025-04-01T04:55:27.487456
|
{
"authors": [
"MebinAbraham"
],
"repo": "ONSdigital/eq-submission-confirmation-consumer",
"url": "https://github.com/ONSdigital/eq-submission-confirmation-consumer/pull/11",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
271864800
|
Allow viewing of submitted answers
What is the context of this PR?
As a respondent,
I need to access my responses to questions
so that I can refer to them if my data is queried by ONS at a later date or used as a reference for next time I complete the same survey
How to review
Easiest to test using docker (docker-compose up -d) to ensure the dynamoDb container is created.
Use test_submitted_responses.json schema which allows answers to be viewed for 45 seconds.
Still need to add tests around this but ready for a high level review. I'd also like to discuss issues around security of answers.
The answers need to be stored encrypted exactly as they are when being stored in Postgres. For this we will need to keep the user signed in (so we can encrypt and decrypt the answers with the user provided keys).
We should look at doing this by adding configurable backends for questionnaire store. This could potentially allow us to switch out Postgres for DynamoDB for survey state during surveys.
The extra information required on the thank you and answers after submission screen e.g. ru name, submission time, should be stored in the server side session.
In doing this we should also be able to store the eq id and form type in the session and remove them from the urls for post submission screens.
@dwyeradam still getting 5xx's on submission, to do with Boto3's PutItem class
@dwyeradam @shahi645 this is because there is an extra & on line 36 of dev_settings.sh
|
gharchive/pull-request
| 2017-11-07T15:08:12 |
2025-04-01T04:55:27.491437
|
{
"authors": [
"ajmaddaford",
"dwyeradam",
"jonnyshaw89",
"shahi645"
],
"repo": "ONSdigital/eq-survey-runner",
"url": "https://github.com/ONSdigital/eq-survey-runner/pull/1345",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
318404911
|
Implemented create survey feature
can be found at /surveys/create (no link from survey page)
This PR is feature complete but I need to add tests (once I've learned how ...)
US102: INT User to Create a CE - After R17/18 (M)
US101: INT User to Create a Survey - After R17/18 (L)
Codecov Report
Merging #139 into master will decrease coverage by 2.73%.
The diff coverage is 44.28%.
@@ Coverage Diff @@
## master #139 +/- ##
==========================================
- Coverage 95.21% 92.48% -2.74%
==========================================
Files 34 34
Lines 1254 1317 +63
==========================================
+ Hits 1194 1218 +24
- Misses 60 99 +39
Impacted Files
Coverage Δ
response_operations_ui/exceptions/exceptions.py
100% <100%> (ø)
:arrow_up:
response_operations_ui/error_handlers.py
100% <100%> (ø)
:arrow_up:
...se_operations_ui/controllers/survey_controllers.py
81.39% <11.76%> (-17.16%)
:arrow_down:
response_operations_ui/views/surveys.py
76.19% <51.85%> (-18.94%)
:arrow_down:
response_operations_ui/forms.py
85% <54.16%> (-13.25%)
:arrow_down:
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0e2b966...bf8db22. Read the comment docs.
Can you add these config changes to the confluence doc for new config? https://digitaleq.atlassian.net/wiki/spaces/RASB/pages/406257665/Pending+Release+Config+Changes
|
gharchive/pull-request
| 2018-04-27T13:14:17 |
2025-04-01T04:55:27.506084
|
{
"authors": [
"codecov-io",
"gemmai95",
"madeye-matt"
],
"repo": "ONSdigital/response-operations-ui",
"url": "https://github.com/ONSdigital/response-operations-ui/pull/139",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
296467399
|
Improve the way login errors are generated
This way of displaying errors means that 1) every error will be displayed if there are mulitple and
2) the errors that are generated have links
Codecov Report
Merging #52 into master will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #52 +/- ##
=======================================
Coverage 99.55% 99.55%
=======================================
Files 24 24
Lines 446 446
=======================================
Hits 444 444
Misses 2 2
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 21d698e...61acac5. Read the comment docs.
|
gharchive/pull-request
| 2018-02-12T17:51:55 |
2025-04-01T04:55:27.511521
|
{
"authors": [
"codecov-io",
"insacuri"
],
"repo": "ONSdigital/response-operations-ui",
"url": "https://github.com/ONSdigital/response-operations-ui/pull/52",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
267688802
|
Get tx id from header
What? and Why?
What does this pull request change/fix? Why was it necessary?
Get tx_id from header. This makes logging and monitoring of the service easier.
Checklist
[x] CHANGELOG.md updated? (if required)
Codecov Report
Merging #60 into master will increase coverage by 0.01%.
The diff coverage is 100%.
@@ Coverage Diff @@
## master #60 +/- ##
==========================================
+ Coverage 98.53% 98.55% +0.01%
==========================================
Files 8 8
Lines 341 345 +4
==========================================
+ Hits 336 340 +4
Misses 5 5
Impacted Files
Coverage Δ
tests/test_data.py
100% <ø> (ø)
:arrow_up:
tests/test_response_processor.py
100% <100%> (ø)
:arrow_up:
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5e16e76...4488c29. Read the comment docs.
Fix empty tx_id in sdx_receipt_rrm
|
gharchive/pull-request
| 2017-10-23T14:25:27 |
2025-04-01T04:55:27.520362
|
{
"authors": [
"codecov-io",
"gemmai95"
],
"repo": "ONSdigital/sdx-receipt-rrm",
"url": "https://github.com/ONSdigital/sdx-receipt-rrm/pull/60",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2419186797
|
incorrect label for rule BR-BT-00738-0053
Citing the changelog of SDK 1.8.1:
Remove redundant rule checking that preferred publication date is after the dispatch date.
Correct rule BR-BT-00738-0053 to allow the preferred publication date (BT-738) to be equal to the dispatch date (BT-05).
The schematron seems to have been updated accordingly, but in rule_en.xml the error message for BR-BT-00738-0053 still reads:
Preferred publication date' (BT-738-notice) must be between 2 and 60 days after 'Notice dispatch date'
Hi,
The English version is indeed not aligned with the translations and this should get fixed with a next labels update.
@YvesJo Thx - as far as I can tell the issue also exists in the other languages.
Hi @mdewinne
Extracting the labels for SDK 1.8.2, I get the following:
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Предпочитана дата за публикуване' (BT-738-notice) трябва да бъде между 0 и 60 дни след 'Дата на изпращане на известието'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Preferované datum zveřejnění' (BT-738-notice) musí být v rozmezí od 0 do 60 dnů po 'Datum odeslání oznámení'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Foretrukken offentliggørelsesdato' (BT-738-notice) skal være mellem 0 og 60 dage efter 'Afsendelsesdato for bekendtgørelsen'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Bevorzugtes Datum der Veröffentlichung' (BT-738-notice) muss zwischen 0 und 60 Tage nach 'Datum der Übermittlung der Bekanntmachung' sein</entry>
Line 17886: <entry key="rule|text|BR-BT-00738-0053">'Προτιμώμενη ημερομηνία δημοσίευσης' (BT-738-notice) πρέπει να είναι μεταξύ 0 και 60 ημερών μετά 'Ημερομηνία αποστολής της προκήρυξης'</entry>
Line 25975: <entry key="rule|text|BR-BT-00738-0053">'Preferred publication date' (BT-738-notice) must be between 2 and 60 days after 'Notice dispatch date'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Fecha preferida de publicación' (BT-738-notice) debe situarse entre 0 días y 60 días después de 'Fecha de envío del anuncio'</entry>
Line 17798: <entry key="rule|text|BR-BT-00738-0053">'Avaldamise eeliskuupäev' (BT-738-notice) peab olema 0 kuni 60 päeva pärast 'Teate saatmise kuupäev'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Ensisijainen julkaisupäivä' (BT-738-notice) on oltava 0 ja 60 päivän välillä 'Ilmoituksen lähetyspäivä' jälkeen</entry>
Line 20106: <entry key="rule|text|BR-BT-00738-0053">'Date de publication souhaitée' (BT-738-notice) doit être compris entre zéro et soixante jours après 'Date d’envoi de l’avis'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Ní mór 'Dáta foilsithe is rogha leat' (BT-738-notice) a bheith idir 0 agus 60 lá tar éis 'Dáta seolta an fhógra'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Željeni datum objave' (BT-738-notice) mora biti između 0 i 60 dana nakon 'Datum slanja obavijesti'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'A hirdetmény közzétételének preferált dátuma' (BT-738-notice) 0–60 nappal 'A hirdetmény megküldésének dátuma' után kell, hogy legyen</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Data di pubblicazione preferita' (BT-738-notice) deve essere compreso tra 0 e 60 giorni dopo 'Data di trasmissione dell'avviso'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Pageidaujama paskelbimo data' (BT-738-notice) turi būti 0–60 dienų po 'Skelbimo išsiuntimo data'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Vēlamais publicēšanas datums' (BT-738-notice) jābūt no 0 līdz 60 dienām pēc 'Paziņojuma nosūtīšanas datums'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Data ta’ pubblikazzjoni preferuta' (BT-738-notice) irid ikun bejn jiem żero u 60 jum wara 'Data ta’ meta ntbagħat l-avviż'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Voorkeursdatum voor publicatie' (BT-738-notice) moet tussen 0 en zestig dagen na 'Verzenddatum van de aankondiging' zijn</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Preferowana data publikacji' (BT-738-notice) musi wynosić od 0 do 60 dni po 'Ogłoszenie – data wysłania'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Data de publicação preferida' (BT-738-notice) deve estar entre 0 e 60 dias após 'Data de envio do anúncio'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Data preferată pentru publicare' (BT-738-notice) trebuie să fie între 0 și 60 de zile după 'Data notificării expedierii'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Preferovaný dátum uverejnenia' (BT-738-notice) musí byť od 0 do 60 dní po 'Dátum odoslania oznámenia'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Zaželeni datum objave' (BT-738-notice) mora biti od 0 dni do 60 dni po 'Datum pošiljanja obvestila'</entry>
Line 17781: <entry key="rule|text|BR-BT-00738-0053">'Önskat offentliggörandedatum' (BT-738-notice) måste vara mellan 0 och 60 dagar efter 'Avsändningsdatum för meddelandet'</entry>
and may only detect an issue for the EN label. Would you have additional information to share?
@YvesJo Ah yes, you are right. But then there may have been a regression, because in the latest version of the rule_.xml files, the delay once again mentions "between 2 and 60 days".
E.g. rule_fr.xml in SDK 1.12:
'Date de publication souhaitée' (BT-738-notice) doit être compris entre deux et soixante jours après 'Date d’envoi de l’avis'
indeed, following further investigations, some unexpecting and competing operations were carried out and led to regression in some recent SDKs. This is currently under investigation.
Thanks for the details.
|
gharchive/issue
| 2024-07-19T15:14:36 |
2025-04-01T04:55:27.527579
|
{
"authors": [
"YvesJo",
"mdewinne"
],
"repo": "OP-TED/eForms-SDK",
"url": "https://github.com/OP-TED/eForms-SDK/issues/980",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2605469866
|
Allow custom base types for structures with optional fields
This a proposed solution to the issue #192.
Extends the templates with a data type template that allows a structure with optional fields to subtype from a custom structure. The template is a copy of the ClassWithOptionalFields.cs with a couple of changes, mainly caused by replacing virtual with override.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.You have signed the CLA already but the status is still pending? Let us recheck it.
Release now has support.
|
gharchive/pull-request
| 2024-10-22T13:30:34 |
2025-04-01T04:55:27.567428
|
{
"authors": [
"CLAassistant",
"larws",
"opcfoundation-org"
],
"repo": "OPCFoundation/UA-ModelCompiler",
"url": "https://github.com/OPCFoundation/UA-ModelCompiler/pull/193",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
216516600
|
database_size is a confusing service
The thresholds sent to database_size is applied on the database growth since last query - this may be slightly unexpected from the name of the service.
Further, the perf-data gives the absolute database size and not the growth. I was staring at the numbers for a while and believing I found a bug - until I read the documentation. I think it's not appropriate that the thresholds apply to one variable while the actual reported data is another variable. And I even do have a use-case where I'd actually like to monitor the absolute database size.
Suggestion: mark database_size as deprecated and split it into two, database_size_growth as a rename of the currenct database_size (but it should return growth values in perfdata), and database_total_size, database_disk_size or something like that for a service that checks for and returns the actual database size. Check the available disk space on the postgres data partition (if possible) and let percentages in the thresholds apply to the ratio database_size / (database_size + free disk space)
Hi,
I agree this service is quite confusing.
Unfortunately, checking the available disk space on the postgres data partition is not possible in a clean or portable way. Moreover, checking the available space/partition size is not related to a PostgreSQL tool, but a system tool that should be deployed as well anyway (eg. check_disk :)).
I'm fine with the idea to rename database_size as database_size_delta (the db can growth...or shrink :)) and create a new service database_size which apply on actual absolute db size.
Right, on second thought the "ratio of database size towards database size plus free disk space" probably doesn't make much sense anyway. I would like to use a percentage there rather than fixed size though. Ratio of database size versus size of all databases could probably work, though in many settings it's always close to 100%. Well, I guess this is yet another threshold variable that has to be hand tuned, where it's meaningless to try to pick some default fire-and-forget value :-)
+1 for adding a check on the database full size
I propose to allow something like -w "1%,100GB" -c "2%,150GB" to keep the compatibility with the current behaviour and get an alarm when getting (slowly) above some threshold ?
I'm ok with @Krysztophe proposal. Not with the syntax. Currently, a raw number, a percentage, or a
size are all accepted as size variation. So it is not clear if a size apply to the total size or variation. Same for percentage or raw number.
Myabe we should pick something like -w "delta=1%,size=100GB" -c "delta=10GB,size=150GB". size would only allow a size or a raw number.
Plus, we should probably add a perfdata about the size variation of each database.
Extending the current database_size avoid some code duplication and to break compatibility with existing setup.
I need reviewers on PR #225.
Done!
Thanks,
|
gharchive/issue
| 2017-03-23T17:48:41 |
2025-04-01T04:55:27.590014
|
{
"authors": [
"Krysztophe",
"ioguix",
"tobixen"
],
"repo": "OPMDG/check_pgactivity",
"url": "https://github.com/OPMDG/check_pgactivity/issues/98",
"license": "PostgreSQL",
"license_type": "permissive",
"license_source": "github-api"
}
|
1875890
|
Platform-Plugin: Table-Only (headless) mode
It would be great if saiku could be called via url in a mode that only renders the crosstab.
it would be a nice addition for cdf-dashboards.
done
?plugin=true&mode=view#query/open/{queryName} will open a query and just display the result. with options like swap axes etc. plugin=true would be biplugin=true in the biserver
mode=table is the same as above except the query toolbar is removed as well
|
gharchive/issue
| 2011-10-11T15:02:30 |
2025-04-01T04:55:27.617226
|
{
"authors": [
"buggtb",
"pstoellberger"
],
"repo": "OSBI/saiku-ui",
"url": "https://github.com/OSBI/saiku-ui/issues/108",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
136381159
|
Feat flexovl elster flex 1586
FLEX-1586
SonarQube analysis reported 1 issue:
1 major
Watch the comments in this conversation to review them.
All is well.
|
gharchive/pull-request
| 2016-02-25T12:56:20 |
2025-04-01T04:55:27.626305
|
{
"authors": [
"bjornamkreutz",
"jenkins-ip-10-4-24-184"
],
"repo": "OSGP/Shared",
"url": "https://github.com/OSGP/Shared/pull/69",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1190732985
|
use of doc strings in notebooks for visualization of test results
the notebooks to visualize the test results has background on test data and tolerances etc. These are now copy-paste (+edits) from the test files. This could probably be more efficient and automated by using doc strings in test files.
this could also help to add remarks to the notebooks, i.e. we have a 'notes' section in each notebook. We could use this to indicate where we needed modifications to the test data to get it run for each contribution. This might also be relevant information when you want to use the code.
|
gharchive/issue
| 2022-04-02T20:32:20 |
2025-04-01T04:55:27.643865
|
{
"authors": [
"petravanhoudt"
],
"repo": "OSIPI/DCE-DSC-MRI_CodeCollection",
"url": "https://github.com/OSIPI/DCE-DSC-MRI_CodeCollection/issues/72",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
64643983
|
Add ttl to SOA records in builds
It seems like SOA records have two different ways of being built... the usual bind_render_record, and zone_builder.render_soa_only. This adds the TTL to the latter.
Resolves https://github.com/OSU-Net/cyder/issues/930
Is this supposed to be reviewed or not?
i'll deal with this one.
|
gharchive/pull-request
| 2015-03-26T22:05:16 |
2025-04-01T04:55:27.663272
|
{
"authors": [
"akeym",
"drkitty",
"murrown"
],
"repo": "OSU-Net/cyder",
"url": "https://github.com/OSU-Net/cyder/pull/938",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
631345783
|
Private github repositories are not listed in Github repo selection step
This issue has been migrated from :
https://github.com/mike-goodwin/owasp-threat-dragon/issues/119
and was opened by @rav94 :
The repository was listed only after I changed the repository permissions to the public mode. Think this would be an important option to have because you are providing the Github credentials expecting to view your private repos as well. Plus people would be reluctant to put their threat model diagrams in public.
Definitely we can't make threat models publicly available, Threat Dragon needs to use a private repo.
Agreed @FrankArchitect that this is a limitation of TD, and should be changed as soon as we can. Dow you know what needs to be done with TD + github to provide this? Any suggestions welcome
The alternative is to use Threat Dragon desktop version on a locally cloned repo, and then commit+push back up to the private repo when required. I know this is not using the webapp, and is more awkward process, but it may help here.
The desktop release is at :
https://github.com/mike-goodwin/owasp-threat-dragon-desktop/releases/tag/v1.2
This has a comment from @hemedga https://github.com/mike-goodwin/owasp-threat-dragon/commit/305296d10fc1d310613f28503cc050ede802027b#commitcomment-40338515
"this commit was a good shot for threatdragon.org.
However, we are going to self-host it and we are going to use private repositories.
I suggest moving this to an environment variable i.e. process.env.GITHUB_REPO.
We're going to test it in our environment.:
|
gharchive/issue
| 2020-06-05T06:27:26 |
2025-04-01T04:55:27.694098
|
{
"authors": [
"FrankArchitect",
"jgadsden"
],
"repo": "OWASP/threat-dragon",
"url": "https://github.com/OWASP/threat-dragon/issues/8",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1868178481
|
Upgrade dependency
Upgrade jsonwebtoken@8.5.1 to version 9.0.1
Please ignore this PR and close it. Generated by TestIM
|
gharchive/pull-request
| 2023-08-26T17:13:43 |
2025-04-01T04:55:27.698190
|
{
"authors": [
"kostya253"
],
"repo": "OX-Security-Demo/Multi-currency-management",
"url": "https://github.com/OX-Security-Demo/Multi-currency-management/pull/222",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2045322266
|
Upgrade dependency
Upgrade jsonwebtoken@8.5.1 to version 9.0.1
Please ignore this PR and close it. Generated by TestIM
|
gharchive/pull-request
| 2023-12-17T18:15:48 |
2025-04-01T04:55:27.699074
|
{
"authors": [
"kostya253"
],
"repo": "OX-Security-Demo/Multi-currency-management",
"url": "https://github.com/OX-Security-Demo/Multi-currency-management/pull/663",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
143280294
|
Multi-file parsing in grid.py does not seem to work
It seems that the multi-file parsing in grid.py does not work. Even when multiple files can be read in the .from_netcdf classmethod, only the last one is returned.
I've made a new branch with a very simple test function to show the issue:
erik:~/Codes/PARCELScode] python tests/test_multi_filename.py
Generating NEMO grid output with basename: multi_filename0
Generating NEMO grid output with basename: multi_filename1
Generating NEMO grid output with basename: multi_filename2
Generating NEMO grid output with basename: multi_filename3
Generating NEMO grid output with basename: multi_filename4
Generating NEMO grid output with basename: multi_filename5
Grid.time as computed in .from_netcdf [ 0. 86400. 172800. 259200. 345600. 432000.]
Grid.time as returned by .from_netcdf [ 432000.]
Issue fixed; turned out not to be a bug. Rather, a misinterpretation from me on how to use the Grid.from_netcdf class. Proper usage is
grid = Grid.from_netcdf(filenames, variables, dimensions)
|
gharchive/issue
| 2016-03-24T15:38:31 |
2025-04-01T04:55:27.728661
|
{
"authors": [
"erikvansebille"
],
"repo": "OceanPARCELS/parcels",
"url": "https://github.com/OceanPARCELS/parcels/issues/54",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
321530964
|
List Card not found
Hi,
Could you please add samples so that we could try out
https://docs.microsoft.com/en-us/microsoftteams/platform/concepts/cards/cards-reference#list-card
Also i was unable to find the same in botbuilder-teams.d.js. Where is it defined ?
Thanks
Could you please provide an example on how to use the JSON manually ? Or could you please refer me to an example.
Thank you. Got it working.
List Card is not supported in Bot Emulator ?
That's correct - it only works in Teams.
|
gharchive/issue
| 2018-05-09T11:20:48 |
2025-04-01T04:55:27.804375
|
{
"authors": [
"billbliss",
"register9091"
],
"repo": "OfficeDev/BotBuilder-MicrosoftTeams",
"url": "https://github.com/OfficeDev/BotBuilder-MicrosoftTeams/issues/111",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
431524057
|
Is there an official documentation for BotBuilder-MicrosoftTeams?
Hi, I am new to Microsoft Teams and Botbuilder framework. Is there any official documentation and API reference for this botbuilder-teams library?
We don't have a separate SDK documentation for botbuilder-teams (yet).
However, this section of our documentation discusses this in depth: https://docs.microsoft.com/en-us/microsoftteams/platform/concepts/bots/bots-overview
|
gharchive/issue
| 2019-04-10T14:06:01 |
2025-04-01T04:55:27.806000
|
{
"authors": [
"chamathabeysinghe",
"clearab"
],
"repo": "OfficeDev/BotBuilder-MicrosoftTeams",
"url": "https://github.com/OfficeDev/BotBuilder-MicrosoftTeams/issues/188",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
232404992
|
Fix copy-paste error in jsdoc for .fetchMemberList()
It seems the jsdoc was copy-pasted from .fetchChannelList() and the first line wasn't updated
@oneeman,
Thanks for having already signed the Contribution License Agreement. Your agreement was validated by Microsoft. We will now review your pull request.
Thanks,
Microsoft Pull Request Bot
|
gharchive/pull-request
| 2017-05-30T22:51:04 |
2025-04-01T04:55:27.807841
|
{
"authors": [
"msftclas",
"oneeman"
],
"repo": "OfficeDev/BotBuilder-MicrosoftTeams",
"url": "https://github.com/OfficeDev/BotBuilder-MicrosoftTeams/pull/31",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1153766610
|
Tell us more about your awesome experience using MCCA :)
It's great to hear that you enjoyed using our product! Please share your feedback on how we can further improve your MCCA experience.
NOTE: Please do not share any personal or sensitive information related to your organization and/or yourself.
Tell us about:
Why do you use MCCA?
What do you like about MCCA?
What do you want to see more of?
What role (admin, leadership, engineer, etc) do you belong to & which other stakeholders in your or use MCCA?
Any other feedback that you want us to work on.
Interested in discussing more?
Leave your email address and we will reach out to you or drop us an email at mccahelp@microsoft.com.
Good day karashah1. I like MCCA a lot. It makes it easier to show people what they have to OK for me to apply in Compliance. My position is Compliance Specialist.
What I believe is necessary is to be able to download the report and share it in Teams or by email. At the moment. At the moment I have to show it on screen.
The print is not in color, which is OK, but not as good as the color version.
niek@m365compliance.com
Good day karashah1. I like MCCA a lot. It makes it easier to show people what they have to OK for me to apply in Compliance. My position is Compliance Specialist. What I believe is necessary is to be able to download the report and share it in Teams or by email. At the moment. At the moment I have to show it on screen. The print is not in color, which is OK, but not as good as the color version. niek@m365compliance.com
@NiekM51 Thanks for taking the time to share feedback!
To your feature request - We are able to print in colour (screenshot below). Are you sure your default print settings are not set to black & white ? Alternatively, you can directly share the HTML report via email/teams as well.
Hi Karashah1, I have looked at my settings and cannot see where the problem may lie. How can I share or email the document directly?
@.***
Compliance Specialist
Niek Meerkotter
+27-72-358-4688
@.@.>
@.***https://m365compliance.com/
From: karashah1 @.>
Sent: Wednesday, 06 April 2022 13:09
To: OfficeDev/MCCA @.>
Cc: Niek Meerkotter @.>; Mention @.>
Subject: Re: [OfficeDev/MCCA] Tell us more about your awesome experience using MCCA :) (Issue #39)
You don't often get email from @.@.>. Learn why this is importanthttp://aka.ms/LearnAboutSenderIdentification
Good day karashah1. I like MCCA a lot. It makes it easier to show people what they have to OK for me to apply in Compliance. My position is Compliance Specialist. What I believe is necessary is to be able to download the report and share it in Teams or by email. At the moment. At the moment I have to show it on screen. The print is not in color, which is OK, but not as good as the color version. @.@.>
@NiekM51https://github.com/NiekM51 Thanks for taking the time to share feedback!
To your feature request - We are able to print in colour (screenshot below). Are you sure your default print settings are not set to black & white ? Alternatively, you can directly share the HTML report via email/teams as well.
[image]https://user-images.githubusercontent.com/67892508/161961609-4d4ddde5-276f-4033-9f3f-70c503f37cac.png
—
Reply to this email directly, view it on GitHubhttps://github.com/OfficeDev/MCCA/issues/39#issuecomment-1090143215, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AWXAKJZI2GVVXHYOFICL3P3VDVWDDANCNFSM5PQK2SFQ.
You are receiving this because you were mentioned.Message ID: @.@.>>
The report is great as it pulls together a lot of information and provides a nice summary and next level of detail. It's a great tool for Microsoft partners as a report to run before, during, and post delivery of a project.
What I'd like to see is an option to export the results in a json file. The option could either have the output only be HTML or Json, or maybe it produces both: -filetype json,html
I could then store these results within blob storage and then use Power BI to show trends over time. HTML is great when you view it one at a time, but makes it harder if I want to compare the summary scores over a period of time. Let's say I automate running the report monthly, and I want to see how we've improved (based on the nice recommendations) during the last six months.
Hi Conrad
This is a great idea. I concur totally with this.
@.***
Compliance Specialist
Niek Meerkotter
+27-72-358-4688
@.@.>
@.***https://m365compliance.com/
[A picture containing text, sign Description automatically generated] [A picture containing icon Description automatically generated] [A picture containing text, sign Description automatically generated]
From: Conrad Agramont @.>
Sent: Thursday, 21 July 2022 18:41
To: OfficeDev/MCCA @.>
Cc: Niek Meerkotter @.>; Mention @.>
Subject: Re: [OfficeDev/MCCA] Tell us more about your awesome experience using MCCA :) (Issue #39)
You don't often get email from @.*** Learn why this is importanthttps://aka.ms/LearnAboutSenderIdentification
The report is great as it pulls together a lot of information and provides a nice summary and next level of detail. It's a great tool for Microsoft partners as a report to run before, during, and post delivery of a project.
What I'd like to see is an option to export the results in a json file. The option could either have the output only be HTML or Json, or maybe it produces both: -filetype json,html
I could then store these results within blob storage and then use Power BI to show trends over time. HTML is great when you view it one at a time, but makes it harder if I want to compare the summary scores over a period of time. Let's say I automate running the report monthly, and I want to see how we've improved (based on the nice recommendations) during the last six months.
—
Reply to this email directly, view it on GitHubhttps://github.com/OfficeDev/MCCA/issues/39#issuecomment-1191709853, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AWXAKJYXBP66E47YVAVLR4TVVF4RRANCNFSM5PQK2SFQ.
You are receiving this because you were mentioned.Message ID: @.***>
|
gharchive/issue
| 2022-02-28T07:54:55 |
2025-04-01T04:55:27.829487
|
{
"authors": [
"NiekM51",
"conradagramont",
"karashah1"
],
"repo": "OfficeDev/MCCA",
"url": "https://github.com/OfficeDev/MCCA/issues/39",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
106877246
|
PickaDate removal from DatePicker
This addresses #99. Microsoft copyright is retained in the plugin while PickaDate is moved to another file that is loaded in.
I'm going to update this pull request to take out commits from another branch.
Closing this pull request because of unintended changes brought in from another branch.
|
gharchive/pull-request
| 2015-09-16T22:34:29 |
2025-04-01T04:55:27.831056
|
{
"authors": [
"wdo3650"
],
"repo": "OfficeDev/Office-UI-Fabric",
"url": "https://github.com/OfficeDev/Office-UI-Fabric/pull/136",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
120490791
|
Wdo3650/panel commanding usability
Updated with usability enhancements.
Changes:
-Command bar animates down
-First button margin, padding and color changes
-Other buttons in command bar match hover and active styles from ODN usability change
Looks good however the commanding seems broken and not selectable on small screens. Probably a pointer events issue. We might want to add some space between the buttons on small screens as well.
THIS IS APPROVED, PULL REQUEST CAN BE COMPLETED IF USER DESIRES THIS RESULT. THANK YOU FOR YOUR WORK, MAY TEH FORCE BE WITH YOU.
|
gharchive/pull-request
| 2015-12-04T21:35:38 |
2025-04-01T04:55:27.832807
|
{
"authors": [
"battletoilet",
"philworthington",
"wdo3650"
],
"repo": "OfficeDev/Office-UI-Fabric",
"url": "https://github.com/OfficeDev/Office-UI-Fabric/pull/226",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
745753551
|
"error":"This service is not available in your location."
Hi
I'm running the step-by-step tutorial and once I run the command [npm run deploy], and login to AAD, I got the following exception:
Fetching zip upload url ...
API response error: {"error":"This service is not available in your location.","correlationId":"40846ac3-b508-476c-ae41-588c58462d58","serverCorrelationId":"226cb7c7-0370-4e8a-a9d5-34b156172bb7"}
Failed to get zip upload url!
any guidance on how to move forward from here ?
Thanks
Checklist app is currently not supported for customers with M365 billing region as Canada, South Africa and United Arab Emirates because of data residency issues.
Has anything changed since almost a year ago when this was first raised. I am getting the same issue still trying to test sample teams app code
|
gharchive/issue
| 2020-11-18T15:21:23 |
2025-04-01T04:55:27.876437
|
{
"authors": [
"elbruno",
"pcox-outreach",
"yogesh-MS"
],
"repo": "OfficeDev/microsoft-teams-app-checklist",
"url": "https://github.com/OfficeDev/microsoft-teams-app-checklist/issues/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1380005599
|
It does not load me or it does not open the company in the desktop teams
Good day, I have a problem with the authors company and it is that when I open it from my local teams, to send a communication, the authors company communicator remains blank and never loads or opens me, because it could be that mistake?
Hi, problem solved.
|
gharchive/issue
| 2022-09-20T21:16:41 |
2025-04-01T04:55:27.877434
|
{
"authors": [
"maye-vargas"
],
"repo": "OfficeDev/microsoft-teams-apps-company-communicator",
"url": "https://github.com/OfficeDev/microsoft-teams-apps-company-communicator/issues/880",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1138094109
|
Config changes to reflect the default branch from master to main. Please merge to complete the default branch renaming job.
Auto branch renaming content.
Docs Build status updates of commit 922ed66:
:white_check_mark: Validation status: passed
File
Status
Preview URL
Details
.openpublishing.publish.config.json
:white_check_mark:Succeeded
For more details, please refer to the build report.
Note: Broken links written as relative paths are included in the above build report. For broken links written as absolute paths or external URLs, see the broken link report.
For any questions, please:Try searching the docs.microsoft.com contributor guidesPost your question in the Docs support channel
|
gharchive/pull-request
| 2022-02-15T02:28:30 |
2025-04-01T04:55:27.882934
|
{
"authors": [
"olprod",
"opbld31"
],
"repo": "OfficeDev/office-js-docs-pr.fr-fr",
"url": "https://github.com/OfficeDev/office-js-docs-pr.fr-fr/pull/3",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
284726364
|
Cannot append 'svg' properly with 'ReactFauxDOM'
I have the following ReactFauxDom implementation:
import React, { Component } from 'react';
import * as d3 from 'd3';
import ReactFauxDOM from 'react-faux-dom';
class Animate extends Component {
render() {
const w = window.innerWidth;
const h = window.innerHeight;
var fauxRoot = ReactFauxDOM.createElement('div');
let nodes = d3.range(200).map(function() {
return {r: Math.random() * 12 + 4};
});
let root = nodes[0];
let color = d3.scaleOrdinal().range(d3.schemeCategory10);
root.radius = 0;
root.fixed = true;
const forceX = d3.forceX(w / 2).strength(0.015);
const forceY = d3.forceY(h / 2).strength(0.015);
let force = d3.forceSimulation()
.velocityDecay(0.2)
.force('x', forceX)
.force('y', forceY)
.force('collide', d3.forceCollide().radius(function(d){
if (d === root) {
return Math.random() * 50 + 100;
}
return d.r + 2;
}).iterations(5))
.nodes(nodes).on('tick', ticked);
let svg = d3.select(fauxRoot).append('svg')
.attr('width', w)
.attr('height', h);
svg.selectAll('circle')
.data(nodes.slice(1))
.enter().append('circle')
.attr('r', function(d) { return d.r; })
.style('fill', function(d, i) { return color(i % 3); });
function ticked(e) {
svg.selectAll('circle')
.attr('cx', function(d) { return d.x; })
.attr('cy', function(d) { return d.y; });
};
svg.on('mousemove', function() {
const p1 = d3.mouse(this);
root.fx = p1[0];
root.fy = p1[1];
force.alphaTarget(0.3).restart();//reheat the simulation
});
return fauxRoot.toReact();
}
}
// indicate which class can be exported, and instantiated via 'require'
export default Animate;
It renders as follows:
But, when I change the above's d3.select(fauxRoot).append('svg') to d3.select('body').append('svg'), it incorrectly renders, duplicated svg's, at the bottom of the <body>:
The following is what I'm trying to get it to look like:
https://jsfiddle.net/jeff1evesque/hgcuy9mc/
Note: the original source code was written with d3js (version 3). It's update doesn't have a nice effect, that the gravity callback provided. More specifically, when the mouse approaches the aggregated circles, they should collectively get repelled. This is less so, with the adjusted jsfiddle's version 4 update.
I adjusted my code to use animateFauxDom:
import React from 'react'
import * as d3 from 'd3'
import {withFauxDOM} from 'react-faux-dom'
class MyReactComponent extends React.Component {
componentDidMount() {
const w = window.innerWidth;
const h = window.innerHeight;
const faux = this.props.connectFauxDOM('svg', 'collision');
let nodes = d3.range(200).map(function () {
return { r: Math.random() * 12 + 4 };
});
let root = nodes[0];
let color = d3.scaleOrdinal().range(d3.schemeCategory10);
root.radius = 0;
root.fixed = true;
const forceX = d3.forceX(w / 2).strength(0.015);
const forceY = d3.forceY(h / 2).strength(0.015);
var svg = d3.select(faux)
.attr('width', w)
.attr('height', h)
.append('g');
svg.selectAll('circle')
.data(nodes.slice(1))
.enter()
.append('circle')
.attr('r', function (d) { return d.r; })
.style('fill', function (d, i) { return color(i % 3); });
root.radius = 0;
root.fixed = true;
function ticked(e) {
svg.selectAll('circle')
.attr('cx', function (d) { return d.x; })
.attr('cy', function (d) { return d.y; });
};
let force = d3.forceSimulation()
.velocityDecay(0.2)
.force('x', forceX)
.force('y', forceY)
.force('collide', d3.forceCollide().radius(function (d) {
if (d === root) {
return Math.random() * 50 + 100;
}
return d.r + 2;
}).iterations(5))
.nodes(nodes).on('tick', ticked);
svg.on('mousemove', function () {
root.fx = d3.event.pageX;
root.fy = d3.event.pageY;
force.alphaTarget(0.3).restart();//reheat the simulation
this.props.animateFauxDOM(3500);
});
this.props.animateFauxDOM(3500);
}
render() {
return (
<div>{this.props.collision}</div>
)
}
}
MyReactComponent.defaultProps = {
collision: 'loading'
}
export default withFauxDOM(MyReactComponent)
However, it seems to only last for a few seconds before displaying an Uncaught TypeError:
Got the animation to work without implementing react-faux-dom:
import React, { Component } from 'react';
import ReactDOM from 'react-dom';
import * as d3 from 'd3';
class AnimateCollisions extends React.Component {
constructor() {
super();
const nodes = this.generateNodes(200);
this.state = {
colors: d3.scaleOrdinal().range(d3.schemeCategory10),
nodes: nodes,
root: nodes[0],
width: window.innerWidth,
height: window.innerHeight,
alpha_target: .4,
iterations: 4,
velocity_decay: .1,
forceX: d3.forceX(window.innerWidth / 2).strength(0.015),
forceY: d3.forceY(window.innerHeight / 2).strength(0.015),
}
this.getColor = this.getColor.bind(this);
this.generateNodes = this.generateNodes.bind(this);
this.storeForce = this.storeForce.bind(this);
this.renderD3 = this.renderD3.bind(this);
}
componentDidMount() {
this.renderD3();
}
generateNodes(range_limit) {
return [...Array(range_limit).keys()].map(function() {
return { r: Math.random() * 12 + 4 };
});
}
storeForce(force) {
this.setState({force: force});
}
getColor(i) {
return this.state.colors(i % 3);
}
renderD3() {
const nodes = this.state.nodes;
const forceX = this.state.forceX;
const forceY = this.state.forceY;
const root = nodes[0];
const svg = d3.select(ReactDOM.findDOMNode(this.refs.animation));
const alpha = this.state.alpha_target;
const iterations = this.state.iterations;
root.radius = 0;
root.fixed = true;
svg.selectAll('circle')
.data(nodes.slice(1))
.enter();
function ticked(e) {
svg.selectAll('circle')
.attr('cx', function(d) { return d.x; })
.attr('cy', function(d) { return d.y; });
};
svg.on('mousemove', function() {
const p1 = d3.mouse(this);
root.fx = p1[0];
root.fy = p1[1];
force.alphaTarget(alpha).restart();//reheat the simulation
});
let force = d3.forceSimulation()
.velocityDecay(this.state.velocity_decay)
.force('x', forceX)
.force('y', forceY)
.force('collide', d3.forceCollide().radius(function(d) {
if (d === root) {
return Math.random() * 50 + 100;
}
return d.r + 2;
}).iterations(iterations))
.nodes(nodes).on('tick', ticked);
this.storeForce(force);
}
render() {
// use React to draw all the nodes, d3 calculates the x and y
const nodes = this.state.nodes.slice(1).map((node, index) => {
const color = this.getColor(index);
return (
<circle
fill={color}
cx={node.x}
cy={node.y}
r={node.r}
key={`circle-${index}`}
/>
);
});
return (
<svg width={this.state.width} height={this.state.height}>
<g ref='animation'>{nodes}</g>
</svg>
)
}
}
export default AnimateCollisions;
This is my first D3v4 attempt. If anyone has advice on how to make my animation better, or smoother, please let me know:
https://codesandbox.io/s/ly5zx2p4ym
|
gharchive/issue
| 2017-12-27T13:44:43 |
2025-04-01T04:55:27.999448
|
{
"authors": [
"jeff1evesque"
],
"repo": "Olical/react-faux-dom",
"url": "https://github.com/Olical/react-faux-dom/issues/126",
"license": "unlicense",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2681337928
|
Fix storage cache persistance across build configs
Currently, the storage cache will persist between different build configurations. For example, if a developer compiles in debug mode, then switches to release mode, the cache will hold onto and emit the debug code.
It's more complicated. The storage cache will store the code compiled with O3 (that's what we do by default), either in LLVM IR or object format. We need to define what should be the expected functionality. Also, what does debug mode mode in terms of optimization flags and debug info? Are you referring to cmake build types?
Tagging @koparasy to triage this issue
|
gharchive/issue
| 2024-11-21T23:42:47 |
2025-04-01T04:55:28.007650
|
{
"authors": [
"ggeorgakoudis",
"johnbowen42"
],
"repo": "Olympus-HPC/proteus",
"url": "https://github.com/Olympus-HPC/proteus/issues/37",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2422698675
|
corn simulation
Hi,
I am trying to run pySTICS for corn simulation.
I have taken de "run_pystics_simulation_stics_example.ipynb" and changed species and variety to :
species = 'corn'
variety = 'Cherif'
where 'Cherif' is a variety present in "corn_plt.xml"
the "parametrization_from_stics_example_files" function retrieves correctly the different example parameters, but the execution fails in the simulation run_pystics_simulation where i get the bellow error:
"""
in senescence_stress(lev_i, ulai, vlaimax, temp_min_prev, tgeljuv10, tgeljuv90, tgelveg10, tgelveg90, tletale, tdebgel, codgeljuv, codgelveg)
else:
fstressgel = 1
return fstressgel
"""
UnboundLocalError: cannot access local variable 'fstressgel' where it is not associated with a value
If you have any insights to resolve this issue i am having. (When leaving to wheat and Talent everything works fine)
Best regards
As a work around i modified the function bellow by adding an init value to the return value fstressgel, and seems to have corrected the issue
`def senescence_stress(lev_i, ulai, vlaimax, temp_min_prev, tgeljuv10, tgeljuv90, tgelveg10, tgelveg90, tletale, tdebgel, codgeljuv, codgelveg):
**fstressgel = 1**
if lev_i > 0:
# Frost stress
if (
ulai < vlaimax
): # si on est avant stade iamf --> fgeljuv
if codgeljuv == 2:
fstressgel = frost_stress(temp_min_prev, tgeljuv90, tgeljuv10, tletale, tdebgel)
else: # après stade iamf -> fgelveg
if codgelveg == 2:
fstressgel = frost_stress(temp_min_prev, tgelveg90, tgelveg10, tletale, tdebgel)
else:
fstressgel = 1
return fstressgel`
hey @edericqu
Thanks for your warning !
I never had this issue on wheat, but I saw the same kind of problem on other modules if the output has not been initialized and when none of the condition is verified. I think your fix is fine, I will add it to the new version released in few days
pySTICS has not been tested and validated on corn yet, so take your precautions when interpreting your results. Validated species will be gradually mentioned in the readme
OK, thanks for your warning @etienneperez , that might explain that somme paramaters had very little infleunce is my initials tests.
Are the weather and soil modules validated in a general matter, or need to be adapted to each crop ?
The options implemented in pySTICS seem validated in our first tests on wheat, we are currently testing other determinate growth crops and forage crops to see if specific formalisms are needed (for example, codlainet code for barley and cutting for forage crops were implemented). @edericqu
|
gharchive/issue
| 2024-07-22T12:01:03 |
2025-04-01T04:55:28.013687
|
{
"authors": [
"edericqu",
"etienneperez"
],
"repo": "OmbreaPV/pySTICS",
"url": "https://github.com/OmbreaPV/pySTICS/issues/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
95815051
|
Watershed mapgen BUGS ! #2
Other bugs need to be checked/fixed about the watershed new mapgen
[x] The papyrus isn't generated on the new mapgen => after check, yes they spawn
[x] The papyrus needs to be generated and grow on default:dirt_with_grass, default:dirt, default:sand, default:desert_sand
[ ] pyramids spawn in a way very strange and bugged...
And generated underground ?!
Is cutted if exceeds +110 ?
[ ] I can see many small ships underwater, but i haven't see big ships(created with stone), need to check this point
@crabman77 @gravgun @Gael-de-Sailly, if someone can check this little point ? Thank you in advance
@Wuzzy2 will maybe be interrested about his [pyramids] bugs with the [watershed] lua mapgen
I can confirm the papyrus grows on dirt only.
Je veux juste vous rappeler que je ne suis pas disponible avant une semaine, c'est probablement ma seule connexion à Internet avant le 1er août.
Je suis désolé mais vous devrez vous en occuper vous-mêmes.
Pas de soucis @Gael-de-Sailly !
@gravgun, @crabman77 => vos dernières modifications ont elles réparé le bugs des pyramids ?
aucune idée, je n'ai rien modifié.
@Ombridride 41b34841021efc63b8626ac62bad763ed21a6155 delayed Pyramids generation, which will most likely fix the current issues as I guess they're related to timing (since the code is pretty clean). The only "bug" that's actually a(n "overexpressed") feature is pyramids generating underground.
Everything is ok, closed this issue
|
gharchive/issue
| 2015-07-18T11:04:02 |
2025-04-01T04:55:28.020199
|
{
"authors": [
"Cyberpangolin",
"Gael-de-Sailly",
"Ombridride",
"crabman77",
"gravgun"
],
"repo": "Ombridride/minetest-minetestforfun-server",
"url": "https://github.com/Ombridride/minetest-minetestforfun-server/issues/136",
"license": "unlicense",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1092322858
|
Test 04 (deep dag): Issues with multiprocessing
Hi I have problems to get test_04_deep_dag() running correctly.
When running as is, it results into an AttributeError. The error message states problems with local objects, which is common for multiprocessing:
AttributeError: Can't pickle local object 'DagTestCase.test_04_deep_dag.<locals>.run_test'
The line it fails is: https://github.com/OmenApps/django-postgresql-dag/blob/20c47971a56c62aed3616d4d29fafcfb79204efd/tests/test.py#L625
I'm running this from Windows. As far as I know Windows systems have pickle-issues. Maybe there is an alternative to using multiprocessing? One can also not use signal, as this is missing some functionality on Windows.
@Nix3c3r Thank you for pointing this out. I need to add some better CI for this repo to check for these sorts of issues, and will see about how I can resolve this. I've added this to my to-do list.
|
gharchive/issue
| 2022-01-03T08:42:49 |
2025-04-01T04:55:28.026935
|
{
"authors": [
"Nix3c3r",
"OmenApps"
],
"repo": "OmenApps/django-postgresql-dag",
"url": "https://github.com/OmenApps/django-postgresql-dag/issues/11",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
823736433
|
Trouble to connect with unity
I download and setup unity properly. After i want to use vs code to code in c# for unity and begin ton link them. All work properly exept the font prediction and i saw in my OUTPUT terminal this issue :
Starting OmniSharp server at 06/03/2021 à 21:03:42
Target: d:\Unity code\Projet\Test\Test.sln
[ERROR] Error: spawn cmd ENOENT
I try every solution i found on the net (as known as add C/Window/System32 in your path value but did not change) so i need help
This is OmniSharp server repo, of you have a problem with C# extension please open an issue here https://github.com/OmniSharp/omnisharp-vscode and provide the necessary details (OmniSharp log, repro steps and so on). Thanks.
|
gharchive/issue
| 2021-03-06T20:10:26 |
2025-04-01T04:55:28.052708
|
{
"authors": [
"VidsSkids",
"filipw"
],
"repo": "OmniSharp/omnisharp-roslyn",
"url": "https://github.com/OmniSharp/omnisharp-roslyn/issues/2109",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
356049508
|
Don't duplicate find symbols results from C# files
Fixes https://github.com/OmniSharp/omnisharp-roslyn/issues/1281
@mholo65 Done; thanks. It does kind of look like this has been present for as long as cake...
@mholo65 @DustinCampbell Another approach could be to simply remove the "cake" flavored handler. Ctrl-t in VS Code doesn't specify a language, so the C# extension doesn't pass one to OmniSharp, thus running both handlers.
However, the symbol provider (https://code.visualstudio.com/docs/extensionAPI/vscode-api#DocumentSymbolProvider) gets access to the current document, so we could specify a language in the request based on the current document. Of course, that means you could only search for Cake symbols from a Cake file and vice versa. Thoughts?
@mholo65 @DustinCampbell Another approach could be to simply remove the "cake" flavored handler. Ctrl-t in VS Code doesn't specify a language, so the C# extension doesn't pass one to OmniSharp, thus running both handlers.
However, the symbol provider (https://code.visualstudio.com/docs/extensionAPI/vscode-api#DocumentSymbolProvider) gets access to the current document, so we could specify a language in the request based on the current document. Of course, that means you could only search for Cake symbols from a Cake file and vice versa. Thoughts?
I think there should be a single FindSymbolsHandler that lives in the OmniSharp.Roslyn.CSharp project that returns results from the entire workspace for all project systems. The cake handlers really just translate locations don't they? This could be done by calling into a method exposed by the project system from the findsymbols endpoint.
In the interest of being able to release today, I'm going to merge this PR. I filed https://github.com/OmniSharp/omnisharp-roslyn/issues/1287 to track cleaning this up further.
|
gharchive/pull-request
| 2018-08-31T16:37:20 |
2025-04-01T04:55:28.057471
|
{
"authors": [
"DustinCampbell",
"rchande"
],
"repo": "OmniSharp/omnisharp-roslyn",
"url": "https://github.com/OmniSharp/omnisharp-roslyn/pull/1282",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
422818672
|
[WIP] Fix renaming in LSP
Blocked by #1411
Still random issues when renaming to a shorter name (the change is too long). Seems to be related to the fact that rename changes are not applied in the server.
//cc @david-driscoll
Ready for review! Everything works ok now!
|
gharchive/pull-request
| 2019-03-19T16:04:18 |
2025-04-01T04:55:28.059030
|
{
"authors": [
"mholo65"
],
"repo": "OmniSharp/omnisharp-roslyn",
"url": "https://github.com/OmniSharp/omnisharp-roslyn/pull/1423",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
987225195
|
Add Custom .NET CLI support to OmniSharp
Add a configuration mechanism to allow users to point OmniSharp to a custom .NET CLI, with a fallback to the DOTNET_ROOT environment variable before falling back to the PATH.
I've updated the tests to use this functionality for locating the .NET CLI instead of using the old test-only code-path.
As this is my first contribution to Omnisharp, I'm not quite sure how this exposes the new setting through to omnisharp.json, so if anyone has any pointers on that, they would be well appreciated.
cc: @333fred
FWIW, it kind of looks like this runs before the omnisharp.json files are loaded. ☹
|
gharchive/pull-request
| 2021-09-02T22:38:20 |
2025-04-01T04:55:28.061080
|
{
"authors": [
"SamB",
"jkoritzinsky"
],
"repo": "OmniSharp/omnisharp-roslyn",
"url": "https://github.com/OmniSharp/omnisharp-roslyn/pull/2227",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
335187958
|
Microsoft.Build.Exceptions.InvalidProjectFileException: ON MACBOOK
Environment data
.NET Core SDK (reflecting any global.json):
Version: 2.1.301
Commit: 59524873d6
Runtime Environment:
OS Name: Mac OS X
OS Version: 10.13
OS Platform: Darwin
RID: osx.10.13-x64
Base Path: /usr/local/share/dotnet/sdk/2.1.301/
Host (useful for support):
Version: 2.1.1
Commit: 6985b9f684
.NET Core SDKs installed:
2.1.300 [/usr/local/share/dotnet/sdk]
2.1.301 [/usr/local/share/dotnet/sdk]
.NET Core runtimes installed:
Microsoft.AspNetCore.All 2.1.0 [/usr/local/share/dotnet/shared/Microsoft.AspNetCore.All]
Microsoft.AspNetCore.All 2.1.1 [/usr/local/share/dotnet/shared/Microsoft.AspNetCore.All]
Microsoft.AspNetCore.App 2.1.0 [/usr/local/share/dotnet/shared/Microsoft.AspNetCore.App]
Microsoft.AspNetCore.App 2.1.1 [/usr/local/share/dotnet/shared/Microsoft.AspNetCore.App]
Microsoft.NETCore.App 2.1.0 [/usr/local/share/dotnet/shared/Microsoft.NETCore.App]
Microsoft.NETCore.App 2.1.1 [/usr/local/share/dotnet/shared/Microsoft.NETCore.App]
To install additional .NET Core runtimes or SDKs:
https://aka.ms/dotnet-download
VS Code version: Version 1.24.1 (1.24.1)
C# Extension version: 1.15.2
Steps to reproduce
dotnet new console
and then run the project
Expected behavior
should print Hello World!
Actual behavior
Here is the output:
Running] scriptcs "/Users/mhermac/Google Drive/projects/console/Program.cs"
Unexpected argument: Drive/projects/console/Program.cs
[Done] exited with code=1 in 0.276 seconds
Here is the OmniSharp log:
Starting OmniSharp server at 2018-6-24 17:24:53
Target: /Users/mhermac/Google Drive/projects/console
OmniSharp server started with Mono 5.12.0
Path: /Users/mhermac/.vscode/extensions/ms-vscode.csharp-1.15.2/.omnisharp/1.30.1/omnisharp/OmniSharp.exe
PID: 12761
[info]: OmniSharp.Stdio.Host
Starting OmniSharp on MacOS 10.13.5 (x64)
[info]: OmniSharp.MSBuild.Discovery.MSBuildLocator
Located 1 MSBuild instance(s)
1: StandAlone 15.0 - "/Users/mhermac/.vscode/extensions/ms-vscode.csharp-1.15.2/.omnisharp/1.30.1/omnisharp/msbuild/15.0/Bin"
[info]: OmniSharp.MSBuild.Discovery.MSBuildLocator
MSBUILD_EXE_PATH environment variable set to '/Users/mhermac/.vscode/extensions/ms-vscode.csharp-1.15.2/.omnisharp/1.30.1/omnisharp/msbuild/15.0/Bin/MSBuild.dll'
[info]: OmniSharp.MSBuild.Discovery.MSBuildLocator
Registered MSBuild instance: StandAlone 15.0 - "/Users/mhermac/.vscode/extensions/ms-vscode.csharp-1.15.2/.omnisharp/1.30.1/omnisharp/msbuild/15.0/Bin"
MSBuildExtensionsPath = /usr/local/Cellar/mono/5.12.0.226/lib/mono/xbuild
BypassFrameworkInstallChecks = true
CscToolPath = /Users/mhermac/.vscode/extensions/ms-vscode.csharp-1.15.2/.omnisharp/1.30.1/omnisharp/msbuild/15.0/Bin/Roslyn
CscToolExe = csc.exe
MSBuildToolsPath = /Users/mhermac/.vscode/extensions/ms-vscode.csharp-1.15.2/.omnisharp/1.30.1/omnisharp/msbuild/15.0/Bin
TargetFrameworkRootPath = /usr/local/Cellar/mono/5.12.0.226/lib/mono/xbuild-frameworks
[info]: OmniSharp.Cake.CakeProjectSystem
Detecting Cake files in '/Users/mhermac/Google Drive/projects/console'.
[info]: OmniSharp.Cake.CakeProjectSystem
Could not find any Cake files
[info]: OmniSharp.DotNet.DotNetProjectSystem
Initializing in /Users/mhermac/Google Drive/projects/console
[info]: OmniSharp.DotNet.DotNetProjectSystem
Auto package restore: False
[info]: OmniSharp.DotNet.DotNetProjectSystem
Update workspace context
[info]: OmniSharp.DotNet.DotNetProjectSystem
Resolving projects references
[info]: OmniSharp.MSBuild.ProjectSystem
No solution files found in '/Users/mhermac/Google Drive/projects/console'
[info]: OmniSharp.MSBuild.ProjectManager
Queue project update for '/Users/mhermac/Google Drive/projects/console/console.csproj'
[info]: OmniSharp.Script.ScriptProjectSystem
Detecting CSX files in '/Users/mhermac/Google Drive/projects/console'.
[info]: OmniSharp.Script.ScriptProjectSystem
Could not find any CSX files
[info]: OmniSharp.Stdio.Host
Invoking Workspace Options Provider: OmniSharp.Roslyn.CSharp.Services.CSharpWorkspaceOptionsProvider
[info]: OmniSharp.Stdio.Host
Configuration finished.
[info]: OmniSharp.Stdio.Host
Omnisharp server running using Stdio at location '/Users/mhermac/Google Drive/projects/console' on host 12747.
[info]: OmniSharp.MSBuild.ProjectManager
Loading project: /Users/mhermac/Google Drive/projects/console/console.csproj
[warn]: OmniSharp.MSBuild.ProjectManager
Failed to load project file '/Users/mhermac/Google Drive/projects/console/console.csproj'.
/Users/mhermac/Google Drive/projects/console/console.csproj(1,1)
Microsoft.Build.Exceptions.InvalidProjectFileException: The imported project "/usr/local/Cellar/mono/5.12.0.226/lib/mono/xbuild/15.0/Microsoft.Common.props" was not found. Also, tried to find "15.0/Microsoft.Common.props" in the fallback search path(s) for $(MSBuildExtensionsPath) - "/Library/Frameworks/Mono.framework/External/xbuild/" . These search paths are defined in "/Users/mhermac/.vscode/extensions/ms-vscode.csharp-1.15.2/.omnisharp/1.30.1/omnisharp/OmniSharp.exe.config". Confirm that the path in the declaration is correct, and that the file exists on disk in one of the search paths. /usr/local/share/dotnet/sdk/2.1.301/Sdks/Microsoft.NET.Sdk/Sdk/Sdk.props
at Microsoft.Build.Shared.ProjectErrorUtilities.ThrowInvalidProject (System.String errorSubCategoryResourceName, Microsoft.Build.Shared.IElementLocation elementLocation, System.String resourceName, System.Object[] args) [0x00042] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Shared.ProjectErrorUtilities.ThrowInvalidProject (Microsoft.Build.Shared.IElementLocation elementLocation, System.String resourceName, System.Object[] args) [0x00000] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].ThrowForImportedProjectWithSearchPathsNotFound (Microsoft.Build.Evaluation.ProjectImportPathMatch searchPathMatch, Microsoft.Build.Construction.ProjectImportElement importElement) [0x000a2] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].ExpandAndLoadImports (System.String directoryOfImportingFile, Microsoft.Build.Construction.ProjectImportElement importElement) [0x001be] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].EvaluateImportElement (System.String directoryOfImportingFile, Microsoft.Build.Construction.ProjectImportElement importElement) [0x0000d] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].PerformDepthFirstPass (Microsoft.Build.Construction.ProjectRootElement currentProjectOrImport) [0x00209] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].EvaluateImportElement (System.String directoryOfImportingFile, Microsoft.Build.Construction.ProjectImportElement importElement) [0x0003a] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].PerformDepthFirstPass (Microsoft.Build.Construction.ProjectRootElement currentProjectOrImport) [0x000e6] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].Evaluate (Microsoft.Build.BackEnd.Logging.ILoggingService loggingService, Microsoft.Build.Framework.BuildEventContext buildEventContext) [0x000f8] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Evaluator4[P,I,M,D].Evaluate (Microsoft.Build.Evaluation.IEvaluatorData4[P,I,M,D] data, Microsoft.Build.Construction.ProjectRootElement root, Microsoft.Build.Evaluation.ProjectLoadSettings loadSettings, System.Int32 maxNodeCount, Microsoft.Build.Collections.PropertyDictionary1[T] environmentProperties, Microsoft.Build.BackEnd.Logging.ILoggingService loggingService, Microsoft.Build.Evaluation.IItemFactory2[S,T] itemFactory, Microsoft.Build.Evaluation.IToolsetProvider toolsetProvider, Microsoft.Build.Evaluation.ProjectRootElementCache projectRootElementCache, Microsoft.Build.Framework.BuildEventContext buildEventContext, Microsoft.Build.Execution.ProjectInstance projectInstanceIfAnyForDebuggerOnly, Microsoft.Build.BackEnd.SdkResolution.ISdkResolverService sdkResolverService, System.Int32 submissionId) [0x00018] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Project.Reevaluate (Microsoft.Build.BackEnd.Logging.ILoggingService loggingServiceForEvaluation, Microsoft.Build.Evaluation.ProjectLoadSettings loadSettings) [0x00046] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Project.ReevaluateIfNecessary (Microsoft.Build.BackEnd.Logging.ILoggingService loggingServiceForEvaluation, Microsoft.Build.Evaluation.ProjectLoadSettings loadSettings) [0x00034] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Project.ReevaluateIfNecessary (Microsoft.Build.BackEnd.Logging.ILoggingService loggingServiceForEvaluation) [0x00000] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Project.ReevaluateIfNecessary () [0x00007] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Project.Initialize (System.Collections.Generic.IDictionary2[TKey,TValue] globalProperties, System.String toolsVersion, System.String subToolsetVersion, Microsoft.Build.Evaluation.ProjectLoadSettings loadSettings) [0x00126] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Evaluation.Project..ctor (System.String projectFile, System.Collections.Generic.IDictionary2[TKey,TValue] globalProperties, System.String toolsVersion, System.String subToolsetVersion, Microsoft.Build.Evaluation.ProjectCollection projectCollection, Microsoft.Build.Evaluation.ProjectLoadSettings loadSettings) [0x0009c] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.Project..ctor (System.String projectFile, System.Collections.Generic.IDictionary2[TKey,TValue] globalProperties, System.String toolsVersion, Microsoft.Build.Evaluation.ProjectCollection projectCollection, Microsoft.Build.Evaluation.ProjectLoadSettings loadSettings) [0x00000] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Evaluation.Project..ctor (System.String projectFile, System.Collections.Generic.IDictionary2[TKey,TValue] globalProperties, System.String toolsVersion, Microsoft.Build.Evaluation.ProjectCollection projectCollection) [0x00000] in <61115f75067146fab35b10183e6ee379>:0 at Microsoft.Build.Evaluation.ProjectCollection.LoadProject (System.String fileName, System.Collections.Generic.IDictionary2[TKey,TValue] globalProperties, System.String toolsVersion) [0x000f5] in <61115f75067146fab35b10183e6ee379>:0
at Microsoft.Build.Evaluation.ProjectCollection.LoadProject (System.String fileName, System.String toolsVersion) [0x00000] in <61115f75067146fab35b10183e6ee379>:0
at OmniSharp.MSBuild.ProjectLoader.EvaluateProjectFileCore (System.String filePath) [0x0003e] in <64f2a38974d54ecaaa93e3697628b738>:0
at OmniSharp.MSBuild.ProjectLoader.BuildProject (System.String filePath) [0x0000d] in <64f2a38974d54ecaaa93e3697628b738>:0
at OmniSharp.MSBuild.ProjectFile.ProjectFileInfo.Load (System.String filePath, OmniSharp.MSBuild.ProjectLoader loader) [0x00014] in <64f2a38974d54ecaaa93e3697628b738>:0
at OmniSharp.MSBuild.ProjectManager+<>c__DisplayClass23_0.b__0 () [0x00000] in <64f2a38974d54ecaaa93e3697628b738>:0
at (wrapper delegate-invoke) System.Func1[System.ValueTuple2[OmniSharp.MSBuild.ProjectFile.ProjectFileInfo,System.Collections.Immutable.ImmutableArray1[OmniSharp.MSBuild.Logging.MSBuildDiagnostic]]].invoke_TResult() at OmniSharp.MSBuild.ProjectManager.LoadOrReloadProject (System.String projectFilePath, System.Func1[TResult] loadFunc) [0x0001b] in <64f2a38974d54ecaaa93e3697628b738>:0
@formher Do you have mono installed?
Do you have mono installed?
yes.
Mhers-MacBook-Pro:~ mhermac$ mono --version
Mono JIT compiler version 5.12.0.226 (tarball Thu Jun 14 13:20:28 BST 2018)
Copyright (C) 2002-2014 Novell, Inc, Xamarin Inc and Contributors. www.mono-project.com
TLS: normal
SIGSEGV: altstack
Notification: kqueue
Architecture: amd64
Disabled: none
Misc: softdebug
Interpreter: yes
LLVM: supported, not enabled.
GC: sgen (concurrent by default)
@formher How did you install it?
I haven't installed mono specifically, I am guessing it came with .NET SDK ?
@formher Not sure. I suspect that if you install mono (https://www.mono-project.com/download/stable/#download-mac) that this issue might go away.
almost there... now it says..
Mhers-MacBook-Pro:~ mhermac$ mono --version
-bash: /usr/local/bin/mono: No such file or directory
no it's not there. installing from https://www.mono-project.com/download/stable/#download-mac seems like broke the pathes or something
CS Code tries to run it via scriptcs for some reason. Here is the output:
Running] scriptcs "/Users/mhermac/Google Drive/projects/console/Program.cs"
/usr/local/bin/scriptcs: line 2: mono: command not found
Tagging @filipw is this related to dotnet script at all?
ok the problem is resolved with mono. now the issue is with scriptcs, which I guess is another topic already
thank you
Tagging @filipw is this related to dotnet script at all?
looks like this is a scriptcs bug, not related to OmniSharp anymore. I will close this and if there is any other issue with OmniSharp we can revisit this.
|
gharchive/issue
| 2018-06-24T15:27:55 |
2025-04-01T04:55:28.089902
|
{
"authors": [
"filipw",
"formher",
"rchande"
],
"repo": "OmniSharp/omnisharp-vscode",
"url": "https://github.com/OmniSharp/omnisharp-vscode/issues/2390",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
243775700
|
Switch to download.visualstudio.microsoft.com part 1
This checkin in the first part of the work to switch to using download.visualstudio.microsoft.com.
This checkin does two things:
It replaces the OmniSharp links with CDN links
It adds a new gulp task to update links in package.json. This will allow us to script the insertion process.
@DustinCampbell I will CC you on the debugger packaging script updates where I make use of gulp updatePackageDependencies so you can leverage something similar for OmniSharp if you would like.
|
gharchive/pull-request
| 2017-07-18T16:27:30 |
2025-04-01T04:55:28.093673
|
{
"authors": [
"gregg-miskelly"
],
"repo": "OmniSharp/omnisharp-vscode",
"url": "https://github.com/OmniSharp/omnisharp-vscode/pull/1646",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1172266113
|
Short solution needed: "How to remove specific header" (nginx)
Please help us write most modern and shortest code solution for this issue:
How to remove specific header (technology: nginx)
Fast way
Just write the code solution in the comments.
Prefered way
Create pull request with a new code file inside inbox folder.
Don't forget to use comments to make solution explained.
Link to this issue in comments of pull request.
How to remove specific header
|
gharchive/issue
| 2022-03-17T11:36:56 |
2025-04-01T04:55:28.129599
|
{
"authors": [
"nonunicorn"
],
"repo": "Onelinerhub/onelinerhub",
"url": "https://github.com/Onelinerhub/onelinerhub/issues/1123",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
821547688
|
Camera.setRange() function only effective if called after Camera.open()
Python Library versions
TauLidarCamera 0.0.2
TauLidarCommon 0.0.2
Camera Info
If possible, paste the output of camera.info() from Python (after camera has been successfully opened)
ToF camera opened successfully:
model: 4.0
firmware: 3.3
uid: 450225
resolution: 160x60
port: /dev/cu.usbmodem00000000001A1
Describe the bug
When calling Camera.setRange() before Camera.open(), the values passed to setRange() have no impact on the depth map colouring
Steps to Reproduce
In the current version of the distance.py example program, comment out line 19, and run the program
Screenshot the depth map output and kill the program
Go back to distance.py and change Camera.setRange(0, 1200) on line 12 to Camera.setRange(0, 7500)
Screenshot the depth map output and kill the program
What is happening?
Following the steps above, you'll see there is no change to the colouring of the depth map
With Camera.setRange(0, 1200) on line 12:
With Camera.setRange(0, 7500) on line 12:
What is expected?
Expecting to see the depth map colouring change when different values are passed to Camera.setRange(0, 7500)
If the program is changed to call Camera.setRange() AFTER Camera.open(), only then does the colouring change:
With Camera.setRange(0, 1200) on line 19:
With Camera.setRange(0, 7500) on line 19:
@vsemi can you comment on this bug? Should we just always call Camera.setRange() AFTER Camera.open()?
The reason is the open() method called setDefaultParameters() and it reset any parameters set before open(). Doing this is to prevent crash if no parameters been set and user doing other operation other than set necessary parameters immediately. I could fix this by by not set those 2 static parameters in setDefaultParameters() but given a static value.
@vsemi since the FrameBuilder constructor sets the range and color mode, it should be safe to remove the setColorMode and setRange calls from Camera.setDefaultParameters().
Please try that out and let me know how it goes?
If everything works fine, commit the changes and we'll make a new release
|
gharchive/issue
| 2021-03-03T22:40:47 |
2025-04-01T04:55:28.138582
|
{
"authors": [
"greenbreakfast",
"vsemi"
],
"repo": "OnionIoT/tau-lidar-camera",
"url": "https://github.com/OnionIoT/tau-lidar-camera/issues/11",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
267658536
|
Provide flow type definitions
Related with #1.
Coverage remained the same at 100.0% when pulling 4a58d56352b8fae529fd82f111fa63df8d6f3bb6 on feature/flow into 4f2dc56cfde9d93b6c542b0638aa67c769f24bdc on master.
Coverage remained the same at 100.0% when pulling d114088fa719a13e296f08d0ddadd6fd8450c6fe on feature/flow into 4f2dc56cfde9d93b6c542b0638aa67c769f24bdc on master.
Coverage remained the same at 100.0% when pulling 880e5fc959781577567549d05c0e032c5295af40 on feature/flow into 4f2dc56cfde9d93b6c542b0638aa67c769f24bdc on master.
|
gharchive/pull-request
| 2017-10-23T13:00:19 |
2025-04-01T04:55:28.157339
|
{
"authors": [
"Oopscurity",
"coveralls"
],
"repo": "Oopscurity/t8on",
"url": "https://github.com/Oopscurity/t8on/pull/13",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2052487664
|
Remove device and board from cproject schema
Describe The Problem To Be Solved
The schema allows board and device to be specified in the cproject file:
https://github.com/Open-CMSIS-Pack/devtools/blob/eacdd6c4709b5ec65054364f0eba7a5f6a7d6c22/tools/projmgr/schemas/common.schema.json#L750-L751
This adds unnecessary complexity, since the hardware is best specified in the target types.
(Optional): Suggest A Solution
Suggestion: remove the device and board from the cproject schema. Replace this with a mechanism for selecting the target processor for multi-processor devices.
This would make it easier for users to decide where to put the hardware information. It would also make it easier to implement tools that consume the csolution specification.
Downsides include making a breaking change to the schema and minor loss of flexibility. I think these are outweighed by the usability benefits.
This looks like an bug in the schema.
https://github.com/Open-CMSIS-Pack/cmsis-toolbox/blob/main/docs/YML-Input-Format.md#project lists the available options:
board: is not permitted, hence a bug in the schema file
device: should be only used to select a processor core. Could this be improved in the schema?
In a similar and consistent way this was always allowed also for clayers.
Should it be also removed as expected in the documentation?
https://github.com/Open-CMSIS-Pack/cmsis-toolbox/blob/main/docs/YML-Input-Format.md#layer
@RobertRostohar I believe you were experimentally working on board layers using such board or device nodes, would you agree with their removal?
Being a breaking change it should be scheduled for a major version release.
I believe that clayer should have the same features as cproject regarding device (and board) as already mentioned by @ReinhardKeil:
board: not permitted
device:only allowed to select processor core (Pname)
We could also consider a new name for the device attribute in the cproject and clayer files to make things clearer. E.g., processorName (it's a shame processor is already taken).
We could change that, but maybe this is not required.
At csolution.yml it is possible to specify:
board: <board-name> // specifies the device indirectly via the device defined by the board
device: :<pname> // adds a processor core for a device defined by a board
device: <device-name> // overwrites the board device or specifies the device of a unknown board
device: <device-name>:<pname> // specifies both device and processor core
At cproject.yml or clayer.yml it is possible to specify just:
device: :<pname> // adds a processor core; device is always defined at csolution.yml level
board: not permitted.
I believe this can be reflected in the schema.
I just validated with @ducnguyenarm that the uv2csolution conversion tool has not specified the device: nor board: node in cproject.yml files.
We still cannot rule out completely that this has not be used by any user, therefore it remains a potentially breaking change.
Still my recommendation is to reflect this in the schema; reasons:
IMHO this is a usage error as the documentation has been clear about it.
Schema checks can be disabled to my knowledge.
@jkrech @ReinhardKeil Specifying the full device name under the device node in *.cproject.yml files has always been allowed. A quick search in NXP packs, for example NXP::EVK-MIMXRT1064_BSP@17.0.0, reveals more than thousand examples using it.
This is potentially a breaking change and requires a major semantic increment.
|
gharchive/issue
| 2023-12-21T14:05:09 |
2025-04-01T04:55:28.209530
|
{
"authors": [
"ReinhardKeil",
"RobertRostohar",
"brondani",
"jkrech",
"mcgordonite"
],
"repo": "Open-CMSIS-Pack/devtools",
"url": "https://github.com/Open-CMSIS-Pack/devtools/issues/1252",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1237411370
|
Reference Handler Not Cloned
Hello,
I am a user of the Convey project and noticed that they use this project for its JSON serialization. I may have found an issue in your Clone code for cloning serialization options.
public static JsonSerializerOptions Clone(this JsonSerializerOptions options)
{
if (options == null) throw new ArgumentNullException(nameof(options));
Contract.EndContractBlock();
var clone = new JsonSerializerOptions
{
AllowTrailingCommas = options.AllowTrailingCommas,
DefaultBufferSize = options.DefaultBufferSize,
DictionaryKeyPolicy = options.DictionaryKeyPolicy,
Encoder = options.Encoder,
IgnoreNullValues = options.IgnoreNullValues,
IgnoreReadOnlyFields = options.IgnoreReadOnlyFields,
IgnoreReadOnlyProperties = options.IgnoreReadOnlyProperties,
IncludeFields = options.IncludeFields,
MaxDepth = options.MaxDepth,
NumberHandling = options.NumberHandling,
PropertyNameCaseInsensitive = options.PropertyNameCaseInsensitive,
PropertyNamingPolicy = options.PropertyNamingPolicy,
ReadCommentHandling = options.ReadCommentHandling,
WriteIndented = options.WriteIndented
};
foreach (var converter in options.Converters)
clone.Converters.Add(converter);
return clone;
}
It looks like you are not cloning the reference handler option. Unless I am missing something. Could this option along with any other potential missing options be added in the clone?
What happens currently when I pass in my JsonSerializerOptions object with a reference handler option set to the JsonSerializerFactory constructor, then call the GetSerializer method. I get back a serializer with the handler option no longer set.
If you need any further details I can try to provide more. Thanks.
https://www.nuget.org/packages/Open.Serialization.Json.System/2.3.3
Thanks for the PR!
|
gharchive/issue
| 2022-05-16T16:34:51 |
2025-04-01T04:55:28.219640
|
{
"authors": [
"electricessence",
"tstrausbaugh-dev"
],
"repo": "Open-NET-Libraries/Open.Serialization",
"url": "https://github.com/Open-NET-Libraries/Open.Serialization/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1754030329
|
【问题确认】3.3.6会出现 hmget 命令耗时很长的问题,确认下最新的代码能否复现该问题
慢查询日志:
pika_client_conn.cc:158] table: db0, command: "HMGET" "d3d01a35ee5a87c909bdd60a52187779" "gnid" "gzh_id" "title_vecstr" "btag_json" "store_news_type" "dupid" "kws_title" "tbtag" "title" "tag" "btag" "cluster" "content_length" "content_pic_count" "image" "pdate" "duration" "store_vvctag" "vbtcls" "store_vctag" "s", command_size: 222, arguments: 23, start_time(s): 1682823831, duration(us): 2161166
pika_client_conn.cc:158] table: db0, command: "HMGET" "d3d01a35ee5a87c909bdd60a52187779" "gnid" "gzh_id" "title_vecstr" "btag_json" "store_news_type" "dupid" "kws_title" "pdate" "cluster" "v_qt" "store_lt", command_size: 133, arguments: 13, start_time(s): 1682823831, duration(us): 2004331
机器配置:
Linux 693.el7.x86_64 #1 SMP Tue Aug 22 21:09:27 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
10 核 256G
月财:没有复现
|
gharchive/issue
| 2023-06-13T04:47:16 |
2025-04-01T04:55:28.264901
|
{
"authors": [
"luky116",
"yaoyinnan"
],
"repo": "OpenAtomFoundation/pika",
"url": "https://github.com/OpenAtomFoundation/pika/issues/1614",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
445667809
|
Uncaught exception edit
For some reason, the CSR Dongle error catch was not working properly. Instead, it gets catched as an uncaught exception, and every uncaught exception closed the Hub.
Commented out the section that would close the Hub app, and I can still connect using a BLED112 dongle attached to my Windows Computer. Users will still get a warning popup window saying no CSR dongle is attached, then it shows up as an error in the GUI.
@daniellasry Ready for review and merge.
I don't agree with this fix. When an uncaught exception is thrown we should consider the application is in a bad state. Not swallow them silently and continue.
If there is a specific exception that's shutting down the Hub, then catch and handle that specific exception in the place where it is thrown!
It was too difficult to catch this Error in the Hub. Though it was possible in the NodeJS Ganglion repo. Changes will be made in the GUI to remove CSR functionality for future releases.
|
gharchive/pull-request
| 2019-05-18T02:02:33 |
2025-04-01T04:55:28.267323
|
{
"authors": [
"daniellasry",
"retiutut"
],
"repo": "OpenBCI/OpenBCI_Hub",
"url": "https://github.com/OpenBCI/OpenBCI_Hub/pull/96",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
140846042
|
Crowdfunding feature
Imagine an author that wants to release a book/film/music in public domain.
Before releasing it for free for everyone (in a CC0 ou CC-BY license), he wants to receive at least 10.0 BTC for his work.
It would be nice to have an option to sell shares of a goal (i.e book release in public domain).
When all 100 shares items are sold out, each one costing 0.1 BTC, a download link with URL becomes visible to all users.
This should be doable. Lighthouse has an on-chain way of doing it, we could simply integrate that same thing into OpenBazaar. Should work for both services and digital goods. Physical goods might be interesting too, for a Kickstarter-style thing.
I mean, releasing the funds when the treshold is reached should be easy enough. Releasing when the actual service is completed, like releasing that book/film/music, would require some sort of oracle service.
|
gharchive/issue
| 2016-03-15T01:46:39 |
2025-04-01T04:55:28.282394
|
{
"authors": [
"Dekker3D",
"rhcastilhos"
],
"repo": "OpenBazaar/OpenBazaar-Client",
"url": "https://github.com/OpenBazaar/OpenBazaar-Client/issues/1111",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
256421786
|
Feature Request: Full Screen on Maximize
This is a minor user interface annoyance. The standard behavior on mac os is that the green "maximize / full screen" button takes you directly into full screen view.
However, on OB clicking the green button simply expands the window without taking you to full screen. You have to go to View -> Toggle Full Screen to do that.
This deviates from standard mac behavior and may be a minor annoyance to new users, as it was for me.
This was closed by #905
|
gharchive/issue
| 2017-09-09T07:33:01 |
2025-04-01T04:55:28.284129
|
{
"authors": [
"ekerstein",
"jjeffryes"
],
"repo": "OpenBazaar/openbazaar-desktop",
"url": "https://github.com/OpenBazaar/openbazaar-desktop/issues/824",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
54482211
|
updated csv files
filled in some gaps and made updates based on the latest ISO standards
i did not check in the generated json files, do you want these too?
this PR overrides the previous one..
have a bitter PR en route
|
gharchive/pull-request
| 2015-01-15T17:58:45 |
2025-04-01T04:55:28.285560
|
{
"authors": [
"iancrowther"
],
"repo": "OpenBookPrices/country-data",
"url": "https://github.com/OpenBookPrices/country-data/pull/19",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1970762220
|
Ignore case of MAC addresses and fix postgres log messages
Summary and Scope
Ignore case when dealing with MAC addresses. This is done by storing MAC addresses in lower case and converting any MAC addresses in requests to lower case before comparison. It is not desirable to distinguish MAC addresses as different based on case; therefore, we ignore it.
Also, some log message reformatting is present in this PR. For instance, there was a postgres.Add prefix in the Delete() function. Further, periods at the ends of log messages are removed so that it looks better when concatenated with other errors.
Issues and Related PRs
Migrated from https://github.com/bikeshack/hms-bss/pull/7
Testing
Tested on:
Cable Guys SI Cluster (cg-head and compute nodes)
This is a system of x86-64 Gigabyte 272-Z32-00 machines running Rocky Linux 8.8
Added integration tests that test addition/retrieval/deletion using MAC addresses of differing case.
Risks and Mitigations
More comprehensive integration tests should be implemented in the future.
Rebased onto new main with re-homing changes.
|
gharchive/pull-request
| 2023-10-31T15:44:16 |
2025-04-01T04:55:28.295915
|
{
"authors": [
"synackd"
],
"repo": "OpenCHAMI/bss",
"url": "https://github.com/OpenCHAMI/bss/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.