id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
617492882 | Replacing-pipelineresources-with-tasks
PipelineResources didn't make the cut for beta and have been replaced by a combination of Tekton Catalog Tasks and Workspaces.
v1alpha1 => v1beta1
The new odo based pipelines are already using task and workspace. There is no plan to change existing Appsody based pipeline until we know when PipelineResource is removed from Tekton.
| gharchive/issue | 2020-05-13T14:23:44 | 2025-04-01T06:39:15.431469 | {
"authors": [
"brianxjx",
"mchenggit"
],
"repo": "kabanero-io/kabanero-pipelines",
"url": "https://github.com/kabanero-io/kabanero-pipelines/issues/374",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2581211326 | Error: UPGRADE FAILED: parse error at (kafka-ui/templates/deployment.yaml:57): unclosed action
Issue submitter TODO list
[X] I've looked up my issue in FAQ
[X] I've searched for an already existing issues here (legacy) and here
[X] I've tried installing latest charts and the issue still persists there
[X] I'm running a supported version of the application & chart which is listed here
Describe the bug (actual behavior)
it seems new version of kafbat helm chart can't be upgraded anymore
Expected behavior
installation without errors
Your installation details
https://github.com/kafbat/kafka-ui/commit/2956664
kafka-ui-1.4.2
yamlApplicationConfig:
kafka:
clusters:
- name: test
bootstrapServers: test.test:9092
schemaRegistry: http://test1.test1:8081
4. -
Steps to reproduce
just applying this values fails
Screenshots
No response
Logs
No response
Additional context
No response
https://github.com/kafbat/helm-charts/blob/ab4948ba71b99576b7096a120597989a6c869a53/charts/kafka-ui/templates/deployment.yaml#L57
this change broke master
any updates on this?
any updates on this?
Could you kindly share the complete values file and the specific error message you encountered?
command: helm upgrade --install kafka-ui kafbat-ui/kafka-ui -f values.yaml -n kafka-ui
output:
Release "kafka-ui-deleteme" does not exist. Installing it now.
Error: parse error at (kafka-ui/templates/deployment.yaml:57): unclosed action
helm version: version.BuildInfo{Version:"v3.5.4", GitCommit:"1b5edb69df3d3a08df77c9902dc17af864ff05d1", GitTreeState:"clean", GoVersion:"go1.15.11"}
values.yaml:
replicaCount: 1
route:
enabled: true
env:
- name: filtering.groovy.enabled
value: "true"
- name: DYNAMIC_CONFIG_ENABLED
value: "true"
resources:
requests:
cpu: '1'
memory: '1Gi'
yamlApplicationConfig:
management:
health:
ldap:
enabled: false
volumes:
- name: config-data
persistentVolumeClaim:
claimName: config-pv-claim
volumeMounts:
- mountPath: "/etc/kafkaui"
name: config-data
helm version: version.BuildInfo{Version:"v3.5.4", GitCommit:"1b5edb69df3d3a08df77c9902dc17af864ff05d1", GitTreeState:"clean", GoVersion:"go1.15.11"}
Please upgrade helm to the latest version and try again.
yes, it worked ! thanks a lot
| gharchive/issue | 2024-10-11T11:58:53 | 2025-04-01T06:39:15.469361 | {
"authors": [
"azatsafin",
"odorT"
],
"repo": "kafbat/helm-charts",
"url": "https://github.com/kafbat/helm-charts/issues/29",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2316190045 | Pyaarlo
Please update this to the most current version of pyaarlo. The one you are using doesn't have the most recent fixes, nor does it support the newest devices.
@YpNo it seems like our fearless leader @kaffetorsk has jumped ship and that you are the only one answering questions, trying to update the code, etc. Are you able to fork this project to be able to keep on top of things?
Much obliged for your consideration.
Ahah, I have been thinking about it but we should find a real python developper and/or a video stream "expert" to help. I can do some things but I could reach some limits.
I'll keep you inform
Hi,
Okay okay, I think I will fork the project :)
@chicknill @bbo76 @RaidMax @xitation I'll keep you inform when the fork is ready. I don't know if I would treat all your issues or suggestions but I'll do my best.
If you know a real Python developpers with asynchronous operation skill and video handling to help me, it would be great !
The fork is ready if you want to try : https://github.com/YpNo/arlo-camera-streamer
It is the same version with little changes. It should run as today.
But it has been built with the latest version of pyaarlo ;)
Feel free to (re)open issue/discussion about your needs.
Welcome back @kaffetorsk !
I will close my fork if you’re really return back. Feel free to find contributors to help you maintain this great app. I can be one of them if you let me build new releases (with new version of pyaarlo first).
Let us know about the future of this project.
Regards.
Thank you, I will take better care of this repo going forward and appreciate the effort you put in.
As to the future of the project I think it is best to keep the scope to it's original intent (which is enabling arlo cameras for 3rd party NVRs (such as frigate)) and keep it maintained.
I also realize the need for maintenance is higher than I expected, mostly due to pyaarlo beeing based on a reverse-engineered API, suspect to change at any given moment :-)
I'll look into adding contributors
Dropped release v0.7.5 btw, pyaarlo is latest bleeding edge (my fork with a yet to be merged pr)
| gharchive/issue | 2024-05-24T20:15:05 | 2025-04-01T06:39:15.475262 | {
"authors": [
"YpNo",
"chicknlil",
"kaffetorsk"
],
"repo": "kaffetorsk/arlo-streamer",
"url": "https://github.com/kaffetorsk/arlo-streamer/issues/22",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2121341608 | UKI: kcrypt unlock-all doesn't unlock TPM-bound partitions
Currently unlocking partitions encrypted with TPM manually by calling kcrypt unlock-all doesn't work.
Workaround exists, and documented in https://kairos.io/docs/installation/trustedboot/#mount-partitions-after-install
To reproduce:
In the Kairos config, try to run kcrypt unlock-all in an after-install stage (e.g. to write some files to the disk)
Install Kairos in UKI mode
See installation failing.
Possible solution:
Introduce a new stage/hook (e.g. "after-decrypt") to allow people to run code right after decrypting the disks. This stage will also make sure the disk is encrypted again when the stage is done.
Also, there is a --tpm flag in kcrypt unlock-all command that might workaround the issue.
kcrypt unlock-all could work if we add a --tpm flag so it knows it needs to go over the tpm unlock workflow instead of the usual one.
Introduced in version 0.9.0: https://github.com/kairos-io/kcrypt/compare/v0.7.0...v0.9.0 (cut in December 18th: https://github.com/kairos-io/kcrypt/releases/tag/v0.9.0)
| gharchive/issue | 2024-02-06T17:32:12 | 2025-04-01T06:39:15.494823 | {
"authors": [
"Itxaka",
"jimmykarily",
"mudler"
],
"repo": "kairos-io/kairos",
"url": "https://github.com/kairos-io/kairos/issues/2217",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2237245668 | Symlink any /boot/Image* to /boot/vmlinuz
The jetson produces /boot/Image which wasn't caught by the previous mechanism
fixes #2461
tested manually, this tests dont' run on pr so merging
| gharchive/pull-request | 2024-04-11T09:00:21 | 2025-04-01T06:39:15.495992 | {
"authors": [
"mauromorales"
],
"repo": "kairos-io/kairos",
"url": "https://github.com/kairos-io/kairos/pull/2463",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
952250620 | fix: 📝Updated Discord Link
Fixes Issue (#22 )
This PR fixes the following issues : Updated Discord Link In Greeting Action.
hey @vind3v17, I see a different link added there. Kindly add this link (I mentioned this in the issue as well) https://discord.gg/K9kxUXvfND
| gharchive/pull-request | 2021-07-25T11:05:46 | 2025-04-01T06:39:15.506055 | {
"authors": [
"kaiwalyakoparkar",
"vind3v17"
],
"repo": "kaiwalyakoparkar/classroom-monitor-bot",
"url": "https://github.com/kaiwalyakoparkar/classroom-monitor-bot/pull/23",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
347587416 | いいところ診断の結果に優しさを追加したい
以下の結果を追加したい
'{userName}のいいところは優しさです。あなたの優しい雰囲気や立ち振舞に多くの人が癒やされています。'
これから対応します
| gharchive/issue | 2018-08-04T04:50:32 | 2025-04-01T06:39:15.507008 | {
"authors": [
"kakaka71"
],
"repo": "kakaka71/assessment",
"url": "https://github.com/kakaka71/assessment/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
890105401 | java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.xxxx/com.kakao.sdk.flutter.AuthCodeCustomTabsActivity}:
안녕하세요. 카카오 개발자 에다가 등록한 패키지명이 com.example.xxxx 였는데 나중에 패키지명을 변경하게되었고, 변경된 패키지명을 카카오개발자에서 수정 하였지만 여전히 에러가 나타납니다. 혹시 프로젝트 내에서 수정을 해줘야하는 부분이 따로있나요?
안녕하세요? 사용하신 sdk 버전이 어떻게 되시나요??
pubspec.yaml에서 kakao_flutter_sdk: ^0.5.2 이렇게 사용하고있어요. 아래는 flutter doctor 실행 했을때 입니다.
[✓] Flutter (Channel unknown, 1.22.6, on macOS 11.2.3 20D91 darwin-x64, locale en-KR) [✓] Android toolchain - develop for Android devices (Android SDK version 29.0.3) [✓] Xcode - develop for iOS and macOS (Xcode 12.4) [✓] Android Studio (version 3.6)
start Activity 할 때 수정 전 패키지 명이 나오는거 봐서는 (com.example.xxxx)
패키지 명이 정확히 수정이 안된 것으로 보여요. 확인 부탁 드립니다.
com.example.xxxx/com.kakao.sdk.flutter.AuthCodeCustomTabsActivity
답변감사드립니다. AndroidManifest.xml 파일들 확인후 manifest package="com.new.new"로 수정 된 부분들 확인하였는데 혹시 다른곳 변경해야할 곳도 있을까요?
https://stackoverflow.com/a/58729067 이 답변을 따라서 app/src/main/kotlin 폴더안의 폴더들을 패키지 명과 맞게 변경해주고 flutter clean을 실행후 다시 build하니 잘 작동 됩니다. 감사합니다.
| gharchive/issue | 2021-05-12T13:43:17 | 2025-04-01T06:39:15.510580 | {
"authors": [
"2jeje",
"wonjae-codecabin"
],
"repo": "kakao/kakao_flutter_sdk",
"url": "https://github.com/kakao/kakao_flutter_sdk/issues/77",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1083027564 | Waiting for animation
When capturing a GeoJSON file we need to wait fot the animation. For now we have defined a wait duration according the activity (map or globe). It could be great to tell Kano to swith off any animation when zooming.
For now, we prodive a waitDelay before taking the snapshot. It could be defined using a WAIT_DELAY environment variable or could be provided on the query
| gharchive/issue | 2021-12-17T08:56:22 | 2025-04-01T06:39:15.538150 | {
"authors": [
"cnouguier"
],
"repo": "kalisio/kapture",
"url": "https://github.com/kalisio/kapture/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1503917791 | 'No connection could be made because the target machine actively refused it. (localhost:5050)'
My code :
Error:
var baseProfileList = await client.SearchBaseProfilesAsync(deviceType: "desktop", browserProduct: "chrome");
pls help me!!!!!!
Are you sure that Kameleo.CLI.exe is running. If yes, please also make sure it runs on 5050 port.
Please see this article to see how to start Kameleo.CLI.exe
| gharchive/issue | 2022-12-20T03:09:05 | 2025-04-01T06:39:15.723062 | {
"authors": [
"duongbinh214",
"kameleo-team"
],
"repo": "kameleo-io/local-api-client-csharp",
"url": "https://github.com/kameleo-io/local-api-client-csharp/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
216716915 | integrate easyjson
https://github.com/mailru/easyjson
as a result of https://github.com/kamilsk/dzone/issues/31
blocked by https://github.com/kamilsk/shared/issues/144
| gharchive/issue | 2017-03-24T09:35:52 | 2025-04-01T06:39:15.727501 | {
"authors": [
"kamilsk"
],
"repo": "kamilsk/dzone",
"url": "https://github.com/kamilsk/dzone/issues/34",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
55797577 | Specifying worker names with stop command
I am going to implement a feature that allows users to specify the worker to kill in command line. This is useful if you are using some automation/CI tools (jenkins, ansible, puppet etc). There are going to be two additional options to the stop command: --worker and --count.
The --worker option allows you to specify a queue name to kill. By default it will kill all workers with that name.
The --count option works in tandem with the --worker option and allows you to specify how many to kill.
I'm not sure if you accept feature additions to your codebase, but if you want, I can make a pull request, implement it and then merge back into your repo. That way others can benefit.
Let me know what you think
I don't quite get what you're trying to do. The stop command already print a list of workers, and you can choose the worker you want to stop.
I think there's naming issue, but are you trying to stop a worker, by the name of the polled queue ? In that case, a --queue option would make more sense.
Sorry, I mean't --queue (been working very long hours as of late so am tripping up all over the place). It does print a list of workers, but then it requires stdin input and that's a problem if you are using automation tools.
Take my configuration for example, we have a jenkins project which allows you to specify queues to start. I make a selection and it plugs the variables into a bash script and runs it. You can't do the same with the stop command and I don't want junior devs having access to our queuing servers. At the moment, the only solution we have is to kill all the workers which is high inconvenient.
fresque stop --queue my_queue --count 2 // that will kill two workers from my_queue
Interesting feature, :+1
cool, I'll make a PR tomorrow then :)
| gharchive/issue | 2015-01-28T18:45:30 | 2025-04-01T06:39:15.731751 | {
"authors": [
"kamisama",
"shadyb"
],
"repo": "kamisama/Fresque",
"url": "https://github.com/kamisama/Fresque/issues/45",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
506672585 | Fix typo and improve wording in README
Hello @kamleshchandnani
Great initiative 👏
Just a minor suggestion if you don't mind 😄
I have fixed following in the README
Typo
Improved the header. You may refer hemingwayapp for the same.
You may review the fixes here.
PS: We will also raise a PR soon to add Kiprosh to the list.
Hey @rohandaxini ,
Thank you so much for taking out time and contributing to this project 🎉
| gharchive/pull-request | 2019-10-14T14:00:10 | 2025-04-01T06:39:15.735156 | {
"authors": [
"kamleshchandnani",
"rohandaxini"
],
"repo": "kamleshchandnani/awesome-interview-process",
"url": "https://github.com/kamleshchandnani/awesome-interview-process/pull/10",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2260542420 | Curriculum update needed on how_does_the_web_work.md
The Odin's file, how_does_the_web_work.md is updated. Please update the Kampus' file, checkout file here how_does_the_web_work.md
Latest commits:
How Does the Web Work?: Update descriptive link text (#27681) (#27689) (additions: 7, deletions: 7) on Mar 26 2024, 14:43 UTC
New commits have been made to the Odin's file. Please update the Kampus' file.
Latest commits:
Surround html tag with lines to follow lint (#28132) (additions: 6, deletions: 4) on Jun 8 2024, 02:29 UTC
| gharchive/issue | 2024-04-24T07:09:01 | 2025-04-01T06:39:15.740871 | {
"authors": [
"filgoBot"
],
"repo": "kamp-us/monorepo",
"url": "https://github.com/kamp-us/monorepo/issues/863",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
175733035 | Getting Server cannot append header after HTTP headers have been sent
Hi,
I am getting Server cannot append header after HTTP headers have been sent when I try to sync the roles with unicorn. Any ideas?
Regards
Danny
Duplicate of #155. This is fixed in the current prerelease version already.
I am assuming that you're using Sitecore 8.1U3 or 8.2. You can work around this by setting the 'X-Frame-Options' header yourself prior to Sitecore trying to set it on request end.
| gharchive/issue | 2016-09-08T11:46:57 | 2025-04-01T06:39:15.744164 | {
"authors": [
"kamsar",
"waaromikniet"
],
"repo": "kamsar/Unicorn",
"url": "https://github.com/kamsar/Unicorn/issues/170",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1182068606 | Creating azure profile fails with "unable to redefine 'e' shorthand in "azure" flagset"
Describe the bug
I am trying to create an Azure profile, but get the following error: panic: unable to redefine 'e' shorthand in "azure" flagset: it's already used for "storage-env" flag
To Reproduce
Type the command kanctl create profile azure
Press [enter]
The following error appears:
$ kanctl create profile azure
panic: unable to redefine 'e' shorthand in "azure" flagset: it's already used for "storage-env" flag
goroutine 1 [running]:
github.com/spf13/pflag.(*FlagSet).AddFlag(0xc000465300, 0xc0006d9220)
/root/go/pkg/mod/github.com/spf13/pflag@v1.0.5/flag.go:874 +0x3f5
github.com/spf13/pflag.(*FlagSet).AddFlagSet.func1(0xc0006d9220)
/root/go/pkg/mod/github.com/spf13/pflag@v1.0.5/flag.go:887 +0x65
github.com/spf13/pflag.(*FlagSet).VisitAll(0x5, 0xc0006bfce0)
/root/go/pkg/mod/github.com/spf13/pflag@v1.0.5/flag.go:290 +0xe3
github.com/spf13/pflag.(*FlagSet).AddFlagSet(0xc0006e6780, 0x5)
/root/go/pkg/mod/github.com/spf13/pflag@v1.0.5/flag.go:885 +0x3d
github.com/spf13/cobra.(*Command).mergePersistentFlags(0xc0006e6780)
/root/go/pkg/mod/github.com/spf13/cobra@v1.4.0/command.go:1658 +0x65
github.com/spf13/cobra.(*Command).InitDefaultHelpFlag(0xc0004ee4c0)
/root/go/pkg/mod/github.com/spf13/cobra@v1.4.0/command.go:1032 +0x25
github.com/spf13/cobra.(*Command).execute(0xc0006e6780, {0x48434f0, 0x0, 0x0})
/root/go/pkg/mod/github.com/spf13/cobra@v1.4.0/command.go:780 +0x107
github.com/spf13/cobra.(*Command).ExecuteC(0xc00061f400)
/root/go/pkg/mod/github.com/spf13/cobra@v1.4.0/command.go:974 +0x3bc
github.com/spf13/cobra.(*Command).Execute(...)
/root/go/pkg/mod/github.com/spf13/cobra@v1.4.0/command.go:902
github.com/kanisterio/kanister/pkg/kanctl.Execute()
/codefresh/volume/kanister/pkg/kanctl/kanctl.go:46 +0x25
main.main()
/codefresh/volume/kanister/cmd/kanctl/main.go:28 +0x17
Expected behavior
The user gets information about how to create an Azure profile (instead of s3 or gcp).
Environment
Kubernetes Version/Provider: v1.22.5
Storage Provider: Azure
Cluster Size (#nodes): 5
Data Size: A few GBs
thanks for reporting this @TacoNaco47, I am taking a look.
| gharchive/issue | 2022-03-26T20:54:15 | 2025-04-01T06:39:15.806638 | {
"authors": [
"TacoNaco47",
"viveksinghggits"
],
"repo": "kanisterio/kanister",
"url": "https://github.com/kanisterio/kanister/issues/1333",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
160601638 | windows: missing TaskDialog
Fixed in a recent mingw-w64
# github.com/andlabs/ui
../../andlabs/ui/libui_windows_amd64.a(stddialogs.cpp.obj): In function `msgbox':
/home/simon/projects/libui/windows/stddialogs.cpp:113: undefined reference to `__imp_TaskDialog'
collect2: error: ld returned 1 exit status
how to resolve ?
This seems to only be part of mingw 5+, which has not yet been released.
| gharchive/issue | 2016-06-16T08:08:15 | 2025-04-01T06:39:15.818956 | {
"authors": [
"chenbs",
"emersion",
"karalabe"
],
"repo": "karalabe/xgo",
"url": "https://github.com/karalabe/xgo/issues/55",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2141107678 | not need to accept
Not need to verify , i am just learning git/github
not need to accept
| gharchive/pull-request | 2024-02-18T17:35:58 | 2025-04-01T06:39:15.819717 | {
"authors": [
"Karansingh2720"
],
"repo": "karan/Projects",
"url": "https://github.com/karan/Projects/pull/193",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
284171299 | [feature request] Support for #notpresent keyword
Use case: I'm writing tests for Spring Boot Repository. It supports "projections" so user can request different detalization, e.g.
"default" projection: Fields A,B,C
"more-data" projection: Fields A,B,C and array D (it is loaded from separate table)
"complete-data" projection: Fields A,B,C and arrays D,E,F,G (each is loaded from separate table)
I cannot check contains/!contains with single JSON so right now my tests look like
Scenario "default"
<request>
then match response contains
"""
{
A : '#string',
B : '#string' ,
C : '#string'
}
"""
and match response !contains
"""
{
D : '#array',
E : '#array',
F : '#array',
G : '#array'
}
"""
Scenario "more-data"
<request>
then match response contains
"""
{
A : '#string',
B : '#string' ,
C : '#string',
D : '#array'
}
"""
and match response !contains
"""
{
E : '#array',
F : '#array',
G : '#array'
}
"""
Scenario "complete-data"
<request>
then match response contains
"""
{
A : '#string',
B : '#string' ,
C : '#string',
D : '#array',
E : '#array',
F : '#array',
G : '#array'
}
"""
With new keyword added my scenarios will be cleaner and easier to read
Scenario "default"
<request>
then match response contains
"""
{
A : '#string',
B : '#string' ,
C : '#string'
D : '#undefined',
E : '#undefined',
F : '#undefined,
G : '#undefined'
}
"""
Scenario "more-data"
<request>
then match response contains
"""
{
A : '#string',
B : '#string' ,
C : '#string',
D : '#array',
E : '#undefined',
F : '#undefined,
G : '#undefined'
}
"""
Scenario "complete-data"
<request>
then match response contains
"""
{
A : '#string',
B : '#string' ,
C : '#string',
D : '#array',
E : '#array',
F : '#array',
G : '#array'
}
"""
@avpavlov actually there is an #ignore marker already, can you confirm that it is what works for you: https://github.com/intuit/karate#fuzzy-matching
#ignore excludes field from verification. I do not want to exclude, I want to ensure this field is not included.
In other words, #undefined is like local !contains for single field
would #notnull work ?
I meant #null
Thank you, both work - #null and ##null.
However, I would say working #null is a bug in this case, because if I expect some key with null value then missing key should fail scenario.
##null is right fit in this case - any chances you expand "contains / !contains" readme sections to propose it as way to verify key is not presented in JSON?
However, I would say working #null is a bug in this case, because if I expect some key with null value then missing key should fail scenario.
Yes. Or how about this, to check for a null just use null itself. So this should fail (as of now it passes, but I'm proposing to change this):
* def foo = { }
* match foo == { a: null }
In my experience, most teams assume a null value and the key missing to be the same. Typically people set the Json marshaller config to 'strip nulls' for example, to reduce payload bloat.
Can you help by suggesting what change you'd like to see in the contains readme section, I'll be happy to add.
(as of now it passes, but I'm proposing to change this):
Sounds good
most teams assume a null value and the key missing to be the same.
That's true. That's why #undefined could help ;)
Can you help by suggesting what change you'd like to see in the contains readme section,
Let me think
@avpavlov cool, I'm beginning to agree. see I'm quite reasonable :P
how about #notpresent - which could be more clear ? undefined has a certain meaning in JS also ?
notpresent sounds better than undefined. I tried to invent something like na or notavailable or missed but all of these were not enough clear so finally I borrowed keyword from JS
great ! looking at it now. are you able to build from source or do you prefer a release.
Created PR for '##null' in README https://github.com/intuit/karate/pull/271
I can build from sources
@avpavlov yes, is now in the develop branch. I actually decided that both #null and the null value will expect the JSON key to be present. Just felt that this is consistent and reduces confusion. Teams may need to use #ignore or #notpresent which is more clear.
I agree, Karate tests output/protocol not how this output could be interpreted by consumer
Just tested project with develop branch - it works! Thank you!
Is there any way in the Github to subscribe to the notifications about releases?
@avpavlov not sure, but you can watch the project on GiHub. there's a twitter account if you are in to that kind of thing: https://twitter.com/KarateDSL
@avpavlov well. after thinking about it, it made sense to implement #present as well ! thanks for triggering this, I think Karate has become a little better as a result. So now here is what is possible. I recommend that folks don't use != not-equals because it gets really confusing, but hey.
* def foo = { }
* match foo != { a: '#present' }
* match foo == { a: '#notpresent' }
* match foo == { a: '#ignore' }
* match foo == { a: '##null' }
* match foo != { a: '#null' }
* match foo != { a: '#notnull' }
* match foo == { a: '##notnull' }
* match foo != { a: null }
* def foo = { a: null }
* match foo == { a: '#null' }
* match foo == { a: '##null' }
* match foo != { a: '#notnull' }
* match foo != { a: '##notnull' }
* match foo == { a: '#present' }
* match foo == { a: '#ignore' }
* match foo != { a: '#notpresent' }
* def foo = { a: 1 }
* match foo == { a: '#notnull' }
* match foo == { a: '##notnull' }
* match foo != { a: '#null' }
* match foo != { a: '##null' }
* match foo == { a: '#present' }
* match foo == { a: '#ignore' }
* match foo != { a: '#notpresent' }
@ptrthomas When I am validating schema using Karate one of the field as per schema is defined as String but its returning null but it returns string value also in some cases so when its returning null its failing my test as its string. How to handle this. I want it to pass if its null or string when its string in schema.
@Shaileshz204 use stack overflow for questions like this please: https://stackoverflow.com/a/71522605/143475
| gharchive/issue | 2017-12-22T14:06:08 | 2025-04-01T06:39:15.839083 | {
"authors": [
"Shaileshz204",
"avpavlov",
"ptrthomas"
],
"repo": "karatelabs/karate",
"url": "https://github.com/karatelabs/karate/issues/270",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
257294817 | Add tests to vendoring only for specific packages
Hi,
I'm sorry if it is trivial but I was not able to find any hint.
Is there a way to add tests only for specific packages in the vendor folder?
I can ignore the tests for all the packages with "ignore" in vendor.json file or, without it, all the tests for all the packages are added.
Nope. There is no way to do a package specific ignore requirement.
Ok, thanks... and is there a way to avoid having tests ignored in vendor.json after govendor init?
You can always remove that after a govendor init. It is just what I consider a "sane default".
Yeah it is a sane default that unfortunately does not apply to my employer default pipeline. I'm trying to find the best way to deal with it. Thanks a lot.
| gharchive/issue | 2017-09-13T08:18:10 | 2025-04-01T06:39:15.841799 | {
"authors": [
"brunetto",
"kardianos"
],
"repo": "kardianos/govendor",
"url": "https://github.com/kardianos/govendor/issues/359",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
209716546 | Process Terminated Unexpectedly error in windows
Im getting the below error msg while stopping the service in windows server 2008 r2.
Errror 1067 :process terminated unexpectedly
Here is my code
package main
import (
"flag"
"net"
"net/http"
"github.com/kardianos/service"
"./mygopackage"
)
var logger service.Logger
var ln net.Listener
type program struct {
exit chan struct{}
}
func (p *program) Start(s service.Service) error {
if service.Interactive() {
logger.Info("Running in terminal.")
} else {
logger.Info("Running under service manager.")
}
p.exit = make(chan struct{})
// Start should not block. Do the actual work async.
go p.run()
return nil
}
func (p *program) run() {
log.Info("Listening on :8081")
ln, err := net.Listen("tcp", ":8081")
if err != nil {
log.Error("Err listening specified port:", err)
}
go mygopackage.do()
http.HandleFunc("/plugins", plugins)
http.Serve(ln, nil)
}
func (p *program) Stop(s service.Service) error {
// Any work in Stop should be quick, usually a few seconds at most.
ln.Close()
log.Info("Stopping SaaSAgent...!")
close(p.exit)
return nil
}
func main() {
svcFlag := flag.String("service", "", "Control the system service.")
flag.Parse()
svcConfig := &service.Config{
Name: "myapp",
DisplayName: "myapp",
Description: "myapp Service :)",
}
prg := &program{}
s, err := service.New(prg, svcConfig)
if err != nil {
log.Error(err)
}
errs := make(chan error, 5)
logger, err = s.Logger(errs)
if err != nil {
log.Error(err)
}
go func() {
for {
err := <-errs
if err != nil {
log.Error(err)
}
}
}()
if len(*svcFlag) != 0 {
err := service.Control(s, *svcFlag)
if err != nil {
log.Info("Valid actions: %q\n", service.ControlAction)
log.Error(err)
}
return
}
err = s.Run()
if err != nil {
logger.Error(err)
}
}
Did I do any mistake?
The ln pkg global is always nil because it is never assigned to. You use a
:= when assigning which creates a local var. Thus the stop method panics.
On Thu, Feb 23, 2017, 02:08 Vigneshr6 notifications@github.com wrote:
Im getting the below error msg while stopping the service in windows
server 2008 r2.
Errror 1067 :process terminated unexpectedly
Here is my code
package main
import (
"flag"
"net"
"net/http"
"github.com/kardianos/service"
)
var logger service.Logger
var ln net.Listener
type program struct {
exit chan struct{}
}
func (p *program) Start(s service.Service) error {
if service.Interactive() {
logger.Info("Running in terminal.")
} else {
logger.Info("Running under service manager.")
}
p.exit = make(chan struct{})
// Start should not block. Do the actual work async.
go p.run()
return nil
}
func (p *program) run() {
log.Info("Listening on :8081")
ln, err := net.Listen("tcp", ":8081")
if err != nil {
log.Error("Err listening specified port:", err)
}
http.HandleFunc("/plugins", plugins)
http.Serve(ln, nil)
}
func (p *program) Stop(s service.Service) error {
// Any work in Stop should be quick, usually a few seconds at most.
ln.Close()
log.Info("Stopping SaaSAgent...!")
close(p.exit)
return nil
}
func main() {
svcFlag := flag.String("service", "", "Control the system service.")
flag.Parse()
svcConfig := &service.Config{
Name: "myapp",
DisplayName: "myapp",
Description: "myapp Service :)",
}
prg := &program{}
s, err := service.New(prg, svcConfig)
if err != nil {
log.Error(err)
}
errs := make(chan error, 5)
logger, err = s.Logger(errs)
if err != nil {
log.Error(err)
}
go func() {
for {
err := <-errs
if err != nil {
log.Error(err)
}
}
}()
if len(*svcFlag) != 0 {
err := service.Control(s, *svcFlag)
if err != nil {
log.Info("Valid actions: %q\n", service.ControlAction)
log.Error(err)
}
return
}
err = s.Run()
if err != nil {
logger.Error(err)
}
}
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/kardianos/service/issues/83, or mute the thread
https://github.com/notifications/unsubscribe-auth/AAuFsfOROqYdj9UuqIUQzrl6x09H2OkUks5rfVqEgaJpZM4MJxX1
.
oops!
I did not notice that
Thanks @kardianos
| gharchive/issue | 2017-02-23T10:08:04 | 2025-04-01T06:39:15.856900 | {
"authors": [
"Vigneshr6",
"kardianos"
],
"repo": "kardianos/service",
"url": "https://github.com/kardianos/service/issues/83",
"license": "Zlib",
"license_type": "permissive",
"license_source": "github-api"
} |
2471471473 | Recipe images not retrieved
[x] I have read the FAQ.
Describe the bug
I have just created my Bar Assistant instance. Everything works fine but the images don't appear. However, thumbnails work fine.
After inspecting the HTML, I see that the URI of the picture is "https://bar.mydomain.com/uploads/cocktails/1/1934-cosmo-1_79lTcl.jpg", but it should be "https://bar.dlmw.ch/bar/uploads/cocktails/1/1934-cosmo-1_79lTcl.jpg". My reverse proxy is Caddy and the Caddyfile looks somewhat like this:
bar.mydomain.com {
handle_path /search/* {
reverse_proxy meilisearch:7700
}
handle_path /bar/* {
reverse_proxy barassistant:3000
}
handle_path /* {
reverse_proxy saltrim:8080
}
}
To Reproduce
Create a Caddyfile containing the snippet above
Versions:
Docker: 24.0.7
LibreWolf: 129.0-1
I managed to solve it by modifying the Caddyfile like so:
bar.dlmw.ch {
# here
@uploads {
path_regexp uploads ^/uploads/(.*)
}
rewrite @uploads /bar/uploads/{re.uploads.1}
handle_path /search/* {
reverse_proxy meilisearch:7700
}
handle_path /bar/* {
reverse_proxy barassistant:3000
}
handle_path /* {
reverse_proxy saltrim:8080
}
}
| gharchive/issue | 2024-08-17T10:40:16 | 2025-04-01T06:39:15.901544 | {
"authors": [
"dlmw"
],
"repo": "karlomikus/bar-assistant",
"url": "https://github.com/karlomikus/bar-assistant/issues/312",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
75741338 | npm install WARN message
with npm install of karma, npm warns:
npm WARN engine karma@0.12.31: wanted: {"node":"~0.8 || ~0.10"} (current: {"node":"0.12.0","npm":"2.7.2"})
#merge
Will be fixed in new version karma
| gharchive/issue | 2015-05-12T21:15:28 | 2025-04-01T06:39:15.905812 | {
"authors": [
"MetaMemoryT",
"maksimr",
"zzo"
],
"repo": "karma-runner/karma",
"url": "https://github.com/karma-runner/karma/issues/1402",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
162459166 | when karma hangs, how do I debug it?
In my project, https://github.com/cellog/react-selection , I have encountered a bug when running karma remotely on saucelabs. It is caused by code coverage. In short, karma connects to the browser, loads, and then hangs indefinitely. When I remove code coverage, the test runs in about 16 seconds. This happens only remotely, when I run coverage on my local machine, it works great.
How can I debug the hang on saucelabs? I'm new to karma. When using karma remotely with saucelabs, is it loading karma from my machine? If yes, which file can I put debug code into to see if I can figure out where it is failing? Or is there another way?
Thanks much.
Oh and I should mention the code coverage is accomplished by a babel plugin, so there is nothing inside karma, it just serving the transpiled files directly to saucelabs
setting logLevel to DEBUG should give you more details, after that regular debugging techniques for node loke console.log in code and attaching a debugger
it turns out that any concurrency larger than 1 causes the entire thing to fail with disconnects to every browser. Is this expected behavior?
On saucelabs that is probably caused by rate limiting from their side.
fyi, the problem is in the karma-saucelabs-launcher, which is unmaintained and doesn't work any more. After 50 tests, it just dies. I switched to browserstack and it works perfectly. However, there are some side effects of running more than 1 browser test in the same karma process. I have no idea how to isolate them, but the ONLY way I can get karma to work and generate code coverage is to run each browser in its own karma process. You can see what I mean in https://github.com/cellog/react-selection
Major pain.
Fortunately, I can do local development with karma very quickly, and then push on commit to find those chance browser differences eventually. Test runs now take about 15 minutes each, so it's really a pain, but at least it works.
So to summarize: there are several bugs that I can't fix or even track down. The first is that karma and sauce labs are no longer friends, and so I can't use them together at all. The second is that karma has some kind of strange shared stuff interfering with each other whenever I run more than 1 browser in the same karma process, even if concurrency is set to 1. That's a huge one, and might be worth investigating further, since it means the sandboxing is leaking. Let me know if you want me to try things to debug it.
FWIW, I have noticed that the Angular team has set up karma with Sauce Labs to use Sauce Labs via shell script in separate processes - if I had to guess, it is to have environmental separation for purer test environments, but maybe they came across some of the same issues with multiple browsers.
Separate issues should probably be opened if there are specific issues found though, that way we can track them better. Going to close this issue, but feel free to open issues for actionable problems you find with karma.
attaching a debugger
How exactly? I tried
ndb ./node_modules/.bin/karma ... but nothing runs, and the opened devtools window is empty, no source files, nothing paused.
The correct answer is to move to jest and forget about it
Unfortunately I've found ndb fail sometimes. I use node --inspect-brk but I also don't use the .bin file directly.
| gharchive/issue | 2016-06-27T14:01:39 | 2025-04-01T06:39:15.912869 | {
"authors": [
"cellog",
"dignifiedquire",
"johnjbarton",
"trusktr",
"wesleycho"
],
"repo": "karma-runner/karma",
"url": "https://github.com/karma-runner/karma/issues/2211",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
803491200 | fix(config): check extension before ts-node register
Call require('ts-node').register() after checking configFilePath has .ts extension
Fixes #3329
сс @devoto13
:white_check_mark: Build karma 2927 completed (commit https://github.com/karma-runner/karma/commit/4157266d7a by @xel23)
@googlebot I signed it!
@xel23 Thanks for the PR!
I guess we can land it as a workaround to solve the most painful appearance of this issue, but I don't think it "fixes" the issue. We really should implement a systematic solution as outlined in https://github.com/karma-runner/karma/issues/3329#issuecomment-772262377.
Technically this is a breaking change. People, who might have relied on type-checking their karma.conf.js with allowJS: true will no longer have the type-checking. As this does not prevent them from running Karma, I am willing to do so to resolve the issue for the majority of users. @johnjbarton What do you think?
Ping
Thank you so much for fixing this! 🥳 🎉
| gharchive/pull-request | 2021-02-08T11:52:14 | 2025-04-01T06:39:15.917482 | {
"authors": [
"AppVeyorBot",
"devoto13",
"jimbojw",
"maksimr",
"xel23"
],
"repo": "karma-runner/karma",
"url": "https://github.com/karma-runner/karma/pull/3651",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1753934976 | Feat/collaboration
This adds Figma -> RN plugin syncing as well as generic editor syncing:
https://feat-collaboration.vslite.pages.dev/#/strait_domestic_heat_dean_patronage
Right now you need to run your own local websocket server:
HOST=localhost PORT=1234 npx y-websocket
All of this is work in progress, but the PoC is working...
Merged initial sync support. Will follow up on this in another PR.
| gharchive/pull-request | 2023-06-13T02:53:36 | 2025-04-01T06:39:15.953840 | {
"authors": [
"TheUltDev"
],
"repo": "kat-tax/vslite",
"url": "https://github.com/kat-tax/vslite/pull/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
446050339 | docs: Added missing doc link
Added a link to the existing how-to-use-virtio-fs-with-kata.md.
Fixes #481.
Signed-off-by: James O. D. Hunt james.o.hunt@intel.com
/test
| gharchive/pull-request | 2019-05-20T10:52:45 | 2025-04-01T06:39:15.955203 | {
"authors": [
"jodh-intel"
],
"repo": "kata-containers/documentation",
"url": "https://github.com/kata-containers/documentation/pull/482",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
442627585 | qemu-lite failed to enable vga when launching Kata container
Description of problem
When I reconfigure Qemu-lite launch parameter from “-vga none” to “-vga std” to enable vga in Kata runtime, it reports PCI unavailable error as below.
Is there any dependence to enable vga? I ask some people and there is no dependence in kvm +qemu to launch an centos VM.
Expected result
Launch Kata container successfully.
Actual result
May 10 06:31:21 localhost.localdomain kata-runtime[28005]: time="2019-05-10T06:31:21.574044938-04:00" level=info msg="launching /usr/bin/qemu-lite-system-x86_64 with: [-name sandbox-c522a66fa441bb1eb5be1b2fcba79b7829e8922c09d2f51d077a5dba7732987b -uuid 3e6695d3-d12f-4bfe-b1e5-2369c0b96df2 -machine pc,accel=kvm,kernel_irqchip,nvdimm -cpu host -qmp unix:/run/vc/vm/c522a66fa441bb1eb5be1b2fcba79b7829e8922c09d2f51d077a5dba7732987b/qmp.sock,server,nowait -m 2048M,slots=10,maxmem=8752M -device pci-bridge,bus=pci.0,id=pci-bridge-0,chassis_nr=1,shpc=on,addr=2,romfile= -device virtio-serial-pci,disable-modern=false,id=serial0,romfile= -device virtconsole,chardev=charconsole0,id=console0 -chardev socket,id=charconsole0,path=/run/vc/vm/c522a66fa441bb1eb5
be1b2fcba79b7829e8922c09d2f51d077a5dba7732987b/console.sock,server,nowait -device nvdimm,id=nv0,memdev=mem0 -object memory-backend-file,id=mem0,mem-path=/usr/share/kata-containers/kata-containers-image_clearlinux_1.5.0_agent_a581aebf473.img,size=536870912 -device virtio-scsi-pci,id=scsi0,disable-modern=false,romfile= -object rng-random,id=rng0,filename=/dev/urandom -device virtio-rng,rng=rng0,romfile= -device virtserialport,chardev=charch0,id=channel0,name=agent.channel.0 -chardev socket,id=charch0,path=/run/vc/vm/c522a66fa441bb1eb5be1b2fcba79b7829e8922c09d2
f51d077a5dba7732987b/kata.sock,server,nowait -device virtio-9p-pci,disable-modern=false,fsdev=extra-9p-kataShared,mount_tag=kataShared,romfile= -fsdev local,id=extra-9p-kataShared,path=/ru
n/kata-containers/shared/sandboxes/c522a66fa441bb1eb5be1b2fcba79b7829e8922c09d2f51d077a5dba7732987b,security_model=none -netdev tap,id=network-0,vhost=on,vhostfds=3,fds=4 -device driver=virtio-net-pci,netdev=network-0,mac=02:42:ac:12:00:02,disable-modern=false,mq=on,vectors=4,romfile= -global kvm-pit.lost_tick_policy=discard -vga std -no-user-config -nodefaults -nographic -
daemonize -kernel /usr/share/kata-containers/vmlinuz-4.14.67.22-143.1.container -append tsc=reliable no_timer_check rcupdate.rcu_expedited=1 i8042.direct=1 i8042.dumbkbd=1 i8042.nopnp=1 i8042.noaux=1 noreplace-smp reboot=k console=hvc0 console=hvc1 iommu=off cryptomgr.notests net.ifnames=0 pci=lastbus=0 root=/dev/pmem0p1 rootflags=dax,data=ordered,errors=remount-ro rw rootf
stype=ext4 debug systemd.show_status=true systemd.log_level=debug panic=1 nr_cpus=8 init=/usr/lib/systemd/systemd systemd.unit=kata-containers.target systemd.mask=systemd-networkd.service systemd.mask=systemd-networkd.socket -smp 1,cores=1,threads=1,sockets=1,maxcpus=8]" arch=amd64 clicreate container=c522a66fa441bb1eb5be1b2fcba79b7829e8922c09d2f51d077a5dba7732987b command=
create name=kata-runtime pid=28005 source=virtcontainers subsystem=qmp
level=error msg="Unable to launch /usr/bin/qemu-lite-system-x86_64: exit status 1" arc
h=amd64 clicreate container=c522a66fa441bb1eb5be1b2fcba79b7829e8922c09d2f51d077a5dba7732987b command=create name=kata-runtime pid=28005 source=virtcontainers subsystem=qmp
May 10 06:31:21 localhost.localdomain kata-runtime[28005]: time="2019-05-10T06:31:21.633469001-04:00" level=error msg="qemu-lite-system-x86_64: -device pci-bridge,bus=pci.0,id=pci-bridge-0,chassis_nr=1,shpc=on,addr=2,romfile=: PCI: slot 2 function 0 not available for pci-bridge, in use by VGA\n" arch=amd64 clicreate container=c522a66fa441bb1eb5be1b2fcba79b7829e8922c09d2f51d
077a5dba7732987b command=create name=kata-runtime pid=28005 source=virtcontainers subsystem=qmp
Hi @zhiminghufighting - what are you trying to achieve? Were you hoping that the container would then gain access to the hosts VGA controller, or? It might help us understand your goal and how to achieve it :-)
For reference, in case it helps, there is a guide on how to pass GPU in/out of Kata at https://github.com/kata-containers/documentation/blob/master/use-cases/GPU-passthrough-and-Kata.md
And there is a link to a container that can help get X11 working in a container (including a kata container) at https://github.com/kata-containers/documentation/wiki/UserGuide#x11-containers
@zhiminghufighting - qemu-lite is extremely minimal (by design), so doesn't have graphical support enabled. We disable all graphics at build time - see:
https://github.com/kata-containers/packaging/blob/master/scripts/configure-hypervisor.sh#L206
@grahamwhaley The background is here: i want to launch android based container image in Kata which is used to support cloud gaming. As you know, lots of gaming is running on billion mobile devices with Android OS in China. there is a big request to enable android container in Kata. After i launch Kata container with android based image with our OTC android in container team and found there is an android key process --surface finger can't be started normally caused by wrong framer buffer size. But if i need to get the framer buffer size, i need to enable vga in qemu-lite to support right framer buffer to start software render. Here android container needs a software render and framer buffer & virtual VGA is a mandatory option. The android container doesn't care about if there is a real VGA or GPU in host because it depends on software render library --opengl in android container image.
In above scenario, there is no need of hardware render in android container, so it doesn't depend on GPU GVT-g or GVT-d. I know and check all the related doc of enabling GPUs for kata in your link.
Thanks for your quick response.
Is there any other options for me to use vga?
@jodh-intel please see the detail background and root cause of this requirement in above answer to Graham's question.
I think you already answer the root cause of why the error log is reported by qemu-lite.
Can i enable gpu support by rebuild qemu-lite? If yes, is there any guide doc?
And if there is any other ways? I think even i enable gpu support to solve this issue, it will cause qemu-lite being heavy and increase the memory footprint and resource consumption.
thanks for your quick response!
By the way, there would be a huge potential user scenario for Kata beside cloud is mentioned by me in above comment : gaming in cloud and cloud gaming based on android image container in chine.
Billions of android mobile device and thousands of gaming based on android OS.
@zhiminghufighting - qemu-lite is a highly optimised version of qemu designed for "standard" sorts of container use-cases so doesn't include graphical support.
To just prove the concept, you could of course simply change your config to use the distro-packaged version of qemu which should contain graphics support:
path = "/usr/bin/qemu-system-x86_64"
Although you could rebuild qemu-lite to include graphic support, I don't think that is the best approach. Depending on the architecture, either NEMU or qemu-vanilla would be better I think.
However, both those options are also built without graphical support so you would have to rebuild them with some of the graphical options we disable using https://github.com/kata-containers/packaging/blob/master/scripts/configure-hypervisor.sh.
However, you may need to make changes to the guest kernel configuration as that too is as minimal as possible for "normal" sorts of workloads.
We don't have documentation for this sort of scenario currently.
Next steps
This is an interesting scenario but since it is going to require changes specifically for graphical support and since those may adversely affect memory density and boot speed, we'd need to study the idea carefully. We would also need additional tests, extra testing infrastructure and documentation.
As such, I suggest you raise an RFC issue using the main project repo (which we tend to use to discuss large features that potentially affect lots of different repos):
https://github.com/kata-containers/kata-containers/issues/new
Then, send a mail to the mailing list referring to the issue and ideally present your idea at the Architecture Committee meeting so we can get more input from the entire community.
@jodh-intel Good suggestion!I will collect the detail background and estimate the potential business impact for android image in kata container firstly and then prepare some material for the this user scenario. thanks for your inputs!
I will try the ways provided by you to make sure the feasibility of android in Kata container in next days.
Thanks @zhiminghufighting.
@zhiminghufighting - I realize this is pretty stale now, but was pretty curious about the Android in Kata use case.
Have you made more progress here, or have an end-to-end demo in this space?
| gharchive/issue | 2019-05-10T09:35:01 | 2025-04-01T06:39:15.980828 | {
"authors": [
"egernst",
"grahamwhaley",
"jodh-intel",
"zhiminghufighting"
],
"repo": "kata-containers/runtime",
"url": "https://github.com/kata-containers/runtime/issues/1661",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
953416232 | chore: add linting as public API
as requested by @aladdin-add this PR implements linting as public API. Linting can now be done like this
import { lintModule, aladdin } from 'kataw';
lintModule('eval', /* reporter */ aladdin, { noEval: true});
or
import { lintScript, aladdin } from 'kataw';
lintScript('eval', /* reporter */ aladdin, { noEval: true});
The 'reporter' argument gives the option to add any reporters and end-users can make their own reporter.
I tried to figure out how Babel does this, but I failed in my research.
Note This PR is blocked by #160
This is how it looks like on the command line with the aladdin reporter
@aladdin-add I'm using the aladdin reporter by default here so this can't be merged before you have merged #160.
In the future we will add a 3rd arg into this so the end-user can choose which reporter to use.
You also requested for custom rules, right? They are coming soon. I need to think about it first because I can't do it the same way as ESLint because I refuse to kill performane. ESLint is horrible slow you know ;)
lintModule('eval', /* reporter */ aladdin, { noEval: true});
the 2nd param can be an linterOptions:
{
reporter: ...,
globals: {...},
fix: true,
}
@aladdin-add Good idea! But what is Globals in this context? And can you fix the conflict?
@aladdin-add I changed it into what you suggested, but now we got an performance issue ...
Every time we invoke the reporter we need to do 'options.reporter(...)'. It's unnecessary property access.
Can you get rid of this unnecessary property access?
| gharchive/pull-request | 2021-07-27T01:29:13 | 2025-04-01T06:39:15.997901 | {
"authors": [
"KFlash",
"aladdin-add"
],
"repo": "kataw/kataw",
"url": "https://github.com/kataw/kataw/pull/162",
"license": "ISC",
"license_type": "permissive",
"license_source": "github-api"
} |
660173422 | Server crashing on player connect
Same as https://github.com/urShadow/Pawn.RakNet/issues/35
I can see the server, but as soon as I connect, the server crashes.
There is NOTHING in the log at the moment of the crash (Not even the 'incoming connection' message).
I simply removed the plugin in server.cfg and in the script and everything worked fine instantly.
I will have to make more tests when I will have a couple of hours to debug this I will try to see what other includes/plugins might be interfering.
content of serverlog.txt :
----------
Loaded log file: "server_log.txt".
----------
SA-MP Dedicated Server
----------------------
v0.3.7-R2, (C)2005-2015 SA-MP Team
[2020-07-18 09:40:19]
[2020-07-18 09:40:19] Server Plugins
[2020-07-18 09:40:19] --------------
[2020-07-18 09:40:19] Loading plugin: pawnraknet
[2020-07-18 09:40:19] [Pawn.RakNet]
| Pawn.RakNet 1.4.1 | 2016 - 2020
|--------------------------------
| Author and maintainer: urShadow
| Compiled: Jun 18 2020 at 14:19:11
|--------------------------------------------------------------
| Forum thread: https://forum.sa-mp.com/showthread.php?t=640306
|--------------------------------------------------------------
| Repository: https://github.com/urShadow/Pawn.RakNet
|--------------------------------------------------------------
| Wiki: https://github.com/urShadow/Pawn.RakNet/wiki
[2020-07-18 09:40:19] Loaded.
[2020-07-18 09:40:19] Loading plugin: crashdetect
[2020-07-18 09:40:19] CrashDetect plugin 4.19
[2020-07-18 09:40:19] Loaded.
[2020-07-18 09:40:19] Loading plugin: mysql
[2020-07-18 09:40:19] >> plugin.mysql: R39-6 successfully loaded.
[2020-07-18 09:40:19] Loaded.
[2020-07-18 09:40:19] Loading plugin: streamer
[2020-07-18 09:40:19]
*** Streamer Plugin v2.9.3 by Incognito loaded ***
[2020-07-18 09:40:19] Loaded.
[2020-07-18 09:40:19] Loading plugin: FileFunctions
[2020-07-18 09:40:19] Loaded.
[2020-07-18 09:40:19] Loading plugin: FCNPC
[2020-07-18 09:40:19]
[2020-07-18 09:40:19] -------------------------------------------------
[2020-07-18 09:40:19] FCNPC - Fully Controllable NPC v1.8.2
[2020-07-18 09:40:19] Windows SA-MP 0.3.7 R2
[2020-07-18 09:40:19] Jan 8 2018 at 01:14:48
[2020-07-18 09:40:19]
[2020-07-18 09:40:19] Author: OrMisicL (2013 - 2015)
[2020-07-18 09:40:19] Continued by: ziggi (2016 - present)
[2020-07-18 09:40:19] Contributors: kurta999, Neutralneu
[2020-07-18 09:40:19] -------------------------------------------------
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] Loading...
[2020-07-18 09:40:20] Loaded.
[2020-07-18 09:40:20] Loading plugin: PathFinder
[2020-07-18 09:40:20] =========================================
[2020-07-18 09:40:20] PathFinder Plugin 1.0 MT
[2020-07-18 09:40:20] by Pamdex
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] Using MapAndreas Plugin 1.2.1
[2020-07-18 09:40:20] Waiting for Init...
[2020-07-18 09:40:20] =========================================
[2020-07-18 09:40:20] Loaded.
[2020-07-18 09:40:20] Loading plugin: MapAndreas
[2020-07-18 09:40:20] Loaded.
[2020-07-18 09:40:20] Loading plugin: SKY
[2020-07-18 09:40:20] Loaded.
[2020-07-18 09:40:20] Loading plugin: YSF
[2020-07-18 09:40:20] ARRAY_ConsoleCommands: 4e43d8
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] ===============================
[2020-07-18 09:40:20] YSF - kurta999's version R19 loaded
[2020-07-18 09:40:20] (c) 2008 Alex "Y_Less" Cole - (c) 2010 - 2016 kurta999
[2020-07-18 09:40:20] Server version: 0.3.7 R2-1
[2020-07-18 09:40:20] Operating System: Windows
[2020-07-18 09:40:20] Built on: Mar 11 2017 at 10:32:34
[2020-07-18 09:40:20] ===============================
[2020-07-18 09:40:20] Loaded.
[2020-07-18 09:40:20] Loading plugin: sscanf
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] ===============================
[2020-07-18 09:40:20] sscanf plugin loaded.
[2020-07-18 09:40:20] Version: 2.8.2
[2020-07-18 09:40:20] (c) 2012 Alex "Y_Less" Cole
[2020-07-18 09:40:20] ===============================
[2020-07-18 09:40:20] Loaded.
[2020-07-18 09:40:20] Loaded 11 plugins.
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] Ban list
[2020-07-18 09:40:20] --------
[2020-07-18 09:40:20] Loaded: samp.ban
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] Filterscripts
[2020-07-18 09:40:20] ---------------
[2020-07-18 09:40:20] Loading filterscript 'antiddos.amx'...
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] | YSI version 4.00.0001 |
[2020-07-18 09:40:20] | By Alex "Y_Less" Cole |
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] [TESTING]: SERVER_LOG_FIRSTCHAR:22;
[2020-07-18 09:40:20] Loaded!
[2020-07-18 09:40:20] Loading filterscript 'fac_test.amx'...
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] | YSI version 4.00.0001 |
[2020-07-18 09:40:20] | By Alex "Y_Less" Cole |
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] -black screens initialization...
[2020-07-18 09:40:20] -players variables
[2020-07-18 09:40:20] -black screens initialization completed.
[2020-07-18 09:40:20] -Loading Testing Faction...
[2020-07-18 09:40:20] [MYSQL]: Connection to `saarp` succesful!
[2020-07-18 09:40:20] -Loading Objects...
[2020-07-18 09:40:20] -Loading Pickups...
[2020-07-18 09:40:20] -Testing Faction loaded correctly...
[2020-07-18 09:40:20] Loading filterscript 'vehicles.amx'...
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] | YSI version 4.00.0001 |
[2020-07-18 09:40:20] | By Alex "Y_Less" Cole |
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] -black screens initialization...
[2020-07-18 09:40:20] -players variables
[2020-07-18 09:40:20] -black screens initialization completed.
[2020-07-18 09:40:20]
-----------------------------------------
[2020-07-18 09:40:20] Stefan/Kevin974 - Speedometer | rt-2 - Fuel/Engine system
[2020-07-18 09:40:20] -----------------------------------------
[2020-07-18 09:40:20] [VEHICLES]: Setting vehicle Interior positions
[2020-07-18 09:40:20] [VEHICLES]: Setting vehicle Interior external doors positions
[2020-07-18 09:40:20] [VEHICLES]: SA Driving Assoc
[2020-07-18 09:40:20] [VEHICLES]: Spawning cars from database
[2020-07-18 09:40:20] Loading filterscript 'doors.amx'...
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] | YSI version 4.00.0001 |
[2020-07-18 09:40:20] | By Alex "Y_Less" Cole |
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] -black screens initialization...
[2020-07-18 09:40:20] -players variables
[2020-07-18 09:40:20] -black screens initialization completed.
[2020-07-18 09:40:20] Loading City Planning faction and adresses...
[2020-07-18 09:40:20] -Connecting to database
[2020-07-18 09:40:20] -Acquiring vehicles coords
[2020-07-18 09:40:20] -Initializing doors vars
[2020-07-18 09:40:20] -Other variables and timers
[2020-07-18 09:40:20] Loading filterscript 'public.amx'...
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20]
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] | YSI version 4.00.0001 |
[2020-07-18 09:40:20] | By Alex "Y_Less" Cole |
[2020-07-18 09:40:20] | |
[2020-07-18 09:40:20] =======================================
[2020-07-18 09:40:20]
[2020-07-18 09:40:21] -black screens initialization...
[2020-07-18 09:40:21] -players variables
[2020-07-18 09:40:21] -black screens initialization completed.
[2020-07-18 09:40:21] # # # # # # # # # # # # # # # # # # # # # #
[2020-07-18 09:40:21] # # # # The city have a PT systems. # # # #
[2020-07-18 09:40:21] # # # # # # # # # # # # # # # # # # # # # #
[2020-07-18 09:40:21] # # Initializing "PT" System
[2020-07-18 09:40:21] # # Initializing other plugins
[2020-07-18 09:40:21] # # # Initializing FCNPC
[2020-07-18 09:40:21] # # # Initializing SAPT System
[2020-07-18 09:40:21] # # # Loading SAPT Routes
[2020-07-18 09:40:21] Loading filterscript 'testing.amx'...
[2020-07-18 09:40:21] Loaded 6 filterscripts.
[2020-07-18 09:40:22] Filterscript '../scriptfiles/callbackfix.amx' loaded.
[2020-07-18 09:40:22]
[2020-07-18 09:40:22]
[2020-07-18 09:40:22]
[2020-07-18 09:40:22] =======================================
[2020-07-18 09:40:22] | |
[2020-07-18 09:40:22] | YSI version 4.00.0001 |
[2020-07-18 09:40:22] | By Alex "Y_Less" Cole |
[2020-07-18 09:40:22] | |
[2020-07-18 09:40:22] =======================================
[2020-07-18 09:40:22]
[2020-07-18 09:40:25] -AC(new) initialization...
[2020-07-18 09:40:25] -players variables
[2020-07-18 09:40:25] -vehicles variables
[2020-07-18 09:40:25] -AC(new) initialization completed.
[2020-07-18 09:40:25] -black screens initialization...
[2020-07-18 09:40:25] -players variables
[2020-07-18 09:40:25] -black screens initialization completed.
[2020-07-18 09:40:25] -inventories initialization...
[2020-07-18 09:40:25] -general variables
[2020-07-18 09:40:25] -players variables
[2020-07-18 09:40:25] -inventories initialization completed.
[2020-07-18 09:40:26] -Logged Off Players Variables initializations...
[2020-07-18 09:40:26] -melee interactions initialization...
[2020-07-18 09:40:26] -players variables
[2020-07-18 09:40:26] -all players variables
[2020-07-18 09:40:26] -melee interactions initialization completed.
[2020-07-18 09:40:26] -Mailboxs initializations...
[2020-07-18 09:40:26] -SASD faction initializing
[2020-07-18 09:40:26] -Pickup(s)
[2020-07-18 09:40:26] -Setting routes
[2020-07-18 09:40:26] -SAPO faction initializing
[2020-07-18 09:40:26] -Pickup(s)
[2020-07-18 09:40:26] -Setting routes
[2020-07-18 09:40:26] -SATEL faction initializing
[2020-07-18 09:40:26] -Pickup(s)
[2020-07-18 09:40:26] -Setting routes
[2020-07-18 09:40:26] -SPRU faction initializing
[2020-07-18 09:40:26] -Pickup(s)
[2020-07-18 09:40:26] -Setting routes
[2020-07-18 09:40:26] -Missions initializations...
[2020-07-18 09:40:26] -Initializing player status.
[2020-07-18 09:40:26] -Initializing timer(s).
[2020-07-18 09:40:26] -Initializing text strings.
[2020-07-18 09:40:26] -Gang Wars initializations...
[2020-07-18 09:40:26] [GANGWAR]: -Initializing Gang War system.
[2020-07-18 09:40:26] [GANGWAR]: -Initializing Areas.
[2020-07-18 09:40:26] [GANGWAR]: -Initializing Gangs Skins.
[2020-07-18 09:40:26] [GANGWAR]: -Initializing Gangs Cribs.
[2020-07-18 09:40:26] [GANGWAR]: -Initializing Gangs Pickups.
[2020-07-18 09:40:26] [GANGWAR]: -Initializing Territories Wars Timer.
[2020-07-18 09:40:26] [GANGWAR]: -Initializing Other Gangs Vars.
[2020-07-18 09:40:26] -Interactive Menus initializations...
[2020-07-18 09:40:26] -Shops initializations...
[2020-07-18 09:40:26] -Sending query
[2020-07-18 09:40:26] -tutorial initialization...
[2020-07-18 09:40:26] -players variables
[2020-07-18 09:40:26] -office pickup
[2020-07-18 09:40:26] -slides variables
[2020-07-18 09:40:26] -Transport system initializations...
[2020-07-18 09:40:26] -Impex Automatic Sales initialization;
[2020-07-18 09:40:26] -stockid is 53 after load
[2020-07-18 09:40:26] -creating container objects;
[2020-07-18 09:40:26] -creating pickup;
[2020-07-18 09:40:26] -creating container variables;
[2020-07-18 09:40:26] -spawning cargos;
[2020-07-18 09:40:26] -digestion initialization...
[2020-07-18 09:40:26] -players variables
[2020-07-18 09:40:26] -digestion initialization completed.
[2020-07-18 09:40:26] -machines initialization...
[2020-07-18 09:40:26] -players variables
[2020-07-18 09:40:26] -machines variables
[2020-07-18 09:40:26] -machines recipes variables
[2020-07-18 09:40:26] -machine initialization completed.
[2020-07-18 09:40:26] -fire initialization...
[2020-07-18 09:40:26] -vehicles variables
[2020-07-18 09:40:26] -previous fires
[2020-07-18 09:40:26] -fire initialization completed.
[2020-07-18 09:40:26] -underground races initialization...
[2020-07-18 09:40:26] -general variables
[2020-07-18 09:40:26] [TESTING]: urace_racevar_reinit() called.
[2020-07-18 09:40:26] -racers variables
[2020-07-18 09:40:26] -players variables
[2020-07-18 09:40:26] -underground races initialization completed.
[2020-07-18 09:40:26] -fishing initialization...
[2020-07-18 09:40:26] -catch types
[2020-07-18 09:40:26] -players variables
[2020-07-18 09:40:26] -fishing initialization completed.
[2020-07-18 09:40:26] -elevators initialization...
[2020-07-18 09:40:26] -loading elevators...
[2020-07-18 09:40:26] -elevators initialization completed.
[2020-07-18 09:40:26] -Scripted Casinos initializations...
[2020-07-18 09:40:26]
----------------------------------
[2020-07-18 09:40:26]
[2020-07-18 09:40:26] STREAMER_OBJECT_SD = 1133903872
[2020-07-18 09:40:26] STREAMER_OBJECT_DD = 0
[2020-07-18 09:40:26] NB OF OBJ LOADED: 5126/8000
[2020-07-18 09:40:26] ----------------------------------
[2020-07-18 09:40:26] -removed buildings initialization...
[2020-07-18 09:40:26] -vars initializing...
[2020-07-18 09:40:26] -building listing...
[2020-07-18 09:40:26] -testing configuration:
[2020-07-18 09:40:26] -679 building removed;
[2020-07-18 09:40:26] -modelid:923 is removed 3 times:
[2020-07-18 09:40:26] -modelid:1216 is removed 14 times:
[2020-07-18 09:40:26] -modelid:'traffic light'(1283) is removed 239 times:
[2020-07-18 09:40:26] -modelid:1284 is removed 17 times:
[2020-07-18 09:40:26] -modelid:1315 is removed 62 times:
[2020-07-18 09:40:26] -modelid:1350 is removed 20 times:
[2020-07-18 09:40:26] -modelid:1373 is removed 8 times:
[2020-07-18 09:40:26] -modelid:1374 is removed 8 times:
[2020-07-18 09:40:26] -modelid:'interior boxes'(1421) is removed 2 times:
[2020-07-18 09:40:26] -modelid:'interior boxes'(1431) is removed 2 times:
[2020-07-18 09:40:26] -modelid:1440 is removed 2 times:
[2020-07-18 09:40:26] -modelid:1441 is removed 2 times:
[2020-07-18 09:40:26] -modelid:2647 is removed 7 times:
[2020-07-18 09:40:26] -modelid:2663 is removed 6 times:
[2020-07-18 09:40:26] -modelid:2672 is removed 4 times:
[2020-07-18 09:40:26] -modelid:2673 is removed 2 times:
[2020-07-18 09:40:26] -modelid:2674 is removed 2 times:
[2020-07-18 09:40:26] -modelid:2675 is removed 3 times:
[2020-07-18 09:40:26] -modelid:2676 is removed 3 times:
[2020-07-18 09:40:26] -modelid:2677 is removed 3 times:
[2020-07-18 09:40:26] -modelid:3377 is removed 8 times:
[2020-07-18 09:40:26] -modelid:3378 is removed 8 times:
[2020-07-18 09:40:26] -modelid:3474 is removed 2 times:
[2020-07-18 09:40:26] -modelid:3516 is removed 4 times:
[2020-07-18 09:40:26] -modelid:3567 is removed 5 times:
[2020-07-18 09:40:26] -modelid:3569 is removed 5 times:
[2020-07-18 09:40:26] -modelid:'cargos'(3574) is removed 32 times:
[2020-07-18 09:40:26] -modelid:'abandonned car'(3593) is removed 11 times:
[2020-07-18 09:40:26] -modelid:'abandonned car'(3594) is removed 10 times:
[2020-07-18 09:40:26] -modelid:3621 is removed 5 times:
[2020-07-18 09:40:26] -modelid:3625 is removed 7 times:
[2020-07-18 09:40:26] -modelid:3664 is removed 4 times:
[2020-07-18 09:40:26] -modelid:3665 is removed 3 times:
[2020-07-18 09:40:26] -modelid:3688 is removed 5 times:
[2020-07-18 09:40:26] -modelid:3744 is removed 32 times:
[2020-07-18 09:40:26] -modelid:3747 is removed 5 times:
[2020-07-18 09:40:26] -modelid:3769 is removed 6 times:
[2020-07-18 09:40:26] -modelid:3780 is removed 3 times:
[2020-07-18 09:40:26] -Administration initializations...
[2020-07-18 09:40:26] -SAARP_fac_cityp_tcmd initializations...
[2020-07-18 09:40:26]
[2020-07-18 09:40:26] --------------------------------------
[2020-07-18 09:40:26] Anticheat Nex-AC loaded!
[2020-07-18 09:40:26] Anticheat version: 1.9.53
[2020-07-18 09:40:26] Author: Nexius
[2020-07-18 09:40:26] --------------------------------------
[2020-07-18 09:40:26] -Initializing principal game mode.
[2020-07-18 09:40:26] -Mysql connection.
[2020-07-18 09:40:26] -Setting Variables.
[2020-07-18 09:40:26] -Preparing streamer plugin...
[2020-07-18 09:40:26] -tick rate: 50;
[2020-07-18 09:40:26] -max pickups: -1;
[2020-07-18 09:40:26] -cell distance: 1219479552;
[2020-07-18 09:40:26] -cell size: 300.000000;
[2020-07-18 09:40:26] -Preparing Main Textdraws...
[2020-07-18 09:40:26] -Spawning map icons...
[2020-07-18 09:40:26] -Plants initializations...
[2020-07-18 09:40:26] -Races initializations...
[2020-07-18 09:40:26] -Blood Stains initializations...
[2020-07-18 09:40:26] -Cheats initializations...
[2020-07-18 09:40:26] -Noob Path Helper initializations...
[2020-07-18 09:40:26] -Death Reasons initializations...
[2020-07-18 09:40:26] -Objects initializations...
[2020-07-18 09:40:26] -Acquiring object types
[2020-07-18 09:40:26] -Spawning ground objects models
[2020-07-18 09:40:26] -Spawning posts
[2020-07-18 09:40:26] -Vehicles interior initializations...
[2020-07-18 09:40:26] -Acquiring vehicles coords
[2020-07-18 09:40:26] -Spawning vehicle interior pickups
[2020-07-18 09:40:26] -Dialog Menus initializations...
[2020-07-18 09:40:26] -LS Stadiums initializations...
[2020-07-18 09:40:26] -Loading Police Forces faction
[2020-07-18 09:40:26] -Loading objects
[2020-07-18 09:40:26] -Loading pickups
[2020-07-18 09:40:26] [TESTING]: LSPD_Weapons_Pickup = 89
[2020-07-18 09:40:26] [TESTING]: SAMA_Armour_Pickup = 94
[2020-07-18 09:40:26] -Taxi faction initializing
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Loading SF Airport Management faction
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Loading LS International Airport faction
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Loading SF Military Police faction
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Loading LS Military Police faction.
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Loading SA Anti Terrorists faction.
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Spawning objects
[2020-07-18 09:40:26] -Mush faction initializing
[2020-07-18 09:40:26] -Spawning objects
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Mush faction initializing
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -City Planning faction initializing
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Spawning objects
[2020-07-18 09:40:26] -Gouv faction initializing (SADA/SAAA/SABA/SACB)
[2020-07-18 09:40:26] -Spawning objects
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] -Xoomer faction initializing
[2020-07-18 09:40:26] -Spawning pickups
[2020-07-18 09:40:26] [TESTING]: LSPD_Weapons_Pickup = 89
[2020-07-18 09:40:26] -IG Player Variables initializations...
[2020-07-18 09:40:26] -Vehicle Variables initializations...
[2020-07-18 09:40:26] -Other Variables initializations...
[2020-07-18 09:40:26] -Principal game mode initializing completed.
[2020-07-18 09:40:26]
Game mode ready!
[2020-07-18 09:40:26] Number of vehicle models: 0
[2020-07-18 09:40:26] [SERVER]: Vehicles are loading...
[2020-07-18 09:40:26] [MYSQL]: Receiving response on "SpawnVehicles", 861 rows.
[2020-07-18 09:40:36] [VEHICLES]: Vehicle spawned, 861 vehicles.
[2020-07-18 09:40:36] Spawning errors:
-Vehicule id:1240(m:588) does not have a vehicle interior view associated!;
-Vehicule id:1250(m:558) does not have a vehicle interior view associated!;
[2020-07-18 09:40:36] [MYSQL]: Receiving response on "getZonesFromDatabase", 43 rows.
[2020-07-18 09:40:36] [MYSQL]: Receiving response on "SpawnGazstations", 33 rows.
[2020-07-18 09:40:36] [DOORS]: Doors are loading...
[2020-07-18 09:40:36] [MYSQL]: Receiving response on "LoadDoors", 1486 rows.
[2020-07-18 09:40:46] [DOORS]: loaded 16/20 map icons.
[2020-07-18 09:40:46] [MYSQL]: Receiving response on "LoggedOff_InitAllAtStart_res", 857 rows.
[2020-07-18 09:40:46] -All Player Variables initializations...
[2020-07-18 09:40:51] [MYSQL]: Receiving response on "getAreasFromDatabase", 43 rows.
[2020-07-18 09:40:51] [MYSQL]: Receiving response on "ReceiveShopsList", 157 rows.
[2020-07-18 09:40:52] [MYSQL]: Receiving response on "GetGroundCargos", 193 rows.
[2020-07-18 09:40:52] [MYSQL]: Receiving response on "Machines_LoadFromDB", 16 rows.
[2020-07-18 09:40:53] [MYSQL]: Receiving response on "elevators_loadFromDatabase", 2 rows.
[2020-07-18 09:40:53] [MYSQL]: Receiving response on "LoadSAAGFields", 11 rows.
[2020-07-18 09:40:53] [MYSQL]: Receiving response on "LoadSAAGPlants", 73 rows.
[2020-07-18 09:40:53] [MYSQL]: Receiving response on "ReceiveGovOffices", 69 rows.
[2020-07-18 09:40:53] [MYSQL]: Receiving response on "ReceiveObjectTypes", 66 rows.
[2020-07-18 09:40:53] [MYSQL]: Receiving response on "SpawnGroundObjects", 4770 rows.
[2020-07-18 09:41:03] [MYSQL]: Receiving response on "SpawnPosts", 4 rows.
[2020-07-18 09:41:04] [MYSQL]: Receiving response on "GetSasdRouteFromDb(0)", 24 rows.
[2020-07-18 09:41:04] [MYSQL]: Receiving response on "GetSasdRouteFromDb(1)", 61 rows.
[2020-07-18 09:41:04] [MYSQL]: Receiving response on "GetSasdRouteFromDb(2)", 20 rows.
[2020-07-18 09:41:04] [MYSQL]: Receiving response on "GetSasdRouteFromDb(3)", 189 rows.
[2020-07-18 09:41:04] ## SASD ROUTE TOO LARGE ## ## will truncate r:1,t:fac_sasd_jy_lsscres from 189 to 65 ##
[2020-07-18 09:41:04] [MYSQL]: Receiving response on "GetSasdRouteFromDb(4)", 10 rows.
[2020-07-18 09:41:04] [MYSQL]: Receiving response on "GetSasdRouteFromDb(5)", 26 rows.
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "GetSasdRouteFromDb(6)", 2 rows.
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "GetSasdRouteFromDb(7)", 150 rows.
[2020-07-18 09:41:05] ## SASD ROUTE TOO LARGE ## ## will truncate r:2,t:fac_sasd_jy_lsnebiz from 150 to 65 ##
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "GetSasdRouteFromDb(8)", 0 rows.
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "GetSasdRouteFromDb(9)", 31 rows.
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "GetSasdRouteFromDb(10)", 19 rows.
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "GetSasdRouteFromDb(11)", 12 rows.
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "GetSasdRouteFromDb(12)", 9 rows.
[2020-07-18 09:41:05] [TESTING]: Finishing route, thisroute_max_leg:14, thisroute_total_leg:16
[2020-07-18 09:41:05] [TESTING]: Finishing route, thisroute_max_leg:5, thisroute_total_leg:7
[2020-07-18 09:41:05] [TESTING]: Finishing route, thisroute_max_leg:5, thisroute_total_leg:7
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "elevator_floor_loadFromDatabase", 21 rows.
[2020-07-18 09:41:05] [MYSQL]: Receiving response on "elevator_floor_loadFromDatabase", 11 rows.
[2020-07-18 09:41:05] [SERVER]: The time is 9:41
[2020-07-18 09:41:10] fire_toggleVehFire(vehicleid:247, toggle:1) called.
[2020-07-18 09:41:10] fire_toggleVehFire(vehicleid:318, toggle:1) called.
[2020-07-18 09:41:10] fire_toggleVehFire(vehicleid:357, toggle:1) called.
[2020-07-18 09:41:10] fire_toggleVehFire(vehicleid:418, toggle:1) called.
[2020-07-18 09:41:10] fire_toggleVehFire(vehicleid:461, toggle:1) called.
[2020-07-18 09:41:10] fire_toggleVehFire(vehicleid:559, toggle:1) called.
[2020-07-18 09:41:28] [SERVER]: The time is 9:41
[2020-07-18 09:41:51] [SERVER]: The time is 9:41
[2020-07-18 09:42:13] [SERVER]: The time is 9:42
[2020-07-18 09:42:36] [SERVER]: Executing 'serverStart1'.
[2020-07-18 09:42:36]
[2020-07-18 09:42:36]
[2020-07-18 09:42:36]
[2020-07-18 09:42:36] =======================================
[2020-07-18 09:42:36] | |
[2020-07-18 09:42:36] | YSI version 4.00.0001 |
[2020-07-18 09:42:36] | By Alex "Y_Less" Cole |
[2020-07-18 09:42:36] | |
[2020-07-18 09:42:36] =======================================
[2020-07-18 09:42:36]
[2020-07-18 09:42:36] -black screens initialization...
[2020-07-18 09:42:36] -players variables
[2020-07-18 09:42:36] -black screens initialization completed.
[2020-07-18 09:42:36] ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° °
[2020-07-18 09:42:36] ° ° ° ° The state is being infected ° ° ° °
[2020-07-18 09:42:36] ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° ° °
[2020-07-18 09:42:36] ° ° Initializing "Infected" System
[2020-07-18 09:42:36] ° ° Initializing other plugins
[2020-07-18 09:42:36] ° ° ° Initializing Map Andreas
[2020-07-18 09:42:37] PathFinder Plugin -> Creating New Thread
[2020-07-18 09:42:37] ° ° ° Initializing FCNPC
[2020-07-18 09:42:37] ° ° ° Initializing PathFinder
[2020-07-18 09:42:37] ° ° ° Generating spawn loacations.
[2020-07-18 09:42:37] ° ° ° initializing infected
[2020-07-18 09:42:37] ° ° ° initializing reserves
[2020-07-18 09:42:37] ° ° ° initializing timers
[2020-07-18 09:42:37] ° ° ° initializing SQL connection
[2020-07-18 09:42:37] ° ° Script initialized
[2020-07-18 09:42:37] Filterscript 'infected.amx' loaded.
[2020-07-18 09:42:37]
----------------------------------
[2020-07-18 09:42:37] GOUVERNEMENT OBJECT SCRIPT
[2020-07-18 09:42:37] nz = 12.557999, stream is
[2020-07-18 09:42:37] NB OF OBJ LOADED: 4338/X
[2020-07-18 09:42:37] ----------------------------------
[2020-07-18 09:42:37] Filterscript 'mobjects.amx' loaded.
[2020-07-18 09:42:37]
[2020-07-18 09:42:37]
[2020-07-18 09:42:37]
[2020-07-18 09:42:37] =======================================
[2020-07-18 09:42:37] | |
[2020-07-18 09:42:37] | YSI version 4.00.0001 |
[2020-07-18 09:42:37] | By Alex "Y_Less" Cole |
[2020-07-18 09:42:37] | |
[2020-07-18 09:42:37] =======================================
[2020-07-18 09:42:37]
[2020-07-18 09:42:37] -black screens initialization...
[2020-07-18 09:42:37] -players variables
[2020-07-18 09:42:37] -black screens initialization completed.
[2020-07-18 09:42:37] -LSPD Tickets initialization...
[2020-07-18 09:42:37] -Loading Testing1 Faction...
[2020-07-18 09:42:37] -Loading Testing1337 Faction...
[2020-07-18 09:42:37] [MYSQL]: Connection to `saarp` succesful!
[2020-07-18 09:42:37] -Loading Objects...
[2020-07-18 09:42:37] -Loading Areas...
[2020-07-18 09:42:37] -Loading Vehicles vars...
[2020-07-18 09:42:37] -Testing1 Faction loaded correctly...
[2020-07-18 09:42:37] Filterscript 'fac_test1.amx' loaded.
[2020-07-18 09:42:37]
[2020-07-18 09:42:37]
[2020-07-18 09:42:37]
[2020-07-18 09:42:37] =======================================
[2020-07-18 09:42:37] | |
[2020-07-18 09:42:37] | YSI version 4.00.0001 |
[2020-07-18 09:42:37] | By Alex "Y_Less" Cole |
[2020-07-18 09:42:37] | |
[2020-07-18 09:42:37] =======================================
[2020-07-18 09:42:37]
[2020-07-18 09:42:38] -black screens initialization...
[2020-07-18 09:42:38] -players variables
[2020-07-18 09:42:38] -black screens initialization completed.
[2020-07-18 09:42:38] -setting BIOS environments
[2020-07-18 09:42:38] -computers variables
[2020-07-18 09:42:38] -players variables
[2020-07-18 09:42:38] -computer initialization...
[2020-07-18 09:42:38] -computer initialization completed.
[2020-07-18 09:42:38] -tele-comunication system initializations...
[2020-07-18 09:42:38] -Phone Objects adjustments
[2020-07-18 09:42:38] -Phone Variables
[2020-07-18 09:42:38] -Starting timers
[2020-07-18 09:42:38] -Loading Test2 Script...
[2020-07-18 09:42:38] -Loading Map Andreas...
[2020-07-18 09:42:38] [MYSQL]: Connection to `saarp` succesful!
[2020-07-18 09:42:38] -Test2 Script loaded correctly...
[2020-07-18 09:42:38] Filterscript 'fac_test2.amx' loaded.
[2020-07-18 09:42:38] [MYSQL]: Receiving response on "cptr_LoadFromDatabase", 3 rows.
[2020-07-18 09:42:38] [MYSQL]: Receiving response on "GetDbPhones", 46 rows.
[2020-07-18 09:42:38] [SERVER]: The time is 9:42
[2020-07-18 09:42:41] [SERVER]: Executing 'serverStart2'.
[2020-07-18 09:42:41] Server password has been removed.
[2020-07-18 09:42:41] Filterscript 'testing.amx' unloaded.
[2020-07-18 09:42:44] [INFECTED]: Infected must be spawned.
[2020-07-18 09:42:44] [npc:join] Infected has joined the server (203:127.0.0.1)
[2020-07-18 09:42:44] [INFECTED]: Connecting player is infected.
[2020-07-18 09:42:52] [INFECTED]: Infected must be spawned.
[2020-07-18 09:42:52] [npc:join] Infected has joined the server (202:127.0.0.1)
[2020-07-18 09:42:52] [INFECTED]: Connecting player is infected.
[2020-07-18 09:43:01] [SERVER]: The time is 9:43
[2020-07-18 09:43:01] [INFECTED]: Infected must be spawned.
[2020-07-18 09:43:01] [npc:join] Infected has joined the server (201:127.0.0.1)
[2020-07-18 09:43:01] [INFECTED]: Connecting player is infected.
[2020-07-18 09:43:09] [INFECTED]: Infected must be spawned.
[2020-07-18 09:43:09] [npc:join] Infected has joined the server (200:127.0.0.1)
[2020-07-18 09:43:09] [INFECTED]: Connecting player is infected.
[2020-07-18 09:43:18] [INFECTED]: Infected must be spawned.
[2020-07-18 09:43:18] [npc:join] Infected has joined the server (199:127.0.0.1)
[2020-07-18 09:43:18] [INFECTED]: Connecting player is infected.
[2020-07-18 09:43:24] [SERVER]: The time is 9:43
[2020-07-18 09:43:26] [INFECTED]: Infected must be spawned.
[2020-07-18 09:43:26] [npc:join] Infected has joined the server (198:127.0.0.1)
[2020-07-18 09:43:26] [INFECTED]: Connecting player is infected.
[2020-07-18 09:43:35] [INFECTED]: Infected must be spawned.
[2020-07-18 09:43:35] [npc:join] Infected has joined the server (197:127.0.0.1)
[2020-07-18 09:43:35] [INFECTED]: Connecting player is infected.
[2020-07-18 09:43:43] [INFECTED]: Infected must be spawned.
[2020-07-18 09:43:43] [npc:join] Infected has joined the server (196:127.0.0.1)
[2020-07-18 09:43:43] [INFECTED]: Connecting player is infected.
[2020-07-18 09:43:44] [TESTING]: RefreshDoorMapIcons() called (Initializer, interval=90).
[2020-07-18 09:43:46] [SERVER]: The time is 9:43
[2020-07-18 09:43:51] [INFECTED]: Infected must be spawned.
[2020-07-18 09:43:51] [npc:join] Infected has joined the server (195:127.0.0.1)
[2020-07-18 09:43:51] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:00] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:00] [npc:join] Infected has joined the server (194:127.0.0.1)
[2020-07-18 09:44:00] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:08] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:08] [npc:join] Infected has joined the server (193:127.0.0.1)
[2020-07-18 09:44:08] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:09] [SERVER]: The time is 9:44
[2020-07-18 09:44:17] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:17] [npc:join] Infected has joined the server (192:127.0.0.1)
[2020-07-18 09:44:17] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:25] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:25] [npc:join] Infected has joined the server (191:127.0.0.1)
[2020-07-18 09:44:25] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:31] [SERVER]: The time is 9:44
[2020-07-18 09:44:33] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:33] [npc:join] Infected has joined the server (190:127.0.0.1)
[2020-07-18 09:44:33] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:41] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:41] [npc:join] Infected has joined the server (189:127.0.0.1)
[2020-07-18 09:44:42] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:50] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:50] [npc:join] Infected has joined the server (188:127.0.0.1)
[2020-07-18 09:44:50] [INFECTED]: Connecting player is infected.
[2020-07-18 09:44:54] [SERVER]: The time is 9:44
[2020-07-18 09:44:58] [INFECTED]: Infected must be spawned.
[2020-07-18 09:44:58] [npc:join] Infected has joined the server (187:127.0.0.1)
[2020-07-18 09:44:58] [INFECTED]: Connecting player is infected.
[2020-07-18 09:45:06] [INFECTED]: Infected must be spawned.
[2020-07-18 09:45:06] [npc:join] Infected has joined the server (186:127.0.0.1)
[2020-07-18 09:45:06] [INFECTED]: Connecting player is infected.
[2020-07-18 09:45:15] [INFECTED]: Infected must be spawned.
[2020-07-18 09:45:15] [npc:join] Infected has joined the server (185:127.0.0.1)
[2020-07-18 09:45:15] [INFECTED]: Connecting player is infected.
[2020-07-18 09:45:16] [SERVER]: The time is 9:45
[2020-07-18 09:45:23] [INFECTED]: Infected must be spawned.
[2020-07-18 09:45:23] [npc:join] Infected has joined the server (184:127.0.0.1)
Thank you
rt-2
I fixed this problem by experimenting with the include orders. Here is what I done:
I moved the include from line 26, now to almost the bottom of my includes.
Note that I could not place it before "fixes.inc", otherwise, "fixes.inc" would give me an ALS error.
Thank you,
rt-2
I fixed this problem by experimenting with the include orders. Here is what I done:
I moved the include from line 26, now to almost the bottom of my includes. Note that I could not place it before "fixes.inc", otherwise, "fixes.inc" would give me an ALS error.
Thank you, rt-2
this fix dont work for me, do u try another?
| gharchive/issue | 2020-07-18T14:12:47 | 2025-04-01T06:39:16.024614 | {
"authors": [
"razttt",
"rt-2"
],
"repo": "katursis/Pawn.RakNet",
"url": "https://github.com/katursis/Pawn.RakNet/issues/36",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2509542727 | showAlertUpdate method does not show cancel button.
In your example file, the showAlertUpdate's cancel button is not showing.
Am I miss any props?
await AppVersionUpdate.showAlertUpdate(
appVersionResult: result,
context: context,
mandatory: true,
backgroundColor: Colors.grey[200],
title: '새 버전의 앱으로 업데이트가 가능합니다',
titleTextStyle: const TextStyle(
color: Colors.black, fontWeight: FontWeight.w600, fontSize: 24.0),
content: '새로운 업데이트가 발견되었습니다.',
contentTextStyle: const TextStyle(
color: Colors.black,
fontWeight: FontWeight.w400,
),
updateButtonStyle: const ButtonStyle(backgroundColor: MaterialStatePropertyAll(Colors.amber)),
cancelButtonStyle:const ButtonStyle(backgroundColor: MaterialStatePropertyAll(Colors.redAccent)),
updateButtonText: '업데이트',
cancelButtonText: '나중에 하기',
);
When you use mandatory = true the cancel button does not appear, as it is mandatory to update, if you want to give it as an option, update or leave it trimmed later, select prop mandatory = false
| gharchive/issue | 2024-09-06T05:22:37 | 2025-04-01T06:39:16.028219 | {
"authors": [
"hogyun3709",
"kauemurakami"
],
"repo": "kauemurakami/app_version_update",
"url": "https://github.com/kauemurakami/app_version_update/issues/40",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
604009534 | added call to convert posix path to string to fix TypeError in save_p…
…retrained call. Fixes Issue #200
Merged. Thanks.
| gharchive/pull-request | 2020-04-21T13:43:11 | 2025-04-01T06:39:16.029184 | {
"authors": [
"aaronbriel",
"kaushaltrivedi"
],
"repo": "kaushaltrivedi/fast-bert",
"url": "https://github.com/kaushaltrivedi/fast-bert/pull/205",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2752738033 | Ship a binary version of the plugin
Since midsurfer only depends on ParaView, creating (and shipping) a binary version of this plugin should be possible by using the tools developed by Kitware:
https://github.com/Kitware/paraview-ci-example
Let me know what you think
Dear Mathieu,
Thank you for the suggestion! This is definitely on my list. The paper is currently under review, we will decide on how to proceed once we receive the reviews. I will keep this issue open for now.
Thomas
| gharchive/issue | 2024-12-20T13:23:35 | 2025-04-01T06:39:16.031088 | {
"authors": [
"mwestphal",
"theusst"
],
"repo": "kaust-vislab/MidSurfer",
"url": "https://github.com/kaust-vislab/MidSurfer/issues/2",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
449392464 | FEATURE property to control truncation state
This allows the truncation to be expanded or collapsed via a property.
Contracted:
<truncate :truncated="true">...</truncate>
Expanded:
<truncate :truncated="false">...</truncate>
Bound:
<truncate :truncated="isExpanded">...</truncate>
data() {
isExpanded: true
}
NOTE: I'm unsure if the property should be named to something truthy by default, like expanded so that true is expanded and false is collapsed
Could someone can merge this ? I need it haha
Or did @Soviut did u found an alternative solution ?
| gharchive/pull-request | 2019-05-28T17:34:22 | 2025-04-01T06:39:16.033537 | {
"authors": [
"CyrilDebon",
"Soviut"
],
"repo": "kavalcante/vue-truncate-collapsed",
"url": "https://github.com/kavalcante/vue-truncate-collapsed/pull/14",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2190672499 | Some feedback...
Hello, I just wanted to leave some feedback on this library.
Context
I'm developing a tool which needs Git functionalities. In particular, but not limited to, it has to clone a huge amount of repositories from different sources.
I started by using JGit but it's so bad it's almost ridiculous. Very often, clone operations are super slow and would end with a EOF exception. Of course, it does not happen with the native command.
Switching to SympleGit
Let's say that the README is quite catchy:
However, JGit's API comes with a learning curve and lacks direct, one-to-one support for CLI actions. Therefore, SympleGit is likely to be a more straightforward option for simple Git integration in many Java projects, particularly those utilizing basic Git functionalities. Let's delve into the details!
In my honest opinion, it cannot be more false. JGit is super easy to use, at least in my use case, every git command is a class in JGit. In SympleGit the clone command class does not exist.
The great thing about SympleGit is that it uses the native command to perform operations, which is so much faster and performant. The bad thing, again in my opinion, is that the API is not so well thought, and it's lacking.
The idea of creating custom git commands with executeGitCommand(...) surely is good to cover all cases even those that are not implemented yet, but still having them implemented as classes would make everything much easier to use.
It's so confusing to do this:
SympleGit sympleGit = SympleGit.custom()
.setDirectory(repoDirectoryPath)
.build();
Why do I have to give the directory here instead of giving it to the command directly? Like this for JGit:
CloneCommand cmd = new CloneCommand()
.setDirectory(destDir.resolve(path).toFile())
.setURI(url)
.setRemote(remote)
.setBranch(branch);
// In this case, destDir is the base dir in which I want to store all the projects
// path is the where I want to clone the repository, the last part of the path will be the name of the directory
I find it a bit more intuitive
3) I read the README multiple times, but I still don't know how I can track the progress of a command. Probably because, yet again, it's not very intuitive. By nature, working with Process and ProcessBuilder in Java is a cumbersome task, processing the output of a process properly is hard. And for this very reason, a library that uses such APIs should make it as easy as possible for the end user to use it.
The first thing that comes to my mind when I want to track the progress of an external process is something like this:
// I get the why of the check...
if (! gitCommander.isResponseOk()) {
System.out.println("An Error Occured: " + gitCommander.getProcessError());
return;
}
// Then...
while (command.output ...) {
// print output
}
JGit here is doing a far better job here, I can modify the above command like this:
CloneCommand cmd = new CloneCommand()
.setProgressMonitor(new TextProgressMonitor()) // And boom, I automatically have output to the console
.setDirectory(destDir.resolve(path).toFile())
.setURI(url)
.setRemote(remote)
.setBranch(branch);
// Not only that, I can even make custom monitor implementations
Conclusion
I believe SympleGit could become a very good alternative to JGit, but it definitely needs to grow, improve and expand the API.
In the meantime, I think I'll implement a custom solution that uses the native command just like SympleGit because it's simply much much better than JGit which crashes all the time.
Hi,
Sorry for very late reply & thanks for the comments.
I was a bid dubitative, about why it's confusing to do that:
SympleGit sympleGit = SympleGit.custom()
.setDirectory(repoDirectoryPath)
.build();
Could you please elaborate a little? I'm not sure of the importance of the issue.
About the progress monitor, I clearly understand the need, but can you tel when it's important to set up monitor?
Maybe my projects are not big enough, I never had the need...
Of course, we will do it cleanly, extendable like you wish.
Best,
N.
| gharchive/issue | 2024-03-17T13:28:32 | 2025-04-01T06:39:16.043818 | {
"authors": [
"ndepomereu",
"palexdev"
],
"repo": "kawansoft/SympleGit-Java",
"url": "https://github.com/kawansoft/SympleGit-Java/issues/1",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
264812738 | 自定义属性存放
Audit Events https://apereo.github.io/cas/5.1.x/installation/Audits.html
登录时传入部分信息,登录成功需要记录相关信息
ip信息,传入参数信息
自定义Credential已得到解决
| gharchive/issue | 2017-10-12T04:45:28 | 2025-04-01T06:39:16.045678 | {
"authors": [
"kawhii"
],
"repo": "kawhii/sso",
"url": "https://github.com/kawhii/sso/issues/19",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
324271876 | Test enhancement
Changed log
Set the multiple PHPUnit version for the different PHP versions.
Add the white filter list in phpunit.xml setting.
Thank you for the pull request! 🙂
Do you mind explaining the reason for these changes? Especially the added PHP version constraint, since it breaks compatibility with PHP 5. I know PHP 5 is reaching its end of life very soon, but I don't see a reason to break compatibility unless it's required.
I also think the phpunit version string should be "^6.2 || ^7.0" rather than "^6.2|^7.0", at least according to the Composer docs.
As you say, I set the required PHP version at least 7.0+ because the PHP 5 is EOL.
How about setting the >=5.6 in composer.json require key?
The PHPUnit version is "^6.2 || ^7.0" is the correct defined version description.
| gharchive/pull-request | 2018-05-18T05:07:51 | 2025-04-01T06:39:16.048738 | {
"authors": [
"kayex",
"peter279k"
],
"repo": "kayex/http-codes",
"url": "https://github.com/kayex/http-codes/pull/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1620515761 | 🛑 GITHUB is down
In a0a45f3, GITHUB (https://www.githubstatus.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: GITHUB is back up in 5e72df3.
| gharchive/issue | 2023-03-12T21:18:52 | 2025-04-01T06:39:16.051411 | {
"authors": [
"kayketeixeira"
],
"repo": "kayketeixeira/upptime",
"url": "https://github.com/kayketeixeira/upptime/issues/15",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
173556237 | projection type
The code is using np.random.randn() times input vector.
In the LSH paper survey, we are using either (Gaussian Distribution * input + bias)/W or (Uniform Distribution * input). I was wondering if we should change the distribution to uniform in the code?
I have seen that Gaussian Random Projection is on way to implement the random projections. What is this LSH paper survey?
| gharchive/issue | 2016-08-26T22:33:03 | 2025-04-01T06:39:16.060860 | {
"authors": [
"ShibiHe",
"eggie5"
],
"repo": "kayzhu/LSHash",
"url": "https://github.com/kayzhu/LSHash/issues/15",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1469309694 | Use materialstore-0.13.0-SNAPSHOT or higher
The materialstore-0.13.0-SNAPSHOT resolved the issue
The pom.xml has no
Now most of the external dependencies for the inspectus project can be resolved by Gradle automatically.
done at 0.6.0
| gharchive/issue | 2022-11-30T09:23:04 | 2025-04-01T06:39:16.064261 | {
"authors": [
"kazurayam"
],
"repo": "kazurayam/inspectus",
"url": "https://github.com/kazurayam/inspectus/issues/40",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
318757159 | Add operations for account to CLI
Adds a command to request all operations relating to a given account
to the CLI application.
Fixes #163
Is there a GIF that reflects how this work made you feel?
@correlator @kbacha can you guys take another look at this? Thanks!
@kbacha can you take a look at this refactor? Took me a bit of Rust-y try and error to figure out the indent level–aware nesting, but in the end, the solution I got was pretty simple.
| gharchive/pull-request | 2018-04-29T22:51:08 | 2025-04-01T06:39:16.066148 | {
"authors": [
"alexanderreiff"
],
"repo": "kbacha/stellar-sdk",
"url": "https://github.com/kbacha/stellar-sdk/pull/201",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2505377078 | add extraPodLabels, hub access label by default
ensures network access to the hub with the default hub networkPolicy in the jupyterhub helm chart
Thanks!
| gharchive/pull-request | 2024-09-04T13:26:21 | 2025-04-01T06:39:16.066985 | {
"authors": [
"TomAugspurger",
"minrk"
],
"repo": "kbatch-dev/helm-chart",
"url": "https://github.com/kbatch-dev/helm-chart/pull/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1215578937 | Patch 1
this branch for our open mic sessions:
It should contains the schedule and contents of the seminars
@blcham
Does it make sense to have the open mic schedule publicly visible, when the sessions are internal to KBSS? Or is there a plan to make the sessions public?
Does it make sense to have the open mic schedule publicly visible, when the sessions are internal to KBSS? Or is there a plan to make the sessions public?
Yes, this was the idea, see #7. I would like to publish some of the presentations and possibly invite somebody to join our presentation online.
Good thing is to have some static page with programme and link it every time like three days before.
| gharchive/pull-request | 2022-04-26T08:15:06 | 2025-04-01T06:39:16.117525 | {
"authors": [
"MichalMed",
"ahmadjana",
"blcham",
"ledsoft"
],
"repo": "kbss-cvut/kbss-website",
"url": "https://github.com/kbss-cvut/kbss-website/pull/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
321606485 | Create Installer Form
Create an installer form to collect information on settings that writes the config file and tests the database, creates objects, and initializes the admin. It should also test whether or not it can send mail, and write to the disk for file uploads.
this is not going to happen by me
| gharchive/issue | 2018-05-09T14:59:34 | 2025-04-01T06:39:16.120976 | {
"authors": [
"kc9eye"
],
"repo": "kc9eye/UData",
"url": "https://github.com/kc9eye/UData/issues/5",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2097506867 | [Feature] Tree-sitter grammar
Discussed in https://github.com/kcl-lang/kcl/discussions/595
Originally posted by matoous July 3, 2023
Hi team!
Are there any plans to offer Tree-sitter grammer for KCL? I would be happy to contribute but wonder whether the repository should be maintained under the kcl-lang organization.
Hi there @octonawish-akcodes If you are willing, you can work in this repo. Thank you! ❤️
| gharchive/issue | 2023-07-03T15:38:34 | 2025-04-01T06:39:16.130758 | {
"authors": [
"Peefy"
],
"repo": "kcl-lang/tree-sitter-kcl",
"url": "https://github.com/kcl-lang/tree-sitter-kcl/issues/6",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1425760800 | bug:
Describe the bug
In e2e-sharded runs we're seeing an error like:
PROXY W1025 18:20:39.100256 31146 reflector.go:324] k8s.io/client-go@v0.0.0-20221025160842-38c73163e766/tools/cache/reflector.go:167: failed to list *v1alpha1.ClusterWorkspace: Unauthorized
PROXY E1025 18:20:39.101457 31146 reflector.go:138] k8s.io/client-go@v0.0.0-20221025160842-38c73163e766/tools/cache/reflector.go:167: Failed to watch *v1alpha1.ClusterWorkspace: failed to list *v1alpha1.ClusterWorkspace: Unauthorized
After debugging locally, it appears to be because we're using the root shard-admin token all shards when creating the clients used by the clusterworkspace informer
We can see here in some local debug log output that the shard-admin token hitting kcp-1 is actually the token from kcp-0:
$ grep "invalid bearer token" kcp-1.log | head -n1
E1027 14:04:04.549596 929593 authentication.go:63] "Unable to authenticate the request SHDEBUG" err="invalid bearer token" req=&{Method:GET URL:/apis/tenancy.kcp.dev/v1alpha1/clusterworkspaces:5678dc3ffb5f4156924efc5c9a4732f12e73fadca3592faa5c762f91fd0fc3f5?limit=500&resourceVersion=0 Proto:HTTP/2.0 ProtoMajor:2 ProtoMinor:0 Header:map[Accept:[application/json, */*] Accept-Encoding:[gzip] Authorization:[Bearer 4ae09225-6a3b-4413-a5a7-d105b45a4466] User-Agent:[kcp-front-proxy/v1.24.3+kcp (linux/amd64) kubernetes/574fe23]] Body:0xc0071db1a0 GetBody:<nil> ContentLength:0 TransferEncoding:[] Close:false Host:10.19.133.66:6445 Form:map[] PostForm:map[] MultipartForm:<nil> Trailer:map[] RemoteAddr:10.19.133.66:37122 RequestURI:/clusters/%2A/apis/tenancy.kcp.dev/v1alpha1/clusterworkspaces:5678dc3ffb5f4156924efc5c9a4732f12e73fadca3592faa5c762f91fd0fc3f5?limit=500&resourceVersion=0 TLS:0xc00e689080 Cancel:<nil> Response:<nil> ctx:0xc00fe05d70}
$ grep -B2 4ae09225-6a3b-4413-a5a7-d105b45a4466 ../.kcp-0/admin.kubeconfig
- name: shard-admin
user:
token: 4ae09225-6a3b-4413-a5a7-d105b45a4466
Steps To Reproduce
Check e2e-sharded logs, or run make test-e2e-sharded locally
Expected Behaviour
IIUC we need the proxy informer to be able to list/watch clusterworkspace objects on all shards, so we have to use an authentication method which isn't limited to the root shard.
Some ideas were discussed on slack and it seems like switching to cert based auth may be the best option - this should mean we can authenticate with any shard, and avoid modifying the proxy config when a shard gets added/removed.
Additional Context
No response
/assign
sgtm, thanks for picking it up.
So looking at the history here, I see that the current per-shard client code landed in https://github.com/kcp-dev/kcp/pull/1203 - so it would be helpful to get input from @sttts before I start changing it :)
If we switch this to use client certs, then I think we need to handle the case where each shard has a different cert, and the method to enable that appears to be the mapping file, however in the sharded-test-server case, we seem to only create mappings to the root shard:
$ grep "server:" .kcp-0/admin.kubeconfig
server: https://10.19.133.66:6444
server: https://10.19.133.66:6444/clusters/root
$ grep "server:" .kcp-1/admin.kubeconfig
server: https://10.19.133.66:6445
server: https://10.19.133.66:6445/clusters/root
$ cat .kcp-front-proxy/mapping.yaml
- backend: https://localhost:6444
backend_server_ca: .kcp/serving-ca.crt
path: /services/
proxy_client_cert: .kcp-front-proxy/requestheader.crt
proxy_client_key: .kcp-front-proxy/requestheader.key
- backend: https://localhost:6444
backend_server_ca: .kcp/serving-ca.crt
path: /clusters/
proxy_client_cert: .kcp-front-proxy/requestheader.crt
proxy_client_key: .kcp-front-proxy/requestheader.key
So I think for multi-shard deployments we need to specify each shard as a backend (using the IP, not localhost), then we can look up the necessary client cert/key using the shard BaseURL, does that sound reasonable?
If we switch this to use client certs, then I think we need to handle the case where each shard has a different cert,
I think that should be okay, I think that the proxy could specify a well-known ServerName during bootstrapping a TLS connection to a shard. That would give us a certificate that would much a well-known CA. Does it make sense?
If we switch this to use client certs, then I think we need to handle the case where each shard has a different cert,
I think that should be okay, I think that the proxy could specify a well-known ServerName during bootstrapping a TLS connection to a shard. That would give us a certificate that would much a well-known CA. Does it make sense?
Thanks for the feedback @p0lyn0mial - I get that the ServerName has to match the CN/SAN in the server cert, but I'm unclear if the proxy has to do anything special here - the user provides the server/client certs, and the ServerName will be derived from the ClusterWorkspaceShard spec in the indexer, so the hostname (or potentially IP for CI/dev testing) just has to match?
I think leveraging the client certs from the mapping file is definitely viable, and probably optimal if we want to ensure the informer uses per-shard credentials.
I can't help wondering if we should just pass an admin kubeconfig to the proxy though which would be potentially much simpler, any thoughts on that? The main disadvantage AFAICS is that we'll need to rename/replace the --root-kubeconfig CLI flag
/cc @ncdc who mentioned he has some work in progress related to this
Summary of some discussion with @ncdc and @p0lyn0mial
@ncdc has some related work in progress which we can possibly revive then use the same client-cert pattern to solve this (convert the informer client to use a proxy->shard client cert instead of the shard-admin token)
We can't easily use an admin kubeconfig to switch to client certs, since in the sharded case the client cert terminates at the proxy, then we use request-header at the shard (so, we can't use the same kubeconfig for the proxy->shard auth)
Accidentally marked as fixed by 2297
/reopen
/assign @p0lyn0mial
This should be resolved via https://github.com/kcp-dev/kcp/pull/2382
| gharchive/issue | 2022-10-27T14:45:36 | 2025-04-01T06:39:16.143584 | {
"authors": [
"hardys",
"ncdc",
"p0lyn0mial"
],
"repo": "kcp-dev/kcp",
"url": "https://github.com/kcp-dev/kcp/issues/2274",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2262005918 | MongoDB - Order Tests
Need to mark tests with @Order() to avoid intermittent test failure.
PR https://github.com/kdhrubo/db2rest/pull/503 is merged. Closing the ticket.
| gharchive/issue | 2024-04-24T19:17:43 | 2025-04-01T06:39:16.163683 | {
"authors": [
"souravroy"
],
"repo": "kdhrubo/db2rest",
"url": "https://github.com/kdhrubo/db2rest/issues/502",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
170828409 | Use cognito authorizer for auth
Tenhle: https://console.aws.amazon.com/apigateway/home?region=us-east-1#/apis/kw5ugqgvx9/custom-authorizers/k4y2fr
Potom bude třeba v login callu vracet id token namísto access tokenu
Asi bych počkal až bude Cloud Formation nativně podporovat Cognito, teď by se to muselo řešit přes custom resource
Konfigurace:
CognitoAuthorizer:
Type: AWS::ApiGateway::Authorizer
Properties:
IdentitySource: method.request.header.Authorization
Name: CognitoAuthorizer
ProviderARNs:
- "arn:aws:cognito-idp:${file(./env.yml):REGION}:${file(./env.yml):ACCOUNT_ID}:userpool/${file(./env.yml):COGNITO_POOL_ID}"
Type: TOKEN
RestApiId:
Outputs:
AuthorizerId:
Description: "Authorizer Id"
Value:
"Ref": "CognitoAuthorizer"
Je nutná přímá podpora v Serverless. V CF to nejde, protože ve chvíli jeho exekuce ještě neexistuje API Gateway
Viz. https://github.com/SC5/serverless-plugin-cfauthorizer
| gharchive/issue | 2016-08-12T09:03:57 | 2025-04-01T06:39:16.220419 | {
"authors": [
"JakubMatejka"
],
"repo": "keboola/developer-portal",
"url": "https://github.com/keboola/developer-portal/issues/21",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
221067435 | cleanup authentication methods
bearer and url.query are useless:
https://github.com/keboola/generic-extractor/blob/master/src/Keboola/GenericExtractor/Config/Api.php#L125-
https://github.com/keboola/generic-extractor/blob/master/src/Keboola/GenericExtractor/Config/Api.php#L129
@odinuv url.query sa ale pouziva, nie?
https://github.com/keboola/kbc-ui-templates/search?utf8=✓&q=url.query&type=
to je mozny, ale kazdopadne jsou tam dve stejny metody a jedna z nich je zbytecna
bearer removed and url.query marked as deprecated in 9edb5ef34b68a1f93f0716cc9c5ddb8374377865
| gharchive/issue | 2017-04-11T20:14:35 | 2025-04-01T06:39:16.223596 | {
"authors": [
"odinuv",
"ujovlado"
],
"repo": "keboola/generic-extractor",
"url": "https://github.com/keboola/generic-extractor/issues/42",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
368540274 | Add support for state file
Comes up from https://github.com/keboola/db-extractor-mysql/pull/81#discussion_r223825923
Would be nice to have getter for state file. Besides there could be also method for writing into outcomes state.
Už tam byla issue https://github.com/keboola/php-component/issues/46 ;)
| gharchive/issue | 2018-10-10T08:04:32 | 2025-04-01T06:39:16.227083 | {
"authors": [
"Actimel",
"tomasfejfar"
],
"repo": "keboola/php-component",
"url": "https://github.com/keboola/php-component/issues/47",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
750987183 | Update composer.json for latest sapi client
Required for https://github.com/keboola/input-mapping/pull/68
Pending sapi-client release https://github.com/keboola/storage-api-php-client/releases/tag/untagged-3d86260daac18b393369
@odinuv what should we call this, 0.0.3?
hmm, test failed on "branch not found". is there a possible concurrency issue?
they don't run in parallel, it's quite suspicous
feel free to make it 1.0.0 :) it's already in production
| gharchive/pull-request | 2020-11-25T17:07:51 | 2025-04-01T06:39:16.229555 | {
"authors": [
"odinuv",
"pivnicek"
],
"repo": "keboola/storage-api-php-client-branch-wrapper",
"url": "https://github.com/keboola/storage-api-php-client-branch-wrapper/pull/5",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2453042770 | create ScaledObject when triggers type is prometheus metricType is Value err
Report
when triggers type is prometheus, metricType is Value,create ScaledObject ,keda-operator is err
2024-08-07T09:33:45Z ERROR Reconciler error {"controller": "scaledobject", "controllerGroup": "keda.sh", "controllerKind": "ScaledObject", "scaledObject": {"name":"my-helm-demo","namespace":"crane-system"}, "namespace": "crane-system", "name": "my-helm-demo", "reconcileID": "f750f79f-5a1b-4fc9-88d2-b136a11b65ff", "error": "HorizontalPodAutoscaler.autoscaling \"keda-hpa-my-helm-demo\" is invalid: spec.metrics[0].external.target.type: Invalid value: \"value\": must be either Utilization, Value, or AverageValue"}
apiVersion: keda.sh/v1alpha1
kind: ScaledObject
metadata:
name: my-helm-demo
namespace: crane-system
spec:
scaleTargetRef:
apiVersion: apps/v1
kind: Deployment
name: my-helm-demo
pollingInterval: 15
minReplicaCount: 2
maxReplicaCount: 50
advanced:
horizontalPodAutoscalerConfig:
behavior:
scaleDown:
stabilizationWindowSeconds: 0
policies:
- type: Percent
value: 100
periodSeconds: 15
scaleUp:
stabilizationWindowSeconds: 0
policies:
- type: Percent
value: 500
periodSeconds: 15
triggers:
- type: prometheus
metricType: Value
metadata:
serverAddress: xxx
metricName: portrait_pod_cpu_predict_compression_all_num
query: portrait_pod_cpu_predict_compression_all_num{}
threshold: "50"
Expected Behavior
hpa resources can be created normally
Actual Behavior
hpa not create
Name: my-helm-demo
Namespace: crane-system
Labels: scaledobject.keda.sh/name=my-helm-demo
Annotations: <none>
API Version: keda.sh/v1alpha1
Kind: ScaledObject
Metadata:
Creation Timestamp: 2024-08-07T09:33:24Z
Finalizers:
finalizer.keda.sh
Generation: 1
Resource Version: 24908
UID: 0d11f0d0-216e-4696-be18-51a49cad0bd3
Spec:
Advanced:
Horizontal Pod Autoscaler Config:
Behavior:
Scale Down:
Policies:
Period Seconds: 15
Type: Percent
Value: 100
Stabilization Window Seconds: 0
Scale Up:
Policies:
Period Seconds: 15
Type: Percent
Value: 500
Stabilization Window Seconds: 0
Max Replica Count: 50
Min Replica Count: 2
Polling Interval: 15
Scale Target Ref:
API Version: apps/v1
Kind: Deployment
Name: my-helm-demo
Triggers:
Metadata:
Metric Name: portrait_pod_cpu_predict_compression_all_num
Query: portrait_pod_cpu_predict_compression_all_num{}
Server Address: xxxx
Threshold: 50
Metric Type: Value
Type: prometheus
Status:
Conditions:
Message: Failed to ensure HPA is correctly created for ScaledObject
Reason: ScaledObjectCheckFailed
Status: False
Type: Ready
Message: ScaledObject check failed
Reason: UnkownState
Status: Unknown
Type: Active
Status: Unknown
Type: Fallback
External Metric Names:
s0-prometheus-portrait_pod_cpu_predict_compression_all_num
Original Replica Count: 8
Scale Target GVKR:
Group: apps
Kind: Deployment
Resource: deployments
Version: v1
Scale Target Kind: apps/v1.Deployment
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Warning ScaledObjectCheckFailed 59s (x17 over 6m27s) keda-operator Failed to ensure HPA is correctly created for ScaledObject
Steps to Reproduce the Problem
Logs from KEDA operator
2024-08-07T09:33:45Z ERROR Reconciler error {"controller": "scaledobject", "controllerGroup": "keda.sh", "controllerKind": "ScaledObject", "scaledObject": {"name":"my-helm-demo","namespace":"crane-system"}, "namespace": "crane-system", "name": "my-helm-demo", "reconcileID": "f750f79f-5a1b-4fc9-88d2-b136a11b65ff", "error": "HorizontalPodAutoscaler.autoscaling \"keda-hpa-my-helm-demo\" is invalid: spec.metrics[0].external.target.type: Invalid value: \"value\": must be either Utilization, Value, or AverageValue"}
KEDA Version
< 2.11.0
Kubernetes Version
< 1.28
Platform
None
Scaler Details
prometheus
Anything else?
$ kubectl version
Client Version: v1.29.2
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: v1.21.1
WARNING: version difference between client (1.29) and server (1.21) exceeds the supported minor version skew of +/-1
kubectl apply --server-side -f https://github.com/kedacore/keda/releases/download/v2.8.0/keda-2.8.0.yaml
Hello,
KEDA v2.8 is quite old (almost 2 years). I don't remember any issue related but just in case, have you tried using v2.8.2 -> https://github.com/kedacore/keda/releases/tag/v2.8.2?
Sadly, that version is out of support, and we won't ship any new releases for it. I know that recent versions don't face with that issue, I'd suggest upgrading to a recent version.
Hello, KEDA v2.8 is quite old (almost 2 years). I don't remember any issue related but just in case, have you tried using v2.8.2 -> https://github.com/kedacore/keda/releases/tag/v2.8.2?
Sadly, that version is out of support, and we won't ship any new releases for it. I know that recent versions don't face with that issue, I'd suggest upgrading to a recent version.
The problem was indeed solved by switching to v2.8.2, but isn’t this the recommended version officially provided by keda?
https://keda.sh/docs/2.15/operate/cluster/#kubernetes-compatibility
The problem was indeed solved by switching to v2.8.2
Nice! Happy to read it 😄
I guess that you are using old k8s cluster, so probably v2.8 is the greatest version that you can use (if you are running k8s < 1.23 KEDA > 2.9 won't work). The point is that we don't ship fixes for old versions, we could ship a fix for the previous version but 2.8 is 7 versions far from v2.15 and although we will try to support with misconfigurations, if there is a real bug unresolved, we won't ship any fix for old versions.
The problem was indeed solved by switching to v2.8.2
Nice! Happy to read it 😄
I guess that you are using old k8s cluster, so probably v2.8 is the greatest version that you can use (if you are running k8s < 1.23 KEDA > 2.9 won't work). The point is that we don't ship fixes for old versions, we could ship a fix for the previous version but 2.8 is 7 versions far from v2.15 and although we will try to support with misconfigurations, if there is a real bug unresolved, we won't ship any fix for old versions.
@JorTurFer
OK, I understand what you mean. There is another situation for you, that is, our latest version of keda is found to be incompatible on the cluster of version 1.21, and panic will occur. The reason is that the v2 version of hpa cannot be detected.
I also encountered the same problem which caused the function not to work properly?
Yes, I said, KEDA >= v2.9 requires k8s >= 1.23. This is because k8s introduced a breaking change when they released HPA v2 and removed HPA v1beta1. v2.8 uses v1beta1 and v2.9 uses v2. There isn't any fix or action to do about this, v2.8 uses v1beta1 and v2.9 uses v2.
@JorTurFer
OK, thank you for your answer and wish you a happy life
@JorTurFer Excuse me, when I only configure cron, will the replicas of the workload become minReplicaCount if it is not in the cron period?
@JorTurFer Excuse me, when I only configure cron, will the replicas of the workload become minReplicaCount if it is not in the cron period?
@JorTurFer The test is like this. When I manually modify replicas during non-cron periods, it will eventually become minReplicaCount, which is not very friendly to the business. Based on version 1.8.2, is there any way to allow the business to still have the right to make decisions in this situation?
Excuse me, when I only configure cron, will the replicas of the workload become minReplicaCount if it is not in the cron period?
Yes, outside the cron period and assuming the cron as the only scaler, it'll return 0 so it'll scale to minReplicaCount
@JorTurFer请问一下,当我只配置cron的时候,如果不在cron周期内,工作负载的replicas会变成minReplicaCount吗?
@JorTurFer测试是这样的,当我在非cron期间手动修改replicas时,最后会变成minReplicaCount,对业务不太友好,基于1.8.2版本,有没有办法让业务在这种情况下依然拥有决策权?
I also encountered the same problem,How to solve it
| gharchive/issue | 2024-08-07T09:41:52 | 2025-04-01T06:39:16.246661 | {
"authors": [
"JorTurFer",
"LY-today",
"enaguo",
"guanqinglin"
],
"repo": "kedacore/keda",
"url": "https://github.com/kedacore/keda/issues/6045",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
95511708 | Bower package is empty
I don't know why, but bower install keen-dashboards results in a license, a README, and nothing else.
Which is a shame, really; you could easily install the necessary CSS and whathaveyou by specifying it as such in the main stanza, which would mean folks like me who use Yeoman could use your layouts far more easily.
Might write a PR.
Thanks for writing this up @aendrew. Admittedly, we haven't spent much time on more involved ways of distributing Keen Dashboards (bower, yeoman, etc.). That would be super cool if you wrote up a PR!
@aendrew just wanted to echo @josephwegner's encouragement toward the PR. We really appreciate your thoughts and any potential contribution!
I don't know anything about bower, but I just came across this SO topic and noticed that there is only a bower.json file in the master branch of this project.
@standaniels Ah! That would make sense why there's nothing when pulling from bower. If the gh-pages branch was merged with master, that would probably resolve this.
Resolved- thanks, everyone!
| gharchive/issue | 2015-07-16T19:43:41 | 2025-04-01T06:39:16.254467 | {
"authors": [
"aendrew",
"dustinlarimer",
"josephwegner",
"standaniels",
"timfalls"
],
"repo": "keen/dashboards",
"url": "https://github.com/keen/dashboards/issues/79",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
216516094 | Can't use --config to define login file location
I use release 0.10 for windows 64.
I try to login using --config flag, but always have error 1 in return.
I created a .ecli.json somewhere as defined in login standard output, but json format is not exact in this message. missing " after password, and additionnal coma after url. But fixing format didn't allow me to connect.
I type : ecli login platform --config "D:\myfolder.ecli.json"
What should be --config argument syntax, are windows paths accepted ? is it folder of file path ?
Looks like your JSON config file is malformed maybe. It must look like
{"platform": {"login": "yourlogin", "password": "yourpassword", "url": "https://yoururl/api/v2"}}
Just create it in your local directory and call it by filename:
ecli login platform --config myfile.json
You can remove the dot . at the beginning on the file name and use any name like myfile.json.
It's working with this syntax, thx for your support.
| gharchive/issue | 2017-03-23T17:47:06 | 2025-04-01T06:39:16.258400 | {
"authors": [
"amounierlltech",
"matm"
],
"repo": "keeneyetech/ecli",
"url": "https://github.com/keeneyetech/ecli/issues/42",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
770910665 | Liquidity rewards fetching data
Ref: #2216
This PR adds support for fetching and displaying the rewards data from the LPRewards contracts. All data are stored in the redux store.
@michalsmiarowski I'm ready for the second round!
| gharchive/pull-request | 2020-12-18T14:04:38 | 2025-04-01T06:39:16.259592 | {
"authors": [
"r-czajkowski"
],
"repo": "keep-network/keep-core",
"url": "https://github.com/keep-network/keep-core/pull/2224",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
952895281 | Stabilize failing test
The test checking if npm-version-bump action "updates version for
already published environment that don't match initial preid" was
unstable - was refering to actively worked on npm packages versions,
making it necessary to update the action every time new package with
that base version is released on ropsten.
In this commit we change the reference to different version, which is no
longer actively worked on.
@nkuba, I wasn't sure what to pick as a base branch (main or v2) for this PR. I see that main is a couple of commits behind v2 and there's no open PR for merging v2 to main.
Should I keep the base branch as v2 and after the merge of this PR should I tag the v2 branch code with v2 tag?
| gharchive/pull-request | 2021-07-26T13:17:51 | 2025-04-01T06:39:16.262899 | {
"authors": [
"michalinacienciala"
],
"repo": "keep-network/npm-version-bump",
"url": "https://github.com/keep-network/npm-version-bump/pull/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
654934909 | tbtc dapp - step 2 console log - RPC Error
DevTools failed to load SourceMap: Could not load content for chrome-extension://fleenceagaplaefnklabikkmocalkcpo/content-script.js.map: HTTP error: status code 404, net::ERR_UNKNOWN_URL_SCHEME
DevTools failed to load SourceMap: Could not load content for chrome-extension://nkbihfbeogaeaoehlefnkodbefgpgknn/sourcemaps/contentscript.js.map: HTTP error: status code 404, net::ERR_UNKNOWN_URL_SCHEME
DevTools failed to load SourceMap: Could not load content for chrome-extension://nkbihfbeogaeaoehlefnkodbefgpgknn/sourcemaps/inpage.js.map: HTTP error: status code 404, net::ERR_UNKNOWN_URL_SCHEME
inpage.js:1 Uncaught TypeError: e is not a function
at inpage.js:1
at inpage.js:1
at i (inpage.js:1)
at inpage.js:1
at inpage.js:1
at c (inpage.js:1)
at inpage.js:1
at We (inpage.js:1)
at Object. (inpage.js:1)
at e.exports._runReturnHandlersUp (inpage.js:1)
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
(anonymous) @ inpage.js:1
We @ inpage.js:1
(anonymous) @ inpage.js:1
_runReturnHandlersUp @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
u @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
a @ inpage.js:1
setTimeout (async)
(anonymous) @ inpage.js:1
write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
_write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
o._onMessage @ inpage.js:1
postMessage (async)
o._write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
o._onMessage @ contentscript.js:1
web3.js:49 netId: 3
chainId: 3
inpage.js:1 MetaMask - RPC Error: The method undefined does not exist/is not available {code: -32601, message: "The method undefined does not exist/is not available"}
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
(anonymous) @ inpage.js:1
We @ inpage.js:1
(anonymous) @ inpage.js:1
_runReturnHandlersUp @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
u @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
a @ inpage.js:1
setTimeout (async)
(anonymous) @ inpage.js:1
write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
_write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
o._onMessage @ inpage.js:1
postMessage (async)
o._write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
o._onMessage @ contentscript.js:1
inpage.js:1 Uncaught TypeError: e is not a function
at inpage.js:1
at inpage.js:1
at i (inpage.js:1)
at inpage.js:1
at inpage.js:1
at c (inpage.js:1)
at inpage.js:1
at We (inpage.js:1)
at Object. (inpage.js:1)
at e.exports._runReturnHandlersUp (inpage.js:1)
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
(anonymous) @ inpage.js:1
We @ inpage.js:1
(anonymous) @ inpage.js:1
_runReturnHandlersUp @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
u @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
a @ inpage.js:1
setTimeout (async)
(anonymous) @ inpage.js:1
write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
_write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
o._onMessage @ inpage.js:1
postMessage (async)
o._write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
o._onMessage @ contentscript.js:1
inpage.js:1 MetaMask - RPC Error: The method undefined does not exist/is not available {code: -32601, message: "The method undefined does not exist/is not available"}
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
(anonymous) @ inpage.js:1
We @ inpage.js:1
(anonymous) @ inpage.js:1
_runReturnHandlersUp @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
u @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
a @ inpage.js:1
setTimeout (async)
(anonymous) @ inpage.js:1
write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
_write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
o._onMessage @ inpage.js:1
postMessage (async)
o._write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
o._onMessage @ contentscript.js:1
inpage.js:1 Uncaught TypeError: e is not a function
at inpage.js:1
at inpage.js:1
at i (inpage.js:1)
at inpage.js:1
at inpage.js:1
at c (inpage.js:1)
at inpage.js:1
at We (inpage.js:1)
at Object. (inpage.js:1)
at e.exports._runReturnHandlersUp (inpage.js:1)
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
(anonymous) @ inpage.js:1
We @ inpage.js:1
(anonymous) @ inpage.js:1
_runReturnHandlersUp @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
u @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
a @ inpage.js:1
setTimeout (async)
(anonymous) @ inpage.js:1
write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
_write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
o._onMessage @ inpage.js:1
postMessage (async)
o._write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
o._onMessage @ contentscript.js:1
inpage.js:1 MetaMask - RPC Error: The method undefined does not exist/is not available {code: -32601, message: "The method undefined does not exist/is not available"}
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
(anonymous) @ inpage.js:1
We @ inpage.js:1
(anonymous) @ inpage.js:1
_runReturnHandlersUp @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
u @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
a @ inpage.js:1
setTimeout (async)
(anonymous) @ inpage.js:1
write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
_write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
o._onMessage @ inpage.js:1
postMessage (async)
o._write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
o._onMessage @ contentscript.js:1
inpage.js:1 Uncaught TypeError: e is not a function
at inpage.js:1
at inpage.js:1
at i (inpage.js:1)
at inpage.js:1
at inpage.js:1
at c (inpage.js:1)
at inpage.js:1
at We (inpage.js:1)
at Object. (inpage.js:1)
at e.exports._runReturnHandlersUp (inpage.js:1)
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
(anonymous) @ inpage.js:1
We @ inpage.js:1
(anonymous) @ inpage.js:1
_runReturnHandlersUp @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
n @ inpage.js:1
i @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
c @ inpage.js:1
u @ inpage.js:1
(anonymous) @ inpage.js:1
(anonymous) @ inpage.js:1
a @ inpage.js:1
setTimeout (async)
(anonymous) @ inpage.js:1
write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
_write @ inpage.js:1
b @ inpage.js:1
(anonymous) @ inpage.js:1
v.write @ inpage.js:1
y @ inpage.js:1
h @ inpage.js:1
s.emit @ inpage.js:1
_ @ inpage.js:1
w @ inpage.js:1
b.push @ inpage.js:1
o._onMessage @ inpage.js:1
postMessage (async)
o._write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
_write @ contentscript.js:1
b @ contentscript.js:1
(anonymous) @ contentscript.js:1
v.write @ contentscript.js:1
y @ contentscript.js:1
h @ contentscript.js:1
s.emit @ contentscript.js:1
_ @ contentscript.js:1
w @ contentscript.js:1
b.push @ contentscript.js:1
o._onMessage @ contentscript.js:1
Deposit.js:446 Creating new deposit contract with lot size 1000000 satoshis...
Deposit.js:454 Looking up new deposit with address 0x7564447CaA2155F23a7352565C09793fE90b72eD backed by keep at address 0x0dD61236117c57e8Ad96E1501Fa0f3689ab82ed9...
Deposit.js:1593 Monitoring deposit 0x7564447CaA2155F23a7352565C09793fE90b72eD for transition to ACTIVE.
Deposit.js:1534 Waiting for deposit 0x7564447CaA2155F23a7352565C09793fE90b72eD keep public key...
this is on Windows 10, Version 83.0.4103.116
This appears to have been an issue with the formation of the signer group. We're capturing the related work in #281. Thanks for the report!
| gharchive/issue | 2020-07-10T17:42:24 | 2025-04-01T06:39:16.336974 | {
"authors": [
"Shadowfiend",
"chdru"
],
"repo": "keep-network/tbtc-dapp",
"url": "https://github.com/keep-network/tbtc-dapp/issues/228",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2731986306 | feat: Centralize RiscZero constants definition by creating Rust bindings for py/ts.
Currently, the same constants (CONTROL ROOT & CONTROL_ID) are defined in multiple locations.
In python : https://github.com/keep-starknet-strange/garaga/blob/073843663fd253ad33a697a6aae285704df45b21/hydra/garaga/starknet/groth16_contract_generator/parsing_utils.py#L20-L24
In rust : https://github.com/keep-starknet-strange/garaga/blob/073843663fd253ad33a697a6aae285704df45b21/tools/garaga_rs/src/calldata/full_proof_with_hints/groth16.rs#L297-L311
In typescript : https://github.com/keep-starknet-strange/garaga/blob/073843663fd253ad33a697a6aae285704df45b21/tools/npm/garaga_ts/src/node/starknet/groth16ContractGenerator/parsingUtils.ts#L7-L9
Keep them in rust only and create a python / wasm binding so that updating them is easier.
Note : the tools/garaga_rs package has already multiple examples of python / wasm bindings.
Hello, I would love to work on this. I'm really excited about Garaga with the latest Noir compatibility update and would be honored to be a part of it.
I have just completed implementing a L1 gas prices sampling oracle to Katana in Rust so I believe to be able to tackle this issue. I'll make sure to communicate if I encounter any problems.
@feltroidprime Thank you, on it!
| gharchive/issue | 2024-12-11T06:53:18 | 2025-04-01T06:39:16.342303 | {
"authors": [
"augustin-v",
"feltroidprime"
],
"repo": "keep-starknet-strange/garaga",
"url": "https://github.com/keep-starknet-strange/garaga/issues/270",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1755228327 | dev: improve testing of to_invoke_tx
Improve the testing of the rpc-core/src/utils.rs file by adding unit tests for to_invoke_tx. The tests are expected to cover edge cases in order to be accepted.
@greged93 I would love to take this!
@greged93 I would love to take this!
Assigned to you
@MdTeach need to wait on #597 to be merged
Hey @MdTeach, #597 was merged and moved the logic to the file primitives/starknet/src/transation/types.rs and the implementation changed to impl TryFrom<BroadcastedInvokeTransaction> for InvokeTransaction. Would be great if you are still able to provide tests for this.
sure @greged93
| gharchive/issue | 2023-06-13T16:02:41 | 2025-04-01T06:39:16.344986 | {
"authors": [
"MdTeach",
"abdelhamidbakhta",
"greged93"
],
"repo": "keep-starknet-strange/madara",
"url": "https://github.com/keep-starknet-strange/madara/issues/626",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1767970229 | dev: use Felt252Wrapper instead of [u8; 32]
Pull Request type
Please add the labels corresponding to the type of changes your PR introduces:
Refactoring (no functional changes, no API changes)
What is the current behavior?
Resolves: #716
Does this introduce a breaking change?
Not sure if this counts as breaking change.
Other information
(OT: we should add clippy to husky hooks as well)
| gharchive/pull-request | 2023-06-21T16:23:39 | 2025-04-01T06:39:16.347869 | {
"authors": [
"lambda-0x"
],
"repo": "keep-starknet-strange/madara",
"url": "https://github.com/keep-starknet-strange/madara/pull/721",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2519390257 | [CONTRACT] setup liquidity addition
Complete this issue according to this article: How to manage liquidity in a P2P ramping app
Please add integration tests
@0xChqrles i
Hi @0xChqrles, an i work on this?
Hi @0xChqrles Can i work on this issue if it's available?
Hey @0xChqrles I would like to work on this so I can work on #67 which is a dependant.
@0xChqrles, can I work on this?
@0xChqrles am available to work on this ,
With previous experience working on this codebase,
I have read the article
I can implement this with 3 days
ETH: 72hrs
@0xChqrles am available to work on this
| gharchive/issue | 2024-09-11T10:48:40 | 2025-04-01T06:39:16.351213 | {
"authors": [
"0xChqrles",
"ikemHood",
"manlikeHB",
"mubarak23",
"ugur-eren"
],
"repo": "keep-starknet-strange/zkramp",
"url": "https://github.com/keep-starknet-strange/zkramp/issues/65",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
446380147 | Reproduce DQN paper results for Atari
Reproduce following paper results (at least 1 game, because @keiohta does not have enough computation resource)
Human-level control through deep reinforcement learning
Preprocessing
Working directly with raw Atari 2600 frames, which are 210 x 160 pixel images with a 128-colour palette, can be demanding in terms of computation and memory requirements.
We apply a basic preprocessing step aimed at reducing the input dimensionality and dealing with some artefacts of the Atari 2600 emulator.
First, to encode a single frame we take the maximum value for each pixel colour value over the frame being encoded and the previous frame. This was necessary to remove flickering that is present in games where some objects appear only in even frames while other objects appear only in odd frames, an artefact caused by the limited number of sprites Atari 2600 can display at once.
Second, we then extract the Y channel, also known as luminance, from the RGB frame and rescale it to
84 x 84. The function w from algorithm 1 described below applies this preprocessing to the m most recent frames and stacks them to produce the input to the Q-function, in which m = 4, although the algorithm isrobust to different values of m (for example, 3 or 5).
| gharchive/issue | 2019-05-21T01:50:27 | 2025-04-01T06:39:16.394902 | {
"authors": [
"keiohta"
],
"repo": "keiohta/tf2rl",
"url": "https://github.com/keiohta/tf2rl/issues/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1790015887 | Statistics/metrics ought to be gathered at Engine level.
Right now, sources::access::opensky has some stats gathering code. This should be expanded into an Engine-wide system.
tracing might be a good crate to use.
More likely as a separate Actor within fetiched.
| gharchive/issue | 2023-07-05T17:48:18 | 2025-04-01T06:39:16.425009 | {
"authors": [
"keltia"
],
"repo": "keltia/fetiche-rs",
"url": "https://github.com/keltia/fetiche-rs/issues/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
322589437 | failed to create the hash value of large files
https://github.com/kemokemo/gckdir/blob/1fb158b19ff1160d2a23244216b076f4c52e3ae1/lib/hash.go#L96-L103
The ioutil.ReadFile() is easy to use, but this read all of the files in memory.
If you have any large files (ex. several gigabytes), the function to create hash value will be very slowly.
In some cases, gckdir application crashes. 😢
Let's fix it!
The official godoc of the "crypto/sha256" package is most valuable.
https://godoc.org/crypto/sha256#ex-New--File
| gharchive/issue | 2018-05-13T12:23:23 | 2025-04-01T06:39:16.433226 | {
"authors": [
"kemokemo"
],
"repo": "kemokemo/gckdir",
"url": "https://github.com/kemokemo/gckdir/issues/18",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2560016431 | Unsound issue in SvmVec
Hi, thanks for your time to read this issue. Our static analysis tool found there might be an unsound issue in your set_len implementation of buffer:
https://github.com/kenba/opencl3/blob/1d7d74da2ff76c93c7418dd016386a8b907e2bd3/src/svm.rs#L300-L311
As mentioned in the comments, this may introduce uninitilaized memory and reading it is considered as undefined behavior in Rust. As a reference, in std library, all the set_len method are marked as unsafe:
https://github.com/rust-lang/rust/blob/63a0bdd5622eaf6b9524702f055bb4525acfc9f2/library/alloc/src/vec/mod.rs#L1849-L1853
Thanks again for your time.
Change incorporated in version 0.10.0.
| gharchive/issue | 2024-10-01T19:24:17 | 2025-04-01T06:39:16.440307 | {
"authors": [
"CXWorks",
"kenba"
],
"repo": "kenba/opencl3",
"url": "https://github.com/kenba/opencl3/issues/69",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
650890714 | feat(frontend): condensed UI for high question count virtual contests
20 問以上あると自動的にモード on します(20 問 OK?)
このモードには Score、Estimated Performance を表示されていません → 表示しますか?
ロジックが複雑。ContestTable のロジックが複雑のでこのように書いておいたけど、マージ前/後 refactor したほうがいい?
使える
プレビュー
Ref. #505.
かっこいいです!
これは LockoutContestTable のように、ContestTable とは別のコンポーネントにした方が良いと思います。
Mode も Normal/Lockout とは別のものを作ったほうが良いと思います(mode は任意の文字列を受けつけます)
Score と Estimate Performance は表示しなくて良いです。ランキングは 1pt / 問 が良いと思います。
-(ContestTable のリファクタは別の PR でやった方が良いと思います)
ContestTable と重なったロジックが多いので、繰り返さないように別のコンポーネントを作ってみます。
Mode は任意を文字列を受け付けるのが知りませんでした。そうすると作れるはずです。
ランキングのロジックを変えたらできますね。
できるだけ重なったロジックをなくしました。
Refactor の時、問題のリンクさえ見えないことを気づいて、追加しました。
| gharchive/pull-request | 2020-07-04T13:21:55 | 2025-04-01T06:39:16.478367 | {
"authors": [
"kenkoooo",
"southball"
],
"repo": "kenkoooo/AtCoderProblems",
"url": "https://github.com/kenkoooo/AtCoderProblems/pull/659",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
659870579 | トロフィー機能の一部変更
resolves #689
未達成のトロフィーを非表示から、名前を"???"にして達成条件のみを表示するように変更しました。
このように表示されます。
概要欄に "resolves #689" みたいなのを書いてもらって良いですか?該当の issue にリンクされます。
説明欄の所に書いてみたんですが、ここで大丈夫でしょうか?
説明欄の所に書いてみたんですが、ここで大丈夫でしょうか?
大丈夫のはずです。
ありがとうございます!
4fc48bc1b32876e40ef1d972ea02db21426bce43
未達成の項目についてはVerifiedマークを非表示にするように変更しました。
表示は画像のようになります。
issue とか PR とかガバガバなので今さら感ありますが、いちおう海外の非日本語話者の開発者の人もいるので commit メッセージは英語で書いてください。
わかりました(Commit メッセージの書き直しって出来ましたっけ……)
| gharchive/pull-request | 2020-07-18T03:46:57 | 2025-04-01T06:39:16.482805 | {
"authors": [
"kenkoooo",
"southball",
"zerokpr"
],
"repo": "kenkoooo/AtCoderProblems",
"url": "https://github.com/kenkoooo/AtCoderProblems/pull/694",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
132283591 | Problem with Content-Lenght is missing
I'm playing clonk rage but i can't play online because this happend
Content-Lenght is missing pls help me
PS: i don't speak english so well
@victor1969 can you please share the code you are using so we can understand what you're asking?
where i get the code?
@victor1969 is this an error you saw while playing a game? Why did you decide to post about it here?
i'm just trying to play online Clonk rage but says on the server list
Internet server on league.clonkspot.org
Invalid server response: Content-Length is missing!
@victor1969 unfortunately, that has nothing to do with project. There's probably somewhere else online for you to report this, though!
:(
| gharchive/issue | 2016-02-08T23:11:45 | 2025-04-01T06:39:16.487044 | {
"authors": [
"kennethreitz",
"victor1969"
],
"repo": "kennethreitz/requests",
"url": "https://github.com/kennethreitz/requests/issues/3000",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
150106470 | Flip conditional in session.send()
Previously we checked that the request being sent was an instance of a
PreparedRequest. If a user somehow created a PreparedRequest using a different
Requests library instance, this check makes the request un-sendable.
(This happened recently - unbeknownst to me, my server was running an outdated
version of pip, vulnerable to this issue - pypa/pip#1489, which creates
multiple subdirectories (src/requests, src/requests/requests) when you rerun
pip install --target. So the PreparedRequest was being created in one version
of the library and compared against the other version of the library, and
throwing this exception, even though they were both PreparedRequest instances!)
It would probably be preferable to check the object's behavior (instead of
its type), but a PreparedRequest has a lot of behavior, and it wouldn't be
really feasible or allow us to provide a helpful error message to check all
of it here. Instead flip the conditional to guard against the user sending an
unprepared Request, which should still give us most of the benefits of the
better error message.
Fixes #3102
I'm happy with this! Go for it @kennethreitz, merge if you'd like to. =D
Not if I merge it first ;)
| gharchive/pull-request | 2016-04-21T15:21:26 | 2025-04-01T06:39:16.490377 | {
"authors": [
"Lukasa",
"kevinburke",
"sigmavirus24"
],
"repo": "kennethreitz/requests",
"url": "https://github.com/kennethreitz/requests/pull/3108",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
267858266 | Update filter handler functions to use FilterOperationInfo objects.
Since we now have a lot of different filter operators, it has become clear that the old filter function signature can use some streamlining. This PR adds a new FilterOperationInfo object that holds all non-contextual data that is relevant to a filtering operation, such as the directive object itself, or the type and field name being filtered on.
This new abstraction:
simplifies the function signature of filtering functions to just 3 arguments, while getting rid of all unused variables; and
simplifies the reasoning for which filters are valid in which situations.
Coverage decreased (-0.5%) to 92.553% when pulling f50fc3a51842cc330bf735048237f23bfecf0b8a on filter_operation_info into da6e03c424abc17cc242341674823ea8c21c060d on edge_degree_filtering.
| gharchive/pull-request | 2017-10-24T00:23:57 | 2025-04-01T06:39:16.502134 | {
"authors": [
"coveralls",
"obi1kenobi"
],
"repo": "kensho-technologies/graphql-compiler",
"url": "https://github.com/kensho-technologies/graphql-compiler/pull/49",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
89564228 | bower was removing all the good stuff
constants, directives, filters and services were not available after bower installing.
I've merged it, but to be perfectly honest, I don't recommend you use this module. I will not be supporting it. I recommend you copy and paste and modify the code.
| gharchive/pull-request | 2015-06-19T13:25:02 | 2025-04-01T06:39:16.511739 | {
"authors": [
"jdart",
"kentcdodds"
],
"repo": "kentcdodds/kcd-angular",
"url": "https://github.com/kentcdodds/kcd-angular/pull/3",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
175473095 | how to center item and some
this is how I want it to be displayed
this is how I got so far
What I want to do is show one center image, and half of the right and left images,
Problem: In my solution, I was able to show one center image, but the right and left image only show little, I want them to show more
Here is my code:
http://jsfiddle.net/fp1kcah6/
Check out centerPadding
| gharchive/issue | 2016-09-07T10:55:58 | 2025-04-01T06:39:16.516208 | {
"authors": [
"craigcosmo",
"leggomuhgreggo"
],
"repo": "kenwheeler/slick",
"url": "https://github.com/kenwheeler/slick/issues/2508",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1292248425 | Windows CI (Build on MSVC+Mingw64)
Set up --features vendored build CI for following configurations to detect build bustage:
Windows MSVC build
Windows MinGW64 build
This is part of #34 task
Wrote README section about --features vendored information
Many undefined reference in CI environment:
= note: c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x42c): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_create(unsigned long long&, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x449): undefined reference to `std::__throw_logic_error(char const*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x56e): undefined reference to `std::current_exception()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x590): undefined reference to `std::__exception_ptr::exception_ptr::swap(std::__exception_ptr::exception_ptr&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x598): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x5a0): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x5ed): undefined reference to `std::__exception_ptr::exception_ptr::exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x622): undefined reference to `std::__exception_ptr::exception_ptr::swap(std::__exception_ptr::exception_ptr&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x62a): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x64c): undefined reference to `std::__exception_ptr::exception_ptr::swap(std::__exception_ptr::exception_ptr&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x654): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x65e): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x669): undefined reference to `std::__exception_ptr::exception_ptr::exception_ptr(std::__exception_ptr::exception_ptr const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x671): undefined reference to `std::rethrow_exception(std::__exception_ptr::exception_ptr)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x67c): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x684): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x79c): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_create(unsigned long long&, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x7b9): undefined reference to `std::__throw_logic_error(char const*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0xbb1): undefined reference to `operator delete[](void*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0xbc1): undefined reference to `operator delete[](void*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x11b0): undefined reference to `__cxa_guard_acquire'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x11cc): undefined reference to `__cxa_guard_release'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x1601): undefined reference to `std::__detail::_List_node_base::_M_hook(std::__detail::_List_node_base*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x1673): undefined reference to `std::__detail::_List_node_base::_M_unhook()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text+0x218): undefined reference to `std::__exception_ptr::exception_ptr::~exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text$_ZN18wxCmdLineArgsArray8FreeArgsEv[_ZN18wxCmdLineArgsArray8FreeArgsEv]+0xc1): undefined reference to `operator delete[](void*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text$_ZN18wxCmdLineArgsArray8FreeArgsEv[_ZN18wxCmdLineArgsArray8FreeArgsEv]+0xd1): undefined reference to `operator delete[](void*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text$_ZN8wxStringaSEPKc[_ZN8wxStringaSEPKc]+0xbb): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_replace(unsigned long long, unsigned long long, wchar_t const*, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(appbase.cpp.obj):appbase.cpp:(.text.startup+0x57): undefined reference to `std::__exception_ptr::exception_ptr::exception_ptr()'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0xec): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_create(unsigned long long&, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x109): undefined reference to `std::__throw_logic_error(char const*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x22d): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_mutate(unsigned long long, unsigned long long, wchar_t const*, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x2fc): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x46c): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0xaab): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0xabb): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0xb46): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0xf92): more undefined references to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)' follow
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x12fc): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_create(unsigned long long&, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x1887): undefined reference to `std::__throw_logic_error(char const*)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x1946): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x1a04): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_create(unsigned long long&, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x1b8b): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::reserve(unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x1cb8): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_mutate(unsigned long long, unsigned long long, wchar_t const*, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x1fda): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_assign(std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> > const&)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x2183): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_append(wchar_t const*, unsigned long long)'
c:/programdata/chocolatey/lib/mingw/tools/install/mingw64/bin/../lib/gcc/x86_64-w64-mingw32/11.2.0/../../../../x86_64-w64-mingw32/bin/ld.exe: C:\Users\runneradmin\.cargo\git\checkouts\wx-x86_64-pc-windows-gnu-e3b66c3b51f90859\b41af99/lib/libwxmsw31u.a(cmdline.cpp.obj):cmdline.cpp:(.text+0x21a1): undefined reference to `std::__cxx11::basic_string<wchar_t, std::char_traits<wchar_t>, std::allocator<wchar_t> >::_M_replace_aux(unsigned long long, unsigned long long, unsigned long long, wchar_t)'
8a384454b0f1ed04e1bacb96a6a9e1f80883d150 didn't fix this.
Github Actions windows-latest environment contains mingw64 from chocolatey which packaged:
https://github.com/brechtsanders/winlibs_mingw/releases/tag/11.2.0-12.0.1-9.0.0-r1
Maybe it is not compatible with @ancwrd1 's -gnu repo...
@ancwrd1 Any advice which CI service/or CI setting to test building with your prebuilt vendored repo?
I think you might want to try explicit RUSTFLAGS="-C link-arg=-lstdc++" instead of build.rs.
Thank you for your advice.
Indeed, each sample programs dont't have not C++ source file, linker can't link C++ library automatically...
Thanks, 1e563d215299aa0796d63d478776ee395d8a3177 fixed linking error.
This empty test failure is another problem, but disable test for now because there is no useful unit test code. (I'll fix this and write some test codes in future once API coverage reaches somewhat useful state.)
Running `D:\a\wxRust2\wxRust2\target\x86_64-pc-windows-gnu\debug\deps\wx-9e0742b7c88531ac.exe`
error: test failed, to rerun pass '-p wx --lib'
Caused by:
process didn't exit successfully: `D:\a\wxRust2\wxRust2\target\x86_64-pc-windows-gnu\debug\deps\wx-9e0742b7c88531ac.exe` (exit code: 0xc0000139, STATUS_ENTRYPOINT_NOT_FOUND)
Error: Process completed with exit code 1.
@ancwrd1 Thank you many help to support this.
I wrote about this feature a bit in README, with acknowledging to you. If you dislike this, I'll fix. Could you check?
https://github.com/kenz-gelsoft/wxRust2/blob/e0229d20917182b870b013506657c9c9328a8f61/README.md#use-vendored-wx-binary-crate
I wrote about this feature a bit in README, with acknowledging to you. If you dislike this, I'll fix. Could you check?
It seem you're busy now. I assume this is OK, and will merge this.
Thank you very much. I'm sorry bother you.
Sorry I haven't seen the notifications for some reason. README looks fine, thank you very much for this project. Highly appreciated!.
| gharchive/pull-request | 2022-07-03T08:57:54 | 2025-04-01T06:39:16.528975 | {
"authors": [
"ancwrd1",
"kenz-gelsoft"
],
"repo": "kenz-gelsoft/wxRust2",
"url": "https://github.com/kenz-gelsoft/wxRust2/pull/91",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
750728349 | Keptn Smart Auth: Autofetch keptn endpoint and api-token
Signed-off-by: ankitjain28may ankitjain28may77@gmail.com
[ ] Can you please add the message "CLI is not authenticated" in case of ClusterIP / NotePort. (to inform the user about the current status)
Codecov Report
Merging #2733 (18ba47f) into master (9bd9c79) will decrease coverage by 13.64%.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #2733 +/- ##
===========================================
- Coverage 36.26% 22.61% -13.65%
===========================================
Files 161 50 -111
Lines 9288 1110 -8178
Branches 197 197
===========================================
- Hits 3368 251 -3117
+ Misses 5381 832 -4549
+ Partials 539 27 -512
Flag
Coverage Δ
moduleA
22.61% <ø> (-0.27%)
:arrow_down:
Flags with carried forward coverage won't be shown. Click here to find out more.
Impacted Files
Coverage Δ
bridge/client/app/_services/data.service.ts
0.98% <0.00%> (-0.12%)
:arrow_down:
...lient/app/project-board/project-board.component.ts
2.83% <0.00%> (-0.03%)
:arrow_down:
bridge/client/app/_models/service.ts
0.00% <0.00%> (ø)
helm-service/controller/action_handler.go
platform-support/openshift-route-service/main.go
api/ws/hub.go
...ce/pkg/handler/approval_triggered_event_handler.go
helm-service/pkg/helm/helm_helper.go
cli/cmd/update.go
.../pkg/configurationchanger/configuration_changer.go
... and 100 more
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ba0b2ae...18ba47f. Read the comment docs.
@bacherfl The unit tests are failing because it needs this PR to be merged - https://github.com/keptn/kubernetes-utils/pull/13
@bacherfl The unit tests are failing because it needs this PR to be merged - keptn/kubernetes-utils#13
Ok, got it. Just had one comment about the other PR - once that has been adressed we can merge that one
@bacherfl The unit tests are failing because it needs this PR to be merged - keptn/kubernetes-utils#13
Ok, got it. Just had one comment about the other PR - once that has been adressed we can merge that one
@ankitjain28may I have just merged the kubernetes-utils PR. Can you please update the kubernetes-utils dependency to the latest status from the master branch? When that is done we can merge this PR as well
@bacherfl Have updated the dependency to latest master, Please review :)
| gharchive/pull-request | 2020-11-25T11:15:19 | 2025-04-01T06:39:16.561042 | {
"authors": [
"ankitjain28may",
"bacherfl",
"codecov-io",
"johannes-b"
],
"repo": "keptn/keptn",
"url": "https://github.com/keptn/keptn/pull/2733",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1059152027 | fixing issue in GCN example
The small issue prevented the code from running in Colab environment, because of inconsistencies in the dimension of the output layer.
Thanks for the PR. This is fixed by #717 717
| gharchive/pull-request | 2021-11-20T14:03:57 | 2025-04-01T06:39:16.569686 | {
"authors": [
"fchollet",
"fmerizzi"
],
"repo": "keras-team/keras-io",
"url": "https://github.com/keras-team/keras-io/pull/716",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1930825996 | TensorFlow GPU - Fix keras/layers/merging/merging_test.py
Fix failing test - keras/layers/merging/merging_test.py::MergingLayersTest::test_sparse_dot_2d Fatal Python error: Aborted and update TODO in https://github.com/keras-team/keras/blob/master/keras/kokoro/github/ubuntu/gpu/build.sh#L39
https://source.cloud.google.com/results/invocations/9df9ee7e-5666-4644-abd2-01a10771faeb/targets/keras%2Fgithub%2Fubuntu%2Fgpu%2Ftensorflow%2Fpresubmit/log
keras/layers/merging/merging_test.py::MergingLayersTest::test_sparse_dot_2d Fatal Python error: Aborted
Current thread 0x00007f51610f0740 (most recent call first):
File "/tmpfs/venv/lib/python3.9/site-packages/tensorflow/python/ops/linalg/sparse/gen_sparse_csr_matrix_ops.py", line 1114 in sparse_matrix_sparse_mat_mul
File "/tmpfs/src/github/keras/keras/backend/tensorflow/numpy.py", line 119 in sparse_sparse_matmul
File "/tmpfs/src/github/keras/keras/backend/tensorflow/numpy.py", line 156 in matmul
File "/tmpfs/src/github/keras/keras/ops/numpy.py", line 3431 in matmul
File "/tmpfs/src/github/keras/keras/layers/merging/dot.py", line 171 in batch_dot
File "/tmpfs/src/github/keras/keras/layers/merging/dot.py", line 320 in _merge_function
File "/tmpfs/src/github/keras/keras/layers/merging/base_merge.py", line 189 in call
File "/tmpfs/src/github/keras/keras/ops/operation.py", line 47 in __call__
File "/tmpfs/src/github/keras/keras/utils/traceback_utils.py", line 114 in error_handler
File "/tmpfs/src/github/keras/keras/layers/layer.py", line 810 in __call__
File "/tmpfs/src/github/keras/keras/utils/traceback_utils.py", line 114 in error_handler
File "/tmpfs/src/github/keras/keras/testing/test_case.py", line 380 in run_layer_test
File "/tmpfs/src/github/keras/keras/layers/merging/merging_test.py", line 240 in test_sparse
File "/tmpfs/venv/lib/python3.9/site-packages/absl/testing/parameterized.py", line 319 in bound_param_test
File "/usr/lib/python3.9/unittest/case.py", line 550 in _callTestMethod
File "/usr/lib/python3.9/unittest/case.py", line 592 in run
File "/usr/lib/python3.9/unittest/case.py", line 651 in __call__
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/unittest.py", line 333 in runtest
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/runner.py", line 262 in <lambda>
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/runner.py", line 341 in from_call
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/runner.py", line 261 in call_runtest_hook
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/runner.py", line 222 in call_and_report
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/runner.py", line 133 in runtestprotocol
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/main.py", line 350 in pytest_runtestloop
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/main.py", line 325 in _main
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/main.py", line 271 in wrap_session
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/main.py", line 318 in pytest_cmdline_main
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/tmpfs/venv/lib/python3.9/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/config/__init__.py", line 169 in main
File "/tmpfs/venv/lib/python3.9/site-packages/_pytest/config/__init__.py", line 192 in console_main
File "/tmpfs/venv/bin/pytest", line 8 in <module>
github/keras/keras/kokoro/github/ubuntu/gpu/build.sh: line 34: 4954 Aborted (core dumped)
The culprit is between 2.15.0.dev20230918 (good) and 2.15.0.dev20230919 (bad).
Tested via: pytest keras/layers/merging/merging_test.py::MergingLayersTest::test_basic_add
Culprit in one of the changes in this range: git log e4a6720f42a..dfcf1d40e46 --oneline
Thanks Ramesh for the repo, we will revisit this during the triage meeting.
On a side note, I didn't find any change on the sparse side between those two date. Will need to dig deep for the root cause.
Here's a small code snippet to reproduce the issue in Colab with Keras Master and TF-Nightly -
!pip uninstall -y keras tensorflow
!pip install tf-nightly[and-cuda]==2.15.0.dev20231009 --extra-index-url https://pypi.nvidia.com
!pip uninstall -y keras-nightly
# Install Keras from Master via `python pip_build.py --install`
input = keras.layers.Input(shape=(2,))
x1 = keras.layers.Dense(4, activation='relu')(input)
x2 = keras.layers.Dense(4, activation='relu')(input)
added = keras.layers.Add()([x1, x2])
out = keras.layers.Dense(1)(added)
model = keras.models.Model(inputs=input, outputs=out)
x = np.random.randn(8, 2)
y = np.random.randn(8, 1)
model.compile(optimizer='sgd', loss='mse')
model.fit(x, y, epochs=1)
If I replace Add with Concatenate also it breaks. This is a high priority error since it breaks very important layer in TF GPU. Also, the same test fails for JAX GPU as well.
@fchollet - If you have any thoughts or suggestions to try let me know.
The example you provided doesn't even use sparse inputs, which is different from the error on top. The error DNN library initialization failed somehow indicate that its a setup issue for GPU.
The example you provided doesn't even use sparse inputs, which is different from the error on top. The error DNN library initialization failed somehow indicate that its a setup issue for GPU.
There are multiple failures in merging_test.py. I tried to run the basic test case with add and that fails. Initially I reported on the sparse test which actually aborts with core dump
keras/layers/merging/merging_test.py::MergingLayersTest::test_sparse_dot_2d Fatal Python error: Aborted
TF Nightly 09/18 works for ALL the tests in merging_test.py. So I think its a common issue due to change in TF on 09/19 between these commits in TF: git log e4a6720f42a..dfcf1d40e46 --oneline
Somehow I wasn't able to produce the on colab with T4 GPU. https://colab.sandbox.google.com/drive/1_hMJieL_6DobTPUbZ6BRZIEVz0YRHhBo#scrollTo=GM2B7qEqNYqk
Maybe I didn't config the GPU properly?
@sampathweb do u have a testable env that I can run with?
Also seems to be failing with JAX-GPU now:
github/keras/keras/kokoro/github/ubuntu/gpu/build.sh: line 57: 4493 Aborted (core dumped) pytest keras --ignore keras/applications --ignore keras/layers/merging/merging_test.py --ignore keras/trainers/data_adapters/py_dataset_adapter_test.py --ignore keras/backend/jax/distribution_lib_test.py --cov=keras
I wil work on this tomorrow. I used Colab v100 as my test env
Seems to be a Cudnn TF compilation issue.
2023-10-17 20:23:09.628643: I external/local_xla/xla/service/service.cc:176] StreamExecutor device (0): Tesla V100-SXM2-16GB, Compute Capability 7.0
2023-10-17 20:23:10.277194: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:447] Loaded runtime CuDNN library: 8.7.0 but source was compiled with: 8.9.4. CuDNN library needs to have matching major version and equal or higher minor version. If using a binary install, upgrade your CuDNN library. If building from sources, make sure the library loaded at runtime is compatible with the version specified during compile configuration.
2023-10-17 20:23:10.278786: W tensorflow/core/framework/op_kernel.cc:1839] OP_REQUIRES failed at xla_ops.cc:574 : FAILED_PRECONDITION: DNN library initialization failed. Look at the errors above for more details.
Tested via pytest keras/layers/merging/merging_test.py::MergingLayersTest::test_basic_add
Don't have a resolution yet. But might be related to this change that's within the range - git log e4a6720f42a..dfcf1d40e46 --oneline
commit 3de44168950a5972ba4cfa7e3c6cbf4cffa67fe6
Author: A. Unique TensorFlower <gardener@tensorflow.org>
Date: Mon Sep 18 13:50:11 2023 -0700
Upgrade to LLVM 17, CUDA 12.2, and CuDNN 8.9.4
This is updating TF's default toolchain to LLVM 17, as well as
CUDA and cuDNN to the latest releases.
PiperOrigin-RevId: 566403707
| gharchive/issue | 2023-10-06T19:27:58 | 2025-04-01T06:39:16.589182 | {
"authors": [
"fchollet",
"qlzh727",
"sampathweb"
],
"repo": "keras-team/keras",
"url": "https://github.com/keras-team/keras/issues/18567",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
266412679 | Build error duplicate entry
Installing version 0.4.4 fails reporting this error:
Failed to execute goal org.apache.maven.plugins:maven-shade-plugin:3.0.0:shade (default) on project grobid-core: Error creating shaded jar: duplicate entry: META-INF/services/shadedgrobid.org.apache.lucene.codecs.PostingsFormat
How can I avoid duplicate entries?
Stephan
hi @stzellerhoff could you paste here the whole log?
Thanks
Luca
| gharchive/issue | 2017-10-18T08:58:28 | 2025-04-01T06:39:16.607783 | {
"authors": [
"lfoppiano",
"stzellerhoff"
],
"repo": "kermitt2/grobid",
"url": "https://github.com/kermitt2/grobid/issues/249",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1022789292 | hacktoberfest
hacktoberfest 2021
What type of PR is this? (check all applicable)
[ ] 🚀 Added Details
[ ] 🌟 stared the repo
[ ] 🐛 Grammatical Error
[ ] 📝 Documentation Update
[ ] 🚩 Other
Description
Add Link of GitHub Profile
A C++ Program to Multiply two Matrices using Divide And Conquer Algorithm
Code :
include <iostream.h>
include <stdlib.h>
include <conio.h>
class Matrix
{
private:
float matrix_a[3][3];
float matrix_b[3][3];
float matrix_c[3][3];
public:
Matrix( );
void get_matrix_a( );
void get_matrix_b( );
void multiply_matrices( );
void show_result_Matrix( );
};
Matrix::Matrix( )
{
for(int i=0;i<3;i++)
{
for(int j=0;j<3;j++)
{
matrix_a[i][j]=0;
matrix_b[i][j]=0;
matrix_c[i][j]=0;
}
}
gotoxy(1,1);
cout<<"*********************************************"<<endl;
cout<<" * * * * * * * * * * * * * * Matrix Multiplication * * * * ** * * * * * * *"<<endl;
cout<<"********************************************"<<endl;
gotoxy(1,25);
cout<<"*****************************************************";
}
void Matrix::get_matrix_a( )
{
gotoxy(1,6);
cout<<" Enter the values of the Matrix A row by row :
"<<endl;
cout<<" Ú ¿"<<endl;
cout<<" ³ ³"<<endl;
cout<<" ³ ³"<<endl;
cout<<" ³ ³"<<endl;
cout<<" À Ù"<<endl;
gotoxy(18,10);
cout<<" A = "<<endl;
int x=28;
int y=9;
for(int i=0;i<3;i++)
{
for(int j=0;j<3;j++)
{
gotoxy(x,y);
cin>>matrix_a[i][j];
x+=5;
}
x=28;
y++;
}
}
void Matrix::get_matrix_b( )
{
gotoxy(1,15);
cout<<" Enter the values of the Matrix B row by row :
"<<endl;
cout<<" Ú ¿"<<endl;
cout<<" ³ ³"<<endl;
cout<<" ³ ³"<<endl;
cout<<" ³ ³"<<endl;
cout<<" À Ù"<<endl;
gotoxy(18,19);
cout<<" B = "<<endl;
int x=28;
int y=18;
for(int i=0;i<3;i++)
{
for(int j=0;j<3;j++)
{
gotoxy(x,y);
cin>>matrix_b[i][j];
x+=5;
}
x=28;
y++;
}
}
void Matrix::multiply_matrices( )
{
for(int i=0;i<3;i++)
{
for(int j=0;j<3;j++)
{
float value=0;
float sum=0;
for(int k=0;k<3;k++)
{
value=matrix_a[j][k]*matrix_b[k][j];
sum+=value;
}
matrix_c[i][j]=sum;
}
}
}
void Matrix::show_result_Matrix( )
{
clrscr( );
gotoxy(1,1);
cout<<"*********************************************************"<<endl;
cout<<" * * * * * * * * * * * * * * Matrix Multiplication * * * * ** * * * * * * *"<<endl;
cout<<"*************************************************************"<<endl;
gotoxy(1,6);
cout<<" The values of Matrix A and B are :"<<endl;
cout<<" Ú ¿ Ú
¿"<<endl;
cout<<" ³ ³ ³
³"<<endl;
cout<<" ³ ³ ³
³"<<endl;
cout<<" ³ ³ ³
³"<<endl;
cout<<" À Ù À Ù"<<endl;
gotoxy(45,9);
cout<<" B = "<<endl;
gotoxy(10,9);
cout<<" A = "<<endl;
gotoxy(1,15);
cout<<" The Product of Matrix A and B is :
"<<endl;
cout<<" Ú ¿"<<endl;
cout<<" ³ ³"<<endl;
cout<<" ³ ³"<<endl;
cout<<" ³ ³"<<endl;
cout<<" À Ù"<<endl;
gotoxy(13,19);
cout<<" A * B = "<<endl;
int x_1=20;
int y_1=8;
int x_2=55;
int y_2=8;
int x_3=28;
int y_3=18;
for(int i=0;i<3;i++)
{
for(int j=0;j<3;j++)
{
gotoxy(x_1,y_1);
cout<<matrix_a[i][j];
gotoxy(x_2,y_2);
cout<<matrix_b[i][j];
gotoxy(x_3,y_3);
cout<<matrix_c[i][j];
x_1+=5;
x_2+=5;
x_3+=5;
}
x_1=20;
y_1++;
x_2=55;
y_2++;
x_3=28;
y_3++;
}
gotoxy(1,25);
cout<<"*********************************************";
}
int main( )
{
textmode(BW80);
clrscr( );
Matrix Obj;
Obj.get_matrix_a( );
Obj.get_matrix_b( );
Obj.multiply_matrices( );
Obj.show_result_Matrix( );
getch( );
return 0;
}
| gharchive/pull-request | 2021-10-11T14:41:24 | 2025-04-01T06:39:16.632088 | {
"authors": [
"1bertovalente",
"Prashima01"
],
"repo": "keshavsingh4522/HacktoberFest-2021",
"url": "https://github.com/keshavsingh4522/HacktoberFest-2021/pull/360",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2374096242 | [Bug]: namespace includes common variable names
What happened?
In _integrators.jl,
export state
export controls
export timestep
export comps
export dim
Version
stable release
What does this bug affect?
[ ] quantum system construction
[ ] problem setup
[ ] problem solution
[ ] problem performance
[ ] solution analysis
[ ] plotting
[ ] documentation
[ ] tests
[ ] other (please specify below)
Other information
No response
these should all be prefixed with get_ probably
closed by #149
| gharchive/issue | 2024-06-26T03:17:52 | 2025-04-01T06:39:16.652767 | {
"authors": [
"aarontrowbridge",
"andgoldschmidt"
],
"repo": "kestrelquantum/QuantumCollocation.jl",
"url": "https://github.com/kestrelquantum/QuantumCollocation.jl/issues/132",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2121997626 | Bug: Windows 10 Symlink required privilege
Problem
Windows 10 symlink required privilege:
Looks like it hasn't been resolved: https://github.com/golang/go/issues/22874
Solution
You can either require privilege or use cmd through os.exec.Command:
if runtime.GOOS == "windows" {
if err = exec.Command("cmd", "mklink", "/d", link , destination).Run(); err == nil {
return nil
}
}
Thanks for reporting.
Pull req too please?
Ok I'll try
| gharchive/issue | 2024-02-07T01:46:04 | 2025-04-01T06:39:16.681900 | {
"authors": [
"kevincobain2000",
"vunhatchuong"
],
"repo": "kevincobain2000/gobrew",
"url": "https://github.com/kevincobain2000/gobrew/issues/174",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1174476354 | Support previewing vim-fugitive :Gclog entries
Feature description
Currently, when the quickfix window is populated by vim-fugitive's :Gclog command, vim-bqf's preview window is blank unless the highlighted entry has been manually opened first. I'm guessing it has to do with the special file URIs fugitive uses, which look like fugitive:///home/maddy/.config/nvim/.git//f0e7c59ec59a25bfe928b555dd8387242d2b810f.
Describe the solution you'd like
It would be nice if nvim-bqf could properly preview these vim-fugitive quickfix items.
Additional context
If the highlighted entry hasn't been manually opened yet:
Once the highlighted entry has been opened:
nvim-bqf will never support this feature because of the performance issue. You can use should_preview_cb hack what you want. You can also use the below setting to experience how slow it is.
require('bqf').setup({
should_preview_cb = function(bufnr)
local bufname = vim.api.nvim_buf_get_name(bufnr)
if bufname:match('^fugitive://') then
vim.api.nvim_buf_call(bufnr, function()
cmd(('noa do fugitive BufReadCmd %s'):format(bufname))
end)
end
return true
end
})
For me, I use https://github.com/rbong/vim-flog which is a git log wrapper and has the ability to limit the commits. More importantly, it is an extension based on fugitive.
Cool, thank you for the tips, and for the wonderful plugin!
I took your hack and hacked it even further - this loads the fugitive buffer asynchronously:
require('bqf').setup {
preview = {
should_preview_cb = function(bufnr)
local bufname = vim.api.nvim_buf_get_name(bufnr)
if bufname:match '^fugitive://' then
local pvs = require 'bqf.preview.session'
local extmark = require 'bqf.preview.extmark'
local ts = require 'bqf.preview.treesitter'
vim.schedule(function()
vim.api.nvim_buf_call(bufnr, function()
local fbufnr = pvs.float_bufnr()
vim.cmd(('do fugitive BufReadCmd %s'):format(bufname))
pvs.floatbuf_reset()
ts.disable_active(fbufnr)
extmark.clear_highlight(fbufnr)
vim.cmd(('silent lua require"bqf.utils".transfer_buf(%d, %d)'):format(bufnr, fbufnr))
end)
end)
end
return true
end,
},
}
I copied some code from bqf.preview.handler.open() to do this, I'm sure there's a better way but this works well enough for me.
Please update the code which is convenient to hack.
local bqf_pv_timer
require('bqf').setup {
preview = {
should_preview_cb = function(bufnr, qwinid)
local bufname = vim.api.nvim_buf_get_name(bufnr)
if bufname:match '^fugitive://' and not vim.api.nvim_buf_is_loaded(bufnr) then
if bqf_pv_timer and bqf_pv_timer:get_due_in() > 0 then
bqf_pv_timer:stop()
bqf_pv_timer = nil
end
bqf_pv_timer = vim.defer_fn(function()
vim.api.nvim_buf_call(bufnr, function()
vim.cmd(('do fugitive BufReadCmd %s'):format(bufname))
end)
require('bqf.preview.handler').open(qwinid, nil, true)
end, 60)
end
return true
end
}
}
Awesome, much cleaner. Thank you!
| gharchive/issue | 2022-03-20T08:43:10 | 2025-04-01T06:39:16.688436 | {
"authors": [
"b0o",
"kevinhwang91"
],
"repo": "kevinhwang91/nvim-bqf",
"url": "https://github.com/kevinhwang91/nvim-bqf/issues/60",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1803128402 | Missing create_data_in_joint_task_format method at DatasetLoader
Hi. Thanks for sharing the code of your papers.
I'm trying to reproduce the training for the joint task, but this error appears:
AttributeError: 'DatasetLoader' object has no attribute 'create_data_in_joint_task_format'
There is any way to solve it with the other methods in the class?
Thanks!
Hi,
There has been some API changes. I have not updated the run_model.py script yet. But for joint task, it should be:
create_data_in_aspe_format()
From here on joint task will be called as ASPE (Aspect Sentiment Pair Extraction), since new tasks are being added.
Great! Thanks for your answer.
| gharchive/issue | 2023-07-13T14:15:19 | 2025-04-01T06:39:16.696267 | {
"authors": [
"kevinscaria",
"noshaq"
],
"repo": "kevinscaria/InstructABSA",
"url": "https://github.com/kevinscaria/InstructABSA/issues/16",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2380755399 | 🛑 FX corretor (kty) is down
In a34b48b, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in d23bdfb after 10 minutes.
| gharchive/issue | 2024-06-28T16:16:13 | 2025-04-01T06:39:16.700244 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/11216",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2384706832 | 🛑 FX corretor (kty) is down
In 2b2371d, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in c0d0354 after 6 minutes.
| gharchive/issue | 2024-07-01T20:47:20 | 2025-04-01T06:39:16.702850 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/12112",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1221463714 | 🛑 FX portal 2 (kty) is down
In ad5ca99, FX portal 2 (kty) (https://megafestivalfoxter.com.br) was down:
HTTP code: 404
Response time: 404 ms
Resolved: FX portal 2 (kty) is back up in 990ec2c.
| gharchive/issue | 2022-04-29T19:56:48 | 2025-04-01T06:39:16.705225 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/1253",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2398554361 | 🛑 FX corretor (kty) is down
In c968390, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in 8af7a55 after 6 minutes.
| gharchive/issue | 2024-07-09T15:37:13 | 2025-04-01T06:39:16.707597 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/14146",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2406172883 | 🛑 FX corretor (kty) is down
In 9cb2e25, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in e48daab after 7 minutes.
| gharchive/issue | 2024-07-12T18:59:00 | 2025-04-01T06:39:16.709952 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/14961",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2415584563 | 🛑 FX corretor (kty) is down
In f6f3242, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in a777894 after 25 minutes.
| gharchive/issue | 2024-07-18T07:22:53 | 2025-04-01T06:39:16.712418 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/16376",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2416296989 | 🛑 FX blog (kty) is down
In 7f56b43, FX blog (kty) (https://foxter-blogeditor.konecty.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX blog (kty) is back up in c2dc0d9 after 9 minutes.
| gharchive/issue | 2024-07-18T12:57:53 | 2025-04-01T06:39:16.714970 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/16437",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1289610946 | 🛑 FX portal 2 (kty) is down
In 964818a, FX portal 2 (kty) (https://megafestivalfoxter.com.br) was down:
HTTP code: 403
Response time: 210 ms
Resolved: FX portal 2 (kty) is back up in f7f800c.
| gharchive/issue | 2022-06-30T05:37:03 | 2025-04-01T06:39:16.717321 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/1682",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2422373938 | 🛑 FX blog (kty) is down
In 510ea4d, FX blog (kty) (https://foxter-blogeditor.konecty.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX blog (kty) is back up in 0ecd7ea after 10 minutes.
| gharchive/issue | 2024-07-22T09:18:40 | 2025-04-01T06:39:16.719730 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/17408",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1388263830 | 🛑 FX blog (kty) is down
In ab13c22, FX blog (kty) (https://blog.foxterciaimobiliaria.com.br) was down:
HTTP code: 503
Response time: 263 ms
Resolved: FX blog (kty) is back up in b4f9996.
| gharchive/issue | 2022-09-27T19:33:23 | 2025-04-01T06:39:16.722075 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/2252",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2475390235 | 🛑 FX corretor (kty) is down
In 80b59df, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in 57f4486 after 31 minutes.
| gharchive/issue | 2024-08-20T11:20:04 | 2025-04-01T06:39:16.724478 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/23882",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2526593460 | 🛑 FX corretor (kty) is down
In 67b8837, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in 452ab19 after 9 minutes.
| gharchive/issue | 2024-09-14T20:17:13 | 2025-04-01T06:39:16.727044 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/29102",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2546730180 | 🛑 FX corretor (kty) is down
In 4c89ba9, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in 64b56cf after 31 minutes.
| gharchive/issue | 2024-09-25T02:00:14 | 2025-04-01T06:39:16.729463 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/30936",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2564605708 | 🛑 FX blog (kty) is down
In 0ad23da, FX blog (kty) (https://foxter-blogeditor.konecty.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX blog (kty) is back up in 30e14b0 after 7 minutes.
| gharchive/issue | 2024-10-03T17:30:37 | 2025-04-01T06:39:16.731811 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/32480",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2582457972 | 🛑 FX blog (kty) is down
In deb8ae3, FX blog (kty) (https://foxter-blogeditor.konecty.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX blog (kty) is back up in 54f4d26 after 7 minutes.
| gharchive/issue | 2024-10-12T03:45:25 | 2025-04-01T06:39:16.734187 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/33962",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2599938243 | 🛑 FX corretor (kty) is down
In 8b9bb16, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in 7cce434 after 6 minutes.
| gharchive/issue | 2024-10-20T04:00:19 | 2025-04-01T06:39:16.736571 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/35276",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2625802692 | 🛑 FX corretor (kty) is down
In 474864c, FX corretor (kty) (https://corretores.foxterciaimobiliaria.com.br) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX corretor (kty) is back up in bf942a9 after 29 minutes.
| gharchive/issue | 2024-10-31T02:09:57 | 2025-04-01T06:39:16.739217 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/37306",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2652343025 | 🛑 FX blog (kty) is down
In 6fbc0c9, FX blog (kty) (https://foxter-blogeditor.konecty.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FX blog (kty) is back up in 706dfeb after 8 minutes.
| gharchive/issue | 2024-11-12T14:25:28 | 2025-04-01T06:39:16.741586 | {
"authors": [
"keviocastro"
],
"repo": "keviocastro/upptime",
"url": "https://github.com/keviocastro/upptime/issues/39625",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.