id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
598247493
|
Problem with new namespace autocompletion
When I create a new namespace with the C# snippet, I have to click away from the auto focus, on the namespace name, for namespace autocompletion to occur. This Is a mild annoyance, as it would be cool to create a new namespace, and immediately have this extension recognize the namespace for autocompletion.
Edit: This is a problem when typing the full namespace, haven't tried namespace-fill, which might work just fine.
Hi,
Sorry for not responding earlier. I didn't notice this issue 😳
I think it's really just how VSCode works. I'm not getting autcompletion from official C# extension in snippets either.
Good think is that you don't have to click away, you can pull up autocompletion list (without existing the snippet) by pressing Ctrl + Space.
@AdrianWilczynski that's okay :-)
Thank you for the suggestion, I'll try to make a habit of activating autocompletion manually in this case.
|
gharchive/issue
| 2020-04-11T11:31:29 |
2025-04-01T04:54:41.310451
|
{
"authors": [
"AdrianWilczynski",
"niem94"
],
"repo": "AdrianWilczynski/NamespaceAutocompletion",
"url": "https://github.com/AdrianWilczynski/NamespaceAutocompletion/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2552762072
|
фикс весов секрета
Описание PR
Пофиксил веса секрета - добавил точечку перед весом еретика
Убрал шанс на зомбиметеориты пока
Повысил за счет трейтора чуть шансы на реву и еретика
:cl:
tweak: Изменение шанса выпадения некоторых режимов в секрете
Да уж,еретик с весом 15 имба конечно, вроде как знаю игра воспринимает значения .15 как 0.15
но тут другое дело
|
gharchive/pull-request
| 2024-09-27T12:06:56 |
2025-04-01T04:54:41.315832
|
{
"authors": [
"1Stepka1",
"PyotrIgn"
],
"repo": "AdventureTimeSS14/space_station_ADT",
"url": "https://github.com/AdventureTimeSS14/space_station_ADT/pull/546",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
423265925
|
Sync with pivotal-cf/pcf-pipeline master
Thanks for submitting an pull request to pcf-pipelines.
To speed up the process of reviewing your pull request please provide us with:
A short explanation of the proposed change:
An explanation of the use cases your change solves:
Expected result after the change:
Current result before the change:
Links to any other associated PRs or issues:
[x] I have viewed signed and have submitted the Contributor License Agreement
[x] I have made this pull request to the master branch
[x] I have run all the unit tests
Please sync up Agile to Pivotal lastest changes
|
gharchive/pull-request
| 2019-03-20T13:58:32 |
2025-04-01T04:54:41.353358
|
{
"authors": [
"gerald-dev"
],
"repo": "Agile-Defense/pcf-pipelines",
"url": "https://github.com/Agile-Defense/pcf-pipelines/pull/1",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
609941445
|
PreviewModelDialog include x rotation doesnt work
I have try to use custom model rotation, but it doesnt work as well. x rotation doesnt work with any number. Althought y and z rotation work perfectly. x rotation parameter doesnt do anything.
Any idea about this?
Thank you.
I have fixed it on #42
Thanks
|
gharchive/issue
| 2020-04-30T12:51:13 |
2025-04-01T04:54:41.359875
|
{
"authors": [
"nathanramli"
],
"repo": "Agneese-Saini/SA-MP",
"url": "https://github.com/Agneese-Saini/SA-MP/issues/39",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1272582849
|
XCFramework binary size too big
Hey all,
Are there any plans to reduce the binary frameworks' size? It seems to me the sizes are way too large.
Currently seeing AgoraRTCKit at ~430MB and BeQuic at 130MB.
Thanks.
@TheCoordinator i think the reason is it includes sdk for all arch. does it increase your final app product a lot as well? i believe not all files will be linked when building release app.
Hi @plutoless, yes you are spot on. However when looking at the AppStore bundle size on my device (iPhone 11 Pro) my app size has indeed been increased by 40MB which I still would say is quite a lot. Obviously don't know what the absolute must have dependencies are. But this is still quite big. Wondering if there's a way I can reduce this given my use-case? i.e. only link dependencies that I need via a custom package?
@TheCoordinator
the baseline is media sdk will be a lot more bigger than normal sdk mainly due to its complicated dependencies (e.g. decoder codec/encoder codec etc) so 40mb increase does not sound too surprising to me.
the chances are:
see if you are using audio-only features. if you don't need video, this will be a big advantage when you want to reduce your app size. In this case you can choose to use our audio sdk, which is basically a slim sdk version which removes video modules.
for some advanced features (like beauty), we have designed to make these features into separate dynamic libraries. if you are not using these features, you can choose to not link them in xode, which may also help you in this case.
Thanks for that @plutoless.
I am using liveBroadcasting mode so would need both video/audio and nothing fancy at all.
What libs do you recommend me to drop from this list here so I make sure, I don't break anything.
https://github.com/agorabuilder/AgoraRtcEngine_iOS_Beta/blob/master/Package.swift
@TheCoordinator
this page should solve your concern,
https://docs.agora.io/en/Video/reduce_rtc_app_size?platform=iOS
OK thanks @plutoless.
Not sure if removing the extensions will change dramatically but will give it a go anyway. In the long run, do you have any plans to reduce the video framework size?
Not in the pipeline for now. We'll raise it internally to see if a lightweight version could be possible in the future, as there's many benefits to that.
Thank you @maxxfrazer That'll be hugely benefitial.
Hey @maxxfrazer , Thanks for the update. See there's been some changes since the beta on how products can be linked via SPM. Is this issue now resolved if I use RtcBasic?
Which I assume, it means I can use the bare minimum and none of the extensions.
you’re correct @TheCoordinator, we’ll update the README here to make that clearer. the other packages are for various things like background segmentation.
The largest package product, AgoraRtcKit, is still included in RtcBasic though, so won’t solve everything in terms of size.
Thanks @maxxfrazer, I think this is already an improvement. I had 20MB back just by this change and removing unnecessary extensions compared to the last beta. 👏🏼
But you're right, it'd be great if we can also have a more basic version of AgoraRtcKit in the future updates.
also each xcframework contains multiple architectures - at least 1 for the physical device, and 2 for the simulator. only one will actually be bundled when it goes to a device.
they really add up when in the xcframework format..!
Obviously xcframework itself is not a huge issue, as discussed above what gets linked based on the arch is important. If there are room for more improvement over that, it'd be highly appreciated.
@TheCoordinator I believe we've made some huge progress on reducing the app size. The XCFramework for the core SDK (those included in RtcBasic).
@TheCoordinator I believe we've made some huge progress on reducing the app size. The XCFramework for the core SDK (those included in RtcBasic).
Great to hear @maxxfrazer. Will give it a go for our next version.
Going to assume it's better now, please re-open if you're still facing the issue.
|
gharchive/issue
| 2022-06-15T18:20:51 |
2025-04-01T04:54:41.370129
|
{
"authors": [
"TheCoordinator",
"maxxfrazer",
"plutoless"
],
"repo": "AgoraIO/AgoraRtcEngine_iOS",
"url": "https://github.com/AgoraIO/AgoraRtcEngine_iOS/issues/18",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
711737676
|
Error with Firebase User UID
Hi,
I keep getting the error message below. from the firebase I am getting a long digit instead of an int, how do I move out from this issue?
The log keep pointing to this code:
await AgoraRtcEngine.joinChannel(
null, widget.selectedClass.classTitle, null, _authenticationService.currentUser.numberId);
E/MethodChannel#agora_rtc_engine(30185): Failed to handle method call E/MethodChannel#agora_rtc_engine(30185): java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Integer E/MethodChannel#agora_rtc_engine(30185): at io.agora.agorartcengine.AgoraRtcEnginePlugin.onMethodCall(AgoraRtcEnginePlugin.java:138) E/MethodChannel#agora_rtc_engine(30185): at io.flutter.plugin.common.MethodChannel$IncomingMethodCallHandler.onMessage(MethodChannel.java:226) E/MethodChannel#agora_rtc_engine(30185): at io.flutter.embedding.engine.dart.DartMessenger.handleMessageFromDart(DartMessenger.java:85) E/MethodChannel#agora_rtc_engine(30185): at io.flutter.embedding.engine.FlutterJNI.handlePlatformMessage(FlutterJNI.java:631) E/MethodChannel#agora_rtc_engine(30185): at android.os.MessageQueue.nativePollOnce(Native Method) E/MethodChannel#agora_rtc_engine(30185): at android.os.MessageQueue.next(MessageQueue.java:331) E/MethodChannel#agora_rtc_engine(30185): at android.os.Looper.loop(Looper.java:142) E/MethodChannel#agora_rtc_engine(30185): at android.app.ActivityThread.main(ActivityThread.java:6518) E/MethodChannel#agora_rtc_engine(30185): at java.lang.reflect.Method.invoke(Native Method) E/MethodChannel#agora_rtc_engine(30185): at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438) E/MethodChannel#agora_rtc_engine(30185): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807) E/flutter (30185): [ERROR:flutter/lib/ui/ui_dart_state.cc(157)] Unhandled Exception: PlatformException(error, java.lang.Long cannot be cast to java.lang.Integer, null) E/flutter (30185): #0 StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:569:7) E/flutter (30185): #1 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:156:18) E/flutter (30185): <asynchronous suspension> E/flutter (30185): #2 MethodChannel.invokeMethod (package:flutter/src/services/platform_channel.dart:329:12) E/flutter (30185): #3 AgoraRtcEngine.joinChannel (package:agora_rtc_engine/agora_rtc_engine.dart:459:41) E/flutter (30185): #4 _WatchLiveStreamViewState.initialize (package:myfitlyfe/ui/views/watch_livestream_view.dart:82:26) E/flutter (30185): <asynchronous suspension> E/flutter (30185): #5 _WatchLiveStreamViewState.initState (package:myfitlyfe/ui/views/watch_livestream_view.dart:58:5) E/flutter (30185): #6 StatefulElement._firstBuild (package:flutter/src/widgets/framework.dart:4640:58) E/flutter (30185): #7 ComponentElement.mount (package:flutter/src/widgets/framework.dart:4476:5) E/flutter (30185): #8 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #9 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #10 SingleChildRenderObjectElement.mount (package:flutter/src/widgets/framework.dart:5830:14) E/flutter (30185): #11 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #12 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #13 ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4527:16) E/flutter (30185): #14 Element.rebuild (package:flutter/src/widgets/framework.dart:4218:5) E/flutter (30185): #15 ComponentElement._firstBuild (package:flutter/src/widgets/framework.dart:4481:5) E/flutter (30185): #16 ComponentElement.mount (package:flutter/src/widgets/framework.dart:4476:5) E/flutter (30185): #17 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #18 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #19 SingleChildRenderObjectElement.mount (package:flutter/src/widgets/framework.dart:5830:14) E/flutter (30185): #20 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #21 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #22 SingleChildRenderObjectElement.mount (package:flutter/src/widgets/framework.dart:5830:14) E/flutter (30185): #23 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #24 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #25 ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4527:16) E/flutter (30185): #26 StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:4675:11) E/flutter (30185): #27 Element.rebuild (package:flutter/src/widgets/framework.dart:4218:5) E/flutter (30185): #28 ComponentElement._firstBuild (package:flutter/src/widgets/framework.dart:4481:5) E/flutter (30185): #29 StatefulElement._firstBuild (package:flutter/src/widgets/framework.dart:4666:11) E/flutter (30185): #30 ComponentElement.mount (package:flutter/src/widgets/framework.dart:4476:5) E/flutter (30185): #31 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #32 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #33 SingleChildRenderObjectElement.mount (package:flutter/src/widgets/framework.dart:5830:14) E/flutter (30185): #34 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #35 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #36 SingleChildRenderObjectElement.mount (package:flutter/src/widgets/framework.dart:5830:14) E/flutter (30185): #37 Element.inflateWidget (package:flutter/src/widgets/framework.dart:3446:14) E/flutter (30185): #38 Element.updateChild (package:flutter/src/widgets/framework.dart:3214:18) E/flutter (30185): #39 ComponentElement.performRebuild (package:flutter/src/widgets/framework.dart:4527:16) E/flutter (30185): #40 StatefulElement.performRebuild (package:flutter/src/widgets/framework.dart:4675:11) E/flutter (30185): #41 Element.rebuild (package:flutter/src/widgets/framework.dart:4218:5) E/flutter (30185): #42 ComponentElement._firstBuild (package:flutter/src/widgets/framework.dart:4481:5) E/flutter (30185): #43 StatefulElement._firstBuild (package:flutter/src/widgets/framework.d
our uid must be int, may be you can use stringUid, joinChannelWithUserAccount
Ok, let me try it out, thank you @LichKing-2234
@LichKing-2234 if I call joinChannelWithUserAccount does that mean I will not need to call the agora event handler like AgoraRtcEngine.onJoinChannelSuccess and AgoraRtcEngine.onUserJoined?
Because at the moment after replacing :
await AgoraRtcEngine.joinChannel(
null, widget.selectedClass.classTitle, null, _authenticationService.currentUser.numberId);
with
var result = {
"userAccount": '${_authenticationService.currentUser.numberId}',
"token": null,
"channelId": widget.selectedClass.classTitle,
};
await AgoraRtcEngine.joinChannelByUserAccount(result);
``` I am still getting error
the error seems like from FirestoreService
|
gharchive/issue
| 2020-09-30T08:18:29 |
2025-04-01T04:54:41.377373
|
{
"authors": [
"LichKing-2234",
"chilex111"
],
"repo": "AgoraIO/Flutter-SDK",
"url": "https://github.com/AgoraIO/Flutter-SDK/issues/189",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2291190094
|
Crude Oil Analysis
It involves data analysis and prediction of crude oil
@Akshat111111 Assign me this issue I will work on it.
Assign this to me, I'm interested in working on this.
Can i work on this ? My approach is to build a Random Forest crude oil price prediction model.
hello ,, pls assign this issue to me i will give my best
Waiting for your response
Im interested in this, kindly assign it to me
I would like to work on this issue. Please assign this task to me
It involves data analysis and prediction of crude oil under the main folder- Hedging by crude oil
@apooyadv @tejasvinigoel lets give others some task, you can then improve upon.
@amishhaa has done the work,are you also working with new feature?? @Amarta113
@Akshat111111 Yes I am working on it, the dataset is too large therefore I am delay
If you are facing any issue in uploading the dataset, convert it to zip file.
Yet its not ready to upload, I need some time to complete it. Is it necessary to upload now?
yes it is needed.
@Akshat111111 I am interested in working on this issue . Kindly assign it to me if possible
One implementation is already there, so create a new issue with a specific functionality.
|
gharchive/issue
| 2024-05-12T06:32:43 |
2025-04-01T04:54:41.568934
|
{
"authors": [
"Akshat111111",
"Amarta113",
"Satyam0775",
"amishhaa",
"apooyadv",
"diptarup794",
"palayushi293",
"tejasvinigoel"
],
"repo": "Akshat111111/Hedging-of-Financial-Derivatives",
"url": "https://github.com/Akshat111111/Hedging-of-Financial-Derivatives/issues/59",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1745768961
|
error NameError: name 'video_main_function' is not defined
I am a beginner to python and NodeJS
when I am running this app. I am getting this error
NameError: name 'video_main_function' is not defined
on a backend terminal, I am getting this
INFO: Started server process [9272]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: 127.0.0.1:52834 - "GET /videoCreate HTTP/1.1" 405 Method Not Allowed
Any idea why I am getting these error
oh i made a mistake, in main.py video_main_function was supposed to be video_main, i fixed the bug, you can try again by cloning or go to main.py and and rename video_main_function to video_main in line 30 path = ...
|
gharchive/issue
| 2023-06-07T12:17:20 |
2025-04-01T04:54:41.572256
|
{
"authors": [
"AkshitIreddy",
"Thrqureshi"
],
"repo": "AkshitIreddy/AI-Powered-Video-Tutorial-Generator",
"url": "https://github.com/AkshitIreddy/AI-Powered-Video-Tutorial-Generator/issues/1",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
497082593
|
Demo does not run --syncAllFiles option deprecated {N} CLI 6
Make sure to check the demo app(s) for sample usage
Did this, demo does not run.
Make sure to check the existing issues in this repository
Didn't see this issue mentioned.
If the demo apps cannot help and there is no issue for your problem, tell us about it
The demo app does not run when following the instructions in Readme.md.
`
npm run demo.android
nativescript-material-components@1.0.2 demo.android /Users/charles/projects/kanayo/mymobile-native-26/current/mymobile-native/nativescript-material-components
cd ./demo && tns run android --syncAllFiles
{
The option 'syncAllFiles' is not supported.
Run 'tns run android --help' for more information.
`
Which platform(s) does your issue occur on?
Both
All versions
Emulator and device
Please, provide the following version numbers that your issue occurs with:
CLI: 6.1.2
Please, tell us how to recreate the issue in as much detail as possible.
Describe the steps to reproduce it.
npm i
npm run tsc
npm run demo.ios
npm run demo.android
Is there any code involved?
The solution is simple, unless we want to support loading node_modules in {N} < 6 AND {N} >= 6
Thanks! scripts have been updated
Thanks, closing 👍
|
gharchive/issue
| 2019-09-23T12:59:04 |
2025-04-01T04:54:41.583220
|
{
"authors": [
"chuckmitchell",
"farfromrefug"
],
"repo": "Akylas/nativescript-material-components",
"url": "https://github.com/Akylas/nativescript-material-components/issues/58",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
985918378
|
feat(banner): banner removed from repo
Alaska Airlines Pull Request
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Fixes: #31
Summary:
This change removes auro-banner from auro-card.
Type of change:
Please delete options that are not relevant.
[ ] New capability
[ ] Revision of an existing capability
[ ] Infrastructure change (automation, etc.)
[x] Other - Removes the auro-banner feature set
Checklist:
[x] My update follows the CONTRIBUTING guidelines of this project
[x] I have performed a self-review of my own update
By submitting this Pull Request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Pull Requests will be evaluated by their quality of update and whether it is consistent with the goals and values of this project. Any submission is to be considered a conversation between the submitter and the maintainers of this project and may require changes to your submission.
Thank you for your submission!
-- Auro Design System Team
Given that https://github.com/AlaskaAirlines/auro-card/pull/52/commits is released, this PR should be closed.
|
gharchive/pull-request
| 2021-09-02T01:03:03 |
2025-04-01T04:54:41.591115
|
{
"authors": [
"blackfalcon",
"jason-capsule42"
],
"repo": "AlaskaAirlines/auro-card",
"url": "https://github.com/AlaskaAirlines/auro-card/pull/34",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2557557381
|
Jjones/beta merge conflicts
Alaska Airlines Pull Request
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Resolves: # (issue, if applicable)
Summary:
Please summarize the scope of the changes you have submitted, what the intent of the work is and anything that describes the before/after state of the project.
Type of change:
Please delete options that are not relevant.
[ ] New capability
[ ] Revision of an existing capability
[ ] Infrastructure change (automation, etc.)
[ ] Other (please elaborate)
Checklist:
[ ] My update follows the CONTRIBUTING guidelines of this project
[ ] I have performed a self-review of my own update
By submitting this Pull Request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Pull Requests will be evaluated by their quality of update and whether it is consistent with the goals and values of this project. Any submission is to be considered a conversation between the submitter and the maintainers of this project and may require changes to your submission.
Thank you for your submission!
-- Auro Design System Team
Summary by Sourcery
Simplify SCSS imports by removing file extensions in various style files to streamline the codebase.
Enhancements:
Simplify SCSS imports by removing file extensions across multiple style files.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.2 out of 3 committers have signed the CLA.:white_check_mark: jordanjones243:white_check_mark: jason-capsule42:x: semantic-release-botYou have signed the CLA already but the status is still pending? Let us recheck it.
|
gharchive/pull-request
| 2024-09-30T20:15:07 |
2025-04-01T04:54:41.599390
|
{
"authors": [
"CLAassistant",
"jason-capsule42"
],
"repo": "AlaskaAirlines/auro-input",
"url": "https://github.com/AlaskaAirlines/auro-input/pull/337",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2746178406
|
Force dropdown bib min-width to not expand outside container #225
Alaska Airlines Pull Request
Type of change:
Please delete options that are not relevant.
[ ] New capability
[x] Revision of an existing capability
[ ] Infrastructure change (automation, etc.)
[ ] Other (please elaborate)
Checklist:
[x] My update follows the CONTRIBUTING guidelines of this project
[x] I have performed a self-review of my own update
By submitting this Pull Request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Pull Requests will be evaluated by their quality of update and whether it is consistent with the goals and values of this project. Any submission is to be considered a conversation between the submitter and the maintainers of this project and may require changes to your submission.
Thank you for your submission!
-- Auro Design System Team
Summary by Sourcery
Bug Fixes:
Fix dropdown menu to prevent it from expanding outside its container by setting a minimum width.
:tada: This PR is included in version 3.3.2 :tada:
The release is available on:
npm package (@latest dist-tag)
GitHub release
Your semantic-release bot :package::rocket:
|
gharchive/pull-request
| 2024-12-17T22:02:25 |
2025-04-01T04:54:41.605229
|
{
"authors": [
"blackfalcon",
"rmenner"
],
"repo": "AlaskaAirlines/auro-select",
"url": "https://github.com/AlaskaAirlines/auro-select/pull/226",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1182874767
|
ForgeBlock encountered an error during the common setup event phase
Built the mod all good and all, but when loading it up I get this
So I tried running with the forge version 31.2.36 and I got a new error, perhaps I did not build the mod all good and all.
Hello! Is it possible you could send your full crash log?
Yep sorry late reply https://paste.ee/p/AhUN9
Ah I see the mistake, when compiling the mod the rpc library isn't being added, I'll see what I can do when I get home, otherwise you could mess around with the grade build file thing :)
|
gharchive/issue
| 2022-03-28T04:42:25 |
2025-04-01T04:54:41.653826
|
{
"authors": [
"MasterZitron",
"Si1kn"
],
"repo": "AlephInfinity1/ForgeBlock",
"url": "https://github.com/AlephInfinity1/ForgeBlock/issues/17",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1931489789
|
404 xhr on icons.svg
Description
I ave 404 status on xhr response to icons.svg
return
send
http://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js:2:80266
ajax
http://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js:2:77118
I have modify the global path:
// SVG Path globally
svgPath : 'libjs/trumbowyg/dist/ui/icons.svg',
svgAbsoluteUseHref: false,
on localy thats all right
|
gharchive/issue
| 2023-10-07T20:19:39 |
2025-04-01T04:54:41.668185
|
{
"authors": [
"SebMiao"
],
"repo": "Alex-D/Trumbowyg",
"url": "https://github.com/Alex-D/Trumbowyg/issues/1426",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2596612535
|
Inconsistent Documentation For Build Versions (AVX/AVX2)
Problem
The repo readme recommends one to go to the to https://github.com/Alex313031/Thorium-AVX2 for AVX2 builds, which then redirects to https://github.com/Alex313031/Thorium-Win-AVX2
But according to https://github.com/Alex313031/Thorium-Win-AVX2/releases/tag/M120, AVX2 builds are being provided here from now on.
If unclear, I mean that the latest release in Thorium-Win-AVX2 says that from now on, builds are being provided at Thorium-Win instead.
|
gharchive/issue
| 2024-10-18T06:55:33 |
2025-04-01T04:54:41.679519
|
{
"authors": [
"slycordinator"
],
"repo": "Alex313031/Thorium-Win",
"url": "https://github.com/Alex313031/Thorium-Win/issues/287",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2049717635
|
Build Issues
System Details
OS: Arch Linux
Thorium Version [e.g.118.0.5993.177]
Problem
Unable to generate ninja build files. All attempts to run gn args out/thorium result in
Waiting for editor on "/mnt/linuxextrassd/chromium/src/out/thorium/args.gn"...
Generating files...
ERROR at //build/config/compiler/BUILD.gn:183:15: Duplicate build argument declaration.
use_cxx17 = false
^----
Here you're declaring an argument that was already declared elsewhere.
You can only declare each argument once in the entire build so there is one
canonical place for documentation and the default value. Either move this
argument to the build config file (for visibility everywhere) or to a .gni file
that you "import" from the files where you need it (preferred).
See //build_overrides/build.gni:61:15: Previous declaration.
use_cxx17 = false
^----
See also "gn help buildargs" for more on how build arguments work.
See //build/config/BUILDCONFIG.gn:334:3: which caused the file to be included.
"//build/config/compiler:afdo_optimize_size",
^-------------------------------------------
Please note that the file referenced on the last line of the error changes each run.
Additional Notes
I have recloned multiple times and followed word for word the instructions in the build guide.
There seems to be an error occurring with circular references, and you need to make sure that there is only one copy of the full clone of the Chromium source code on the device, and that all dependencies are pulled.
Also, it is recommended to use Ubuntu as well as Debian systems for compilation.
@InventorXtreme
Is there any way to sort out the circular references? I really don't want to spin up a VM and reclone everything for the fourth time. Thanks for the response by the way.
Ok, by manually setting the version in the version script to the version in the releases page I got past the gn script, but now am unable to build because clang is not recognizing x86_64-v3 as a real arch. I will probably try to reclone everything tomorrow.
@InventorXtreme What you should do is follow the chromium documentation up to the point where it wants you to do the gn args out. At that point, then: (assuming depot_tools, the thorium repo, and the chromium repo are in $HOME).
Run ./trunk.sh to fetch tags and full git history.
Run ./version.sh to check out the Chromium repo at the revision Thorium is currently at.
Run ./setup.sh --help. This will show you the flags that can be used for the different platforms. For regular linux, you can run setup.sh with no arguments. For an AVX2 build like what you are wanting, use setup.sh --avx2
After this you can manually run gn args out/thorium, and use the args.gn appropriate for the platform. Make sure to update the PGO Profile location and version at the bottom, and you can add API Keys at the top if you wish.
Then you can either use the build scripts (build.sh, build_win.sh, etc.) to build for your platform. The scripts take one integer afterwards to tell ninja how many jobs to spawn.
For example, to build thorium for linux on an 8 core sysytem:
./build.sh 8
@InventorXtreme I updated the docs, see > https://thorium.rocks/docs/building.html
|
gharchive/issue
| 2023-12-20T02:16:19 |
2025-04-01T04:54:41.687196
|
{
"authors": [
"Alex313031",
"InventorXtreme",
"gz83"
],
"repo": "Alex313031/thorium",
"url": "https://github.com/Alex313031/thorium/issues/477",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1139945111
|
Inline SubscribeInitialize?
Right now even if auto provisioning is enabled, clients need to call SubscribeInitialize to allocate on the broker before subscribing.
Feels like there may be a case to call this inside subscribe, not sure. May be worth caching whats initialized and whats not if the change is made, though NATS lookups seem very fast.
In general probably what should happen is moving the configuration check into publisher / subscriber and call SubscribeInitialize if it's enabled in the subscriber. Publisher can still call directly.
|
gharchive/issue
| 2022-02-16T12:20:56 |
2025-04-01T04:54:41.694257
|
{
"authors": [
"AlexCuse"
],
"repo": "AlexCuse/watermill-jetstream",
"url": "https://github.com/AlexCuse/watermill-jetstream/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2065817647
|
Getting scrambled frames for portrait videos
By using the example code to read a video and write the output directly and save the frames, I am getting all frames and the output video scrambled for certain input videos. This seems to only happen in portrait videos as far as I can tell.
This is an example frame, as you can see it looks like the stride is off:
I'll submit a PR with the fix that I am using
I can add a test if you would like, but I will need to add a new test video for this, as it does not happen with the current test videos
|
gharchive/issue
| 2024-01-04T15:03:56 |
2025-04-01T04:54:41.708847
|
{
"authors": [
"lgaribaldi"
],
"repo": "AlexEidt/Vidio",
"url": "https://github.com/AlexEidt/Vidio/issues/10",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2453243190
|
it doesnt work
What Python version are you using?
What Python version are you using?
Python 3.12.4
What's the version of psutil you have?
C:\Users\AlexFlipnote>python --version
Python 3.12.4
>>> import psutil
>>> psutil.__version__
'6.0.0'
Tested with my local machine and 6.0.0 works on my end at least, which seems to be latest version.
What's the version of psutil you have?
C:\Users\AlexFlipnote>python --version
Python 3.12.4
>>> import psutil
>>> psutil.__version__
'6.0.0'
Tested with my local machine and 6.0.0 works on my end at least, which seems to be latest version.
it says i have psutil installed but when i try to use anything psutil related it says i dont have psutil
It feels a bit out of scope for neofetch, since with the same settings that you have but for my own machine, it works. I'd recommend checking your Windows to see if psutil simply has no access to get DLL, maybe if it could be that your Python and Windows are in two different partitions, etc.
Could also try pip install psutil==5.9.3 and see if lower version works for your case
pip install psutil==5.9.3
Collecting psutil==5.9.3
Downloading psutil-5.9.3.tar.gz (483 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: psutil
Building wheel for psutil (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for psutil (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [38 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-312
creating build\lib.win-amd64-cpython-312\psutil
copying psutil_common.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_compat.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psaix.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psbsd.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_pslinux.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psosx.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psposix.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_pssunos.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_pswindows.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_init_.py -> build\lib.win-amd64-cpython-312\psutil
creating build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\runner.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_aix.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_bsd.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_connections.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_contracts.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_linux.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_memleaks.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_misc.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_osx.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_posix.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_process.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_sunos.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_system.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_testutils.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_unicode.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_windows.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests_init_.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests_main_.py -> build\lib.win-amd64-cpython-312\psutil\tests
running build_ext
building 'psutil._psutil_windows' extension
error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for psutil
Failed to build psutil
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (psutil)
I ALREADY HAVE C++ 14.0.85 INSTALLED
The error says you apparently do not, so there is not much i can say or do. Essentially, try different versions of psutil and see if it can help, i will look into seeing if the tool is required on later updates
|
gharchive/issue
| 2024-08-07T11:24:53 |
2025-04-01T04:54:41.727306
|
{
"authors": [
"AlexFlipnote",
"StupidMurderDroneFanThatUsesArchBtw"
],
"repo": "AlexFlipnote/neofetch-win",
"url": "https://github.com/AlexFlipnote/neofetch-win/issues/27",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2101552847
|
Search Function
As a user
When I enter a query into the searchbar
Then I am redirected to a separate page displaying content which includes my query
Complete
|
gharchive/issue
| 2024-01-26T03:46:55 |
2025-04-01T04:54:41.738205
|
{
"authors": [
"AlexanderJGael",
"Jbrockhoff"
],
"repo": "AlexanderJGael/social-searcher",
"url": "https://github.com/AlexanderJGael/social-searcher/issues/10",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1353076935
|
🛑 Nyam is down
In be4acaa, Nyam ($SECRET_URL_NYAM) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Nyam is back up in 1e23fb7.
|
gharchive/issue
| 2022-08-27T15:42:55 |
2025-04-01T04:54:41.740582
|
{
"authors": [
"AlexanderKlimashevskiy"
],
"repo": "AlexanderKlimashevskiy/upptime",
"url": "https://github.com/AlexanderKlimashevskiy/upptime/issues/138",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1577489002
|
🛑 Home is down
In 157a80a, Home ($SECRET_URL_HOME) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Home is back up in 0e7ee34.
|
gharchive/issue
| 2023-02-09T08:55:17 |
2025-04-01T04:54:41.742657
|
{
"authors": [
"AlexanderKlimashevskiy"
],
"repo": "AlexanderKlimashevskiy/upptime",
"url": "https://github.com/AlexanderKlimashevskiy/upptime/issues/159",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2109385576
|
🛑 Nyam is down
In 6cbec54, Nyam ($SECRET_URL_NYAM) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Nyam is back up in b0d2975 after 7 minutes.
|
gharchive/issue
| 2024-01-31T06:32:29 |
2025-04-01T04:54:41.744769
|
{
"authors": [
"AlexanderKlimashevskiy"
],
"repo": "AlexanderKlimashevskiy/upptime",
"url": "https://github.com/AlexanderKlimashevskiy/upptime/issues/251",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
500342794
|
Added the code for enabling 1080p full screen resolution.
Many here had issues regarding the full screen mode. To enable 1080p resolution on VB try:
cd “c:/Program Files\oracle\virtualbox”
VBoxManage setextradata “YOUR VB NAME” VBoxInternal2/EfiGraphicsResolution 1920x1080
This code worked for me. Run this on CMD and now run the virtual machine.
Replace N with one of 0,1,2,3,4,5. These numbers correspond to the screen resolutions
640x480, 800x600, 1024x768, 1280x1024, 1440x900, 1920x1200 screen resolution,
respectively.
VBoxManage setextradata "YOUR VB NAME" VBoxInternal2/EfiGopMode 4
|
gharchive/pull-request
| 2019-09-30T15:12:01 |
2025-04-01T04:54:41.749030
|
{
"authors": [
"dtpoirot",
"gokulmanohar"
],
"repo": "AlexanderWillner/runMacOSinVirtualBox",
"url": "https://github.com/AlexanderWillner/runMacOSinVirtualBox/pull/82",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1634976617
|
SBS96 signature assignment reproducibility using SigProfilerAssignment?
I ran SigProfilerAssignment for 396 HCC samples across a mutational matrix constructed using SigProfilerMatrixGenerator from PCAWG vcfs (lifted over to GRCh38) or vcfs from our own in house mutation calling pipeline (called using GRCh38 as ref.). Twenty-six of the samples were called in both pipelines (our lab submitted the LIRI-JP PCAWG data). Of some concern, the signature profiles sometimes show very large mis-matches for the assigned SBS signatures, so I don't know what values I can "trust". I could understand that would happen if the two pipelines were outputting very different calls for the same sample, but this occurs even when the values in the mutational matrix for a sample's duplicates appear to be almost identical (r=0.9986 for the two columns) and the SigProfilerMatrixGenerator output/plots/SBS_96_plots...pdf appear identical to me for those samples.
I have attached the JOB_METADATA_SPA file and an Excel file for one sample (RK001_T) with the mutational matrix data and SBS96 signatures that were called in either of the two samples.
As one can see looking at the SBS96 signature summary table, SBS5 was assigned in both samples, but SBS8, SBS46, and SBS92 were assigned only in LIRI-JP_RK001_T and SBS12 and SBS40 were assigned only to the data from my current somatic calling pipeline. I did not see such differences for ID or DBS in this sample.
JOB_METADATA_SPA.txt
RK001_T_matrix_and_COSMIC_sigs.xlsx
Dear @toddajohnson,
Please accept my apologies for not getting back to you sooner. Unfortunately, what you described is one of the biggest challenges that we currently face in the field of signature assignment. Considering the large pool of SBS reference signatures available, the possibility of different signature reconstructions having very similar reconstruction similarity is relatively high. Indeed, as you pointed out, almost no difference was observed for ID and DBS, with both having a much smaller number of COSMIC reference signatures.
However, there are several strategies to try to overcome this uncertainty. First, one can focus on those signatures previously related to the cancer type of interest, liver cancer in your particular case. Although this would restrict the analysis to previous knowledge, it would avoid unreliable assignment of signatures, such as SBS46 in your specific case, which is a signature related to a sequencing artifact observed in early releases of TCGA. You can exclude certain reference signatures from the assignment by using the exclude_signature_subgroups parameter (following the instructions on the README) or directly providing a modified signature database using the signature_database parameter. Other options can include checking the transcriptional strand bias of your samples compared to the one observed in the reference signatures (available in the topographical features section of the COSMIC Mutational Signatures website) or excluding certain etiologies based on additional information about the input samples (e.g., restricting the assignment of chemotherapy-related signatures in treatment-naive cases).
I hope this helps, and please feel free to reopen the ticket or reach out by email (mdiazgay@health.ucsd.edu) if you have further comments or questions. Thanks for your interest!
|
gharchive/issue
| 2023-03-22T02:57:59 |
2025-04-01T04:54:41.755558
|
{
"authors": [
"marcos-diazg",
"toddajohnson"
],
"repo": "AlexandrovLab/SigProfilerAssignment",
"url": "https://github.com/AlexandrovLab/SigProfilerAssignment/issues/81",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1465185435
|
Missing tray icons for blueberry-tray and some other applications
Hi
I am using waybar on a sway based nixos install and when I try to run blueberry-tray or some other applications (such as youtube-music or mailspring) their tray icons do not show up in waybar even if they are activated.
When starting the apps I don't see any specific logs regarding an error. I don't know if it comes from my nixos install or from waybar itself but I was wondering how I can find the reason why it is not working.
waybar version: 0.9.15
blueberry-tray version: 1.4.8
youtube-music version: 1.17.0
mailspring version: 1.10.5
closing this as I solved it by installing correctly libappindicator on nixos
|
gharchive/issue
| 2022-11-26T15:35:56 |
2025-04-01T04:54:41.758626
|
{
"authors": [
"aacebedo"
],
"repo": "Alexays/Waybar",
"url": "https://github.com/Alexays/Waybar/issues/1834",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1804235269
|
mpris when no player is active generates errors in logs
When no player is currently active allot of log noise is generated
(waybar:5688): playerctl-WARNING **: 22:56:30.449: Spotify does not use the D-Bus property cache, getting properties directly
(waybar:5688): playerctl-WARNING **: 22:56:30.449: Spotify does not use the D-Bus property cache, getting properties directly
(waybar:5688): playerctl-WARNING **: 22:56:30.449: Spotify does not use the D-Bus property cache, getting properties directly
(waybar:5688): playerctl-WARNING **: 22:56:30.450: Spotify does not use the D-Bus property cache, getting properties directly
[2023-07-13 22:56:30.450] [error] mpris[playerctld]: GDBus.Error:com.github.altdesktop.playerctld.NoActivePlayer: No player is being controlled by playerctld
repeating every few seconds
Same issue. the error keeps repeating
I also have this issue. I suspect that it is logged on this line: https://github.com/Alexays/Waybar/blob/a90e275d5e26226c9e69abbb6f9be4d7391ba3c1/src/modules/mpris/mpris.cpp#L570C1-L570C1
Is there any news ? I also have this problem and I'm afraid of battery impact
Is there any news ? I also have this problem and I'm afraid of battery impact
Since it's just logging, I doubt it has a significant impact on energy usage.
should be fixed by https://github.com/Alexays/Waybar/pull/2622
|
gharchive/issue
| 2023-07-14T05:58:05 |
2025-04-01T04:54:41.762213
|
{
"authors": [
"BijanRegmi",
"MithicSpirit",
"NoahFraiture",
"oniGino",
"robertgzr"
],
"repo": "Alexays/Waybar",
"url": "https://github.com/Alexays/Waybar/issues/2312",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2273979000
|
No json crash with Hyprland
I'm using the hyprland/workspaces module with the following config:
"hyprland/workspaces": {
"format": "{id}"
},
I'm not sure what triggers this, but sometimes (randomly ?) the entire waybar crashes with it. It prints the following message in the log:
[2024-05-01 20:02:35.208] [error] Hyprland IPC: Couldn't write (4)
[2024-05-01 20:02:35.208] [error] hyprland/workspaces: Error parsing JSON: * Line 1, Column 1
Syntax error: value, object or array expected.
Can you try to run waybar with the -l trace argument? This should give more info about what happens before the crash.
Which Hyprland and Waybar versions are you running?
This happening to me too. I am on Nixos (24.05), waybar is on version v0.10.2.
[2024-05-02 08:59:45.196] [debug] Try expanding: $XDG_CONFIG_HOME/waybar/config.jsonc
[2024-05-02 08:59:45.196] [debug] Try expanding: $HOME/.config/waybar/config
[2024-05-02 08:59:45.196] [debug] Found config file: $HOME/.config/waybar/config
[2024-05-02 08:59:45.196] [info] Using configuration file /home/nathannix/.config/waybar/config
[2024-05-02 08:59:45.199] [debug] Try expanding: $XDG_CONFIG_HOME/waybar/style.css
[2024-05-02 08:59:45.199] [debug] Try expanding: $HOME/.config/waybar/style.css
[2024-05-02 08:59:45.199] [debug] Found config file: $HOME/.config/waybar/style.css
[2024-05-02 08:59:45.199] [info] Using CSS file /home/nathannix/.config/waybar/style.css
[2024-05-02 08:59:45.206] [debug] Output detection done: eDP-1 (BOE 0x0819)
[2024-05-02 08:59:45.211] [debug] window-rewrite is not defined or is not an object, using default rules.
[2024-05-02 08:59:45.211] [info] Hyprland IPC starting
[2024-05-02 08:59:45.212] [error] Hyprland IPC: Unable to connect?
[2024-05-02 08:59:45.212] [error] Hyprland IPC: Couldn't connect to /tmp/hypr/ed58cc4c31c21e09ee780d0df818afe935181cd9_1714615262_233001966/.socket.sock. (3)
[2024-05-02 08:59:45.212] [warning] module hyprland/workspaces: Disabling module "hyprland/workspaces", Error parsing JSON: * Line 1, Column 1
Syntax error: value, object or array expected.
[2024-05-02 08:59:45.213] [error] Hyprland IPC: Couldn't connect to /tmp/hypr/ed58cc4c31c21e09ee780d0df818afe935181cd9_1714615262_233001966/.socket.sock. (3)
[2024-05-02 08:59:45.213] [error] hyprland language initLanguage failed with basic_string::substr: __pos (which is 7) > this->size() (which is 0)
[2024-05-02 08:59:45.271] [debug] GTK widget tree:
window#waybar..background.top.eDP-1.mode-default:dir(ltr)
decoration:dir(ltr)
box.horizontal:dir(ltr)
box.horizontal.modules-left:dir(ltr)
widget:dir(ltr)
label#custom-launcher.module:dir(ltr)
widget:dir(ltr)
label#custom-playerctl.backward.module:dir(ltr)
widget:dir(ltr)
label#custom-playerctl.play.module:dir(ltr)
widget:dir(ltr)
label#custom-playerctl.forward.module:dir(ltr)
widget:dir(ltr)
box#tray.horizontal.module:dir(ltr)
box.horizontal.modules-center:dir(ltr)
box.horizontal.modules-right:dir(ltr)
widget:dir(ltr)
label#cpu.module:dir(ltr)
widget:dir(ltr)
label#memory.module:dir(ltr)
widget:dir(ltr)
label#language.module:dir(ltr)
widget:dir(ltr)
label#pulseaudio.module:dir(ltr)
widget:dir(ltr)
label#backlight.module:dir(ltr)
widget:dir(ltr)
label#battery.module:dir(ltr)
widget:dir(ltr)
label#clock.module:dir(ltr)
[2024-05-02 08:59:45.385] [info] Bar configured (width: 1920, height: 47) for output: eDP-1
[2024-05-02 08:59:45.394] [trace] Set tray item property: :1.9.AttentionAccessibleDesc = ''
[2024-05-02 08:59:45.394] [trace] Set tray item property: :1.9.AttentionIconName = ''
[2024-05-02 08:59:45.394] [trace] Set tray item property: :1.9.Category = 'SystemServices'
[2024-05-02 08:59:45.394] [trace] Set tray item property: :1.9.IconAccessibleDesc = 'Wi-Fi network connection “KAM-2” active: KAM-2 (54%)'
[2024-05-02 08:59:45.394] [trace] Set tray item property: :1.9.IconName = 'nm-signal-75'
[2024-05-02 08:59:45.394] [trace] Set tray item property: :1.9.IconThemePath = ''
[2024-05-02 08:59:45.394] [trace] Set tray item property: :1.9.Id = 'nm-applet'
[2024-05-02 08:59:45.394] [trace] Set tray item property: nm-applet.Menu = '/org/ayatana/NotificationItem/nm_applet/Menu'
[2024-05-02 08:59:45.395] [trace] Set tray item property: nm-applet.Status = 'Active'
[2024-05-02 08:59:45.395] [trace] Set tray item property: nm-applet.Title = 'Network'
[2024-05-02 08:59:45.395] [trace] Set tray item property: nm-applet.XAyatanaLabel = ''
[2024-05-02 08:59:45.395] [trace] Set tray item property: nm-applet.XAyatanaLabelGuide = ''
[2024-05-02 08:59:45.395] [trace] Set tray item property: nm-applet.XAyatanaOrderingIndex = 0
[2024-05-02 08:59:45.415] [trace] Set tray item property: :1.50.AttentionIconName = ''
[2024-05-02 08:59:45.415] [trace] Set tray item property: :1.50.AttentionIconPixmap = []
[2024-05-02 08:59:45.416] [trace] Set tray item property: :1.50.AttentionMovieName = ''
[2024-05-02 08:59:45.416] [trace] Set tray item property: :1.50.Category = 'ApplicationStatus'
[2024-05-02 08:59:45.416] [trace] Set tray item property: :1.50.IconName = 'telegram-panel'
[2024-05-02 08:59:45.416] [trace] Set tray item property: :1.50.IconPixmap = a(iiay)
[2024-05-02 08:59:45.416] [trace] Set tray item property: :1.50.Id = 'TelegramDesktop'
[2024-05-02 08:59:45.416] [trace] Set tray item property: TelegramDesktop.ItemIsMenu = false
[2024-05-02 08:59:45.416] [trace] Set tray item property: TelegramDesktop.Menu = '/MenuBar'
[2024-05-02 08:59:45.416] [trace] Set tray item property: TelegramDesktop.OverlayIconName = ''
[2024-05-02 08:59:45.416] [trace] Set tray item property: TelegramDesktop.OverlayIconPixmap = []
[2024-05-02 08:59:45.416] [trace] Set tray item property: TelegramDesktop.Status = 'Active'
[2024-05-02 08:59:45.416] [trace] Set tray item property: TelegramDesktop.Title = 'TelegramDesktop'
[2024-05-02 08:59:45.416] [trace] Set tray item property: TelegramDesktop.ToolTip = ('', [], 'Telegram Desktop', '')
[2024-05-02 08:59:45.421] [trace] Set tray item property: org.kde.StatusNotifierItem-2728-2.AttentionIconName = ''
[2024-05-02 08:59:45.421] [trace] Set tray item property: org.kde.StatusNotifierItem-2728-2.AttentionIconPixmap = []
[2024-05-02 08:59:45.421] [trace] Set tray item property: org.kde.StatusNotifierItem-2728-2.AttentionMovieName = ''
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2728-2.Category = 'ApplicationStatus'
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2728-2.IconName = ''
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2728-2.IconPixmap = a(iiay)
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2728-2.Id = 'ViberPC'
[2024-05-02 08:59:45.422] [trace] Set tray item property: ViberPC.ItemIsMenu = false
[2024-05-02 08:59:45.422] [trace] Set tray item property: ViberPC.Menu = '/MenuBar'
[2024-05-02 08:59:45.422] [trace] Set tray item property: ViberPC.OverlayIconName = ''
[2024-05-02 08:59:45.422] [trace] Set tray item property: ViberPC.OverlayIconPixmap = []
[2024-05-02 08:59:45.422] [trace] Set tray item property: ViberPC.Status = 'Active'
[2024-05-02 08:59:45.422] [trace] Set tray item property: ViberPC.Title = 'ViberPC'
[2024-05-02 08:59:45.422] [trace] Set tray item property: ViberPC.ToolTip = ('', [], 'Viber ', '')
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.AttentionIconName = ''
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.AttentionIconPixmap = []
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.AttentionMovieName = ''
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.Category = 'SystemServices'
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.IconName = 'zero-trust-disconnected'
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.IconPixmap = a(iiay)
[2024-05-02 08:59:45.422] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.IconThemePath = ''
[2024-05-02 08:59:45.423] [trace] Set tray item property: org.kde.StatusNotifierItem-2779-1.Id = 'zero-trust-client'
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.ItemIsMenu = false
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.Menu = '/MenuBar'
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.OverlayIconName = ''
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.OverlayIconPixmap = []
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.Status = 'Active'
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.Title = 'Zero Trust `Client'``
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.ToolTip = ('', [], '', '')
[2024-05-02 08:59:45.423] [trace] Set tray item property: zero-trust-client.WindowId = 0
[2024-05-02 08:59:46.556] [trace] Set tray item property: :1.104.Category = 'Hardware'
[2024-05-02 08:59:46.556] [trace] Set tray item property: :1.104.IconName = 'blueman-tray'
[2024-05-02 08:59:46.556] [trace] Set tray item property: :1.104.Id = 'blueman'
[2024-05-02 08:59:46.556] [trace] Set tray item property: blueman.ItemIsMenu = false
[2024-05-02 08:59:46.556] [trace] Set tray item property: blueman.Menu = '/org/blueman/sni/menu'
[2024-05-02 08:59:46.556] [trace] Set tray item property: blueman.Status = 'Active'
[2024-05-02 08:59:46.556] [trace] Set tray item property: blueman.Title = 'blueman'
[2024-05-02 08:59:46.556] [trace] Set tray item property: blueman.ToolTip = ('', [], 'Bluetooth Enabled', '')```
I have the same issue, running hyprland-git from the AUR, so currently on Hyprland v0.39.1-124-g56de72f3, with waybar v0.10.2, with the same sort of output as RustLover2910, just 3 times as much because i have 3 screens with separate waybar configs
Same issue about with current git release so a3ca016 on nixOS
Just released a new version 0.10.3 :)
|
gharchive/issue
| 2024-05-01T18:34:51 |
2025-04-01T04:54:41.768851
|
{
"authors": [
"AggressiveHayBale",
"Alexays",
"BeauTaapken",
"NobodyForNothing",
"RustLover2910",
"zjeffer"
],
"repo": "Alexays/Waybar",
"url": "https://github.com/Alexays/Waybar/issues/3192",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1421361189
|
Implement About Page
Context
Add the necessary components and implement them on the about page.
@davidsmejia This PR will be created based on the branch nozomione/26-create-global-components-for-app-structure, for it's a dependent of the issue #26, thank you!
|
gharchive/issue
| 2022-10-24T20:01:08 |
2025-04-01T04:54:41.830114
|
{
"authors": [
"davidsmejia",
"nozomione"
],
"repo": "AlexsLemonade/refinebio-web",
"url": "https://github.com/AlexsLemonade/refinebio-web/issues/30",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1437471955
|
FIX - SONOFF BULB B02-BL seen as switch and no dimmer or temperature options
SONOFF B02-BL seen as switch and no dimmer or temperature option
found no reference to a fix to this or any mentions on a work around
SOLUTION
on the device go to DOWNLOAD DIAGNOSTICS open the file and find the UIID mine was 135
open /config/custom_components/sonoff/core/devices.py
you should find the entry for UIID 135 missing
add the line
135: [XLightB02, RSSI], # Sonoff B02-BL
B02-BL should now be visible as a light in your drop downs and you will be able to add the dimmer and the cool temperature.
Hope this helps anyone.
DIPDIPOTATOCH1P
This needs to please be added/pulled into code base.
Thanks!
Fix worked a treat!
Thanks for sharing.
Just what i needed. Thank-you!
Un grand merci !!!!!
https://github.com/AlexxIT/SonoffLAN/releases/tag/v3.4.0
|
gharchive/issue
| 2022-11-06T17:02:38 |
2025-04-01T04:54:41.838295
|
{
"authors": [
"AlexxIT",
"Cruzy404",
"ZiggySatdust",
"camsaway",
"dipdipotat0chip",
"santiniuk"
],
"repo": "AlexxIT/SonoffLAN",
"url": "https://github.com/AlexxIT/SonoffLAN/issues/1026",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1225200169
|
Sonoff TH10 slow update interval after 3.0.0
I use two Sonoff TH10 to monitor and control incubators temperature and humidity. I need fast and accurate readings to keep those in check. Only way to have reliable readings and reaction is to use older versions (2.4.7 working great) with those settings in configuration.yaml:
mode: local
force_update: [temperature, humidity]
scan_interval: "00:00:30"
sensors: [temperature, humidity]
Without those settings temperature and humidity go up or down way to much until update is received. 5 minutes is bad for my use, but even so the update sometimes is more then 10 minutes. A way to update the sensors is also by toggle the switch, but interfere with automations.
It is any way I can force the update in new versions? The settings in configuration are ignored on new version?
Added to latest master version. Will be in next release
https://github.com/AlexxIT/SonoffLAN#force-update
|
gharchive/issue
| 2022-05-04T10:42:30 |
2025-04-01T04:54:41.841194
|
{
"authors": [
"AlexxIT",
"lgxmedia"
],
"repo": "AlexxIT/SonoffLAN",
"url": "https://github.com/AlexxIT/SonoffLAN/issues/814",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
679540074
|
Несколько станций.
Встал вопрос как подключить 2 станции в telegram? Хотя бы с двумя ботами.
https://gist.github.com/AlexxIT/dc42882c44e298d41631720f146e701d
Двух ботов ХА не умеет насколько я знаю
|
gharchive/issue
| 2020-08-15T10:14:54 |
2025-04-01T04:54:41.842442
|
{
"authors": [
"AlexxIT",
"Securond"
],
"repo": "AlexxIT/YandexStation",
"url": "https://github.com/AlexxIT/YandexStation/issues/49",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2626402014
|
Divide and Conquer Maximum Subarray Problem
The Maximum Subarray Problem is a classical algorithmic problem that involves finding the contiguous subarray within a one-dimensional array of numbers which has the largest sum. This problem can be efficiently solved using the Divide and Conquer approach.
Do not create multiple issues.
|
gharchive/issue
| 2024-10-31T09:34:49 |
2025-04-01T04:54:41.859722
|
{
"authors": [
"Subashree-selvaraj",
"pankaj-bind"
],
"repo": "AlgoGenesis/C",
"url": "https://github.com/AlgoGenesis/C/issues/1525",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2646242667
|
[NEW ALGORITHM] Cheapest Flights Within K Stops Graph Probelm
Issue will be closed if:
the problem is about finding the cheapest way to travel between two cities given a set of flights, while also considering a limit on the number of stops. It tests your ability to work with graphs and optimize paths based on costs and constraints.
Name:
Cheapest Flights Within K Stops
About:
Propose a new algorithm to be added to the repository
Labels:
new algorithm, gssoc-ext, hacktoberfest, level1
Assignees:
[x] Contributor in GSSoC-ext
[x] Want to work on it
@pankaj-bind pls assign me this issue!
@pankaj-bind !
Do not create multiple issues.
|
gharchive/issue
| 2024-11-09T15:38:54 |
2025-04-01T04:54:41.862853
|
{
"authors": [
"SimranShaikh20",
"pankaj-bind"
],
"repo": "AlgoGenesis/C",
"url": "https://github.com/AlgoGenesis/C/issues/1807",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2596995391
|
[NEW ALGORITHM] Memory Management in C (malloc, calloc, realloc, free)
Description:
A comprehensive understanding of memory management is crucial for effective C programming. This issue proposes the addition of a dedicated section in the repository to cover memory management techniques, specifically focusing on the following:
1.Dynamic Memory Allocation:
Explain the importance of dynamic memory allocation in C programming.
Introduce functions like malloc, calloc, and realloc for allocating memory at runtime.
2.Memory Deallocation:
Describe how to properly deallocate memory using the free function to prevent memory leaks.
3.Best Practices:
Discuss best practices for managing memory in C, including checking for null pointers, avoiding memory leaks, and understanding memory fragmentation.
Name:
[]
About:
Propose a new algorithm to be added to the repository
Labels:
new algorithm, gssoc-ext, hacktoberfest, level1
Assignees:
[x] Contributor in GSSoC-ext
[x] Want to work on it
can you assign this for me ?
Since I have a good grasp of the basics, I was wondering if you'd be up for assigning me a task related to malloc, calloc, realloc, and free.
Do not create unnecessary issues until you complete the previous one.
|
gharchive/issue
| 2024-10-18T09:34:55 |
2025-04-01T04:54:41.866963
|
{
"authors": [
"Swastimp",
"biswajit-sarkar-007",
"hari-dev-003",
"pankaj-bind"
],
"repo": "AlgoGenesis/C",
"url": "https://github.com/AlgoGenesis/C/issues/981",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
198434232
|
full colours
how good of a computer is required for 16777216 colors of one block type?
I wouldn't know, you would at least need "NotEnoughIds" to extend the block id limit.
FCB isn't that demanding, but that would use a size-able amount of memory just to keep track of the ids and references, I imagine whats in MC to keep track of those blocks would exceed what FCB would use to accomplish said task.
|
gharchive/issue
| 2017-01-03T09:16:05 |
2025-04-01T04:54:41.883124
|
{
"authors": [
"AlgorithmX2",
"Pitigoi"
],
"repo": "AlgorithmX2/FlatColoredBlocks",
"url": "https://github.com/AlgorithmX2/FlatColoredBlocks/issues/35",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
930950332
|
Improve Osano detection
Example site - https://www.hobbycraft.co.uk/
Current detection -
"Osano": {
"cats": [
67
],
"description": "Osano is a data privacy platform that helps your website become compliant with laws such as GDPR and CCPA.",
"icon": "osano.png",
"scripts": "cookieconsent\\.min\\.js",
"website": "https://www.osano.com/"
},
To improve detection, look for JS object 'Osano'. Keep the current script detection as well
Hey there , updated
|
gharchive/issue
| 2021-06-27T14:56:37 |
2025-04-01T04:54:41.888406
|
{
"authors": [
"nurbek91",
"rockeynebhwani"
],
"repo": "AliasIO/wappalyzer",
"url": "https://github.com/AliasIO/wappalyzer/issues/4072",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
941537997
|
Update i18n
Added missing french translations.
Merci!
|
gharchive/pull-request
| 2021-07-11T21:47:27 |
2025-04-01T04:54:41.889140
|
{
"authors": [
"AliasIO",
"Rom1-J"
],
"repo": "AliasIO/wappalyzer",
"url": "https://github.com/AliasIO/wappalyzer/pull/4181",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2592593798
|
Description bar size fixing needed
The bar beneath the calculation bar needs a bit of adjustment in width to remove irregularity so that it becomes more visually appealing
Here is a reference of the above case marked in red color:
Kindly assign the issue to me
duplicate with #136
|
gharchive/issue
| 2024-10-16T17:37:27 |
2025-04-01T04:54:41.954254
|
{
"authors": [
"Alitindrawan24",
"shreya-paul-17"
],
"repo": "Alitindrawan24/Binary-Calculator",
"url": "https://github.com/Alitindrawan24/Binary-Calculator/issues/137",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2684887034
|
Investigating validitiy of lick interval warnings
After merging in calculating lick interval percentages, there have been more than expected instances of same side and cros side licks over 10%. We need to investigate whether this is an issue on scopes or with code.
Cross side issues per box:
ephys1A: 2
ephys3A: 1
ephys4A: 1
2B: 1
3C: 1
3D: 1
6A: 2
6B: 1
7A: 1
7C: 1
7D: 2
8D: 1
7B: 1
Same side issues per box:
ephys3A: 5
1A: 1
1D: 1
6A: 1
7B: 1
7C: 1
Updated list of issues per box (posted week of 11/25/2024):
Cross side issues per box:
ephys1: 5
ephys3: 1
ephys4: 2
2B: 1
3C: 1
3D: 1
6A: 3
6B: 3
6D: 2
7A: 2
7B: 3
7C: 7
7D: 5
8A: 1
8C: 1
8D: 2
9D: 2
Same side issues per box:
ephys3: 13
ephys4: 1
1A: 1
1C: 1
1D: 1
6A: 2
7B: 2
7C: 1
@ZhixiaoSu recommends changing the definition of cross-side licks
https://github.com/AllenNeuralDynamics/dynamic-foraging-task/issues/1026#issuecomment-2513330895
@JeremiahYCohen suggests decreasing 100ms to 50ms
Updated 12/23/24:
Cross side issues per box:
1B: 2
1C: 1
1D: 2
3C: 3
3D: 1
6A: 2
6B: 3
6D: 3
7A: 2
7B: 5
7C: 4
7D: 6
8A: 2
8B: 2
8C: 3
8D: 2
ephys1: 5
ephys3: 5
ephys4: 3
9D: 2
Same side issues per box:
1A: 3
1C: 3
1D: 3
2B: 1
3C: 2
6A: 2
6D: 2
7B: 3
7C: 1
8B: 1
ephys3: 15
ephys4: 2
This appears to be a very serious issue and might worth a discussion in a separate place.
While some of the lick detection problems could potentially be mitigated by adjusting parameters and the sensitivity of the lick detector, I’ve observed that the Janelia lick detector often lacks stability and consistency across different mice and days. This seems to be an inherent hardware limitation of the device.
It might be worth discussing with SIPE the possibility of developing a more stable lick detector specifically for the behavior rig. It's ok if it has licking artefacts during ephys recording when we lack an idea solution, which could be a separate issue.
@XX-Yin I agree we need to discuss more. We are waiting on some feedback from Sue and will likely discuss in a meeting in January.
|
gharchive/issue
| 2024-11-22T23:05:36 |
2025-04-01T04:54:42.041257
|
{
"authors": [
"XX-Yin",
"alexpiet",
"ellahiltonvano",
"micahwoodard"
],
"repo": "AllenNeuralDynamics/dynamic-foraging-task",
"url": "https://github.com/AllenNeuralDynamics/dynamic-foraging-task/issues/1059",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
595368637
|
Remove doubled PK defaults for master client gc threshold
Defaults were set twice accidentally. Removed them.
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/9009/
Test PASSed.
alluxio-bot, merge this please
alluxio-bot, cherry-pick this to branch-2.1 please
alluxio-bot, cherry-pick this to branch-2.2 please
|
gharchive/pull-request
| 2020-04-06T19:26:59 |
2025-04-01T04:54:42.048888
|
{
"authors": [
"AmplabJenkins",
"ZacBlanco",
"ns1123"
],
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/11241",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1284314519
|
Print snapshot file size
What changes are proposed in this pull request?
Just print the size of the snapshot file which has just been downloaded or generated.
Why are the changes needed?
Print this snapshot file size can help to get more info when encountered some issue.
Does this PR introduce any user facing changes?
No
@jiacheliu3 Would you please take a look at this PR? Thanks!
alluxio-bot, merge this please
|
gharchive/pull-request
| 2022-06-24T23:38:52 |
2025-04-01T04:54:42.050606
|
{
"authors": [
"jiacheliu3",
"maobaolong"
],
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/15775",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1440281312
|
Improve job worker health report
What changes are proposed in this pull request?
Restructure the worker health report class so user of the class can get the worker health report at the point in time plus it will be easy to add many more system information to calculate the worker health.
#16502
Why are the changes needed?
It would be easy to add new system information like memory, io etc.. of the worker in calculating the worker health. The new structure of the class provides the worker health report point in time.
Does this PR introduce any user facing changes?
No
@vimalKeshu mostly LGTM, please fix the checkStyle. @luzhang6 PTAL since it's observability related
Yes, let me fix the style.thank you @jja725.
alluxio-bot, merge this please
|
gharchive/pull-request
| 2022-11-08T14:11:39 |
2025-04-01T04:54:42.053030
|
{
"authors": [
"Xenorith",
"jja725",
"vimalKeshu"
],
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/16503",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
137336301
|
[ALLUXIO-1739] Move Dataserver to Alluxio Worker
https://tachyon.atlassian.net/browse/ALLUXIO-1739
This is a step to implementing ALLUXIO-1739. After moving the data server out of the block worker, we can share the data server between different types of workers.
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/8709/
Test PASSed.
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/8710/
Test PASSed.
LGTM
|
gharchive/pull-request
| 2016-02-29T18:46:28 |
2025-04-01T04:54:42.056727
|
{
"authors": [
"AmplabJenkins",
"aaudiber",
"calvinjia",
"gpang"
],
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/2779",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
170110963
|
[ALLUXIO-1893] Support deploying alluxio on secure yarn cluster
Send a PR because #3060 has been closed.
Thanks for @gpang @apc999 @aaudiber 's review~
https://alluxio.atlassian.net/browse/ALLUXIO-1893
alluxio-bot, check this please
alluxio-bot, check this please
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/11058/
Test PASSed.
Merged build finished. Test PASSed.
LGTM
LGTM
|
gharchive/pull-request
| 2016-08-09T08:37:46 |
2025-04-01T04:54:42.059692
|
{
"authors": [
"AmplabJenkins",
"aaudiber",
"apc999",
"lshmouse"
],
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/3806",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
251651432
|
[ALLUXIO-2997] Implement PUT bucket in S3RestServiceHandler.
This PR implemented http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUT.html in Alluxio Proxy.
The following Python script using AWS S3 Python SDK to create a bucket works.
import boto
import boto.s3.connection
access_key = 'put your access key here!'
secret_key = 'put your secret key here!'
conn = boto.connect_s3(
aws_access_key_id = access_key,
aws_secret_access_key = secret_key,
host = 'localhost',
port = 39999,
path = '/api/v1/s3',
is_secure=False,
calling_format = boto.s3.connection.OrdinaryCallingFormat(),
)
# This will succeed.
conn.create_bucket('not-mount-point')
# This will raise an error formatted as XML with error message:
# "The specified bucket is not a directory directly under a mount point".
try:
conn.create_bucket('not-mount-point:bucket')
except boto.exception.S3ResponseError as e:
print(e)
@uronce-cc @calvinjia FYI, I've updated the PR to target at s3 feature branch. The s3 branch was created based on latest master.
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16483/Build result: FAILURE[...truncated 2778 lines...][JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/pom.xml to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/target/alluxio-core-client-hdfs-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/target/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/target/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-tests.jar to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-tests.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/keyvalue/common/pom.xml to org.alluxio/alluxio-keyvalue-common/1.6.0-SNAPSHOT/alluxio-keyvalue-common-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/pom.xml to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/target/alluxio-core-client-fs-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/target/alluxio-core-client-fs-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/target/alluxio-core-client-fs-1.6.0-SNAPSHOT-tests.jar to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT-tests.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/pom.xml to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/target/alluxio-core-common-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/target/alluxio-core-common-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/target/alluxio-core-common-1.6.0-SNAPSHOT-tests.jar to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT-tests.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/server/worker/pom.xml to org.alluxio/alluxio-core-server-worker/1.6.0-SNAPSHOT/alluxio-core-server-worker-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/underfs/glusterfs/dependency-reduced-pom.xml to org.alluxio/alluxio-underfs-glusterfs/1.6.0-SNAPSHOT/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/underfs/glusterfs/target/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-underfs-glusterfs/1.6.0-SNAPSHOT/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/underfs/glusterfs/target/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-underfs-glusterfs/1.6.0-SNAPSHOT/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/pom.xml to org.alluxio/alluxio-core-client/1.6.0-SNAPSHOT/alluxio-core-client-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/examples/pom.xml to org.alluxio/alluxio-examples/1.6.0-SNAPSHOT/alluxio-examples-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/pom.xml to org.alluxio/alluxio-core/1.6.0-SNAPSHOT/alluxio-core-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/server/proxy/pom.xml to org.alluxio/alluxio-core-server-proxy/1.6.0-SNAPSHOT/alluxio-core-server-proxy-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/shell/pom.xml to org.alluxio/alluxio-shell/1.6.0-SNAPSHOT/alluxio-shell-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/tests/pom.xml to org.alluxio/alluxio-tests/1.6.0-SNAPSHOT/alluxio-tests-1.6.0-SNAPSHOT.pomchannel stopped
Test FAILed.
Merged build finished. Test FAILed.
jenkins, test this please
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16485/
Test PASSed.
Merged build finished. Test PASSed.
jenkins, test this please
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16490/
Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16493/
Test PASSed.
|
gharchive/pull-request
| 2017-08-21T12:55:56 |
2025-04-01T04:54:42.074418
|
{
"authors": [
"AmplabJenkins",
"ifcharming",
"uronce-cc"
],
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/5914",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2079582431
|
Default transactions to abort on drop
Hey!
Here's an idea for indexed-db-futures v0.5: what if transactions aborted on drop, instead of committing on drop?
This'd make them farther away from the indexed-db standard, for sure, but the indexed-db standard is based on callbacks, which are pretty different from futures anyway. And, in Rust, it's very easy to miss one early-return point, to make sure that returning Err from a function aborts the transaction.
TL;DR:
fn foo() {
let transaction = [...];
transaction.add_key_val(...).unwrap().await.unwrap();
do_some_check(&transaction).await?;
Ok(())
}
This will (AFAICT) commit the transaction if do_some_check were to return an Err. The behavior I'd expect from the code if just reading it intuitively, would be for the transaction to be aborted.
In order to get the behavior I'd instinctively expect, I need to use the following code, which is quite a bit less pleasant to both write and read:
fn foo() {
let transaction = [...];
transaction.add_key_val(...).unwrap().await.unwrap();
if let Err(e) = do_some_check(&transaction).await {
transaction.abort().unwrap();
return Err(e);
}
Ok(())
}
WDYT about adding a commit(self) function to transaction, that'd commit it (ie. just drop it), and to have IdbTransaction's Drop implementation abort the transaction if it has not been explicitly committed?
Anyway, I'm just starting using indexed-db-futures, but it already seems much, much more usable than the web-sys version. So, thank you! :D
After some more investigation, I learned that this would be a nontrivial feature.
So I decided to go ahead and implement it, and this resulted in the indexed-db crate, that has a misuse-resistant API with transactions that always abort upon errors and never commit early.
Do you want to keep this issue open to track this feature inside indexed_db_futures, or should we close it? :)
This issue has been included in v0.6.0 :tada:
- Your friendly neighbourhood :robot: semantic release bot
|
gharchive/issue
| 2024-01-12T19:42:14 |
2025-04-01T04:54:42.109755
|
{
"authors": [
"Alorel",
"Ekleog"
],
"repo": "Alorel/rust-indexed-db",
"url": "https://github.com/Alorel/rust-indexed-db/issues/32",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1886789323
|
exec_aud.sh <- Need Push Fix to Github
Not sure how to correctly execute the Git Push / Commit / Add, feel free to make a working version.
Entering the Danger Zone.
Fixed with @brplcc , with contributions from @TacticalTux and @wes34.
|
gharchive/issue
| 2023-09-08T01:13:28 |
2025-04-01T04:54:42.111156
|
{
"authors": [
"Scrippy"
],
"repo": "Alpha-Authority/Szeebe",
"url": "https://github.com/Alpha-Authority/Szeebe/issues/10",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1213765584
|
🛑 Cañada el Espinar is down
In 0a6f3f0, Cañada el Espinar (https://cañadaespinar.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Cañada el Espinar is back up in c2cff19.
|
gharchive/issue
| 2022-04-24T20:52:17 |
2025-04-01T04:54:42.276727
|
{
"authors": [
"Alvarohf"
],
"repo": "Alvarohf/sites-status",
"url": "https://github.com/Alvarohf/sites-status/issues/70",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1590390724
|
footer padding issue
yeah sure !
Hey @Amark19, Can i work on this?? Please assign me.
yeah sure !
Hey I am not able to run this projects on local. Can you please guide me. Here are the error screenshots
which python version are you using ?
@rajpatel17-bot which python version are you using ?
can u explicitely run this command python -m pip install flake8 ?
you can simply run pip install -r requirements.txt but first you need to remove psycopg2 module from it then run that command
hey @rajpatel17-bot , I'll see the issue try it from my end then will ping u..
hey @rajpatel17-bot you can try now all the steps
|
gharchive/issue
| 2023-02-18T16:14:38 |
2025-04-01T04:54:42.300823
|
{
"authors": [
"Amark19"
],
"repo": "Amark19/calcont.in",
"url": "https://github.com/Amark19/calcont.in/issues/21",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
270027708
|
Is there a way to customize the base map, map size, zoom level?
I would like to be able to customize the base map, map size, and zoom level. AKA set the map height to 600px and zoom level to 8 or something. Is this possible through the API?
Hi Andy, W were checking that out too. Clark is working on a different base map ( colored instead of BW ). Perhaps there will be an option to set this in the http post data array.
I found that to fix the size of the map, I had to put it into a div of certain width and height.
The zoom, we found, its set automatically, so that none of your regions are cut off. I had to tweak the height of ours because it was a hair to small to fit all the regions into. Once I made it slightly taller, it defaulted to the next zoom level up.
The <div style="width:100%; height:600px;"> works great for the size. Thanks, Jim that is a nice simple solution! I would still like to set the zoom one level higher than it displays on our map to make our region take up all the space in the map window.
you can check out the 3 options on my dev site if you want to compare them:
http://dev.sierraavalanchecenter.org
Yeah, I see what you mean. Try increasing the height a bit. I think whatever function auto zooms in or out, it likes to have some padding around the advisory area. Go big to find a height that will definitely work, then try dropping it down to the smallest height that will still give you the better zoom level.
See docs for latest update. Client can now supply map height as well as desired zoom level.
|
gharchive/issue
| 2017-10-31T16:30:42 |
2025-04-01T04:54:42.335758
|
{
"authors": [
"andyanderso",
"ccorey",
"jimurl"
],
"repo": "AmericanAvalanche/avalanche.org-API-Client",
"url": "https://github.com/AmericanAvalanche/avalanche.org-API-Client/issues/1",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
417559262
|
Invite someone in the Chat
Describe the solution you'd like
Endpoint:
POST /api/v1//s/chat/thread//member/invite HTTP/1.1
JSON:
{
"uids": ["Array of UserID's to be invited to Chat", "MUST BE ABLE TO (i.e. the invitee needs to follow the inviter)"],
"timestamp": "A JS Timestamp"
}
Got it. Will be adding at the update
Did the module inviteChat.js, and wrote the example already. This new feature will be added at the 3.0.0-nightly
Closing due to finished
|
gharchive/issue
| 2019-03-06T00:02:05 |
2025-04-01T04:54:42.344175
|
{
"authors": [
"akatsukilevi",
"moelrobi"
],
"repo": "AminoJS/Amino.JS",
"url": "https://github.com/AminoJS/Amino.JS/issues/56",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
925236566
|
create normal and uv data where needed so bevy's renderer doesnt complain
I'm not sure if this is the right place to solve this problem (perhaps in an AssetCleanerPlugin), but this adds normal data (which is actually obviously broken, I could compute it from the face orientation..), and vertex uv data.
I have only tested this with a file missing UV data, so It is probably horribly broken when trying to inject normal data...
Is this something you would add to the crate? If so I will test it with normal data. Otherwise I will look at creating a AssetCleanerPlugin.
Cheers
I'm definitely interested adding this to the crate. In fact I see this as a bugfix.
I have tested it with both 'UV and Normal' and 'only UV' missing and it seems to work fine.
I haven't looked into how logging works in bevy but it would be nice to write out a small warning maybe.
Also If you could generate some simple normals, it would be even better.
But none of these required
|
gharchive/pull-request
| 2021-06-18T22:10:43 |
2025-04-01T04:54:42.346250
|
{
"authors": [
"AmionSky",
"rezural"
],
"repo": "AmionSky/bevy_obj",
"url": "https://github.com/AmionSky/bevy_obj/pull/7",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2650112641
|
[Bug] Education Category, summary text box not auto-sizing. Create Button missing
Is there an existing issue for this?
[X] Yes, I have searched the existing issues and none of them match my problem.
Product Variant
Cloud (https://rxresu.me)
Current Behavior
When you try to add something in the Education Category, when you paste something into the Summary Box, it does not auto-size and the Create button disappears. You can't scroll in this window
Expected Behavior
The summary box autosizes and you can press the button.
Steps To Reproduce
Go to Education Category
Paste something into the Summary Box
Create button missing.
What browsers are you seeing the problem on?
Chrome
What template are you using?
None
Anything else?
No response
I want to try this issue #2098
|
gharchive/issue
| 2024-11-11T18:46:00 |
2025-04-01T04:54:42.358711
|
{
"authors": [
"calinfaja",
"vikash28-cloud"
],
"repo": "AmruthPillai/Reactive-Resume",
"url": "https://github.com/AmruthPillai/Reactive-Resume/issues/2098",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1580474713
|
CFFL
CFFL
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:22: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:30: missing_pip_check: For pypi packages, pip check should be present in the test commands
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:40: missing_dev_url: The recipe is missing a dev_url
WARNING: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:40: missing_description: The recipe is missing a description
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:40: http_url: http://public.readthedocs.io/ is not https
Errors were found
|
gharchive/pull-request
| 2023-02-10T23:18:49 |
2025-04-01T04:54:42.370436
|
{
"authors": [
"anaconda-pkg-build",
"bkreider"
],
"repo": "AnacondaRecipes/atpublic-feedstock",
"url": "https://github.com/AnacondaRecipes/atpublic-feedstock/pull/1",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1981781682
|
build 1.4.1
Upstream
Jira ticket
is this a SF package?
is this a SF package?
it's a dependency for pykx. Maybe it should go to SF channel.
is this a SF package?
it's a dependency for pykx. Maybe it should go to SF channel.
it's not on defaults already, so yes, it goes to SF 👌 thanks!
|
gharchive/pull-request
| 2023-11-07T16:42:03 |
2025-04-01T04:54:42.373160
|
{
"authors": [
"boldorider4",
"lorepirri"
],
"repo": "AnacondaRecipes/dlfcn-win32-feedstock",
"url": "https://github.com/AnacondaRecipes/dlfcn-win32-feedstock/pull/1",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1581624589
|
CFFL
CFFL
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:15: pip_install_args: pip install should be run with --no-deps.
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:22: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:39: missing_pip_check: For pypi packages, pip check should be present in the test commands
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:48: missing_documentation: The recipe is missing a doc_url or doc_source_url
Errors were found
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:15: pip_install_args: pip install should be run with --no-deps.
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:22: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:39: missing_pip_check: For pypi packages, pip check should be present in the test commands
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:48: missing_documentation: The recipe is missing a doc_url or doc_source_url
Errors were found
|
gharchive/pull-request
| 2023-02-13T04:13:39 |
2025-04-01T04:54:42.376963
|
{
"authors": [
"anaconda-pkg-build",
"bkreider"
],
"repo": "AnacondaRecipes/duckdb-engine-feedstock",
"url": "https://github.com/AnacondaRecipes/duckdb-engine-feedstock/pull/1",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2336900757
|
[PKG-4871] 0.3.2
hmmlearn 0.3.2 :snowflake:
Destination channel: Snowflake
Links
PKG-4871
Upstream repository
Upstream changelog/diff
Explanation of changes:
Forced openblas on osx-64 to prevent a segfault error.
Added upstream tests.
The tests require the package to be built in place and editable, so it's currently being rebuilt in the test block. Please let me know if there's a better way to go about that.
Added $RPATH/ld64.so.1 to missing_dso_whitelist for s390x
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_b8iflxnl06/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:61: missing_description: The recipe is missing a description
clone/recipe/meta.yaml:28: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== ERRORS =====
clone/recipe/meta.yaml:28: missing_wheel: For pypi packages, wheel should be present in the host section
===== Final Report: =====
1 Error and 2 Warnings were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_04w0vgazwm/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:28: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_cffhv4rr3j/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_2d3roj_dif/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_47fy0mrt8y/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_e7wyvczymq/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
The tests on osx-64 are triggering a segfault any time _AbstractHMM.fit is called. I suspect it might be connected to thescikit-learn version is pulled in.
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_34zy_3spyu/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== ERRORS =====
clone/recipe/meta.yaml:38: version_constraints_missing_whitespace: Packages and their version constraints must be space separated
===== Final Report: =====
1 Error and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_1ckgcfl1nh/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
let's create and merge first 0.3.0 and then this one, otherwise we will have to have a versioned feedstock for 0.3.0
let's create and merge first 0.3.0 and then this one, otherwise we will have to have a versioned feedstock for 0.3.0
It's still unclear whether the client actually needs both versions. The production code is identical aside from a single exception message. Jack asked them about it last week, but there was no response.
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_73g3bjwvqe/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
|
gharchive/pull-request
| 2024-06-05T21:59:13 |
2025-04-01T04:54:42.394408
|
{
"authors": [
"ViridianMelody",
"anaconda-pkg-build",
"lorepirri"
],
"repo": "AnacondaRecipes/hmmlearn-feedstock",
"url": "https://github.com/AnacondaRecipes/hmmlearn-feedstock/pull/2",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1579842585
|
CFFL
CFFL
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:18: pip_install_args: pip install should be run with --no-deps.
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:23: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:49: missing_documentation: The recipe is missing a doc_url or doc_source_url
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:49: missing_dev_url: The recipe is missing a dev_url
WARNING: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:49: missing_description: The recipe is missing a description
Errors were found
|
gharchive/pull-request
| 2023-02-10T15:05:42 |
2025-04-01T04:54:42.396898
|
{
"authors": [
"anaconda-pkg-build",
"bkreider"
],
"repo": "AnacondaRecipes/uri-template-feedstock",
"url": "https://github.com/AnacondaRecipes/uri-template-feedstock/pull/1",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2336755544
|
Not working
╰─$ ./osm_extract_polygon -f kazakhstan-latest.osm.pbf
error: cannot set both -o (--overwrite) and -s (--skip)!
uff. sorry.. I must have broken the recent build. Will fix asap.
Hey @corop ,
can you try the new release https://github.com/AndGem/osm_extract_polygon/releases/tag/v.0.5.5 ?
I think this fixes the issue you have reported. Many thanks for pointing this out.
Hey @corop ,
can you try the new release https://github.com/AndGem/osm_extract_polygon/releases/tag/v.0.5.5 ?
I think this fixes the issue you have reported. Many thanks for pointing this out.
my test 0.0.5 version ((( need check again.... i5, Debian 12, zsh
-rw-r--r-- 1 asergeev asergeev 34M Jun 5 02:26 china-latest-admin.osm.pbf
-rwxr-xr-x 1 asergeev docker 2.5M Jun 6 23:39 osm_extract_polygon
-rw-r--r-- 1 asergeev docker 6.2K Jun 5 08:40 README.md
╭─asergeev@uran ~/work/osm/country-borders/china
╰─$ ./osm_extract_polygon --file china-latest-admin.osm.pbf
error: cannot set both -o (--overwrite) and -s (--skip)!
Hey @corop ,
thanks for the update. I understand that you want to check again?
Either 0.5.5 or the newest 0.5.6 should both work. Please let me know if there is an issue, and ideally with a pbf file (or a link where I can download it from) to try it out.
i assume everything is fine now
|
gharchive/issue
| 2024-06-05T20:13:35 |
2025-04-01T04:54:42.405776
|
{
"authors": [
"AndGem",
"corop"
],
"repo": "AndGem/osm_extract_polygon",
"url": "https://github.com/AndGem/osm_extract_polygon/issues/45",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2501615892
|
Crash for missing deps
Crash on windows, some errors about missing deps.
Invalid. It was a issue with Nodejs on my Windows system.
Invalid. It was a issue with Nodejs on my Windows system.
|
gharchive/issue
| 2024-09-02T21:50:50 |
2025-04-01T04:54:42.431834
|
{
"authors": [
"AndreaPontrandolfo"
],
"repo": "AndreaPontrandolfo/sheriff",
"url": "https://github.com/AndreaPontrandolfo/sheriff/issues/252",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
567424159
|
Open DataSource without locking it
This allow to keep the datasource file writable by the application generating it while parsing.
(Sorry for the online editing of github screwing file encoding)
Good call. I'll get this merged in, along with a contributor file. The latest by the end of the week hopefully.
|
gharchive/pull-request
| 2020-02-19T09:16:30 |
2025-04-01T04:54:42.456672
|
{
"authors": [
"AndrewRissing",
"holigo1"
],
"repo": "AndrewRissing/GenericParsing",
"url": "https://github.com/AndrewRissing/GenericParsing/pull/11",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
97428267
|
Размер шрифта основного текста
ГОСТ говорит, что размер шрифта должен быть 12-14 пунктов. Сейчас в шаблоне по умолчанию стоит 12. Имею два аргумента, чтобы по умолчанию стояло 14.
По опыту: в подавляющем числе российских диссертаций, что я видел, было 14 (не помню ни одной с 12).
ГОСТ требует полуторный интервал. Традиционно полуторный интервал используется вместе с 14пт. Вместе с 12 визуально получается хуже.
Я не против. @Lenchik, что думаешь?
С точки зрения кода - в одном месте меняем одну цифру - вообще не проблема. Самому мне тоже нравится в 14 шрифте.
Сделал в PR https://github.com/AndreyAkinshin/Russian-Phd-LaTeX-Dissertation-Template/pull/49
В вашем PR #49 получается, что при выборе 14 размеры заголовоков совпадают с размером основного текста.
Так и задумано.
ГОСТ Р 7.0.11-2011 СИБИД. Диссертация и автореферат диссертации. Структура и правила оформления:
5.3.6 Работа должна быть выполнена печатным способом с использованием компьютера и принтера на одной стороне листа белой бумаги одного сорта формата А4 (210х297 мм) через полтора интервала и размером шрифта 12-14 пунктов. Диссертация должна иметь твердый переплет.
Желающие отойти от ГОСТ могут:
изменить размеры шрифта в соответствующих местах (обильно снабжены комментариями);
закомментировать блок задания жесткого размера шрифта и раскомментировать блок
%% Ниже закомментирован вариант, несоответствующий ГОСТ 7.0.11, в котором заголовки крупнее положенного размера в 12-14 пт
%\ifnum\curtextsize>\bigtextsize % Проверяем условие использования базового шрифта 14 pt
%\renewcommand{\cfttoctitlefont}{\LARGE\bfseries} % Исправляем размер заголовка оглавления
%\fi
@ulysses4ever
Сейчас можно завести опцию упрощённой настройки в файле setup. Например, счётчик headingsize для тех, кому не страшно отступление от ГОСТ. Но тогда на какие размеры стоит всё изменить для двух вариантов основного шрифта (12 пт и 14 пт)?
Я предпочитаю пользоваться командами \large \Large и тому подобное.
Сможете предложить их комбинацию для заголовков разных уровней для двух вариантов основного шрифта (12 пт и 14 пт)?
Я сейчас в основном отошёл от оформления, могу лишь предложить вариант, которым я пользуюсь с 14пт, но для 12 тоже должно быть норм, по идее: для того эти команды и вводились, чтобы не зависеть от базового размера.
\titleformat{\chapter}[block]
{\filcenter\Large\bfseries\sffamily}
{\chaptername\ \thechapter.}
{\widthof{ }}
{}
\titleformat{\section}[block]
{\filcenter\large\bfseries\sffamily}
{\thesection.}
{\widthof{ }}
{}
\titleformat{\subsection}[block]
{\filcenter\bfseries\sffamily}
{\thesubsection.}
{\widthof{ }}
{}
|
gharchive/issue
| 2015-07-27T09:59:51 |
2025-04-01T04:54:42.470540
|
{
"authors": [
"AndreyAkinshin",
"Lenchik",
"ulysses4ever"
],
"repo": "AndreyAkinshin/Russian-Phd-LaTeX-Dissertation-Template",
"url": "https://github.com/AndreyAkinshin/Russian-Phd-LaTeX-Dissertation-Template/issues/45",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1048707101
|
Allow mutating the SMP info tag
Allows getting a mutable reference to StivaleSmpTag and mutating the StivaleSmpInfo structs, this can be used to start the application processors.
Closes #4
Thanks @JCapucho!
|
gharchive/pull-request
| 2021-11-09T14:59:06 |
2025-04-01T04:54:42.481892
|
{
"authors": [
"Andy-Python-Programmer",
"JCapucho"
],
"repo": "Andy-Python-Programmer/stivale",
"url": "https://github.com/Andy-Python-Programmer/stivale/pull/5",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1803537740
|
Way to calculate grams?
How can I calculate grams from given GCode?
Thanks.
Built CuraEngine, wontfix
Sorry, missed this issue. You would need to use the diameter of the filament to calculate the cross sectional area of the filament. Then multiply by the Extrusion distance in the G1 command to get volume. Multiply volume by the density of the filament to get mass.
|
gharchive/issue
| 2023-07-13T18:17:22 |
2025-04-01T04:54:42.485078
|
{
"authors": [
"AndyEveritt",
"MajliTech"
],
"repo": "AndyEveritt/GcodeParser",
"url": "https://github.com/AndyEveritt/GcodeParser/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
370689021
|
Homepage URL on 404.html is broken
Hey, thanks for your work. I like your theme.
I noticed that on the 404.html page the link back to the homepage is broken.
Instead of a valid URL a link to Page("Homepage name") is created.
I have a fix ready and can submit a pull request if you want.
Thanks for reporting this, and for your kind words!
If you've got a pull request ready, please go ahead and submit it — I'll review it ASAP.
|
gharchive/issue
| 2018-10-16T16:11:10 |
2025-04-01T04:54:42.489476
|
{
"authors": [
"AngeloStavrow",
"sauerj"
],
"repo": "AngeloStavrow/indigo",
"url": "https://github.com/AngeloStavrow/indigo/issues/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1634317222
|
Manually Import Model Files?
Hello could a feature be added to allow importation of a .th Demucs model file from a local directory please?
Not a code-warrior, so I would appreciate an option to import model files manually from my hard drive.
I like this idea too. I've wanted an option to add my own models without issues
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
How does one create a .yaml file and correlate it to the .th file so that UVR recognizes them as a pair?
And then would the model show up in UVR automatically after relaunch, under one of the Demucs categories?
Very specific directions appreciated. Thank you :)
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
How does one create a .yaml file and correlate it to the .th file so that UVR recognizes them as a pair?
And then would the model show up in UVR automatically after relaunch, under one of the Demucs categories?
Very specific directions appreciated. Thank you :)
Here's an example:
Download and rename this file and use it as a template as a template yaml file
Change the yaml file name to whatever you want (obviously keep the .yaml)
Edit the yaml file to reference your Demucs model.
Example: Change the "955717e8" in the yaml to the name of the .th model you wish to associate with it.
models: ['955717e8']
Make sure the yaml and .th model are in the following directory - models/Demucs_Models/v3_v4_models
Thank you, this process worked! UVR found the model and was able to begin the separation process.
However an error occurs straight away, "bag_num" not defined. I am sure there may be subsequent errors that need troubleshooting to get this to work; is there an email I can contact you at for a little bit of guidance through any errors that may pop up?
|
gharchive/issue
| 2023-03-21T16:41:13 |
2025-04-01T04:54:42.529618
|
{
"authors": [
"Anjok07",
"Dyslexicon",
"sabaasa"
],
"repo": "Anjok07/ultimatevocalremovergui",
"url": "https://github.com/Anjok07/ultimatevocalremovergui/issues/460",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1112825888
|
🛑 AnnAngela.cn is down
In d650034, AnnAngela.cn (https://ping.annangela.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: AnnAngela.cn is back up in 63de86a.
|
gharchive/issue
| 2022-01-24T15:54:52 |
2025-04-01T04:54:42.535296
|
{
"authors": [
"AnnAngela"
],
"repo": "AnnAngela/annangela.cn-monitor",
"url": "https://github.com/AnnAngela/annangela.cn-monitor/issues/488",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2261855189
|
✨: Create issue forms for respective buttons
Is your feature request related to a problem? Please describe.
Currently, we lack issue forms which is kinda crucial in making the whole process of raising an issue smooth for newbie contributors
Describe the solution you'd like
Describe alternatives you've considered
NULL
Additional context
Add any other context or screenshots about the feature request here.
this actually makes sense, thanks for this idea.
if you have the template ready can you raise a PR for this?
@Ansub yes will be raising one shortly.
@dakshsinghrathore This is pretty cool
@dakshsinghrathore Can you still upload photos in the forms?
Yes @epoll31 check this out
That's neat! I didn't realize those were md fields.
@epoll31 github recently added this feature, do give it a read.
#72
Merged PR #72
|
gharchive/issue
| 2024-04-24T17:49:13 |
2025-04-01T04:54:42.544532
|
{
"authors": [
"Ansub",
"dakshsinghrathore",
"epoll31"
],
"repo": "Ansub/SyntaxUI",
"url": "https://github.com/Ansub/SyntaxUI/issues/69",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1192106980
|
Bug: Can't load ESM module in Node.js
Environment:
Node v16.3.0
npm v7.10.0
What I did
Tried to load the plugin in an ESM file like this:
import figuresPlugin from "markdown-it-image-figures";
What happened
Got the following error:
import figuresPlugin from "markdown-it-image-figures";
^^^^^^^^^^^^^
SyntaxError: The requested module 'markdown-it-image-figures' does not provide an export named 'default'
at ModuleJob._instantiate (node:internal/modules/esm/module_job:121:21)
at async ModuleJob.run (node:internal/modules/esm/module_job:171:5)
at async Loader.import (node:internal/modules/esm/loader:178:24)
at async Object.loadESM (node:internal/process/esm_loader:68:5)
Notes
From looking through the dist folder locally, it appears that all of the generated files that should be ESM are actually CommonJS, including the ones marked as .module.js and .mjs. I'm guessing this has something to do with microbundle.
Has been released as 2.0.2, @nzakas could you double-check on your end?
Confirmed! This fixes the issue. Thanks so much.
Thanks for confirming! Closing this!
|
gharchive/issue
| 2022-04-04T17:40:31 |
2025-04-01T04:54:42.603599
|
{
"authors": [
"Antonio-Laguna",
"nzakas"
],
"repo": "Antonio-Laguna/markdown-it-image-figures",
"url": "https://github.com/Antonio-Laguna/markdown-it-image-figures/issues/10",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2117038587
|
Feature/background color
Add ability to enable/disable the background fog
If fog is disabled then default background color is set based on your theme.
Background color can be customized using a hex value eg. #3c3c3c if fog is disabled
Thanks for your contribution, i think we need to add to the setup flow too, not just environment variables, I can work on this in a few weeks, if you dont have time.
I added it in the setup also. please take a good look since I'm rusty on js/html
the setup looks good, one issue though:
default behaviour has now changed fog is now disabled by default, consider calling the flag disable fog, at least internally.
this will change behaviour for people who auto update with docker.
I added it as a "migration" method in IndexService. Can be moved to a separate migration service if needed
looks good will merge soon, just a minor suggestion but doesn't matter.
|
gharchive/pull-request
| 2024-02-04T11:05:56 |
2025-04-01T04:54:42.608252
|
{
"authors": [
"AntonyLeons",
"nicandris"
],
"repo": "AntonyLeons/Ward",
"url": "https://github.com/AntonyLeons/Ward/pull/107",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1719172900
|
[GSSOC' 23 ] HOVERING IS NOT WORKING FOR TWO BUTTONS
I am an Open source Contributor
DESCRIPTION:-
The hovering effect of the Instagram and LinkedIn icons aren't working in the site so please give me an opportunity to correct those and also I would like to change the ui a bit to make it more impressive.
I CAN FIX THIS AND MAKE THIS MORE GOOD WITH GOOD ICONS
CODE OF CONDUCT
-I Follow Contributing Guidelines of this project
Hello, I already worked on a food recipe app, so I'm good to handle this issue. please assign me this issue.
I'm GSSOC'23 contributor.
|
gharchive/issue
| 2023-05-22T08:27:30 |
2025-04-01T04:54:42.655676
|
{
"authors": [
"Tarun0951",
"chiki012"
],
"repo": "Anupkjha2601/food-recipes-website",
"url": "https://github.com/Anupkjha2601/food-recipes-website/issues/98",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1068480222
|
Fixed #1
feat : added requirements.txt and .gitignore files
fix : removed .idea file as it is specific to pycharm configs
Minor Bug Detected. Will resend after fixing that.
|
gharchive/pull-request
| 2021-12-01T14:33:09 |
2025-04-01T04:54:42.658137
|
{
"authors": [
"iamAbhishekkumar"
],
"repo": "AnuvabSen/Virtual_Key_Board_Using_Opencv",
"url": "https://github.com/AnuvabSen/Virtual_Key_Board_Using_Opencv/pull/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
457245296
|
【紧急】虚拟机安装Apollo开发环境-构建失败-[ERROR] Build failed!
vmware虚拟机安装apollo3.5 失败,,拉取Apollo源代码成功,docker安装成功,docker内apollo镜像拉去成功,之后编译失败。安装指导文件参考的是“ https://www.cnblogs.com/iors/p/10862713.html”
该错误是由于内存不足引起的。你有一个至少8 GB可用内存的系统吗? @fenghongjian
ok ,thank you。
@fenghongjian you are very welcome. Closing this issue currently
|
gharchive/issue
| 2019-06-18T03:43:08 |
2025-04-01T04:54:42.708020
|
{
"authors": [
"fenghongjian",
"natashadsouza"
],
"repo": "ApolloAuto/apollo",
"url": "https://github.com/ApolloAuto/apollo/issues/8860",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
94850001
|
Impressive
Your bot is so impressive!! It really helped me get to the leader place!!!
That's pretty nice. I guess Terminators aren't so bad after all.
I hope you will add the auto-split feature/ option. It will be a lot better.
I had one hit 6k score. Got no screen sadly. :(
Little sneak peak of what I'll be working on today: ![http://i.imgur.com/x9crfEF.png]
ragegaming.net/Screenshots/2015-07-11_12-38-27.mp4
18k eh
I've showed this numerous amount of times. But it's pretty sweet. 5 hours. Did it while I was asleep.
If I've seen this before, I probably wasn't looking at those numbers. :D
xD
|
gharchive/issue
| 2015-07-14T01:54:38 |
2025-04-01T04:54:42.717874
|
{
"authors": [
"Apostolique",
"KickingRG",
"laxika",
"marcocc"
],
"repo": "Apostolique/Agar.io-bot",
"url": "https://github.com/Apostolique/Agar.io-bot/issues/209",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
153329897
|
Agario update
I think agar updated today, and this brilliant bot no longer works. I have been using it for a few weeks now and it always amazes me how well it did. Could someone please fix it so it works again? Please?
Refer to #616.
|
gharchive/issue
| 2016-05-05T21:25:49 |
2025-04-01T04:54:42.718953
|
{
"authors": [
"Drflash55",
"Stedman42"
],
"repo": "Apostolique/Agar.io-bot",
"url": "https://github.com/Apostolique/Agar.io-bot/issues/623",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
192907415
|
Appscale dashboard cron page
Hi!
I implemented basic system to view/run cron jobs from Appscale dashboard. It looks like this:
What do you think about it?
I still have one issue about how to give an access to /etc/cron.d/ and /var/apps/ directories for appscaledashboard application. I did this through by modifying the file
AppServer/google/appengine/tools/devappserver2/python/sandbox.py.
Peter
Hi! I made changes regarding:
not used method
parsing 7-elems cron entries (changes reverted)
2 new tests for 4-elems cron entries.
About cron parser implementation. I think default Appscale implementation is definitely better because when we consider the schedule like "every 30 minutes from 6:45 to 7:50" we would usually expect cron job will be executed at 6.45, 7.15 and 7.45.
Anyway one possible improvement can be done 7-elems cron entries. That is instead of creating separate cron entry for the same minutes but different hours we can use only one cron entry.
For example:
30,35,40,45,50,55 1 * * *
0,5,10,15,20,25,30,35,40,45,50,55 2 * * *
0,5,10,15,20,25,30,35,40,45,50,55 3 * * *
0,5,10,15,20,25,30,35,40,45,50,55 4 * * *
0 5 * * *
can be replaced with
30,35,40,45,50,55 1 * * *
0,5,10,15,20,25,30,35,40,45,50,55 2-4 * * *
0 5 * * *
Hi! Yeah, actually compressing should be done carefully. I think about different one commit. Maybe not even this PR. Is that ok?
Of course! More PRs are always welcome. :)
I will start a build for this branch.
https://ocd.appscale.com:8080/job/Daily Build/2387/
@RaiaN Thank you for your contribution! :)
P.S. Feel free to reach out with more ideas or questions.
|
gharchive/pull-request
| 2016-12-01T17:13:38 |
2025-04-01T04:54:42.755350
|
{
"authors": [
"RaiaN",
"cdonati",
"menivaitsi"
],
"repo": "AppScale/appscale",
"url": "https://github.com/AppScale/appscale/pull/2242",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1751344208
|
Sporadic "key not found errors" in reaching def pass
We need a solution or a workaround for this exception. So far, tried adding JAVA_OPTS to increase heap memory which didn't work. Merely running parsedeps command against a repo like scipy or requests is enough to replicate this.
atom parsedeps -l python -o /tmp/atom-deps-sIUOi7/app.atom --slice-outfile /tmp/atom-deps-sIUOi7/slices.json /home/almalinux/work/sandbox/scipy -Dlog4j.configurationFile=/tmp/atom-deps-sIUOi7/log4j2.xml
Data-flow overlay is not detected, applying now
Failure: java.util.NoSuchElementException: key not found: io.shiftleft.codepropertygraph.generated.nodes.Identifier[label=IDENTIFIER; id=2346266]
2023-06-11 00:02:29.712 ERROR CpgPassBase: Pass io.joern.dataflowengineoss.passes.reachingdef.ReachingDefPass failed
java.util.NoSuchElementException: java.util.NoSuchElementException: key not found: io.shiftleft.codepropertygraph.generated.nodes.Identifier[label=IDENTIFIER; id=2346266]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:480) ~[?:?]
at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:562) ~[?:?]
at java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:591) ~[?:?]
at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:689) ~[?:?]
at java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:927) ~[?:?]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:?]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:693) ~[?:?]
at io.shiftleft.passes.NewStyleCpgPassBase.runWithBuilder(CpgPass.scala:152) ~[io.shiftleft.codepropertygraph_3-1.3.600.jar:1.3.600]
at io.shiftleft.passes.ForkJoinParallelCpgPass.createApplySerializeAndStore(CpgPass.scala:74) ~[io.shiftleft.codepropertygraph_3-1.3.600.jar:1.3.600]
at io.shiftleft.semanticcpg.layers.LayerCreator.runPass(LayerCreator.scala:53) ~[io.joern.semanticcpg_3-1.1.1742.jar:1.1.1742]
at io.joern.dataflowengineoss.layers.dataflows.OssDataFlow.create$$anonfun$1(OssDataFlow.scala:31) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15) ~[org.scala-lang.scala3-library_3-3.3.0.jar:3.3.0]
at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10) ~[org.scala-lang.scala3-library_3-3.3.0.jar:3.3.0]
at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:575) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:573) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1300) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at io.joern.dataflowengineoss.layers.dataflows.OssDataFlow.create(OssDataFlow.scala:32) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at io.shiftleft.semanticcpg.layers.LayerCreator.run(LayerCreator.scala:32) ~[io.joern.semanticcpg_3-1.1.1742.jar:1.1.1742]
at io.appthreat.atom.parsedeps.package$.parseDependencies(package.scala:32) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at io.appthreat.atom.Atom$.generateSlice$$anonfun$2(Atom.scala:250) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at scala.util.Using$.resource(Using.scala:261) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at io.appthreat.atom.Atom$.generateSlice(Atom.scala:250) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at io.appthreat.atom.Atom$.run$$anonfun$1$$anonfun$1$$anonfun$1(Atom.scala:114) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at scala.util.Either.flatMap(Either.scala:352) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at io.appthreat.atom.Atom$.run$$anonfun$1$$anonfun$1(Atom.scala:115) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at scala.util.Either.flatMap(Either.scala:352) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at io.appthreat.atom.Atom$.run$$anonfun$1(Atom.scala:115) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at scala.util.Either.flatMap(Either.scala:352) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at io.appthreat.atom.Atom$.run(Atom.scala:115) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at io.appthreat.atom.Atom$.run(Atom.scala:99) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at io.appthreat.atom.Atom$.main(Atom.scala:55) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
at io.appthreat.atom.Atom.main(Atom.scala) ~[io.appthreat.atom-1.0.0.jar:1.0.0]
Caused by: java.util.NoSuchElementException: key not found: io.shiftleft.codepropertygraph.generated.nodes.Identifier[label=IDENTIFIER; id=2346266]
at scala.collection.immutable.BitmapIndexedMapNode.apply(HashMap.scala:635) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at scala.collection.immutable.BitmapIndexedMapNode.apply(HashMap.scala:633) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at scala.collection.immutable.HashMap.apply(HashMap.scala:132) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at io.joern.dataflowengineoss.passes.reachingdef.ReachingDefFlowGraph.pred(ReachingDefProblem.scala:72) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at io.joern.dataflowengineoss.passes.reachingdef.ReachingDefFlowGraph.pred(ReachingDefProblem.scala:71) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at io.joern.dataflowengineoss.passes.reachingdef.DataFlowSolver.$anonfun$1(DataFlowSolver.scala:20) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at scala.collection.StrictOptimizedIterableOps.flatMap(StrictOptimizedIterableOps.scala:118) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at scala.collection.StrictOptimizedIterableOps.flatMap$(StrictOptimizedIterableOps.scala:105) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at scala.collection.mutable.ListBuffer.flatMap(ListBuffer.scala:39) ~[org.scala-lang.scala-library-2.13.10.jar:?]
at io.joern.dataflowengineoss.passes.reachingdef.DataFlowSolver.calculateMopSolutionForwards(DataFlowSolver.scala:33) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at io.joern.dataflowengineoss.passes.reachingdef.ReachingDefPass.runOnPart(ReachingDefPass.scala:31) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at io.joern.dataflowengineoss.passes.reachingdef.ReachingDefPass.runOnPart(ReachingDefPass.scala:23) ~[io.joern.dataflowengineoss_3-1.1.1742.jar:1.1.1742]
at io.shiftleft.passes.NewStyleCpgPassBase$$anon$2.accept(CpgPass.scala:147) ~[io.shiftleft.codepropertygraph_3-1.3.600.jar:1.3.600]
at io.shiftleft.passes.NewStyleCpgPassBase$$anon$2.accept(CpgPass.scala:146) ~[io.shiftleft.codepropertygraph_3-1.3.600.jar:1.3.600]
at java.util.stream.ReduceOps$4ReducingSink.accept(ReduceOps.java:220) ~[?:?]
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) ~[?:?]
at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) ~[?:?]
at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) ~[?:?]
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182) ~[?:?]
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) ~[?:?]
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) ~[?:?]
cc: @mpollmeier
Added a temporary working by adding custom passes that set queue length to 1
|
gharchive/issue
| 2023-06-11T09:35:54 |
2025-04-01T04:54:42.758990
|
{
"authors": [
"prabhu"
],
"repo": "AppThreat/atom",
"url": "https://github.com/AppThreat/atom/issues/26",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1654186122
|
Pin click==8.0.4
See: https://github.com/psf/black/issues/2964#issuecomment-1080971383.
This is also the version we pin click to in applied2.
Seeing this error without click pinned
Traceback (most recent call last):
File "/home/young/.local/bin/black", line 8, in <module>
sys.exit(patched_main())
File "/home/young/.local/lib/python3.8/site-packages/black/__init__.py", line 6609, in patched_main
patch_click()
File "/home/young/.local/lib/python3.8/site-packages/black/__init__.py", line 6598, in patch_click
from click import _unicodefun # type: ignore
ImportError: cannot import name '_unicodefun' from 'click' (/home/young/.local/lib/python3.8/site-packages/click/__init__.py)
Does this give substantially better UX than say pip install click==8.0.4? We're working on removing our dependency on this fork and favor of using vanilla black and would rather avoid touching this unless necessary.
we're deprecating this fork soon anyway, so closing this
|
gharchive/pull-request
| 2023-04-04T16:14:46 |
2025-04-01T04:54:42.882364
|
{
"authors": [
"jl-applied",
"youngxguo"
],
"repo": "AppliedIntuition/black",
"url": "https://github.com/AppliedIntuition/black/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
834023276
|
Couldn't get BlueALSA PCM list: Rejected send message
Hello.
I'm brand new to the world of Bluetooth, and I hope you can point what I (probably) did wrong.
I have my Jabra headset paired to my computer.
I set up my /etc/asound.conf
pcm.btheadset {
type plug
slave {
pcm {
type bluealsa
device 30:50:75:43:F4:AE
profile "a2dp"
}
}
hint {
show on
description "BT Headset"
}
}
Then I run bluealsa ...
% sudo src/bluealsa
src/bluealsa: D: ../../src/bluez.c:497: Creating media endpoint object: /org/bluez/hci0/A2DP/SBC/Source/1
src/bluealsa: D: ../../src/bluez.c:413: Registering media endpoint: /org/bluez/hci0/A2DP/SBC/Source/1
src/bluealsa: D: ../../src/bluez.c:497: Creating media endpoint object: /org/bluez/hci0/A2DP/SBC/Source/2
src/bluealsa: D: ../../src/bluez.c:413: Registering media endpoint: /org/bluez/hci0/A2DP/SBC/Source/2
src/bluealsa: D: ../../src/bluez.c:792: Creating hands-free profile object: /org/bluez/HSP/AudioGateway
src/bluealsa: D: ../../src/bluez.c:726: Registering hands-free profile: /org/bluez/HSP/AudioGateway
src/bluealsa: D: ../../src/bluez.c:792: Creating hands-free profile object: /org/bluez/HFP/AudioGateway
src/bluealsa: D: ../../src/bluez.c:726: Registering hands-free profile: /org/bluez/HFP/AudioGateway
src/bluealsa: D: ../../src/main.c:396: Acquiring D-Bus service name: org.bluealsa
src/bluealsa: D: ../../src/main.c:401: Starting main dispatching loop
But none of the tools are succeeding.
% amixer -D bluealsa
ALSA lib ../../../src/asound/bluealsa-ctl.c:972:(_snd_ctl_bluealsa_open) Couldn't get BlueALSA PCM list: Rejected send message, 1 matched rules; type="method_call", sender=":1.186" (uid=1000 pid=23605 comm="amixer -D bluealsa ") interface="org.bluealsa.Manager1" member="GetPCMs" error name="(unset)" requested_reply="0" destination="org.bluealsa" (uid=0 pid=22707 comm="src/bluealsa ")
amixer: Mixer attach bluealsa error: No such device
Can you give me a push?
OK, I think I found my first mistake. I had to be running bluealso before I paired the headphones. Now I run bluealsa and I have this:
src/bluealsa: D: ../../src/dbus.c:59: Called: org.bluez.Profile1.NewConnection() on /org/bluez/HFP/AudioGateway
src/bluealsa: D: ../../src/ba-rfcomm.c:1285: Created new RFCOMM thread [ba-rfcomm]: HFP Audio Gateway (CVSD)
src/bluealsa: D: ../../src/ba-rfcomm.c:901: Starting RFCOMM loop: HFP Audio Gateway (CVSD)
src/bluealsa: D: ../../src/bluez.c:616: HFP Audio Gateway (CVSD) configured for device 30:50:75:43:F4:AE
src/bluealsa: D: ../../src/ba-transport.c:669: Starting transport: HFP Audio Gateway (CVSD)
src/bluealsa: D: ../../src/sco.c:245: IO loop: START: sco_thread: HFP Audio Gateway (CVSD)
src/bluealsa: D: ../../src/ba-transport.c:1043: Created new transport thread [ba-sco]: HFP Audio Gateway (CVSD)
src/bluealsa: D: ../../src/at.c:161: AT message: SET: command:+BRSF, value:923
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:+BRSF, value:2272
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/ba-rfcomm.c:126: RFCOMM: HFP Audio Gateway (CVSD) state transition: 0 -> 2
src/bluealsa: D: ../../src/at.c:161: AT message: TEST: command:+CIND, value:(null)
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:+CIND, value:("service",(0-1)),("call",(0,1)),("callsetup",(0-3)),("callheld",(0-2)),("signal",(0-5)),("roam",(0-1)),("battchg",(0-5))
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/ba-rfcomm.c:126: RFCOMM: HFP Audio Gateway (CVSD) state transition: 2 -> 5
src/bluealsa: D: ../../src/at.c:161: AT message: GET: command:+CIND, value:(null)
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:+CIND, value:0,0,0,0,0,0,5
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/ba-rfcomm.c:126: RFCOMM: HFP Audio Gateway (CVSD) state transition: 5 -> 7
src/bluealsa: D: ../../src/at.c:161: AT message: SET: command:+CMER, value:3, 0, 0, 1
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/ba-rfcomm.c:126: RFCOMM: HFP Audio Gateway (CVSD) state transition: 7 -> 8
src/bluealsa: D: ../../src/ba-rfcomm.c:126: RFCOMM: HFP Audio Gateway (CVSD) state transition: 8 -> 9
src/bluealsa: D: ../../src/at.c:161: AT message: SET: command:+VGS, value:07
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/at.c:161: AT message: SET: command:+VGM, value:09
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/at.c:161: AT message: SET: command:+XAPL, value:0B0E-BABE-0123,14
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:+XAPL=BlueALSA,6
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/at.c:161: AT message: GET: command:+BTRH, value:(null)
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/at.c:161: AT message: SET: command:+BIA, value:0,1,1,1,0,0,0,0
src/bluealsa: D: ../../src/ba-rfcomm.c:107: Sending AT message: RESP: command:(null), value:OK
src/bluealsa: D: ../../src/dbus.c:59: Called: org.bluez.MediaEndpoint1.SelectConfiguration() on /org/bluez/hci0/A2DP/SBC/Source/1
src/bluealsa: D: ../../src/bluez.c:922: Signal: org.freedesktop.DBus.ObjectManager.InterfacesAdded()
src/bluealsa: D: ../../src/dbus.c:59: Called: org.bluez.MediaEndpoint1.SetConfiguration() on /org/bluez/hci0/A2DP/SBC/Source/1
src/bluealsa: D: ../../src/a2dp.c:723: Selected A2DP SBC bit-pool range: [2, 53]
src/bluealsa: D: ../../src/bluez.c:298: A2DP Source (SBC) configured for device 30:50:75:43:F4:AE
src/bluealsa: D: ../../src/bluez.c:301: Configuration: channels: 2, sampling: 48000
src/bluealsa: D: ../../src/bluez.c:1110: Signal: org.freedesktop.DBus.Properties.PropertiesChanged(): org.bluez.MediaTransport1: Delay
but I'm still having no luck with
% amixer -D bluealsa
ALSA lib ../../../src/asound/bluealsa-ctl.c:972:(_snd_ctl_bluealsa_open) Couldn't get BlueALSA PCM list: Rejected send message, 1 matched rules; type="method_call", sender=":1.202" (uid=1000 pid=25588 comm="amixer -D bluealsa ") interface="org.bluealsa.Manager1" member="GetPCMs" error name="(unset)" requested_reply="0" destination="org.bluealsa" (uid=0 pid=22707 comm="src/bluealsa ")
amixer: Mixer attach bluealsa error: No such device
Rejected send message, 1 matched rules
This is a message from D-Bus daemon which have its own ACL system (other than UNIX's DAC). You need properly configured /etc/dbus-1/system.d/bluealsa.conf file (assuming that on your distro this config file is in that location). By default only root and user in the audio group are allowed to connect with bluealsa service.
You're right ... things work fine for root so it must be a permission error.
However, my user is in the audio group, so I'm not sure what the next step is.
% groups
hymie root disk lp floppy dialout audio video cdrom netdev scanner vboxusers
=====
I (think I) figured it out. Even though my groups listing includes "audio" , I'm not listed in /etc/group in the "audio" group. Once I did that, things are working.
Thank you for the help!
|
gharchive/issue
| 2021-03-17T17:33:04 |
2025-04-01T04:54:43.410894
|
{
"authors": [
"Arkq",
"hymie0"
],
"repo": "Arkq/bluez-alsa",
"url": "https://github.com/Arkq/bluez-alsa/issues/426",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1652097994
|
undefined reference to "make_timeout_time_ms"
please
This can be fixed by adding including "pico/time.h" in src/lorawan.c
diff --git a/src/lorawan.c b/src/lorawan.c
index 9e4cec3..dc24e60 100644
--- a/src/lorawan.c
+++ b/src/lorawan.c
@@ -28,6 +28,7 @@
#include <string.h>
#include "pico/lorawan.h"
+#include "pico/time.h"
#include "board.h"
#include "rtc-board.h"
@lingmaple did you get a changes to try @jerryneedell's suggestion above (https://github.com/ArmDeveloperEcosystem/lorawan-library-for-pico/issues/34#issuecomment-1494472699) ?
|
gharchive/issue
| 2023-04-03T13:42:06 |
2025-04-01T04:54:43.421202
|
{
"authors": [
"jerryneedell",
"lingmaple",
"sandeepmistry"
],
"repo": "ArmDeveloperEcosystem/lorawan-library-for-pico",
"url": "https://github.com/ArmDeveloperEcosystem/lorawan-library-for-pico/issues/34",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
461299470
|
开发的时候编译慢
[1] DONE Compiled successfully in 18193ms11:51:49
[1]
[1] No type errors found
[1] Version: typescript 3.5.2
[1] Time: 12408ms
为啥修改一个要这么慢,10s-20s
重新编译10几秒正常呀,而且基本上可以hot-load,没事重新编译干啥呢
是hot-reload慢么 还是? hot-reload慢的话 我可以最近找时间研究研究
是hot-reload慢么 还是? hot-reload慢的话 我可以最近找时间研究研究
是hot-reload,i7的pc,测试过2台机器都如此。
想跟进各个模块看统计时间,发现还要跟到源码,暂时就没看了。
好 我找时间看看
@cjz9032 你再试试 现在应该快不少 估计平均只要2-5s?
@cjz9032 如果没什么别的问题就自己关掉这个issue吧
@cjz9032 如果没什么别的问题就自己关掉这个issue吧
三克油,因为太匆忙了,一直没看到这里~~~
|
gharchive/issue
| 2019-06-27T03:53:05 |
2025-04-01T04:54:43.424693
|
{
"authors": [
"Armour",
"cjz9032",
"tyzero"
],
"repo": "Armour/vue-typescript-admin-template",
"url": "https://github.com/Armour/vue-typescript-admin-template/issues/68",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
411429538
|
Design UI/UX
Design the user interface of our application
Designed and uploaded the document containing the main page #21
Uploaded finished prototype for the chat view.
|
gharchive/issue
| 2019-02-18T11:25:28 |
2025-04-01T04:54:43.440764
|
{
"authors": [
"AJunque9",
"emiliocortina"
],
"repo": "Arquisoft/dechat_en2b",
"url": "https://github.com/Arquisoft/dechat_en2b/issues/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
873821607
|
Stauffer branch
read. me transfer
README versions merged
further edits and refinement necessary.
|
gharchive/pull-request
| 2021-05-02T02:42:26 |
2025-04-01T04:54:43.447210
|
{
"authors": [
"ArtTucker",
"dagibbins186"
],
"repo": "ArtTucker/mental_health_and_economics",
"url": "https://github.com/ArtTucker/mental_health_and_economics/pull/23",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
347844881
|
Generer UTM vector tile sets for alle koder
Se https://www.gdal.org/drv_mvt.html
TILING_SCHEME=crs,tile_origin_upper_left_x,tile_origin_upper_left_y, tile_dimension_zoom_0: Define a custom tiling scheme with a CRS (typically given as EPSG:XXXX), the coordinates of the upper-left corner of the upper-left tile (0,0) in the CRS, and the dimension of the tile at zoom level 0. Only available for FORMAT=DIRECTORY. The standard WebMercator tiling scheme would be defined by "EPSG:3857,-20037508.343,20037508.343,40075016.686". A tiling scheme for WGS84 geodetic could be "EPSG:4326,-180,180,360". The tiling scheme for Finnish ETRS-TM35FIN (EPSG:3067) is "EPSG:3067,-548576,8388608,2097152". When using such as custom tiling scheme, the 'crs', 'tile_origin_upper_left_x', 'tile_origin_upper_left_y' and 'tile_dimension_zoom_0' entries are added to the metadata.json, and are honoured by the OGR MVT reader.
MVT i gdal trenger versjon > 2.3. Ikke så lett å finne binaries for dette for windows. Osgeo4W har p.t. 2.2.4
Ikke så lett å finne binaries for dette for windows.
Gå til GDAL hjemmeside og klikk på Download. Da får du lenker for nedlasting av ferdigkompilerte binærfiler til Windows.
Jeg bygde på linux og det var heller ikke vanskelig:
./configure
make
De har en rimelig grei beskrivelse for Windows også på https://trac.osgeo.org/gdal/wiki/BuildingOnWindows
Klart for testing:
https://test.artsdatabanken.no/data/25833/
Er det enkelt å få disse med koder etter samme mal som ellers, med ett prefiks i hver fil?
NA, BS, MI, AO...
Ser ut som noe har feilet ved bygging eller overføring for det ligger bare fra A til BS_7SN-SN der.. resten av alfabetet mangler.
Vi skal i grunnen bare bytte ut tippecanoe med gdal (https://github.com/Artsdatabanken/grunnkart-dataflyt/blob/master/bin/export/02_createAllMbtiles.bat), så det bør gå fort.
Filene mangler fordi det bare opprettet tiles for de 1000 første kodene.
|
gharchive/issue
| 2018-08-06T09:23:21 |
2025-04-01T04:54:43.508898
|
{
"authors": [
"bjornreppen",
"jarped"
],
"repo": "Artsdatabanken/grunnkart-dataflyt",
"url": "https://github.com/Artsdatabanken/grunnkart-dataflyt/issues/55",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
294552538
|
Coverbilder på nin-typer
Coverage decreased (-0.1%) to 30.926% when pulling a192e11063bcac8daa0185b14a46aca60d132337 on bjornreppen:nin-foto into 7bf8cb87ee09482ec4ec7c178cae19a41f89ba62 on Artsdatabanken:master.
|
gharchive/pull-request
| 2018-02-05T21:42:47 |
2025-04-01T04:54:43.511786
|
{
"authors": [
"bjornreppen",
"coveralls"
],
"repo": "Artsdatabanken/ratatouille",
"url": "https://github.com/Artsdatabanken/ratatouille/pull/66",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1166662713
|
Nice rat
https://www.virustotal.com/gui/file/4d426e3e4f35223ed1b73d8ec6ae1d9d1c83bbdd0ec5a6c0b6e1495ab97b1182
ofc the virus is gonna triggered by unauthorized software? get some knowledge of files and how it works before posting this ss
can't see anywhere it's our file so keep on making fake bs
can't see anywhere it's our file so keep on making fake bs
I FUCKING SENT THE LINK, HOW RETARDED ARE YOU?
AND I KNOW FILES, I CODE PY ASWELL.
Bro its not malware lol pyinstaller use hook methods to make python runtime run the code without install python xD
|
gharchive/issue
| 2022-03-11T17:13:03 |
2025-04-01T04:54:43.549479
|
{
"authors": [
"AsjadOooO",
"H-nta1",
"Hellscap3d",
"wolfff21123"
],
"repo": "AsjadOooO/Zero-attacker",
"url": "https://github.com/AsjadOooO/Zero-attacker/issues/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
356395071
|
Handle Validation onBlur
Hi, do you have any advice on how to handle validation onBlur with this library?
I was thinking of a workaround with handling onBlur myself and then doing something like SetFocussedField("email"), and checking that before showing the validation message. But probably it could be done in a nicer way.
I'm willing to make a PR if this is a feature that more people would like :)
The library itself does not provide yet a onBlur handler, we recommend bringing your own handler for now.
That is interesting to have indeed, we didn't had this use case before (of keeping the focused field state).
probably something like?
<Form etc>
...({ focusedField: FormParams.fields, setFocusedField: FormParams.fields => unit, handleChange } => {
})
</Form>
Yeah, I think focusedField could work. It would let the user decide themselves what they want to do with that info.
My use-case is only showing the validation message if there is an error and the user is already done with the input (not focused anymore).
I'll see if I can set something up soon.
I just came across this exact use case also @happylinks!
I also want my user to be able to lose focus of an email field before they're attacked by an error message, because their first keystroke will trigger that error regardless.
My thought-solution was decoupling validation from onChange but this way is probably more elegant.
This is now released in 5.1.1-beta.6. Closing this issue :)
|
gharchive/issue
| 2018-09-03T07:54:26 |
2025-04-01T04:54:43.616443
|
{
"authors": [
"MarcelCutts",
"grsabreu",
"happylinks"
],
"repo": "Astrocoders/reform",
"url": "https://github.com/Astrocoders/reform/issues/57",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2098453950
|
Scary ReagentDispencer Transfer
Описание PR
Солюшн контейнеры в раздатчике химикатов имеют неограниченный запас реагентов.
Так-как в раздатчик химикатов можно запихнуть солюшен контейнер с любым реагентом - раздатчик химикатов позволит получить любой реагент в неограниченном количестве.
Будут ли это обузить? Конечно да.
Плохо-ли это? Я так не думаю.
Изменения
:cl:
tweak: Раздатчик химикатов делает страшные вещи.
я сути не понял, что за страшные вещи
|
gharchive/pull-request
| 2024-01-24T14:56:40 |
2025-04-01T04:54:43.619052
|
{
"authors": [
"dttric",
"yglop"
],
"repo": "Astroplex-SS14/space-station-14",
"url": "https://github.com/Astroplex-SS14/space-station-14/pull/5",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
660864874
|
Has anyone begun work to implement support for 4.0.1? Especially for the CMS Final rule?
Do you want to request a feature, report a bug, or improve documentation?
If you are reporting a bug?
What is the current behavior?
What is the expected behavior?
What are the steps to reproduce?
What OS are you using and what version of node.js and @asymmetrik/node-fhir-server-core are you running?
Not that I know of. Do you have documentation on the CMD Final rule I could check out?
High-level details are here: https://www.cms.gov/Regulations-and-Guidance/Guidance/Interoperability/index
Links to specs are about 1/3 of the way down the page.
I would like to follow up on this. Is there any planned support for FHIR 4.0.1?
No answer means this should eventually be forked, in order to support 4.0.1
I opened https://github.com/Asymmetrik/node-fhir-server-core/pull/303 for this
4.0.1 support has been added
|
gharchive/issue
| 2020-07-19T14:22:23 |
2025-04-01T04:54:43.632915
|
{
"authors": [
"hank-lenzi",
"luan-dev",
"matthewjcable",
"mgramigna",
"zeevosec"
],
"repo": "Asymmetrik/node-fhir-server-core",
"url": "https://github.com/Asymmetrik/node-fhir-server-core/issues/239",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
848497499
|
Adding Leadership Society's Enterprise Hackathon
I'm one of the organisers for Leadership Society's Enterprise Hackathon (based in Newcastle University), and we're hosting our Regional Enterprise Hackthon soon. Let me know if there's anything I need to change!
Everything looks find, so i'll merge in now!
|
gharchive/pull-request
| 2021-04-01T14:02:16 |
2025-04-01T04:54:43.651913
|
{
"authors": [
"bahorn",
"normansophie789"
],
"repo": "Athons/wiki",
"url": "https://github.com/Athons/wiki/pull/241",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
202808712
|
Minor update of documentation for Translatable
Update Translatable documentation to warn users of performance cap.
https://github.com/Atlantic18/DoctrineExtensions/issues/1502#issuecomment-171011176
https://github.com/Atlantic18/DoctrineExtensions/issues/860#issuecomment-26154908
well, there are many users who use it in production for larger projects. the point is that by default translatable casts foreign key to string (which is not optimal if you have integer). in that case optimization is to override the mapping of and change the field to integer.
It could mention instead that it may handle optimally up to million row tables with certain optimizations like using personal translations or overriding mapping of foreign key to prevent casting in database.
@l3pp4rd not millions. 50k of translation rows is enough for performance cap mentioned in #1512
yes, but it mentions exactly that, you need to modify the type of foreignKey to improve the performance.
Have fun with blazing speed of your DB queries. (In some cases we have accelerated our queries in more than 500 times: 130 milliseconds instead of 72 seconds)
So it is capable to handle big row size if the foreignKey is of correct type in order to prevent casting. That should mention it and point to that issue in order to update metadata for this column.
If we would say that it is capable to handle only small sets of data, that would not be true, because you can apply this optimization.
Hello, and thank you for your contribution.
In the interest of spring cleaning and package modernization, I'm going through and closing old issues and pull requests that have had no recent activity, do not have associated tests, or have failing tests.
If you are still experiencing this issue or wish to contribute an up-to-date pull request, please create a new issue.
Thank you!
|
gharchive/pull-request
| 2017-01-24T12:46:15 |
2025-04-01T04:54:43.658639
|
{
"authors": [
"AkenRoberts",
"TriAnMan",
"l3pp4rd",
"svgrafov"
],
"repo": "Atlantic18/DoctrineExtensions",
"url": "https://github.com/Atlantic18/DoctrineExtensions/pull/1742",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2425464616
|
[Feature] 1.21 Compatibility?
Describe the solution you'd like
I am looking to see if this mod will be updated to 1.21.
From Minecraft 1.21+, I separated the mod project.
WTHIT
curseforge: https://www.curseforge.com/minecraft/mc-mods/your-reputation-wthit
modrinth: https://modrinth.com/mod/your-reputation-wthit
Jade
curseforge: https://www.curseforge.com/minecraft/mc-mods/your-reputation-jade
modrinth: https://modrinth.com/mod/your-reputation-jade
From Minecraft 1.21+, I separated the mod project.
WTHIT
curseforge: https://www.curseforge.com/minecraft/mc-mods/your-reputation-wthit
modrinth: https://modrinth.com/mod/your-reputation-wthit
Jade
curseforge: https://www.curseforge.com/minecraft/mc-mods/your-reputation-jade
modrinth: https://modrinth.com/mod/your-reputation-jade
This is due to issues #33 #34 .
I do apologize for seeing this only now. Thank you for the update!
|
gharchive/issue
| 2024-07-23T15:11:21 |
2025-04-01T04:54:43.715374
|
{
"authors": [
"Aton-Kish",
"Etanarvazac"
],
"repo": "Aton-Kish/your-reputation",
"url": "https://github.com/Aton-Kish/your-reputation/issues/44",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2268018697
|
Oscillator only plays for a short time then stops in a physical device
macOS Version(s) Used to Build
macOS 13 Ventura
Xcode Version(s)
Xcode 14
Description
I've tried writing a very simple Oscillator that just plays a certain frequency. It's working on the emulator but once I deployed it to a my iPhone (12 mini, iOS version 17.4.1) it just plays for a short while but then stops (about 1 second). I've tried checking whether the Oscillator really started using isStarted but it prints out true.
Is there any configuration settings that I need to setup beforehand? I'm fairly new to AudioKit (and swift in general), was hoping someone can point me in the right direction. Here's my code:
public class HertzPlugin:CAPPlugin {
let engine = AudioEngine()
let oscillator = Oscillator()
@objc func play(_ call: CAPPluginCall) {
AudioKit.Settings.enableLogging = true
let frequency = call.getFloat("value", 0)
oscillator.stop()
oscillator.frequency = frequency
oscillator.amplitude = 1.0
engine.output = oscillator
do {
try engine.start()
} catch let err {
print("error")
Log(err)
}
print("playing...")
oscillator.start()
print("played \(oscillator.isStarted)") // always returns true
call.resolve(["value": frequency])
}
@objc func stop(_ call: CAPPluginCall) {
if (oscillator.isStarted == true) {
oscillator.stop()
engine.stop()
}
call.resolve()
}
}
I'm running this using an M2 Macbook Air, MacOs Sonoma 14.4.1
Crash Logs, Screenshots or Other Attachments (if applicable)
No response
Try out the Cookbook Oscillator example and see if it is giving you the same issue. https://github.com/AudioKit/Cookbook
My guess is that you need to set the AVAudioSession category in your main app file like this:
init() {
#if os(iOS)
do {
Settings.bufferLength = .medium
try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(Settings.bufferLength.duration)
try AVAudioSession.sharedInstance().setCategory(.playback,
options: [.mixWithOthers, .allowBluetoothA2DP])
try AVAudioSession.sharedInstance().setActive(true)
} catch let err {
print(err)
}
#endif
}
That worked really well! Closing this issue. Thanks for the help! Is the Cookbook the official place where to find all the necessary information regarding AudioKit? I'm very new to all of this, and I'm having trouble navigating the documentation. I don't even know how to run the Cookbook, any suggestions on where absolute beginners like me should start?
The Cookbook showcases most of the various pieces of AudioKit and how to implement them. I started learning AudioKit with the Cookbook. You can download the project from GitHub and run it like any other Xcode project (unless you’re getting some weird compiler errors). You can also find some tutorials on YouTube that should help in going up the AudioKit learning curve.On May 4, 2024, at 11:11 PM, yev @.***> wrote:
That worked really well! Closing this issue. Thanks for the help! Is the Cookbook the official place where to find all the necessary information regarding AudioKit? I'm very new to all of this, and I'm having trouble navigating the documentation. I don't even know how to run the Cookbook, any suggestions on where absolute beginners like me should start?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>
|
gharchive/issue
| 2024-04-29T02:49:38 |
2025-04-01T04:54:43.747067
|
{
"authors": [
"NickCulbertson",
"vycoder"
],
"repo": "AudioKit/AudioKit",
"url": "https://github.com/AudioKit/AudioKit/issues/2914",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2447442120
|
Remove PDFs
The amount of PDFs greatly bloats the size of the repository unnecessarily, especially since these resources can be accessed from NTULearn anyway.
ok nigga
well played
|
gharchive/issue
| 2024-08-05T02:00:26 |
2025-04-01T04:54:43.755952
|
{
"authors": [
"cadzchua",
"nonnoxer",
"yuurraa"
],
"repo": "AudricY/dscc-hub",
"url": "https://github.com/AudricY/dscc-hub/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
322538837
|
Feedback
Hey all -- wasn't sure where the best place to put this was, so figured I'd just open up an issue!
I spent 30 minutes playing around with Augur today and wanted to share some high level feedback from a first time user. First thing I want to say is: wow! This has come a long way. The UI is beautiful, the sections make sense, the interface feels snappy and exciting. Amazing work!
Some smaller nits:
They distinction between Buy/Sell is confusing. On this trade page, I am very confused about how to bet "No", I do not think they will win the championship. I executed a "Sell" (though I don't think I owned anything?) and it seems to have executed. Does this mean I bet against it? I also issue a "Buy" and the order is still pending? I'm pretty confused by what's going on here. It would be amazing if in the UI there was some Augur 101 stuff that walked me through the basics of how I invest on either side of a prediction.
I am confused by this market. How do I actually pick a degree that I think it will be? I can place a "Buy" or a "Sell", but I can't figure out what this actually means. This is similar feedback to above, but since it's not a binary prediction, I think I require even more UI / UX help.
When I place an order, there's not status or indicator that my transaction is pending. It would be amazing if the little prompt that shows up bottom right also included a link to something like Etherscan where I could track status. Also, it would be great if pending transactions could show up in the My Positions section with a pending status.
On the Porfolio page, I'm not sure what this chart is. It was there before I had an positions, is it the entire market? Fake data?
Anyways, just wanted to dump out thoughts in case they were useful. Really excited to see this continuing to mature 👏 🎉 💛
cc @joeykrug
Thanks! This is [one of] the right place[s]!
Yep it means you short sold / bet against it. It is kinda confusing, there'll be a tutorial walkthough explaining it all in the UI eventually. Essentially it's a trading platform first, then there'll be a simpler UI skin that makes it easier / simpler to use for betting, which should also help with this issue. Did your buy order ever finish pending? Or does it instead say "open order" [this is like an open unfilled order on gdax if that's what it says]
Ah yeah, so you don't actually pick an exact degree it'll be, what you do is buy / sell or go long / short the market at a given price. So for instance, say the current prediction is 0.27 degrees and you think it'll get up to 20 degrees, just how if in a binary market you would buy up to say 60% odds or a price of 0.60 and you'd enter in 0.60 for your price in the buy box, here you'd enter in 20 and the amount you want to buy as well.
I agree, good call x2
Yeah, that's a bug I've reported as well, it's fake data haha
🙏
|
gharchive/issue
| 2018-05-12T20:00:31 |
2025-04-01T04:54:43.762033
|
{
"authors": [
"jessepollak",
"joeykrug"
],
"repo": "AugurProject/augur",
"url": "https://github.com/AugurProject/augur/issues/1574",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
434912810
|
Scroll bar issue - multiple scrollbars
@chwy gets 2 individual scrollbars show up in trade history on windows. Could you supply a screenshot
I get two types of scroll bar:
1 default:
1 active:
Can we discuss a better solution for this
Closing - this is the default behavior in Chrome. You get that one type of bar when scrolling BUT if you move and hold your mouse inside the SCROLL BAR path Chrome will draw the scroll bar margins.
|
gharchive/issue
| 2019-04-18T18:59:47 |
2025-04-01T04:54:43.765111
|
{
"authors": [
"Chwy5",
"andrewdamelio",
"matt-bullock"
],
"repo": "AugurProject/augur",
"url": "https://github.com/AugurProject/augur/issues/1730",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
523602102
|
Design QA: Chat - Chat screen
Design: https://www.figma.com/file/fLWVwmanAwetVZbujQquEi/Market-Page?node-id=194%3A3832
[x] Centre align text for '205 Online'
[x] Adjust height of chat input box to match design. Currently it looks small leaving a large gap at the bottom
[x] Update username text style
[x] Check user's messages are using colour text/primary
[x] 'Type your message' text. Make colour text/secondary at 75% opacity
[x] Active input text should be colour text/primary at 100% opacity
[x] Check types style for input - might need updating to larger font size
Oops
|
gharchive/issue
| 2019-11-15T17:40:38 |
2025-04-01T04:54:43.768311
|
{
"authors": [
"bconfortin",
"matt-bullock"
],
"repo": "AugurProject/augur",
"url": "https://github.com/AugurProject/augur/issues/4797",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
537658950
|
No alerts confirming REP migration
There's currently no confirmation alert when the REP is migrated to V2. Nothing happens...
Ran it by @Chwy5 show same spinner proposed for notifications when transaction is being processed.
show spinning icon when migration transaction is being processed.
Shows up in the bell dropdown but not as a toast. Is toast required?
+1 we should enable the toast for this, it's an important transaction, we should make the confirmation state as visible as possible
toast and bell now there
|
gharchive/issue
| 2019-12-13T16:58:38 |
2025-04-01T04:54:43.770649
|
{
"authors": [
"cillianhunter0",
"matt-bullock"
],
"repo": "AugurProject/augur",
"url": "https://github.com/AugurProject/augur/issues/5238",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
616972186
|
Eth Reserve Top off messaging - See designs in comment
Add the dai equivalent of eth reserve balance to the current tooltip....like "Your total funds does not include the Eth reserve balance of $7.94"
Inform the user when their ETH reserves get topped-off.
[x] bell alert
[x] order form ETH reserve top-off
Edit copy in ETH reserve tooltip in account
[x] tooltip
Augur runs on a peer-to-peer network which requires transaction fees that are paid in ETH. These fees go entirely to the network and Augur doesn’t collect any of these fees.
If your account balance exceeds $40, a portion of this equivilant to 0.04ETH will be held in your ETH reserve to cover transaction fees resulting in cheaper transaction fees.
So long as your available account balance remains over $40 Dai, your ETH reserve will replenish automatically.
Your ETH reserve can be easily cashed out at anytime using the withdraw button in the transactions section of your account summary.
[x] Total Funds tooltip
Design for order form message
Bell & Toast message
In the 3rd paragraph of the tooltip "As long as your available account balance remains over $40 Dai, your ETH reserve will automatically be replenish."
need to add an ed onto replenish....should be replenished
Checked. Closing
|
gharchive/issue
| 2020-05-12T21:27:54 |
2025-04-01T04:54:43.776179
|
{
"authors": [
"Chwy5",
"matt-bullock"
],
"repo": "AugurProject/augur",
"url": "https://github.com/AugurProject/augur/issues/7745",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.