id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
1068450793
|
Remove id attribute from RouteChangeAnnouncement component
https://github.com/remix-run/remix/discussions/763
tldr; the id attribute is not referenced anywhere, it seems it has no purpose.
we no longer have that in the template, but thank you!
|
gharchive/pull-request
| 2021-12-01T14:07:07 |
2025-04-01T06:40:13.271255
|
{
"authors": [
"matmilbury",
"ryanflorence"
],
"repo": "remix-run/remix",
"url": "https://github.com/remix-run/remix/pull/826",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2575331728
|
Fix error when using modify() with pick, on a schema that doesn't define allOf
What this PR does
Fix an error that occurs if you pick fields using modify() on a schema that doesn't define allOf.
Why it happened / how it's fixed
Unguarded access to allOf.forEach(...) on the modified schema, which we now check.
How to reproduce and test
The following example no longer errors on this branch. The added test case demonstrates this.
const schema = {
// This schema does not use `allOf`
properties: {
age: {
title: 'Age',
type: 'integer',
},
quantity: {
title: 'Quantity',
type: 'integer',
},
},
'x-jsf-order': ['age', 'quantity'],
};
// TypeError reading properties of allOf
modify(schema, {
pick: 'age'
})
Thanks @tcrammond this looks good, do you have a corresponding Dragon MR?
Thanks @tcrammond this looks good, do you have a corresponding Dragon MR?
Hi @ollyd, internal ticket here.
|
gharchive/pull-request
| 2024-10-09T09:22:57 |
2025-04-01T06:40:13.274301
|
{
"authors": [
"ollyd",
"tcrammond"
],
"repo": "remoteoss/json-schema-form",
"url": "https://github.com/remoteoss/json-schema-form/pull/93",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
247665976
|
Process 'command '/Users/giuliopettenuzzo/Library/Android/sdk/ndk-bundle/ndk-build'' finished with non-zero exit value 2
I have been in this issue for one week. I'm not able to sort out, please help me!
I follow all the steps in the guide like 10 times and I still have the same error.
In the web I found a lot of people had this problem and I try all their solution but still don't work.
I have a MacBook Pro
Information:Gradle tasks [:app:generateDevelopDebugSources, :app:generateDevelopDebugAndroidTestSources, :app:prepareDevelopDebugUnitTestDependencies, :app:mockableAndroidJar, :app:compileDevelopDebugSources, :app:compileDevelopDebugAndroidTestSources, :app:compileDevelopDebugUnitTestSources, :tess-two:tess-two:generateDebugSources, :tess-two:tess-two:mockableAndroidJar, :tess-two:tess-two:prepareDebugUnitTestDependencies, :tess-two:tess-two:generateDebugAndroidTestSources, :tess-two:tess-two:compileDebugSources, :tess-two:tess-two:compileDebugUnitTestSources, :tess-two:tess-two:compileDebugAndroidTestSources]
Error:Execution failed for task ':tess-two:tess-two:ndkBuild'.
Process 'command '/Users/giuliopettenuzzo/Library/Android/sdk/ndk-bundle/ndk-build'' finished with non-zero exit value 2
in my case the path of ndk-build is like this:
ndkDir=/Users/giuliopettenuzzo/Library/Android/sdk/ndk-bundle
running the ndk-build in the tess-two jni directory I have the following error:
Android NDK: android-8 is unsupported. Using minimum supported version android-14.
Android NDK: WARNING: APP_PLATFORM android-14 is higher than android:minSdkVersion 8 in /Users/giuliopettenuzzo/AndroidStudioProjects/textfairy/tess-two/tess-two/AndroidManifest.xml. NDK binaries will not be comptible with devices older than android-14. See https://android.googlesource.com/platform/ndk/+/master/docs/user/common_problems.md for more information.
[armeabi-v7a] Compile : lept <= adaptmap.c
make: /Users/giuliopettenuzzo/Library/Android/sdk/ndk-bundle/toolchains/arm-linux-androideabi-4.8/prebuilt/darwin-x86_64/bin/arm-linux-androideabi-gcc: Command not found
make: *** [/Users/giuliop
See https://github.com/renard314/textfairy/issues/138#issuecomment-287583567. Same fix should work for you too.
should be fixed now
|
gharchive/issue
| 2017-08-03T10:43:21 |
2025-04-01T06:40:13.305891
|
{
"authors": [
"GiulioPettenuzzo",
"alexcohn",
"renard314"
],
"repo": "renard314/textfairy",
"url": "https://github.com/renard314/textfairy/issues/156",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2232927870
|
[BUG] Find daily notes & find week notes both show all the notes
Please confirm
[X] I am running the latest version of this plugin
Describe the bug
Both finding daily and weekly notes show all the notes that I have in my notes folder.
To Reproduce
Steps to reproduce the behavior:
Start nvim
:Telekasten find_daily_notes (or :Telekasten find_weekly_notes)
Telescope window shows all the notes
Expected behavior
I'd like to only see daily/weekly notes
Screenshots
If applicable, add screenshots to help explain your problem.
Operating system (please complete the following information):
OS: Ubuntu 22-04
Additional context
Add any other context about the problem here.
Do you have the daily/weekly folders set to a different directory than the home directory? As far as I can tell, the pickers launched by find_daily_notes/find_weekly_notes uses the configured folder as the cwd for telescope.
@Tonitum No, I have everything laying flat within the ~/notes' folder. Should daily/weekly notes be in some specific place? When I create a daily note it also creates it within the '~/notes folder.
Then it is normal.
Basically find_daily/weekly opens a picker in the configured dailies and weeklies directory. If you have not specified anything special, it will always return everything in your home.
Not sure if there is a fix for that to be honest.
If it is possible, we could ask ripgrep to only search in files with the appropriate filename convention rather than using the default find_command. But I am not sure if it is possible.
Yeah, I can take a crack at this!
Think this is resolved, PR is here: https://github.com/renerocksai/telekasten.nvim/pull/328
|
gharchive/issue
| 2024-04-09T08:47:12 |
2025-04-01T06:40:13.317515
|
{
"authors": [
"Tonitum",
"lambtho12",
"marad"
],
"repo": "renerocksai/telekasten.nvim",
"url": "https://github.com/renerocksai/telekasten.nvim/issues/321",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
616559239
|
Adds packageNameQuery to limit results on long feeds
This change is
<3
|
gharchive/pull-request
| 2020-05-12T10:52:21 |
2025-04-01T06:40:13.319056
|
{
"authors": [
"jessehouwing"
],
"repo": "renevanosnabrugge/vsts-promotepackage-task",
"url": "https://github.com/renevanosnabrugge/vsts-promotepackage-task/pull/55",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1249134231
|
Added support for Kubernetes 1.24.0
The code requires kubernetes.io/master label on two places
nodeSelectors
Tolerations
Add the master label right after kubeadm is inited and before installing the kurl addons. And this logic is added only if the current kubernetes version is >= 1.24.0
Tested for ekc-operator
$ kubectl get nodes --show-labels=true
NAME STATUS ROLES AGE VERSION LABELS
jalaja-k8s-v1-24-with-control-plane-labels-1 Ready control-plane,master 26h v1.24.0 beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,kubernetes.io/arch=amd64,kubernetes.io/hostname=jalaja-k8s-v1-24-with-control-plane-labels-1,kubernetes.io/os=linux,kurl.sh/cluster=true,node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.kubernetes.io/exclude-from-external-load-balancers=
What type of PR is this?
type::feature
What this PR does / why we need it:
Which issue(s) this PR fixes:
Fixes #
SC-48706
Does this PR introduce a user-facing change?
No
Added support for Kubernetes 1.24.0
Does this PR require documentation?
Yes
TODO
@emosbaugh added logic for joining masters.
this looks better. can you please test on k8s 1.19,1.23 and 1.24
yeah, will let you know after testing on 1.19 and 1.23 and 1.24
Kurl: Tested on 1.19 (only with master label and 1.24. Going to test 1.23 on staging.
Kots: Built ttl image and manually edited the Deployment. And then ClusterNode, click Add Node to trigger the code path.
Ekco: Built ttl.sh image, manually edited the Deployment and added code to make sure it executes the new code.
$ kubectl get nodes --show-labels=true -A
NAME STATUS ROLES AGE VERSION LABELS
jalaja-kurl-1-19 Ready master 58m v1.19.16 beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,kubernetes.io/arch=amd64,kubernetes.io/hostname=jalaja-kurl-1-19,kubernetes.io/os=linux,kurl.sh/cluster=true,node-role.kubernetes.io/master=
jalaja-kurl-1-19-master-3 Ready master 9m v1.19.16 beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,kubernetes.io/arch=amd64,kubernetes.io/hostname=jalaja-kurl-1-19-master-3,kubernetes.io/os=linux,kurl.sh/cluster=true,node-role.kubernetes.io/master=
$ kubeadm version
kubeadm version: &version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.16", GitCommit:"e37e4ab4cc8dcda84f1344dda47a97bb1927d074", GitTreeState:"clean", BuildDate:"2021-10-27T16:24:44Z", GoVersion:"go1.15.15", Compiler:"gc", Platform:"linux/amd64"}
kubectl get nodes -A
NAME STATUS ROLES AGE VERSION
jalaja-kurl-1-19 Ready master 49m v1.19.16
jalaja-kurl-1-19-master-3 Ready master 23s v1.19.16
Instance: 35.193.60.111
|
gharchive/pull-request
| 2022-05-26T05:56:54 |
2025-04-01T06:40:13.456485
|
{
"authors": [
"jala-dx"
],
"repo": "replicatedhq/kURL",
"url": "https://github.com/replicatedhq/kURL/pull/2938",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2406650414
|
[protocolv2] Fix passing input as first message being backwards incompatible
Why
In v1.1 sometimes the first message is not init, but instead it's the input. Let's handle that.
What changed
If we get an init that's not matching what we expect, we'll check if the client is on v1.1 and then we'll pass the first message as input and treat init as an empty object (which is what the first pass of requiring init for stream and upload be).
Versioning
[ ] Breaking protocol change
[ ] Breaking ts/js API change
Will link river-babel test PR shortly
test here https://github.com/replit/river-babel/pull/38/commits/814fe3be94f9d57850821a9d870315dbe8e00099
|
gharchive/pull-request
| 2024-07-13T02:55:40 |
2025-04-01T06:40:13.461914
|
{
"authors": [
"masad-frost"
],
"repo": "replit/river",
"url": "https://github.com/replit/river/pull/236",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
236621583
|
Update set method not working as expected in Kotlin
From the docs
int rows = data.update(Person.class)
.set(Person.ABOUT, "student")
.where(Person.AGE.lt(21)).get().value();
I would expect the Kotlin equivalent to work similarly
val rows = data.update(Person::class)
.set(Page::about, "student")
.where(Page::age lt 21).get().value()
But I get Type mismatch: inferred type is KMutableProperty1<Person, String> but Expression<String> was expected. I realize that it must be an Expression, but I'm unsure of the proper way of converting the property to an Expression.
This is with a KotlinReactiveEntityStore<Persistable> data store
Thanks for reporting there was a bug in the set extension method it didn't have the right return type
Thanks for the quick fix!
On Mon, Jun 19, 2017 at 11:26 PM, Nikhil Purushe notifications@github.com
wrote:
Thanks for reporting there was a bug in the set extension method it didn't
have the right return type
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/requery/requery/issues/592#issuecomment-309656970,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABLNR7F35Lj_7kZpU0UZTd9cPG_8UHtSks5sF2YCgaJpZM4N9DUB
.
Actually a question for you. I pulled the latest fixes and was still experiencing the problem. Looking at the test suite I saw that the Person class is a data class, but I'm using an interface. I figured out that I could get it to work if I just used the generated PersonEntity rather than the Person interface.
data.update(PersonEntity::class).set(PersonEntity.ABOUT, "student")
Is that intentional or should you be able to use the interface in the same fashion? It through me off because I'm using the interface in every other query.
data.update(PersonEntity::class).set(findAttribute(Person::about), "student")
Is this what you're looking for?
@consp1racy That's helpful, thank you! I'm just curious if the API is working as intended. If I should expect the update function to behave the same when used with either an interface or data class, or if the workaround you've described is the intended way.
With the change you should be able to do:
data.update(Person::class).set(Person::about, "student")
I think the issue may be with my datastore which is a KotlinReactiveEntityStore<Persistable>(KotlinEntityDataStore(source.configuration)). Given the following model definition:
@Entity
data class Person constructor (
@get:Key
var id: Int,
var name: String,
var email: String,
var birthday: Date,
var age: Int,
var about: String,
@get:Column(unique = true)
val uuid: UUID,
val homepage: URL,
val picture: String
) : io.requery.Persistable
I get the following errors:
val rowCount = data.update(Person::class)
.set(Person::about, "nothing") // Type mismatch: inferred type is KMutableProperty1<Person, String> but Expression<String> was expected
.set(Person::age, 50) // Type mismatch: inferred type is KMutableProperty1<Person, String> but Expression<String> was expected
.where(Person::age.eq(100)).get().value()
If I don't make Person persistable, the compiler obviously complains, but the errors on those lines go away.
I am having the same issue using KotlinReactiveEntityStore
Cannot choose among the following candidates without completing type inference:
public open fun <E : Any> update(entity: KClass<TreeInfoEntity>): KClass<TreeInfoEntity> defined in io.requery.sql.KotlinEntityDataStore
public open fun <E : Any> update(type: KClass<TreeInfoEntity>): Update<Scalar<Int>> defined in io.requery.sql.KotlinEntityDataStore
when I use update class. I'm using KotlinEntityDataStore
I am having the same issue.
|
gharchive/issue
| 2017-06-16T23:33:56 |
2025-04-01T06:40:13.478917
|
{
"authors": [
"consp1racy",
"exitface",
"jeffzoch",
"ninjudd",
"npurushe",
"xxxifan"
],
"repo": "requery/requery",
"url": "https://github.com/requery/requery/issues/592",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
569310181
|
Warning Please fix
warning
"request-promise-native > request-promise-core@1.1.3" has unmet peer dependency "request@^2.34".
warning
" > request-promise-native@1.0.8" has unmet peer dependency "request@^2.34".
Have you tried adding request dependency to your project?
Please read: https://nodejs.org/ru/blog/npm/peer-dependencies/
|
gharchive/issue
| 2020-02-22T09:42:30 |
2025-04-01T06:40:13.481462
|
{
"authors": [
"ArtashMardoyan",
"debugmaster"
],
"repo": "request/request-promise-native",
"url": "https://github.com/request/request-promise-native/issues/59",
"license": "isc",
"license_type": "permissive",
"license_source": "bigquery"
}
|
367111519
|
Using the response param from one endpoint in Request of another
Hello,
I want to pass the response received from one API as a request parameter in another API using Request-Promise NodeJs module. Can someone pls help me in this? I am giving a brief of the sample code below:
var Sequence = {
test1: function (param) {
return request({
"method": "POST",
"uri": baseURL+"/v1/" + userID + "/test/info/",
"json": true,
"headers": {
"Accept": "application/json",
},
}).then(function (result) {
return result.pairingInfo // I want to use this pairinfInfo param in another request
})
test2 : function (param) {
return request({
"method": "POST",
"uri": baseURL+"/v1/passenger/" + userID + "/test/test/",
"json": true,
"headers": {
"Accept": "application/json",
},
"qs": {
"pairingInfo": pairingInfo,//This pairingInfo would come from the returned result.pairingInfo of test 1 API call
}
})
}
},
function main(param) {
return sequence.test1(param).then(sequence.test2(param))
}
function main(param) {
return sequence.test1(param)
.then(function (pairingInfo) {
return sequence.test2(pairingInfo)
})
}
...should do the trick. Btw, if you want to understand why then read about the .then(...) function for promises in general.
|
gharchive/issue
| 2018-10-05T08:34:07 |
2025-04-01T06:40:13.484277
|
{
"authors": [
"Suvin1987",
"analog-nico"
],
"repo": "request/request-promise",
"url": "https://github.com/request/request-promise/issues/289",
"license": "isc",
"license_type": "permissive",
"license_source": "bigquery"
}
|
258183320
|
End to End TLS + SNI with proxy
Summary.
We are running TLS+SNI with fabio https://github.com/fabiolb/fabio and backed by a nodejs running an HTTPS server with self signed certs. We have the application working with curl. We can't get this working with requests. I tried the toolbelt as well as may other adapters I found on the internet. Fundamentally what we need to have happen is the client will connect to the proxy on the given physical IP and PORT. Then pass the logical host through so that the proxy can route the logical server name to the correct backend. Plz halp.
Expected Result
curl -v -X PUT --resolve "logical.server:10000:$IP" \
"https://logical.server:10000/some/uri" -d '{"value": "hello"}' \
-H "X-hmac: $(_hmac)" \
-H "Content-Type: application/json" \
-H "X-nonce: $(_nonce)"
Actual Result
With PyOpenSSL: (Caused by SSLError(SSLError("bad handshake: SysCallError(-1, 'Unexpected EOF')",),))
Without PyOpenSSL: (Caused by SSLError(SSLEOFError(8, u'EOF occurred in violation of protocol (_ssl.c:661)'),))
Reproduction Steps
However I can't find a reasonable solution in requests to mimic this curl behavior.
My first go was just following the toolbelt SNI example:
s = requests.Session()
s.mount("https://", HostHeaderSSLAdapter())
nonce = _nonce()
data="foo"
r = s.put("https://$IP:10000/some/uri", json={"value": "hello"}, headers={
"Host": "logical.server",
"X-hmac": _hmac(data, SECRET),
"Content-Type": "application/json",
"X-nonce": nonce}, verify=False)
I tried other strategies around forcing TLS and spoofing with transparent proxies: https://github.com/jakubroztocil/httpie/issues/422#issuecomment-236398663. None changed the behavior.
System Information
$ python --version
Python 2.7.13
$ openssl version
OpenSSL 1.0.2l 25 May 2017
$ OSX (also tried on linux 14.04)
10.12.6
Without pyOpenSSL
$ pip freeze
asn1crypto==0.22.0
certifi==2017.7.27.1
cffi==1.10.0
chardet==3.0.4
cryptography==2.0.3
enum34==1.1.6
idna==2.6
ipaddress==1.0.18
pycparser==2.18
requests==2.18.4
requests-toolbelt==0.8.0
six==1.10.0
urllib3==1.22
With pyOpenSSL
$ pip freeze
asn1crypto==0.22.0
certifi==2017.7.27.1
cffi==1.10.0
chardet==3.0.4
cryptography==2.0.3
enum34==1.1.6
idna==2.6
ipaddress==1.0.18
pycparser==2.18
pyOpenSSL==17.3.0
requests==2.18.4
requests-toolbelt==0.8.0
six==1.10.0
urllib3==1.22
Openssl connecting to Fabio
$ openssl s_client -connect $IP:$PORT -servername logical.server
CONNECTED(00000003)
depth=0 /C=US/ST=CA/L=San Francisco/O=PubNub/OU=Dev/CN=logical.server
verify error:num=18:self signed certificate
verify return:1
depth=0 /C=US/ST=CA/L=San Francisco/O=PubNub/OU=Dev/CN=logical.server
verify return:1
---
Certificate chain
0 s:/C=US/ST=CA/L=San Francisco/O=PubNub/OU=Dev/CN=logical.server.name
i:/C=US/ST=CA/L=San Francisco/O=PubNub/OU=Dev/CN=logical.server.name
---
Server certificate
-----BEGIN CERTIFICATE-----
MIIDozCCAougAwIBAgIJAM3s1shczk7eMA0GCSqGSIb3DQEBCwUAMGgxCzAJBgNV
BAYTAlVTMQswCQYDVQQIDAJDQTEWMBQGA1UEBwwNU2FuIEZyYW5jaXNjbzEPMA0G
A1UECgwGUHViTnViMQwwCgYDVQQLDANEZXYxFTATBgNVBAMMDHZhdWx0ci5sb2Nh
bDAeFw0xNzA5MTQyMjMzMzJaFw00NTAxMzAyMjMzMzJaMGgxCzAJBgNVBAYTAlVT
MQswCQYDVQQIDAJDQTEWMBQGA1UEBwwNU2FuIEZyYW5jaXNjbzEPMA0GA1UECgwG
UHViTnViMQwwCgYDVQQLDANEZXYxFTATBgNVBAMMDHZhdWx0ci5sb2NhbDCCASIw
DQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKPtkscVVph69UEOYneVm9zh3Ru7
DPXihS4+P6U2hHANTpX/FOxQIP+FC5ELG+xDG2caCXbb7oER9MFm+TkytzQqW4ur
lbngnOl5t3QPHfsMB645SCYKmjREOMSwgIx4zi2brBf5GKAu5hOpP84FPSYd06a5
LzIJBEybiFJVKH/j2Ig5ekgvw6F6pjDMIb035G1WMHl341rOyusfAAEiJeTW0c6b
6MF4GJL8WS8tpwUVAGnFvcub6O/uxoniiW6vn53Xs04+zFgecYtklcqFZeUHKLxe
EPFbTXTi0yWUcC4UC/E9iFGVgVzBZxWQf1Bj/R3o1b+irZiYSGnNeX5IQSsCAwEA
AaNQME4wHQYDVR0OBBYEFFL6Npq18TvCkwc8tazezOERo5M9MB8GA1UdIwQYMBaA
FFL6Npq18TvCkwc8tazezOERo5M9MAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcNAQEL
BQADggEBAHtzLceffM1AWh/XXqidtb3fkLovpk9XAADoPetb2ARJBYG/BUJwmmiA
zEQ8IVxS5N3t+sdEC5RBKfqW05DDnW/0OY3eIHLUYqoyyvt6/WzJo08J6kKNLgIm
ukmOfNfs7i2R0YRbfsmq8Mze9Lk3G23Q44wOzx89vGX2mCYkeUznR9m28e+23wLX
aBYUix9XpWrj13R6pc1ljS2O9SqTWjsmi6zr/HhbdGd3QyVK+Q8g4DpJ2tWPzZR4
q7VsxEzJubKXGQb6d20k5DT4ehjQBAbxT4BSrqhyBABox4M/R2sQ4iDColXemXEI
YZl8zbJwpPZxV9mvstWY+NQDxCPe9yM=
-----END CERTIFICATE-----
subject=/C=US/ST=CA/L=San Francisco/O=PubNub/OU=Dev/CN=logical.server
issuer=/C=US/ST=CA/L=San Francisco/O=PubNub/OU=Dev/CN=logical.server
---
No client certificate CA names sent
---
SSL handshake has read 1299 bytes and written 447 bytes
---
New, TLSv1/SSLv3, Cipher is AES256-SHA
Server public key is 2048 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
SSL-Session:
Protocol : TLSv1
Cipher : AES256-SHA
Session-ID: DB0923D38B54871EE944C9A206A51B3C7C54BD5723EBB1DBF30D421AB742AD88
Session-ID-ctx:
Master-Key: D32CC66A48B71E577024A0947AC292CC32F374978F0156104D958A9AE24E9AA71B67AFCFD3D26F778139576E2CDF40EA
Key-Arg : None
TLS session ticket lifetime hint: 300 (seconds)
TLS session ticket:
0000 - c3 64 25 44 9c 28 85 bc-a3 80 1c 79 96 e3 aa 45 .d%D.(.....y...E
0010 - 2d dd 07 6d 8e 39 43 95-4c 36 dc 58 d6 ea 9e 2e -..m.9C.L6.X....
0020 - 60 10 f4 ba 83 84 e9 6a-29 25 d6 1e 5e 93 e7 15 `......j)%..^...
0030 - 95 74 03 e5 d1 37 99 8f-c6 2b 15 ab bc ee 99 e4 .t...7...+......
0040 - fd 89 a4 b4 57 d4 0a 0e-1e a6 86 22 ef 46 07 6a ....W......".F.j
0050 - 80 31 97 a7 6b 31 1f 45-f2 4e f7 63 65 e8 a0 d8 .1..k1.E.N.ce...
0060 - 26 3e 5d d6 fa 97 87 3d-f2 dd 25 82 ff f7 d0 95 &>]....=..%.....
0070 - d0 c9 f1 30 00 ba 71 e4-b4 1d 14 27 e9 ce 83 76 ...0..q....'...v
0080 - c2 1e 25 84 1c ad 5c 0d-9f 03 ac a4 6b b8 4b 84 ..%...\.....k.K.
0090 - 1d 38 f8 47 a4 01 c8 a9-79 55 1e 52 e2 0d 1d 42 .8.G....yU.R...B
00a0 - ce 19 e2 ed d3 98 05 87-ba 6a de 45 e3 c7 01 c4 .........j.E....
00b0 - 54 58 98 b0 40 c3 57 d2-96 08 30 41 ca b6 82 94 TX..@.W...0A....
00c0 - 00 3e 1e 36 da 7f f3 44-7d f7 24 08 36 52 eb 69 .>.6...D}.$.6R.i
Start Time: 1505515260
Timeout : 300 (sec)
Verify return code: 18 (self signed certificate)
The problem here is the IP address. When you set that IP address Requests is unable to correctly set the SNI header, which may well be confusing your server.
Unfortunately at this time Requests does not support anything like curl's --resolve. You'd need to edit the hosts file on your machine to get this to work the way you want.
@lukasa is it understood the changes necessary to get this to work or is it unknown and require discovery?
The changes to the hosts file, or to Requests/urllib3?
The changes to requests or urllib to support resolve-like functionality
There is a semi-abandoned PR: shadow/urllib3#1209. That needs to be pursued.
Ok thanks! Will follow up if we have time.
|
gharchive/issue
| 2017-09-15T22:49:27 |
2025-04-01T06:40:13.493773
|
{
"authors": [
"Lukasa",
"jshaw86"
],
"repo": "requests/requests",
"url": "https://github.com/requests/requests/issues/4293",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1915715728
|
Update README.md with "multimodal data" messaging
What
Update intro text, code snippet, and screenshot
Checklist
[x] I have read and agree to Contributor Guide and the Code of Conduct
[x] I've included a screenshot or gif (if applicable)
[x] I have tested demo.rerun.io (if applicable)
PR Build Summary
Docs preview
Examples preview
Recent benchmark results
Wasm size tracking
Rendered
What about "About"? :zany_face:
|
gharchive/pull-request
| 2023-09-27T14:27:12 |
2025-04-01T06:40:13.499915
|
{
"authors": [
"emilk",
"nikolausWest",
"teh-cmc"
],
"repo": "rerun-io/rerun",
"url": "https://github.com/rerun-io/rerun/pull/3498",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2119492912
|
Allow SpaceViews to customize the behavior for how default visualizers are chosen
What
Builds on top of: https://github.com/rerun-io/rerun/pull/5047
We don't want to use Scalar as the indicator for LineSeries or PointSeries or else we would get both any time a user logs a Scalar. Instead we now delegate this choice to the space-view.
Additionally, entities are now always included in the query-results if they are visualizable, even if they aren't indicated. Without this there is no way to access the query-result in order to override its visualizers if necessary.
Checklist
[x] I have read and agree to Contributor Guide and the Code of Conduct
[x] I've included a screenshot or gif (if applicable)
[x] I have tested the web demo (if applicable):
Using newly built examples: app.rerun.io
Using examples from latest main build: app.rerun.io
Using full set of examples from nightly build: app.rerun.io
[x] The PR title and labels are set such as to maximize their usefulness for the next release's CHANGELOG
PR Build Summary
Docs preview
Examples preview
Recent benchmark results
Wasm size tracking
nice. if we'd now enable visualizer overrides we could solve
#5010
|
gharchive/pull-request
| 2024-02-05T21:32:34 |
2025-04-01T06:40:13.507163
|
{
"authors": [
"Wumpf",
"jleibs"
],
"repo": "rerun-io/rerun",
"url": "https://github.com/rerun-io/rerun/pull/5050",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
867048118
|
Hover different on ocaml files.
The hover command gives identical results for hover on doc comments in .res and .ml files.
Hovever vscode shows markdown only for those from .res files.
Actually I was testing it wrong -- with the last published extension.
On master it works as expected.
|
gharchive/pull-request
| 2021-04-25T16:06:13 |
2025-04-01T06:40:13.510472
|
{
"authors": [
"cristianoc"
],
"repo": "rescript-lang/rescript-vscode",
"url": "https://github.com/rescript-lang/rescript-vscode/pull/140",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2405474151
|
Long Sort and Copy
Thanks for the sort added in https://github.com/research-ag/vector/pull/15 !
I have started using vectors for buffer in a number of places even when I don't anticipate it getting too large. Maybe this is a mistake as I saw that the performance for small collections is a bit better....but where the might get big I'm trying to use vector.
@timohanke mentioned a long-running sort which is really interesting to me and one particular use case that I have. This may be a different object all together, or maybe there is a 'mode' to add to this.
Thought one: How hard would it be to add a "sort as you go" mode to vecotr so that it is a SortedVector? Is this just a function of sorting after each addition(It seems there might be some overhead to that). Perhaps this is just StableHeapBTree and I should just stick with that, but I'd be interested in your thoughts.
Thought two: Long running sort....If I do need to sort a huge vector for some reason at semi-regular intervals it would be great to be able to do that. Use case: Once a month I want to sort all accounts by their balance and award them something(tokens, VP, give top 10% special rights).
Thought three: Any thoughts on strategies to easily/quickly make copies of large vectors? This seems like it could be a long process as well, but I'm not sure in motoko if there might be some way to efficiently make a 'quick copy' of a vector(or maybe some other object). I'm guessing that if the vector were static you might beable to use pointers to the original vector and then manipulate from there? I'm thinking of things like snapshots for voting for tokens where there may be millions of accounts.
Perhaps this isn't the right place for this issue and I'd be happy to move it to the forum, but it seems that you all have likely done the most work around these things and it seemed like the right place to put it for now.
Thanks for the sort added in https://github.com/research-ag/vector/pull/15 !
Now published as vector 0.4.0 on mops.
Long running sort means that execution terminates, other messages can reach the canister, and then at some point sorting continues in another execution. There are two ways depending on your needs. The first is that we block (via lock) the use of the vector until sorting is completed. That means new messages can reach the canister but the lock does not allow them to alter the vector. In this case we can choose from a wider selection of sorting algorithms. Some algorithms might still be excluded, for example recursion will not work across executions, but many others are allowed.
The second way is an algorithm that sorts the vector in multiple steps, a little bit at a time, and does not lock the vector between executions. If you sort by balances for example then it could happen that a balance that has been "sorted" in the first step gets changed in between steps and then the second step has to "sort" it again. This way likely requires an algorithm that performs better than average on pre-sorted arrays. Otherwise it may never terminate. If the algorithm performs better on pre-sorted arrays the the hope is that with each step the remaining work will be reduced, even if the vector gets altered between steps.
Don't know if locking is acceptable for your application.
Re. copy. There already is a clone() function and it is as efficient as possible.
Re long sort. The benchmarks that you can see here show 7-9 billion instructions for sorting 1 million elements. So in 20B instructions, which I believe is the DTS limit, you can sort roughly around 2 million elements. That is true even for Array.sort. This means that long sort isn't anything that is Vector specific. It is needed for Array as well and should be implemented for Array first in my opinion.
|
gharchive/issue
| 2024-07-12T12:36:14 |
2025-04-01T06:40:13.517215
|
{
"authors": [
"skilesare",
"timohanke"
],
"repo": "research-ag/vector",
"url": "https://github.com/research-ag/vector/issues/16",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1867048048
|
Preview server does not support aliases
Describe the Bug
I use aliases with my TypeScript configuration in order to have simpler paths (@components/Test.tsx as an example), with the way the dev server currently works, I can't configure the tsconfig.json of the dev server, so it does not use proper aliases and do not found my components.
error - ../src/emails/TestEmail.tsx:19:0
Module not found: Can't resolve '@components/Test'
If I update the tsconfig.json used by the dev server, it works. In export mode, no problem.
Current workaround is to not use paths option, or to edit tsconfig.json inside .react-email folder, but it is auto-generated.
I am available to help with this fix if necessary and if you wish.
Which package is affected (leave empty if unsure)
react-email
Link to the code that reproduces this issue
None
To Reproduce
Add paths option in compilerOptions in tsconfig.json :
"paths": {
"@components/*": ["components/*"],
"@utils/*": ["utils/*"]
}
Expected Behavior
Dev server should work with aliases, like export mode.
What's your node version? (if relevant)
v20.5.1
I've implemented a workaround to temporarily address this issue by updating paths in React Email TypeScript configuration file (.react-email/tsconfig.json) and have also added a script entry for automating this process. Here's the code:
// src/utils/updateTsConfigDevServer.mjs
import * as fs from 'fs';
const devServerTsConfig = JSON.parse(fs.readFileSync('.react-email/tsconfig.json').toString());
const currentTsConfig = JSON.parse(fs.readFileSync('tsconfig.json').toString());
const normalizedPaths = Object.entries(currentTsConfig.compilerOptions.paths).reduce((prev, [key, paths]) => {
return {...prev, [key]: paths.map(path => `../src/${path}`)}
}, {})
devServerTsConfig.compilerOptions.paths = normalizedPaths;
fs.writeFileSync('.react-email/tsconfig.json', JSON.stringify(devServerTsConfig, null, 2));
In addition, I've included a script in the package.json file to automate the workaround:
"scripts": {
"dev": "node src/utils/updateTsConfigDevServer.mjs && email dev --dir src/emails"
}
Please note that this code may need modifications to suit another project folder structure.
I'm available and willing to assist with resolving this issue further. I've noticed that React Email seem to be using a similar approach to my workaround, such as merging the package.json for the dev server, could this be applied to tsconfig.json?. Please feel free to reach out if I can contribute to resolving the issue.
Any updates on this? It's a real hassle to have to update all path aliases to relative imports in order to preview and test emails. This also includes path aliases inside nested imports. The issue is happening on both the email dev command, as well as the VSCode extension. Any workarounds or suggestions would be greatly appreciated.
I'd love to see a screenshot of how you're organizing the filesystem with aliases for email templates. Given the simplistic and encapsulated nature of email templates, it's unusual that a complex file tree would be needed.
I'd love to see a screenshot of how you're organizing the filesystem with aliases for email templates. Given the simplistic and encapsulated nature of email templates, it's unusual that a complex file tree would be needed.
Hi @shellscape,
Personally, I don't have a particularly complex structure for my emails. However, I'm in a monorepo and each workspace uses more or less the same rules, including aliasing.
If I didn't have to use aliases, it wouldn't be very complicated without them either, but I wouldn't expect a dev server to change this behavior (and therefore my configurations), at least without it being documented.
Here's an example of the structure I use for my emails:
.
└── @acme/email/
├── atoms/
│ └── Link.tsx
├── molecules/
│ └── Article.tsx
└── organisms/
└── ArticlesGrid.tsx
We've just upgraded React Email and aliases no longer work in export mode, just like the dev server. This seems to have been introduced by this PR https://github.com/resend/react-email/pull/1027 which forces the use of a tsconfig.
I'm still of the opinion that React Email should be more flexible on the tools used for build and development and give this responsibility to the user but could facilitate integration with existing stacks. Or at least make this part configurable?
For the moment, I've simply removed the aliases from my project and I'm no longer using my previous workaround, which no longer works.
@Gregory-Gerard
I'm still of the opinion that React Email should be more flexible on the tools used for build and development and give this responsibility to the user but could facilitate integration with existing stacks. Or at least make this part configurable?
That is our intention with the project, what other pain points do you find make us hard to use on some stacks?
We've just upgraded React Email and aliases no longer work in export mode, just like the dev server. This seems to have been introduced by this PR https://github.com/resend/react-email/pull/1027 which forces the use of a tsconfig.
Sorry to hear that, I'll try getting a fix on this for you soon. For context, we had to use a specific tsconfig due to the esbuild JSX option getting overridden by tsconfig's which caused errors with users that had "jsx": "preserve". Just bumping esbuild should fix this for you. If you want I can get you a quick patch for now since I'm not sure I'll be able to get this fixed and released very soon.
@gabrielmfern
Thanks for your reply.
Mainly the pain points are with build / dev opinionated tools, which is great for getting started quickly but obviously less flexible. Maybe we could set up something to be able to configure at least the config files (tsconfig and so on) used by export and build?
There's no real critical issue, I just removed the aliases from my project for now, but thanks for offering me a quick patch.
I just saw that @bukinoshita closed this issue, may I open another issue about export mode?
@Gregory-Gerard Yeah, please do that, after today I think I'll tackle that, if you want I can also get you a quick patch you can use for now.
@Gregory-Gerard Hey, I forgot that I had implemented this, but I think the 2.0.0 also includes a fix for this. Can you check it out?
@gabrielmfern
Hey, thanks for your patience and assistance. The preview server seems to work perfectly with aliases and launches with serious performance improvements. Congrats!
Unfortunately, it seems that export is broken for now. I keep getting this error:
~/projects/stackblitz-starters-bziz8k 3s
❯ email export
✔ Preparing files...
✖ Failed when rendering Test.js
TypeError: component.default is not a function
at eval (file:///home/projects/stackblitz-starters-bziz8k/node_modules/react-email/cli/index.js:777:75)
at step (file:///home/projects/stackblitz-starters-bziz8k/node_modules/react-email/cli/index.js:185:23)
at Object.eval [as next] (file:///home/projects/stackblitz-starters-bziz8k/node_modules/react-email/cli/index.js:126:20)
at asyncGeneratorStep (file:///home/projects/stackblitz-starters-bziz8k/node_modules/react-email/cli/index.js:11:28)
at _next (file:///home/projects/stackblitz-starters-bziz8k/node_modules/react-email/cli/index.js:29:17)
It does not seem to be my configuration because here is a minimal repro with just React Email installed: Minimal Repro
It appears to be related to this line: export.ts#65
By replacing component.default({}) with component.default.default({}), it works without any issues as before. I don't believe the problem is specifically with this line, but rather with the build method that may have changed.
Interesting, I think this might be an ESM issue since we build with CJS format. Maybe using require might fix this, will open a PR for it.
Hi. I can see it's been just 4 days. However, any progress on this?
I just initiated a new project inside turborepo using this guide and also took inspiration from this example and I am running into the TypeError: component.default is not a function issue as well.
The output is a bunch of javascript. I also found this already closed issue which basically has the same result.
We do have various different tsconfigs in our monorepo but don't use aliases within the packages folder that this project resides in.
Our base tsconfig is this:
{
"$schema": "https://json.schemastore.org/tsconfig",
"display": "Default",
"compilerOptions": {
"plugins": [
{ "name": "next" }
],
"target": "es2017",
"lib": ["dom", "dom.iterable", "esnext"],
"allowJs": true,
"checkJs": true,
"skipLibCheck": true,
"strict": true,
"strictNullChecks": true,
"forceConsistentCasingInFileNames": true,
"noEmit": true,
"esModuleInterop": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"jsx": "preserve",
"incremental": true,
"noUncheckedIndexedAccess": true
},
"exclude": ["node_modules"]
}
However, I have doubts about this having an influence. I don't extend this base.
So far, I've tried this, but to no avail:
{
"$schema": "https://json.schemastore.org/tsconfig",
"compilerOptions": {
"jsx": "react",
"baseUrl": "."
},
"exclude": ["node_modules", "out"]
}
@tim3w4rp PR is open for this #1214
Hope, it would be resolved sooner.
Temporarily, i've just added my components or the whole my "src" folder as a dependency in package.json using local paths
{
"dependencies": {
"src": "file:./src"
}
}
so, i able to import my file with absolute path like importing a package/library in node_modules,
import { MyComponentLego } from 'src/components'
import { SomeHelper } from 'src/helpers'
|
gharchive/issue
| 2023-08-25T13:20:45 |
2025-04-01T06:40:13.540491
|
{
"authors": [
"Gregory-Gerard",
"RakaDoank",
"gabrielmfern",
"shellscape",
"taylor-lindores-reeves",
"tim3w4rp"
],
"repo": "resendlabs/react-email",
"url": "https://github.com/resendlabs/react-email/issues/895",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
632079440
|
All examples core dump when running
This is built on a Ubuntu 18.04 docker following all of the directions for adding requirements. Each example dies when trying to parse and build from its urdf file. I've tried using DART master as well, which doesn't help. Any ideas on what I could have missed that's causing this issue?
Compiled:
g++ -g -I/usr/include/eigen3 -o /tmp/hex src/examples/hexapod.cpp -l RobotDARTSimu -ldart
Run (gdb /tmp/hex):
Program received signal SIGSEGV, Segmentation fault.
__GI___libc_free (mem=0x3a1) at malloc.c:3103
3103 malloc.c: No such file or directory.
(gdb) bt
#0 __GI___libc_free (mem=0x3a1) at malloc.c:3103
#1 0x00007fa6fc740f5b in dart::common::EmbeddedStateAndPropertiesAspect<dart::dynamics::GenericJoint<dart::math::SE3Space>, dart::dynamics::detail::GenericJointState<dart::math::SE3Space>, dart::dynamics::detail::GenericJointUniqueProperties<dart::math::SE3Space> >* dart::common::SpecializedForAspect<dart::common::EmbeddedStateAndPropertiesAspect<dart::dynamics::GenericJoint<dart::math::SE3Space>, dart::dynamics::detail::GenericJointState<dart::math::SE3Space>, dart::dynamics::detail::GenericJointUniqueProperties<dart::math::SE3Space> > >::_createAspect<dart::dynamics::FreeJoint::Properties const&>(dart::common::SpecializedForAspect<dart::common::EmbeddedStateAndPropertiesAspect<dart::dynamics::GenericJoint<dart::math::SE3Space>, dart::dynamics::detail::GenericJointState<dart::math::SE3Space>, dart::dynamics::detail::GenericJointUniqueProperties<dart::math::SE3Space> > >::type<dart::common::EmbeddedStateAndPropertiesAspect<dart::dynamics::GenericJoint<dart::math::SE3Space>, dart::dynamics::detail::GenericJointState<dart::math::SE3Space>, dart::dynamics::detail::GenericJointUniqueProperties<dart::math::SE3Space> > >, dart::dynamics::FreeJoint::Properties const&) [clone .isra.593] () from /usr/local/lib/libdart.so.6.9
#2 0x00007fa6fc741f27 in dart::dynamics::FreeJoint::FreeJoint(dart::dynamics::FreeJoint::Properties const&) () from /usr/local/lib/libdart.so.6.9
#3 0x00007fa6fcc70ef5 in dart::dynamics::Skeleton::createJointAndBodyNodePair<dart::dynamics::FreeJoint, dart::dynamics::BodyNode> (this=0x55ca0103a9c0,
_parent=0x0, _jointProperties=..., _bodyProperties=...)
at /usr/local/include/dart/dynamics/detail/Skeleton.hpp:82
---Type <return> to continue, or q <return> to quit---
#4 0x00007fa6fb596b0b in dart::utils::DartLoader::modelInterfaceToSkeleton(urdf::ModelInterface const*, dart::common::Uri const&, std::shared_ptr<dart::common::ResourceRetriever> const&) () from /usr/local/lib/libdart-utils-urdf.so.6.9
#5 0x00007fa6fb5974ad in dart::utils::DartLoader::parseSkeleton(dart::common::Uri const&, std::shared_ptr<dart::common::ResourceRetriever> const&) ()
from /usr/local/lib/libdart-utils-urdf.so.6.9
#6 0x00007fa6fcc66e38 in robot_dart::Robot::_load_model (
this=this@entry=0x55ca0102cfe0, filename="res/models/pexod.urdf",
packages=std::vector of length 0, capacity 0,
is_urdf_string=is_urdf_string@entry=false)
at ../src/robot_dart/robot.cpp:1013
#7 0x00007fa6fcc6990b in robot_dart::Robot::Robot (this=0x55ca0102cfe0,
model_file="res/models/pexod.urdf",
packages=std::vector of length 0, capacity 0, robot_name=...,
is_urdf_string=<optimized out>, cast_shadows=<optimized out>,
damages=std::vector of length 0, capacity 0)
at ../src/robot_dart/robot.cpp:151
#8 0x00007fa6fcc69c95 in robot_dart::Robot::Robot (this=0x55ca0102cfe0,
model_file="res/models/pexod.urdf", robot_name="robot",
is_urdf_string=<optimized out>, cast_shadows=<optimized out>, damages=...)
at ../src/robot_dart/robot.cpp:157
#9 0x000055c9ff1d96c0 in __gnu_cxx::new_allocator<robot_dart::Robot>::construct<robot_dart::Robot, char const (&) [22]> (this=0x7ffeb510ad67,
---Type <return> to continue, or q <return> to quit---
__p=0x55ca0102cfe0) at /usr/include/c++/7/ext/new_allocator.h:136
#10 0x000055c9ff1d90d6 in std::allocator_traits<std::allocator<robot_dart::Robot> >::construct<robot_dart::Robot, char const (&) [22]> (__a=...,
__p=0x55ca0102cfe0) at /usr/include/c++/7/bits/alloc_traits.h:475
#11 0x000055c9ff1d8b5c in std::_Sp_counted_ptr_inplace<robot_dart::Robot, std::allocator<robot_dart::Robot>, (__gnu_cxx::_Lock_policy)2>::_Sp_counted_ptr_inplace<char const (&) [22]> (this=0x55ca0102cfd0, __a=...)
at /usr/include/c++/7/bits/shared_ptr_base.h:526
#12 0x000055c9ff1d84a5 in std::__shared_count<(__gnu_cxx::_Lock_policy)2>::__shared_count<robot_dart::Robot, std::allocator<robot_dart::Robot>, char const (&) [22]> (this=0x7ffeb510af28, __a=...)
at /usr/include/c++/7/bits/shared_ptr_base.h:637
#13 0x000055c9ff1d8014 in std::__shared_ptr<robot_dart::Robot, (__gnu_cxx::_Lock_policy)2>::__shared_ptr<std::allocator<robot_dart::Robot>, char const (&) [22]> (this=0x7ffeb510af20, __tag=..., __a=...)
at /usr/include/c++/7/bits/shared_ptr_base.h:1295
#14 0x000055c9ff1d78d3 in std::shared_ptr<robot_dart::Robot>::shared_ptr<std::allocator<robot_dart::Robot>, char const (&) [22]> (this=0x7ffeb510af20,
__tag=..., __a=...) at /usr/include/c++/7/bits/shared_ptr.h:344
#15 0x000055c9ff1d6d48 in std::allocate_shared<robot_dart::Robot, std::allocator<robot_dart::Robot>, char const (&) [22]> (__a=...)
at /usr/include/c++/7/bits/shared_ptr.h:691
#16 0x000055c9ff1d60cc in std::make_shared<robot_dart::Robot, char const (&) [2---Type <return> to continue, or q <return> to quit---
2]> () at /usr/include/c++/7/bits/shared_ptr.h:707
#17 0x000055c9ff1d515d in main () at src/examples/hexapod.cpp:11
(gdb)
Additional info (linking)
ldd /tmp/hex
linux-vdso.so.1 (0x00007fffa57eb000)
libRobotDARTSimu.so => /usr/local/lib/libRobotDARTSimu.so (0x00007f56b6152000)
libdart.so.6.9 => /usr/local/lib/libdart.so.6.9 (0x00007f56b5902000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f56b5579000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f56b5361000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f56b4f70000)
libdart-utils.so.6.9 => /usr/local/lib/libdart-utils.so.6.9 (0x00007f56b4cba000)
libdart-utils-urdf.so.6.9 => /usr/local/lib/libdart-utils-urdf.so.6.9 (0x00007f56b4a89000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f56b486a000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f56b44cc000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f56b42c8000)
libdart-external-odelcpsolver.so.6.9 => /usr/local/lib/libdart-external-odelcpsolver.so.6.9 (0x00007f56b40ba000)
libboost_system.so.1.65.1 => /usr/lib/x86_64-linux-gnu/libboost_system.so.1.65.1 (0x00007f56b3eb5000)
libboost_filesystem.so.1.65.1 => /usr/lib/x86_64-linux-gnu/libboost_filesystem.so.1.65.1 (0x00007f56b3c9b000)
libboost_regex.so.1.65.1 => /usr/lib/x86_64-linux-gnu/libboost_regex.so.1.65.1 (0x00007f56b3993000)
libfcl.so.0.5 => /usr/lib/x86_64-linux-gnu/libfcl.so.0.5 (0x00007f56b2eab000)
libassimp.so.4 => /usr/lib/x86_64-linux-gnu/libassimp.so.4 (0x00007f56b24e0000)
liboctomap.so.1.8 => /usr/lib/liboctomap.so.1.8 (0x00007f56b229c000)
liboctomath.so.1.8 => /usr/lib/liboctomath.so.1.8 (0x00007f56b2096000)
/lib64/ld-linux-x86-64.so.2 (0x00007f56b65b8000)
libtinyxml2.so.6 => /usr/lib/x86_64-linux-gnu/libtinyxml2.so.6 (0x00007f56b1e82000)
liburdfdom_model.so.1.0 => /usr/lib/x86_64-linux-gnu/liburdfdom_model.so.1.0 (0x00007f56b1c60000)
libicui18n.so.60 => /usr/lib/x86_64-linux-gnu/libicui18n.so.60 (0x00007f56b17bf000)
libicuuc.so.60 => /usr/lib/x86_64-linux-gnu/libicuuc.so.60 (0x00007f56b1407000)
libccd.so.2 => /usr/lib/x86_64-linux-gnu/libccd.so.2 (0x00007f56b11fc000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f56b0fdf000)
libminizip.so.1 => /usr/lib/x86_64-linux-gnu/libminizip.so.1 (0x00007f56b0dd4000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f56b0bcc000)
libtinyxml.so.2.6.2 => /usr/lib/x86_64-linux-gnu/libtinyxml.so.2.6.2 (0x00007f56b09b7000)
libconsole_bridge.so.0.4 => /usr/lib/x86_64-linux-gnu/libconsole_bridge.so.0.4 (0x00007f56b07b2000)
libicudata.so.60 => /usr/lib/x86_64-linux-gnu/libicudata.so.60 (0x00007f56aec09000)
Ok, there's probably something missing somehow in a straight build. But after crawling around, I found the files in cmake/example and those worked. So I created a equivalent cmakefile for the src/examples and so far all is well.
Closing this.
@danachee thank you for using our library. First of all, as we say in the README, the library is currently under heavy restructuring and many things (including the documentation/installation instructions) might not be up to date.
We will soon make a big v1.0 release with hopefully a proper documentation and surely many new features.
Compiled:
g++ -g -I/usr/include/eigen3 -o /tmp/hex src/examples/hexapod.cpp -l RobotDARTSimu -ldart
If you want to compile it by hand, you would need to add many more flags and link to more libraries. We strongly suggest to use waf (our current favorite build system -- for an example usage for finding robot_dart can be found here and here). If you prefer CMake, we recently provided support for this, so when you do [sudo] ./waf install a CMake config file will be installed, so that one can use CMake to find RobotDART along with its dependencies.
I guess you already discovered this, but just in case..
I realize its in flux, and I might just need to wait, but it seems the python side is "missing"?? I set it up to build the bindings, and it did install the object:
install /usr/local/lib/python3/dist-packages/RobotDART.cpython-38-x86_64-linux-gnu.so (from build/RobotDART.cpython-38-x86_64-linux-gnu.so)
But it didn't install anything else, and I can't find any python modules. Are they missing, or do I need a different branch?
Thanks,
But it didn't install anything else, and I can't find any python modules
You do not need anything else. Most probably you have to change your PYTHONPATH (I guess by default it is not looking to /usr/local/): export PYTHONPATH=$PYTHONPATH:/usr/local/lib/python3/dist-packages/. Alternatively, you can do ./waf configure --prefix=/usr --python and then sudo ./waf install, this will install everything in /usr which should be okay for the PYTHONPATH. I prefer the first way (install in /usr/local and alter the PYTHONPATH as this keeps the system cleaner; you can also put the export in your bashrc/zshrc file so that you have it in every terminal by default).
You are correct! Thanks again! (it was a typeo in my PYTHONPATH set that was the issue)
About waf. Is it something that would be used on a "outside" set of code (not stored in the robot_dart repo)?
So if one were to create their own simulation, do they need to copy the waf and waf_tools into their directory? And if so, what would they need to tell it? Feel free to tell me this will all be in the user guide for the new version and to be patient :-).
You are correct! Thanks again! (it was a typeo in my PYTHONPATH set that was the issue)
Good!
About waf. Is it something that would be used on a "outside" set of code (not stored in the robot_dart repo)?
You do not need to use waf. We have provide the installation of CMake config for this reason: for people to use robot_dart within their CMake projects easily. So if you are familiar with CMake and using it, feel free to use the library in that way.
So if one were to create their own simulation, do they need to copy the waf and waf_tools into their directory? And if so, what would they need to tell it?
If you want to use waf for building your projects, there are many ways of doing this. I would suggest you to read the waf book where there is an extensive documentation of how to use waf in general.
Feel free to tell me this will all be in the user guide for the new version and to be patient :-).
We will make a small standalone example of how to use waf similar to our use-cases to find robot_dart and compile your projects, so you can be a bit patient until the documentation is finished. If you want to have a look now on possible use-cases, have a look at the whc library that depends on robot_dart and uses waf to get an idea.
|
gharchive/issue
| 2020-06-06T00:16:10 |
2025-04-01T06:40:13.556156
|
{
"authors": [
"costashatz",
"danachee"
],
"repo": "resibots/robot_dart",
"url": "https://github.com/resibots/robot_dart/issues/82",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
711305685
|
webmanifest broken?
I tried to re-install Tileboard as an app after the recent restructure. But Firefox does not offer me install anymore. Only a regular add to home screen. I guess something broke with the webmanifest since I created it...
Hmm... The add to home screen does what I expect. It's working on the iPad. Maybe the icons/menus have just changed in Firefox...
Works for me too on Chrome.
|
gharchive/issue
| 2020-09-29T17:16:10 |
2025-04-01T06:40:13.572073
|
{
"authors": [
"akloeckner",
"rchl"
],
"repo": "resoai/TileBoard",
"url": "https://github.com/resoai/TileBoard/issues/446",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
837247744
|
Vertical Slider Tile
Add Vertical Slider tile.
Tile doesn't require any input from user, width and height will be calculated automatically based on Tile size.
Best to use with tile height > 1.
Also, some other options are available for user: sliderHeight and sliderWidth. If they are defined inside slider - they will be used in first place and take precedence via automated calculation.
Also, Tile supports icon. If icon doesn't defined it will be taken to consideration in case of automatic calculation.
Example:
{
position: [0, 0],
height: 3,
id: 'light.entity',
type: TYPES.SLIDER_VERTICAL,
unit: '%',
title: 'Slider',
icon: 'mdi-lightbulb',
state: false,
filter: function (value) {
var num = parseFloat(value) / 2.55 ;
return num && !isNaN(num) ? num.toFixed() : 0;
},
value: '@attributes.brightness',
slider: {
max: 255,
min: 0,
step: 5,
field: 'brightness',
// sliderWidth: '60',
// sliderHeight: '270',
request: {
type: "call_service",
domain: "light",
service: "turn_on",
field: "brightness"
},
},
}
@rchl Can you please take a look on PR
Thanks in advance.
merged: https://github.com/resoai/TileBoard/pull/672
Thanks
|
gharchive/issue
| 2021-03-22T01:47:40 |
2025-04-01T06:40:13.575624
|
{
"authors": [
"timota"
],
"repo": "resoai/TileBoard",
"url": "https://github.com/resoai/TileBoard/issues/673",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
393977787
|
Reverb is always mono on any browser
HEllo!
Reverb is mono, even with the audio demos that are provided in the SDK like room models.
IT's mono on Electron, Safari, Firefox, everywhere.
Any chance to get this fixed?
I’ll look into this. We might be able to implement a stereo version of
reverb but that would only work if you dont want ambisonic output.
On Tue, Dec 25, 2018 at 2:28 AM ogomez92 notifications@github.com wrote:
HEllo!
Reverb is mono, even with the audio demos that are provided in the SDK
like room models.
IT's mono on Electron, Safari, Firefox, everywhere.
Any chance to get this fixed?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/resonance-audio/resonance-audio-web-sdk/issues/23,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AGDsNpH3u8tm2q7gBnOaXfNJIMyEg9HSks5u8dOWgaJpZM4Zg4vU
.
Reverb on mono sounds really wrong, that’s what I mean. I don’t think mono reverb is a good idea in any case.
On 6 Feb 2019, at 13:15, Drew Allen notifications@github.com wrote:
I’ll look into this. We might be able to implement a stereo version of
reverb but that would only work if you dont want ambisonic output.
On Tue, Dec 25, 2018 at 2:28 AM ogomez92 notifications@github.com wrote:
HEllo!
Reverb is mono, even with the audio demos that are provided in the SDK
like room models.
IT's mono on Electron, Safari, Firefox, everywhere.
Any chance to get this fixed?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/resonance-audio/resonance-audio-web-sdk/issues/23,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AGDsNpH3u8tm2q7gBnOaXfNJIMyEg9HSks5u8dOWgaJpZM4Zg4vU
.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub https://github.com/resonance-audio/resonance-audio-web-sdk/issues/23#issuecomment-461002825, or mute the thread https://github.com/notifications/unsubscribe-auth/ADj_9fCBm5Z3gz7x__ewsOLeMZ_BQ4Ogks5vKsdegaJpZM4Zg4vU.
I agree. We just don’t want to implement something that will 4x as
expensive as current, and we’ll need to implement decorrelation filters,
I’ll need to see how cheap we can make them in WebAudio.
On Wed, Feb 6, 2019 at 11:14 AM ogomez92 notifications@github.com wrote:
Reverb on mono sounds really wrong, that’s what I mean. I don’t think mono
reverb is a good idea in any case.
On 6 Feb 2019, at 13:15, Drew Allen notifications@github.com wrote:
I’ll look into this. We might be able to implement a stereo version of
reverb but that would only work if you dont want ambisonic output.
On Tue, Dec 25, 2018 at 2:28 AM ogomez92 notifications@github.com
wrote:
HEllo!
Reverb is mono, even with the audio demos that are provided in the SDK
like room models.
IT's mono on Electron, Safari, Firefox, everywhere.
Any chance to get this fixed?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<https://github.com/resonance-audio/resonance-audio-web-sdk/issues/23
,
or mute the thread
<
https://github.com/notifications/unsubscribe-auth/AGDsNpH3u8tm2q7gBnOaXfNJIMyEg9HSks5u8dOWgaJpZM4Zg4vU
.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub <
https://github.com/resonance-audio/resonance-audio-web-sdk/issues/23#issuecomment-461002825>,
or mute the thread <
https://github.com/notifications/unsubscribe-auth/ADj_9fCBm5Z3gz7x__ewsOLeMZ_BQ4Ogks5vKsdegaJpZM4Zg4vU
.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/resonance-audio/resonance-audio-web-sdk/issues/23#issuecomment-461082838,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AGDsNob8DriUSp2qMVNtAdxVx46GWnJrks5vKv9PgaJpZM4Zg4vU
.
This is a very uneducated guess, but wouldn't it be enough to simply create stereo noise for the reverb tail white noise in the convolver to at least give the impression? It might not be very accurate but already sound twice as real, if not more, until a better solution can be found that computes fast enough? Maybe it could be ported from the Resonance C library and compiled to WebAssembly. Right now I always set my room materials to transparent because the reverb actually makes it more confusing to tell where something comes from, because it's limited to a mono center channel and that throws off the perception. Adding a custom convolver with a real room impulse does, sadly, sound better at this point.
|
gharchive/issue
| 2018-12-25T07:28:22 |
2025-04-01T06:40:13.592251
|
{
"authors": [
"Ghorthalon",
"drewbitllama",
"ogomez92"
],
"repo": "resonance-audio/resonance-audio-web-sdk",
"url": "https://github.com/resonance-audio/resonance-audio-web-sdk/issues/23",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
6414641
|
allowing before_execution_proc to be in the instance.
The ideia is to allowing Resource to have it's own before_execution_proc in addition to the one that's provided by the class. Let me know what you think.
thank you.
Could it be merged? looking for such functionality
@ab any chance to merge this? (open > 1yr), and we really want this functionality.
@ab any update on the status of this issue? do you need help to get it fixed?
Thanks! I ended up taking a slightly different approach here. You can set a :before_execution_proc as an option to any Request or Resource object.
|
gharchive/pull-request
| 2012-08-23T17:41:37 |
2025-04-01T06:40:13.619906
|
{
"authors": [
"ab",
"brbrr",
"fribeiro",
"oschreib",
"simon3z"
],
"repo": "rest-client/rest-client",
"url": "https://github.com/rest-client/rest-client/pull/137",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
530066901
|
"self-update --output" fails to create a new file on Windows
Output of restic version
restic 0.9.6 compiled with go1.13.4 on windows/amd64
How did you run restic exactly? (/Actual behavior/Steps to reproduce the behavior)
Got the following error when running restic self-update:
$ restic self-update --output restic-new.exe
CreateFile restic-new.exe: The system cannot find the file specified.
If I create an empty file named restic-new.exe on the current directory, the command runs successfully:
$ touch restic-new.exe
$ restic self-update --output restic-new.exe
writing restic to restic-new.exe
find latest release of restic at GitHub
restic is up to date
I was able to update from 0.9.4 by creating such empty file.
Expected behavior
Restic should check for new version even when the output file doesn't exist.
Related Issues
On #2248 (self-update doesn't work on Windows), the suggested workaround is to use "--output".
Did restic help you today? Did it make you happy in any way?
It's the best backup solution I could find!
@guyshapiro Thanks for the report! Can you please update (edit) the initial post in this issue such that both examples (when it doesn't work and when it does work) contains the entire command and all the output from restic?
[I updated the report according to @rawtaz request.]
This doesn't appear to be specific to Windows, but also something I'm seeing on my Linux systems, that restic self-update --output requests the specified path to be an existing file.
$ mktemp --directory
/tmp/tmp.N7OIm2e9MO
$ /opt/restic/0.9.6/restic self-update --output /tmp/tmp.N7OIm2e9MO/restic
lstat /tmp/tmp.N7OIm2e9MO/restic: no such file or directory
$ touch /tmp/tmp.N7OIm2e9MO/restic
$ /opt/restic/0.9.6/restic self-update --output /tmp/tmp.N7OIm2e9MO/restic
writing restic to /tmp/tmp.N7OIm2e9MO/restic
find latest release of restic at GitHub
latest version is 0.10.0
download SHA256SUMS
download SHA256SUMS.asc
GPG signature verification succeeded
download restic_0.10.0_linux_amd64.bz2
downloaded restic_0.10.0_linux_amd64.bz2
saved 18378752 bytes in /tmp/tmp.N7OIm2e9MO/restic
successfully updated restic to version 0.10.0
$
The newly released restic 0.10.0 shows a corresponding behavior.
$ mktemp --directory
/tmp/tmp.ABJ89sJRmw
$ /opt/restic/0.10.0/restic self-update --output /tmp/tmp.ABJ89sJRmw/restic
lstat /tmp/tmp.ABJ89sJRmw/restic: no such file or directory
$ touch /tmp/tmp.ABJ89sJRmw/restic
$ /opt/restic/0.10.0/restic self-update --output /tmp/tmp.ABJ89sJRmw/restic
writing restic to /tmp/tmp.ABJ89sJRmw/restic
find latest release of restic at GitHub
restic is up to date
$
Unless I'm missing something this definitely feels like a bug, just the right size for me to take a stab at.
just the right size for me to take a stab at
Go pher it ;-)
Ok, here we go! https://github.com/restic/restic/pull/2937
|
gharchive/issue
| 2019-11-28T20:23:55 |
2025-04-01T06:40:13.629761
|
{
"authors": [
"andreaso",
"guyshapiro",
"rawtaz"
],
"repo": "restic/restic",
"url": "https://github.com/restic/restic/issues/2491",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
154367197
|
fix toJSON behavior
Fix long standing bug with double JSON stringifed errors.
When serializing to JSON, we now call toString() and VError should return the full context of the error message back. That's used as the 'message' when we have nested causes.
Blocker for restify/node-restify#1084
Coverage increased (+0.05%) to 98.02% when pulling 4f9137f4250130280d3316c7949f2f03e4c2ce48 on fix-toJSON into 03804812958d260c1c945498350b2f3980f236d1 on master.
@yunong @micahr incorrect usage of JSON.stringify (including tests) causing some integration issues with restify 5.x. This fixes that along with the tests.
|
gharchive/pull-request
| 2016-05-11T23:42:39 |
2025-04-01T06:40:13.632871
|
{
"authors": [
"DonutEspresso",
"coveralls"
],
"repo": "restify/errors",
"url": "https://github.com/restify/errors/pull/34",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
394805544
|
Fix #170 - set a 'dirty' flag in response to reconciliation
Issue: If a component calls setState, the UI does not get refreshed in response. This is very problematic for a UI, since the interesting interactivity will occur duet o state transitions.
Defect: Our setShouldRenderCallback should be smarter - as pointed out in #170 by @OhadRau , there are additional cases we need to handle here.
Fix: The reason-reactify reconciler was intended to help with this. It exposes both a onBeginReconcile and onEndReconcile events that are dispatched whenever a sub-tree is updated. For example, if you call setState(...) from some component, that event will be dispatched with the target node as the first argument. For now, we set a dirty flag to true, and clear that dirty flag on render.
There's a test case in reason-reactify that show how this functionality is intended to work:
https://github.com/revery-ui/reason-reactify/blob/cc0dbb453068f3693fc7fb692626c7d2f4ececcb/test/ContainerTest.re#L58
This PR addresses the particular case of updating the UI in response to nodes changing (which handles the style changing case, parent changing, an event happening, etc - as they all go through the reconciler).
Future work:
As called out in #170 - there may be other cases where we need to leverage such a flag, like when a window loses focus.
Incremental rendering - today, we always render the full screen. However, especially with something like #54 , we may be able to do much better in terms of just rendering a subtree.
I suspect there are still reconciler (reason-reactify) bugs we will run into - but that is good, it's expected we'll shake out the bugs in that space as we build out more real-world scenarios.
Fixes #170 , but we may want to track some of the corollary work called out (ie, re-rendering on window losing focus) in a separate issue.
Thanks for the review @OhadRau !
|
gharchive/pull-request
| 2018-12-29T15:51:31 |
2025-04-01T06:40:14.146692
|
{
"authors": [
"bryphe"
],
"repo": "revery-ui/revery",
"url": "https://github.com/revery-ui/revery/pull/171",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1856883982
|
Update requirements.md
Fixed typo mistake
Thank you for your first contribution @jignesh-baldha !
|
gharchive/pull-request
| 2023-08-18T15:10:04 |
2025-04-01T06:40:14.195195
|
{
"authors": [
"janosmiko",
"jignesh-baldha"
],
"repo": "rewardenv/reward",
"url": "https://github.com/rewardenv/reward/pull/52",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1343414522
|
Update s3, sdk-core, sqs to 2.17.255
Updates
software.amazon.awssdk:s3
software.amazon.awssdk:sdk-core
software.amazon.awssdk:sqs
from 2.17.191 to 2.17.255.
I'll automatically update this PR to resolve conflicts as long as you don't change it yourself.
If you'd like to skip this version, you can just close this PR. If you have any feedback, just mention me in the comments below.
Configure Scala Steward for your repository with a .scala-steward.conf file.
Have a fantastic day writing Scala!
Adjust future updates
Add this to your .scala-steward.conf file to ignore future updates of this dependency:
updates.ignore = [ { groupId = "software.amazon.awssdk" } ]
Or, add this to slow down future updates of this dependency:
dependencyOverrides = [{
pullRequests = { frequency = "@monthly" },
dependency = { groupId = "software.amazon.awssdk" }
}]
labels: library-update, early-semver-patch, semver-spec-patch, commit-count:1
Superseded by #536.
|
gharchive/pull-request
| 2022-08-18T17:44:21 |
2025-04-01T06:40:14.198904
|
{
"authors": [
"scala-steward"
],
"repo": "rewards-network/pure-aws",
"url": "https://github.com/rewards-network/pure-aws/pull/535",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2754200177
|
Errors not propogated to the user
function connect() {
if (request) {
return
}
readyState = CONNECTING
controller = new AbortController()
request = fetch(url, getRequestOptions())
.then(onFetchResponse)
.catch((err: Error & {type: string}) => {
request = null
// We expect abort errors when the user manually calls `close()` - ignore those
if (err.name === 'AbortError' || err.type === 'aborted') {
return
}
scheduleReconnect()
})
}
https://github.com/rexxars/eventsource-client/blob/d8244e895b0c47bcb15fc4f8dd1f098ed42daa57/src/client.ts#L80C2-L99C4
My understanding based on the code is when there is an error connecting there is no way to know. For my use case I'd like to display something to the user / have some logic to deal with errors like this.
I am planning on using this library - do you want a PR to add an onConnectionError or some other event?
Additionally, I'd like to update the reconnect logic to support some sort of back off strategy, do you want that in the library also?
@lukebelbina Did you figure this out?
@lukebelbina Did you figure this out?
I ended up pulling it down locally and making a couple changes, mainly
function connect() {
if (request) {
return;
}
readyState = CONNECTING;
controller = new AbortController();
request = fetch(url, getRequestOptions())
.then(onFetchResponse)
.catch((err: Error & { type: string }) => {
request = null;
// We expect abort errors when the user manually calls `close()` - ignore those
if (err.name === 'AbortError' || err.type === 'aborted') {
return;
}
onConnectionError(err);
scheduleReconnect();
});
}
And throwing on non 200 status in onFetchResponse
async function onFetchResponse(response: FetchLikeResponse) {
onConnect();
parser.reset();
const { body, redirected, status } = response;
// HTTP 204 means "close the connection, no more data will be sent"
if (status === 204) {
onDisconnect();
close();
return;
}
if (status >= 400) {
// todo better and more informative errors for the user
// todo parse error body for more info
throw new Error(`Unable to connect with HTTP status ${status}`);
}
I was waiting to hear back from the maintainer if it was worth doing a PR. I'll give it until after the holidays and if I don't hear anything back I'll fork / submit a PR.
Thank you
|
gharchive/issue
| 2024-12-21T16:17:02 |
2025-04-01T06:40:14.442313
|
{
"authors": [
"lukebelbina",
"punkpeye"
],
"repo": "rexxars/eventsource-client",
"url": "https://github.com/rexxars/eventsource-client/issues/8",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
303944264
|
Pass node properties to renderer
This allow us to have an abstract representation of custom nodes and passing props to the render function such as:
{
'type': 'linkbar',
'description': 'some text',
'image': 'image.jpg',
}
and render them as:
renderLinbar = ({description, image}) =>
<Linkbar description={description} image={image} />
closing for now. Something is broken.
I'm going to need a test for this one
Closing this as #170 solved this with tests.
|
gharchive/pull-request
| 2018-03-09T18:55:17 |
2025-04-01T06:40:14.444820
|
{
"authors": [
"juangl",
"rexxars"
],
"repo": "rexxars/react-markdown",
"url": "https://github.com/rexxars/react-markdown/pull/152",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2236874963
|
accept a path or bytes for Single and Parallel
let _ = Single::from_bytes(); // accepts a vector (or array) of bytes
let _ = Single::from_path(); // accepts a path leading to a .jpg image
let _ = Parallel::from_dir(); // accepts a directory containing .jpg images
let _ = Parallel::from_vec(); // accepts a vector of `Vec<u8>`'s or `&[u8]`'s
This allows for broader use of the crate.
it is decided this crate will not interact with filesystem at all, it will only deal with image bytes.
closed in #23
|
gharchive/issue
| 2024-04-11T04:12:30 |
2025-04-01T06:40:14.449875
|
{
"authors": [
"rfdzan"
],
"repo": "rfdzan/jippigy",
"url": "https://github.com/rfdzan/jippigy/issues/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
871701040
|
Missing Copyright Info
Hi team,
I couldn’t identify the copyright information. I am not sure if the copyright info is accurate in the License file. If it is not, do you mind to provide the copyright information, maybe in the copyright notice file?
License was added on 45fbb83.
Thanks,
|
gharchive/issue
| 2021-04-30T00:09:55 |
2025-04-01T06:40:14.459978
|
{
"authors": [
"morningwinter",
"rfmoz"
],
"repo": "rfmoz/grafana-dashboards",
"url": "https://github.com/rfmoz/grafana-dashboards/issues/78",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
181421903
|
[noco] Support for playlist
[x] I've verified and I assure that I'm running youtube-dl 2016.10.02
[x] README/FAQ/search
[x] Feature request (request for a new functionality)
Main extractor contributor: @dstftw
I'd like to suggest adding support for Noco's playlists, to allow download for a full series.
Example, with the "latest episodes" page:
http://noco.tv/emissions/64-4-148/olydri-studio/noob/derniers-episodes
{homepage} /emissions/ {playlist id} / {editor studio} / {series title slug} / {optional sorting filter; default: newest first}
Each episode is inside a <div class="item">, from which the fist link goes like:
http://noco.tv/emission/17205/olydri-studio/noob/s06e03-accomplissement
and is already supported by the Noco extractor URL matching. However, only the 30 newest episodes are listed in the HTML code of the page, older episodes get loaded by "infinite scroll".
Sidenote: Noco also uses this URL format:
http://noco.tv/famille/148/olydri-studio/noob
{homepage} / famille / {series id} / {editor studio} / {series title slug}
Which is an information page sorting by seasons, related contents, most viewed episode, suggestions... From the actual series, the 15 most recent episodes can be seen in the slide menu, older ones get loaded by "infinite slide".
Pinpointing all episodes from a specific series may be quite challenging :(
+1 for this
|
gharchive/issue
| 2016-10-06T13:56:18 |
2025-04-01T06:40:14.464269
|
{
"authors": [
"Alwaysin",
"mitsukarenai"
],
"repo": "rg3/youtube-dl",
"url": "https://github.com/rg3/youtube-dl/issues/10864",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
274232095
|
hotstar and voot livestreamer i.e h.exe and v.exe not downloading video from voot, from today.
Please follow the guide below
You will be asked some questions and requested to provide some information, please read them carefully and answer honestly
Put an x into all the boxes [ ] relevant to your issue (like this: [x])
Use the Preview tab to see what your issue will actually look like
Make sure you are using the latest version: run youtube-dl --version and ensure your version is 2017.11.15. If it's not, read this FAQ entry and update. Issues with outdated version will be rejected.
[x] I've verified and I assure that I'm running youtube-dl 2017.11.15
Before submitting an issue make sure you have:
[x] At least skimmed through the README, most notably the FAQ and BUGS sections
[x] Searched the bugtracker for similar issues including closed ones
What is the purpose of your issue?
[x] Bug report (encountered problems with youtube-dl)
[ ] Site support request (request for adding support for a new site)
[ ] Feature request (request for a new functionality)
[ ] Question
[ ] Other
The following sections concretize particular purposed issues, you can erase any section (the contents between triple ---) not applicable to your issue
If the purpose of this issue is a bug report, site support request or you are not completely sure provide the full verbose output as follows:
Add the -v flag to your command line you run youtube-dl with (youtube-dl -v <your command line>), copy the whole output and insert it here. It should look similar to one below (replace it with your log inserted between triple ```):
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'-v', u'http://www.youtube.com/watch?v=BaW_jenozKcj']
[debug] Encodings: locale cp1251, fs mbcs, out cp866, pref cp1251
[debug] youtube-dl version 2017.11.15
[debug] Python version 2.7.11 - Windows-2003Server-5.2.3790-SP2
[debug] exe versions: ffmpeg N-75573-g1d0487f, ffprobe N-75573-g1d0487f, rtmpdump 2.4
[debug] Proxy map: {}
...
<end of log>
If the purpose of this issue is a site support request please provide all kinds of example URLs support for which should be included (replace following example URLs by yours):
Single video: https://www.youtube.com/watch?v=BaW_jenozKc
Single video: https://youtu.be/BaW_jenozKc
Playlist: https://www.youtube.com/playlist?list=PL4lCao7KL_QFVb7Iudeipvc2BCavECqzc
Note that youtube-dl does not support sites dedicated to copyright infringement. In order for site support request to be accepted all provided example URLs should not violate any copyrights.
Description of your issue, suggested solution and other information
Explanation of your issue in arbitrary form goes here. Please make sure the description is worded well enough to be understood. Provide as much context and examples as possible.
If work on your issue requires account credentials please provide them or explain how one can obtain them.
Carefully read new issue template and provide all requested information.
i was downlading videos from voot smoothely using youtube-dl v,exe but today i m not able to download single file i am getting error '
C:\Hotstar and Voot Downloader>v "https://www.voot.com/playlist/bigg-boss-uncut-
scenes/333246/hina-misses-her-pooh/548208/?&trayLayout=playlistMedias"
[Voot] 548208: Downloading JSON metadata
[Kaltura] 0_layzbckh: Downloading video info JSON
[Kaltura] 0_layzbckh: Downloading m3u8 information
ERROR: unable to download video data: HTTP Error 404: Not Found
C:\Hotstar and Voot Downloader>
I have even updated to current version
|
gharchive/issue
| 2017-11-15T17:02:56 |
2025-04-01T06:40:14.480061
|
{
"authors": [
"dstftw",
"shank06"
],
"repo": "rg3/youtube-dl",
"url": "https://github.com/rg3/youtube-dl/issues/14757",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
296211606
|
Hey! I have problem from today with Udemy paid courses. Yesterday everything worked well, but today I get this, from different URL-s
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: ['-v', '-u', 'PRIVATE', '-p', 'PRIVATE', 'https://www.udemy.com/the-python-mega-course/learn/v4/t/lecture/5170346?start=0']
[debug] Encodings: locale cp1252, fs mbcs, out cp437, pref cp1252
[debug] youtube-dl version 2018.02.11
[debug] Python version 3.4.4 (CPython) - Windows-10-10.0.16299
[debug] exe versions: none
[debug] Proxy map: {}
[udemy] Downloading login popup
ERROR: Unable to download webpage: HTTP Error 403: Unauthorized (caused by HTTPError()); please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\rg3\tmpy028tx8p\build\youtube_dl\extractor\common.py", line 519, in _request_webpage
File "C:\Users\dst\AppData\Roaming\Build archive\youtube-dl\rg3\tmpy028tx8p\build\youtube_dl\YoutubeDL.py", line 2199, in urlopen
File "C:\Python\Python34\lib\urllib\request.py", line 470, in open
File "C:\Python\Python34\lib\urllib\request.py", line 580, in http_response
File "C:\Python\Python34\lib\urllib\request.py", line 508, in error
File "C:\Python\Python34\lib\urllib\request.py", line 442, in _call_chain
File "C:\Python\Python34\lib\urllib\request.py", line 588, in http_error_default
Me , too :(
Same Here :(
|
gharchive/issue
| 2018-02-11T18:15:02 |
2025-04-01T06:40:14.486716
|
{
"authors": [
"Ashroyal",
"Slobodnjak",
"silverbret1709"
],
"repo": "rg3/youtube-dl",
"url": "https://github.com/rg3/youtube-dl/issues/15571",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
355772274
|
Where do I enter --cookies FILE command?
Here is command that not working because It premium content and required cookie, question is where should I put the command?
youtube-dl.exe --ffmpeg-location C:\Users\anonimous\Desktop\ffmpeg\ffmpeg-20180404-2accdd3-win64-static\ffmpeg-20180404-2accdd3-win64-static\bin --hls-prefer-native https://cdn-video.pvgna.com/782/original-1487910128-1080p/1080p.m3u8
Same place you enter any other command. Note that passing cookies does not guarantee it will work.
@dstftw path is this "C:\Users\anonimous\Desktop\ffmpeg\ffmpeg-20180404-2accdd3-win64-static\ffmpeg-20180404-2accdd3-win64-static\bin" or this "F:/twitch download/bin/cookies.txt"?
@dstftw if I put it in quotes spaces is not a problem anymore? And the --cookies command itself I put in valid order?
@dstftw now this error
youtube-dl.exe --ffmpeg-location C:\Users\anonimous\Desktop\ffmpeg\ffmpeg-20180404-2accdd3-win64-static\ffmpeg-20180404-2accdd3-win64-static\bin --cookies "F:/twitch download/bin/cookies.txt" --hls-prefer-native https://cdn-video.pvgna.com/782/original-1487910128-1080p/1080p.m3u8
@dstftw now when I put path in ' ' instead of " " I get this hardcore error. I need to update or something?
youtube-dl.exe --ffmpeg-location C:\Users\anonimous\Desktop\ffmpeg\ffmpeg-20180404-2accdd3-win64-static\ffmpeg-20180404-2accdd3-win64-static\bin --cookies 'F:/twitch download/bin/cookies.txt' --hls-prefer-native https://cdn-video.pvgna.com/782/original-1487910128-1080p/1080p.m3u8
Maybe you'll finally start reading my posts?
Note that passing cookies alone does not guarantee it will work in the first place.
|
gharchive/issue
| 2018-08-30T22:10:32 |
2025-04-01T06:40:14.491827
|
{
"authors": [
"dstftw",
"hasnogaems"
],
"repo": "rg3/youtube-dl",
"url": "https://github.com/rg3/youtube-dl/issues/17389",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
126565814
|
youtube-dl should display ffmpeg errors
Whenever ffmpeg gives out errors, youtube-dl should give that feedback to the user. As it is now, it only gives the last line:[ffmpeg] Correcting container in "A.m4a"
ERROR: file:A.temp.m4a: Invalid argument
But it would be better if it gave the whole error so that I would know that the error has to do with mp4 codec: Requested output format 'mp4' is not a suitable output format
file:A.temp.m4a: Invalid argument
-hide_banner -loglevel warning (or -loglevel error) can be passed to ffmpeg instead of silencing it completely, unless --verbose was given to youtube-dl. This way you'll get less verbose messages that can be even forwarded to the terminal directly.
|
gharchive/issue
| 2016-01-14T02:45:10 |
2025-04-01T06:40:14.493947
|
{
"authors": [
"fstirlitz",
"illumilore"
],
"repo": "rg3/youtube-dl",
"url": "https://github.com/rg3/youtube-dl/issues/8230",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
206205427
|
[Pornhub] Fix for Issue #12007
Please follow the guide below
You will be asked some questions, please read them carefully and answer honestly
Put an x into all the boxes [ ] relevant to your pull request (like that [x])
Use Preview tab to see how your pull request will actually look like
Before submitting a pull request make sure you have:
[x] At least skimmed through adding new extractor tutorial and youtube-dl coding conventions sections
[x] Searched the bugtracker for similar pull requests
In order to be accepted and merged into youtube-dl each piece of code must be in public domain or released under Unlicense. Check one of the following options:
[x] I am the original author of this code and I am willing to release it under Unlicense
[ ] I am not the original author of this code but it is in public domain or released under Unlicense (provide reliable evidence)
What is the purpose of your pull request?
[x] Bug fix
[ ] Improvement
[ ] New extractor
[ ] New feature
Description of your pull request and other information
Explanation of your pull request in arbitrary form goes here. Please make sure the description explains the purpose and effect of your pull request and is worded well enough to be understood. Provide as much context and examples as possible.
Fixes the pronhub extractor. Uses exec() which is not the smartest solution... suggestions are welcomed!
UPDATE: Does not use exec() anymore.
Fix for issue #12007 - I'm not comfortable with using exec() but it worked fine...
You should not use exec. Random codes may be executed.
@yan12125 I'm aware of that. Any suggestions how to solve the problem we're facing?
Pornhub does first define variables:
a = 'http://www'
b = 'pornhub.com/video.mp4'
And than the video url gets merged from the variables:
url = a + b
So I need to find some kind of way to execute this Javascript-Code - I don't think that writing a parser myself would be the solution. I hoped that youtube-dl had some "execute this javascript code"-Function.
There's a jsinterp module. See youtube.py for an example usage
@yan12125 Fixed it, it's kind of simple parsing, so no need to use JSInterpreter.
All you need to do is request the video with a cookie "platform=tv" then you get an unobfuscated video URL.
|
gharchive/pull-request
| 2017-02-08T13:58:46 |
2025-04-01T06:40:14.504873
|
{
"authors": [
"MikeRich88",
"ThomasChr",
"yan12125"
],
"repo": "rg3/youtube-dl",
"url": "https://github.com/rg3/youtube-dl/pull/12018",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
373556499
|
[discovery] Added foodnetwork and hgtv (fixes #17947)
Please follow the guide below
You will be asked some questions, please read them carefully and answer honestly
Put an x into all the boxes [ ] relevant to your pull request (like that [x])
Use Preview tab to see how your pull request will actually look like
Before submitting a pull request make sure you have:
[x] At least skimmed through adding new extractor tutorial and youtube-dl coding conventions sections
[x] Searched the bugtracker for similar pull requests
[x] Checked the code with flake8
In order to be accepted and merged into youtube-dl each piece of code must be in public domain or released under Unlicense. Check one of the following options:
[x] I am the original author of this code and I am willing to release it under Unlicense
[ ] I am not the original author of this code but it is in public domain or released under Unlicense (provide reliable evidence)
What is the purpose of your pull request?
[x] Bug fix
[ ] Improvement
[ ] New extractor
[ ] New feature
Description of your pull request and other information
foodnetwork and hgtv updated their site to use the discovery format of listing content. fixes #17947
The checks are failing because of the HGTV test in scrippsnetworks.py.
Also, captions/subtitles aren't downloading because of some authentication issue with AWS. For example:
$ youtube-dl -v --skip-download --all-subs --cookies cookies.txt https://watch.foodnetwork.com/tv-shows/good-eats/full-episodes/a-bird-in-the-pan
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: ['-v', '--skip-download', '--all-subs', '--cookies', 'cookies.txt', 'https://watch.foodnetwork.com/tv-shows/good-eats/full-episodes/a-bird-in-the-pan']
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2018.11.03
[debug] Python version 3.7.0 (CPython) - Linux-4.4.0-17134-Microsoft-x86_64-with-debian-buster-sid
[debug] exe versions: ffmpeg 3.4.4-0ubuntu0.18.04.1, ffprobe 3.4.4-0ubuntu0.18.04.1
[debug] Proxy map: {}
[Discovery] a-bird-in-the-pan: Downloading webpage
[Discovery] a-bird-in-the-pan: Downloading JSON metadata
[Discovery] a-bird-in-the-pan: Downloading m3u8 information
[debug] Default format spec: bestvideo+bestaudio/best
[info] Writing video subtitles to: A Bird in the Pan-5b3b73546b66d104f34fdb6a.en.scc
WARNING: Unable to download subtitle for "en": Unable to download webpage: HTTP Error 403: Forbidden (caused by <HTTPError 403: 'Forbidden'>); please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; see https://yt-dl.org/update on how to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
The captions url is https://s3.amazonaws.com/prod.sni.transcode/Discovery_Go/FOOD12517/FOOD12517_20180823152543.scc.
Might need to port the AWS code from the Scripps extractor too?
I manually added this patch and it does work on hgtv at least. Thank you!
Hey, how can I learn to manually add a patch? thanks in advance. @tv21
I'd love to know too! If this is working for HGTV, I'd like to test it. I have a login for it and everything.
|
gharchive/pull-request
| 2018-10-24T15:57:47 |
2025-04-01T06:40:14.514580
|
{
"authors": [
"VietTPham",
"nerdily",
"zacklb"
],
"repo": "rg3/youtube-dl",
"url": "https://github.com/rg3/youtube-dl/pull/17959",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
1444765099
|
Cannot enable/disable controller from card
Checklist:
[X] I updated to the latest version available
[X] I cleared the cache of my browser
Release with the issue: Home Assistant 2022.11.2, Irrigation Unlimited 2022.11.0, Irrigation Unlimited Card 2022.10.0
Last working release (if known): Not sure, but I believe it was working a few months ago.
Browser and Operating System: Firefox 106.0.5, Linux (Endeavour OS). Issue also occurs from the Home Assistant iOS companion app.
Description of problem: Cannot "Enable" irrigation controller from the card. When I try to enable the controller, the toggle button changes state and makes it look like it is enabled, but none of the schedules/timers update. Then if I refresh the page, the toggle button no longer indicates it is enabled. I can enable the controller by calling the Enable Service, and once I do this, the rest of the card will update with the schedules/timers. And vice versa if I then call the Disable Service. Redownload the card through HACS has not helped.
Javascript errors shown in the web inspector (if applicable):
Additional information:
I confirm that even with irrigation unlimited and irrigation unlimited card set to 2023.6.1
The whole menu is broken. The fix is in. Will release shortly.
Fixed in release 2023.6.2
Hi, how to use Open Weather with it?
Is your question related to the title of the thread? It would be more appropriate I think to post in HA forum
Hi, I installed irrigation unlimited and most of it I have working also
OpenWeather but not with the irrigation.
On Wed, 27 Mar 2024, 21:06 Kolia56, @.***> wrote:
Is your question related to the title of the thread? It would be more
appropriate I think to post in HA forum
—
Reply to this email directly, view it on GitHub
https://github.com/rgc99/irrigation-unlimited-card/issues/8#issuecomment-2023990385,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AY6VFUBC7J4PF3KYTN5SIW3Y2MRDTAVCNFSM6AAAAAAR5D6AI6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRTHE4TAMZYGU
.
You are receiving this because you commented.Message ID:
@.***>
|
gharchive/issue
| 2022-11-11T00:54:38 |
2025-04-01T06:40:14.524715
|
{
"authors": [
"Kolia56",
"abcben78",
"rgc99",
"twigonometry1"
],
"repo": "rgc99/irrigation-unlimited-card",
"url": "https://github.com/rgc99/irrigation-unlimited-card/issues/8",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2727508574
|
🛑 Sito Web is down
In f46a661, Sito Web (https://www.comune.preganziol.tv.it) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Sito Web is back up in a34ec3e after 6 minutes.
|
gharchive/issue
| 2024-12-09T15:49:17 |
2025-04-01T06:40:14.529491
|
{
"authors": [
"rglauco"
],
"repo": "rglauco/upptime",
"url": "https://github.com/rglauco/upptime/issues/1873",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1721722768
|
🛑 Cryptoagent is down
In df59216, Cryptoagent (https://cryptoagent.us) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Cryptoagent is back up in d9c0a8f.
|
gharchive/issue
| 2023-05-23T10:13:02 |
2025-04-01T06:40:14.538870
|
{
"authors": [
"rgstephens"
],
"repo": "rgstephens/upptime",
"url": "https://github.com/rgstephens/upptime/issues/337",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2550063879
|
OpenShift Login - inconsistent login error message
Describe the bug
When logging into OpenShift, if the user provides either an incorrect Username or Password the error message references 'Login' instead of Username
To Reproduce
Login to OpenShift
Enter an incorrect Username or Password
Click Log in
Expected behavior
The error should be consistent with the UI which references 'Username' and not 'Login'
Screenshots
Additional context
Add any other context about the problem here.
Not a Parasol workshop, but an OpenShift one. Which I had not noticed in the 10 years I've been using it!
https://issues.redhat.com/browse/OCPBUGS-43609
|
gharchive/issue
| 2024-09-26T09:38:47 |
2025-04-01T06:40:14.542601
|
{
"authors": [
"antowaddle",
"guimou"
],
"repo": "rh-aiservices-bu/parasol-insurance",
"url": "https://github.com/rh-aiservices-bu/parasol-insurance/issues/140",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1287251004
|
fix: Make quay.io use pull secret only
If the remote registry is quay, we don't have user/pass as we only use the pull secret, avoid using them.
Fixes #452
:tada: This PR is included in version 1.10.2 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
|
gharchive/pull-request
| 2022-06-28T12:12:58 |
2025-04-01T06:40:14.544852
|
{
"authors": [
"iranzo"
],
"repo": "rh-ecosystem-edge/ztp-pipeline-relocatable",
"url": "https://github.com/rh-ecosystem-edge/ztp-pipeline-relocatable/pull/454",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2309792654
|
A stack overflow vulnerability was found
The vulnerThe vulnerability error information is as followsability error information is as follows:
==272977==ERROR: AddressSanitizer: stack-overflow on address 0x7ffc96d13ac0 (pc 0x55f7a3fd45c5 bp 0x7ffc96d16880 sp 0x7ffc96d13ac0 T0)
#0 0x55f7a3fd45c5 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs
#1 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#2 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#3 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#4 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#5 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#6 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#7 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#8 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#9 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#10 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#11 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#12 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#13 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#14 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#15 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#16 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#17 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#18 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#19 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#20 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#21 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#22 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#23 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#24 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#25 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#26 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#27 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#28 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#29 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#30 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#31 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#32 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#33 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#34 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#35 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#36 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#37 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#38 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#39 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#40 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#41 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#42 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#43 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#44 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#45 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#46 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#47 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#48 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#49 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#50 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#51 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#52 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#53 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#54 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#55 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#56 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#57 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#58 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#59 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#60 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#61 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#62 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#63 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#64 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#65 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#66 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#67 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#68 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#69 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#70 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#71 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#72 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#73 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#74 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#75 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#76 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#77 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#78 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#79 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#80 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#81 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#82 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#83 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#84 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#85 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#86 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#87 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#88 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#89 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#90 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#91 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#92 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#93 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#94 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#95 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#96 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#97 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#98 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#99 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#100 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#101 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#102 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#103 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#104 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#105 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#106 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#107 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#108 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#109 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#110 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#111 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#112 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#113 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#114 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#115 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#116 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#117 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#118 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#119 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#120 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#121 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#122 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#123 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#124 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#125 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#126 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#127 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#128 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#129 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#130 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#131 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#132 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#133 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#134 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#135 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#136 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#137 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#138 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#139 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#140 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#141 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#142 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#143 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#144 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#145 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#146 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#147 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#148 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#149 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#150 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#151 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#152 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#153 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#154 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#155 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#156 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#157 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#158 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#159 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#160 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#161 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#162 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#163 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#164 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#165 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#166 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#167 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#168 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#169 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#170 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#171 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#172 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#173 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#174 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#175 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#176 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#177 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#178 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#179 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#180 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#181 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#182 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#183 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#184 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#185 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#186 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#187 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#188 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#189 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#190 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#191 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#192 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#193 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#194 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#195 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#196 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#197 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#198 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#199 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#200 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#201 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#202 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#203 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#204 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#205 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#206 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#207 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#208 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#209 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#210 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#211 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#212 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#213 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#214 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#215 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#216 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#217 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#218 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#219 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#220 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#221 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#222 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#223 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#224 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#225 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#226 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#227 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#228 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#229 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#230 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#231 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#232 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#233 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#234 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#235 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#236 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#237 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#238 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#239 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#240 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#241 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#242 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#243 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#244 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#245 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#246 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#247 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#248 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#249 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#250 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#251 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#252 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#253 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#254 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#255 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#256 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#257 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#258 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#259 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#260 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#261 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#262 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#263 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#264 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#265 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#266 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
#267 0x55f7a4047619 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_method_call::hc709b7863797ed38 /src/rhai/src/func/call.rs:976:25
#268 0x55f7a3f7e804 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain_raw::h40ce123181b20b25 /src/rhai/src/eval/chaining.rs:877:25
#269 0x55f7a3f76bf3 in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::_$u7b$$u7b$closure$u7d$$u7d$::ha0ae2b6cfdd7ea79 /src/rhai/src/eval/chaining.rs:513:25
#270 0x55f7a3f7271b in core::option::Option$LT$T$GT$::map_or_else::h6059df74e72b5602 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/option.rs:1172:24
#271 0x55f7a3f7271b in rhai::eval::chaining::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_dot_index_chain::h6ecf223a52f3fc5a /src/rhai/src/eval/chaining.rs:508:17
#272 0x55f7a3fbe989 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:405:30
#273 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#274 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#275 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#276 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#277 0x55f7a3fc0e2d in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:396:17
#278 0x55f7a3fbfd3c in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:261:37
#279 0x55f7a4043112 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::get_arg_value::h40eeaa55050cd41e /src/rhai/src/func/call.rs:717:9
#280 0x55f7a40584f9 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::make_function_call::h6ced0fe022154253 /src/rhai/src/func/call.rs:1377:25
#281 0x55f7a4077eb1 in rhai::func::call::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_fn_call_expr::h9eea1509380a08ec /src/rhai/src/func/call.rs:1902:9
#282 0x55f7a3fbe743 in rhai::eval::expr::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_expr::h09fd9ccc181ed335 /src/rhai/src/eval/expr.rs:246:17
#283 0x55f7a3fd5cb2 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96 /src/rhai/src/eval/stmt.rs:278:33
#284 0x55f7a3fcefeb in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::_$u7b$$u7b$closure$u7d$$u7d$::hc172eee184228ff8 /src/rhai/src/eval/stmt.rs:76:17
#285 0x55f7a3fce1d0 in core::iter::traits::iterator::Iterator::try_fold::h0a3b3286061c5141 /rustc/89e2160c4ca5808657ed55392620ed1dbbce78d1/library/core/src/iter/traits/iterator.rs:2462:21
#286 0x55f7a3fce1d0 in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt_block::h30e41a366e84e11a /src/rhai/src/eval/stmt.rs:69:9
#287 0x55f7a4091c1a in rhai::func::script::_$LT$impl$u20$rhai..engine..Engine$GT$::call_script_fn::h4a86d7b916e31cb2 /src/rhai/src/func/script.rs:121:39
SUMMARY: AddressSanitizer: stack-overflow /src/rhai/src/eval/stmt.rs in rhai::eval::stmt::_$LT$impl$u20$rhai..engine..Engine$GT$::eval_stmt::h3f1d68ce37fc6e96
==272977==ABORTING
This vulnerability was found when using ossfuzz to test scripting fuzzer, the crash sample is attached.
crash-c70466c551d3cea97000681f88369f27b3cfff54.zip
Yes, this is a bug. Thanks for catching this.
Please test the latest drop https://github.com/rhaiscript/rhai/pull/881 and see if it fixes the stack overflow.
I recompile the latest project, and then use the crash - c70466c551d3cea97000681f88369f27b3cfff54 test, the fuzzer without error, says it has fixed.
|
gharchive/issue
| 2024-05-22T07:30:55 |
2025-04-01T06:40:14.565272
|
{
"authors": [
"MageWeiG",
"schungx"
],
"repo": "rhaiscript/rhai",
"url": "https://github.com/rhaiscript/rhai/issues/880",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2547405256
|
Internal panic in the engine
Hello rhai team,
We've introduced the possibility to edit document by (rhai) function in Meilisearch a few months ago and a user reported an internal panic in rhai.
I didn’t have the time to try to reproduce the issue on my side yet, but I thought it could be nice for you to get the error with the line number in case it’s trivial to fix:
2024-09-25T07:11:29.563769Z ERROR meilisearch: info=panicked at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/rhai-1.19.0/src/engine.rs:347:58: called `Option::unwrap()` on a `None` value
And here’s the original issue with the function if that helps: https://github.com/meilisearch/meilisearch/issues/4956
Thanks for the awesome lib 🎉
This seems to happen during a strings cache contention. I can see that, under sync, there is possibility that the strings cache is still locked by another process while somebody wants to write to it.
Can you check the latest drop to see if it happens again? I just pushed a fix.
Hey, after a lot of tests from ourselves and our users we didn’t reproduce this bug or anything similar, thanks for the patch!
Do you think you could make a new patch release this week so we’re not relying on the git repo + a rev?
I encountered the same proble.
It points to this function:
pub fn get_interned_string(
&self,
string: impl AsRef<str> + Into<ImmutableString>,
) -> ImmutableString {
match self.interned_strings {
Some(ref interner) => locked_write(interner).unwrap().get(string),
None => string.into(),
}
}
the locked_write(interner) returns None.
Please kindly check out the latest drop to see if it solves your problem.
I'm planning to make a new release soonish if everything is stable.
@irevoire hope this is now resolved?
Hey, we've not heard of this issue again.
I cannot say for sure it's 100% solved since it's not our most used feature. But it's definitely probably solved ahah
Great! So closing this for now.
|
gharchive/issue
| 2024-09-25T09:03:37 |
2025-04-01T06:40:14.571815
|
{
"authors": [
"irevoire",
"longzou",
"schungx"
],
"repo": "rhaiscript/rhai",
"url": "https://github.com/rhaiscript/rhai/issues/916",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
185812451
|
ev3dev-lang-python, rpyc and PIL
Hi again! This is not a issue. I am working with rpyc. I need big fonts for lcd screen!
I got a beautiful example in ev3python (thanks @ndward!)
#!/usr/bin/env python3
from ev3dev.ev3 import *
from PIL import Image, ImageDraw, ImageFont
from time import sleep
lcd = Screen()
f = ImageFont.truetype('/usr/share/fonts/truetype/msttcorefonts/Arial.ttf', 75)
lcd.draw.text((3,0), 'Hello', font=f)
lcd.draw.text((2,55), 'world', font=f)
lcd.update()
sleep(7) # if run from Brickman, need time to see displayed image
The problem is ImageFont. How i can import the remote PIL module (and Image, ImageDraw, ImageFont)? is it possible? Any example? Any workaround?
DJuego
P.S: I am considering seriously to use ev3dev and ev3dev lang python for robotic teaching.
Interactive programming is so appealing... I only wish to get things done as simple as I can.
If you are using the latest developement version of ev3dev-lang-python (soon to be released as 0.7.1), then you can use bitmap fonts distributed with the library (see #225 and http://python-ev3dev.readthedocs.io/en/latest/other.html#bitmap-fonts for more details). The following works for me:
import rpyc
conn = rpyc.classic.connect('ev3')
ev3 = conn.modules['ev3dev.ev3']
fonts = conn.modules['ev3dev.fonts']
s = ev3.Screen()
s.draw.text((10,10), 'Hello World', font = fonts.load('luBS14'))
s.update()
If you are using the latest released version (0.7.0), you may still @ndward's example you referenced. The only change is that you need to import PIL.ImageFont remotely:
ImageFont = conn.modules['PIL.ImageFont']
f = ImageFont.truetype('/usr/share/fonts/truetype/msttcorefonts/Arial.ttf', 75) # EV3 path
s.draw.text((10,10), 'Hello World', font=f)
s.update()
Well. I am overwhelmed. :-) Easily, I prefer the 0.7.1 option. However i don´t know how upgrade to the last development version the ev3dev and the Spyder editor too. I am a absolutely newbie in Python.
Anyway i have a alternative. :-P
Thank you again, ddemidov for your swift and clear answers. I will be on the watch for the 0.7.1 release.
DJuego
However i don´t know how upgrade to the last development version the ev3dev
Here is what I do to get the latest version installed (on EV3):
sudo apt-get remove python3-ev3dev
git clone https://github.com/rhempel/ev3dev-lang-python
cd ev3dev-lang-python
python3 setup.py bdist
sudo python3 setup.py install --force
O.o Amazing! I will consider to upgrade. In my opinion, this info is worthy of a place in the README.
DJuego
For beginners, in general I'd recommend sticking to the official releases. This is mainly because it ensures that you can share your code with others and have it work properly. It's also helpful because the standard upgrade instructions will only work on your installation if it's based on the official package.
@ddemidov Is there anything that you think is blocking a release (#233)?
@WasabiFan, I'd like to do #230. This looks easy enough, just need to update my kernel :smile_cat:
Also, I am not yet sure about the best way to fix #234. Other than that, 0.7.1 seems ready to be released.
|
gharchive/issue
| 2016-10-28T00:27:05 |
2025-04-01T06:40:14.588700
|
{
"authors": [
"DJuego",
"WasabiFan",
"ddemidov"
],
"repo": "rhempel/ev3dev-lang-python",
"url": "https://github.com/rhempel/ev3dev-lang-python/issues/240",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
155132278
|
Only add line spacing when there are 1 or more lines.
If numberOfLines == 0 and lineSpacing != 0, the dimensions calculated in collectionViewContentSize were incorrect. numberOfLines - 1 would wrap around to a giant int which was causing size.height to become extremely large in my case. The grid layout partially broke down and autolayout complained with this log message:
This NSLayoutConstraint is being configured with a constant that exceeds internal limits. A smaller value will be substituted, but this problem should be fixed. Break on void _NSLayoutConstraintNumberExceedsLimit() to debug. This will be logged only once. This may break in the future.
I was wondering if the guard checks should actually be self.numberOfLines > 1 since line spacing is really only needed if there are 2 or more lines but I didn't want to change any pre-existing behaviour other users may rely on.
Thanks for this fix!
Added tests for this case and merged in.
Cheers,
Rich
|
gharchive/pull-request
| 2016-05-16T22:00:20 |
2025-04-01T06:40:14.600934
|
{
"authors": [
"rhodgkins",
"robmaceachern"
],
"repo": "rhodgkins/RDHCollectionViewGridLayout",
"url": "https://github.com/rhodgkins/RDHCollectionViewGridLayout/pull/8",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
44957272
|
provide a way to specify health checks / availability checks / happy page checks to a JVM with jolokia?
So whenever possible I include jolokia in all JVMs (and docker images with JVMs inside) as a standard REST way to do monitoring and to use the nice hawtio console etc. (I tend to use the java agent approach to avoid modifying the classpath or container in any way)
It'd be awesome if we could auto-generate or manually specify health checks or 'availability checks' that the jolokia agent could use. e.g. that when a JVM is running that certain MBeans should exist; or that certain mbeans should have certain attribute values. Am thinking something along the lines of we can configure the jolokia agent to point at a folder which contains a bunch of, say, json files which contain 'health checks / availability checks'.
Then if this enabled, jolokia would add some API to do the checks. (It could just be a jolokia mbean that gets enabled really - so we use the standard jolokia API to query the health check / availability checks).
e.g. if using Apache Camel; we could auto-generate or manually define that an mbean should exist for a CamelContext along with, say, 3 routes. Then if the JVM / docker image starts, we can have jolokia automatically reply with details of whether its running correctly (and giving details of the failures in JSON if the camel context is not present).
Right now there's no easy way to check jolokia for 'is my stuff really running correctly right now'. By letting folks specify these checks in JMX - as mostly static JSON files - it'd add huge value IMHO.
If folks need really complex health / availability checks - e.g. using dynamic java code and stuff - folks could just implement those as a simple JMX MBean which is then statically referred to in a JSON file for jolokia to query?
i.e. if we just come up with a simple file format for validation checks using (e.g. JSON) we should be able to support most things.
e.g. to verify that a tomcat container starts up and the welcome pages for each WAR can return 200-299 return value.
Users might want to specify known local URL paths within their web apps that should return OK values. (If they return a non OK value we could return the output in the JSON). Then folks could write their own 'happy pages' in servlets / JSP / JAXRS and we can easily invoke them from jolokia.
Sorry, for the long long delay (there was simply to much around it). And yes, I should not give promises on things which I then forget ;-{
I really can see the value for this. IMO it boils down to two parts: Allowing the agent to be extended in a controlled manner and to provide the health checks.
Extending the agent
Jolokia 2.0 already has quite sophisticated hooks for so called services which can be provided by various means (classpath lookup for the JVM and WAR agents, OSGi Service lookup for the OSGi agent). There are different types of services which already available. A broad overview can be found on the Jolokia Wiki (although the actual implementation is a bit different).
But this probably doesn't help much here, because 2.0 is not really ready for prime time and although I have some ideas about the release roadmap, but I can't promise anything since actually there is no really pressure for doing it (at least nobody asked for it yet ;-).
But the good thing is that there is some 'lightweight' service framework also integrated in Jolokia 1.x: ServiceObjectFactory. It is currently used for server detectors and simplifiers (for serialization to JSON). My suggestion is to introduce a lifecycle service which is started in a similar way. That way anything could be started during the startup of the agent. This will work for the WAR and JVM agent. For the OSGi stuff a dedicated ServiceTracker could be added for this purpose.
Health Checks
Given that this startup service hook is in place, one could then easily provide a custom service which registers an own MBean for doing the health check. This MBean could be implemented in any way. The only open task is how and in which form this MBean can get access to the JMX. This could be either by providing directly the MBeanServers detected by the agent so the health checks can do JMX call on its own. This would be the simplest and probably best way.
And alternative to provide access to the JSON parsing layer so that JMX calls could be done like other Jolokia clients. This would probably be easier for implementer of such services, but would be not as performant as direct JMX access.
My overall suggestion is to tackle the two task separately: I'm going to implement the lifecycle-service hook directly within Jolokia (and available in the next Jolokia release). Then, you could implement this service API and provide an implementation for an HealthCheck the way you like. For providing an appropriate packaged agent, I would suggest to use jolokia-extra where I already have a customized agent for JSR-77 deserialization on WebSphere.
What do you think ?
That all sounds good.
I made a little experiment defining a few kinds of assertion we could use
https://github.com/jstrachan/jolokia/tree/liveness/agent/core/src/main/java/org/jolokia/health
e.g. SearchAssertion does a search and asserts the count of mbeans returned; then ReadAssertion makes an assertion on an mbean attribute value.
Here's a few test cases that show them in action:
https://github.com/jstrachan/jolokia/tree/liveness/agent/core/src/test/java/org/jolokia/health
e.g. typical output after simulating some failures is:
Results: [Failure{assertion=ReadAssertion{id=my.check, description=Check that the cpu load is valid, mbeanName='java.lang:type=OperatingSystem', attribute='ProcessCpuLoad', comparison='<', value=-1000}, actualValue=0.0}]
Results: [Failure{assertion=SearchAssertion{id=my.check, description=Check that we have enough cheese beans, mbeanName='foo.cheese:type=*', comparison='>', count=1}, actualValue=0}]
i.e. its clear from the results which assertions failed.
We could then have an mbean that lets folks pass in JSON for assertions to perform ad hoc assertion checks.
I was hoping we could then have support for a folder of files which are each json and can load one or more assertions; evaluate them all and return the failures if there are any. Then if we get that far, we could add some REST API in jolokia for the health check; returning OK if there are no failures; if there are failures it'd be nice to return the JSON of the failures.
Looks good, my suggestion is:
To move the checking code rhuss/jolokia-extra which also will use the new, yet-to-come CustomizeService for registering one or more MBeans
One for doing ad-hoc checks by uploading JSON formatted assertions.
And one for using predefined checks, e.g. by looking into a folder or lookup from the classpath. For specifying a folder I will add a way for the CustomizeServices to access the configuration information (which then can be used to specify the server side folder). That way we could avoid to extend the Jolokia protocol since we could simply use an EXEC request for accessing the MBean which uses the preconfigured checks.
I will now focus on the customizer service so that in the next step we can hook in the health checks.
Hi,
I already had some thoughts about jolokia and application health checks / monitoring and I figured it out that it would be nice to have some features listed below.
For the sake of simplicity I will try to explain it on real world use case: check if used memory is less or equal to 80% maximum allowed.
We need to:
define what data should be considered
read mbean attribute / exec mbean operation
example:
read / java.lang:type=Memory / HeapMemoryUsage
query (filter) necessary info from complex output structure
optimization, we can use "path" but in that case we need to perform multiple reading/executing
example:
usedVar = aboveRead.used
maxVar = aboveRead.max
manipulate with values
example:
80PctMaxVar = 0.8 * maxVar
compare values (check criteria)
example:
usedVar > 80PctMaxVar
optional blocks
support for different process state (e.g. on web container if application is deployed we should perform some checks, otherwise not)
Here is example of ad-hoc JSON formatted assertions:
{
"dataCollection": [
{
"id": "first",
"cmd": "read",
"mbean": "java.lang:type=Memory",
"attribute": "HeapMemoryUsage"
},
{
"id": "second",
"cmd": "exec",
"mbean": "com.mycompany:type=Something",
"operation": "someOperation"
}
],
"vars": [
{"usedVar": "first.used"}, // additional filtering
{"maxVar": "first.max"},
{"80PctMaxVar": "0.8 * maxVar"} // values manipulation
],
"assertions": [
{
"id": "request_123",
"check": "usedVar > 80PctMaxVar",
"desc": " ... message ..."
}
],
"conditionalBlocks": [
{
"condition": "second.something != 0",
"dataCollection": [],
"vars": [],
"assertions": [],
"conditionalBlocks": []
}
]
}
What do you guys think ?
Regards,
N.
Having a DSL for specifying health-checks (aka monitoring) is a good thing. However, I wouldn't do it in JSON but probably with something more expressive like Groovy, since this is already a language.
Also, as stated above, this is to much for the vanilla agent. I already included some hooks for custom plugins which could implement this kind of health checks via own MBeans which then communicate over the normal Jolokia protocol (probaly EXEC).
I will soon have a POC based on @jstrachan suggestion in jolokia-extra. Please stay tuned ....
A proof-of-concept is now available at jolokia-extra. You can build the agent with @jstrachan 's sample test health checks included on you own (mvn -Phealth clean install which creates the JVM and WAR agent below agents/) or you can download them as Snapshots(JVM or WAR agent).
If used as usual, it will install a MBean jolokia:type=plugin,name=health with two operations:
cpuLoadCheck : to be called with a single int-argument, which specifies the alarming threshold. If the CPU load is below this value, an code null string is returned, otherwise an error description.
mbeanCountCheck : Simple, dump check which only checks whether the there are any MBeans matching the pattern java:lang=*
Of course this is only a beginning, but show nicely how to hook into Jolokia: Via a plugin which is called during startup, getting a MBeanPluginContext which can be used to register a MBean (which automatically get unregistered when the agent goes down) and also for accessing the local MBeanServers (which can be many) in order to perform the health checks. The communication the simply uses the current Jolokia protocol, which is IMO a good thing (as an alternative to introducing a new command health).
Next steps will happen on jolokia-extra. I will open an issue there collecting all ideas we have so far so that we can implement a feature rich, flexible health check system.
A POC is done in jolokia-extra, a blog post is written and I suggest to continue the discussion for the healthcheck implementation in issue rhuss/jolokia-extra#1
I'll add some backlinks in this new issue and will close that one.
|
gharchive/issue
| 2014-10-06T08:07:14 |
2025-04-01T06:40:14.624859
|
{
"authors": [
"jstrachan",
"nevenr",
"rhuss"
],
"repo": "rhuss/jolokia",
"url": "https://github.com/rhuss/jolokia/issues/162",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
95772511
|
Work well with cellular or no network
This app needs to be better behaved if no or slow networks are available:
[ ] automatic sync on wifi
[ ] sync manually on cell
[x] do not attempt to sync if no network is available
[x] #24
This shows no apparent need to sync if no network is available.
2.0 will not require in-app managed networking capabilities; all network requirements will be handled by the appropriate file managers.
|
gharchive/issue
| 2015-07-18T01:00:47 |
2025-04-01T06:40:14.628189
|
{
"authors": [
"rhwood"
],
"repo": "rhwood/roster-decoder",
"url": "https://github.com/rhwood/roster-decoder/issues/14",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
270485282
|
Alerts
Added the alerts component
Note: This was branched from the #12 brach so this pr includes those changes as well.
Looks fantastic, awesome stuff! And thank you for the fix on the border/background color.
|
gharchive/pull-request
| 2017-11-01T23:26:54 |
2025-04-01T06:40:14.633025
|
{
"authors": [
"Fraham",
"rhyneav"
],
"repo": "rhyneav/papercss",
"url": "https://github.com/rhyneav/papercss/pull/16",
"license": "ISC",
"license_type": "permissive",
"license_source": "github-api"
}
|
273497972
|
Fixes #1
Brief description
Fixes #1
Developer Certificate of Origin
[x] I certify that these changes according to the Developer Certificate of Origin 1.1 as described at https://developercertificate.org/.
Further details
The trick is to url encode the whole SVG and remove charset. Tested in Chrome, FF, IE9-10-11.
Two beautiful lines of code. Thank you so much for the fix!
|
gharchive/pull-request
| 2017-11-13T16:44:23 |
2025-04-01T06:40:14.634900
|
{
"authors": [
"rhyneav",
"wintercounter"
],
"repo": "rhyneav/papercss",
"url": "https://github.com/rhyneav/papercss/pull/35",
"license": "ISC",
"license_type": "permissive",
"license_source": "github-api"
}
|
271967583
|
Serialization should consider DateTimeKind for UTC and non-UTC DateTimes
Right now serialization doesn't handle DateTimeKind. This makes interop with some libraries (like Telerik's RadSchedule stuff) problematic. If we pay attention to the DateTimeKind, and preserve it during serialization and deserialization, interop will improve, and the library will provide higher-resolution results.
Fixed in nuget version 4.0.0: https://www.nuget.org/packages/Ical.Net/4.0.0
|
gharchive/issue
| 2017-11-07T20:12:41 |
2025-04-01T06:40:14.636500
|
{
"authors": [
"rianjs"
],
"repo": "rianjs/ical.net",
"url": "https://github.com/rianjs/ical.net/issues/331",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
714301942
|
Fix coverage upload to coveralls.io
coveralls.io reports 0% coverage but the coverage reports in the GitHub action report 88%.
Closed via #8
|
gharchive/issue
| 2020-10-04T12:19:49 |
2025-04-01T06:40:14.681149
|
{
"authors": [
"richford"
],
"repo": "richford/groupyr",
"url": "https://github.com/richford/groupyr/issues/6",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
54952950
|
Atom.Object.defineProperty.get is deprecated.
atom.workspaceView is no longer available.
In most cases you will not need the view. See the Workspace docs for
alternatives: https://atom.io/docs/api/latest/Workspace.
If you do need the view, please use atom.views.getView(atom.workspace),
which returns an HTMLElement.
Atom.Object.defineProperty.get (/usr/share/atom/resources/app/src/atom.js:55:11)
HighlightLineView.attach (/home/csozo/.atom/packages/highlight-line/lib/highlight-line-view.coffee:13:9)
Duplicate of #37. Please close.
|
gharchive/issue
| 2015-01-20T22:56:46 |
2025-04-01T06:40:14.683000
|
{
"authors": [
"csozo",
"stramel"
],
"repo": "richrace/highlight-line",
"url": "https://github.com/richrace/highlight-line/issues/40",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
287473019
|
mute the video
At least on iOS this works and has the benefit the user's
music doesn't stop playing yet it still doesn't sleep.
I don't have Android to test on
I'm going to test this now on both Android and iOS devices!
According to https://github.com/richtr/NoSleep.js/issues/47 this does not actually mute the video on Android devices. We might want to keep the idea provided in this PR + include a separate video src for Android devices that does not include an audio track (only iOS Safari requires an audio track in the video to enable wake lock).
I have a similar solution but I unmute the video inside the enable method before playing it so that it works on both iOS and Android.
FYI, commit 615ee56675ae2c334c74a658ddb68b6a09122bcb from this PR has now been cherry-picked on to this PR: https://github.com/richtr/NoSleep.js/pull/58 (branch name: feature/webm-mp4-dual-video)
|
gharchive/pull-request
| 2018-01-10T15:26:55 |
2025-04-01T06:40:14.686104
|
{
"authors": [
"greggman",
"montamal",
"richtr"
],
"repo": "richtr/NoSleep.js",
"url": "https://github.com/richtr/NoSleep.js/pull/45",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1881942373
|
Allow edit selected command before execute it
Description:
This PR allows users to customize the commands generated by the API before execution (#20). For instance, when using the command "shai create new folder," users can now confirm and modify it to their preference, such as changing "mkdir new_folder" to "mkdir myfolder."
Changes:
This PR implements a minor adjustment that enables users to edit the generated command prior to execution, utilizing the inquirer.text method.
Very neat!
|
gharchive/pull-request
| 2023-09-05T13:04:46 |
2025-04-01T06:40:14.688056
|
{
"authors": [
"JoseMiguelCh",
"ricklamers"
],
"repo": "ricklamers/shell-ai",
"url": "https://github.com/ricklamers/shell-ai/pull/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
453699303
|
Publish 2.13.0
Once 1.14.0 for 2.13 is published, #479 will be green in Travis for the master branch.
This may be a good time to publish Scalajs 1.0.0-M8 for 2.13 and 2.12 from the 1.14.0_sonatype branch, as well, which would require #475.
Tried helping with this along with PR #481.
Fixed with https://github.com/rickynils/scalacheck/pull/481.
🎉
https://repo1.maven.org/maven2/org/scalacheck/scalacheck_2.13/
https://repo1.maven.org/maven2/org/scalacheck/scalacheck_sjs0.6_2.13/
|
gharchive/issue
| 2019-06-07T21:24:37 |
2025-04-01T06:40:14.696648
|
{
"authors": [
"ashawley",
"dwijnand",
"xuwei-k"
],
"repo": "rickynils/scalacheck",
"url": "https://github.com/rickynils/scalacheck/issues/480",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
237218346
|
Change in folder structure
This PR has lots of changes, but is just a change in the folder structure.
Problem
Currently, if we clone the repository or install the dependency from the repository, the path to the compiled dist file will be nativescript-vue/nativescript-vue/dist/index.js, which feels weird. The npm link trick would solve some cases, but I ran into some issues when using webpack.
Changes
Everything under nativescript-vue is now in the root of the package.
Renamed vue-sample to samples.
Updated samples do they still work in the new structure.
Updated .gitignore and other configuration files to match this new structure.
What do you think?
Does this simplify the development process by not having to run npm-link? (For the samples I mean)
I guess we can do this, easier to work on by not having to go into directories all the time!
Also I was thinking if we should start publishing to npm.
It does avoid avoid having to do npm link. In fact, I made this change also because webpack does not resolve dependencies of linked modules (not without some extra work). The samples are working without npm link too.
I think it also helps when developing with another project. For instance, in the template I created, I just added the nativescript-vue dependency directly from my github fork like this:
"dependencies": {
"nativescript-vue": "tralves/nativescript-vue",
and then import the lib with const Vue = require('nativescript-vue/dist/index'). With the current folder structure, this line would be const Vue = require('nativescript-vue/nativescript-vue/dist/index') which feels weird to me.
Even without these "practical" reasons, I think it is a better structure, more in line with what we see libs do around the world.
+1. It makes sense to me.
Also I was thinking if we should start publishing to npm.
Agreed. It looks like that this change will also facilitate the npm publishing process.
I don't know whether my problems were related to the installation instructions that need to be updated, or an unforeseen problem. But I can tell after reading this that my issue and this PR are very much related
https://github.com/rigor789/nativescript-vue/issues/27
|
gharchive/pull-request
| 2017-06-20T13:56:31 |
2025-04-01T06:40:14.729785
|
{
"authors": [
"gvsrepins",
"rigor789",
"tralves",
"vesper8"
],
"repo": "rigor789/nativescript-vue",
"url": "https://github.com/rigor789/nativescript-vue/pull/26",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2495558453
|
Feat: Human readable dashboard state
https://www.notion.so/rilldata/Human-Readable-dashboard-state-f851ea32b47e4e1d81d87d6e857f2268?pvs=4
Blocks https://github.com/rilldata/rill/issues/5490
@nishantmonu51 to track the state work
|
gharchive/issue
| 2024-08-29T20:54:44 |
2025-04-01T06:40:14.737786
|
{
"authors": [
"mindspank"
],
"repo": "rilldata/rill",
"url": "https://github.com/rilldata/rill/issues/5561",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
282060932
|
Deploy master release
This PR is limited to linux and mac for now.
I have just one question: can Travis deploy the mkdocs site if there is something in the gh-pages that is not on the master?
|
gharchive/pull-request
| 2017-12-14T11:00:21 |
2025-04-01T06:40:14.746720
|
{
"authors": [
"BotellaA"
],
"repo": "ringmesh/RINGMesh",
"url": "https://github.com/ringmesh/RINGMesh/pull/97",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2339205655
|
Axelar bridge docs use js, pythons, and Foundry increasing skill set and tools needed.
ethers js library can be use to both encode and send transactions eliminating need for python and Foundry/cast
encoding
// Define the data type and value
const types = ['string'];
const values = ['hello, world!'];
// ABI encode the data
const abiEncoded = ethers.utils.defaultAbiCoder.encode(types, values);
// Compute the Keccak-256 hash of the ABI encoded data
const keccakHash = ethers.utils.keccak256(abiEncoded);
This was partially addressed in #88, where Python code was replaced with JS using the ethers.js library.
|
gharchive/issue
| 2024-06-06T21:48:32 |
2025-04-01T06:40:14.781874
|
{
"authors": [
"brettmollin",
"k4m4"
],
"repo": "ripple/opensource.ripple.com",
"url": "https://github.com/ripple/opensource.ripple.com/issues/87",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
926141725
|
Use api.generateAddress to create account. Get account information and report error rippled-1.7.2
Please help me
server start(Start a New Genesis Ledger in Stand-Alone Mode)
rippled -a --start --conf=/etc/opt/ripple/rippled.cfg
jsApi
import { RippleAPI } from 'ripple-lib'
const api = new RippleAPI({
server: 'ws://106.13.47.223:6005', // My rippled server
//server: 'wss://s1.ripple.com' // Public rippled server
})
api
.connect()
.then(async () => {
/* begin custom code ------------------------------------ */
let addressObj = api.generateAddress({
test: true,
})
let account= api.getAccountInfo(addressObj.classicAddress)
return account
})
.then((info) => {
console.log(info)
console.log('getAccountInfo done')
/* end custom code -------------------------------------- */
})
.then(() => {
return api.disconnect()
})
.then(() => {
console.log('done and disconnected.')
})
.catch(console.error)
error:
message:'Account not found.'
`{"name":"RippledError","data":{"account":"r4Lbjg2SbsZDH1mdr4WcSc6reZSuYE9NVz","error":"actNotFound","error_code":19,"error_message":"Account not found.","id":1,"ledger_hash":"F5A8382FEB857BDDE8B71A391B229B3D94193E32FC1B5BA108568C24650ECB25","ledger_index":2,"request":{"account":"r4Lbjg2SbsZDH1mdr4WcSc6reZSuYE9NVz","command":"account_info","id":1,"ledger_index":"validated"},"status":"error","type":"response","validated":true}}`
Does rippled.cfg need to add any configuration?
generateAddress creates a new address, but an account object for that address does not exist on the ledger. To create the account, fund it by sending it a payment. The payment amount must be greater than or equal to the base account reserve. On Mainnet, the base account reserve is currently 20 XRP. On a stand-alone ledger, the amount may be different.
When creating a new ledger in standalone mode, the initial funds are in the genesis address, as documented here: https://xrpl.org/start-a-new-genesis-ledger-in-stand-alone-mode.html
You can send XRP from that genesis address to your newly-generated address in order to fund and activate the account.
thanks @intelliot, is there a ledger_accept equivalent in RippleAPI?
I sent the XRP, but it was never verified
There is no access on method:api.connection.on("transaction")
sever:ws://106.13.47.223:6005
const RippleAPI = require('ripple-lib').RippleAPI;
let address = "rHb9CJAWyB4rj91VRWn96DkukG4bwdtyTh"
let secret = "snoPBrXtMeMyMHUVTgbuqAfg1SUTb"
const api = new RippleAPI({server: 'ws://106.13.47.223:6005'})
api.connect()
api.on('connected', async () => {
while (true) {
try {
let ac=await api.request("account_info", {account: address, ledger_index: "validated"})
console.log(ac)
break
} catch(e) {
await new Promise(resolve => setTimeout(resolve, 1000))
}
}
// Prepare transaction -------------------------------------------------------
const preparedTx = await api.prepareTransaction({
"TransactionType": "Payment",
"Account": address,
"Amount": api.xrpToDrops("22"), // Same as "Amount": "22000000"
"Destination": "r351srbb3RP8wxRLXHLy8m4zNJeAvQX9us"
}, {
// Expire this transaction if it doesn't execute within ~5 minutes:
"maxLedgerVersionOffset": 75
})
const maxLedgerVersion = preparedTx.instructions.maxLedgerVersion
console.log("Prepared transaction instructions:", preparedTx.txJSON)
console.log("Transaction cost:", preparedTx.instructions.fee, "XRP")
console.log("Transaction expires after ledger:", maxLedgerVersion)
// Sign prepared instructions ------------------------------------------------
const signed = api.sign(preparedTx.txJSON, secret)
const txID = signed.id
const tx_blob = signed.signedTransaction
console.log("Identifying hash:", txID)
console.log("Signed blob:", tx_blob)
// Submit signed blob --------------------------------------------------------
// The earliest ledger a transaction could appear in is the first ledger
// after the one that's already validated at the time it's *first* submitted.
const earliestLedgerVersion = (await api.getLedgerVersion()) + 1
const result = await api.submit(tx_blob)
console.log("Tentative result code:", result.resultCode)
console.log("Tentative result message:", result.resultMessage)
// Wait for validation -------------------------------------------------------
let has_final_status = false
api.request("subscribe", {accounts: [address]})
api.connection.on("transaction", (event) => {
if (event.transaction.hash == txID) {
console.log("Transaction has executed!", event)
has_final_status = true
}
})
api.on('ledger', ledger => {
if (ledger.ledgerVersion > maxLedgerVersion && !has_final_status) {
console.log("Ledger version", ledger.ledgerVersion, "was validated.")
console.log("If the transaction hasn't succeeded by now, it's expired")
has_final_status = true
}
})
// There are other ways to do this, but they're more complicated.
// See https://xrpl.org/reliable-transaction-submission.html for details.
while (!has_final_status) {
await new Promise(resolve => setTimeout(resolve, 1000))
}
// Check transaction results -------------------------------------------------
try {
const tx = await api.getTransaction(txID, {
minLedgerVersion: earliestLedgerVersion})
console.log("Transaction result:", tx.outcome.result)
console.log("Balance changes:", JSON.stringify(tx.outcome.balanceChanges))
} catch(error) {
console.log("Couldn't get transaction outcome:", error)
}
})
log
Transaction cost: 0.000012 XRP
src/sendXrp.js:33
Transaction expires after ledger: 77
src/sendXrp.js:34
Identifying hash: D5C22448C169A74F9396147EAA209DF604422C6E10A4A585B1A3B0379D4D4156
src/sendXrp.js:40
Signed blob: 12000022800000002400000009201B0000004D6140000000014FB18068400000000000000C73210330E7FC9D56BB25D6893BA3F317AE5BCF33B3291BD63DB32654A313222F7FD0207447304502210096AEF0515024368E019E2FC0D50DE59DE9603E4634F2EB44B830BE84A0F9B82D02201A9ABFFBCA2F7FF09A69FC57CD4BC76B8EFC83ADAA69C0316C719D2AD56CA10C8114B5F762798A53D543A014CAF8B297CFF8F2F937E8831454B3B3A5EE00E45F502037ADDCCF4126CFEDBFAB
src/sendXrp.js:41
Tentative result code: tesSUCCESS
src/sendXrp.js:48
Tentative result message: The transaction was applied. Only final in a validated ledger.
src/sendXrp.js:49
generateAddress creates a new address, but an account object for that address does not exist on the ledger. To create the account, fund it by sending it a payment. The payment amount must be greater than or equal to the base account reserve. On Mainnet, the base account reserve is currently 20 XRP. On a stand-alone ledger, the amount may be different.
When creating a new ledger in standalone mode, the initial funds are in the genesis address, as documented here: https://xrpl.org/start-a-new-genesis-ledger-in-stand-alone-mode.html
You can send XRP from that genesis address to your newly-generated address in order to fund and activate the account.
dd
@pwang200 Sorry, I just saw this. Yes, you can call ledger_accept with RippleAPI using the request method:
return api.request('ledger_accept').then(response => {
/* Do something useful with response */
console.log(JSON.stringify(response, null, 2))
}).catch(console.error);
Note: ledger_accept is as admin request, so you'll need to be connected to rippled with an admin connection.
Does that help?
@pwang200对不起,我刚看到这个。是的,您可以使用以下request方法使用 RippleAPI 调用 ledger_accept :
return api.request('ledger_accept').then(response => {
/* Do something useful with response */
console.log(JSON.stringify(response, null, 2))
}).catch(console.error);
注意: ledger_accept 作为管理员请求,因此您需要使用管理员连接连接到涟漪。
这有帮助吗?
``
@pwang200 Sorry, I just saw this. Yes, you can call ledger_accept with RippleAPI using the request method:
return api.request('ledger_accept').then(response => {
/* Do something useful with response */
console.log(JSON.stringify(response, null, 2))
}).catch(console.error);
Note: ledger_accept is as admin request, so you'll need to be connected to rippled with an admin connection.
Does that help?
thank you
|
gharchive/issue
| 2021-06-21T11:59:14 |
2025-04-01T06:40:14.797383
|
{
"authors": [
"intelliot",
"pwang200",
"zwht"
],
"repo": "ripple/rippled",
"url": "https://github.com/ripple/rippled/issues/3873",
"license": "ISC",
"license_type": "permissive",
"license_source": "github-api"
}
|
96883589
|
Switch some Ledger instances to ReadView instances
In order to finish the cutover to using open views rather than open ledgers, code that can handle open ledgers must switch to using a ReadView. Here are a few changes in that direction.
Remove ltCURRENT (it was never used anyway)
Change getOwnerInfo
Use ReadView in TransactionSign
Change AcceptedLedger and ProposedTransaction to use ReadView
Change RPC::accounts
Straightforward
:+1:
Rebased against 0.29.1-b1.
Looks good. :+1:
|
gharchive/pull-request
| 2015-07-23T19:00:30 |
2025-04-01T06:40:14.800702
|
{
"authors": [
"JoelKatz",
"vinniefalco",
"ximinez"
],
"repo": "ripple/rippled",
"url": "https://github.com/ripple/rippled/pull/1200",
"license": "ISC",
"license_type": "permissive",
"license_source": "github-api"
}
|
129883838
|
Improve error message when signing fails (RIPD-1066):
With the addition of multisigning there are a variety of reasons a signature may fail. We now return a more descriptive message for the reason certain signature checks fail.
Reviewers: @miguelportilla @seelabs
Current coverage is 62.35%
Merging #1521 into develop will increase coverage by +0.14% as of f750aad
@@ develop #1521 diff @@
=======================================
Files 891 891
Stmts 63094 63169 +75
Branches 0 0
Methods 0 0
=======================================
+ Hit 39254 39391 +137
Partial 0 0
+ Missed 23840 23778 -62
Review entire Coverage Diff as of f750aad
Powered by Codecov. Updated on successful CI builds.
Merged as 2eaf211e9bfe60f351f9d02169c3ecee1aec3e67
|
gharchive/pull-request
| 2016-01-29T21:12:30 |
2025-04-01T06:40:14.804867
|
{
"authors": [
"codecov-io",
"nbougalis",
"scottschurr"
],
"repo": "ripple/rippled",
"url": "https://github.com/ripple/rippled/pull/1521",
"license": "ISC",
"license_type": "permissive",
"license_source": "github-api"
}
|
365183226
|
Allow validators to detect transaction censorship attempts:
The XRP Ledger is designed to be censorship resistant. Any attempt to censor transactions would require coordinated action by a majority of the system's validators.
Importantly, the design of the system is such that such an attempt is detectable and can be easily proven since every validators must sign the validations it publishes.
This commit adds an automated censorship detector which is active on servers configured as validators. The detector tracks all transactions proposed by the validator and issues warnings of increasing severity for any transactions which, despite being proposed repeatedly by the validator, has not been included.
Jenkins Build Summary
Built from this commit
Built at 20180930 - 03:07:34
Test Results
Build Type
Log
Result
Status
msvc.Debug
logfile
1033 cases, 0 failed, t: 790s
PASS :white_check_mark:
gcc.Debug -Dcoverage=ON
logfile
1036 cases, 0 failed, t: 851s
PASS :white_check_mark:
docs
logfile
1 cases, 0 failed, t: 2s
PASS :white_check_mark:
gcc.Release -Dassert=ON, MANUAL_TESTS=true
logfile
1101 cases, 0 failed, t: 933s
PASS :white_check_mark:
clang.Debug
logfile
1036 cases, 0 failed, t: 284s
PASS :white_check_mark:
rpm
logfile
1035 cases, 0 failed, t: n/a
PASS :white_check_mark:
clang.Debug -Dunity=OFF
logfile
1036 cases, 0 failed, t: 441s
PASS :white_check_mark:
gcc.Debug
logfile
1036 cases, 0 failed, t: 290s
PASS :white_check_mark:
msvc.Debug, NINJA_BUILD=true
logfile
1033 cases, 0 failed, t: 651s
PASS :white_check_mark:
gcc.Debug -Dunity=OFF
logfile
1036 cases, 0 failed, t: 439s
PASS :white_check_mark:
clang.Release -Dassert=ON
logfile
1035 cases, 0 failed, t: 405s
PASS :white_check_mark:
gcc.Debug -Dstatic=OFF
logfile
1036 cases, 0 failed, t: 292s
PASS :white_check_mark:
gcc.Release -Dassert=ON
logfile
1035 cases, 0 failed, t: 423s
PASS :white_check_mark:
gcc.Debug -Dstatic=OFF -DBUILD_SHARED_LIBS=ON
logfile
1036 cases, 0 failed, t: 292s
PASS :white_check_mark:
gcc.Debug, NINJA_BUILD=true
logfile
1036 cases, 0 failed, t: 275s
PASS :white_check_mark:
msvc.Debug -Dunity=OFF
logfile
1033 cases, 0 failed, t: 1211s
PASS :white_check_mark:
msvc.Release
logfile
1032 cases, 0 failed, t: 416s
PASS :white_check_mark:
clang.Debug -Dunity=OFF -Dsan=address, PARALLEL_TESTS=false, DEBUGGER=false
logfile
1036 cases, 0 failed, t: 1068s
PASS :white_check_mark:
clang.Debug -Dunity=OFF -Dsan=undefined, PARALLEL_TESTS=false
logfile
1036 cases, 0 failed, t: 1174s
PASS :white_check_mark:
Is is possible to do these checks on non-validating nodes too? Some "censorship probes" scattered throughout the network (to have better chances of receiving censored valid transactions) would probably be another use case for this and maybe even better suited for the task than validators.
@MarkusTeufelberger: you could, I guess, but it could result in false positives, which is something I'd like to avoid. The protocol is such that if a validator proposes some transaction T that's valid but doesn't make it into consensus for round X, it's going to have priority for round X+1. That's something that doesn't happen with regular servers.
For that reason, I think it's better to only perform these checks on validators. If someone is interested in probing this, running a validator is no more expensive or computationally intensive than is running a regular node.
I copied the header from some other file and didn’t really think much about it so, 2016. Will fix. Thanks!
I haven't reviewed yet, but two things stick out. First, typo in commit message, "every validator[s] must". Second, this probably should be enabled on non-validators as well as validators. All participants want to now if censorship is affecting the network.
@JoelKatz: I considered making this available for non-validators, but I believe it would complicate things unnecessarily and not be as good. My rationale, copy-pasted from an earlier message to this thread:
The protocol is such that if a validator proposes some transaction T that's valid but doesn't make it into consensus for round X, it's going to have priority for round X+1. That's something that doesn't happen with regular servers.
For that reason, I think it's better to only perform these checks on validators. If someone is interested in probing this, running a validator is no more expensive or computationally intensive than is running a regular node.
To do what you suggest would require adding more (and more complicated) code which is something I’d like to avoid.
If you believe it's super-important to have this operate on non-validators as well, we'll have to go back to the drawing board and design something more complicated.
I do feel fairly strongly that everyone who uses the system wants to know if they're seeing censorship, whether they're running a validator or not. If you see censorship, whether you're running a validator or not, that's unexpected and indicates something is wrong that a human should look at.
@JoelKatz, you and @MarkusTeufelberger have convinced me that the best thing to do is to have this running on every server and not just validators. I will update this PR accordingly. Thanks.
Rebased against 1.2.0-b5 and modified the code so that every server performs censorship detection as long as it entered a round synced to the network.
That's an interesting point, and I agree that we should do this. But I think I'm going to punt right now. I want to get censorship detection merged so it can be included in 1.2.0. We can augment the code to log in a future PR.
|
gharchive/pull-request
| 2018-09-30T02:04:09 |
2025-04-01T06:40:14.830009
|
{
"authors": [
"JoelKatz",
"MarkusTeufelberger",
"nbougalis",
"ripplelabs-jenkins"
],
"repo": "ripple/rippled",
"url": "https://github.com/ripple/rippled/pull/2700",
"license": "ISC",
"license_type": "permissive",
"license_source": "github-api"
}
|
2564175919
|
SYSSUSP_SUSPEND should allow platform-specific quiescent states, not only WFI
Currently, the SYSSUSP_SUSPEND service description contains the following requirement:
The application processor which called this service must enter into the WFI state by executing the WFI instruction. The platform microcontroller will transition the system to the requested SUSPEND_STATE upon the successful WFI state transition of the application processor.
However, not all implementations use the WFI instruction to enter into a quiescent state. The HSM_HART_STOP service allows for this flexibility:
The hart upon successful acknowledgement can perform the final context saving if required and must enter into a quiesced state such as WFI which can be detected and allow the platform microcontroller to proceed to stop the hart. The mechanism to detect the hart quiesced state by the platform microcontroller is platform specific.
The SYSSUSP_SUSPEND service should be updated to use similar language.
Yes thanks @SiFiveHolland , let me raise the PR
|
gharchive/issue
| 2024-10-03T14:12:23 |
2025-04-01T06:40:14.839023
|
{
"authors": [
"SiFiveHolland",
"pathakraul"
],
"repo": "riscv-non-isa/riscv-rpmi",
"url": "https://github.com/riscv-non-isa/riscv-rpmi/issues/65",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2421860490
|
Vector Test Generator
Write a python script to automatically generate vector json tests. Add support for the script to be able to parse Mavis as well to generate the tests.
hey @aarongchan Can I take up this issue?
@aarongchan , @danbone I’ve completed a script for generating vector JSON tests and parsing Mavis data. Could you advise on the best directory to place this file in the riscv_model project? Should it go in tests, core, or elsewhere? Your guidance on adhering to project conventions would be greatly appreciated.
|
gharchive/issue
| 2024-07-22T03:59:20 |
2025-04-01T06:40:14.840596
|
{
"authors": [
"Shubhf",
"aarongchan"
],
"repo": "riscv-software-src/riscv-perf-model",
"url": "https://github.com/riscv-software-src/riscv-perf-model/issues/186",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2739936778
|
Value of vstopi.IPRIO cannot be 256 for candidate #3
Hi,
The width of vstopi.IPRIO is 8 bits, the maximum value is 255, but candidate 3 says priority number 256 is used.
Read-only CSR vstopi is VSXLEN bits wide and has the same format as stopi:
bits 27:16 IID
bits 7:0 IPRIO
if bit 9 is one in both vsip and vsie, and neither of the first two cases applies:
a supervisor external interrupt (code 9) with priority number 256;
As specified by Table 5.5 on page 66, IPRIO should be set to 255 in this case.
|
gharchive/issue
| 2024-12-14T15:10:25 |
2025-04-01T06:40:14.842673
|
{
"authors": [
"zengderui"
],
"repo": "riscv/riscv-aia",
"url": "https://github.com/riscv/riscv-aia/issues/112",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2154782445
|
actionlint errors: shellcheck double quote to prevent globbing; deprecated command set-output
I ran actionlint https://github.com/rhysd/actionlint on repo:
kbroch@~/rvi/riscv-docs-base-container-image on main via 🐍 v3.11.2 (venv)
❯ actionlint
.github/workflows/build.yaml:32:9: shellcheck reported issue in this script: SC2086:info:1:33: Double quote to prevent globbing and word splitting [shellcheck]
|
32 | run: echo "NOW=$(date +'%m%d%Y')" >> $GITHUB_ENV
| ^~~~
.github/workflows/weekly_build.yaml:33:14: workflow command "set-output" was deprecated. use `echo "{name}={value}" >> $GITHUB_OUTPUT` instead: https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions [deprecated-commands]
|
33 | run: echo "::set-output name=date::$(date +'%Y-%m-%d')"
| ^~~~
IMO fixing these are useful and we should consider adding pre-commit hook
Fixed with #9
|
gharchive/issue
| 2024-02-26T17:59:08 |
2025-04-01T06:40:14.844548
|
{
"authors": [
"kbroch-rivosinc"
],
"repo": "riscv/riscv-docs-base-container-image",
"url": "https://github.com/riscv/riscv-docs-base-container-image/issues/8",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1872109226
|
What is the step-by-step process for two-stage address translation
In the latest pdf build (5/23/23) of the Privileged Architecture, the step-by-step description in 5.3.2 Virtual Address Translation Process is clear and concise for single stage translation. However, when we move to two-stage translation, the differences specified in 9.5.1 Guest Physical Address Translation (e.g., hgatp substitutes for the usual satp) seem to be incomplete.
For example, applying step 7 in 5.3.2 (as updated by 9.5.1) "if the original memory access is a store, also set pte.d to 1" is a bit confusing when trying to apply it to G-stage translations. The intent is probably for all traversed G-stage PTEs to have A=1 and, if the operation that kicked off the original translation was a STORE, the final G-stage PTE (i.e. pointing to the SPA of the page of the STORE) needs D=1. We should have a clarification of some sort for this.
Furthermore, the updated steps in 5.3.2 do not provide any guidance on how to update the G-stage PTE that points to the VS-stage leaf cell. Presumably, if the VS-stage leaf cell is updated, this G-stage that points to it must have D=1.
It appears to me that the various models (SPIKE, RISC-V, and QEMU) are missing this last step (actually, SPIKE is currently being corrected based this issue being raised in Are G-stage D bits set correctly as a result of setting a VS-stage D bit? #1447 with the changes in
A/D updates in G-stage PTE #1448)
I suggest that we add clarifications to the Privileged Architecture in the form of a step-by-step Two-stage Virtual Address Translation Process in 9.5.1 Guest Physical Address Translation. Alternatively, we could add a normative description that explicitly calls out of these differences in two-stage translations.
Alternatively, we could add a normative description that explicitly calls out of these differences in two-stage translations.
The section 9.5.1 does call out these differences:
hgatp substitutes for the usual satp;
For the translation to begin, the effective privilege mode must be VS-mode or VU-mode;
When checking the U bit, the current privilege mode is always taken to be U-mode; and
Guest-page-fault exceptions are raised instead of regular page-fault exceptions.
For G-stage address translation, all memory accesses (including those made to access data structures for VS-stage address translation) are considered to be user-level accesses, as though executed in U-mode.
Access type permissions—readable, writable, or executable—are checked during G-stage translation the same as for VS-stage translation. For a memory access made to support VS-stage address translation (such as to read/write a VS-level page table), permissions are checked as though for a load or store, not for the original access type. However, any exception is always reported for the original access type (instruction, load, or store/AMO).
The G bit in all G-stage PTEs is reserved for future standard use.
We agree that section 9.5.1 calls out those differences. However, applying those differences to the single-stage step-by-step virtual address translation process does not yield a complete two-stage step-by-step description. There is no mention in these steps, for example, of when to set the D bit in a G-stage PTE as a result of the setting of an A bit in a VS-stage PTE.
Since people seem to be relying on these step-by-step instructions when creating models or RTL implementations, it would be helpful to create a version that explicitly and completely covers two-stage translation.
Probably would be useful for writing the complete ACT tests as well.
On Tue, Sep 5, 2023 at 12:09 PM Ken Dockser @.***>
wrote:
We agree that section 9.5.1 calls out those differences. However,
applying those differences to the single-stage step-by-step virtual
address translation process does not yield a complete two-stage
step-by-step description. There is no mention in these steps, for example,
of when to set the D bit in a G-stage PTE as a result of the setting of an
A bit in a VS-stage PTE.
Since people seem to be relying on these step-by-step instructions when
creating models or RTL implementations, it would be helpful to create a
version that explicitly and completely covers two-stage translation.
—
Reply to this email directly, view it on GitHub
https://github.com/riscv/riscv-isa-manual/issues/1103#issuecomment-1707175122,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AHPXVJSQXZJGYILHZVBX4A3XY52HDANCNFSM6AAAAAA4DJNDPI
.
You are receiving this because you are subscribed to this thread.Message
ID: @.***>
Or (to Ken's and Ved's last posts) just add one more bullet to the section 9.5.1 bullet list that explicitly makes clear when to set the D bit in a G-stage PTE as a result of the setting of an A bit in a VS-stage PTE.
Section 9.5.1 states:
For a memory access made to support VS-stage address translation (such as to read/write a VS-level page table), permissions are checked as though for a load or store, not for the original access type.
For added clarity, the phrase "permissions are checked as though for a load or store" could be expanded to "permissions, A, and D bits are checked as though for a load or store".
"permissions, A-bit, and the D-bit are checked" reads in a confusing way. I assume this is trying to refer to setting of G-stage A&D bits as a function of the VS-level PTE read or write access (and not of the original access type)?
If so, then this might read better as:
For a memory access made to support VS-stage address translation (such as to read/write a VS-level page table), permissions and the need to set A and/or D bits at the G-stage level are checked as though for a load or store, not for the original access type.
Also, consider expanding "for a load or store" to explicitly state "for an implicit load or store." The distinction between implicit and explicit guides the decision on the value reported in [h|m]tinst when a guest-page fault trap occurs.
As follows:
For a memory access made to support VS-stage address translation (such as to read/write a VS-level page table), permissions and the need to set A and/or D bits at the G-stage level are checked as though for an implicit load or store, not for the original access type.
The addresses in hgatp and the G-stage paging structures are supervisor physical addresses. Please clarify what the second bullet implies by "to support G-stage address translation".
|
gharchive/issue
| 2023-08-29T17:29:45 |
2025-04-01T06:40:14.862455
|
{
"authors": [
"allenjbaum",
"gfavor",
"kdockser",
"ved-rivos"
],
"repo": "riscv/riscv-isa-manual",
"url": "https://github.com/riscv/riscv-isa-manual/issues/1103",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
272296410
|
xTVAL missing from MEDELEG/MIDELEG description
It seems like xTVAL delegation rules are missing
https://github.com/riscv/riscv-isa-manual/blob/7cf3673198998d38b7235c879974cdcfa0912031/src/machine.tex#L1044
You're right, thanks.
|
gharchive/issue
| 2017-11-08T18:10:58 |
2025-04-01T06:40:14.864541
|
{
"authors": [
"as-sc",
"aswaterman"
],
"repo": "riscv/riscv-isa-manual",
"url": "https://github.com/riscv/riscv-isa-manual/issues/113",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1402640394
|
Add Bubble Sort in Algorithm folder
There is a folder "Bubble sort" in DSA/JAVA/Algorithm you need to add java source file (your code) and readme file describing algorithm in details.
Would love to work on this @risingstar-bit . Please assign this to me :)
Would love to work on this @risingstar-bit . Please assign this to me :)
Assigned👍
Thanks :)
Please assign.I am done with it.@risingstar-bit.Will make pr immediately.
Please assign.I am done with it.@risingstar-bit .Will make pr immediately.
You didn't ask me to assign this.
Bubble sort is already added
|
gharchive/issue
| 2022-10-10T06:05:00 |
2025-04-01T06:40:14.874724
|
{
"authors": [
"Kritika30032002",
"YuvrajRakheja",
"probro27",
"risingstar-bit"
],
"repo": "risingstar-bit/Compose-Community",
"url": "https://github.com/risingstar-bit/Compose-Community/issues/75",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1839632655
|
Support abort state in transaction
Is your feature request related to a problem? Please describe.
In postgresql, transcation will encouter abort state when occurs error. It will refuse the following request until the COMMIT or ROLLBACK in this state.
// Postgresql
test_db=> begin read only;
BEGIN
test_db=*> select v1;
ERROR: column "v1" does not exist
LINE 1: select v1;
^
test_db=!> select 1;
ERROR: current transaction is aborted, commands ignored until end of transaction block
// RW
dev=> begin read only;
BEGIN
dev=*> select 1;
?column?
----------
1
(1 row)
dev=*> select v1;
ERROR: QueryError: Bind error: failed to bind expression: v1
Caused by:
Item not found: Invalid column: v1
dev=*> select 1;
?column?
----------
1
(1 row)
Describe the solution you'd like
No response
Describe alternatives you've considered
No response
Additional context
No response
Tracked in #10736
Let's keep it open to track it. 😂
Currently we only support read-only transaction, so it's not a big deal. But for better compatibility with Postgres wire protocol, it's still a good-to-have.
|
gharchive/issue
| 2023-08-07T14:51:22 |
2025-04-01T06:40:14.877998
|
{
"authors": [
"BugenZhao",
"ZENOTME"
],
"repo": "risingwavelabs/risingwave",
"url": "https://github.com/risingwavelabs/risingwave/issues/11510",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2160506883
|
feat(batch): support batch read for file source
I hereby agree to the terms of the RisingWave Labs, Inc. Contributor License Agreement.
What's changed and what's your intention?
This pr will introduce batch read for new file source(s3_v2, gcs), and the next pr will bring PredicatePushdown for them.
Checklist
[ ] I have written necessary rustdoc comments
[ ] I have added necessary unit tests and integration tests
[ ] I have added test labels as necessary. See details.
[ ] I have added fuzzing tests or opened an issue to track them. (Optional, recommended for new SQL features #7934).
[ ] My PR contains breaking changes. (If it deprecates some features, please create a tracking issue to remove them in the future).
[ ] All checks passed in ./risedev check (or alias, ./risedev c)
[ ] My PR changes performance-critical code. (Please run macro/micro-benchmarks and show the results.)
[ ] My PR contains critical fixes that are necessary to be merged into the latest release. (Please check out the details)
Documentation
[ ] My PR needs documentation updates. (Please use the Release note section below to summarize the impact on users)
Release note
If this PR includes changes that directly affect users or other significant modifications relevant to the community, kindly draft a release note to provide a concise summary of these changes. Please prioritize highlighting the impact these changes will have on users.
The e2e test passed:
|
gharchive/pull-request
| 2024-02-29T06:35:37 |
2025-04-01T06:40:14.884034
|
{
"authors": [
"wcy-fdu"
],
"repo": "risingwavelabs/risingwave",
"url": "https://github.com/risingwavelabs/risingwave/pull/15358",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1591329904
|
refactor: no need to resolve hanging channel
Signed-off-by: Bugen Zhao i@bugenzhao.com
I hereby agree to the terms of the RisingWave Labs, Inc. Contributor License Agreement.
What's changed and what's your intention?
The hanging channel is a channel that's external for a table fragments subgraph. We introduced this for MV on MV long ago, but after https://github.com/risingwavelabs/risingwave/pull/4045 and https://github.com/risingwavelabs/risingwave/pull/6170, this becomes not that necessary: we can simply create a pair of channels if not found in the ChannelPool.
https://github.com/risingwavelabs/risingwave/blob/b1891423a68b98dff39d7220a891cdea8bd67951/src/stream/src/task/mod.rs#L123-L138
This is a preparation step for #7908.
Checklist
[x] I have written necessary rustdoc comments
[x] I have added necessary unit tests and integration tests
[x] I have added fuzzing tests or opened an issue to track them. (Optional, recommended for new SQL features).
[x] I have demonstrated that backward compatibility is not broken by breaking changes and created issues to track deprecated features to be removed in the future. (Please refer to the issue)
[x] All checks passed in ./risedev check (or alias, ./risedev c)
Documentation
[x] My PR DOES NOT contain user-facing changes.
Click here for Documentation
Types of user-facing changes
Please keep the types that apply to your changes, and remove the others.
Installation and deployment
Connector (sources & sinks)
SQL commands, functions, and operators
RisingWave cluster configuration changes
Other (please specify in the release note below)
Release note
Signed-off-by: Bugen Zhao
This PR is so successful...
|
gharchive/pull-request
| 2023-02-20T07:51:28 |
2025-04-01T06:40:14.891188
|
{
"authors": [
"BugenZhao",
"shanicky"
],
"repo": "risingwavelabs/risingwave",
"url": "https://github.com/risingwavelabs/risingwave/pull/8050",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2050023631
|
[Feature Request] onLoad callback
Description
I've noted the web version has an onLoad callback when loading files over the URL. I dug through the code and I think this may not be even exposed from native iOS/Android level?
Would be great for us to have this functionality so we could show some sort of loaders while waiting for the Rive animation to be loaded. If this can be done on the native side I would be happy to write the bindings on React native's JS layer :)
Provide a Repro
N/A
Expected behavior
Have an onLoad Callback
I agree, I'm implementing animations with state, and it's quite complicated to find a way to understand the right time to set the initial state
For the moment this is the workaround but it seems very complicated to maintain
const BlobAnimation = () => {
const [mood, setMood] = useState(-2);
const [riveRef, setRiveRef] = useState<RiveRef | null>(null);
useEffect(() => {
if (!riveRef) return;
setTimeout(() => {
riveRef.setInputState(blobStateMachineName, 'mood', mood);
}, 300);
}, [mood, riveRef]);
return (
<>
<Rive
ref={setRiveRef}
autoplay
url={blobUrl}
stateMachineName={blobStateMachineName}
style={{ width: 200, height: 200 }}
/>
<Button onPress={() => setMood((value) => value + 1)} label="mood + 1" />
</>
);
};
Any updates?
Any updates?
|
gharchive/issue
| 2023-12-20T07:45:23 |
2025-04-01T06:40:14.921591
|
{
"authors": [
"SergeyYurkevich",
"cosimochellini",
"edgartoomik",
"ridwansameer"
],
"repo": "rive-app/rive-react-native",
"url": "https://github.com/rive-app/rive-react-native/issues/216",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
465975090
|
Feature: Telegram zu notifiers hinzugefügt
Als kleine Ergänzung habe ich Telegram als möglichen notifier hinzugefügt.
Cool, vielen Dank. Sinnvolle PRs sind immer willkommen 😄
Top @pandaslide das Feature funktioniert wunderbar. Steht ab Release 5.6.2 bereit!
Danke für den schnellen Merge und das tolle Tool!
|
gharchive/pull-request
| 2019-07-09T20:40:54 |
2025-04-01T06:40:14.931548
|
{
"authors": [
"pandaslide",
"rix1337"
],
"repo": "rix1337/RSScrawler",
"url": "https://github.com/rix1337/RSScrawler/pull/308",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1353309778
|
Add Tech Stack Filter
Problems
The website can only filter the people/engineers only by their location. The employer might find it hard to search for specific engineers that match their requirements.
Solution
Add the Tech-Stack filter feature
Implementation
Added Svelte-Select library to provide a Select component with common features (e.g. Multi-values)
Added Tech-Stack filter component and function to filter people/engineers by inputted stacks
Repositioned filter layout (Tech-Stack and Location filter) using flex
Refactored filter function and implemented multiple filtering that triggers a single function. It would be easy to add another filter feature in the future
Wah, mantap sekali ini. Terimakasih 🙏
|
gharchive/pull-request
| 2022-08-28T10:21:16 |
2025-04-01T06:40:14.934353
|
{
"authors": [
"luthfihizb",
"rizafahmi"
],
"repo": "rizafahmi/carikerja",
"url": "https://github.com/rizafahmi/carikerja/pull/111",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
756572925
|
pair programming exercise
This was fun! Turned prompt to function!
looks great! thanks for playing!
|
gharchive/pull-request
| 2020-12-03T20:37:02 |
2025-04-01T06:40:14.973688
|
{
"authors": [
"ChanceHarmon",
"rkgallaway"
],
"repo": "rkgallaway/about-me",
"url": "https://github.com/rkgallaway/about-me/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1634555501
|
Consider Supporting Alfred Gallery?
I apologize as I've just learned about this, myself, but apparently we can keep our workflows updated through the Alfred Gallery. I don't know what it takes to do such, but I was going to look at it later as a possible PR.
ooo i was unaware of this, but this is awesome. i've been dying for some official support for something like this given that packal is incredibly outdated. i will definitely look into adding this workflow to alfred gallery at some point
topic added here on official forums to get the ball rolling on getting access to submit the workflow
|
gharchive/issue
| 2023-03-21T19:15:33 |
2025-04-01T06:40:14.981034
|
{
"authors": [
"SeanSith",
"rkoval"
],
"repo": "rkoval/alfred-aws-console-services-workflow",
"url": "https://github.com/rkoval/alfred-aws-console-services-workflow/issues/59",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2414067742
|
How do I get the test to run with specific environment variables set?
In order for my test to run properly, I need certain environment variables set. As is, I can run the test fine outside of vscode, but when I use this extension, the test fails because a certain environment variable is not set.
You can open a PR to add the ability to use custom env vars :)
sounds like a good addition
|
gharchive/issue
| 2024-07-17T16:39:16 |
2025-04-01T06:40:15.018497
|
{
"authors": [
"fmaddenflx",
"rluvaton"
],
"repo": "rluvaton/vitest-vs-code-plugin",
"url": "https://github.com/rluvaton/vitest-vs-code-plugin/issues/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
609649335
|
Do you plan to publish the codes (MAML, RL^2, PEARL) to reproduce the paper?
It would be very helpful if you could make it public.
Hi @emuemuJP, you can find code for the algorithms that you have listed in
https://github.com/rlworkgroup/garage , along with working examples on metaworld.
Please note that the above is not the exact code originally used in the metaworld paper, which is unfortunately not available.
Thank you for letting me know!
|
gharchive/issue
| 2020-04-30T07:03:36 |
2025-04-01T06:40:15.020495
|
{
"authors": [
"avnishn",
"emuemuJP",
"krzentner"
],
"repo": "rlworkgroup/metaworld",
"url": "https://github.com/rlworkgroup/metaworld/issues/81",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1465569799
|
🛑 Micro.blog is down
In 53dbcca, Micro.blog (https://roberto.mateu.me) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Micro.blog is back up in a29ed6b.
|
gharchive/issue
| 2022-11-27T20:41:07 |
2025-04-01T06:40:15.023114
|
{
"authors": [
"rmateu"
],
"repo": "rmateu/statuspage",
"url": "https://github.com/rmateu/statuspage/issues/210",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1141254339
|
Update README.md
Adds the workaround from https://github.com/rmculpepper/iracket/issues/11 to the readme.
I am not a fan of copying files installed by a package manager to other places on the system. It clutters up your drive and can lead to issues when trying to uninstall.
IMO a better approach is to chance the lib-search-dir racket uses to find dynamic libraries. This can be achieved by something like:
cat /Applications/Racket\ v8.3/etc/config.rktd
#hash(
(build-stamp . "")
(catalogs . ("https://download.racket-lang.org/releases/8.3/catalog/" #f))
(doc-search-url . "https://download.racket-lang.org/releases/8.3/doc/local-redirect/index.html")
(lib-search-dirs . (#f "/opt/homebrew/lib")))
This has the additional benefit that Racket will be able to find other dynamic libs installed with homebrew.
|
gharchive/pull-request
| 2022-02-17T12:07:22 |
2025-04-01T06:40:15.028408
|
{
"authors": [
"NightMachinary",
"mpcjanssen"
],
"repo": "rmculpepper/iracket",
"url": "https://github.com/rmculpepper/iracket/pull/18",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
965189685
|
Problem when generating audio for new epoch
Traceback (most recent call last):
File "train.py", line 335, in
main()
File "train.py", line 326, in main
verbose=0,
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/training.py", line 1229, in fit
callbacks.on_epoch_end(epoch, epoch_logs)
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/callbacks.py", line 435, in on_epoch_end
callback.on_epoch_end(epoch, logs)
File "/content/gdrive/My Drive/prism-samplernn/callbacks.py", line 103, in on_epoch_end
self._generate(ckpt_path, self.generation_args)
File "/content/gdrive/My Drive/prism-samplernn/callbacks.py", line 109, in _generate
args['temperature'], args['seed'], args['seed_offset'])
File "/content/gdrive/My Drive/prism-samplernn/generate.py", line 135, in generate
file_name = f'{file_name}_t={temperature[i][0]}.wav'
IndexError: too many indices for array: array is 0-dimensional, but 1 were indexed
@deccolquitt Hi, looks like a problem with the value being passed to temperature parameter. Could you post how you're calling the train.py script, with the parameters? Thanks.
@deccolquitt OK I can see what's causing it - I'm assuming you weren't explicitly passing a temperature value. I've identified a bug in that circumstances, am fixing it now.
Hi, thanks for getting back so quickly, just in case you need it, here are the parameters I was using anyway:
!python train.py
--data_dir ./chunks
--num_epochs 100
--batch_size 64
--max_checkpoints 2
--checkpoint_every 1
--output_file_dur 5
--sample_rate 16000
Thanks, that's just as I expected. It would be using the default temperature if you don't explicitly set it (either as a list or as one float value), but the default value was not being passed down internally in the correct format
Should work fine now, just pushed the fix.
brilliant, will it try tomorrow
BTW, we're experimenting with replacing the default config params, you might want to try these:
{
"seq_len": 512,
"frame_sizes": [2,8],
"dim": 1024,
"rnn_type": "gru",
"num_rnn_layers": 1,
"q_type": "mu-law",
"q_levels": 256,
"emb_size": 256
}
These worked well for our Beethoven Piano Sonata model, though as always YMMV.
|
gharchive/issue
| 2021-08-10T17:44:50 |
2025-04-01T06:40:15.066190
|
{
"authors": [
"deccolquitt",
"relativeflux"
],
"repo": "rncm-prism/prism-samplernn",
"url": "https://github.com/rncm-prism/prism-samplernn/issues/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
88406285
|
jira.plugin.zsh:52: parse error near `else'
Two else branches in this if......
I'll be darned. Reproduces for me on OS X if I just add jira to my plugins list.
Last login: Mon Jun 15 16:43:18 on ttys001
/Users/janke/.oh-my-zsh/plugins/jira/jira.plugin.zsh:52: parse error near `else'
[@ in ~]
$
Looks like it was caused by a recent commit for #2898, probably as a result of bad merge conflict resolution with the jira-prefix support introduced in #2955. (The fact those are out of order makes me think we're looking at merge conflict problems and not a bad individual PR.)
Agreed, now idea how a conflict resulted that way but this is bad.
I think it's that #2898 just had the bug in the first place after it was rebased after 2955 was merged (see its comment history), and wasn't re-tested after the rebase. So the "merge conflict" happened inside #2898 when it was rebased, not when you pulled it in to easymerge or anything after that. The bug's present in tsldh/master if you check that out on its own and the diffs are clear.
Oh yes, good catch! So indeed the merge conflict was triggered (and improperly solved) during the rebase.
Taking a look, I'll see if #4041 solves it...
#4041 tests good, let's get that reopened and pulled in.
I have same problem.
@tresni gj
I have same problem
Also experiencing the same issue.
I've been getting this same issue since the last zsh update I had.
is there unit tests on oh-my-zsh?
To all: #4041 should fix this issue.
@diwu1989 not at all, but we are working on improving the update/release process, and any suggestion or contribution is welcome.
Sad, also having this issue still.
Same here:
-> % upgrade_oh_my_zsh --help
Upgrading Oh My Zsh
From https://github.com/robbyrussell/oh-my-zsh
* branch master -> FETCH_HEAD
Current branch master is up to date.
[...]
-> % zsh
/Users/fabian/.oh-my-zsh/plugins/jira/jira.plugin.zsh:52: parse error near `else'
Unit tests with some build tool would be a good option.
oh-my-zsh/plugins/jira/jira.plugin.zsh:52: parse error near `else'
Did the fix never get pushed to master?
Hi all, this will be merged later today. Sorry for the long wait!
El mié., 1 jul. 2015 a las 12:51, Ion Caza (notifications@github.com)
escribió:
Did the fix never get pushed to master?
—
Reply to this email directly or view it on GitHub
https://github.com/robbyrussell/oh-my-zsh/issues/4027#issuecomment-117601221
.
|
gharchive/issue
| 2015-06-15T12:11:54 |
2025-04-01T06:40:15.084097
|
{
"authors": [
"JohnCaza",
"JohnSmithX",
"Schrank",
"apjanke",
"arnoldnedap",
"diwu1989",
"ericop",
"feiyan35488",
"francisbautista",
"hasannawaz",
"ladyborg",
"mcornella",
"ncanceill",
"tresni",
"xyalan"
],
"repo": "robbyrussell/oh-my-zsh",
"url": "https://github.com/robbyrussell/oh-my-zsh/issues/4027",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
198365643
|
The only thing vi-mode does is ignore some key presses
I used to use bindkey -v to use vi-like keybindings in zsh. After installing oh-my-zsh, this stopped working. I changed the line to plugins=(vi-mode). Now if I hit escape and try to hit command keys it will ignore some number of key presses and then start inserting again. From what I can tell, it will skip one key press if I start by hitting '$' and two key presses if I start by hitting '0'. Here is a mapping from key sequences to observed behavior:
ESC 0 0 0 => one 0 is inserted
ESC 0 0 $ => one $ is inserted
ESC 0 $ $ => one $ is inserted
ESC $ $ $ => two $'s are inserted
ESC $ $ 0 => one $ followed by one 0 is inserted
ESC $ 0 0 => two 0's are inserted
ESC $ 0 $ => one 0 followed by one $ is inserted
ESC 0 $ 0 => one 0 is inserted
I am running Arch Linux with zsh verison 5.3.1. I have no other oh-my-zsh plugins installed. I have zsh-syntax-highlighting installed, but disabling it did not change the behavior of the program.
Can't reproduce on
zsh 5.3.1
archlinux
urxvt
Can you try to use another terminal, like urxvt?
I found behavior you talking about.
I cut all set -o vi, binkey -v and vi-mode plugin from my zsh.
"ESC 0 0 0 => one 0 is inserted" is emacs-mode.
So you should overlook your zshrc and all includes.
Alright, I found two problems with my zshrc, and after resolving them everything appears to be working as expected.
The first problem that I had was that I thought that the line plugins=(vi-mode) would enable the plugin, not just load it, so I had removed my bindkey -v line when I discovered that this plugin exists.
The second problem that I had was that I sourced oh-my-zsh at the end of my zshrc, after plugins=(vi-mode). I did this because I also use zsh-syntax-highlighting which says that it needs to be the last thing sourced, and it seemed natural to group the sourcing of external files.
Thanks for the help, much appreciated.
|
gharchive/issue
| 2017-01-02T17:58:12 |
2025-04-01T06:40:15.089150
|
{
"authors": [
"SaffronSnail",
"slavaGanzin"
],
"repo": "robbyrussell/oh-my-zsh",
"url": "https://github.com/robbyrussell/oh-my-zsh/issues/5746",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
259137151
|
Add robin theme
Just slight modification of agnoster
adding timestamp
icon fixup
New themes are currently not being accepted, but you can add a link to it on the external themes wiki page.
|
gharchive/pull-request
| 2017-09-20T11:57:24 |
2025-04-01T06:40:15.091003
|
{
"authors": [
"robinwolny",
"stevenspasbo"
],
"repo": "robbyrussell/oh-my-zsh",
"url": "https://github.com/robbyrussell/oh-my-zsh/pull/6311",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
804607212
|
adding Productboard
https://www.productboard.com/pricing/
@deethreedouble it appears productboard raised their pricing for enterprise to be $200/user/month (without any discounts)
I just went through the sales process with them and they are able to shave SSO off a la carte, but at the total per-user cost of $167/month.
|
gharchive/pull-request
| 2021-02-09T14:34:06 |
2025-04-01T06:40:15.099039
|
{
"authors": [
"CalebAlbers",
"deethreedouble"
],
"repo": "robchahin/sso-wall-of-shame",
"url": "https://github.com/robchahin/sso-wall-of-shame/pull/196",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
187474573
|
So, where are the icons?
This is the first time I've ever done this.
https://1drv.ms/i/s!AlnGeQSYLUVaoocds4rUwd7LmxHtzg
https://1drv.ms/i/s!AlnGeQSYLUVaoocenxf_W_aZSQsSfg
I'm new to a lot so any way to know what I'm doing wrong based on the screenshots? Everything is updated to the fullest.
Checklist:
I'm using Visual Studio Code Version [1.x.x] [Stable/Insider]
I'm using the extension version [1.x.x]
[ ] I've regarded to the FAQ and they did not help.
My system is: [OS] [Arch]
[ ] I'm sure this issue is not a duplicate?
[ ] I want to create an icon request(if not, remove these lines below):
- Type: [extension/folder]
- Icon Name: [name]
- Sample original Icon(32x32/png/transparent whenever possible): [url]
- Extensions: .just, .a, .bunch, .of, .extensions
- [ ] I crossed out "Description", "Steps to reproduce", "Expected behaviour" and "Actual behaviour".
Description:
[short description of the bug, enhancement or feature request]
[PLEASE INSERT THE ERROR(S) YOU CAN SEE IN DEV CONSOLE HERE]
[Where is the dev console? Launch code, click on "Help" in the menubar at the top, click on "Toggle Developer Tools", and click on that window on "Console".]
Steps to reproduce:
First step
Second step
And you can count yourself, can't you?
Expected behaviour:
[these lines are only important if you're submitting an issue]
Actual behaviour:
[these lines are only important if you're submitting an issue]
I read and read and found that they were disable in file icons under preferences. I read in the extension description with how to turn them off or disable them, so I checked and there it was. I didn't know this was possible as I could disable the extension from the extension list, but when I enabled them, I thought they were enabled. So one has to enable and then make sure they are double enabled under file icons. Sorry I had no idea this existed. You may delete this issue post as it is pointless and silly noobish.
Sorry to bother and thank you for the cool icons!
|
gharchive/issue
| 2016-11-05T03:45:00 |
2025-04-01T06:40:15.135472
|
{
"authors": [
"DanJ210"
],
"repo": "robertohuertasm/vscode-icons",
"url": "https://github.com/robertohuertasm/vscode-icons/issues/412",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1943963999
|
🛑 Revista Cómo ves is down
In e1cb84c, Revista Cómo ves (http://www.comoves.unam.mx) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Revista Cómo ves is back up in 111d452 after 5 minutes.
|
gharchive/issue
| 2023-10-15T16:44:47 |
2025-04-01T06:40:15.138252
|
{
"authors": [
"robertormzg"
],
"repo": "robertormzg/upptime-dgdc",
"url": "https://github.com/robertormzg/upptime-dgdc/issues/613",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
775001019
|
Support eoy definition in taskwarrior 2.5.2 (Closes: #87)
taskwarrior 2.5.2 switched the end of year definition:
$ task calc eoy
2000-12-31T23:59:59 (version 2.5.1)
2021-01-01T00:00:00 (version 2.5.2)
Coverage decreased (-1.7%) to 90.488% when pulling 57384842951b36a2201a3a5db3586201ed8991f2 on jspricke:develop into 8b045552526e83e6d592aaa8d64b115d7cc7bb54 on robgolding:develop.
Coverage decreased (-1.7%) to 90.488% when pulling 57384842951b36a2201a3a5db3586201ed8991f2 on jspricke:develop into 8b045552526e83e6d592aaa8d64b115d7cc7bb54 on robgolding:develop.
Ah, just realised the usage of six is causing an issue here - no need for that anymore as this library is Python 3 only! If you can fix that up I'll merge this PR!
Fixed. (I wrote it for the Debian version, initially).
Btw. can you push tags for the latest versions? pypi lists 2.2.1 but latest tag is only 1.3.0, or is pypi the official source?
Thanks, and Happy New Year! Merging this now. I've also pushed tags for the missing versions to match PyPI.
Thanks, and Happy New Year! Merging this now. I've also pushed tags for the missing versions to match PyPI.
|
gharchive/pull-request
| 2020-12-27T08:49:19 |
2025-04-01T06:40:15.143562
|
{
"authors": [
"coveralls",
"jspricke",
"robgolding"
],
"repo": "robgolding/tasklib",
"url": "https://github.com/robgolding/tasklib/pull/88",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2583238822
|
Чи могли б оновити додаток, а також підсказати як використовувати квантонізовану версію?
Чи могли б оновити додаток, а також підсказати як використовувати квантонізовану версію?
Намагався запустити, сходу не запустилось, після встановлення залежностей з файлу залежностей, видавало помилку, Бо вже не сумісне з бібліотекою.
Після того, як використовуючи ChatGPT з горем пополам запустив, то свопнуло на 80 гб. Чи могли б підсказати, як запустити так, щоб влазило в 10 гб пам'яті?
Завчасно дякую!
@olehbozhok, доброго дня, тут демо нової моделі, можете взяти код як запустити звідти:
https://huggingface.co/spaces/robinhad/UAlpaca
В цьому файлі є приклад як запустити:
https://huggingface.co/spaces/robinhad/UAlpaca/blob/main/app.py
|
gharchive/issue
| 2024-10-12T16:26:15 |
2025-04-01T06:40:15.153955
|
{
"authors": [
"olehbozhok",
"robinhad"
],
"repo": "robinhad/kruk",
"url": "https://github.com/robinhad/kruk/issues/6",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
40136988
|
Make the change method the default and comment out up/down
This is potentially a big change moving forwards.
In newly created migrations up and down would be commented out by default and change would become the default method.
This is the opposite of the present behaviour.
Background
The change method is far simpler to use and suffices for most users. The up / down methods are more flexible but should be used less.
Does change() work correctly with data migration, not just schema migration? In other words, if I modify a column in a table and have to adjust existing rows based on a SELECT, won't I have to create a custom down() method? Or if I drop a table/column in a migration and need to recreate it on rollback? Of course, you are just talking about changing the default here and I will still be able to override it by creating an up()/down() method when I need it?
I am looking at moving to phinx on a project and am trying to understand what the approximate limits are to understand the impact of planned changes and how I should organize my initial setup.
@evought see the note about limitations at the end of http://docs.phinx.org/en/latest/migrations.html#the-change-method
"see the note about limitations..."
Understood, thank you. Been experimenting with the behavior this evening and it seems fairly logical.
With regard to the current state, and the proposal to make change() the default operation, would it be advantageous (and more defensive) to validate that the migration can be rolled-back BEFORE the migration takes place? Maybe by using a variant of the adapters that only allow the reversible methods?
So, up() and down() would use the current adapter and I am at will to do whatever I want.
But change() would use a limited adapter that simply does not have any method that cannot be reversed.
For me, the advantage here is that the developer is protected from making non-reversible mistakes. Make their/our lives easier.
Hey @RQuadling I'd prefer to keep Phinx as dumb as possible. Some people run Phinx on massive production systems with 100's of tables and terrabytes of data. I'd sleep much better knowing they are in full control of their migrations (even if they write something nasty). If Phinx started to become opinionated about migrations I'm worried it might lead to problems. Better to keep it simple.
p.s moving to 0.5.0 now.
I agree with @RQuadling, those people would not have full control on their migration because they would unknowingly make a change in the change() method that can't be reverted. The full control is having up() and down() and being able to make mistakes because of my code and not phinx's.
What I was saying is that the methods that can be called in the change() method should be limited to reversible ones. If a user requires something that cannot be automatically reversed in a migration then they need to use up() and down().
done in 5ffb2324f5789c8919271dd2c6bb0b448a0b02a1
I suppose we need to see a use case of a non reversible change() call. I'm still not sure that every use of change IS reversible.
up()/down() IS the "in the hands of the developer" control that is required.
On the other hand, I never use change(), so meh!
|
gharchive/issue
| 2014-08-13T09:24:59 |
2025-04-01T06:40:15.169427
|
{
"authors": [
"RQuadling",
"evought",
"maxgalbu",
"robmorgan",
"rquadling",
"shadowhand"
],
"repo": "robmorgan/phinx",
"url": "https://github.com/robmorgan/phinx/issues/287",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
161520658
|
Deprecate showInterface from Font initialization?
I'm not sure how desirable or immediately necessary this change is, but given that there's potential for a clean break with the roboFab implementation, I'm opening this issue as a point of discussion.
Usually when you see a boolean option for any method, it's either a signal that there is a method that is doing double duty, or domain model that hasn't been defined. I'm wondering if this isn't a place where another object might be introduced in order to better encapsulate the design space that fontParts works within:
Font('foo.ufo', showInterface=True)
# vs. proposed:
Editor().open(Font('foo.ufo'))
I'd propose introducing some kind of Editor or Application concept that would be a generic container that explicitly interacts with whichever application is the subject of a fontParts implementation.
I think that this could be a cleaner separation of responsibilities, and would allow the implementation of Font to be ignorant of the details of the manipulating the UI, which I like. There are ramifications for testing (isolating UI concerns to fewer areas), and for conceptually decoupling interface from editor (the latter of which may or may not include an interface that can be shown, but still have the concept of opening a font). It could also mean that a specialized Editor object could open different kinds of resources (e.g. just glyphs, features, etc.).
It would still be possible to maintain the old showInterface API (with a deprecation warning?).
Thoughts?
My two cents:
Most folks writing scripts will have a much easier time with:
Font('foo.ufo', showInterface=True)
I see the appeal of
Editor().open(Font('foo.ufo'))
on a higher level, but would not want the first to go away. This is based on my experience teaching typeface designers to write code, they are not out to be the world's best programmers, they just want to easily understand something and get on with it.
I guess it comes down somewhat to the philosophy of the library… Keeping both feels like it violates the Pythonic tenant of one right way, so I suppose this could be something that just can't change.
Hm, I kinda like the general idea. I think there are some gnarly backwards compatibility and environment things that we need to think through. For example, not all environments have an interface and not all environments have a no-interface way of opening a font. OpenFont("foo.ufo") in FontLab and RoboFont will show an interface. The same thing in NoneLab will not.
Let me think on it...
I was looking through the OpenFont and friends after I opened this issue. For me these have always felt like convenience functions wrapping the mid-level interfaces, to mask some of the underlying implementation details. Perhaps it's worth considering what the primary interface to the library is, and which APIs are changeable, even if this proposal goes too far in introducing new functionality.
The the case of NoneLab Editor#open could simply be a no-op.
I also don't want to imply that Editor is the most appropriate name for this—nor Application. I can think of several situations where one or the other may not be the appropriate term for the context. Interface could be general enough, or at least on the opposite end from Editor
I would agree with @benkiel. One thing to consider is that these calls are part of almost all robofab centric scripts. It would break whatever portability they have. "Clean break" is expensive.
You could kind of use the same reason against the creation of this library entirely. I guess my question is still, if portability is a goal, then which interfaces are changeable and which ones are frozen? Is there a priority among interfaces? One of the stated goals is to remove cruft, and I think that—arguably—this keyword arg is a wart, in not at least crufty.
The second question that I have would be, if these changes are not possible—or not a priority—now, is it possible they could be in the future? Is there a plan to make a long-term roadmap possible? I think that having this library versioned and available on pypi would go a long way to being about to say "now that this is v.2 here are the breaking changes" and offer deprecation warnings up to that point.
I guess my question is still, if portability is a goal, then which interfaces are changeable and which ones are frozen?
There isn't a hard and fast rule. It's more of an intuitive, "How many scripts do we think this will break?" vs, "Ugh, I can't live with this mistake from 2003 anymore so everyone can just deal with the break." kind of thing.
The second question that I have would be, if these changes are not possible—or not a priority—now, is it possible they could be in the future?
Yes. Right now the focus is on trying to get reasonable compatibility with RoboFab so we can stop answering RoboFab tech support questions. After that, we're going to need to bring the API up to date with new stuff.
All that said, we need to make sure that we keep in mind that this is supposed to be a simple scripting API for non-programmers. I've been struggling with that from time to time as evidenced by the looks I got when I proposed a couple of weird things at Typographics. 😕
I've been thinking about an Editor object a bit more. It could be useful for abstracting some of the other common interface interactions that we have needed. For example, robofab.interface.all.dialogs. <bigboldtype>This was a huge headache in RoboFab so I'm sweating a bit even bringing it up.</bigboldtype> But, I know how valuable being able to have a common API for triggering extremely simple user interactions from my experiences writing scripts that had to work equally well in RoboFont and Windows FontLab. Something like this:
app = Application()
path = app.showGetFile(...)
go = app.askYesNo(...)
# etc.
<bigboldtype>FontParts would not implement any of this. Ever. At all. It would only define the API.</bigboldtype> Still, I'm shaking a bit just thinking about it. It's a short road from here to <insert horrible "universal" windowing API name here>.
We've put UI things into fontParts.ui, this is explicitly for an implementation of fontParts to handle, fontParts is not going to make ui components. These are simple, for common things that a scripter would likely want, anything fancier can be done with something else (dialogKit, vanilla, etc). As such, I'm going to close this for now, but it may be worth coming back to the idea of an Editor object at some point.
|
gharchive/issue
| 2016-06-21T19:33:28 |
2025-04-01T06:40:15.189828
|
{
"authors": [
"LettError",
"benkiel",
"jackjennings",
"typesupply"
],
"repo": "robofab-developers/fontParts",
"url": "https://github.com/robofab-developers/fontParts/issues/40",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
151790041
|
Finished main development of the DVL drivers
I also moved the drivers directory up a level like was in the refactor notes.
Also need to work on the actual build integration, at the moment the makefile just spits things into the bin directories next to the source, which probably isn't the best idea.
|
gharchive/pull-request
| 2016-04-29T04:15:51 |
2025-04-01T06:40:15.239326
|
{
"authors": [
"gharris1727"
],
"repo": "robotics-at-maryland/qubo",
"url": "https://github.com/robotics-at-maryland/qubo/pull/27",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
650987842
|
Path intersections between (x,y, x,y)
Hi,
The Glyphs API has intersectionBetweenPoints and while looking for a UFO equivalent, I found robofab.path.intersect; I was wondering if this feature is missing from fontParts or if it was intentionally left out as out-of-scope? Or maybe it is implemented with a different name and I just missed it?
A lot of half-working apis where removed in the transition between robofab and fontParts: easier to maintain, smaller codebase, ... . The path module was not ported over to fontParts.
BooleanOperations has getIntersections of list of contours. see https://github.com/typemytype/booleanOperations
from booleanOperations import getIntersections
f = CurrentFont()
g1 = f["A"]
g2 = f["B"]
result = getIntersections(g1.contours + g2.contours)
print(result)
good luck!
Hi @typemytype - thank you, but I think I did not do a great job wording my question - or possibly I misunderstood robofab.path.intersect - but this is not the kind of intersection I am looking for. Rather than the intersection of glyphs themselves, I'm trying to run a straight line run across a glyph from arbitrary point (x,y) to another arbitrary point (x,y) - pretty much the measurement or slice tool but without creating new points, and without a GUI
Hopefully this demonstrates the values I am looking for in such an API - imagine I were to run some imaginary code like getLineIntersection((-1000,200), (1000,200)) and would get something like {51,200}, {263, 200}, {446, 200}, {658, 200} as an output.
I would love to know what you recommend! (And if it is something that might make sense in fontParts, I would be happy to try and contribute)
if you are in a RoboFont world then you can use IntersectGlyphWithLine
from mojo.tools import IntersectGlyphWithLine
result = IntersectGlyphWithLine(CurrentGlyph(), ((startX, startY), (endX, endY))
print(result)
see https://robofont.com/documentation/building-tools/api/mojo/mojo-tools/#mojo.tools.IntersectGlyphWithLine
good luck!
Something along these lines could work in fontShell on the command line:
`from booleanOperations import getIntersections
from fontParts.fontshell.glyph import RGlyph
def getLine(sx,sy,ex,ey):
line = RGlyph()
line.width = ey
pen = line.getPen()
pen.moveTo((sx, sy))
pen.lineTo((ex,ey))
pen.closePath()
return line.contours
def getLineIntersection(glyph, sx,sy,ex,ey):
return getIntersections(getLine(sx,sy,ex,ey) + glyph.contours)`
I've tested a bit, didn't quite get it working, but that's the idea
Thanks again for the quick response! That's exactly the functionality I am looking for, but outside of RoboFont - hopefully I would just be able to point a Python script to a UFO file
@jpt It's not in fontParts, as @typemytype said; it didn't make the move over from roboFab. Hopefully the bit above helps set you down the path to do it with a script.
Thank you @benkiel - my previous reply happened at the same time as yours; I see now how the two libraries can work together with RGlyph as the basis -- I'll get experimenting! Closing the issue, thanks again
@jpt note that that code doesn't pull in a font, but you can do so with an import of RFont.
there’s also MarginPen in fontPens, but it only works for vertical or horizontal lines — see this example
Ah, that's likely easier, @jpt
if you draw it first into a transformPen with the correct angle, then it also works of arbitrary lines!!
and done:
from fontPens.marginPen import MarginPen
from fontTools.pens.transformPen import TransformPen
from fontTools.misc.transform import Transform
import math
def intersectGlyphWithLine(glyph, line):
# expand the line into seperate variables
(startx, start), (endx, endy) = line
# calculate the diff
yDiff = endy - starty
xDiff = endx - startx
# use the atan2 to get the angle of the line
angle = math.atan2(yDiff, xDiff)
# create a marginPen, starting from the starty
pen = MarginPen(g.font, starty, isHorizontal=True)
# create an empty transform matrix
matrix = Transform()
# set the starting point tot the origin
matrix = matrix.translate(startx, starty)
# rotate the matrix around the origin
matrix = matrix.rotate(-angle)
# translate back
matrix = matrix.translate(-startx, -starty)
# draw in a transform pen with the margin pen
glyph.draw(TransformPen(pen, matrix))
# get the inverse matrix
inverseMatrix = matrix.inverse()
# create an empty list of possible intersections
intersections = []
# calculate the min max of the x's
minx = min((startx, endx))
maxx = max((startx, endx))
# get all margin x values
for x in pen.getAll():
# convert the straight line back to with an inverse matrix
x, y = inverseMatrix.transformPoint((x, starty))
# if the point is inbetween the min, max its a point on the line
if minx < x < maxx:
# add it to the intersections
intersections.append((x, y))
return intersections
g = CurrentGlyph()
startx, starty = 94, 356
endx, endy = 60, 142
results = intersectGlyphWithLine(g, ((startx, starty), (endx, endy)))
print(results)
Perhaps we should add this to fontTools?
wow, I can't overstate my thanks for this @gferreira @typemytype @benkiel - it solves the problem, but it's also so generous and educational. I can have trouble locating the right library for this or that, I appreciate the demonstration of interoperability between them!
|
gharchive/issue
| 2020-07-05T02:30:48 |
2025-04-01T06:40:15.261497
|
{
"authors": [
"benkiel",
"gferreira",
"jpt",
"typemytype"
],
"repo": "robotools/fontParts",
"url": "https://github.com/robotools/fontParts/issues/534",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2724399014
|
Add trevorhauter/gitportal.nvim
Repo URL:
https://github.com/trevorhauter/gitportal.nvim
Checklist:
[x] The plugin is specifically built for Neovim, or if it's a colorscheme, it supports treesitter syntax.
[x] The lines end with a .. This is to conform to awesome-list linting and requirements.
[x] The title of the pull request is Add/Update/Remove `username/repo` (notice the backticks around `username/repo`) when adding a new plugin.
[x] The description doesn't mention that it's a Neovim plugin, it's obvious from the rest of the document. No mentions of the word plugin unless it's related to something else. No .. for Neovim.
[x] The description doesn't contain emojis.
[x] Neovim is spelled as Neovim (not nvim, NeoVim or neovim), Vim is spelled as Vim (capitalized), Lua is spelled as Lua (capitalized), Tree-sitter is spelled as Tree-sitter.
[x] Acronyms should be fully capitalized, for example LSP, TS, YAML, etc.
Thanks for the PR!
|
gharchive/pull-request
| 2024-12-07T05:59:46 |
2025-04-01T06:40:15.289791
|
{
"authors": [
"trevorhauter",
"tversteeg"
],
"repo": "rockerBOO/awesome-neovim",
"url": "https://github.com/rockerBOO/awesome-neovim/pull/1382",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1862071271
|
Dispute text for GMC operations is not totally clear
There are 3 bullets like this
Anyone MAY file an RPIP disputing a grant, bounty, or retrospective award within two weeks of the announcement of recipients. Such an RPIP SHALL be subject to a snapshot vote.
If a majority of the GMC agrees that a grant recipient is failing to provide the specified services to the protocol in a timely manner (as documented in the original application and in subsequent monthly updates), the GMC SHALL publicly announce such a decision and cease any future payments. This decision MAY be disputed by anyone through the creation of an RPIP within two weeks of the GMC’s announcement. The RPIP SHALL be subject to a snapshot vote.
Any group or individual MAY submit a publicly-available document to the GMC claiming successful completion of the bounty. The GMC SHALL discuss all such applications. If a majority of the GMC agrees then the GMC SHALL announce the award of the bounty. Anyone MAY dispute the awarding of the bounty through the creation of an RPIP within two weeks of the GMC’s announcement. The RPIP SHALL be subject to a snapshot vote.
The term "creation an RPIP", is not as clear as it should be.
In discussion, https://discord.com/channels/405159462932971535/1143603646592987237/1143616540600717332, we thought it should be a merged Draft that was actively being pursued. This could be measured by having a Draft RPIP and an open sentiment poll.
Niche
|
gharchive/issue
| 2023-08-22T19:28:34 |
2025-04-01T06:40:15.293170
|
{
"authors": [
"Nichebiche",
"Valdorff"
],
"repo": "rocket-pool/RPIPs",
"url": "https://github.com/rocket-pool/RPIPs/issues/66",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
655250032
|
Enable usage of IOStreams::Pgp with keys that don't have an email.
Description of changes
Allow usage of keys without email. Also so allow key_id to be used for delete_keys and trust methods.
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Excellent work, impressive tests. :100:
Thank you for adding support for using the key_id. Had not encountered keys before with an email address. This enhancement will help anyone that runs into this scenario.
Changes are now available in gem: iostreams v1.3.0
|
gharchive/pull-request
| 2020-07-11T17:44:49 |
2025-04-01T06:40:15.295248
|
{
"authors": [
"andresfcamacho",
"reidmorrison"
],
"repo": "rocketjob/iostreams",
"url": "https://github.com/rocketjob/iostreams/pull/10",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
851339033
|
Can't disable started, processing and rendered logs when using log_level debug
Environment
Ruby Version: 2.6.6
Rails Version: 6.0.3.2
Semantic Logger Version: 4.7.4
Rails Semantic Logger Version: 4.5.0
Puma: 4.3.6
Rails configuration:
config.log_level = :debug
STDOUT.sync = true
config.rails_semantic_logger.rendered = false
config.rails_semantic_logger.add_file_appender = false
config.semantic_logger.add_appender(io: STDOUT, level: config.log_level, formatter: config.rails_semantic_logger.format)
Expected Behavior
Using above configuration I should not see the ActionView rendered output, but I am.
Actual Behavior
I am still seeing the rendered logs being output, same goes for the started and processing logs (These are disabled by default as well).
It seems when I set the log_level to :info the started, processing and rendered config settings do take effect.
I believe this is the offending line for the "Processing" log,
https://github.com/rocketjob/rails_semantic_logger/blob/47112b2c9effe7ab72f4f99d46875ed8d67d0965/lib/rails_semantic_logger/action_controller/log_subscriber.rb#L8
which has a comment: # Log as debug to hide Processing messages in production
As far as the other "started" and "rendered" logs, you can set their log level using:
RailsSemanticLogger::Rack::Logger.started_request_log_level = :trace
RailsSemanticLogger::ActionView::LogSubscriber.rendered_log_level = :trace
@andrew-newell Thanks! In the end I chose to go with another gem because this didn't feel like the best solution for our problem.
|
gharchive/issue
| 2021-04-06T11:40:36 |
2025-04-01T06:40:15.300396
|
{
"authors": [
"andrew-newell",
"yourivdlans"
],
"repo": "rocketjob/rails_semantic_logger",
"url": "https://github.com/rocketjob/rails_semantic_logger/issues/125",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.