id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
1455276862
|
feat: AS-Prepends added to direct link
PR summary
Added Gateway AS Prepends endpoints to SDK,
List AS prepends
Replace existing AS prepends
Fixes:
PR Checklist
Please make sure that your PR fulfills the following requirements:
[x] The commit message follows the Angular Commit Message Guidelines.
[x] Tests for the changes have been added (for bug fixes / features)
[ ] Docs have been added / updated (for bug fixes / features)
PR Type
[ ] Bugfix
[ ] Feature
[ ] Code style update (formatting, local variables)
[ ] Refactoring (no functional changes, no api changes)
[ ] New tests
[ ] Build/CI related changes
[ ] Documentation content changes
[ ] Other (please describe)
What is the current behavior?
What is the new behavior?
Does this PR introduce a breaking change?
[ ] Yes
[x] No
Other information
:tada: This PR is included in version 0.20.0 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
|
gharchive/pull-request
| 2022-11-18T14:55:56 |
2025-04-01T04:55:10.814746
|
{
"authors": [
"MadhavanPillai",
"MalarvizhiK"
],
"repo": "IBM/networking-python-sdk",
"url": "https://github.com/IBM/networking-python-sdk/pull/69",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
304986351
|
New Data Science Experience accounts do not include object storage or spark services
The Create the Project step within Create the notebook needs to add steps for adding a storage instance and spark instance as the current flow when creating a DSX account no longer creates the DSX-ObjectStorage and DSX-Spark services in the user's cloud account.
this is all changed with the new watson studio service, see latest readme
|
gharchive/issue
| 2018-03-14T00:53:22 |
2025-04-01T04:55:10.816196
|
{
"authors": [
"stevemart",
"timroster"
],
"repo": "IBM/pixiedust-traffic-analysis",
"url": "https://github.com/IBM/pixiedust-traffic-analysis/issues/37",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
212207334
|
PEP 8 naming
I'm very surprised to see a new Python API which doesn't follow PEP 8 naming conventions with, for example, runExperiment instead of run_experiment.
Yes, you are right @rob-smallshire , we will adapt the next version to PEP 8 :). Sorry for the inconvenience.
|
gharchive/issue
| 2017-03-06T18:22:15 |
2025-04-01T04:55:10.817940
|
{
"authors": [
"pacomf",
"rob-smallshire"
],
"repo": "IBM/qiskit-api-py",
"url": "https://github.com/IBM/qiskit-api-py/issues/2",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
755269133
|
add support for httpfs
Signed-off-by: CDORON@il.ibm.com cdoron@lnx-arrow.sl.cloud9.ibm.com
Hi Roee,
I accidentally found a way to support httpfs.
Please review and check whether we want this support in the-mesh-for-data-flight-module.
Thanks!
httpfs seems like a 1 person project with zero adoption. Not saying that we can't use it but we need a requirement here and I'm not sure that we have one.
Also, technically, can you explain how this works? I don't see implementation of any of the FSSpecHandler methods in HttpFs or fs.base.FS
Looking around it seems like pyarrow is compatible with fsspec. See http://arrow.apache.org/docs/python/filesystems.html#using-fsspec-compatible-filesystems.
fsspec has an HTTPFileSystem implementation so you can use it instead (note that it also requires the aiohttp package):
from fsspec.implementations.http import HTTPFileSystem
I tried working with the fsspec HTTPFileSystem implementation.
I got the following error:
Traceback (most recent call last):
File "sample/sample.py", line 39, in <module>
main(args.port, args.repeat)
File "sample/sample.py", line 22, in main
info = client.get_flight_info(
File "pyarrow/_flight.pyx", line 1237, in pyarrow._flight.FlightClient.get_flight_info
File "pyarrow/_flight.pyx", line 80, in pyarrow._flight.check_flight_status
File "pyarrow/error.pxi", line 84, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: 'utf-8' codec can't decode byte 0x86 in position 38: invalid start byte. Detail: Python exception: UnicodeDecodeError
It appears that fsspec HTTPFileSystem is not a friend of the parquet format. I could only get it to work when I applied the following patch to fsspec/implementations/http.py
https://github.com/intake/intake-parquet/issues/18#issuecomment-687394384
|
gharchive/pull-request
| 2020-12-02T13:21:18 |
2025-04-01T04:55:10.822743
|
{
"authors": [
"cdoron",
"roee88"
],
"repo": "IBM/the-mesh-for-data-flight-module",
"url": "https://github.com/IBM/the-mesh-for-data-flight-module/pull/27",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2382114835
|
japanese_llama system prompt name typo
unitxt/prepare/system_prompts/models/japanese_llama.py has a typo in its ID string (line 10):
Current: "system_prompt.models.japanese_llama"
Should be: "system_prompts.models.japanese_llama"
Fixed in PR #1056
|
gharchive/issue
| 2024-06-30T06:47:14 |
2025-04-01T04:55:10.824872
|
{
"authors": [
"bnayahu"
],
"repo": "IBM/unitxt",
"url": "https://github.com/IBM/unitxt/issues/964",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2681550255
|
Check if unitxt_pkg is None
When using HuggingFace APIs to load datasets and metrics, the unitxt_pkg is None. Check it before accessing its properties. The current code breaks the installation approach documented here: https://www.unitxt.ai/en/latest/docs/lm_eval.html#installation.
Thank you @yhwang.
TrustyAI's LMEval is being affected by this https://github.com/trustyai-explainability/trustyai-service-operator/issues/367
Thanks @yhwang . The test fails because the submission was from a fork and not branch, so I copied it to a branch in #1387 and merged.
Thanks!
|
gharchive/pull-request
| 2024-11-22T02:27:06 |
2025-04-01T04:55:10.827424
|
{
"authors": [
"ruivieira",
"yhwang",
"yoavkatz"
],
"repo": "IBM/unitxt",
"url": "https://github.com/IBM/unitxt/pull/1386",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
232462115
|
Home page title.
The initial home page announces
Repository for computational datasets generated on the Imperial College High Performance Computing Service resources.
General practise with this repository has resulted in a lot of instrumental data being deposited, data which was NOT created using the HPC resources.
Can a more general title that reflects this be considered?
That text is now changed to remove reference to HPC. Better page content will follow as part of overarching RCS web improvements
|
gharchive/issue
| 2017-05-31T06:02:40 |
2025-04-01T04:55:10.842037
|
{
"authors": [
"hrzepa",
"mj-harvey"
],
"repo": "ICHPC/hpc-repo",
"url": "https://github.com/ICHPC/hpc-repo/issues/28",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2085225666
|
Chore: update cors origins
Update list of domains supported in cross-origin requests.
Default to allow all origins, as providing a specific list of just those known would still be very permissive (localhost:4200, *.vercel.app, *.picsa.app) so unlikely to achieve much
accidentally built on #21, rebase onto #23
|
gharchive/pull-request
| 2024-01-17T01:25:56 |
2025-04-01T04:55:10.910797
|
{
"authors": [
"chrismclarke"
],
"repo": "IDEMSInternational/epicsa-climate-api",
"url": "https://github.com/IDEMSInternational/epicsa-climate-api/pull/22",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
451199194
|
数据集下载问题
您好,CBDNet-tensorflow的model和dataset能否以其他方式上传,实在是打不开下载链接(登录微软账号之后显示“不成功很抱歉,在 zjueducn-my.sharepoint.com 目录中找不到cyl964156631@hotmail.com(我的微软账户)”),万分感谢~~
我建议你再尝试下。
我空闲时会上传到其他网盘(大概是baidu和mega),几天内吧。
好的,非常感谢,我再尝试一下
---原始邮件---
发件人: "IDKiro"notifications@github.com
发送时间: 2019年6月3日 00:11:21
收件人: "IDKiro/CBDNet-tensorflow"CBDNet-tensorflow@noreply.github.com;
抄送: "CYL0089"964156631@qq.com;"Author"author@noreply.github.com;
主题: Re: [IDKiro/CBDNet-tensorflow] 数据集下载问题 (#4)
我建议你再尝试下。
我空闲时会上传到其他网盘(大概是baidu和mega),几天内吧。
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
Have you upload the dataset.zip to other platform yet? @IDKiro
I failed to download through your "dataset.zip" link all the time.
Have you upload the dataset.zip to other platform yet? @IDKiro
I failed to download through your "dataset.zip" link all the time.
i'm sorry that i delete the extra link since the model has little changed and forget to re-add it. now you can see the other download link.
Much thx for your quick reply. ^_^ @IDKiro
|
gharchive/issue
| 2019-06-02T14:41:32 |
2025-04-01T04:55:10.931171
|
{
"authors": [
"CYL0089",
"IDKiro",
"xiaozhi2015"
],
"repo": "IDKiro/CBDNet-tensorflow",
"url": "https://github.com/IDKiro/CBDNet-tensorflow/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
226242314
|
java.lang.ArrayIndexOutOfBoundsException: 4096
I'm getting the following error during ePub validation (using v4.0.2):
$: java -jar epubcheck.jar Selbststandig\ machen\ -\ sevDesk\ Lexikon.epub
Verwendung der EPUB 2.0.1 Prüfungen
java.lang.ArrayIndexOutOfBoundsException: 4096
at com.sun.imageio.plugins.gif.GIFImageReader.read(GIFImageReader.java:984)
at javax.imageio.ImageIO.read(ImageIO.java:1448)
at javax.imageio.ImageIO.read(ImageIO.java:1308)
at com.adobe.epubcheck.bitmap.BitmapChecker.getImageSizes(BitmapChecker.java:145)
at com.adobe.epubcheck.bitmap.BitmapChecker.checkImageDimensions(BitmapChecker.java:269)
at com.adobe.epubcheck.bitmap.BitmapChecker.runChecks(BitmapChecker.java:321)
at com.adobe.epubcheck.opf.OPFChecker.checkItemContent(OPFChecker.java:423)
at com.adobe.epubcheck.opf.OPFChecker.runChecks(OPFChecker.java:157)
at com.adobe.epubcheck.ocf.OCFChecker.runChecks(OCFChecker.java:300)
at com.adobe.epubcheck.api.EpubCheck.doValidate(EpubCheck.java:215)
at com.adobe.epubcheck.tool.EpubChecker.validateFile(EpubChecker.java:161)
at com.adobe.epubcheck.tool.EpubChecker.processFile(EpubChecker.java:495)
at com.adobe.epubcheck.tool.EpubChecker.run(EpubChecker.java:255)
at com.adobe.epubcheck.tool.Checker.main(Checker.java:31)
EpubCheck abgeschlossen
This seems to happen when checking GIF files.
Can you narrow down the problem to a single GIF image and attach this file or a demo epub file containing this GIF?
Still waiting for feedback...
Can you narrow down the problem to a single GIF image and attach this file or a demo epub file containing this GIF?
Hi tofi86,
You can check attachment file which also produces the same error.
Thanks for the demo data, we'll take a look!
@tofi86 sorry for being unresponsive - I was not able to reproduce the issue. @minhductin2 thanks for the demo data!
I can confirm that the error can be reproduced with the above animated GIF.
However, when reading this StackOverflow question I have the impression that there isn't an easy patch for this besides switching the ImageIO GIFImageReader.
|
gharchive/issue
| 2017-05-04T10:47:08 |
2025-04-01T04:55:10.935362
|
{
"authors": [
"marbetschar",
"minhductin2",
"tofi86"
],
"repo": "IDPF/epubcheck",
"url": "https://github.com/IDPF/epubcheck/issues/756",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2458227739
|
🛑 IDR (well 469267) is down
In e09a7d5, IDR (well 469267) (https://idr.openmicroscopy.org/webclient/?show=well-469267) was down:
HTTP code: 502
Response time: 219 ms
Resolved: IDR (well 469267) is back up in 7f3c8cb after 26 minutes.
|
gharchive/issue
| 2024-08-09T15:59:34 |
2025-04-01T04:55:10.938303
|
{
"authors": [
"snoopycrimecop"
],
"repo": "IDR/upptime",
"url": "https://github.com/IDR/upptime/issues/3643",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1655260135
|
Add all the team
We have a team page with a small part of the team there. It would be awesome if we have everyone added to it.
I am intrested in doing it!...can i do this?
yeah okay i'll let you know
The component part still needs to be made, I have hardcoded the rest of the team.
|
gharchive/issue
| 2023-04-05T09:26:52 |
2025-04-01T04:55:10.944356
|
{
"authors": [
"MrGKanev",
"atilapan",
"seabeePraveen"
],
"repo": "IEEE-Student-Branch-of-Uni-Ruse/website",
"url": "https://github.com/IEEE-Student-Branch-of-Uni-Ruse/website/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1326624986
|
about checkpoint file
你好,训练好的权重方便分享吗?
你好,训练好的权重方便分享吗?
https://github.com/IIGROUP/MANIQA/releases/tag/PIPAL22-VALID-CKPT
你好,训练的时候,验证loss一直在下降,目前只有第一轮结果是最好的。为什么会出现这个情况啊。
你好,训练的时候,验证loss一直在下降,目前只有第一轮结果是最好的。为什么会出现这个情况啊。
不知道你说的啥意思,能截张图嘛
就是eval_epoch的, SRCC, PLCC值随着epoch增加,值一直在减小。
就是eval_epoch的, SRCC, PLCC值随着epoch增加,值一直在减小。
我还没训练过,感觉他的预训练权重应该就能满足需求了,你用的他说的公开训练集训练嘛,看他的readme,好像就只让训练一轮,也不知道啥原因
就是eval_epoch的, SRCC, PLCC值随着epoch增加,值一直在减小。
我还没训练过,感觉他的预训练权重应该就能满足需求了,你用的他说的公开训练集训练嘛,看他的readme,好像就只让训练一轮,也不知道啥原因
是用的pipal数据集训练的,batch_size改大了也是一样。没说只训练一轮吧。
您好,PIPAL数据集容易出现过拟合问题。我们的比赛经验是,学习率大小和训练epoch数是成反比的,当epoch数过大,会出现较大的drop,这是由于过拟合引起的。所以在训练时我们选择了一个能保证了我们只用训一个epoch就能达到最好的效果的学习率。
|
gharchive/issue
| 2022-08-03T03:19:41 |
2025-04-01T04:55:11.051990
|
{
"authors": [
"9vivian88",
"Stephen0808",
"monument-and-sea-all-the-gift"
],
"repo": "IIGROUP/MANIQA",
"url": "https://github.com/IIGROUP/MANIQA/issues/14",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1302185093
|
Formatting Update
formatting update
removed renderText from server and replaced with headers
added comments for readability
Thank you! Looks great
|
gharchive/pull-request
| 2022-07-12T14:57:44 |
2025-04-01T04:55:11.107431
|
{
"authors": [
"jazzy-y",
"yv-h"
],
"repo": "INDIA-CHARLIE-ECHO-MAN/ODC",
"url": "https://github.com/INDIA-CHARLIE-ECHO-MAN/ODC/pull/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2328629229
|
🛑 IOGT CLOUD is down
In e1d036a, IOGT CLOUD ($CLOUD_SERVER) was down:
HTTP code: 0
Response time: 0 ms
Resolved: IOGT CLOUD is back up in 0a4c4a1 after 20 minutes.
|
gharchive/issue
| 2024-05-31T21:45:56 |
2025-04-01T04:55:11.116557
|
{
"authors": [
"aalonzolu"
],
"repo": "IOGT/upptime",
"url": "https://github.com/IOGT/upptime/issues/851",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1327716703
|
🛑 IOGT CLOUD is down
In 4c1be12, IOGT CLOUD ($CLOUD_SERVER) was down:
HTTP code: 523
Response time: 3276 ms
Resolved: IOGT CLOUD is back up in d6152c7.
|
gharchive/issue
| 2022-08-03T19:53:47 |
2025-04-01T04:55:11.118594
|
{
"authors": [
"aalonzolu"
],
"repo": "IOGT/upptime",
"url": "https://github.com/IOGT/upptime/issues/90",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1346538736
|
Commenting out CONFIG_KERNEL_NONE = 1 might be switching off the interrupts.
I'm working on a small W806 app (https://github.com/nyh-workshop/w806-i2s) and trying to put a FreeRTOS inside.
From the readme, I need to comment out the "CONFIG_KERNEL_NONE = 1" or else it won't work.
However when I comment out this setting, most of the interrupts are not being called anymore, even the DMA ones.
There is someone also discussing about this here: http://ask.winnermicro.com/question/269.html. There is a possible answer at the bottom of that thread - however my Mandarin is limited and the translation seems to be not working good today.
It could mean to remove the macros, but I'm not sure which one to remove on these files.
I tried the changes in the discussion,
Comment out CONFIG_KERNEL_NONE = 1 in csi_config.h
Edit function SystemInit() in system.c, change it to
void SystemInit(void)
{
__set_VBR((uint32_t) & (irq_vectors));
__set_CHR(__get_CHR() | CHR_IAE_Msk);
/* Clear active and pending IRQ */
VIC->IABR[0] = 0x0;
VIC->ICPR[0] = 0xFFFFFFFF;
__enable_excp_irq();
}
Then the timer interrupts will work.
I am not quite sure about the side effects... I didn't play much with w806 recently. Sure, the macros will be updated when there are more feedback on this change.
Hello, thanks for the assistance. I have written another example code with this change and so far it is running good. :)
It's really cool~ I will try it when I get a pcm5102.
I played it using a max98357. The music reminds me of the old games, beautiful.
|
gharchive/issue
| 2022-08-22T14:45:25 |
2025-04-01T04:55:11.126949
|
{
"authors": [
"IOsetting",
"nyh-workshop"
],
"repo": "IOsetting/wm-sdk-w806",
"url": "https://github.com/IOsetting/wm-sdk-w806/issues/25",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2434022288
|
🛑 Main is down
In 85334c5, Main (https://ipmgroupuk.com) was down:
HTTP code: 403
Response time: 693 ms
Resolved: Main is back up in d465ede after 9 minutes.
|
gharchive/issue
| 2024-07-28T16:48:51 |
2025-04-01T04:55:11.129557
|
{
"authors": [
"GarethWright"
],
"repo": "IPMGroupLtd/Uptime",
"url": "https://github.com/IPMGroupLtd/Uptime/issues/634",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1718532176
|
Feature/swagger integration
@bellaabdelouahab
I have integrated Swagger docs and added Swagger documentation for the user route. The documentation will be available at the "/docs" route, and you can retrieve it in JSON format at "/docs-json".
Working with YAML data can be a real pain😭😭😭
thank you great job, i'm surprised on how much time that took you
I'm so glad you liked it!
btw while documenting you users route i have seen some security vulnerability, for example in the signup route 'role' is set by the client side.
can i fix that?
@bellaabdelouahab
Sure you can as long a s you provide details about what you do, and I'll gladly review them.
When more people start working on this project, would it be possible for me to join as well? It would be great to contribute and learn from others. Plus, I can include this collaboration experience in my resume.
|
gharchive/pull-request
| 2023-05-21T14:53:41 |
2025-04-01T04:55:11.250597
|
{
"authors": [
"bellaabdelouahab",
"muttaqin1"
],
"repo": "ISIL-ESTE/Student-Workflow-Organizer",
"url": "https://github.com/ISIL-ESTE/Student-Workflow-Organizer/pull/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1891849284
|
[BE] Add pino as a logger
[ ] Define metrics to be collected for backend events.
[ ] Setup production and development pino log levels
[ ] Store the logs in a .log file, that is ignored on local.
Related: #392
Updating all the controllers and it's tests for the errors to pass through errorMiddleware.
|
gharchive/issue
| 2023-09-12T07:23:51 |
2025-04-01T04:55:11.268847
|
{
"authors": [
"SergiFont",
"kevinmamaqi"
],
"repo": "IT-Academy-BCN/ita-wiki",
"url": "https://github.com/IT-Academy-BCN/ita-wiki/issues/467",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1140894253
|
🛑 red-circule.com is down
In 64f9cfb, red-circule.com (https://red-circule.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: red-circule.com is back up in a066531.
|
gharchive/issue
| 2022-02-17T05:27:39 |
2025-04-01T04:55:11.282283
|
{
"authors": [
"ya-erm"
],
"repo": "ITGlobal/upptime",
"url": "https://github.com/ITGlobal/upptime/issues/77",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
479481742
|
replace digit parsing with subtraction
You can avoid the parse machinery here because the function is examining one decimal digit at a time. Conveniently, subtracting the character code for '0' from the character code for another digit gives the value of that digit as an integer.
Codecov Report
Merging #76 into master will decrease coverage by <.01%.
The diff coverage is 100%.
@@ Coverage Diff @@
## master #76 +/- ##
==========================================
- Coverage 93.29% 93.29% -0.01%
==========================================
Files 26 26
Lines 2416 2415 -1
==========================================
- Hits 2254 2253 -1
Misses 162 162
Impacted Files
Coverage Δ
src/smallstring.jl
91.66% <100%> (-0.23%)
:arrow_down:
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 790a6e3...973fb98. Read the comment docs.
|
gharchive/pull-request
| 2019-08-12T06:11:06 |
2025-04-01T04:55:11.322125
|
{
"authors": [
"JeffreySarnoff",
"codecov-io"
],
"repo": "ITensor/ITensors.jl",
"url": "https://github.com/ITensor/ITensors.jl/pull/76",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1948755184
|
fix: spelling typos
Pull Request: Typo Corrections in Portfolio Text
Description:
This PR addresses the following typo issues in the provided text as described in #11.
"experties" corrected to "expertise."
"Massege" corrected to "Message."
"GAT IN TOUCH" changed to "GET IN TOUCH."
"Frond-End" corrected to "Front-End."
Affected Files:
index.html
projects.html
services.html
about.html
pricing.html
Changes Made:
Fixed the specified typos in the affected files.
Issues:
Closes: Issue #11
This new PR was created because my initial PR did not follow the required branch naming convention desired by maintainer.
thankyou @beingsie .
next time wait until you get assigned to the respective issue
thankyou @beingsie . next time wait until you get assigned to the respective issue
Will do, thank you @Iamdivyak 😄
|
gharchive/pull-request
| 2023-10-18T03:55:46 |
2025-04-01T04:55:11.329644
|
{
"authors": [
"Iamdivyak",
"beingsie"
],
"repo": "Iamdivyak/portfolio",
"url": "https://github.com/Iamdivyak/portfolio/pull/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
278464743
|
skipped example. Why?
why is not PorterDuff example for chapter 11 published?
The code that generated the example PorterDuff images from Chapter 11 is remarkably similar to what is built in much more detail in Chapter 12, so I didn't include both versions.
|
gharchive/issue
| 2017-12-01T13:23:02 |
2025-04-01T04:55:11.333566
|
{
"authors": [
"IanGClifton",
"evgen1977777"
],
"repo": "IanGClifton/auid2",
"url": "https://github.com/IanGClifton/auid2/issues/3",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
235717267
|
Pass extra params to auth code request
I would like to be able to pass some metadata/context when the Authorization Code is requested. I can do that with the extra params from the TokenClient. Thoughts on something like this for passing the params from the native client?
What's the use case for that?
We have a call center WPF app. Everything is done in the context of a customer. I want to have the customerId be a claim that is passed to services.
And how does this become a claim when sending it alongside the code request?
By overriding GetAccessTokenClaimsAsync and inspecting values in the request in the IClaimsService.
OK - makes sense.
But I think it needs to be implemented differently. The extraParameters is for the authorize request. We would need an additional one for extra parameters on the back-channel.
You could actually achieve the same today by providing a back-channel handler - just not as elegant.
Are you planning to use LoginAsync or rather the PrepareLogin/ProcessResponse approach?
I think it needs to be implemented differently
Sounds good.
You could actually achieve the same today by providing a back-channel handler - just not as elegant.
We are doing that, but as you said, it isn't very elegant and takes some explaining.
Are you planning to use LoginAsync or rather the PrepareLogin/ProcessResponse approach?
Right now we're using LoginAsync but could do either.
I am leaving for holidays now - When I am back I will think about the best way to enable that scenario.
Sorry - totally forgot about this...
better late than never
https://github.com/IdentityModel/IdentityModel.OidcClient2/commit/a8c92d5af580b2f3b17360a85726e55720c7a40f
|
gharchive/pull-request
| 2017-06-13T23:02:10 |
2025-04-01T04:55:11.386678
|
{
"authors": [
"joemcbride",
"leastprivilege"
],
"repo": "IdentityModel/IdentityModel.OidcClient2",
"url": "https://github.com/IdentityModel/IdentityModel.OidcClient2/pull/30",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
507759349
|
asp net identity UI error
I found issue with the new samples example exactly in Quickstarts\1_ClientCredentials
Now listening on: http://localhost:5000
Application started. Press Ctrl+C to shut down.
[11:15:54 Debug] IdentityServer4.Startup
Login Url: /Account/Login
[11:15:54 Debug] IdentityServer4.Startup
Login Return Url Parameter: ReturnUrl
[11:15:54 Debug] IdentityServer4.Startup
Logout Url: /Account/Logout
[11:15:54 Debug] IdentityServer4.Startup
ConsentUrl Url: /consent
[11:15:54 Debug] IdentityServer4.Startup
Consent Return Url Parameter: returnUrl
[11:15:54 Debug] IdentityServer4.Startup
Error Url: /home/error
[11:15:54 Debug] IdentityServer4.Startup
Error Id Parameter: errorId
I didn't found the file appsettings.json in IdentityServer for configuration,
I installed asp net identitu UI from nugetpackage the same problem
how we can resolve it
thank you
What exactly is the problem?
I wann use Asp Identity UI with this example Quickstarts\1_ClientCredentials
a) this does not make sense because QS1 is about machine to machine and not end-user authentication
b) even then - what's the problem?
Thank you for the answer.
|
gharchive/issue
| 2019-10-16T10:24:07 |
2025-04-01T04:55:11.394910
|
{
"authors": [
"amSaighi",
"leastprivilege"
],
"repo": "IdentityServer/IdentityServer4",
"url": "https://github.com/IdentityServer/IdentityServer4/issues/3740",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
141109043
|
Allow development to affect pops
https://forum.paradoxplaza.com/forum/index.php?threads/the-eu4-to-v2-converter-project.707362/page-147#post-19835132
https://forum.paradoxplaza.com/forum/index.php?threads/the-eu4-to-v2-converter-project.707362/page-147#post-19835355
https://forum.paradoxplaza.com/forum/index.php?threads/the-eu4-to-v2-converter-project.707362/page-154#post-19976429
https://forum.paradoxplaza.com/forum/index.php?threads/the-eu4-to-v2-converter-project.707362/page-154#post-19976602
https://forum.paradoxplaza.com/forum/index.php?threads/the-eu4-to-v2-converter-project.707362/page-159#post-20127741
https://forum.paradoxplaza.com/forum/index.php?threads/the-eu4-to-v2-converter-project.707362/page-159#post-20129497
https://forum.paradoxplaza.com/forum/index.php?threads/the-eu4-to-v2-converter-project.707362/page-159#post-20137258
What does this say?
1: "On a side note, have you considered reading Development to influence population? Possibly even read the individual components to turn out the pop distribution (so, high Manpower development would increase the number of Soldier pops)."
2 (direct reply): "Yes, though I still need to work out the details. I want to address some of the other tickets (and get another release out once someone helps with the province mappings) first, though."
3: "Considering that, I think we should have a discussion on how development should influence the population. I think we should get a bigger discussion going on how it should influence and what formula should be used to calculate it. I have thought about this a bit and have come to the conclusion that I believe to be most prudent. We should first gather all the possible real world data we have to compare and then contrast that with the base province development that we have at those dates. It will be far from perfect as most provinces base development does not change with time and I highly doubt Paradox has added in development changing with dates, but I haven't looked at the province files in a long time so I might be wrong.
Thus I propose that we take the full development of a province to be the modifier that is added to the calculation once we reach a reasonable one. Then we use base production to figure out the percentage of artisan and craftsmen population in a province , use the base tax to figure out the labourers, farmers and most of the rest of the middle class, assigning weighted modifiers to all of those mentioned previously and then use the base manpower to figure out the amount of soldier pops in a province.
Alternatively we could take the manpower amount given per each province and use that factoring in the manpower development level to figure out how much population does each province development level carry and then use weighted modifiers per level of development for each pop type.
Both of these ways need many more levels of refinement but they could be a basis for a more development level accurate province population. Could we get a discussion going on this, as it would be interesting to see what could be made of this."
4: Quote: "Oh, let me explain how it works.
The converter calculates the yearly production income, yearly tax income and yearly manpower of a province based on base tax, base production, base manpower, buildings and idea groups. It then adds them up for each prov with yearly manpower having a smaller scaling as income values dont reach values of 1000s, i would have to check how much exactly and then to that is added total development and building weight. Building weight is calculated by using cost of the buildings and adding additional weight for buildings that dont affect income, but are still of use in eu4 and building weight is not the biggest part of the weights of a prov, it has roughly same effect as dev with income and manpower having roughly 50% effect.
The weights work in that way that each prov has a certain weight after this and so provs will get pop from pie of total pop based on weight and this is how end formula looks for prov weight :
Weight = Yearly Prod Income + Yearly Tax Income + Province Manpower (scaled down) + 2*Development of a prov + Building Weight"/Quote
So that forms the basis for the province populations division into pops, correct? So most of what should be done is to have the population be influenced by province development, get a base number for the development and then utilise that for the weights to get more dynamic provinces or am I wrong."
5: "Quote: Unfortunately I havn't really been playing much EU4 since these changes were released, so I'm not quite sure what the best approach is. I'll trust @dtremenak 's changes, though I'd be interested to hear his methodology :)./Quote
Just to be clear, my changes don't take development into account at all. They just update the old building algorithm to work with the new building system instead of the old one. I'm not sure development is a good way to go for pop composition (the three categories don't seem to map very well to pop types), but I'd be happy to hear ideas if you have them."
6(direct reply). "You could use a combination of trade good, buildings and development to map that out. Development in ADM and DIP could determine the size of the pop (in a 30-70 or 40-60 split). Then you could pull a general proportion of teach type from the country's technologies and ideas (e.g. innovative countries might have more priests, Quantity more troops, etc) as a flat percentage across the country, modified by the trade goods in a country (producing spices ups aristocrats and farmers, iron ups labourers, etc) and then their buildings (manufacturies produces a factory in the province and some relevant workers, universities ups priests, administrators and capitalists etc). You might even reflect terrain. In terms of cultures, you could pull from the province history - the proportion of new culture v old culture could depend on how many years it's been since culture conversion (with some arbitrary cut off, like 200 years) and the same for religion.
That might produce quite a nice system where player choice is reflected - ultimately you want a system where people go "If I build a university, in V3 I'll be good in terms of teachers" or "if I invest heavily into my capital it'll be one of the richest provinces in V2""
7(reply to 5): "Yeah, you're right that there's no clear mapping, and it's the reason I've not advanced an idea before.
First, we can reasonably infer that manpower development represents the (weighted) percentage of soldier (and officer?) pops that live in that province. That's pretty straight forward.
Second, production development is specifically about the production of that provinces trade good. Since you already use buildings to determine the presence of other pops, I'm tempted to conclude that production development actually tells us about the number of RGO pops in the province.
Lastly we have base tax. Base tax can be used to catch everyone else: an abstract number for all of the taxes raised from all the other kinds of pops we missed.
Now assuming my analysis has any validity, the trick is actually converting this all to actual numbers."
|
gharchive/issue
| 2016-03-15T21:50:02 |
2025-04-01T04:55:11.405656
|
{
"authors": [
"Idhrendur",
"derHochmeister",
"kratostatic"
],
"repo": "Idhrendur/paradoxGameConverters",
"url": "https://github.com/Idhrendur/paradoxGameConverters/issues/50",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
788082299
|
feat(comp: typography): add directive typography
fix #130
PR Checklist
Please check if your PR fulfills the following requirements:
[x] The commit message follows our guidelines
[x] Tests for the changes have been added/updated or not needed
[x] Docs and demo have been added/updated or not needed
PR Type
What kind of change does this PR introduce?
[ ] Bugfix
[x] Feature
[ ] Component style update
[ ] Code style update (formatting, local variables)
[ ] Refactoring (no functional changes, no api changes)
[ ] Build related changes
[ ] CI related changes
[ ] Documentation content changes
[ ] Application (the showcase website) / infrastructure changes
[ ] Other... Please describe:
What is the current behavior?
What is the new behavior?
Other information
@danranVm @hele10086 请重新review
@danranVm @hele10086 请重新review
|
gharchive/pull-request
| 2021-01-18T09:13:07 |
2025-04-01T04:55:11.412291
|
{
"authors": [
"LaamGinghong"
],
"repo": "IduxFE/components",
"url": "https://github.com/IduxFE/components/pull/148",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1084494880
|
how to setup database
how to setup database
No response, I think the problem is solved 🚀
I'm closing this issue.
If you are still having problems I recommend joining our discord server - https://discord.com/invite/bVNNHuQ
|
gharchive/issue
| 2021-12-20T08:23:29 |
2025-04-01T04:55:11.508379
|
{
"authors": [
"IgorKowalczyk",
"PankKRB"
],
"repo": "IgorKowalczyk/majobot",
"url": "https://github.com/IgorKowalczyk/majobot/issues/31",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
482040319
|
Issues with resource manifest
There is a issue since the GUI changes, specifically adding the resource manifest to vrp is causing problems, something to do with removing entities, most likely to be a fivem client side problem as usual.
something to do with removing entities, most likely to be a fivem client side problem as usual.
This is not an issue at all. There's an explanation on FiveM docs website:
Scripts will now be registered as a game network script. This is required for networking entities.
This is how game works. To fix "removal" your script need have to get control over an entity you want to remove (SET_ENTITY_AS_MISSION_ENTITY).
Oh it seems to already be using that in the script
https://github.com/ImagicTheCat/vRP/blob/master/vrp/client/veh_blacklist.lua
@Disquse this is for 05cfa83c-a124-4cfa-a768-c24a5811d8f9, but 44febabe-d386-4d18-afbe-5e627f4af937 is used
@ImagicTheCat Same issues unfortunately
Or you could do NetworkRequestControlOfEntity
=> https://runtime.fivem.net/doc/natives/#_0xB69317BF5E782347 On Tue, Aug 20, 2019 at 1:32am, bradxy < notifications@github.com [notifications@github.com] > wrote:@ImagicTheCat [https://github.com/ImagicTheCat] Same issues unfortunately
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub [https://github.com/ImagicTheCat/vRP/issues/558?email_source=notifications&email_token=ABITCMJ4GZLRPMFCFBPIPZLQFMUQVA5CNFSM4IMUFWCKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD4UTCUI#issuecomment-522793297] , or mute the thread [https://github.com/notifications/unsubscribe-auth/ABITCML32IZQJNQS3TXDMVTQFMUQVANCNFSM4IMUFWCA] .
Odd all worked fine until the manifest was added... vRP was already is using the required SET_ENTITY_AS_MISSION_ENTITY for these things...
The way GTA V handles entities is complicated, even with the previous manifest we have issues. I don't know how other resources are dealing with shared entity consistency. We may need to experiment with the last manifest version and networked entities to find a better solution.
@Disquse @mickeygeecom Are you guys able to add the current resource manifest to the core vrp __resource.lua? Any help with this would be greatly appreciated and will help current members that started using vrp
@ImagicTheCat Is there anyway to downgrade vRP to remove the GUI changes and remove the resource manifest, Everyone new using the current vRP version will have the vehicles not storing problem, and if they remove the manifest the vehicle storing issues is resolved but the hanger/thrist bars stop working cause it now relies on the manifest...
You can use any version of vRP you want by checking out a specific commit with git.
|
gharchive/issue
| 2019-08-18T21:40:59 |
2025-04-01T04:55:11.553504
|
{
"authors": [
"Disquse",
"ImagicTheCat",
"bradxy",
"mickeygeecom"
],
"repo": "ImagicTheCat/vRP",
"url": "https://github.com/ImagicTheCat/vRP/issues/558",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1934948762
|
Include user guide on GitHub Pages
I think this task is optional. It might be worth doing if it's easy.
If we write the user guide in Markdown format, then hopefully we should be able to include it into the mkdocs-based workflow we're using to generate the developer documentation.
If the user guide already written or it needs to be written? If it is the second, this is no small task!
There's a placeholder file there. @Sahil590 worked on the integration with the main program. So this is issue is just a case of publishing that file along with the other documentation.
Writing it is a separate issue: #350
|
gharchive/issue
| 2023-10-10T10:00:01 |
2025-04-01T04:55:11.600567
|
{
"authors": [
"alexdewar",
"dalonsoa"
],
"repo": "ImperialCollegeLondon/FINESSE",
"url": "https://github.com/ImperialCollegeLondon/FINESSE/issues/346",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2041228250
|
vr_run sometimes stalls when combining data files
Describe the bug
Sometimes when vr_run is called it stalls at the point it uses xarray.open_mfdataset() to combine the files for individual time points into a single file. This sometimes happens on CI meaning the tests don't pass (because they don't finish). This appears to be a known bug in xarray.
To Reproduce
By its nature it is hard to reproduce this bug but when I run vr_run locally (2017 MacBook Pro) it stalls ~1 in 3 times. This is more frequent than happens on the CI so I suspect the stall is somewhat spec driven.
Approach to fix
The bug thread suggests setting lock=False as a work around but suggests that this can cause errors as hdf5 is not thread safe. I feel like stopping CI from randomly breaking is probably the top priority for now, so I think we should go ahead and set this option. If setting this option proves to be too error prone for our use case then we will need to think of an alternative way of merging data, e.g. possibly a dedicated tool for merging data
The actual halt is now resolved by #355, but the option may cause occasional errors. If we start seeing HDF5 errors with any regularity then we need to revisit our solution to this issue
This cropped up again and I've raised PR #457 to explicitly turn off any attempt to parallelise the file handling in dask. I think that is as much as we can do here - it's a problem all the way down the stack to the NetCDF code, I think.
|
gharchive/issue
| 2023-12-14T08:50:08 |
2025-04-01T04:55:11.604952
|
{
"authors": [
"davidorme",
"jacobcook1995"
],
"repo": "ImperialCollegeLondon/virtual_rainforest",
"url": "https://github.com/ImperialCollegeLondon/virtual_rainforest/issues/354",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
347725564
|
PhysX needs an update - currently doesn't work on Linux!
The version we are currently using of PhysX is before nVidia released Linux support in PhysX. This leaves Linux running the rather pathetic implementation of "BasicPhysics" I created that just simply allows avatar movement and avatar-ground collisions.
The primary problem is that we've got our own fork of PhysX.Net, a .NET wrapper for PhysX, that should have any relevant changes found and ported as PRs back up to upstream.
So subtasks are as follows:
Find changes to our own fork and get them PR'd upstream if not already there or solved alternately.
Bring in the NuGet package of PhysX.Net and test it.
Iterate any needed changes to the above or Halcyon to make everything happy.
This was a major undertaking for multiple reasons last I investigated:
As already identified, the physx.net package had a bunch of issues that I had to fix where patches were not merged that I know of. Many of these issues were crashers, memory leaks, or O(n) / O(n^2) issues that had to be rectified before the package would perform at all well on long running and demanding simulation processes.
It was not possible to compile C++/CLI on linux. If this has changed that removes the biggest hurdle. Though it is possible to run pure C++ with only CLR calls on linux, this does not describe the bindings to physx.
I had started down an alternate path with swig wrappers to be bound to the current physic code for these reasons, but it was not completed work.
YMMV and things may have changed
Thanks for the history David! BTW, any idea what revision of the upstream repo was forked off of? Since it went through SVN then uploaded back to GitHub from SVN, it's lost that info.
Doing some research it turns out that C++/CLI is still a no-go on anything but Windows with notes that it's likely not going to happen. The entire library would have to be rewritten using P/Invoke which is basically a rewrite - not something a lib maintainer wants to do.
David was the SWIG work every checked in anywhere? And yes C++/CLI isn't a viable option. You'd need P/Invoke wrappers as a replacement
Hi there,
So I know that Halcyon uses the PhysX fork from StillDesigns so I am not sure if this will work or be of help but I figure at the very minimum it might at least provide some ideas going forward as you figure out the implementation of PhysX for Linux. If it does not help it is okay as I will have only wasted a couple of moments to provide the information. But even information passed on can be helpful even if in the end it does not work.
Before they officially switched over to Halcyon, the M-O-S-E-S project did, in fact, have a working PhysX wrapper for Linux in their fork of OpenSim. My team stumbled onto it as it was mentioned to us by one of our upstream providers. We found it on their projects page. However, since the time we found it they have added it to their GitHub organization as a repository. The link I am passing you is to their README.txt in that repository which contains instructions on how they built the wrapper, but if you click on the repository link you can also see their code. The link is: https://github.com/M-O-S-E-S/physx-wrapper/blob/master/README.txt
As I indicated I am not sure if they had this particular wrapper working on their Halcyon fork or not. At the very minimum, it was for an earlier version of OpenSim. So it might or might not work or even be viable here. I should also note that the versions of OpenSim that did have PhysX in the source code (i don't remember whether it was working, however, but I don't believe it was) likely was using the dlls direct from the Nvidia PhysX repositories which you can get access to but you have to register with Nvidia directly to get access to those repositories.
At any rate I hope this is of some help, even if at the very least it gives you some ideas to look at.
This issue was moved to HalcyonGrid/halcyon#2
|
gharchive/issue
| 2018-08-05T19:48:00 |
2025-04-01T04:55:11.630100
|
{
"authors": [
"ddaeschler",
"emperorstarfinder",
"kf6kjg",
"mdickson"
],
"repo": "InWorldz/halcyon",
"url": "https://github.com/InWorldz/halcyon/issues/463",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1376820445
|
Create dataset loader for INDspeech_DIGIT_CDSR
NusaCatalogue: https://indonlp.github.io/nusa-catalogue/card.html?indsp_digit_cdsr
Dataset
indsp_digit_cdsr
Description
INDspeech_DIGIT_CDSR is the first Indonesian speech dataset for connected digit speech recognition (CDSR). The data was developed by TELKOMRisTI (R&D Division, PT Telekomunikasi Indonesia) in collaboration with Advanced Telecommunication Research Institute International (ATR) Japan and Bandung Institute of Technology (ITB) under the Asia-Pacific Telecommunity (APT) project in 2004 [Sakti et al., 2004]. Although it was originally developed for a telecommunication system for hearing and speaking impaired people, it can be used for other applications, i.e., automatic call centers that recognize telephone numbers.
License
CC-BY-NC-SA 4.0
#self-assign
#self-assign
Hi @ziweiji, apparently @IvanHalimP has self-assigned first and he already made a pull request, so there's no need for you to work on this dataloader.
cc: @SamuelCahyawijaya
|
gharchive/issue
| 2022-09-17T16:50:24 |
2025-04-01T04:55:11.650626
|
{
"authors": [
"IvanHalimP",
"SamuelCahyawijaya",
"holylovenia",
"ziweiji"
],
"repo": "IndoNLP/nusa-crowd",
"url": "https://github.com/IndoNLP/nusa-crowd/issues/278",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
250393633
|
X-NuGet-ApiKey is not the correct header for pushing package
See http://inedo.com/support/documentation/proget/administration/security#api-keys
It should be 'X-ApiKey'
X-ApiKey is only used by ProGet, and APIs we design requests (like promotion, etc), but not necessarily APIs we implement (like NuGet, npm, etc) .
X-NuGet-ApiKey is used by NuGet servers (including ProGet); it's part of the NuGet protocol, which we don't document as part of ProGet (we just implement it).
|
gharchive/issue
| 2017-08-15T18:14:58 |
2025-04-01T04:55:11.679608
|
{
"authors": [
"mattkenn4545",
"whatatripp"
],
"repo": "Inedo/bmx-nuget",
"url": "https://github.com/Inedo/bmx-nuget/issues/23",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2090313012
|
fix: unsaved changes nextjs > 13
Implementert workaround løsning foreslått her: https://github.com/vercel/next.js/discussions/47020
Btw, ser at nextjs har kommet med en støtte for pushState og replaceState i router-komponenten:
https://nextjs.org/blog/next-14-1#windowhistorypushstate-and-windowhistoryreplacestate
|
gharchive/pull-request
| 2024-01-19T11:18:14 |
2025-04-01T04:55:11.697761
|
{
"authors": [
"jeffreiffers",
"pooriamehregan"
],
"repo": "Informasjonsforvaltning/catalog-frontend",
"url": "https://github.com/Informasjonsforvaltning/catalog-frontend/pull/585",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
910013516
|
Edit not working in generated api
Hi sir,
I was generated api using php artisan infyom:api member command.
And the command successfull. Add, delete, get and list is working properly.
But edit is not working. Kindly help to solve the issue and let me know if I done anything wrong.
I am using laravel version 7 and infyom laravel generator version 7.
I tried it in postman, Received the following error, I created api with only two column name and email.
In that name is required field.
Check the screenshot - Click to see screenshot
It showing the name field is required error response, But the name field is already filled.
If I remove required for name column, It just returning the data like get request, but not updated
Waiting for your reply
Hope explained clearly
Thank you
@sbn111 i think you trying with form data. Please try with row and pass json in put request from postman. See example here https://images.app.goo.gl/DF2YZQ4eKWZbYaPK7
@sbn111 i think you trying with form data. Please try with row and pass json in put request from postman. See example here https://images.app.goo.gl/DF2YZQ4eKWZbYaPK7
Thank you it worked. Really appreciate your quick response. I starting to see your youtube videos
|
gharchive/issue
| 2021-06-03T01:39:05 |
2025-04-01T04:55:11.739447
|
{
"authors": [
"sbn111",
"shailesh-ladumor"
],
"repo": "InfyOmLabs/laravel-generator",
"url": "https://github.com/InfyOmLabs/laravel-generator/issues/962",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
215520806
|
convert regular DF to simba supported geometries
Does simba have som UDF to support creation of a simbaDF out of a regular data frame? I.e. like magellans df.withColumn("point", point('x, 'y))
If I am required to manually map all points / polygons to simba Geometry, how can I represent additional fiels?
val ps = (0 until 10000).map(x => PointData(Point(Array(x.toDouble, x.toDouble)), x + 1)).toDS
How can I parse WKT polygons to a simba supported geometry format?
Theoretically, you can do anything supported by Spark SQL DataFrame to a Simba DataFrame. As Simba DataFrame inherits from that of Spark SQL.
To represent additional fields, you simply add them to your structure. For example, you can define:
case class PointData(x: Point, payload: Int, tag: String)
And Simba will be able to automatically detect its fields and build the data frame. It will give you a schema like:
-- DataFrame
|----- x : ShapeType
|----- payload : Integer
|----- tag : String
I see. And What about polygons? You seem to use Polygon.apply(Array(Point(Array(-1.0, -1.0)), Point(Array(1.0, -1.0)), If I have WKT polygon strings how could these be converted?
Refer to:
https://github.com/InitialDLab/Simba/blob/standalone-2.1/src/main/scala/org/apache/spark/sql/simba/spatial/Polygon.scala#L117
So assuming a Data frame with Polygons like below
case class MyClass(a:String, b:int, wktString:String)
val df = Seq(MyClass("a", 1, "POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))"), MyClass("b",2,"POLYGON ((30 1, 40 40, 20 50, 10 20, 30 1))")).toDS()
val dfGeom = df.map(x => SPolygon.fromWKB(x.wktString.toCharArray.map(_.toByte)))
is this how the conversion is supposed to be?
As for me this will fail with a code generator exception when calling dfGeom.show
17/03/20 20:26:50 ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 102, Column 31: Assignment conversion not possible from type "org.apache.spark.sql.simba.spatial.Shape" to type "org.apache.spark.sql.simba.spatial.Polygon"
I think you can try this:
case class MyClass(a:String, b:int, wktString:Polygon)
val df = Seq(("a", 1, "POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))"),("b",2,"POLYGON ((30 1, 40 40, 20 50, 10 20, 30 1))")).map(x => MyClass(x._1, x._2, Polygon.fromWKB(x._3.toCharArray.map(_.toByte)))).toDS()
df.show()
I don't know if it can work, but you can try.
This would fail with com.vividsolutions.jts.io.ParseException: Unknown WKB type 71 already when trying to parse the WKT.
Well, I think this is a parsing problem of JTS, which is out of my scope now. And just to remind, general geometric objects including polygons are still under development.
Would about:
def toPolygon(s:String, u:String):SPolygon = {
@transient lazy val reader = new WKTReader()
reader.read(s) match {
case poly: Polygon => {
poly.setUserData(u)
SPolygon.fromJTSPolygon(poly)
}
}
}
val df = Seq(("a", 1, "POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))"),("b",2,"POLYGON ((30 1, 40 40, 20 50, 10 20, 30 1))")).map(x => MyClass(x._1, x._2, toPolygon(x._3, "foobar"))).toDS
df.show
not sure if it will join later on, but df.show works.
df.show() should work. There must be something wrong with my fromWKT function.
Nevertheless, I don't think it will work for joins since our current join algorithm does not support polygons, which is technically caused by no partitioner for polygons and it assumes the join keys will be evaluated as Point. This is coming from our legacy hacks for its original prototype. Still, I treat partitioning general geometry objects as a research problem.
|
gharchive/issue
| 2017-03-20T18:53:33 |
2025-04-01T04:55:11.747110
|
{
"authors": [
"Skyprophet",
"geoHeil"
],
"repo": "InitialDLab/Simba",
"url": "https://github.com/InitialDLab/Simba/issues/87",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
777473280
|
BakedCanvasDataProxy needs to be more informative
An issue with using an index is, if someone were to manipulate the BakedCanvasData, the indices would no longer be correct.
Because I can't use GetInstancedID() or GetHashCode() for persistent unique identifiers it makes more sense to generate a GUID and serialize the BakedCanvasData as a serializable GUID.
It would be nice to show people how the canvas will be batched together.
Okay added buttons to easily bake, update, and remove information from the scriptable object. Will move bullet point 3 to a new issue.
Okay added buttons to easily bake, update, and remove information from the scriptable object. Will move bullet point 3 to a new issue.
|
gharchive/issue
| 2021-01-02T14:58:38 |
2025-04-01T04:55:11.750319
|
{
"authors": [
"psuong"
],
"repo": "InitialPrefabs/UGUIDOTS",
"url": "https://github.com/InitialPrefabs/UGUIDOTS/issues/70",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
700331412
|
N-UPNP Endpoint is retired
Shade Info:
Version: 1.1.1
How to Reproduce the Issue:
Example:
shade discover
Expected behavior:
This should list the available endpoints.
This also affects the SDK in the same fashion, with a similar stack trace
Actual behavior:
$ shade discover
Unknown Error: HTTP 404
retrofit2.HttpException: HTTP 404
at retrofit2.KotlinExtensions$await$2$2.onResponse(KotlinExtensions.kt:53)
at retrofit2.OkHttpCall$1.onResponse(OkHttpCall.java:161)
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
at java.base/java.lang.Thread.run(Thread.java:832)
This is fixed in 1.1.2
|
gharchive/issue
| 2020-09-12T18:36:05 |
2025-04-01T04:55:11.753232
|
{
"authors": [
"ReneeVandervelde"
],
"repo": "InkApplications/Shade",
"url": "https://github.com/InkApplications/Shade/issues/60",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
267920431
|
Tested it!
I tested render node & api server on MacOS Sierra 10.12.6 and Windows Server 2016 with AE CC 2018 Version 15.0.0 (Build 180). Works fine 👍
oh, great news, so it is actually working fine
and was it mp4 rendering, or avi/mov, or maybe any other ?
@Inlife on both systems it was mov
oh i see
and i almost sure new AE doesn't allow to setup the h264 output
|
gharchive/issue
| 2017-10-24T07:08:26 |
2025-04-01T04:55:11.755033
|
{
"authors": [
"Inlife",
"max-kovpak"
],
"repo": "Inlife/nexrender",
"url": "https://github.com/Inlife/nexrender/issues/43",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
863638054
|
Add some cypress test
For the moment the app doesn't have any test, it would be great to add some test with cypress
What tests are you looking for since I can see the website does not have much content as of now!
Do you just want the static component testing of the homepage?
What do you mean by "not much content"? This ui expects a valid api (https://github.com/InseeFrLab/sugoi-api), bjt there is enough components to test I think
For this issue, what is expected is the adding of what is necessary to start writing some tests on react components using cypress, to be more confortable with changes.
|
gharchive/issue
| 2021-04-21T09:12:16 |
2025-04-01T04:55:11.775941
|
{
"authors": [
"Donatien26",
"ShubhamPalriwala",
"micedre"
],
"repo": "InseeFrLab/sugoi-ui",
"url": "https://github.com/InseeFrLab/sugoi-ui/issues/111",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2098554905
|
Latest CMake produces a warning
cmake version 3.28.1 produces a warning in default ITK configuration.
Description
CMake Deprecation Warning at Utilities/KWSys/CMakeLists.txt:89 (cmake_minimum_required):
Compatibility with CMake < 3.5 will be removed from a future version of
CMake.
Update the VERSION argument <min> value or use a ...<max> suffix to tell
CMake that the project does not need compatibility with older versions.
Additional Information
We need to update KWSys, and maybe update it beforehand too.
This fixes the warning, as demonstrated by https://github.com/dzenanz/ANTsWasm/actions/runs/7642347741/job/20821757352
I accidentally wrote a comment here instead of in https://github.com/InsightSoftwareConsortium/ITKRemoteModuleBuildTestPackageAction/pull/74#issuecomment-1908406096. Sorry for the noise to those who watch.
|
gharchive/issue
| 2024-01-24T15:45:04 |
2025-04-01T04:55:11.778980
|
{
"authors": [
"dzenanz"
],
"repo": "InsightSoftwareConsortium/ITK",
"url": "https://github.com/InsightSoftwareConsortium/ITK/issues/4426",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
539229122
|
BUG: fixes issue# 1498
BUG: Replaced std::defaultfloat with std::resetiosflags to fix issue #1498 on old gcc compilers.
PR Checklist
[ ] :no_entry_sign: Makes breaking changes
[ ] :no_entry_sign: Makes design changes
[ ] :no_entry_sign: Adds the License notice to new files.
[ ] :no_entry_sign: Adds Python wrapping.
[ ] :no_entry_sign: Adds tests and baseline comparison (quantitative).
[ ] :no_entry_sign: Adds test data.
[ ] :no_entry_sign: Adds Examples to ITKExamples
[ ] :no_entry_sign: Adds Documentation.
Refer to the ITK Software Guide for
further development details if necessary.
Hello,
Thanks for following up on this. I just checked the docs for std::io_base::fmtflags. It appears that std::ios_base::floatfield only effects the float and scientific fields. Not the precession.
At any rate, I then checked the itkWarningMacro. And it turns out that it's output is a temporary ostring stream, so there is actually to need to restore the formatting!
ok, that would simplify things
Thanks!
|
gharchive/pull-request
| 2019-12-17T18:12:05 |
2025-04-01T04:55:11.786101
|
{
"authors": [
"blowekamp",
"vfonov"
],
"repo": "InsightSoftwareConsortium/ITK",
"url": "https://github.com/InsightSoftwareConsortium/ITK/pull/1499",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
832281620
|
COMP: Forward CMAKE_APPLE_SILICON_PROCESSOR
Forward CMAKE_APPLE_SILICON_PROCESSOR variable to external cmake configuration commands (Eigen3 and FFTW).
This fixes configuration for x86_64 on Apple Silicon, if CMAKE_APPLE_SILICON_PROCESSOR variable was set by user (value should be x86_64 or arm64).
Theoretically, for cross-compile setups, specially on other architectures, more variables should be forwarded, IMHO, but Eigen3 is header library, this is required only to pass correct architecture to "try compile" cmake's tests, AFAIK. And internal FFTW build is marked as non-production experimental build, so probably it is OK. Note: i didn't test FFTW build on Apple Silicon, but i think it is straightforward.
Let me know if you think CMAKE_OSX_ARCHITECTURES or other variables should be added too, so temporary set WIP.
S. comments to https://github.com/InsightSoftwareConsortium/ITK/pull/2391
CC @seanm
Edit:
have tried "internal" FFTW builds (both variants, float and double) with CMAKE_APPLE_SILICON_PROCESSOR x86_64 on Apple Silicon, they are OK.
This variable exists in Cmake >= 3.19.2, BTW.
Not sure I get the point of the existence of CMAKE_APPLE_SILICON_PROCESSOR, but it exists, so passing it down the line seems fine.
Not sure I get the point of the existence of CMAKE_APPLE_SILICON_PROCESSOR, but it exists, so passing it down the line seems fine.
Here https://cmake.org/cmake/help/v3.19/release/3.19.html is some information
Apple Silicon is now supported (since CMake 3.19.2):
The CMAKE_HOST_SYSTEM_PROCESSOR is selected using uname -m. Since this may vary based on CMake’s own architecture and that of the invoking process tree, the CMAKE_APPLE_SILICON_PROCESSOR variable or CMAKE_APPLE_SILICON_PROCESSOR environment variable may be set to specify a host architecture explicitly.
If CMAKE_OSX_ARCHITECTURES is not set, CMake adds explicit flags to tell the compiler to build for the CMAKE_HOST_SYSTEM_PROCESSOR so the toolchain does not have to guess based on the process tree’s architecture.
@seanm
#include <cstdlib>
int main(int, char**)
{
system("uname -a");
return 0;
}
mi@mis-Mac-mini ~ % clang++ --target="x86_64-apple-macos10.14" test1.cpp
mi@mis-Mac-mini ~ % ./a.out
Darwin mis-Mac-mini.fritz.box 20.3.0 Darwin Kernel Version 20.3.0: Thu Jan 21 00:06:51 PST 2021; root:xnu-7195.81.3~1/RELEASE_ARM64_T8101 x86_64
mi@mis-Mac-mini ~ % clang++ test1.cpp
mi@mis-Mac-mini ~ % ./a.out
Darwin mis-Mac-mini.fritz.box 20.3.0 Darwin Kernel Version 20.3.0: Thu Jan 21 00:06:51 PST 2021; root:xnu-7195.81.3~1/RELEASE_ARM64_T8101 arm64
|
gharchive/pull-request
| 2021-03-15T23:35:09 |
2025-04-01T04:55:11.796649
|
{
"authors": [
"issakomi",
"seanm"
],
"repo": "InsightSoftwareConsortium/ITK",
"url": "https://github.com/InsightSoftwareConsortium/ITK/pull/2409",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1993193388
|
Replace std::min and std::max calls with std::clamp, in "Colormap", ProgressTransformer, SliceImageFilter, and MultiThreaderBase.
Follow-up to:
pull request #3992
pull request #4182
Use cases mostly found by Notepad++ regular expression (.+) = std::m..\(.+;\r\n\1 = std::m..\(.
@martinmoene Hope you like 👍 this pulll request as well 😃 ! I honestly didn't realize before that clamping is such a popular thing to do!
|
gharchive/pull-request
| 2023-11-14T17:17:28 |
2025-04-01T04:55:11.799485
|
{
"authors": [
"N-Dekker"
],
"repo": "InsightSoftwareConsortium/ITK",
"url": "https://github.com/InsightSoftwareConsortium/ITK/pull/4328",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1528395283
|
openvpnconnectv3 stopped downloading
recently noticing openvpnconnectv3 packages aren't downloading
Was able to download just fine from browser with the same URL so i suspected curl is failing, added curlOptions and pkgs starts downloading again.
curlOptions=( -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.1 Safari/605.1.15" )
Will raise a PR
2023-01-11 12:03:42 : REQ : openvpnconnectv3 : ################## Start Installomator v. 10.3beta, date 2023-01-11
2023-01-11 12:03:42 : INFO : openvpnconnectv3 : ################## Version: 10.3beta
2023-01-11 12:03:42 : INFO : openvpnconnectv3 : ################## Date: 2023-01-11
2023-01-11 12:03:42 : INFO : openvpnconnectv3 : ################## openvpnconnectv3
2023-01-11 12:03:42 : DEBUG : openvpnconnectv3 : DEBUG mode 1 enabled.
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : name=OpenVPN Connect
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : appName=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : type=pkgInDmg
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : archiveName=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : downloadURL=https://openvpn.net/downloads/openvpn-connect-v3-macos.dmg
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : curlOptions=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : appNewVersion=3.4.1
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : appCustomVersion function: Not defined
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : versionKey=CFBundleShortVersionString
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : packageID=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : pkgName=OpenVPN_Connect_[0-9_()]*_x86_64_Installer_signed.pkg
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : choiceChangesXML=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : expectedTeamID=ACV7L3WCD8
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : blockingProcesses=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : installerTool=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : CLIInstaller=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : CLIArguments=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : updateTool=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : updateToolArguments=
2023-01-11 12:03:44 : DEBUG : openvpnconnectv3 : updateToolRunAsCurrentUser=
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : BLOCKING_PROCESS_ACTION=tell_user
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : NOTIFY=success
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : LOGGING=DEBUG
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : LOGO=/System/Applications/App Store.app/Contents/Resources/AppIcon.icns
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : Label type: pkgInDmg
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : archiveName: OpenVPN Connect.dmg
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : no blocking processes defined, using OpenVPN Connect as default
2023-01-11 12:03:45 : DEBUG : openvpnconnectv3 : Changing directory to /Users/asri/Documents/dev/Installomator/build
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : App(s) found: /Applications/OpenVPN Connect.app
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : found app at /Applications/OpenVPN Connect.app, version 3.4.1, on versionKey CFBundleShortVersionString
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : appversion: 3.4.1
2023-01-11 12:03:45 : INFO : openvpnconnectv3 : Latest version of OpenVPN Connect is 3.4.1
2023-01-11 12:03:45 : WARN : openvpnconnectv3 : DEBUG mode 1 enabled, not exiting, but there is no new version of app.
2023-01-11 12:03:45 : REQ : openvpnconnectv3 : Downloading https://openvpn.net/downloads/openvpn-connect-v3-macos.dmg to OpenVPN Connect.dmg
2023-01-11 12:03:45 : DEBUG : openvpnconnectv3 : No Dialog connection, just download
2023-01-11 12:03:45 : ERROR : openvpnconnectv3 : error downloading https://openvpn.net/downloads/openvpn-connect-v3-macos.dmg
ls: OpenVPN Connect.dmg: No such file or directory
2023-01-11 12:03:45 : ERROR : openvpnconnectv3 : File list:
2023-01-11 12:03:45 : ERROR : openvpnconnectv3 : File type: OpenVPN Connect.dmg: cannot open `OpenVPN Connect.dmg' (No such file or directory)
2023-01-11 12:03:45 : DEBUG : openvpnconnectv3 : DEBUG mode 1, not reopening anything
2023-01-11 12:03:45 : ERROR : openvpnconnectv3 : ERROR: Error downloading https://openvpn.net/downloads/openvpn-connect-v3-macos.dmg error:
Trying 104.18.110.96:443...
Connected to openvpn.net (104.18.110.96) port 443 (#0)
ALPN, offering h2
ALPN, offering http/1.1
successfully set certificate verify locations:
CAfile: /etc/ssl/cert.pem
CApath: none
(304) (OUT), TLS handshake, Client hello (1):
} [316 bytes data]
(304) (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
(304) (IN), TLS handshake, Unknown (8):
{ [19 bytes data]
(304) (IN), TLS handshake, Certificate (11):
{ [4752 bytes data]
(304) (IN), TLS handshake, CERT verify (15):
{ [264 bytes data]
(304) (IN), TLS handshake, Finished (20):
{ [52 bytes data]
(304) (OUT), TLS handshake, Finished (20):
} [52 bytes data]
SSL connection using TLSv1.3 / AEAD-AES256-GCM-SHA384
ALPN, server accepted to use h2
Server certificate:
subject: C=US; ST=California; O=OpenVPN Inc.; CN=*.openvpn.net
start date: Mar 21 00:00:00 2022 GMT
expire date: Apr 20 23:59:59 2023 GMT
subjectAltName: host "openvpn.net" matched cert's "openvpn.net"
issuer: C=GB; ST=Greater Manchester; L=Salford; O=Sectigo Limited; CN=Sectigo RSA Organization Validation Secure Server CA
SSL certificate verify ok.
Using HTTP2, server supports multiplexing
Connection state changed (HTTP/2 confirmed)
Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
Using Stream ID: 1 (easy handle 0x7fba4200b600)
GET /downloads/openvpn-connect-v3-macos.dmg HTTP/2
Host: openvpn.net
user-agent: curl/7.79.1
accept: /
Connection state changed (MAX_CONCURRENT_STREAMS == 256)!
< HTTP/2 301
< date: Wed, 11 Jan 2023 04:03:45 GMT
< content-type: text/html; charset=UTF-8
< location: https://swupdate.openvpn.net/downloads/connect/openvpn-connect-3.4.1.4522_signed.dmg
< expires: Wed, 11 Jan 1984 05:00:00 GMT
< cache-control: no-cache, must-revalidate, max-age=0
< x-redirect-by: redirection
< cf-cache-status: MISS
< server: cloudflare
< cf-ray: 787ac3b23c254a1d-SIN
<
{ [0 bytes data]
Connection #0 to host openvpn.net left intact
Issue another request to this URL: 'https://swupdate.openvpn.net/downloads/connect/openvpn-connect-3.4.1.4522_signed.dmg'
Trying 104.18.109.96:443...
Connected to swupdate.openvpn.net (104.18.109.96) port 443 (#1)
ALPN, offering h2
ALPN, offering http/1.1
successfully set certificate verify locations:
CAfile: /etc/ssl/cert.pem
CApath: none
(304) (OUT), TLS handshake, Client hello (1):
} [325 bytes data]
(304) (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
(304) (IN), TLS handshake, Unknown (8):
{ [19 bytes data]
(304) (IN), TLS handshake, Certificate (11):
{ [4752 bytes data]
(304) (IN), TLS handshake, CERT verify (15):
{ [264 bytes data]
(304) (IN), TLS handshake, Finished (20):
{ [52 bytes data]
(304) (OUT), TLS handshake, Finished (20):
} [52 bytes data]
SSL connection using TLSv1.3 / AEAD-AES256-GCM-SHA384
ALPN, server accepted to use h2
Server certificate:
subject: C=US; ST=California; O=OpenVPN Inc.; CN=*.openvpn.net
start date: Mar 21 00:00:00 2022 GMT
expire date: Apr 20 23:59:59 2023 GMT
subjectAltName: host "swupdate.openvpn.net" matched cert's "*.openvpn.net"
issuer: C=GB; ST=Greater Manchester; L=Salford; O=Sectigo Limited; CN=Sectigo RSA Organization Validation Secure Server CA
SSL certificate verify ok.
Using HTTP2, server supports multiplexing
Connection state changed (HTTP/2 confirmed)
Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
Using Stream ID: 1 (easy handle 0x7fba4200b600)
GET /downloads/connect/openvpn-connect-3.4.1.4522_signed.dmg HTTP/2
Host: swupdate.openvpn.net
user-agent: curl/7.79.1
accept: /
Connection state changed (MAX_CONCURRENT_STREAMS == 256)!
< HTTP/2 403
< date: Wed, 11 Jan 2023 04:03:46 GMT
< content-type: text/plain; charset=UTF-8
< content-length: 16
< x-frame-options: SAMEORIGIN
< referrer-policy: same-origin
< cache-control: private, max-age=0, no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< expires: Thu, 01 Jan 1970 00:00:01 GMT
< server: cloudflare
< cf-ray: 787ac3b57abe4b50-SIN
The requested URL returned error: 403
stopped the pause stream!
Connection #1 to host swupdate.openvpn.net left intact
curl: (22) The requested URL returned error: 403
2023-01-11 12:03:45 : REQ : openvpnconnectv3 : ################## End Installomator, exit code 2
resubmit #852 , branch closed before review/merge
|
gharchive/issue
| 2023-01-11T04:09:21 |
2025-04-01T04:55:11.841209
|
{
"authors": [
"asri-tm"
],
"repo": "Installomator/Installomator",
"url": "https://github.com/Installomator/Installomator/issues/845",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2047731724
|
Added finaldraft12, finaldraft11, evercast
Contributing back some fragments I wrote.
TPC1076:Installomator ben.macphail$ sudo ./assemble.sh evercast DEBUG=0 && sudo ./assemble.sh finaldraft11 DEBUG=0 && sudo ./assemble.sh finaldraft12 DEBUG=0
Password:
2023-12-18 16:59:18 : REQ : evercast : ################## Start Installomator v. 10.6beta, date 2023-12-18
2023-12-18 16:59:18 : INFO : evercast : ################## Version: 10.6beta
2023-12-18 16:59:18 : INFO : evercast : ################## Date: 2023-12-18
2023-12-18 16:59:18 : INFO : evercast : ################## evercast
2023-12-18 16:59:18 : DEBUG : evercast : DEBUG mode 1 enabled.
2023-12-18 16:59:18 : INFO : evercast : setting variable from argument DEBUG=0
2023-12-18 16:59:18 : DEBUG : evercast : name=Evercast
2023-12-18 16:59:18 : DEBUG : evercast : appName=
2023-12-18 16:59:18 : DEBUG : evercast : type=pkg
2023-12-18 16:59:18 : DEBUG : evercast : archiveName=
2023-12-18 16:59:18 : DEBUG : evercast : downloadURL=https://s3.amazonaws.com/files.evercast.us/Evercast.pkg
2023-12-18 16:59:18 : DEBUG : evercast : curlOptions=
2023-12-18 16:59:18 : DEBUG : evercast : appNewVersion=
2023-12-18 16:59:18 : DEBUG : evercast : appCustomVersion function: Not defined
2023-12-18 16:59:18 : DEBUG : evercast : versionKey=CFBundleShortVersionString
2023-12-18 16:59:18 : DEBUG : evercast : packageID=
2023-12-18 16:59:18 : DEBUG : evercast : pkgName=
2023-12-18 16:59:18 : DEBUG : evercast : choiceChangesXML=
2023-12-18 16:59:18 : DEBUG : evercast : expectedTeamID=H2J9VR2HM2
2023-12-18 16:59:18 : DEBUG : evercast : blockingProcesses=
2023-12-18 16:59:18 : DEBUG : evercast : installerTool=
2023-12-18 16:59:18 : DEBUG : evercast : CLIInstaller=
2023-12-18 16:59:18 : DEBUG : evercast : CLIArguments=
2023-12-18 16:59:18 : DEBUG : evercast : updateTool=
2023-12-18 16:59:18 : DEBUG : evercast : updateToolArguments=
2023-12-18 16:59:18 : DEBUG : evercast : updateToolRunAsCurrentUser=
2023-12-18 16:59:18 : INFO : evercast : BLOCKING_PROCESS_ACTION=tell_user
2023-12-18 16:59:18 : INFO : evercast : NOTIFY=success
2023-12-18 16:59:18 : INFO : evercast : LOGGING=DEBUG
2023-12-18 16:59:18 : INFO : evercast : LOGO=/System/Applications/App Store.app/Contents/Resources/AppIcon.icns
2023-12-18 16:59:18 : INFO : evercast : Label type: pkg
2023-12-18 16:59:18 : INFO : evercast : archiveName: Evercast.pkg
2023-12-18 16:59:18 : INFO : evercast : no blocking processes defined, using Evercast as default
2023-12-18 16:59:18 : DEBUG : evercast : Changing directory to /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF
2023-12-18 16:59:18 : INFO : evercast : name: Evercast, appName: Evercast.app
2023-12-18 16:59:18.819 mdfind[22328:369721] [UserQueryParser] Loading keywords and predicates for locale "en_CA"
2023-12-18 16:59:18.820 mdfind[22328:369721] [UserQueryParser] Loading keywords and predicates for locale "en"
2023-12-18 16:59:18.864 mdfind[22328:369721] Couldn't determine the mapping between prefab keywords and predicates.
2023-12-18 16:59:18 : WARN : evercast : No previous app found
2023-12-18 16:59:18 : WARN : evercast : could not find Evercast.app
2023-12-18 16:59:18 : INFO : evercast : appversion:
2023-12-18 16:59:18 : INFO : evercast : Latest version not specified.
2023-12-18 16:59:18 : REQ : evercast : Downloading https://s3.amazonaws.com/files.evercast.us/Evercast.pkg to Evercast.pkg
2023-12-18 16:59:18 : DEBUG : evercast : No Dialog connection, just download
2023-12-18 16:59:33 : DEBUG : evercast : File list: -rw-r--r--@ 1 root wheel 125M 18 Dec 16:59 Evercast.pkg
2023-12-18 16:59:33 : DEBUG : evercast : File type: Evercast.pkg: xar archive compressed TOC: 4509, SHA-1 checksum
2023-12-18 16:59:33 : DEBUG : evercast : curl output was:
* Trying 52.217.137.136:443...
* Connected to s3.amazonaws.com (52.217.137.136) port 443 (#0)
* ALPN: offers h2,http/1.1
* (304) (OUT), TLS handshake, Client hello (1):
} [321 bytes data]
* CAfile: /etc/ssl/cert.pem
* CApath: none
* (304) (IN), TLS handshake, Server hello (2):
{ [106 bytes data]
* TLSv1.2 (IN), TLS handshake, Certificate (11):
{ [5571 bytes data]
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
{ [333 bytes data]
* TLSv1.2 (IN), TLS handshake, Server finished (14):
{ [4 bytes data]
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
} [70 bytes data]
* TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.2 (OUT), TLS handshake, Finished (20):
} [16 bytes data]
* TLSv1.2 (IN), TLS change cipher, Change cipher spec (1):
{ [1 bytes data]
* TLSv1.2 (IN), TLS handshake, Finished (20):
{ [16 bytes data]
* SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256
* ALPN: server accepted http/1.1
* Server certificate:
* subject: CN=s3.amazonaws.com
* start date: Oct 10 00:00:00 2023 GMT
* expire date: Jul 10 23:59:59 2024 GMT
* subjectAltName: host "s3.amazonaws.com" matched cert's "s3.amazonaws.com"
* issuer: C=US; O=Amazon; CN=Amazon RSA 2048 M01
* SSL certificate verify ok.
* using HTTP/1.1
> GET /files.evercast.us/Evercast.pkg HTTP/1.1
> Host: s3.amazonaws.com
> User-Agent: curl/8.1.2
> Accept: */*
>
< HTTP/1.1 200 OK
< x-amz-id-2: 1gjMq2rT9PG2AHnkmXEsPi6zstMdNVdd8IFwr38e4JcfZ2WcKMWYKIlgXPaXHujOgMPw+F8yZ2c=
< x-amz-request-id: 5VARA0J6Z8SAFSP1
< Date: Tue, 19 Dec 2023 00:59:20 GMT
< Last-Modified: Mon, 20 Nov 2023 16:58:34 GMT
< ETag: "e0ade2ce4e2846690d54ec22c3c7cd15-8"
< x-amz-server-side-encryption: AES256
< x-amz-version-id: TsKxFHtkyi0X.b9gwPphh1Z1z7RnGvqo
< Accept-Ranges: bytes
< Content-Type: binary/octet-stream
< Server: AmazonS3
< Content-Length: 131505358
<
{ [7617 bytes data]
* Connection #0 to host s3.amazonaws.com left intact
2023-12-18 16:59:33 : REQ : evercast : no more blocking processes, continue with update
2023-12-18 16:59:33 : REQ : evercast : Installing Evercast
2023-12-18 16:59:33 : INFO : evercast : Verifying: Evercast.pkg
2023-12-18 16:59:33 : DEBUG : evercast : File list: -rw-r--r--@ 1 root wheel 125M 18 Dec 16:59 Evercast.pkg
2023-12-18 16:59:33 : DEBUG : evercast : File type: Evercast.pkg: xar archive compressed TOC: 4509, SHA-1 checksum
2023-12-18 16:59:34 : DEBUG : evercast : spctlOut is Evercast.pkg: accepted
2023-12-18 16:59:34 : DEBUG : evercast : source=Notarized Developer ID
2023-12-18 16:59:34 : DEBUG : evercast : override=security disabled
2023-12-18 16:59:34 : DEBUG : evercast : origin=Developer ID Installer: Evercast LLC (H2J9VR2HM2)
2023-12-18 16:59:34 : INFO : evercast : Team ID: H2J9VR2HM2 (expected: H2J9VR2HM2 )
2023-12-18 16:59:34 : INFO : evercast : Installing Evercast.pkg to /
2023-12-18 17:00:17 : DEBUG : evercast : Debugging enabled, installer output was:
Dec 18 16:59:34 installer[22427] <Debug>: Product archive /private/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF/Evercast.pkg trustLevel=350
Dec 18 16:59:34 installer[22427] <Debug>: External component packages (1) trustLevel=350
Dec 18 16:59:34 installer[22427] <Debug>: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: location = file://localhost
Dec 18 16:59:34 installer[22427] <Debug>: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: file://localhost/private/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF/Evercast.pkg#Evercast.pkg
Dec 18 16:59:34 installer[22427] <Info>: Set authorization level to root for session
Dec 18 16:59:34 installer[22427] <Info>: Authorization is being checked, waiting until authorization arrives.
Dec 18 16:59:34 installer[22427] <Info>: Administrator authorization granted.
Dec 18 16:59:34 installer[22427] <Info>: Packages have been authorized for installation.
Dec 18 16:59:34 installer[22427] <Debug>: Will use PK session
Dec 18 16:59:34 installer[22427] <Debug>: Using authorization level of root for IFPKInstallElement
Dec 18 16:59:34 installer[22427] <Info>: Starting installation:
Dec 18 16:59:34 installer[22427] <Notice>: Configuring volume "Macintosh HD"
Dec 18 16:59:34 installer[22427] <Info>: Preparing disk for local booted install.
Dec 18 16:59:34 installer[22427] <Notice>: Free space on "Macintosh HD": 365.08 GB (365081833472 bytes).
Dec 18 16:59:34 installer[22427] <Notice>: Create temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22427cSXwbl"
Dec 18 16:59:34 installer[22427] <Notice>: IFPKInstallElement (1 packages)
Dec 18 16:59:34 installer[22427] <Info>: Current Path: /usr/sbin/installer
Dec 18 16:59:34 installer[22427] <Info>: Current Path: /bin/zsh
Dec 18 16:59:34 installer[22427] <Info>: Current Path: /usr/bin/sudo
Dec 18 16:59:34 installer[22427] <Notice>: PackageKit: Enqueuing install with framework-specified quality of service (utility)
installer: Package name is Evercast
installer: Upgrading at base path /
installer: Preparing for installation….....
installer: Preparing the disk….....
installer: Preparing Evercast….....
installer: Waiting for other installations to complete….....
installer: Configuring the installation….....
installer:
#
installer: Writing files….....
#
installer: Writing files….....
#
installer: Writing files….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#Dec 18 17:00:16 installer[22427] <Info>: PackageKit: Registered bundle file:///Applications/Evercast.app/ for uid 0
installer: Registering updated components….....
Last Log repeated 67 times
installer: Validating packages….....
#Dec 18 17:00:16 installer[22427] <Notice>: Running install actions
Dec 18 17:00:16 installer[22427] <Notice>: Removing temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22427cSXwbl"
Dec 18 17:00:16 installer[22427] <Notice>: Finalize disk "Macintosh HD"
Dec 18 17:00:16 installer[22427] <Notice>: Notifying system of updated components
Dec 18 17:00:16 installer[22427] <Notice>:
Dec 18 17:00:16 installer[22427] <Notice>: **** Summary Information ****
Dec 18 17:00:16 installer[22427] <Notice>: Operation Elapsed time
Dec 18 17:00:16 installer[22427] <Notice>: -----------------------------
Dec 18 17:00:16 installer[22427] <Notice>: disk 0.02 seconds
Dec 18 17:00:16 installer[22427] <Notice>: script 0.01 seconds
Dec 18 17:00:16 installer[22427] <Notice>: zero 0.00 seconds
Dec 18 17:00:16 installer[22427] <Notice>: install 42.19 seconds
Dec 18 17:00:16 installer[22427] <Notice>: -total- 42.22 seconds
Dec 18 17:00:16 installer[22427] <Notice>:
installer: Running installer actions…
installer:
installer: Finishing the Installation….....
installer:
#
installer: The software was successfully installed......
installer: The upgrade was successful.
Output of /var/log/install.log below this line.----------------------------------------------------------2023-12-18 16:59:34-08 TPC1076 installer[22427]: Product archive /private/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF/Evercast.pkg trustLevel=350
2023-12-18 16:59:34-08 TPC1076 installer[22427]: External component packages (1) trustLevel=350
2023-12-18 16:59:34-08 TPC1076 installer[22427]: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: location = file://localhost
2023-12-18 16:59:34-08 TPC1076 installer[22427]: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: file://localhost/private/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF/Evercast.pkg#Evercast.pkg
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Set authorization level to root for session
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Authorization is being checked, waiting until authorization arrives.
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Administrator authorization granted.
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Packages have been authorized for installation.
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Will use PK session
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Using authorization level of root for IFPKInstallElement
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Starting installation:
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Configuring volume "Macintosh HD"
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Preparing disk for local booted install.
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Free space on "Macintosh HD": 365.08 GB (365081833472 bytes).
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Create temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22427cSXwbl"
2023-12-18 16:59:34-08 TPC1076 installer[22427]: IFPKInstallElement (1 packages)
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Current Path: /usr/sbin/installer
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Current Path: /bin/zsh
2023-12-18 16:59:34-08 TPC1076 installer[22427]: Current Path: /usr/bin/sudo
2023-12-18 16:59:34-08 TPC1076 installd[2540]: PackageKit: Adding client PKInstallDaemonClient pid=22427, uid=0 (/usr/sbin/installer)
2023-12-18 16:59:34-08 TPC1076 installer[22427]: PackageKit: Enqueuing install with framework-specified quality of service (utility)
2023-12-18 16:59:34-08 TPC1076 installd[2540]: PackageKit: Set reponsibility for install to 15095
2023-12-18 16:59:34-08 TPC1076 installd[2540]: PackageKit: Hosted team responsibility for install set to team:(H2J9VR2HM2)
2023-12-18 16:59:34-08 TPC1076 installd[2540]: PackageKit: ----- Begin install -----
2023-12-18 16:59:34-08 TPC1076 installd[2540]: PackageKit: request=PKInstallRequest <1 packages, destination=/>
2023-12-18 16:59:34-08 TPC1076 installd[2540]: PackageKit: packages=(
2023-12-18 16:59:34-08 TPC1076 installd[2540]: PackageKit: Extracting file:///var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF/Evercast.pkg#Evercast.pkg (destination=/Library/InstallerSandboxes/.PKInstallSandboxManager/AAC8A1E9-7D72-4E84-88E1-CC228B1F768B.activeSandbox/Root/Applications, uid=0)
2023-12-18 16:59:36-08 TPC1076 installd[2540]: PackageKit: prevent user idle system sleep
2023-12-18 16:59:36-08 TPC1076 installd[2540]: PackageKit: suspending backupd
2023-12-18 16:59:36-08 TPC1076 installd[2540]: PackageKit: Using trashcan path /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/PKInstallSandboxTrash/AAC8A1E9-7D72-4E84-88E1-CC228B1F768B.sandboxTrash for sandbox /Library/InstallerSandboxes/.PKInstallSandboxManager/AAC8A1E9-7D72-4E84-88E1-CC228B1F768B.activeSandbox
2023-12-18 16:59:36-08 TPC1076 installd[2540]: PackageKit: Shoving /Library/InstallerSandboxes/.PKInstallSandboxManager/AAC8A1E9-7D72-4E84-88E1-CC228B1F768B.activeSandbox/Root (1 items) to /
2023-12-18 16:59:36-08 TPC1076 install_monitor[22428]: Temporarily excluding: /Applications, /Library, /System, /bin, /private, /sbin, /usr
2023-12-18 16:59:36-08 TPC1076 installd[2540]: PackageKit: Writing receipt for us.evercast.Evercast to /
2023-12-18 16:59:39-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/Evercast.WebRTC.framework/Versions/A/Evercast.WebRTC
2023-12-18 16:59:39-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/EvercastNetworking
2023-12-18 16:59:39-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/Frameworks/Evercast.Connections.framework/Versions/A/Evercast.Connections
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/Frameworks/EvercastLogger.framework/Versions/A/EvercastLogger
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/Frameworks/EvercastLogger.framework/Versions/A/Frameworks/Evercast.Configuration.framework/Versions/A/Evercast.Configuration
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/Frameworks/EvercastLogger.framework/Versions/A/Frameworks/Evercast.Logger.framework/Versions/A/Evercast.Logger
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/Frameworks/Evercast.RESTfulAPI.framework/Versions/A/Evercast.RESTfulAPI
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastCore.framework/Versions/A/EvercastCore
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLogin.framework/Versions/A/EvercastLogin
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.Configuration.framework/Versions/A/Evercast.Configuration
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/EvercastLogger.framework/Versions/A/EvercastLogger
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/EvercastLogger.framework/Versions/A/Frameworks/Evercast.Configuration.framework/Versions/A/Evercast.Configuration
2023-12-18 16:59:40-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastDiagnosticsProvider.framework/Versions/A/EvercastDiagnosticsProvider
2023-12-18 16:59:41-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/Frameworks/Evercast.Telemetry.framework/Versions/A/Evercast.Telemetry
2023-12-18 16:59:41-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/Evercast.MediaManager.framework/Versions/A/Evercast.MediaManager
2023-12-18 16:59:41-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastRecordings.framework/Versions/A/EvercastRecordings
2023-12-18 16:59:41-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastRecordings.framework/Versions/A/Frameworks/Evercast.SDLinterface.framework/Versions/A/Evercast.SDLinterface
2023-12-18 16:59:42-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastAccountSettings.framework/Versions/A/EvercastAccountSettings
2023-12-18 16:59:43-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/PixelWeave.framework/Versions/A/PixelWeave
2023-12-18 16:59:43-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastSubscriptions.framework/Versions/A/EvercastSubscriptions
2023-12-18 16:59:53-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/Evercast.LiveRoom.framework/Versions/A/Evercast.LiveRoom
2023-12-18 16:59:54-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastNetworking.framework/Versions/A/Frameworks/Evercast.GraphQLClient.framework/Versions/A/Evercast.GraphQLClient
2023-12-18 16:59:55-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastDiagnosticsProvider.framework/Versions/A/Frameworks/Evercast.Diagnostics.framework/Versions/A/Evercast.Diagnostics
2023-12-18 16:59:55-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/EvercastLogger.framework/Versions/A/Frameworks/Evercast.Logger.framework/Versions/A/Evercast.Logger
2023-12-18 16:59:55-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.SDLinterface.framework/Versions/A/Evercast.SDLinterface
2023-12-18 16:59:55-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastConfig.framework/Versions/A/EvercastConfig
2023-12-18 16:59:56-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastConfig.framework/Versions/A/Frameworks/Evercast.Configuration.framework/Versions/A/Evercast.Configuration
2023-12-18 16:59:56-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/EvercastLiveRoom
2023-12-18 16:59:57-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/Evercast.Connections.framework/Versions/A/Evercast.Connections
2023-12-18 16:59:57-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/Evercast.SDLinterface.framework/Versions/A/Evercast.SDLinterface
2023-12-18 16:59:57-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.Common.framework/Versions/A/Evercast.Common
2023-12-18 16:59:58-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.Diagnostics.framework/Versions/A/Evercast.Diagnostics
2023-12-18 16:59:59-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.GraphQLClient.framework/Versions/A/Evercast.GraphQLClient
2023-12-18 17:00:05-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.LiveRoom.framework/Versions/A/Evercast.LiveRoom
2023-12-18 17:00:05-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.Logger.framework/Versions/A/Evercast.Logger
2023-12-18 17:00:05-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.MediaManager.framework/Versions/A/Evercast.MediaManager
2023-12-18 17:00:05-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.Telemetry.framework/Versions/A/Evercast.Telemetry
2023-12-18 17:00:07-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.WebRTC.framework/Versions/A/Evercast.WebRTC
2023-12-18 17:00:07-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLogger.framework/Versions/A/EvercastLogger
2023-12-18 17:00:07-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLogger.framework/Versions/A/Frameworks/Evercast.Configuration.framework/Versions/A/Evercast.Configuration
2023-12-18 17:00:07-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLogger.framework/Versions/A/Frameworks/Evercast.Logger.framework/Versions/A/Evercast.Logger
2023-12-18 17:00:09-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libavcodec.dylib
2023-12-18 17:00:09-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libavdevice.dylib
2023-12-18 17:00:10-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libavfilter.dylib
2023-12-18 17:00:10-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libavformat.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libavutil.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftAVFoundation.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftAppKit.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftCore.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftCoreAudio.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftCoreData.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftCoreFoundation.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftCoreGraphics.dylib
2023-12-18 17:00:11-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftCoreImage.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftCoreMedia.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftDarwin.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftDispatch.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftFoundation.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftGLKit.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftIOKit.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftMetal.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftMetalKit.dylib
2023-12-18 17:00:12-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftModelIO.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftObjectiveC.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftQuartzCore.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftSpriteKit.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftXPC.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftos.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswiftsimd.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswresample.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/libswscale.dylib
2023-12-18 17:00:13-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/zimg.dylib
2023-12-18 17:00:14-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/MacOS/Evercast
2023-12-18 17:00:14-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastUI.framework/Versions/A/EvercastUI
2023-12-18 17:00:15-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/Evercast.Connections.framework/Versions/A/Evercast.Connections
2023-12-18 17:00:15-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastLiveRoom.framework/Versions/A/Frameworks/Evercast.Common.framework/Versions/A/Evercast.Common
2023-12-18 17:00:15-08 TPC1076 installd[2540]: oahd translated /Applications/Evercast.app/Contents/Frameworks/EvercastDashboard.framework/Versions/A/EvercastDashboard
2023-12-18 17:00:15-08 TPC1076 installd[2540]: PackageKit: Translated 77 binaries.
2023-12-18 17:00:15-08 TPC1076 installd[2540]: PackageKit: Touched bundle /Applications/Evercast.app
2023-12-18 17:00:15-08 TPC1076 installd[2540]: Installed "Evercast" ()
2023-12-18 17:00:15-08 TPC1076 installd[2540]: Successfully wrote install history to /Library/Receipts/InstallHistory.plist
2023-12-18 17:00:15-08 TPC1076 install_monitor[22428]: Re-included: /Applications, /Library, /System, /bin, /private, /sbin, /usr
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: releasing backupd
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: allow user idle system sleep
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: ----- End install -----
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: 41.9s elapsed install time
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: Cleared responsibility for install from 22427.
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: Hosted team responsible for install has been cleared.
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: Running idle tasks
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: Done with sandbox removals
2023-12-18 17:00:16-08 TPC1076 installer[22427]: PackageKit: Registered bundle file:///Applications/Evercast.app/ for uid 0
2023-12-18 17:00:16-08 TPC1076 installd[2540]: PackageKit: Removing client PKInstallDaemonClient pid=22427, uid=0 (/usr/sbin/installer)
2023-12-18 17:00:16-08 TPC1076 installer[22427]: Running install actions
2023-12-18 17:00:16-08 TPC1076 installer[22427]: Removing temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22427cSXwbl"
2023-12-18 17:00:16-08 TPC1076 installer[22427]: Finalize disk "Macintosh HD"
2023-12-18 17:00:16-08 TPC1076 installer[22427]: Notifying system of updated components
2023-12-18 17:00:16-08 TPC1076 installer[22427]:
2023-12-18 17:00:16-08 TPC1076 installer[22427]: **** Summary Information ****
2023-12-18 17:00:16-08 TPC1076 installer[22427]: Operation Elapsed time
2023-12-18 17:00:16-08 TPC1076 installer[22427]: -----------------------------
2023-12-18 17:00:16-08 TPC1076 installer[22427]: disk 0.02 seconds
2023-12-18 17:00:16-08 TPC1076 installer[22427]: script 0.01 seconds
2023-12-18 17:00:16-08 TPC1076 installer[22427]: zero 0.00 seconds
2023-12-18 17:00:16-08 TPC1076 installer[22427]: install 42.19 seconds
2023-12-18 17:00:16-08 TPC1076 installer[22427]: -total- 42.22 seconds
2023-12-18 17:00:16-08 TPC1076 installer[22427]:
2023-12-18 17:00:17 : INFO : evercast : Finishing...
2023-12-18 17:00:20 : INFO : evercast : App(s) found: /Applications/Evercast.app
2023-12-18 17:00:20 : INFO : evercast : found app at /Applications/Evercast.app, version 3.1.2, on versionKey CFBundleShortVersionString
2023-12-18 17:00:20 : REQ : evercast : Installed Evercast
2023-12-18 17:00:20 : INFO : evercast : notifying
2023-12-18 17:00:21 : DEBUG : evercast : Deleting /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF
2023-12-18 17:00:21 : DEBUG : evercast : Debugging enabled, Deleting tmpDir output was:
/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF/Evercast.pkg
2023-12-18 17:00:21 : DEBUG : evercast : /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Zh9asjeF
2023-12-18 17:00:21 : INFO : evercast : Installomator did not close any apps, so no need to reopen any apps.
2023-12-18 17:00:21 : REQ : evercast : All done!
2023-12-18 17:00:21 : REQ : evercast : ################## End Installomator, exit code 0
2023-12-18 17:00:21 : REQ : finaldraft11 : ################## Start Installomator v. 10.6beta, date 2023-12-18
2023-12-18 17:00:21 : INFO : finaldraft11 : ################## Version: 10.6beta
2023-12-18 17:00:21 : INFO : finaldraft11 : ################## Date: 2023-12-18
2023-12-18 17:00:21 : INFO : finaldraft11 : ################## finaldraft11
2023-12-18 17:00:21 : DEBUG : finaldraft11 : DEBUG mode 1 enabled.
2023-12-18 17:00:23 : INFO : finaldraft11 : setting variable from argument DEBUG=0
2023-12-18 17:00:23 : DEBUG : finaldraft11 : name=Final Draft 11
2023-12-18 17:00:23 : DEBUG : finaldraft11 : appName=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : type=pkgInZip
2023-12-18 17:00:23 : DEBUG : finaldraft11 : archiveName=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : downloadURL=https://www.finaldraft.com/downloads/finaldraft1117Mac.zip
2023-12-18 17:00:23 : DEBUG : finaldraft11 : curlOptions=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : appNewVersion=1117
2023-12-18 17:00:23 : DEBUG : finaldraft11 : appCustomVersion function: Not defined
2023-12-18 17:00:23 : DEBUG : finaldraft11 : versionKey=CFBundleShortVersionString
2023-12-18 17:00:23 : DEBUG : finaldraft11 : packageID=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : pkgName=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : choiceChangesXML=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : expectedTeamID=7XUZ8R5736
2023-12-18 17:00:23 : DEBUG : finaldraft11 : blockingProcesses=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : installerTool=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : CLIInstaller=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : CLIArguments=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : updateTool=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : updateToolArguments=
2023-12-18 17:00:23 : DEBUG : finaldraft11 : updateToolRunAsCurrentUser=
2023-12-18 17:00:23 : INFO : finaldraft11 : BLOCKING_PROCESS_ACTION=tell_user
2023-12-18 17:00:23 : INFO : finaldraft11 : NOTIFY=success
2023-12-18 17:00:23 : INFO : finaldraft11 : LOGGING=DEBUG
2023-12-18 17:00:23 : INFO : finaldraft11 : LOGO=/System/Applications/App Store.app/Contents/Resources/AppIcon.icns
2023-12-18 17:00:23 : INFO : finaldraft11 : Label type: pkgInZip
2023-12-18 17:00:23 : INFO : finaldraft11 : archiveName: Final Draft 11.zip
2023-12-18 17:00:23 : INFO : finaldraft11 : no blocking processes defined, using Final Draft 11 as default
2023-12-18 17:00:23 : DEBUG : finaldraft11 : Changing directory to /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK
2023-12-18 17:00:23 : INFO : finaldraft11 : name: Final Draft 11, appName: Final Draft 11.app
2023-12-18 17:00:23.923 mdfind[22774:372052] [UserQueryParser] Loading keywords and predicates for locale "en_CA"
2023-12-18 17:00:23.923 mdfind[22774:372052] [UserQueryParser] Loading keywords and predicates for locale "en"
2023-12-18 17:00:23.960 mdfind[22774:372052] Couldn't determine the mapping between prefab keywords and predicates.
2023-12-18 17:00:23 : WARN : finaldraft11 : No previous app found
2023-12-18 17:00:23 : WARN : finaldraft11 : could not find Final Draft 11.app
2023-12-18 17:00:23 : INFO : finaldraft11 : appversion:
2023-12-18 17:00:23 : INFO : finaldraft11 : Latest version of Final Draft 11 is 1117
2023-12-18 17:00:24 : REQ : finaldraft11 : Downloading https://www.finaldraft.com/downloads/finaldraft1117Mac.zip to Final Draft 11.zip
2023-12-18 17:00:24 : DEBUG : finaldraft11 : No Dialog connection, just download
2023-12-18 17:00:25 : DEBUG : finaldraft11 : File list: -rw-r--r--@ 1 root wheel 70M 18 Dec 17:00 Final Draft 11.zip
2023-12-18 17:00:25 : DEBUG : finaldraft11 : File type: Final Draft 11.zip: Zip archive data, at least v2.0 to extract, compression method=deflate
2023-12-18 17:00:25 : DEBUG : finaldraft11 : curl output was:
* Trying 173.249.144.78:443...
* Connected to www.finaldraft.com (173.249.144.78) port 443 (#0)
* ALPN: offers h2,http/1.1
* (304) (OUT), TLS handshake, Client hello (1):
} [323 bytes data]
* CAfile: /etc/ssl/cert.pem
* CApath: none
* (304) (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* (304) (IN), TLS handshake, Unknown (8):
{ [19 bytes data]
* (304) (IN), TLS handshake, Certificate (11):
{ [2973 bytes data]
* (304) (IN), TLS handshake, CERT verify (15):
{ [264 bytes data]
* (304) (IN), TLS handshake, Finished (20):
{ [52 bytes data]
* (304) (OUT), TLS handshake, Finished (20):
} [52 bytes data]
* SSL connection using TLSv1.3 / AEAD-AES256-GCM-SHA384
* ALPN: server accepted h2
* Server certificate:
* subject: C=US; ST=California; L=Burbank; O=Cast & Crew, LLC; CN=*.finaldraft.com
* start date: Jan 19 00:00:00 2023 GMT
* expire date: Feb 19 23:59:59 2024 GMT
* subjectAltName: host "www.finaldraft.com" matched cert's "*.finaldraft.com"
* issuer: C=US; O=DigiCert Inc; CN=DigiCert TLS RSA SHA256 2020 CA1
* SSL certificate verify ok.
* using HTTP/2
* h2 [:method: GET]
* h2 [:scheme: https]
* h2 [:authority: www.finaldraft.com]
* h2 [:path: /downloads/finaldraft1117Mac.zip]
* h2 [user-agent: curl/8.1.2]
* h2 [accept: */*]
* Using Stream ID: 1 (easy handle 0x126812400)
> GET /downloads/finaldraft1117Mac.zip HTTP/2
> Host: www.finaldraft.com
> User-Agent: curl/8.1.2
> Accept: */*
>
< HTTP/2 200
< server: nginx
< date: Tue, 19 Dec 2023 01:00:24 GMT
< content-type: application/zip
< content-length: 73087960
< x-content-type-options: nosniff
< x-xss-protection: 1; mode=block
< cache-control: public, max-age=604800
< access-control-allow-origin: https://finaldraft.com https://trial.finaldraft.com https://store.finaldraft.com https://manuals.finaldraft.com https://remote.finaldraft.com https://answernet.finaldrat.com https://reg.finaldraft.com https://newsletter.finaldraft.com https://blog.finaldraft.com
< strict-transport-security: max-age=31536000
< referrer-policy: same-origin
< x-frame-options: DENY
< permissions-policy: autoplay=*, geolocation=(self), camera=(self), fullscreen=()
< content-security-policy: default-src *; style-src * 'unsafe-inline'; script-src * 'unsafe-inline' 'unsafe-eval'; font-src * data:; img-src * data:; frame-ancestors * https://app.hubspot.com; frame-src *;
< last-modified: Wed, 04 Oct 2023 21:40:15 GMT
< etag: "45b3bd8-606eada4f2dc0"
< cache-control: max-age=172800
< expires: Thu, 21 Dec 2023 01:00:24 GMT
< x-cache-nxaccel: BYPASS
< accept-ranges: bytes
<
{ [7491 bytes data]
* Connection #0 to host www.finaldraft.com left intact
2023-12-18 17:00:25 : REQ : finaldraft11 : no more blocking processes, continue with update
2023-12-18 17:00:25 : REQ : finaldraft11 : Installing Final Draft 11
2023-12-18 17:00:25 : INFO : finaldraft11 : Unzipping Final Draft 11.zip
2023-12-18 17:00:25 : DEBUG : finaldraft11 : Found pkg(s):
/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg
2023-12-18 17:00:25 : INFO : finaldraft11 : found pkg: /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg
2023-12-18 17:00:25 : INFO : finaldraft11 : Verifying: /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg
2023-12-18 17:00:25 : DEBUG : finaldraft11 : File list: -rw-r--r--@ 1 administrator staff 70M 4 Oct 14:06 /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg
2023-12-18 17:00:25 : DEBUG : finaldraft11 : File type: /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg: xar archive compressed TOC: 4811, SHA-1 checksum
2023-12-18 17:00:25 : DEBUG : finaldraft11 : spctlOut is /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg: accepted
2023-12-18 17:00:25 : DEBUG : finaldraft11 : source=Notarized Developer ID
2023-12-18 17:00:25 : DEBUG : finaldraft11 : override=security disabled
2023-12-18 17:00:25 : DEBUG : finaldraft11 : origin=Developer ID Installer: Cast & Crew Production Software, LLC (7XUZ8R5736)
2023-12-18 17:00:25 : INFO : finaldraft11 : Team ID: 7XUZ8R5736 (expected: 7XUZ8R5736 )
2023-12-18 17:00:25 : INFO : finaldraft11 : Installing /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg to /
2023-12-18 17:00:34 : DEBUG : finaldraft11 : Debugging enabled, installer output was:
Dec 18 17:00:25 installer[22882] <Info>: Architecture Translation: Distribution failed architecture check and is about to be re-executed as Intel.
Dec 18 17:00:25 installer[22882] <Warning>: Architecture Translation: Process is about to get executed as Intel.
Dec 18 17:00:25 installer[22882] <Debug>: Product archive /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg trustLevel=350
Dec 18 17:00:25 installer[22882] <Debug>: External component packages (1) trustLevel=350
Dec 18 17:00:25 installer[22882] <Debug>: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: location = file://localhost
Dec 18 17:00:25 installer[22882] <Debug>: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: file://localhost/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final%20Draft%2011%20Installer.pkg#finalDraft11.pkg
Dec 18 17:00:25 installer[22882] <Info>: Set authorization level to root for session
Dec 18 17:00:25 installer[22882] <Info>: Authorization is being checked, waiting until authorization arrives.
Dec 18 17:00:25 installer[22882] <Info>: Administrator authorization granted.
Dec 18 17:00:25 installer[22882] <Info>: Packages have been authorized for installation.
Dec 18 17:00:25 installer[22882] <Debug>: Will use PK session
Dec 18 17:00:25 installer[22882] <Debug>: Using authorization level of root for IFPKInstallElement
Dec 18 17:00:25 installer[22882] <Info>: Install request is requesting Rosetta translation.
Dec 18 17:00:25 installer[22882] <Info>: Starting installation:
Dec 18 17:00:25 installer[22882] <Notice>: Configuring volume "Macintosh HD"
Dec 18 17:00:25 installer[22882] <Info>: Preparing disk for local booted install.
Dec 18 17:00:25 installer[22882] <Notice>: Free space on "Macintosh HD": 364.28 GB (364278161408 bytes).
Dec 18 17:00:25 installer[22882] <Notice>: Create temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22882R8W31B"
Dec 18 17:00:25 installer[22882] <Notice>: IFPKInstallElement (1 packages)
Dec 18 17:00:25 installer[22882] <Info>: Current Path: /usr/sbin/installer
Dec 18 17:00:25 installer[22882] <Info>: Current Path: /bin/zsh
Dec 18 17:00:25 installer[22882] <Info>: Current Path: /usr/bin/sudo
Dec 18 17:00:25 installer[22882] <Notice>: PackageKit: Enqueuing install with framework-specified quality of service (utility)
installer: Package name is Final Draft 11
installer: Upgrading at base path /
installer: Preparing for installation….....
installer: Preparing the disk….....
installer: Preparing Final Draft 11….....
installer: Waiting for other installations to complete….....
installer: Configuring the installation….....
installer:
#
installer: Writing files….....
#
installer: Running package scripts….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#
installer: Registering updated components….....
#Dec 18 17:00:32 installer[22882] <Info>: PackageKit: Registered bundle file:///Applications/Final%20Draft%2011.app/ for uid 0
Dec 18 17:00:32 installer[22882] <Info>: PackageKit: Registered bundle file:///Applications/Final%20Draft%2011.app/Contents/Frameworks/Sparkle.framework/Versions/A/Autoupdate.app/ for uid 0
installer: Registering updated components….....
Last Log repeated 5 times
installer: Validating packages….....
#Dec 18 17:00:32 installer[22882] <Notice>: Running install actions
Dec 18 17:00:32 installer[22882] <Notice>: Removing temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22882R8W31B"
Dec 18 17:00:32 installer[22882] <Notice>: Finalize disk "Macintosh HD"
Dec 18 17:00:32 installer[22882] <Notice>: Notifying system of updated components
Dec 18 17:00:32 installer[22882] <Notice>:
Dec 18 17:00:32 installer[22882] <Notice>: **** Summary Information ****
Dec 18 17:00:32 installer[22882] <Notice>: Operation Elapsed time
Dec 18 17:00:32 installer[22882] <Notice>: -----------------------------
Dec 18 17:00:32 installer[22882] <Notice>: disk 0.02 seconds
Dec 18 17:00:32 installer[22882] <Notice>: script 0.00 seconds
Dec 18 17:00:32 installer[22882] <Notice>: zero 0.00 seconds
Dec 18 17:00:32 installer[22882] <Notice>: install 7.07 seconds
Dec 18 17:00:32 installer[22882] <Notice>: -total- 7.09 seconds
Dec 18 17:00:32 installer[22882] <Notice>:
installer: Running installer actions…
installer:
installer: Finishing the Installation….....
installer:
#
installer: The software was successfully installed......
installer: The upgrade was successful.
Output of /var/log/install.log below this line.----------------------------------------------------------2023-12-18 17:00:25-08 TPC1076 installer[22882]: Architecture Translation: Distribution failed architecture check and is about to be re-executed as Intel.
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Architecture Translation: Process is about to get executed as Intel.
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Product archive /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg trustLevel=350
2023-12-18 17:00:25-08 TPC1076 installer[22882]: External component packages (1) trustLevel=350
2023-12-18 17:00:25-08 TPC1076 installer[22882]: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: location = file://localhost
2023-12-18 17:00:25-08 TPC1076 installer[22882]: -[IFDInstallController(Private) _buildInstallPlanReturningError:]: file://localhost/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final%20Draft%2011%20Installer.pkg#finalDraft11.pkg
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Set authorization level to root for session
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Authorization is being checked, waiting until authorization arrives.
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Administrator authorization granted.
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Packages have been authorized for installation.
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Will use PK session
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Using authorization level of root for IFPKInstallElement
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Install request is requesting Rosetta translation.
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Starting installation:
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Configuring volume "Macintosh HD"
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Preparing disk for local booted install.
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Free space on "Macintosh HD": 364.28 GB (364278161408 bytes).
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Create temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22882R8W31B"
2023-12-18 17:00:25-08 TPC1076 installer[22882]: IFPKInstallElement (1 packages)
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Current Path: /usr/sbin/installer
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Current Path: /bin/zsh
2023-12-18 17:00:25-08 TPC1076 installer[22882]: Current Path: /usr/bin/sudo
2023-12-18 17:00:25-08 TPC1076 installd[2540]: PackageKit: Adding client PKInstallDaemonClient pid=22882, uid=0 (/usr/sbin/installer)
2023-12-18 17:00:25-08 TPC1076 installer[22882]: PackageKit: Enqueuing install with framework-specified quality of service (utility)
2023-12-18 17:00:25-08 TPC1076 installd[2540]: PackageKit: Set reponsibility for install to 15095
2023-12-18 17:00:25-08 TPC1076 installd[2540]: PackageKit: Hosted team responsibility for install set to team:(7XUZ8R5736)
2023-12-18 17:00:25-08 TPC1076 installd[2540]: PackageKit: ----- Begin install -----
2023-12-18 17:00:25-08 TPC1076 installd[2540]: PackageKit: request=PKInstallRequest <1 packages, destination=/>
2023-12-18 17:00:25-08 TPC1076 installd[2540]: PackageKit: packages=(
2023-12-18 17:00:25-08 TPC1076 installd[2540]: PackageKit: Will do receipt-based obsoleting for package identifier com.finaldraft.installerfinaldraft.v11 (prefix path=Applications)
2023-12-18 17:00:26-08 TPC1076 installd[2540]: PackageKit: Extracting file:///var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final%20Draft%2011%20Installer.pkg#finalDraft11.pkg (destination=/Library/InstallerSandboxes/.PKInstallSandboxManager/4D0AFD15-DB91-4D2F-8C8F-64044BE5D996.activeSandbox/Root/Applications, uid=0)
2023-12-18 17:00:26-08 TPC1076 installd[2540]: PackageKit: prevent user idle system sleep
2023-12-18 17:00:26-08 TPC1076 installd[2540]: PackageKit: suspending backupd
2023-12-18 17:00:26-08 TPC1076 installd[2540]: PackageKit (package_script_service): Preparing to execute script "./preinstall" in /private/tmp/PKInstallSandbox.DLxEQ4/Scripts/com.finaldraft.installerfinaldraft.v11.Q4PJ9B
2023-12-18 17:00:26-08 TPC1076 package_script_service[5019]: PackageKit: Preparing to execute script "preinstall" in /tmp/PKInstallSandbox.DLxEQ4/Scripts/com.finaldraft.installerfinaldraft.v11.Q4PJ9B
2023-12-18 17:00:26-08 TPC1076 package_script_service[5019]: Set responsibility to pid: 15095, responsible_path: /System/Applications/Utilities/Terminal.app/Contents/MacOS/Terminal
2023-12-18 17:00:26-08 TPC1076 package_script_service[5019]: Hosted team responsibility for script set to team:(7XUZ8R5736)
2023-12-18 17:00:26-08 TPC1076 package_script_service[5019]: Preparing to execute with Rosetta Intel Translation: '/tmp/PKInstallSandbox.DLxEQ4/Scripts/com.finaldraft.installerfinaldraft.v11.Q4PJ9B/preinstall'
2023-12-18 17:00:26-08 TPC1076 package_script_service[5019]: PackageKit: Executing script "preinstall" in /tmp/PKInstallSandbox.DLxEQ4/Scripts/com.finaldraft.installerfinaldraft.v11.Q4PJ9B
2023-12-18 17:00:26-08 TPC1076 install_monitor[22884]: Temporarily excluding: /Applications, /Library, /System, /bin, /private, /sbin, /usr
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Installing licensing service from FNPLicensingService to /Library/Application Support/FLEXnet Publisher/Service/11.15.1
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Checking system for trusted storage area...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Configuring for Mac OS X, Trusted Storage path /Library/Preferences/FLEXnet Publisher...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: /Library/Preferences/FLEXnet Publisher already exists...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Setting permissions on /Library/Preferences/FLEXnet Publisher...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Permissions set...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Checking system for Replicated Anchor area...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Configuring Replicated Anchor area...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Replicated Anchor area already exists...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Setting permissions on Replicated Anchor area...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Replicated Anchor area permissions set...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Configuring Temporary area...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Temporary area already exists...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Setting permissions on Temporary area...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Temporary area permissions set...
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: ./preinstall: Configuration completed successfully.
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: PackageKit: Hosted team responsible for script has been cleared.
2023-12-18 17:00:27-08 TPC1076 package_script_service[5019]: Responsibility set back to self.
2023-12-18 17:00:27-08 TPC1076 installd[2540]: PackageKit: Using trashcan path /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/PKInstallSandboxTrash/4D0AFD15-DB91-4D2F-8C8F-64044BE5D996.sandboxTrash for sandbox /Library/InstallerSandboxes/.PKInstallSandboxManager/4D0AFD15-DB91-4D2F-8C8F-64044BE5D996.activeSandbox
2023-12-18 17:00:27-08 TPC1076 installd[2540]: PackageKit: Shoving /Library/InstallerSandboxes/.PKInstallSandboxManager/4D0AFD15-DB91-4D2F-8C8F-64044BE5D996.activeSandbox/Root (1 items) to /
2023-12-18 17:00:27-08 TPC1076 installd[2540]: PackageKit: Writing receipt for com.finaldraft.installerfinaldraft.v11 to /
2023-12-18 17:00:27-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Library/Spotlight/FinalDraftImporter.mdimporter/Contents/MacOS/FinalDraftImporter
2023-12-18 17:00:28-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/Final Draft_libFNP.dylib
2023-12-18 17:00:28-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/FnpCommsSoap.dylib
2023-12-18 17:00:28-08 TPC1076 installd[2540]: AOT translation failed for /Applications/Final Draft 11.app/Contents/Frameworks/libicudata.dylib
2023-12-18 17:00:28-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/libicui18n.dylib
2023-12-18 17:00:29-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/libicuuc.dylib
2023-12-18 17:00:29-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/librui_5.4.2.x64.dylib
2023-12-18 17:00:31-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/MacOS/Final Draft 11
2023-12-18 17:00:31-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/Sparkle.framework/Versions/A/Autoupdate.app/Contents/MacOS/Autoupdate
2023-12-18 17:00:31-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/Sparkle.framework/Versions/A/Autoupdate.app/Contents/MacOS/fileop
2023-12-18 17:00:32-08 TPC1076 installd[2540]: oahd translated /Applications/Final Draft 11.app/Contents/Frameworks/Sparkle.framework/Versions/A/Sparkle
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Translated 11 binaries.
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Touched bundle /Applications/Final Draft 11.app
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Touched bundle /Applications/Final Draft 11.app/Contents/Frameworks/Sparkle.framework/Versions/A/Autoupdate.app
2023-12-18 17:00:32-08 TPC1076 installd[2540]: Installed "Final Draft 11" ()
2023-12-18 17:00:32-08 TPC1076 installd[2540]: Successfully wrote install history to /Library/Receipts/InstallHistory.plist
2023-12-18 17:00:32-08 TPC1076 install_monitor[22884]: Re-included: /Applications, /Library, /System, /bin, /private, /sbin, /usr
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: releasing backupd
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: allow user idle system sleep
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: ----- End install -----
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: 6.7s elapsed install time
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Cleared responsibility for install from 22882.
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Hosted team responsible for install has been cleared.
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Running idle tasks
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Done with sandbox removals
2023-12-18 17:00:32-08 TPC1076 installer[22882]: PackageKit: Registered bundle file:///Applications/Final%20Draft%2011.app/ for uid 0
2023-12-18 17:00:32-08 TPC1076 installer[22882]: PackageKit: Registered bundle file:///Applications/Final%20Draft%2011.app/Contents/Frameworks/Sparkle.framework/Versions/A/Autoupdate.app/ for uid 0
2023-12-18 17:00:32-08 TPC1076 installd[2540]: PackageKit: Removing client PKInstallDaemonClient pid=22882, uid=0 (/usr/sbin/installer)
2023-12-18 17:00:32-08 TPC1076 installer[22882]: Running install actions
2023-12-18 17:00:32-08 TPC1076 installer[22882]: Removing temporary directory "/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T//Install.22882R8W31B"
2023-12-18 17:00:32-08 TPC1076 installer[22882]: Finalize disk "Macintosh HD"
2023-12-18 17:00:32-08 TPC1076 installer[22882]: Notifying system of updated components
2023-12-18 17:00:32-08 TPC1076 installer[22882]:
2023-12-18 17:00:32-08 TPC1076 installer[22882]: **** Summary Information ****
2023-12-18 17:00:32-08 TPC1076 installer[22882]: Operation Elapsed time
2023-12-18 17:00:32-08 TPC1076 installer[22882]: -----------------------------
2023-12-18 17:00:32-08 TPC1076 installer[22882]: disk 0.02 seconds
2023-12-18 17:00:32-08 TPC1076 installer[22882]: script 0.00 seconds
2023-12-18 17:00:32-08 TPC1076 installer[22882]: zero 0.00 seconds
2023-12-18 17:00:32-08 TPC1076 installer[22882]: install 7.07 seconds
2023-12-18 17:00:32-08 TPC1076 installer[22882]: -total- 7.09 seconds
2023-12-18 17:00:32-08 TPC1076 installer[22882]:
2023-12-18 17:00:34 : INFO : finaldraft11 : Finishing...
2023-12-18 17:00:37 : INFO : finaldraft11 : App(s) found: /Applications/Final Draft 11.app
2023-12-18 17:00:37 : INFO : finaldraft11 : found app at /Applications/Final Draft 11.app, version 11.1.7, on versionKey CFBundleShortVersionString
2023-12-18 17:00:37 : REQ : finaldraft11 : Installed Final Draft 11, version 1117
2023-12-18 17:00:37 : INFO : finaldraft11 : notifying
2023-12-18 17:00:37 : DEBUG : finaldraft11 : Deleting /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK
2023-12-18 17:00:37 : DEBUG : finaldraft11 : Debugging enabled, Deleting tmpDir output was:
/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11.zip
2023-12-18 17:00:37 : DEBUG : finaldraft11 : /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK/Final Draft 11 Installer.pkg
2023-12-18 17:00:37 : DEBUG : finaldraft11 : /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.cSEMi8LK
2023-12-18 17:00:37 : INFO : finaldraft11 : Installomator did not close any apps, so no need to reopen any apps.
2023-12-18 17:00:37 : REQ : finaldraft11 : All done!
2023-12-18 17:00:37 : REQ : finaldraft11 : ################## End Installomator, exit code 0
2023-12-18 17:00:37 : REQ : finaldraft12 : ################## Start Installomator v. 10.6beta, date 2023-12-18
2023-12-18 17:00:37 : INFO : finaldraft12 : ################## Version: 10.6beta
2023-12-18 17:00:37 : INFO : finaldraft12 : ################## Date: 2023-12-18
2023-12-18 17:00:37 : INFO : finaldraft12 : ################## finaldraft12
2023-12-18 17:00:37 : DEBUG : finaldraft12 : DEBUG mode 1 enabled.
2023-12-18 17:00:38 : INFO : finaldraft12 : setting variable from argument DEBUG=0
2023-12-18 17:00:38 : DEBUG : finaldraft12 : name=Final Draft 12
2023-12-18 17:00:38 : DEBUG : finaldraft12 : appName=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : type=appInDmgInZip
2023-12-18 17:00:38 : DEBUG : finaldraft12 : archiveName=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : downloadURL=https://www.finaldraft.com/downloads/finaldraft1209Mac.zip
2023-12-18 17:00:38 : DEBUG : finaldraft12 : curlOptions=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : appNewVersion=1209
2023-12-18 17:00:38 : DEBUG : finaldraft12 : appCustomVersion function: Not defined
2023-12-18 17:00:38 : DEBUG : finaldraft12 : versionKey=CFBundleShortVersionString
2023-12-18 17:00:38 : DEBUG : finaldraft12 : packageID=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : pkgName=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : choiceChangesXML=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : expectedTeamID=7XUZ8R5736
2023-12-18 17:00:38 : DEBUG : finaldraft12 : blockingProcesses=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : installerTool=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : CLIInstaller=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : CLIArguments=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : updateTool=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : updateToolArguments=
2023-12-18 17:00:38 : DEBUG : finaldraft12 : updateToolRunAsCurrentUser=
2023-12-18 17:00:38 : INFO : finaldraft12 : BLOCKING_PROCESS_ACTION=tell_user
2023-12-18 17:00:38 : INFO : finaldraft12 : NOTIFY=success
2023-12-18 17:00:38 : INFO : finaldraft12 : LOGGING=DEBUG
2023-12-18 17:00:38 : INFO : finaldraft12 : LOGO=/System/Applications/App Store.app/Contents/Resources/AppIcon.icns
2023-12-18 17:00:38 : INFO : finaldraft12 : Label type: appInDmgInZip
2023-12-18 17:00:38 : INFO : finaldraft12 : archiveName: Final Draft 12.zip
2023-12-18 17:00:38 : INFO : finaldraft12 : no blocking processes defined, using Final Draft 12 as default
2023-12-18 17:00:39 : DEBUG : finaldraft12 : Changing directory to /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Kwk9q9Nt
2023-12-18 17:00:39 : INFO : finaldraft12 : name: Final Draft 12, appName: Final Draft 12.app
2023-12-18 17:00:39.042 mdfind[23210:373410] [UserQueryParser] Loading keywords and predicates for locale "en_CA"
2023-12-18 17:00:39.043 mdfind[23210:373410] [UserQueryParser] Loading keywords and predicates for locale "en"
2023-12-18 17:00:39.084 mdfind[23210:373410] Couldn't determine the mapping between prefab keywords and predicates.
2023-12-18 17:00:39 : WARN : finaldraft12 : No previous app found
2023-12-18 17:00:39 : WARN : finaldraft12 : could not find Final Draft 12.app
2023-12-18 17:00:39 : INFO : finaldraft12 : appversion:
2023-12-18 17:00:39 : INFO : finaldraft12 : Latest version of Final Draft 12 is 1209
2023-12-18 17:00:39 : REQ : finaldraft12 : Downloading https://www.finaldraft.com/downloads/finaldraft1209Mac.zip to Final Draft 12.zip
2023-12-18 17:00:39 : DEBUG : finaldraft12 : No Dialog connection, just download
2023-12-18 17:00:41 : DEBUG : finaldraft12 : File list: -rw-r--r--@ 1 root wheel 129M 18 Dec 17:00 Final Draft 12.zip
2023-12-18 17:00:41 : DEBUG : finaldraft12 : File type: Final Draft 12.zip: Zip archive data, at least v2.0 to extract, compression method=deflate
2023-12-18 17:00:41 : DEBUG : finaldraft12 : curl output was:
* Trying 173.249.144.78:443...
* Connected to www.finaldraft.com (173.249.144.78) port 443 (#0)
* ALPN: offers h2,http/1.1
* (304) (OUT), TLS handshake, Client hello (1):
} [323 bytes data]
* CAfile: /etc/ssl/cert.pem
* CApath: none
* (304) (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* (304) (IN), TLS handshake, Unknown (8):
{ [19 bytes data]
* (304) (IN), TLS handshake, Certificate (11):
{ [2973 bytes data]
* (304) (IN), TLS handshake, CERT verify (15):
{ [264 bytes data]
* (304) (IN), TLS handshake, Finished (20):
{ [52 bytes data]
* (304) (OUT), TLS handshake, Finished (20):
} [52 bytes data]
* SSL connection using TLSv1.3 / AEAD-AES256-GCM-SHA384
* ALPN: server accepted h2
* Server certificate:
* subject: C=US; ST=California; L=Burbank; O=Cast & Crew, LLC; CN=*.finaldraft.com
* start date: Jan 19 00:00:00 2023 GMT
* expire date: Feb 19 23:59:59 2024 GMT
* subjectAltName: host "www.finaldraft.com" matched cert's "*.finaldraft.com"
* issuer: C=US; O=DigiCert Inc; CN=DigiCert TLS RSA SHA256 2020 CA1
* SSL certificate verify ok.
* using HTTP/2
* h2 [:method: GET]
* h2 [:scheme: https]
* h2 [:authority: www.finaldraft.com]
* h2 [:path: /downloads/finaldraft1209Mac.zip]
* h2 [user-agent: curl/8.1.2]
* h2 [accept: */*]
* Using Stream ID: 1 (easy handle 0x13c012400)
> GET /downloads/finaldraft1209Mac.zip HTTP/2
> Host: www.finaldraft.com
> User-Agent: curl/8.1.2
> Accept: */*
>
< HTTP/2 200
< server: nginx
< date: Tue, 19 Dec 2023 01:00:39 GMT
< content-type: application/zip
< content-length: 134785619
< x-content-type-options: nosniff
< x-xss-protection: 1; mode=block
< cache-control: public, max-age=604800
< access-control-allow-origin: https://finaldraft.com https://trial.finaldraft.com https://store.finaldraft.com https://manuals.finaldraft.com https://remote.finaldraft.com https://answernet.finaldrat.com https://reg.finaldraft.com https://newsletter.finaldraft.com https://blog.finaldraft.com
< strict-transport-security: max-age=31536000
< referrer-policy: same-origin
< x-frame-options: DENY
< permissions-policy: autoplay=*, geolocation=(self), camera=(self), fullscreen=()
< content-security-policy: default-src *; style-src * 'unsafe-inline'; script-src * 'unsafe-inline' 'unsafe-eval'; font-src * data:; img-src * data:; frame-ancestors * https://app.hubspot.com; frame-src *;
< last-modified: Wed, 27 Sep 2023 19:27:52 GMT
< etag: "808aa53-6065c2ffafe00"
< cache-control: max-age=172800
< expires: Thu, 21 Dec 2023 01:00:39 GMT
< x-cache-nxaccel: BYPASS
< accept-ranges: bytes
<
{ [7491 bytes data]
* Connection #0 to host www.finaldraft.com left intact
2023-12-18 17:00:41 : REQ : finaldraft12 : no more blocking processes, continue with update
2023-12-18 17:00:41 : REQ : finaldraft12 : Installing Final Draft 12
2023-12-18 17:00:41 : INFO : finaldraft12 : Unzipping Final Draft 12.zip
2023-12-18 17:00:41 : INFO : finaldraft12 : found dmg: /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Kwk9q9Nt/Final Draft 12.dmg
2023-12-18 17:00:41 : INFO : finaldraft12 : Mounting /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Kwk9q9Nt/Final Draft 12.dmg
2023-12-18 17:00:44 : DEBUG : finaldraft12 : Debugging enabled, dmgmount output was:
Checksumming Protective Master Boot Record (MBR : 0)…
Protective Master Boot Record (MBR :: verified CRC32 $D211E1E1
Checksumming GPT Header (Primary GPT Header : 1)…
GPT Header (Primary GPT Header : 1): verified CRC32 $1689098F
Checksumming GPT Partition Data (Primary GPT Table : 2)…
GPT Partition Data (Primary GPT Tabl: verified CRC32 $518AE603
Checksumming (Apple_Free : 3)…
(Apple_Free : 3): verified CRC32 $00000000
Checksumming disk image (Apple_APFS : 4)…
disk image (Apple_APFS : 4): verified CRC32 $E2874327
Checksumming (Apple_Free : 5)…
(Apple_Free : 5): verified CRC32 $00000000
Checksumming GPT Partition Data (Backup GPT Table : 6)…
GPT Partition Data (Backup GPT Table: verified CRC32 $518AE603
Checksumming GPT Header (Backup GPT Header : 7)…
GPT Header (Backup GPT Header : 7): verified CRC32 $9E4C68FC
verified CRC32 $B923399A
/dev/disk4 GUID_partition_scheme
/dev/disk4s1 Apple_APFS
/dev/disk5 EF57347C-0000-11AA-AA11-0030654
/dev/disk5s1 41504653-0000-11AA-AA11-0030654 /Volumes/Final Draft 12
2023-12-18 17:00:44 : INFO : finaldraft12 : Mounted: /Volumes/Final Draft 12
2023-12-18 17:00:44 : INFO : finaldraft12 : Verifying: /Volumes/Final Draft 12/Final Draft 12.app
2023-12-18 17:00:44 : DEBUG : finaldraft12 : App size: 258M /Volumes/Final Draft 12/Final Draft 12.app
2023-12-18 17:00:46 : DEBUG : finaldraft12 : Debugging enabled, App Verification output was:
/Volumes/Final Draft 12/Final Draft 12.app: accepted
source=Notarized Developer ID
override=security disabled
origin=Developer ID Application: Cast & Crew Production Software, LLC (7XUZ8R5736)
2023-12-18 17:00:46 : INFO : finaldraft12 : Team ID matching: 7XUZ8R5736 (expected: 7XUZ8R5736 )
2023-12-18 17:00:46 : INFO : finaldraft12 : Installing Final Draft 12 version 12.0.9 on versionKey CFBundleShortVersionString.
2023-12-18 17:00:46 : INFO : finaldraft12 : App has LSMinimumSystemVersion: 10.14
2023-12-18 17:00:46 : INFO : finaldraft12 : Copy /Volumes/Final Draft 12/Final Draft 12.app to /Applications
2023-12-18 17:00:47 : DEBUG : finaldraft12 : Debugging enabled, App copy output was:
Copying /Volumes/Final Draft 12/Final Draft 12.app
2023-12-18 17:00:47 : WARN : finaldraft12 : Changing owner to ben.macphail
2023-12-18 17:00:47 : INFO : finaldraft12 : Finishing...
2023-12-18 17:00:50 : INFO : finaldraft12 : App(s) found: /Applications/Final Draft 12.app
2023-12-18 17:00:50 : INFO : finaldraft12 : found app at /Applications/Final Draft 12.app, version 12.0.9, on versionKey CFBundleShortVersionString
2023-12-18 17:00:50 : REQ : finaldraft12 : Installed Final Draft 12, version 12.0.9
2023-12-18 17:00:50 : INFO : finaldraft12 : notifying
2023-12-18 17:00:50 : DEBUG : finaldraft12 : Unmounting /Volumes/Final Draft 12
2023-12-18 17:00:50 : DEBUG : finaldraft12 : Debugging enabled, Unmounting output was:
"disk4" ejected.
2023-12-18 17:00:50 : DEBUG : finaldraft12 : Deleting /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Kwk9q9Nt
2023-12-18 17:00:50 : DEBUG : finaldraft12 : Debugging enabled, Deleting tmpDir output was:
/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Kwk9q9Nt/Final Draft 12.zip
2023-12-18 17:00:50 : DEBUG : finaldraft12 : /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Kwk9q9Nt/Final Draft 12.dmg
2023-12-18 17:00:50 : DEBUG : finaldraft12 : /var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/tmp.Kwk9q9Nt
2023-12-18 17:00:50 : INFO : finaldraft12 : Installomator did not close any apps, so no need to reopen any apps.
2023-12-18 17:00:50 : REQ : finaldraft12 : All done!
2023-12-18 17:00:50 : REQ : finaldraft12 : ################## End Installomator, exit code 0
Thank you for your submission. In the future, please separate pull requests for individual labels unless the labels are closely related in some way. Among other reasons, this makes it easier for us to request changes to specific label submissions without having to hold up approval of others.
|
gharchive/pull-request
| 2023-12-19T00:53:41 |
2025-04-01T04:55:11.861719
|
{
"authors": [
"BigMacAdmin",
"bmacphail"
],
"repo": "Installomator/Installomator",
"url": "https://github.com/Installomator/Installomator/pull/1378",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
928862549
|
Support variables from script caller
According to the readme, you define a global variable like
./Installomator.sh firefox LOGO=jamf BLOCKING_PROCESS_ACTION=tell_user_then_kill NOTIFY=all
and call the script.
However, in practice, this does not work as expected.
This is because the variables are overwritten inside the script.
Assuming the readme is correct, I made it so that we can set some global variables when calling the script.
And,
./Installomator.sh firefox LOGO=jamf BLOCKING_PROCESS_ACTION=tell_user_then_kill NOTIFY=all
is not recognized correctly as a shell variable.
Therefore, I changed this project readme description to recognize variables correctly like
LOGO=jamf BLOCKING_PROCESS_ACTION=tell_user_then_kill NOTIFY=all ./Installomator.sh firefox
You are misunderstanding how this is supposed to work. The intention of this feature is not to read environment variables.
Some management systems allow the admin to pass a number of arguments when calling a script. Since an admin may want to override any number of arbitrary variables used in the script, even variables that we haven't implemented yet, positional arguments won't work.
None of the management systems I know, allow to set environment variables for the context of script (though I admit that would be neat).
Instead I put in some code which parse the arguments and when an argument contains an = it passes it through eval which sets the variable. Since this happens after everything is initialized, it will override all the presets from earlier in the code.
Thank you for the explanation.
I now understand the behavior of this script.
|
gharchive/pull-request
| 2021-06-24T05:32:03 |
2025-04-01T04:55:11.869315
|
{
"authors": [
"kenchan0130",
"scriptingosx"
],
"repo": "Installomator/Installomator",
"url": "https://github.com/Installomator/Installomator/pull/198",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
367777143
|
WEBAPP-363: Align event, category and poi models and endpoints
This pull request belongs to an issue on our bugtracker.
You can find it there by looking for an issue with the key which is mentioned in the title of this pull request.
It starts with the keyword WEBAPP.
Sorry for the large PR, but the changes are mostly updated tests because of changes in the models. I'm not quite sure if this is a step forward, but I tend to say yes. The endpoints and models are simpler than before. So just say what you think.
As a next step, I would generalize some of our components (EventDetail, Page and the PoiDetail (does not exist yet) and EventList, CategoryList and PoiList(does not exist yet) are mostly the same (with more or less info displayed)).
Haha yeah, in PRs like this, I wish GitHub had a function to provide a pre-defined filter to only show "important" changes...
So my thoughts are the following:
So if you want, we can do sth. like a class-hierarchy, but I don't really like your suggested structure. Your suggestion is:
PageModel
↑ ↑
PoiModel CategoryModel
↑
EventModel
And the DisclaimerModel didn't fit in there. Also, imo, a PoiModel is not a superclass of EventModel (in a sementical way -- probably looking at the attributes it kindof is). But in the future, we probably also should have a reference in an EventModel to the corresponding PoiModel (the location of the event); Allowing this PoiModel to be an EventModel doesn't make sense.
Another point, I don't like, is that you generalized all the endpoint-types. We should not do that. These types represent the actual data-representation we expect to get from the cms, and should not have unnecessary optional arguments and arguments we expect should not be optional.
A suggested hierarchy would be for me (although I'm also not 100% convinced of it):
BasePageModel
↑ ↑
DisclaimerModel ContentPageModel
↑ ↑ ↑
EventModel CategoriesModel PoiModel
Yeah I definitely see your point, thereto:
EventModel can definitely also inherit from PageModel, no problem here.
DisclaimerModel does fit in here, it has exactly the same attributes as categories, but I figured it would be a too big overhead to include this in this structure, since we only use three of these (title, content and modified_gmt), so it does not make sense imo to create a BasePageModel and a ContentPageModel just to fit in the Disclaimer. I would rather just use a CategoryModel for the disclaimer.
So the endpoints should just stay the way they were?
I think we should discuss this in a call. A thing that I miss in this issue/PR is a real benefit of changing our models.
We should start to groom tasks again, because there are much more important architectural stuff to change. For example to make the routing easier. This also would break the current react-native use of the endpoint package.
There is no real benefit apart from sticking to DRY, since our most of the attributes of our models were the same. What is the problem with react-native?
@maxammann My main reason for doing this is that we already have like 5 different Lists, ListIElements and also some detail views which are all roughly the same, but with different designs and approaches. I think this is not what it should be like and since I'm implementing the POI stuff, which will also need a List, ListElement and so on, this is, at least in my opinion, quite important. This PR just makes it easier for me to generalize those components, since the models are more similar than before.
@klinzo You could instead just generalize the lists; so for example you would probably use the same kind-of-list as we use for the events. Then you could just generalize the eventList (for example call it thumbnailList) and an according model if necessary (see also TileModel -- we use the Tiles both for categories and extras)
BTW I just saw, that the excerpt of the news shouldn't be contained in a RemoteContent, since it's a plain string we get from cms. (Also it looks odd, that there are 3 different font-styles for each news-entry)
Yes you should generalize the input of components. Then map the models to to the input. In java you sometimes create a View of objects so they have the interface you need. There are also interfaces in flow: https://flow.org/en/docs/types/interfaces/
We should also think about using a framework for components: https://hackernoon.com/23-best-react-ui-component-libraries-and-frameworks-250a81b2ac42
Main reason is accessibility. Also could force us to follow a more strict guideline.
Yeah sure, I'll generalize the lists, was also my plan. I think it would be good to know what the problem though and why this pr is bad, especially the react native topic. @maxammann
I think we should remove DisclaimerPage and use a PageModel which fits the disclaimer API. So we have a structure which makes sense.
The disclaimer api looks like this: https://cms.integreat-app.de/nuernberg/de/wp-json/extensions/v3/disclaimer
This would simplify the structure even more and would fit the models in the cms a bit more.
Apart from this I'm oke with merging :+1:
|
gharchive/pull-request
| 2018-10-08T12:56:11 |
2025-04-01T04:55:11.881470
|
{
"authors": [
"Schedulaar",
"klinzo",
"maxammann"
],
"repo": "Integreat/integreat-webapp",
"url": "https://github.com/Integreat/integreat-webapp/pull/237",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1606802526
|
Huge overhead on devcloud linked to dpctl calls
Version: numba_0.20.0dev3 and main
The three following dpctl calls 1 2 3 have huge wall time on edge devcloud (measured ranging from 10 to 30ms each call by py-spy, see speedscope report):
On the devcloud this add about 80 seconds to the k-means benchmark (for an expected 10 seconds).
I didn't see the issue on a local machine, but maybe the remaining small overhead that we reported comes from there.
@oleksandr-pavlyk not sure if this should be considered as an unreasonable use in numba_dpex (those calls should be expected to be that long and cached ?) or a bug in dpctl.
I've experimenting with caching the values and can confirm that caching those 3 calls completely remove the overhead.
Regarding the scope of the cache, I'll check if a hotfix that consists in storing those value in a WeakKeyDictionary where keys are val, and usm_mem, and wrapping SyclDevice(device) call in a lru_cache, is enough. (if so, will monkey-patch in sklearn_numba_dpex in the meantime).
To avoid confusion, the .svg file extension of the py-spy report file should be a .json extension. The .svg extension only make sense when py-spy is used to generate a flamegraph report as an SVG file instead of a json speedscope trace.
By zooming in the report, it seems that the overhead seems to come from the repeated calls to typeof_usm_ndarray:
but I cannot see the calls to dpctl.SyclDevice(device).
@ogrisel @fcharras could you please verify if https://github.com/IntelPython/numba-dpex/pull/946 fixes the issue?
Unfortunately, it doesn't fix. Looking at the PR it doesn't seem to change the instructions that lead to the time consuming steps in the OP (that are, dpctl.SyclDevice(device) and *.sycl_device.filter_string).
The workaround I've posted yesterday doesn't work either.
The dpctl.SyclDevice calls sycl::device constructor, which scores each available device and selects the one with the highest score. SyclDevice.filter_string calls sycl::get_devices() and searches for the given device in that list.
Construction of SYCL devices may thus be expensive, as RT must talk to the hardware. numba-dpex should not be constructing the device, but rather should capture it from the instance of usm_ndarray that it is inferring the type from. This is forthcoming, but I do not the ETA.
This suggests that using SYCL_DEVICE_FILTER to limit the number of devices discoverable by RT should improve the timing.
Use sycl-ls to determine the appropriate value to set the environment variable to. For example: with SYCL_DEVICE_FILTER=level_zero:gpu:0 the runtime would only discover one level-zero GPU device.
@oleksandr-pavlyk
You are correct. We should extract device from usm_ndarray instead of creation of new one from filter_string.
But we still have to get *.filter_string since it is part of type signature. And getting filter_string is slow
I would argue that the need to store filter_string as part of type signature would be rendered unnecessary once boxing/unboxing of dpctl.SyclQueue is implemented.
@oleksandr-pavlyk
It doesn't matter if it is part of array type signature or queue type signature.
Device must be part of signature. We not just need to get queue. We need to know for which device we are compiling/calling function. The most human friendly form of adding it to type signature is to use filter_string. Alternatives are: device name, python object id, something else?
I've fixed the monkey-patching workaround given in a previous comment. This should work https://github.com/soda-inria/sklearn-numba-dpex/blob/e040e78d2a5492d7b7b0ec79c2576f0df15cb9db/sklearn_numba_dpex/patches/load_numba_dpex.py#L44
This even also (almost?) entirely fixes the remaining small overhead that we also noticed even on iGPUs on laptops after the caching overhaul. (pointed out at in https://github.com/IntelPython/numba-dpex/issues/886#issuecomment-1415981746)
So, this issue is exacerbated on the intel edge devcloud, but also noticeable on more ordinary hardware.
I would argue that the need to store filter_string as part of type signature would be rendered unnecessary once boxing/unboxing of dpctl.SyclQueue is implemented.
Absolutely. #930
Using the filter string for compute follows data and having it part of any type signature (DpnpNdArray or SyclQueue) is a no go. I only did it as a stop gap under time pressure.
We need to know for which device we are compiling/calling function.
Sure, but that has nothing to do with adding it to any type signature. Moreover, it is conceivable that advanced programmers will target sub-devices and have much finer gain control. For such cases, a filter string is not supported by SYCL.
The most human friendly form of adding it to type signature is to use filter_string
I agree, but given the performance overhead of generating a filter string it is not possible. We can perhaps add backend and device type as string attributes for ease of reading typemaps and such. It is the generation of device number that kills performance.
@diptorupd
I agree, but given the performance overhead of generating a filter string it is not possible.
I really don't see any problem in caching filter string for the device. You need to generate it only once for the created device.
FYI, I have added caching for filter_string property in https://github.com/IntelPython/dpctl/pull/1127
@oleksandr-pavlyk this is half of the fix for this issue I think ? The remaining issue is that since the cache key is a device instance, the cache is not shared for distinct arrays or queues. Would that be possible that all arrays share the same device instance (i.e having id(array.sycl_device) == id(dpctl.SyclDevice(array.device.filter_string))) for all arrays) without adding any overhead to the array.sycl_device call ? I was trying to look into monkey patching my way to that from what is exposed to the python interpreter byt I'm not sure it's possible now.
@fcharras Could you please try https://github.com/IntelPython/numba-dpex/pull/946 again? I've updated it according to your comment and I think with https://github.com/IntelPython/dpctl/pull/1127 it should solve the problem. I'm not sure if https://github.com/IntelPython/dpctl/pull/1127 is already on dppy/label/dev channel already or not.
I'll look more into that today and reach back.
|
gharchive/issue
| 2023-03-02T13:23:06 |
2025-04-01T04:55:11.915595
|
{
"authors": [
"AlexanderKalistratov",
"diptorupd",
"fcharras",
"ogrisel",
"oleksandr-pavlyk"
],
"repo": "IntelPython/numba-dpex",
"url": "https://github.com/IntelPython/numba-dpex/issues/945",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1060685116
|
Move coverage.py configs into pyproject.toml
We do not need a separate config file for coverage.py and the config options can be included inside pyproject.toml.
Merging as coverage generation worked as expected and no other code was touched.
|
gharchive/pull-request
| 2021-11-22T23:21:58 |
2025-04-01T04:55:11.917421
|
{
"authors": [
"diptorupd"
],
"repo": "IntelPython/numba-dppy",
"url": "https://github.com/IntelPython/numba-dppy/pull/637",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1402303992
|
Announcement: Introducing the RealSense D457 GMSL/FAKRA stereo camera
Hi everyone,
Intel have introduced a new member of the RealSense stereo camera family called the D457. It is based around the D450 depth module and the new Vision Processor D4 V5.
https://www.intelrealsense.com/depth-camera-d457/
https://www.intel.com/content/www/us/en/newsroom/news/new-realsense-depth-camera-d457.html
Its listing in the official RealSense Store can be found here:
https://store.intelrealsense.com/buy-intel-realsense-depth-camera-d457.html
To quote the store page:
"The Intel® RealSense™ Depth Camera D457 is our first GMSL/FAKRA high bandwidth stereo camera. The D457 has an IP65 grade enclosure protecting it from dust ingress and projected water. D457 has an on-board Maxim serializer chip. A Maxim de-serializer is needed at host side to decode the streams".
Connect Tech provide support for D457 with their hardware products.
https://connecttech.com/connect-tech-offers-built-in-support-on-the-newly-announced-intel-gmsl-realsense-depth-camera-d457/
The RealSense Store retails an 8-pack of the D450 camera module and Vision Processor D4 V5 here:
https://store.intelrealsense.com/intelr-realsensetm-depth-module-d450-vision-processor-d4-board-v5-gmsl-fakra-8-pack.html
The dedicated data sheet document for the D457 can be downloaded here:
https://dev.intelrealsense.com/docs/intel-realsense-d400-series-product-family-datasheet
The official Intel CAD file for the D457 is here:
https://dev.intelrealsense.com/docs/stereo-depth-camera-d400#cad-files
How long is the coax cable that comes with the camera?
How much is its bend radius?
Thanks very much @Mat198 for your questions about the D457. I have submitted them to my Intel RealSense colleagues to obtain the requested information.
Hi @Mat198 My Intel RealSense colleagues have provided the following D457 information.
The Fakra cable in the D457 box has a length of 1000 mm +/- 10
According to RG174 coax cable spec, the minimum radius is 9.9 mm.
Users can obtain a longer cable from the market. It is recommended to use “LEONI Dacar 302” for a 10 ~15 m long cable. An Intel RealSense colleague has tried RG174 cable on D457. The maximum length is 11 m, but Dacar 302 can reach 15 m.
Thanks for your fast reply!
Does the GMSL interface need a different implementation, or it's ready to plug and play in this Syslogic computer?
https://www.syslogic.ai/jetson-agx-xavier/rugged-computer-rpc-rml-a3-e2
I saw ConnectTech products, but I'm aiming for an outdoor agriculture application that requires an IP rate of at least 65. https://connecttech.com/connect-tech-offers-built-in-support-on-the-newly-announced-intel-gmsl-realsense-depth-camera-d457/
A seperate Maxim de-serializer is needed at host side to decode the streams. For users who wish to perform tests with the D457, the Connect Tech product is recommended as no de-serializer is needed in a Connect Tech / D457 setup.
The Intel RealSense sales team can be contacted regarding the de-serializer component. A contact form for doing so is at the link below.
https://www.intelrealsense.com/talk-to-sales/
Understood. Thanks!
do you supply interface / adapter to connect it to nvidia AGX devkit somehow?
Hi @AndreV84 If you have obtained a de-serializer to decode the D457's streams at host-side then information about how to obtain the reference driver package for AGX Xavier can be found at https://github.com/IntelRealSense/librealsense/issues/11010#issuecomment-1330218867
In regards to cabling a 1 m FAKRA cable is supplied with the D457, as described at https://github.com/IntelRealSense/librealsense/issues/10984#issuecomment-1292320808
Hi @MartyG-RealSense
Thank you for following up!
However, the default Jetson AGX devkit doesn't have the GSML port for the cabling. Could you extend on how to connect the camera to the default nvidia Jetson AGX devkit hardware, please? Are there any particular cable+ adapter things or how else it could be connected?
My understanding is that the cabling connects to the de-serializer board, like the one at the link below that a RealSense user at https://github.com/IntelRealSense/librealsense/issues/11010#issuecomment-1306030505 chose for their D457.
https://www.leopardimaging.com/product/accessories/adapters-carrier-boards/li-gmsl2-ipx-deser/
In the data sheet document for the D457, it states "GMSL is based on SerDes technology, meaning using a
serializer on the transmitter side and a de-serializer on the receiver side".
so for the camera connection we nee dto procure this board [ which is out of stock]
https://www.leopardimaging.com/product/accessories/adapters-carrier-boards/li-gmsl2-ipx-deser/ ?
are there any other boards we could buy for nvidia Jetson AGX devkit default implementation?
For the initial testing stage of a D457 project, Intel recommend using Connect Tech hardware so that purchasing a separate de-serializer is not required as de-serialization is built into the Connect Tech hardware.
https://connecttech.com/connect-tech-offers-built-in-support-on-the-newly-announced-intel-gmsl-realsense-depth-camera-d457/
If you would prefer to proceed directly to using a separate de-serializer component then it can be obtained through the Intel RealSense sales team, as described at https://github.com/IntelRealSense/librealsense/issues/10984#issuecomment-1292382772
Intel's de-serializer board can be plug onto the AGX Xavier board directly. The D457 connects to the FAKRA connector on this board through a GMSL cable.
I would not name further de-serializer products at present without confirmation that they are able to function with the D457. The RealSense user who purchased the LI-GMSL2-IPX-DESER product stated though that it contained MAX9296A technology, which should narrow down research for compatible de-serializer products. For example, you could google for maxim deserializer "MAX9296A" (including the quotation marks so that relevant products are included in the search results).
A googling for the LI-GMSL2-IPX-DESER product that is out of stock at the link above is out of stock at other suppliers found in the search results too at the time of writing this. One of the suppliers had a useful PDF brochure about it though that I have linked to below.
De-serializer.pdf
@MartyG-RealSense
Do we order a de serializer from intel realsense somehow directly through sales easily?
or we should look to order elsewhere e.g. from Leopard imaging?
The latter [LI] listed these:
We have below adapter boards and cable which can connect GMSL2 cameras to Jetson AGX Xavier, but I am sure if this deserializer adapter (with MAX9296) is compatible with Intel d457 camera.
LI-GMSL2-IPX-DESER
FAW-1233-03 cable
BLI-JXAV-MIPI-ADPT-4CAM```
But if Intel realsense can supply sale of de serializer it maybe even better option?
Also are there adapters for Jetson NX devkits by nvidia?
Contacting Intel sales regarding their de-serializer product that can plug directly onto AGX Xavier would be the safest option, especially in regard to your question of how to interface the de-serializer with the AGX board.
Intel's guidance is to use a Maxim de-serializer rather than a specific model of Maxim de-serializer, suggesting that if the de-serializer supports GMSL then it should be able to work.
My Intel RealSense colleagues confirmed with me that because D457 uses the GMSL protocol, any GMSL cable can work with it.
@MartyG-RealSense
Thank you for following up
I think I have filled the contact with sales form
Maybe they will follow up sometime in the future to reach back to me regarding the procurement of the de serializer. Does it take many days to get the contact from sales?
Bearing in mind that it is December 23 at the time of writing this, if you have not had a response in a couple of weeks then let me know and I will follow it up.
@MartyG-RealSense
Will these CSI-> GSML adapters provided by Intel or LI work with Nvidia Jetson Orin devkit by Nvidia? There is no such support for NX devits, right? But there is support for AGX. How about Orin Devkits?
The D457's Jetson reference driver support is designed to work with a de-serializer on AGX Xavier at this time.
Hi!
Is there any documentation for bringup? I have a Rudi-AGX and D457 cameras on my desk but realsense-viewer (v2.53.1) doesn't find them with their Jetpack & BSP 4.6.1 installed (4.6.1 was recommended by ConnectTech). I installed the base-image, do I maybe need to select any of their AR** or IMX 390 images? A USB3 D450 works fine btw with USB 3.2 connection.
Looking forward to any hints how to get them up and running...
Thanks!
Hi @jast1982 I will discuss your issue with my Intel RealSense colleagues. Thanks very much for your patience!
Hi again @jast1982 My colleagues advised that first, you need to use the latest RealSense Viewer version. I believe that you are already doing this as you are using SDK version 2.531.
Secondly, you need to obtain the D457 driver from Connect Tech. Intel cannot help with this and you need to contact Connect Tech directly.
@MartyG-RealSense
Could you confirm if we ordering these three units for jetson AGX devkit will be able to use it for d457 connection? Thanks
LI-GMSL2-IPX-DESER
FAW-1233-03 cable
LI-JXAV-MIPI-ADPT-4CAM
@MartyG-RealSense
is it the deser board from intel?
https://store.intelrealsense.com/buy-intel-realsense-des457.html
The de-serializer board on the store link is the Intel one, though I would recommend contacting the Intel RealSense sales team to confirm pricing. As mentioned above at https://github.com/IntelRealSense/librealsense/issues/10984#issuecomment-1364041699 - please allow another week for a response to your submitted query and I will follow it up for you if you have not received a response after that. Thanks very much for your patience!
In regard to the components list, it was confirmed by my Intel RealSense support colleagues at https://github.com/IntelRealSense/librealsense/issues/11010#issuecomment-1306169640 that LI-GMSL2-IPX-DESER could be used with D457.
in regard to a cable, my colleagues recommended at https://github.com/IntelRealSense/librealsense/issues/10984#issuecomment-1292320808 using a LEONI Dacar 302 cable or a cable that is compliant with RG174.
The D457 camera should connect onto the de-serializer board. If using Intel's de-serializer board then it should attach directly onto the AGX Xavier board.
Hi again @jast1982 My colleagues advised that first, you need to use the latest RealSense Viewer version. I believe that you are already doing this as you are using SDK version 2.531.
Secondly, you need to obtain the D457 driver from Connect Tech. Intel unfortunately cannot help with this and you need to contact Connect Tech directly.
Hi!
So, I was able to progress somewhat: Downloading the Realsense specific BSP from here https://connecttech.com/supported-cameras/ installing it according to the normal instructions and afterwards adding this line to /boot/extlinux/extlinux.conf:
FDT /boot/tegra194-agx-cti-AGX104-INTEL.dtb
enabled at least two out of my 4 connected realsense cameras on my system. I can see /dev/video0 to 11 and /dev/video2 and /dev/video8 play a video stream in VLC. Unfortunately the realsense-viewer outputs this:
29/12 10:34:54,968 INFO [548026171408] (context.cpp:382) Found 2 RealSense devices (mask 0xff) 29/12 10:34:55,251 INFO [548026171408] (rs.cpp:2701) Framebuffer size changed to 2688 x 756 29/12 10:34:55,252 INFO [548026171408] (rs.cpp:2701) Scale Factor is now 1 29/12 10:34:55,350 ERROR [547922886432] (librealsense-exception.h:52) xioctl(VIDIOC_S_EXT_CTRLS) failed Last Error: Invalid argument 29/12 10:34:55,356 WARNING [547922886432] (rs.cpp:310) null pointer passed for argument "device" 29/12 10:34:55,356 WARNING [547922886432] (rs.cpp:2704) Couldn't refresh devices - xioctl(VIDIOC_S_EXT_CTRLS) failed Last Error: Invalid argument
(V2.53.1, tried as user and with sudo)
Any ideas?
Hi @jast1982 As it is a xioctl error, it would indicate to me that there is a kernel conflict. You could try building the SDK from source code with CMake with the RSUSB backend installation method to confirm whether or not this is the cause, as RSUSB bypasses the kernel and so is not dependent on Linux versions or kernel versions and does not require kernel patching. RSUSB is not ideally suited to multiple cameras however, as a kernel-patched build works better with multicam.
@MartyG-RealSense That was my hope as well. Already compiling...
@MartyG-RealSense
It appears that the design presumes that LI deser is connected to the 4x cameras adapter
It doesn't appear that it can be connected directly to Jetson AGX, can it?
http://www.leopardimaging.com/uploads/LI-IMX390-GMSL2-XAVIER-xxxH_datasheet.pdf
@MartyG-RealSense from LI regarding the purpose
The LI-JXAV-MIPI-ADPT-4CAM is a MIPI adapter which can connect GMSL2 camera and LI-GMSL2-IPX-DESER to the Jetson AGX.
Please make sure your d457 camera includes a GMSL2 interface (MAX9295). You may also need to build the Jetson driver for this camera kit.
@AndreV84 If you have an Intel NDA agreement then I recommend submitting a Zendesk support ticket to Intel for these technical questions about sourcing and attaching together components for use with D457. Alternatively, if you intend to use components that are all made by LI then LI are likely to be able to assist, like with their response above.
@MartyG-RealSense
As far as I understand LI can not confirm that their solution works with d457 specifically as they did not do such testing specifically. But they say it compatible with the chip.
Could you extend on how to get through NDS step to Zendesk ticket creation, please?
community.intel.com/t5/Items-with-no-label/Multiple-Camera-Positions-Calibration/m-p/1443270#M17774
@MartyG-RealSense
Could you confirm that
" Intel D457 camera includes GMSL2 serializer (MAX9295)." ?
Thanks
@AndreV84 I will make enquires to my RealSense colleagues about your question. Thanks very much for your patience.
@MartyG-RealSense
I signed for Zendesk account
could you extend if Suite or Support type of subscription is required to create ticket to intel from zendesk account? Thanks
I have referred your latest enquiry to my colleagues.
@MartyG-RealSense
Thank you for following up
I did not hear back from realsense sales pertaining the intel de serializer ordering procedure or regarding price confirmation; is the listed at backorder page at librealsense price valid for the de serializer? Thanks
@AndreV84 Please send me an email at martyx.grover@intel.com so that I have your email contact to pass to my Intel RealSense colleagues.
@MartyG-RealSense on Jetson AGX devkit with intel dse serializer, whoch Jetson OS supported? the latest Jetpack 5.x? or only 4.x? Thanks
JetPack 5.0.2 support has only just been introduced into the RealSense SDK in its development branch and will therefore be included in the next SDK release (likely numbered 2.54.0). At the time of writing this, 2.53.1 supports JetPack 4.6, though RealSense users have already been able to unofficially use JetPack 5 successfully with the SDK (though with IMU problems).
https://github.com/IntelRealSense/librealsense/tree/development
nvidia already released Jetpack 5.1
@MartyG-RealSense isn't it supported in the development branch of sdk? Thanks
SDK support of JetPack is not synchronized with new JetPack releases and support for specific JetPack versions is added over time.
Looks like it is very challenging to work with the GMSL/FAKRA interface.
Someone in this thread owns and have used the RealSense D457? would you please confirm that the USB port cannot be used for development purposes (or like the D455)?
Looks like it is very challenging to work with the GMSL/FAKRA interface.
Someone in this thread owns and have used the RealSense D457? would you please confirm that the USB port cannot be used for development purposes (or like the D455)?
Unfortunately my setup (ConnectTech Rudi + 1-6 Realsenses) still doesn't work due to an init issue with the cameras. Also ConnectTech doesn't seem to have a clue. Does anyone outside of Intel has a working setup?
I have this issue https://github.com/IntelRealSense/librealsense/issues/10984#issuecomment-1367191641 and no idea how to fix it. I tried with master and mipi branch with the same results. Cameras are detected, but open in librealsense fails.
The Leopard Imaging de-serializer board that my Intel RealSense colleagues confirmed is able to be used with D457 is now listd as In Stock at the time of writing this.
https://www.leopardimaging.com/product/accessories/adapters-carrier-boards/li-gmsl2-ipx-deser/
@MartyG-RealSense
did your colleague used it directly with AGX devkit somehow or it necessarily required an extra 1 to 4 cam MIPI adapter by LI in order to run?
I assume that it was a general statement of compatibility based on the store description rather than it having been actually purchased and tested on specific hardware.
I assume that it was a general statement of compatibility based on the store description rather than it having been actually purchased and tested on specific hardware.
the problem is that my boss already ordered as I passed your prior post to them
The Leopard Imaging de-serializer board that my Intel RealSense colleagues confirmed is able to be used with D457 is now listed as In Stock at the time of writing this.
https://www.leopardimaging.com/product/accessories/adapters-carrier-boards/li-gmsl2-ipx-deser/
according to the manufacturer the other required adapter 4xcameras MIPI won't be in the stock until mid March
@AndreV84 Thanks for the information about the adapter.
Advice from my Intel RealSense colleagues at https://github.com/IntelRealSense/librealsense/issues/11010#issuecomment-1301807128 stated that the de-serializer MAX9296A (the Leopard Imaging product uses MAX9296) can be paired with the MAX9295A serializer built inside the D457. As the Leopard Imaging board contains the appropriate de-serializer component, this is likely the basis of confidence that it can work when paired with D457.
Hi everyone, I have received information from my Intel RealSense colleagues that it is fine to use the D457 only in USB mode and it has the same USB performance and the same feature-set as the D455, with the added IP65 protection. So it is absolutely valid to purchase the D457 for the protection and use it as a USB camera instead of in GMSL / FAKRA mode.
How do I make D457 work in USB mode? Just plug-and-play? Realsense-viewer is not finding it, but it works with my D435. Any advice?
Hi @Mat198 Are you using librealsense SDK version 2.53.1 please as this will be required for use with D457.
The SDK Version is correct. I can find the D435 with lsusb, but it doesn't show D457. It doesn't work on windows and another ubuntu 20.04 PC. Is my D457 camera dead? :(
If the D457 is not detected on a USB connection then unplugging the micro-sized end of the USB from the cable, turning it around the opposite way and re-inserting it into the camera can cause the camera to be detected.
Tried both sides without success. :cry: Any other think I can do?
Is your camera firmware driver version 5.14.0.0 ?
I'm not sure. I never changed the firmware so it must be the factory default.
I will consult with my Intel RealSense colleagues. Thanks very much for your patience.
I will be waiting for you response. Thank you for the fast replies!
Hi @Mat198 My colleagues informed me that to enable USB mode you need to set the mode switch to USB mode, as the D457's default is MIPI mode. The switch is under a small cap on the camera.
It worked @MartyG-RealSense. Thank you very much! I couldn't find anything about this mode switch in the user manual. The word "switch" is not even mentioned a single time. The documentation only states the hardware sync connector behind that cover. I think the datasheet should have this info.
It's great to hear that the mode switch resolved your problem!
The switch is illustrated on a diagram of the camera in the D457 data sheet on page 65 of its current edition.
Indeed it's on the datasheet. Could be easier to find, the text is actually an image. Anyway, thank you!
@MartyG-RealSense
is Jetpack 5.1 supported? at MIPI 5.02 is listed as supported by default;
so will we be able to use the camera with the Intels adapter board on AGX Jetson with OS by Jetpack 5.1?
JetPack 5.0.2 support will be included in the next SDK version, likely numbered 2.54.0.
RealSense users of JetPack 5 have been able to use unsupported JetPack 5 versions successfully if they do not patch the kernel or they build the SDK in RSUSB backend mode.
@MartyG-RealSense when you are talking about patching the kernel are you reffering to patching like in https://github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md#building-from-source-using-rsusb-backend
or like in patching like in MIPI driver for jetson patching?
If we are to try on 5.1 Jetpack JEtson are we to try with e.g.
https://github.com/IntelRealSense/librealsense/blob/master/scripts/libuvc_installation.sh
or need still to patch the kernel for de serializer driver before applying or trying it?
I mean kernel patching like in installation_jetson.md. The Jetson users who had success with JetPack 5 were not using D457. Those users were of course also using USB instead of GMSL / FAKRA.
When the D457's mode switch is set to USB, it should behave like a D455 and so be able to skip patching the kernel.
I do not currently have information on the effect that not patching the kernel would have on the camera when using it with a de-serializer in its default MIPI mode.
~/realsense_mipi_platform_driver$ ./apply_patches_ext.sh ./Linux_for_tegra/source/public 5.1
Wrong JetPack version (5.1)! Only 5.0.2 and 4.6.1 supported.
How do I apply this patch to 5.1 version?
The MIPI driver does not have a branch for 5.1. As you pointed out, the only 5-version supported is 5.0.2.
I recall that in SDK 2.50.0 a RealSense user added support for JetPack 4.6 via a PR at https://github.com/IntelRealSense/librealsense/pull/9855/commits/13094ea0b8cfb6955ce6e69d18138afc125f56aa by submitting an edit of patch-realsense-ubuntu-L4T.sh. Perhaps you could create a custom 'forked' version of librealsense and make your own edit to patch-realsense-ubuntu-L4T.sh to support 5.1 if you need 5.1 support added immediately to the patch script.
@MartyG-RealSense
after patching 5.0.2 with MIPI driver
multiple /dev/video devices appeared but after building librealsense sdk MIPI branch somehow the player shows no any device
any ideas?
which command to use with which branch for geting gsml camera outputs on 5.0.2 after patching OS with MIPI driver for 5.0.2?
Hi @AndreV84 Has your D457's mode switch under the removable cap been moved from MIPI mode to USB mode since you bought the camera? If it has then the switch should be moved back to its MIPI position.
Hi @MartyG-RealSense
I think the switch hasn't been switched at all so it should be the default position as it comes from sales. However I will check with remote team to confirm. Is there a particular branch of the sdk with particular cmake build sequence that you could suggest to be used to get the gsml working somehow? Is use of the mipi sdk branch with rsusb= ON correct for use with gsml?
which branch has to be used?
D457 should be used with a minimum of SDK 2.53.1 and firmware 5.14.0.0.
FORCE_RSUSB_BACKEND relates to USB. The SDK should be kernel patched for use with MIPI, as described in the reference driver's instructions.
https://github.com/IntelRealSense/realsense_mipi_platform_driver
Hi @MartyG-RealSense
the mipi patch has been applied
it resulted in multiple /dev/video
the MIPI branch of sdk has built 54.0 usig cmake command above
How could I ceck or update firmware if the device is not detected in realsense viewer? how to know what firmware does it get shipped with?
SDK 2.54.0 is not properly released yet and is still on the development branch of the SDK (i.e it is an unfinished SDK whose development is still in progress). At this time, 2.53.1 should be used as this is currently the latest public release of the SDK.
https://github.com/IntelRealSense/librealsense/releases/tag/v2.53.1
The firmware version cannot be checked if the camera is not detected. My own D457 shipped with firmware 5.13.01.0, though the firmware will depend on when a particular camera unit was manufactured and mine is an early pre-production version.
When the mode switch is set to USB, the camera is detected as a D455 as the D457 has all the features, performance and characteristics of a D455 in that mode.
@MartyG-RealSense
folks won't be switching to usb mode as they need gsml working
I built this branch https://github.com/IntelRealSense/librealsense/tree/D4XX_MIPI that is 2.54 as I noticed
do you suggest to build 2.53.1 public release? which arguments to use for cmake to make sure gsml will work on Jetson?
Some RealSense users prefer conventional USB over GMSL and the need for a de-serializer, and desire an Intel-made RealSense camera that can work with USB like a D455 but also has IP65 protection. The D457 appeals to customers with those needs.
Yes, I suggest building the 2.53.1 public release.
There is not specific guidance for installing the SDK for GMSL, simply 'install 2.53.1' (presumably with Jetson-compatible instructions for those who are using Jetson).
@MartyG-RealSense
Should it reflect gsml camera?
realsense viewer doesn't show anything either but warning about missed udev rules
2.53.1 was built with these
how could we get the camera detected?
I would suggest setting DFORCE_RSUSB_BACKEND to Off.
Network Device is a valid instruction in 2.53.1, although network device is removed from the SDK in the next upcoming version as it was an experimental proof of concept rather than an official, validated Intel feature.
The MIPI reference driver is designed for use with Intel's own de-serializer board and with AGX Xavier and the driver should be ported by the customer if using it with a non-Intel de-serializer. Intel are unable to provide assistance with this porting process, unfortunately.
we are using intel de serializer board
with external 12 power
with 5.0.2 OS was pointed out by intel in MIPI driver instructions
Have your colleagues correctly plugged the Intel de-serializer board onto the AGX Xavier board and attached the D457 camera to the de-serializer with the 1000 mm official GMSL cable supplied with the camera?
@MartyG-RealSense
from devs
if we need to power the camera via the usb line
Or if the Gmsl cable provides power
after rebuilding with rsusb realsense seems detected the camera
The D457 was designed primarily for use with a GMSL connection, with the USB connector intended for debugging purposes (though it can be used as a full D455-like camera on USB). So USB would not be the main means of powering the camera.
@MartyG-RealSense
firmware upgrade doesn't get through gsml connection
does it require usb cable?
how do we upgrade from 13 to 14?
In general for any RealSense camera, performing firmware updates through a short-length USB cable (preferably an official RealSense one) is recommendable.
If the camera can be detected in the RealSense Viewer then you can update to 5.14.0.0 in the Viewer. I have performed the update on my own D457 today via USB in the Viewer.
feedback
the option to build the mipi patch on AGX natively could have been more convenient but it won't run if following intel steps on AGX as steps are for x86_64 that introduces extra step extra computer in the patching process that can be avoided if building/ applying the patch on AGX natively
here you may find a reference example to build natively
https://www.viziochron.com/xavier#h.p_aiRTN3O-uTrg
Thanks so much @AndreV84 for sharing the feedback and reference for setting up on AGX Xavier!
Hello @MartyG-RealSense,
I have the Rudy AGX Jetson Xavier from connect Tech . They provide a BSP to have the D4XX mipi driver which works well when looking at the dmesg logs. I want to use GMSL and I have an issue when building librealsense.
When building from the development branch (which seems to support the D457 and GMSL and L4T 32.7.1) and applying the patch at ./scripts/patch-realsense-ubuntu-L4T.sh and building with cmake .. -DBUILD_EXAMPLES=true -DCMAKE_BUILD_TYPE=release -DFORCE_RSUSB_BACKEND=false , rs-depth returns the following issue :
There are 1 connected RealSense devices.
rs_error was raised when calling rs2_create_device(info_list:0x55a65045d0, index:0):
xioctl(VIDIOC_S_EXT_CTRLS) failed Last Error: Invalid argument
The BSP uses JetPack 4.6.1 and L4T version 32.7.1. What are the instruction to build from source librealsense in this case?
@lmuffang
did you try to build the mipi branch of librealsense sdk available from github?
4.6.1 seems less recent release than we tried with ordinary nvidia devkit though
I applied the patch as mentioned in the build instruction for Jetson devices here. However, not applying them does not make a difference for my issue. I guess I will skip this patch from now on, not doing it cloud also prevent some conflit with connect Tech's BSP I think.
And yes I also already tried the D4XX_MIPI branch with the same outcome.
Connect Tech provides support for the D457 GMSL camera and their instruction are to flash the BSP with JetPack 4.6.1. They also suggested the same branch as you did @AndreV84.
Should I try using the latest JetPack version from Nvidia SDK manager instead of 4.6.1 for flashing?
I just tried using a newer BSP connect Tech send me and it seems to work a bit better, I can get RGB, motion and infrared 1&2 using GMSL in realsense-viewer, but when I try to enable the depth stream, I have no frame received and I need to reset my device. (I built realsense from the branch D4XX_MIPI)
Here are my realsense-viewer logs when enabling one infrared only (which works) and then the depth stream only (which does not):
realsense-viewer.log
@lmuffang the patch https://github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md#building-from-source-using-native-backend seems is not relevant for gmsl use case, but may be applicable for d455 mode of the camera [ usb connection] if you wish to use it as usb 455 camera instead of using it as d457 via gmsl.
However, if you try using sdkmanager it won't suceed as you need rather to use provided by ConnectTech OS image that has their devicetree changes integrated. Sdkmanager is only good for default nvidia devkit carrier boards as it seems.
Moreover, you may try to build a latest Public release 2.53.1 m as @MartyG-RealSense advised [ not the development branch] to see if outcomes will be different with it. In our setup both 2.53.1 & MIPI branches worked if built without RSUSB backend.
Hope it helps
So I managed to get it working.
I had CUDA missing before and I did not think it was an issue. So I installed the Jetson components (not the OS) via the Nvidia SDK manager after flashing the ConnectTech OS image. Then I could build realsense with -DBUILD_WITH_CUDA=true on the D4XX_MIPI branch and it worked!
However, on the latest Public release v2.53.1, I could not get the depth stream working using GMSL, only the infrared and color streams worked in that case.
Thanks @AndreV84 for your help
@lmuffang It's great to hear that you were successful :)
@AndreV84 Thanks so much for your help!
@MartyG-RealSense
Could you list reference for flashing camera led indicator, please?
A remote colleague pointed out flashing of d457 was unsteady. I am trying to decypher flashing states so we could correspond it to the mode of flashing of the camera .
Thanks
Andrew
Hi @AndreV84 I do not have information on that subject. Please submit a Zendesk ticket to Intel about that query.
@MartyG-RealSense
folks are asking to ask "what type of barrel jack [is] used for the Gmsl adapter?"
Hi @AndreV84 If you are asking about the long blue GMSL connector on the side of the D457 camera, page 22 of the D457 data sheet document states that there are 14 different layouts for GMSL / FAKRA connectors and the D457 one is a C-coding (blue) connector.
So I managed to make it work
I had CUDA missing before and I did not think it was an issue. So I installed the Jetson components (not the OS) via the Nvidia SDK manager after flashing the ConnectTech OS image. Then I could build realsense with -DBUILD_WITH_CUDA=true on the D4XX_MIPI branch and it worked!
However, on the latest Public release v2.53.1, I could not get the depth stream working using GMSL, only the infrared and color streams worked in that case.
Thanks @AndreV84 for your help
Hi!
I have the same hardware setup and tried again to get the real sense cameras up and running, but also with CUDA I was not successful. However I still use Jetpack 4.6.1 with the CTI-L4T package V 32.7.1 since there doesn't seem to be a correct CTI-L4T package for the newer BSP for Jetpack 5.1.
What combination did you use eventually? Were any other kernel patches required?
Regards,
Jan
So I managed to make it work
I had CUDA missing before and I did not think it was an issue. So I installed the Jetson components (not the OS) via the Nvidia SDK manager after flashing the ConnectTech OS image. Then I could build realsense with -DBUILD_WITH_CUDA=true on the D4XX_MIPI branch and it worked!
However, on the latest Public release v2.53.1, I could not get the depth stream working using GMSL, only the infrared and color streams worked in that case.
Thanks @AndreV84 for your help
Hi!
I have the same hardware setup and tried again to get the real sense cameras up and running, but also with CUDA I was not successful. However I still use Jetpack 4.6.1 with the CTI-L4T package V 32.7.1 since there doesn't seem to be a correct CTI-L4T package for the newer BSP for Jetpack 5.1.
What combination did you use eventually? Were any other kernel patches required?
Regards, Jan
Hi Jan,
After I flashed the BSP as you did, I installed the Jetpack SDK Components via the Nvidia SDK as mentioned here. Then I built realsense with CUDA on the D4XX_MIPI branch and it worked. I did not apply any kernel patches, I just cloned the repo and simply built it from source.
So I managed to make it work
I had CUDA missing before and I did not think it was an issue. So I installed the Jetson components (not the OS) via the Nvidia SDK manager after flashing the ConnectTech OS image. Then I could build realsense with -DBUILD_WITH_CUDA=true on the D4XX_MIPI branch and it worked!
However, on the latest Public release v2.53.1, I could not get the depth stream working using GMSL, only the infrared and color streams worked in that case.
Thanks @AndreV84 for your help
Hi!
I have the same hardware setup and tried again to get the real sense cameras up and running, but also with CUDA I was not successful. However I still use Jetpack 4.6.1 with the CTI-L4T package V 32.7.1 since there doesn't seem to be a correct CTI-L4T package for the newer BSP for Jetpack 5.1.
What combination did you use eventually? Were any other kernel patches required?
Regards, Jan
Hi Jan,
After I flashed the BSP as you did, I installed the Jetpack SDK Components via the Nvidia SDK as mentioned here. Then I built realsense with CUDA on the D4XX_MIPI branch and it worked. I did not apply any kernel patches, I just cloned the repo and simply built it from source.
Hi!
Thank you for your advise! And just to confirm: With OS you meant installing Jetpack 4.6.1 + their CTI-L4T V32.7.1 package from the CTI camera homepage, running their script and selecting the configuration for Rudi+Realsense, correct?
Regards,
Jan
@MartyG-RealSense is it likely that we can run Zed X using intels de serializer card on AGX?
plug in did not autosetup as it seems
could you remind the maxim de serializer model of the intel gmsl agx adapter?
zed lists support of 96712/9626 from LI
Hi @AndreV84 I am unable to provide advice about non-RealSense cameras.
A zoom-in of an image of the Intel deserializer from the RealSense Store listing shows that it is using Maxim 9296A like the LI product does.
So I managed to make it work
I had CUDA missing before and I did not think it was an issue. So I installed the Jetson components (not the OS) via the Nvidia SDK manager after flashing the ConnectTech OS image. Then I could build realsense with -DBUILD_WITH_CUDA=true on the D4XX_MIPI branch and it worked!
However, on the latest Public release v2.53.1, I could not get the depth stream working using GMSL, only the infrared and color streams worked in that case.
Thanks @AndreV84 for your help
Hi!
I have the same hardware setup and tried again to get the real sense cameras up and running, but also with CUDA I was not successful. However I still use Jetpack 4.6.1 with the CTI-L4T package V 32.7.1 since there doesn't seem to be a correct CTI-L4T package for the newer BSP for Jetpack 5.1.
What combination did you use eventually? Were any other kernel patches required?
Regards, Jan
Hi Jan,
After I flashed the BSP as you did, I installed the Jetpack SDK Components via the Nvidia SDK as mentioned here. Then I built realsense with CUDA on the D4XX_MIPI branch and it worked. I did not apply any kernel patches, I just cloned the repo and simply built it from source.
Hi!
Thank you for your advise! And just to confirm: With OS you meant installing Jetpack 4.6.1 + their CTI-L4T V32.7.1 package from the CTI camera homepage, running their script and selecting the configuration for Rudi+Realsense, correct?
Regards,
Jan
Hi!
I wasn't able to get the D457 to run until CTIs support suggest to try the tag https://github.com/IntelRealSense/librealsense/releases/tag/v2.50.40_MIPI_QS_RC2 which worked then right away with jetpack 4.6.1 and their BSP v32.7.1 from their camera homepage.
Even without CUDA btw.
Regards,
Jan
Thanks so much Jan for sharing the advice!
I had purchased the LI-GMSL2-IPX-DESER product from the Leopard Imaging.
https://www.leopardimaging.com/product/nvidia-jetson-cameras/nvidia_agx_xavier_gmsl2_camera_kits/li-imx390-gmsl2-xavier/li-imx390-gmsl2-xavier-120h/
And try developt Intel D457 on AGX Xavier.
My hardware environment are:
Nvidia Jetson AGX Xavier EVB * 1
Leopad LI-JXAV-MIPI-ADPT-4CAM board * 1
Leopad LI-GMSL2-IPX-DESER board * 1
Leopad FAW-1233-03 cable *1
Intel RealSense D457 camera * 1
GMSL FAKA cable * 1
May I use the realsense_mipi_platform_driver from Intel GitHub to develop Intel D457 + LI-GMSL2-IPX-DESER on AGX Xavier?
https://github.com/IntelRealSense/realsense_mipi_platform_driver
Hi @sammychentw The RealSense MIPI driver for D457 is designed for AGX Xavier and Intel's own brand of deserializer. Whilst you can use your own choice of deserializer, to use the driver with your own hardware (a Leopard deserializer) you will need to port (convert) the MIPI driver for use on your equipment. Intel are unable to provide assistance with this porting process unfortunately.
May I confirm below Intel RealSense D4 GMSL De-serializer board could meet the RealSense MIPI driver for D457?
Intel® RealSense™ Vision Processor D4 GMSL/FAKRA De-serializer Board
https://store.intelrealsense.com/buy-intel-realsense-des457.html
The MIPI driver was designed for that specific Intel deserializer. It is not available for purchase in single or small quantities though but rather for manufacturer-scale orders, so it is necessary to instead purchase a non-Intel deserializer and port the MIPI driver.
An alternative is to use the D457-compatible AGX Orin hardware product by the company Connect Tech, which has built-in support for D457 and does not require the purchase of a separate deserializer.
https://connecttech.com/connect-tech-offers-built-in-support-on-the-newly-announced-intel-gmsl-realsense-depth-camera-d457/
Does anyone have any info on the Leopard Imaging boards + AGX Orin gmsl/fakra setup? Seems like there should be drivers out there.
Hello. I just received my D457 camera and I can not get it working with USB Type C port.
When I connect with Type C the slush command shows no change. I have also tried it in windows 10 after installing latest SDK. Nothing in device manager. Your help is very much appreciated.
Hi @MhmRhm To use the camera with a USB cable you need to move a physical switch sideways that is under a removable cap on top of the camera. Information about this can be found at the link below.
https://support.intelrealsense.com/hc/en-us/community/posts/14840675121043/comments/14843232242835
@MartyG-RealSense
There again emerged some issue with d457 not detected; at AGX nvidia devkit
Since I do not see how to troubleshoot it easily it was considered that unit will get reflashed;
The question is what is the current newest supported OS and what is the procedure for installing the driver if changed;
Thanks
AV
Hi @AndreV84 The newest supported OS at the time of writing this is Ubuntu 22.04, though support for the recent 24.04 Noble Numbat and ROS2 Jazzy is planned.
If you mean the D457's MIPI reference driver AGX Xavier, installation instructions can be found on the driver's page at the link below.
https://github.com/IntelRealSense/realsense_mipi_platform_driver
@MartyG-RealSense Thanks but what version of realsense sdk to build? the latest one? for gmsl
Yes, 2.55.1 supports D457 / MIPI, as described in the 2.55.1 release notes at the link below. The notes recommend reinstalling the udev rules and the MIPI reference driver though.
https://github.com/IntelRealSense/librealsense/wiki/Release-Notes#release-2551
the problem is that while the github repo lists support os 5.1.2 Jetpack os
the release you pointed out to only could support of 5x the release 5.0.2
after installing mipi driver it got all like that:
ndre
8:20 PM
ipped.
[ 70.435540] systemd[1]: Starting udev Kernel Device Manager...
[ 70.436732] max9295 30-0040: max9295_write_reg:i2c write failed, 0x0 = 84
[ 70.438661] max9295 30-0042: max9295_write_reg:i2c write failed, 0x10 = 21
[ 70.438796] max9295 30-0042: max9295_setup_control: ERROR: ser device not found
[ 70.438802] d4xx 30-001a: gmsl serializer setup failed
[ 70.439418] d4xx 30-001a: ds5_serdes_setup gmsl serdes setup failed
[ 70.453457] d4xx: probe of 30-001a failed with error -121
[ 70.454459] systemd[1]: Started Journal Service.
[ 70.458753] d4xx 33-001a: Probing driver for D45x
[ 70.468979] d4xx 33-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.469355] d4xx 33-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.480755] d4xx 33-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.486885] d4xx 33-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.493316] d4xx 33-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.493673] d4xx 33-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.493700] d4xx 33-001a: ds5_probe(): cannot communicate with D4XX: -121 on addr: 0x1a
[ 70.493940] d4xx: probe of 33-001a failed with error -121
[ 70.493995] d4xx 34-001a: Probing driver for D45x
[ 70.494764] d4xx 34-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.530152] d4xx 34-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.536197] d4xx 34-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.536578] d4xx 34-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.536918] d4xx 34-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.537282] d4xx 34-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.537288] d4xx 34-001a: ds5_probe(): cannot communicate with D4XX: -121 on addr: 0x1a
[ 70.537508] d4xx: probe of 34-001a failed with error -121
[ 70.574162] d4xx 35-001a: Probing driver for D45x
[ 70.575039] d4xx 35-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.585328] d4xx 35-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.585699] d4xx 35-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.586213] d4xx 35-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.604094] d4xx 35-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.610565] d4xx 35-001a: ds5_read(): i2c read failed -121, 0x5020
[ 70.610571] d4xx 35-001a: ds5_probe(): cannot communicate with D4XX: -121 on addr: 0x1a
[ 70.610817] d4xx: probe of 35-001a failed with error -121
[ 70.628948] nvmap_heap_init: nvmap_heap_init: created heap
It doesn't seem it worked? no /dev/video devices are present @MartyG-RealSense
does the sdk have to be built from sources or the apt installation should do with 2.55.1 ?
what does the red led indicate?
@AndreV84 I do not recognize the hardware in the image. Can you provide information about what the component is, please?
SDK 2.55.1 introduces support for JetPack 6.0. JetPack 5.0.2 is also supported. Some Jetson users have built the SDK from source code and not applied the kernel patch script if they have not had an officially supported JetPack version.
@MartyG-RealSense isn't it the intel jetson agx gmsl board at the image?
is 5.1.2 Jetpack supported? by 2.55.1?
it blinks with green led during the boot process twice shortly; then pause then once again
then only red led radiates
https://store.intelrealsense.com/buy-intel-realsense-des457.html
@MartyG-RealSense
linux-image-5.10.120 with d457 won't work as it seems due to lack of driver integration or else
following up instructions won't work
https://github.com/IntelRealSense/realsense_mipi_platform_driver/issues/207
I will consult with my Intel RealSense colleagues about your issues with the official Intel deserializer board and let you know as soon as I have information.
@MartyG-RealSense Thank you for following up!
now it got green leds of that one radiates constantly then another blinks between red/green
is there
a way to read from camera without realsense sdk maybe ?
In GMSL mode it needs the MIPI reference driver.
The camera would likely work in USB mode as a D455 if you moved the mode switch in the casing to the USB position but that would not provide useful information about your GMSL problem, unfortunately.
The best place to seek support on this issue is likely to be the Zendesk support channel, which you have used before for previous D457 issues such as https://github.com/IntelRealSense/librealsense/issues/10560#issuecomment-1400689652
@MartyG-RealSense may I know what is the url for zendesk support ticketing system, please?
The claims for verification control have not been verified.
The code has been verified. You can now continue.
that is what I got logging in
I do not have information about submitting tickets to Zendesk. From https://github.com/IntelRealSense/librealsense/issues/10560#issuecomment-1400689652 it sounds as though you have successfully contacted Intel's Zendesk channel in the past.
there was some w instruction at the url as it seems:
https://forums.intel.com/s/question/0D70P000006GkkASAS
somehow created request somewhere with zendesk realsense
thanks
(22669) request
That's great to hear!
@MartyG-RealSense is it correct to use this script to build the sdk from sources?
https://github.com/IntelRealSense/librealsense/blob/master/scripts/libuvc_installation.sh
@AndreV84 I do not recall this build script being previously used for a MIPI connection but the script has been updated to include a MIPI version of the udev device handling rules.
https://github.com/IntelRealSense/librealsense/blob/master/scripts/libuvc_installation.sh#L43
what are the power input characteristics requirements for the board? I see 12v-13.7v is written at the board; what Ampere does it expect o nthe input port?
@MartyG-RealSense
this component seems got llost even if it was present for gmsl card so are there specific power input requirements but for 12v-13.7v range?
@MartyG-RealSense
@AndreV84 There is not any publicly available technical information about the Intel deserializer board. It will be necessary to enquire to Intel's Zendesk support channel for that information.
I asked them but there was no specific answer yet but a general reference to github repository
Is this error pointing out issue with gmsl board or with camera?
max9295 30-0042: max9295_setup_control: ERROR: ser device not found
folks changed the power source after that the driver issue got resolved; 2.53 sdk also worked; now the only thing is to figure out how to buld the latest sdk
It's great to hear that you were completely successful :)
folks are asking to check:"
exactly what size stand-off we need?
To secure it to the AGX unit?"
@MartyG-RealSense
@AndreV84 I will ask my colleagues about that question too.
@AndreV84 My colleagues advised that the stack space is 11 mm. They added that there should have been 4 pieces of stand-off included in the Xavier carrier board package. Did your package have these pieces, please?
thank you for following up!
folks pointed out "We didn’t get the standoffs"
Okay, I will follow up with my colleagues about the missing stand-offs.
@AndreV84 My colleagues advised that the package should include the following items as a kit:
Board
Power supply
GSML cable
Stand offs / screws in a small ziplock bag
Can you ask your team please if they received the board, power supply and GMSL cable and whether the stand-offs and screws were the only item missing.
Hello @MartyG-RealSense, the post mentions mostly ConnectTech's hardware. Do you have by chance a list of carrier boards that supports D457 GMSL, both hardware and software?
Hi @tonynajjar There is not a list available, unfortunately. The most compatible boards will be Nvidia Jetson AGX Xavier or Orin, as Intel's D457 MIPI driver has installation instructions for those specific boards at the link below.
https://github.com/IntelRealSense/realsense_mipi_platform_driver
The MIPI driver can be converted ("ported") for other boards but you need to perform this conversion yourself as Intel are unable to provide assistance for doing so.
If you are not using Connect Tech's product then you will also need to purchase an additional deserializer board. Leopard Imaging's LI-GMSL2-IPX-DESER deserializer board is compatible with D457.
https://leopardimaging.com/product/accessories/adapters-carrier-boards/li-gmsl2-ipx-deser/
Other brands of deserializer board should also work though as long as they contain MAX9296A deserializer technology like the Leopard Imaging LI-GMSL2-IPX-DESER product does.
|
gharchive/issue
| 2022-10-09T15:10:26 |
2025-04-01T04:55:12.057320
|
{
"authors": [
"AndreV84",
"MartyG-RealSense",
"Mat198",
"MhmRhm",
"jast1982",
"javier-recasens",
"jrecasens",
"lmuffang",
"robotom",
"sammychentw",
"tonynajjar"
],
"repo": "IntelRealSense/librealsense",
"url": "https://github.com/IntelRealSense/librealsense/issues/10984",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
971983226
|
rs-server on Windows
Required Info
Camera Model
D435i
Operating System & Version
Win 10
Platform
PC
Issue Description
https://dev.intelrealsense.com/docs/open-source-ethernet-networking-for-intel-realsense-depth-cameras
I would like to apply the contents of this article to a solution I am currently designing.
However, this article seems to assume a Linux PC (Raspberry Pi 4) as the server machine, but I want to use Windows 10 for the server.
Is there any plan to make rs-server workable on Windows?
I built rs-server on Windows to try:
git clone https://github.com/IntelRealSense/librealsense.git
cd librealsense\tools\rs-server
Open CMakeLists.txt and delete 'if(WIN32) message(...)'
mkdir build; cd build; cmake ..
Open rs-server.sln, add several path and build
I got a lot of link errors for live555
Do these link errors have anything to do with why rs-server is not supported on Windows?
Is there any possibility to build completely to the end without link errors on Windows?
Thank you.
Hi @terranas rs-server currently can only be compiled on Linux. I do not have information about the reason for this.
Would it be practical for you to set up your remote computer to be dual boot (both Windows and Linux installed on it) so that you can set up rs-server on the Linux OS and boot into Linux mode when running the computer as a remote server?
https://opensource.com/article/18/5/dual-boot-linux
Hi @terranas Do you require further assistance with this case, please? Thanks!
Case closed due to no further comments received.
Sorry for the delay.
After that, I made some changes to the original code, and run successfully rs-server on Windows 10.
Thank you very much for releasing useful whitepaper.
That's excellent news @terranas that you were successful - thanks very much for the update :)
Hi @terranas,
I am trying to build the rs-server on Windows 10 and it does not work for me, could you tell what changes you have made to make it work?
Best regards,
Julian
Hi @jbruno79
Please apply following patch to the tag v2.49.0.
Most important is linking ws2_32.lib. (I wonder why the development team didn't do this)
I hope it works :)
tools/CMakeLists.txt | 6 +--
tools/rs-server/CMakeLists.txt | 83 ++++++++++++++++++----------------
2 files changed, 46 insertions(+), 43 deletions(-)
diff --git a/tools/CMakeLists.txt b/tools/CMakeLists.txt
index b1b967af2..13f05a984 100644
--- a/tools/CMakeLists.txt
+++ b/tools/CMakeLists.txt
@@ -19,10 +19,8 @@ add_subdirectory(terminal)
add_subdirectory(recorder)
add_subdirectory(fw-update)
-if(NOT WIN32)
- if(BUILD_NETWORK_DEVICE)
- add_subdirectory(rs-server)
- endif()
+if(BUILD_NETWORK_DEVICE)
+ add_subdirectory(rs-server)
endif()
endif()
diff --git a/tools/rs-server/CMakeLists.txt b/tools/rs-server/CMakeLists.txt
index 617a72379..bbb5a0be0 100644
--- a/tools/rs-server/CMakeLists.txt
+++ b/tools/rs-server/CMakeLists.txt
@@ -20,46 +20,51 @@ SET(CMAKE_EXE_LINKER_FLAGS ${CMAKE_EXE_LINKER_FLAGS} "-pthread")
set(LIVE ${CMAKE_BINARY_DIR}/third-party/live)
+file(GLOB LIVE_SOURCES LIST_DIRECTORIES false CONFIGURE_DEPENDS
+"../../src/ipDeviceCommon/*.c*"
+"${LIVE}/*.c*"
+"${LIVE}/groupsock/*.c*"
+"${LIVE}/BasicUsageEnvironment/*.c*"
+"${LIVE}/liveMedia/*.c*"
+"${LIVE}/UsageEnvironment/*.c*"
+)
+source_group("LIVE555" FILES ${LIVE_SOURCES})
+file(GLOB RS_SERVER_SOURCES LIST_DIRECTORIES false CONFIGURE_DEPENDS
+"*.c*"
+)
+list(APPEND RS_SERVER_SOURCES ${LIVE_SOURCES})
+list(REMOVE_ITEM RS_SERVER_SOURCES "${LIVE}/liveMedia/ADTSAudioStreamDiscreteFramer.cpp")
+if (${BUILD_SHARED_LIBS} AND ${BUILD_EASYLOGGINGPP})
+ list(APPEND RS_SERVER_SOURCES ../../third-party/easyloggingpp/src/easylogging++.cc)
+endif()
+add_executable(${PROJECT_NAME} ${RS_SERVER_SOURCES})
+add_definitions(-DELPP_NO_DEFAULT_LOG_FILE)
+
+include_directories(
+../../common
+../../src
+../../src/ipDeviceCommon
+../../include/librealsense2
+../../third-party/tclap/include
+../../third-party/easyloggingpp/src
+${LIVE}/groupsock/include
+${LIVE}/liveMedia/include
+${LIVE}/UsageEnvironment/include
+${LIVE}/BasicUsageEnvironment/include
+)
+
+set_property(TARGET ${PROJECT_NAME} PROPERTY CXX_STANDARD 11)
+
+set(DEPENDENCIES ${DEPENDENCIES} realsense2 Threads::Threads realsense2-compression ${ZLIB_LIBRARIES} ${JPEG_LIBRARIES})
+
if(WIN32)
- message(SEND_ERROR "rs-server supports Linux only")
-else()
- file(GLOB RS_SERVER_SOURCES LIST_DIRECTORIES false CONFIGURE_DEPENDS
- "*.c*"
- "../../src/ipDeviceCommon/*.c*"
- "${LIVE}/*.c*"
- "${LIVE}/groupsock/*.c*"
- "${LIVE}/BasicUsageEnvironment/*.c*"
- "${LIVE}/liveMedia/*.c*"
- "${LIVE}/UsageEnvironment/*.c*"
- )
- list(REMOVE_ITEM RS_SERVER_SOURCES "${LIVE}/liveMedia/ADTSAudioStreamDiscreteFramer.cpp")
- add_executable(${PROJECT_NAME} ${RS_SERVER_SOURCES})
- add_definitions(-DELPP_NO_DEFAULT_LOG_FILE)
-
- include_directories(
- ../../common
- ../../src
- ../../src/ipDeviceCommon
- ../../include/librealsense2
- ../../third-party/tclap/include
- ../../third-party/easyloggingpp/src
- ${LIVE}/groupsock/include
- ${LIVE}/liveMedia/include
- ${LIVE}/UsageEnvironment/include
- ${LIVE}/BasicUsageEnvironment/include
- )
-
- set_property(TARGET ${PROJECT_NAME} PROPERTY CXX_STANDARD 11)
-
- set(DEPENDENCIES ${DEPENDENCIES} realsense2 Threads::Threads realsense2-compression ${ZLIB_LIBRARIES} ${JPEG_LIBRARIES})
-
- target_link_libraries(${PROJECT_NAME} ${DEPENDENCIES})
-
- set_target_properties(${PROJECT_NAME} PROPERTIES FOLDER "Tools")
-
- target_compile_definitions(${PROJECT_NAME} PUBLIC RESPONSE_BUFFER_SIZE=100000)
-
- install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR})
+ set(DEPENDENCIES ${DEPENDENCIES} ws2_32)
endif()
+target_link_libraries(${PROJECT_NAME} ${DEPENDENCIES})
+
+set_target_properties(${PROJECT_NAME} PROPERTIES FOLDER "Tools")
+
+target_compile_definitions(${PROJECT_NAME} PUBLIC RESPONSE_BUFFER_SIZE=100000)
+install(TARGETS ${PROJECT_NAME} RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR})
|
gharchive/issue
| 2021-08-16T18:13:02 |
2025-04-01T04:55:12.072745
|
{
"authors": [
"MartyG-RealSense",
"jbruno79",
"terranas"
],
"repo": "IntelRealSense/librealsense",
"url": "https://github.com/IntelRealSense/librealsense/issues/9637",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
499092572
|
rs-ar-basic: fix extrinsic pose to camera transformation
Found by user @JBBee: https://github.com/IntelRealSense/librealsense/issues/4883#issuecomment-535597865.
@radfordi Could you also update this comment:
// This is the pose of the fisheye sensor relative to the T265 coordinate system.
It should say the opposite:
// This is the pose of the T265 coordinate system relative to the fisheye sensor
Thanks @dorian3d. I changed the variable name to replace the comment. That way it makes the usage site clearer as well.
Thanks @radfordi. I forgot about the readme. Do you mind updating it too?
|
gharchive/pull-request
| 2019-09-26T20:12:14 |
2025-04-01T04:55:12.075463
|
{
"authors": [
"dorian3d",
"radfordi"
],
"repo": "IntelRealSense/librealsense",
"url": "https://github.com/IntelRealSense/librealsense/pull/4945",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2079986871
|
Add entity API
PLEASE READ THE GUIDELINES BEFORE MAKING A CONTRIBUTION
Please check if the PR fulfills these requirements
[x] The commit message are well described
[ ] Docs have been added / updated (for features or maybe bugs which were noted). If not, please update the needed documentation here. This is not mandatory
[x] All changes have fully been tested
What kind of change does this PR introduce? (Bug fix, feature, ...)
Feature.
What is the current behavior? (You can also link to an open issue here)
Calling EnvironmentDetectorPeripheral.scanEntities() gives only a small amount of information on each entity. I felt like this could be used to a greater potential.
What is the new behavior (if this is a feature change)?
Instead of getting a table with a small amount of data when calling EnvironmentDetectorPeripheral.scanEntities(), UUIDs for entities are now the only thing that is returned. These can be used with the new entity API. For example, entity.getNBT(uuid) returns the NBT data of the entity with the given UUID.
The entity API itself can be disabled in config, as well as the individual functions within it that could be deemed exploitative on servers.
A PlayerDetectorPeripheral.getPlayerUUID(username) Lua function has also been added so that you can make use of the entity API when using a PlayerDetectorPeripheral.
I have also expanded the amount of entity information you get when calling LuaConverter.completeEntityToLua(entity). This function can be called via the entity API with entity.getData(uuid).
Does this PR introduce a breaking change? (What changes might users need to make in their scripts due to this PR?)
Yes, any calls to EnvironmentDetectorPeripheral.scanEntities() will need to use the entity API to get the data that it used to give you.
Calls to AutomataEntityHandPlugin.searchAnimals() now have position information in a table with the key relativePos. This was a side-effect of cleaning up LuaConverter.completeEntityToLua(entity) to not require an ItemStack to be passed to it.
Other information:
I can make a PR for the docs if this PR is to be merged.
Here's a script that I've been testing with: https://pastebin.com/uCH39VEm
Before I start reviewing
Since this adds a wider palette of features and due to the fact that this includes a breaking change, could you change the target branch to dev/0.8 so this features will be implemented to the new major version?
Done
Greetings
The build seems to fail, I am not sure if the issues came with the change of the target branch, but they are there
Could you fix them? I would also recommend running gradle check afterward to check for checkstyle rule violations.
You can either open the build log from teamcity by clicking on "Details" or you just could try to build it locally
My bad, it builds now
You still didn't fix the checkstyle violations
Either check the teamcity build log or run gradle check locally
Foremost, thanks for the PR and I really like your idea
I also discussed that internally with some collaborators, and we agree that switching to an LuaAPI based entity api is a great idea
The changes look good, but I didn't inspect them too deeply. I will take some time this week to review your changes and test them a bit for me
Would it have made more sense to implement this with peripheral methods instead of polluting the global namespace? You could return a userdata from the scan method, return all the information by default, or just add another method to the peripheral for looking up data by uuid
Would it have made more sense to implement this with peripheral methods instead of polluting the global namespace?
Originally I started with adding a getNBT(UUID) function to EnvironmentDetectorPeripheral, since I didnt want to put the huge NBT information in the small table returned by scanEntities, but if you think about it it makes no sense to have a getNBT function in EnvironmentDetectorPeripheral. Its either have a massive table returned or separate this into an Entity API which can have more fine-grained functions, so I opted for the later. It also isn't clear which functions in different peripherals return what entity data, but if you get a UUID from EnvironmentDetectorPeripheral.scanEntities and a UUID from PlayerDetectorPeripheral.getPlayerUUID, you know that Entity.getData(UUID) will return the same type of information for both entities.
You could return a userdata from the scan method
It was already the case before that some functions returned different entity data than others for no real reason. scanEntities only gave you like 10 fields per entity so it wasn't very useful, whereas searchAnimals gave you a lot of information, but still didn't include NBT data. It is easier to maintain and more consistent if this is all in one place.
return all the information by default
You could return all data, but there is a lot of information available now. It feels almost wrong to lump almost all information on the entity into one huge table. This would be NBT, almost all runtime data, and all persistent data. If you just need to get the positions of entities then this is all useless information.
or just add another method to the peripheral for looking up data by uuid
Something like getEntityData(UUID) wouldn't make sense in any one peripheral
After looking at some of my changes again though, I think it would make sense for AutomataEntityHandPlugin.searchAnimals and AutomataEntityTransferPlugin.getCapturedAnimal to both return UUIDs instead of completeEntityToLuaWithShearable.
Would it have made more sense to implement this with peripheral methods instead of polluting the global namespace? You could return a userdata from the scan method, return all the information by default, or just add another method to the peripheral for looking up data by uuid
I am currently discussing that a bit with some of the CC community in the CC discord. I know that additions to the global API are not welcome in the community. AP already does unconventional things that I want to eliminate in 0.8, and I'd hate to add more.
I am currently discussing that a bit with some of the CC community in the CC discord. I know that additions to the global API are not welcome in the community. AP already does unconventional things that I want to eliminate in 0.8, and I'd hate to add more.
Damn thats unfortunate, I wasn't aware of that. It's an issue of usability and code quality vs global pollution then. It's a shame that an entity API isn't available in CC itself
Another alternative is turning the Player Detector peripheral into an Entity Detector peripheral. All functionality of the entity API could be put in the Entity Detector. scanEntities could be removed from the Environment Detector and put into the Entity Detector.
I support that idea
Adding all the entity related features to one block would be the ideal compromise imo
you could also participate in the small dicussion in the #modmaking chat on the cc discord
One idea I like is to still add all the entity API functions to the environmental detector but to separate them a bit, put them into a table which you can retrieve using getEntityAPI.
This would be simpler than "adding the API to the global environment, but only when the peripheral is attached" imo
So maybe you want to go with that?
One idea I like is to still add all the entity API functions to the environmental detector but to separate them a bit, put them into a table which you can retrieve using getEntityAPI. This would be simpler than "adding the API to the global environment, but only when the peripheral is attached" imo
So maybe you want to go with that?
I will also convert this to a draft until we agree on a method
For the idea of turning the Player Detector into an Entity Detector:
Player Detector has four player-specific things: the playerClick, playerJoin, playerLeave events, and the getOnlinePlayers function. Everything else can be made to work for any entity (1 event and 10 functions). The getPlayers functions currently return the names of players, but could become getEntities and return entity UUIDs instead. The functions migrated from the EntityAPI like getName(UUID) or getNBT(UUID) would have checks within them to see if the entity is still within the range of the Entity Detector so that these can't be abused to get information about entities anywhere.
I think a getEntityAPI on the Environment Detector is a possibility, although you'd then have almost duplicate functions across peripherals (such as EnvironmentDetector.EntityAPI.getPos and PlayerDetector.getPlayerPos). An Entity Detector could replace all functions that get entity data in all other peripherals.
Merging the two peripherals is an interesting idea, but I don't really like it. Merging them would create one peripheral with maybe too many functions imo which wouldn't keep it simple
Developing the Player Detector and the Environment Detector separately would be my way to go. I would simply accept the duplicates, as there are apparently only 2 functions (getPlayerPos and getOnlinePlayers)
Keeping the player related functions (isPlayerInX... getPlayersInX...) and events to the player detector would make it simpler for the user imo to just have player related functions in one dedicated block while there would be the more advanced entity api in the Environment Detector
Sweet. I'm going away for a week but I'll work on that when I get back
Greetings @lonevox
Do you have any plans on continuing this?
Greetings @lonevox Do you have any plans on continuing this?
I took quite a hiatus from making my pack which is why I completely disappeared, sorry about that. I'll try getting the PR ready with the discussed changes within the next few days. I'll still be working on adding it to 1.20.1.
@SirEndii I'm having trouble exposing the EntityAPI class as a table through the EnvironmentDetectorPeripheral like you suggested. Here's what I assume would work, but isn't:
@LuaFunction(mainThread = true)
public final EntityAPI getEntityAPI() {
return new EntityAPI();
}
It successfully returns a table, but the table is empty, even though there are 6 LuaFunctions in EntityAPI. Any ideas?
Can you check the log?
CC throws some debug messages when it's unable to parse functions
I'm pretty sure the function arguments and return types are fine, especially as I was using these methods fine when using EntityAPI as a CC API.
Even this for the EntityAPI isn't working:
public class EntityAPI {
@LuaFunction
public final String test() {
return "Hello World!";
}
}
For testing this in Lua I have:
local entityAPI = environment.getEntityAPI();
print(entityAPI)
print(entityAPI.test())
This results in:
table: 1f87cadd
/scan.lua:10: attempt to call field 'test' (a nil value)
Hopefully I'm doing something simple in the wrong way.
I think he solved the conflicts a bit wrong there
Thanks for your contribution! ❤️
Everything looks good now. However, I'll still do more client test before merging it, just to be responsible ...
I may finish testing this today, or tomorrow.
Still don't know if query with UUID is a good move, because this change force users do more main thread invokes, it increased the response time, and may decrease the ease of use.
It does decrease ease of use in that you need to manage UUIDs in Lua, but it allows for the EnvironmentDetectorPeripheral, PlayerDetectorPeripheral, and AutomataEntityHandPlugin to use a shared API for getting entity info instead of having duplicate functions.
Ok, so when an entity is not exists any more, the entity API will throw a java error, we should not expect that, instead we should return nil, "ENTITY_NOT_EXISTS" or some else message
Hm and with the entity API, you can get any entities data once you know their UUID, no matter where they are. That probably is a little bit too powerful, may be abused, and not what we expect.
I'd ask @SirEndii's opinion for that.
I'm doing slightly weird validation by using Pair, but it is better than before.
I'd ask @SirEndii's opinion for that.
I would prefer your option one, seems to be the simplest, and it would follow the same logic as with other functions and peripherals
I'm currently opting for having EntityAPI be constructed by EnvironmentDetectorPeripheral::getEntityAPI like so (it used to return a static EntityAPI instance):
@LuaFunction
public final EntityAPI getEntityAPI() {
return new EntityAPI(getPeripheralOwner());
}
This means that within the EntityAPI it can check the distance from the entity to the peripheral whenever a function is called. However, I'm unsure of what range the EntityAPI should have, and how to handle range. It could just be a flat max distance value such as 5 blocks, or it could be a SphereOperation. If the right way to go is a SphereOperation, then here's some options I'm considering:
I could add a new SphereOperation value for each EntityAPI function (as EntityAPI::getPos should probably be cheaper than EntityAPI::getNBT based on how much less data is returned) but this would pollute SphereOperation, as it currently has 2 values and I would be adding 6 more.
I could add just one SphereOperation value called GET_ENTITY_DATA and have the free radius the same as the cost radius so it is a free operation. The idea behind it being free is the player already needed to run EnvironmentDetectorPeripheral::scanEntities to get the entity UUIDs, however this might as well be free because you can now use those UUIDs to call EntityAPI functions for that entity forever for no extra cost.
A compromise is to add 2 SphereOperation values:
GET_SMALL_ENTITY_DATA (for getName, getPos, getBoundingBox, and getPersistentData).
GET_LARGE_ENTITY_DATA (for getNBT and getData).
Or SphereOperation might not be the way to do this at all. I'd like to know what you guys think.
There is also a whole other issue of the fact that a malicious player could get their hands on a victims EntityAPI instance (since they're instanced by EnvironmentDetectorPeripheral now) and call expensive functions on it from anywhere. This lends itself to unfortunately ditching the EntityAPI and moving the functions to EnvironmentDetectorPeripheral, which is becoming a more attractive option.
Sorry for not answering, I completely forgot this PR
I would prefer the use of SphereOperation, your second option sounds more reasonable to me
There is also a whole other issue of the fact that a malicious player could get their hands on a victims EntityAPI instance (since they're instanced by EnvironmentDetectorPeripheral now)
If the function creates a new Entity API for every getEntityAPI() function call, wouldn't a user need to save the entity api inside a local field so the cooldowns and other stuff are stored? If a user always creates a new EntityAPI instance via the function, wouldn't they also create new cooldowns/reset them,
Maybe you're right, and the best way is to move all the functions into the EnvDetector itself. I at least would now prefer that
Greetings @lonevox
I would just like to know how things are going for you in terms of the time you have and what else you are planning for the PR
I just have an idea that you should not try to use UUID to index entity data at all, you can just returns a table include a method that takes no argument for extra information, such as:
{
id = x,
uuid = xx,
name = xxx,
type = xxxx,
getAdvancedInfo = function() return <wrappedJavaMap> end,
getNBT = function() return <convertedNBTData> end,
}
|
gharchive/pull-request
| 2024-01-13T02:14:23 |
2025-04-01T04:55:12.136195
|
{
"authors": [
"SirEndii",
"lonevox",
"tehgreatdoge",
"zyxkad"
],
"repo": "IntelligenceModding/AdvancedPeripherals",
"url": "https://github.com/IntelligenceModding/AdvancedPeripherals/pull/549",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1205980638
|
Fixed alignment of checkboxes and label in refinement component
This PR aligns labels and checkboxes of the refinement component.
For comparison see the rendered result before: https://play.tailwindcss.com/ByvL8yGQ6b
and after: https://play.tailwindcss.com/1UofU3K5ac
Awesome. Can you switch indentation to tabs?
Done
|
gharchive/pull-request
| 2022-04-16T01:51:59 |
2025-04-01T04:55:12.143391
|
{
"authors": [
"inxilpro",
"martin-ro"
],
"repo": "InterNACHI/blade-alpine-instantsearch",
"url": "https://github.com/InterNACHI/blade-alpine-instantsearch/pull/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
321300798
|
Establish and document versioning and release policies
Once the library starts being published to a public Maven repository (see issue #6), some versioning and branching/release strategies should be established, documented and enforced in the repository so that the project can evolve safely, while being used in production by the public.
As far as versioning is concerned, Semantic Versioning (https://semver.org/) is the best choice IMO.
And when it comes to the branching/release strategy, a simple yet effective one (especially in concert with semantic versioning) is to maintain a master branch holding the newest, working snapshot as well as release branches for every major (or minor) version (ex. release/1.x, release/2.x) When such a branch is updated, artifacts built from a release branch can be safely published to a public Maven repository. This kind of branching policy also facilitates easier backporting of critical changes/fixes to legacy versions, which can prove handy for this project.
Either way, it would greatly help developers who would like to make their systems compliant with the new regulations by using this library, if both versioning and branching/release policies were established and clearly communicated.
Any thoughts/updates on that?
+1 for Semantic versioning and release branches. Also, tag for each published release.
Done
|
gharchive/issue
| 2018-05-08T18:40:56 |
2025-04-01T04:55:12.146038
|
{
"authors": [
"lanusau",
"turu"
],
"repo": "InteractiveAdvertisingBureau/Consent-String-SDK-Java",
"url": "https://github.com/InteractiveAdvertisingBureau/Consent-String-SDK-Java/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2238920823
|
Fix load single safetensor file
https://huggingface.co/deepseek-ai/deepseek-vl-1.3b-chat/tree/main
@zhulinJulia24 请问 pr test 失败的原因是?
@zhulinJulia24 请问 pr test 失败的原因是?记忆性的assert还是不够健壮,我想想怎么优化下
AssertionError: 成都 shouldn't exist in:当然可以!以下是一些美食的介绍:\n\n1. 麻婆豆腐:这是四川省成都市的传统名菜,主要原料是豆腐和牛肉末,口感麻辣,是四川菜系中的代表之一。\n\n2. 小笼包:这是上海的特色美食,外皮薄而有弹性,内馅是鲜美的猪肉,汁水丰富,味道十分鲜美。\n\n3. 北京烤鸭:这是北京的传统名菜,以鸭子为原料,经过独特的烤制工艺,使得烤鸭皮脆肉嫩,香味四溢。\n\n4. 叉烧:这是广东省的传统名菜,主要原料是猪肉,经过腌制和烤制,口感香甜,肉质鲜嫩。\n\n5. 担担面:这是四川省成都市的传统小吃,主要原料是面条和肉末,口感麻辣,是四川菜系中的代表之一。\n\n6. 水饺:这是中国传统的小吃,主要原料是面粉和馅料,口感鲜美,营养丰富,是中国人喜爱的传统美食之一。\n\n以上这些美食都有着各自独特的风味和口感,如果你有机会品尝,一定不要错过哦!
@zhulinJulia24 请问 pr test 失败的原因是?记忆性的assert还是不够健壮,我想想怎么优化下
AssertionError: 成都 shouldn't exist in:当然可以!以下是一些美食的介绍:\n\n1. 麻婆豆腐:这是四川省成都市的传统名菜,主要原料是豆腐和牛肉末,口感麻辣,是四川菜系中的代表之一。\n\n2. 小笼包:这是上海的特色美食,外皮薄而有弹性,内馅是鲜美的猪肉,汁水丰富,味道十分鲜美。\n\n3. 北京烤鸭:这是北京的传统名菜,以鸭子为原料,经过独特的烤制工艺,使得烤鸭皮脆肉嫩,香味四溢。\n\n4. 叉烧:这是广东省的传统名菜,主要原料是猪肉,经过腌制和烤制,口感香甜,肉质鲜嫩。\n\n5. 担担面:这是四川省成都市的传统小吃,主要原料是面条和肉末,口感麻辣,是四川菜系中的代表之一。\n\n6. 水饺:这是中国传统的小吃,主要原料是面粉和馅料,口感鲜美,营养丰富,是中国人喜爱的传统美食之一。\n\n以上这些美食都有着各自独特的风味和口感,如果你有机会品尝,一定不要错过哦!
句子里不是有 “四川省成都市” 么?
|
gharchive/pull-request
| 2024-04-12T02:27:42 |
2025-04-01T04:55:12.175592
|
{
"authors": [
"AllentDan",
"lvhan028",
"zhulinJulia24"
],
"repo": "InternLM/lmdeploy",
"url": "https://github.com/InternLM/lmdeploy/pull/1427",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2044839963
|
TEST PR: Update README.md
Description
Checklist
[ ] Commit sequence broadly makes sense and commits have useful messages
[ ] New tests are added if needed and existing tests are updated
[ ] When applicable, versions are updated in .cabal and CHANGELOG.md files according to the
versioning process.
[ ] The version bounds in .cabal files for all affected packages are updated. If you change the bounds in a cabal file, that package itself must have a version increase. (See RELEASING.md)
[ ] All visible changes are prepended to the latest section of a CHANGELOG.md for the affected packages. New section is never added with the code changes. (See RELEASING.md)
[x] Code is formatted with fourmolu (use scripts/fourmolize.sh)
[x] Cabal files are formatted (use scripts/cabal-format.sh)
[x] hie.yaml has been updated (use scripts/gen-hie.sh)
[ ] Self-reviewed the diff
:heavy_check_mark: Test passed :slightly_smiling_face:
|
gharchive/pull-request
| 2023-12-16T16:08:36 |
2025-04-01T04:55:12.208093
|
{
"authors": [
"angerman",
"lehins"
],
"repo": "IntersectMBO/cardano-ledger",
"url": "https://github.com/IntersectMBO/cardano-ledger/pull/3937",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1038618318
|
initial updates to network documentation
Mainly tidying and adding some additional context for the multiplexer
cc @njd42 @coot what about this PR, should it be closed?
It might be still relevant, we haven't changed anything fundamental in the connection manager since it was designed.
Merged in #5001.
|
gharchive/pull-request
| 2021-10-28T14:55:42 |
2025-04-01T04:55:12.209680
|
{
"authors": [
"bolt12",
"coot",
"njd42"
],
"repo": "IntersectMBO/ouroboros-network",
"url": "https://github.com/IntersectMBO/ouroboros-network/pull/3464",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
222163587
|
Image rotation
Hi, I am simply trying to resize and image, everything is working fine but don't know why sometimes I get rotated (90 deg) images.
$height = 500;
$width = 500;
$img = Image::make('uploads/user00012.jpg');
$img->resize($height, $width);
$img->save('uploads/thumbnail_user00012.jpg');
Am I doing something wrong?
Thanks
Some photos have include Exif data to adjust the orientation.
Try applying orientate().
|
gharchive/issue
| 2017-04-17T16:08:30 |
2025-04-01T04:55:12.211533
|
{
"authors": [
"jagroop",
"olivervogel"
],
"repo": "Intervention/image",
"url": "https://github.com/Intervention/image/issues/714",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2070506381
|
feat!: devOnly fix unbuild configs
Title say everything.
There are other potential unbuild configs that could see some improvements, will dig into them once I better understand how nuxt/module-builder works.
browser crash made me not realize I was pushing to the wrong branch
|
gharchive/pull-request
| 2024-01-08T13:52:07 |
2025-04-01T04:55:12.212653
|
{
"authors": [
"Sandros94"
],
"repo": "Intevel/nuxt-directus",
"url": "https://github.com/Intevel/nuxt-directus/pull/230",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2758690014
|
🛑 all-rf.no is down
In 4ee7d95, all-rf.no (https://all-rf.no) was down:
HTTP code: 0
Response time: 0 ms
Resolved: all-rf.no is back up in c40304b after 33 minutes.
|
gharchive/issue
| 2024-12-25T10:00:06 |
2025-04-01T04:55:12.216113
|
{
"authors": [
"KindCoder-no"
],
"repo": "Intus-AS/Types-status",
"url": "https://github.com/Intus-AS/Types-status/issues/5313",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1419526044
|
Test for editable page title when there are random times of space entered in the front of the title like ' xxxxx'
There is a special edge case that the user may enter some random space before they enter the core title.
Like ' translation'
To implement random times of space, I firstly initialize a random number. Then I search in the Stack overflow how to use the regular expression to write the format.
Source can be found here : https://stackoverflow.com/questions/33539797/how-to-create-string-with-multiple-spaces-in-javascript
component.setTitle( Array(num).fill('\xa0').join('') + 'new editable Page');
describe("DemoComponent-pagetitle-edittwice", () => {
let component: DemoComponent;
let fixture: ComponentFixture<DemoComponent>;
let num:number = Math.ceil(Math.random()*10);
beforeEach(async () => {
await TestBed.configureTestingModule({
imports: [MaterialModule],
declarations: [DemoComponent],
schemas: [CUSTOM_ELEMENTS_SCHEMA],
}).compileComponents();
fixture = TestBed.createComponent(DemoComponent);
component = fixture.componentInstance;
component.setTitle( Array(num).fill('\xa0').join('') + 'new editable Page');
fixture.detectChanges();
});
it("test editable page title after random times of edit ", () => {
const { debugElement } = fixture;
const title1 = debugElement.query(By.css('#spanid')).nativeElement;
expect(title1.textContent).toContain('\xa0'.repeat(num) + 'new editable Page');
});
});
|
gharchive/issue
| 2022-10-22T22:02:46 |
2025-04-01T04:55:12.242707
|
{
"authors": [
"crystaljesica"
],
"repo": "Iostream3100/Studio-Web",
"url": "https://github.com/Iostream3100/Studio-Web/issues/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
952277189
|
Nerf speed boots
Decrease speed, decrease player health further
Change the generic max health back. That'll kill them for wearing it. You just made the severity of the already harsh health penalty more severe, with less reward
Generic max health is per half heart afaik
|
gharchive/pull-request
| 2021-07-25T13:41:13 |
2025-04-01T04:55:12.261837
|
{
"authors": [
"CocoTheOwner",
"NextdoorPsycho"
],
"repo": "IrisDimensions/overworld",
"url": "https://github.com/IrisDimensions/overworld/pull/167",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
327511750
|
ctypes\test\test_prototypes.py is failing
.F
Unhandled Exception: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
at Microsoft.Win32.Win32Native.WideCharToMultiByte(UInt32 cp, UInt32 flags, Char* pwzSource, Int32 cchSource, Byte* pbDestBuffer, Int32 cbDestBuffer, IntPtr null1, IntPtr null2)
at System.String.ConvertToAnsi(Byte* pbNativeBuffer, Int32 cbNativeBuffer, Boolean fBestFit, Boolean fThrowOnUnmappableChar)
at System.Runtime.InteropServices.Marshal.StringToHGlobalAnsi(String s)
at InteropInvoker(IntPtr , Object , Object[] )
at CallSite.Target(Closure , CallSite , Object , Object )
at Microsoft.Scripting.Interpreter.DynamicInstruction`3.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 163
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at Microsoft.Scripting.Interpreter.FuncCallInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 784
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at IronPython.Compiler.Ast.CallExpression.Invoke1Instruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\Ast\CallExpression.cs:line 267
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run2[T0,T1,TRet](T0 arg0, T1 arg1) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 97
at IronPython.Runtime.Method.MethodBinding.SelfTarget(CallSite site, CodeContext context, Object target) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Method.Generated.cs:line 130
at Microsoft.Scripting.Interpreter.DynamicInstruction`3.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 163
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run3[T0,T1,T2,TRet](T0 arg0, T1 arg1, T2 arg2) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 130
at Microsoft.Scripting.Interpreter.FuncCallInstruction`5.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 760
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run6[T0,T1,T2,T3,T4,T5,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4, T5 arg5) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 241
at Microsoft.Scripting.Interpreter.DynamicInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 238
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run5[T0,T1,T2,T3,T4,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 202
at Microsoft.Scripting.Interpreter.DynamicInstruction`5.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 213
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at Microsoft.Scripting.Interpreter.FuncCallInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 784
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run5[T0,T1,T2,T3,T4,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 202
at System.Dynamic.UpdateDelegates.UpdateAndExecute4[T0,T1,T2,T3,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2, T3 arg3)
at IronPython.Runtime.Method.MethodBinding`1.SelfTarget(CallSite site, CodeContext context, Object target, T0 arg0) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Method.Generated.cs:line 161
at CallSite.Target(Closure , CallSite , CodeContext , Object , Object )
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at IronPython.Compiler.Ast.CallExpression.Invoke1Instruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\Ast\CallExpression.cs:line 267
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at Microsoft.Scripting.Interpreter.FuncCallInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 784
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run6[T0,T1,T2,T3,T4,T5,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4, T5 arg5) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 241
at System.Dynamic.UpdateDelegates.UpdateAndExecute5[T0,T1,T2,T3,T4,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4)
at Microsoft.Scripting.Interpreter.FuncCallInstruction`8.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 832
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run6[T0,T1,T2,T3,T4,T5,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4, T5 arg5) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 241
at Microsoft.Scripting.Interpreter.DynamicInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 238
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run5[T0,T1,T2,T3,T4,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 202
at Microsoft.Scripting.Interpreter.DynamicInstruction`5.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 213
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at Microsoft.Scripting.Interpreter.FuncCallInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 784
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run5[T0,T1,T2,T3,T4,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 202
at System.Dynamic.UpdateDelegates.UpdateAndExecute4[T0,T1,T2,T3,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2, T3 arg3)
at IronPython.Runtime.Method.MethodBinding`1.SelfTarget(CallSite site, CodeContext context, Object target, T0 arg0) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Method.Generated.cs:line 161
at CallSite.Target(Closure , CallSite , CodeContext , Object , Object )
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at Microsoft.Scripting.Interpreter.FuncCallInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 784
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at IronPython.Compiler.Ast.CallExpression.Invoke1Instruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\Ast\CallExpression.cs:line 267
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at IronPython.Compiler.PythonCallTargets.OriginalCallTarget3(PythonFunction function, Object arg0, Object arg1, Object arg2) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\PythonCallTargets.cs:line 52
at Microsoft.Scripting.Interpreter.FuncCallInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 784
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run6[T0,T1,T2,T3,T4,T5,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4, T5 arg5) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 241
at System.Dynamic.UpdateDelegates.UpdateAndExecute5[T0,T1,T2,T3,T4,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4)
at Microsoft.Scripting.Interpreter.DynamicInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 238
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run5[T0,T1,T2,T3,T4,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 202
at System.Dynamic.UpdateDelegates.UpdateAndExecute4[T0,T1,T2,T3,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2, T3 arg3)
at Microsoft.Scripting.Interpreter.DynamicInstruction`5.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\DynamicInstructions.Generated.cs:line 213
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at IronPython.Compiler.PythonCallTargets.OriginalCallTarget3(PythonFunction function, Object arg0, Object arg1, Object arg2) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\PythonCallTargets.cs:line 52
at Microsoft.Scripting.Interpreter.FuncCallInstruction`6.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.Generated.cs:line 784
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run5[T0,T1,T2,T3,T4,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 202
at System.Dynamic.UpdateDelegates.UpdateAndExecute4[T0,T1,T2,T3,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2, T3 arg3)
at IronPython.Runtime.Method.MethodBinding`1.SelfTarget(CallSite site, CodeContext context, Object target, T0 arg0) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Method.Generated.cs:line 161
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at CallSite.Target(Closure , CallSite , CodeContext , Object , Object )
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at IronPython.Compiler.Ast.CallExpression.Invoke1Instruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\Ast\CallExpression.cs:line 267
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run3[T0,T1,T2,TRet](T0 arg0, T1 arg1, T2 arg2) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 130
at IronPython.Runtime.Method.MethodBinding`1.SelfTarget(CallSite site, CodeContext context, Object target, T0 arg0) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Method.Generated.cs:line 161
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at IronPython.Compiler.Ast.CallExpression.Invoke1Instruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\Ast\CallExpression.cs:line 267
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run2[T0,T1,TRet](T0 arg0, T1 arg1) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 97
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at IronPython.Runtime.Method.MethodBinding.SelfTarget(CallSite site, CodeContext context, Object target) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Method.Generated.cs:line 130
at System.Dynamic.UpdateDelegates.UpdateAndExecute2[T0,T1,TRet](CallSite site, T0 arg0, T1 arg1)
at IronPython.Compiler.Ast.CallExpression.Invoke0Instruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\Ast\CallExpression.cs:line 246
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run12[T0,T1,T2,T3,T4,T5,T6,T7,T8,T9,T10,T11,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3, T4 arg4, T5 arg5, T6 arg6, T7 arg7, T8 arg8, T9 arg9, T10 arg10, T11 arg11) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 517
at IronPython.Compiler.PythonCallTargets.OriginalCallTarget11(PythonFunction function, Object arg0, Object arg1, Object arg2, Object arg3, Object arg4, Object arg5, Object arg6, Object arg7, Object arg8, Object arg9, Object arg10) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\PythonCallTargets.cs:line 92
--- End of inner exception stack trace ---
at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
at Microsoft.Scripting.Interpreter.MethodInfoCallInstruction.InvokeWorker(Object[] args) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.cs:line 262
at Microsoft.Scripting.Interpreter.MethodInfoCallInstruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Instructions\CallInstruction.cs:line 279
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run4[T0,T1,T2,T3,TRet](T0 arg0, T1 arg1, T2 arg2, T3 arg3) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 165
at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)
at IronPython.Runtime.Types.LateBoundInitBinder.FastInitSite.CallTarget(CallSite site, CodeContext context, Object inst) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Types\PythonType.Generated.cs:line 836
at System.Dynamic.UpdateDelegates.UpdateAndExecute2[T0,T1,TRet](CallSite site, T0 arg0, T1 arg1)
at IronPython.Runtime.Types.PythonType.FastTypeSite.CallTarget(CallSite site, CodeContext context, Object type) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Runtime\Types\PythonType.Generated.cs:line 267
at System.Dynamic.UpdateDelegates.UpdateAndExecute2[T0,T1,TRet](CallSite site, T0 arg0, T1 arg1)
at IronPython.Compiler.Ast.CallExpression.Invoke0Instruction.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\Ast\CallExpression.cs:line 246
at Microsoft.Scripting.Interpreter.Interpreter.Run(InterpretedFrame frame) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\Interpreter.cs:line 105
at Microsoft.Scripting.Interpreter.LightLambda.Run1[T0,TRet](T0 arg0) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Interpreter\LightLambda.Generated.cs:line 66
at IronPython.Compiler.RuntimeScriptCode.InvokeTarget(Scope scope) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Compiler\RuntimeScriptCode.cs:line 75
at IronPython.Hosting.PythonCommandLine.RunFileWorker(String fileName) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Hosting\PythonCommandLine.cs:line 552
at IronPython.Hosting.PythonCommandLine.RunFile(String fileName) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Hosting\PythonCommandLine.cs:line 537
at Microsoft.Scripting.Hosting.Shell.CommandLine.Run() in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Hosting\Shell\CommandLine.cs:line 142
at IronPython.Hosting.PythonCommandLine.Run() in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPython\Hosting\PythonCommandLine.cs:line 134
at Microsoft.Scripting.Hosting.Shell.CommandLine.Run(ScriptEngine engine, IConsole console, ConsoleOptions options) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Hosting\Shell\CommandLine.cs:line 110
at Microsoft.Scripting.Hosting.Shell.ConsoleHost.RunCommandLine() in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Hosting\Shell\ConsoleHost.cs:line 397
at Microsoft.Scripting.Hosting.Shell.ConsoleHost.ExecuteInternal() in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Hosting\Shell\ConsoleHost.cs:line 332
at Microsoft.Scripting.Hosting.Shell.ConsoleHost.Run(String[] args) in C:\Users\USER\Code\_ipy\ironpython2\Src\DLR\Src\Microsoft.Dynamic\Hosting\Shell\ConsoleHost.cs:line 208
at PythonConsoleHost.Main(String[] args) in C:\Users\USER\Code\_ipy\ironpython2\Src\IronPythonConsole\Console.cs:line 103
Collected in #457
|
gharchive/issue
| 2018-05-29T22:37:48 |
2025-04-01T04:55:12.272175
|
{
"authors": [
"slide"
],
"repo": "IronLanguages/ironpython2",
"url": "https://github.com/IronLanguages/ironpython2/issues/404",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
111983864
|
ui-bootstrap update
thanks for this great repository, you may want to see if ui-bootstrap update to 14.0.2? and not cause problems
I'll check into it. Everything will be blown up shortly and updated to Bootstrap 4 though so I am not making major updates to themes and templates at the moment, I'm just working locally on getting everything Bootstrap 4 ready!
|
gharchive/issue
| 2015-10-17T20:45:45 |
2025-04-01T04:55:12.274125
|
{
"authors": [
"davidtmiller",
"newmesiss"
],
"repo": "IronSummitMedia/startbootstrap-sb-admin-2",
"url": "https://github.com/IronSummitMedia/startbootstrap-sb-admin-2/issues/81",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2632320827
|
Added support for MidiCo LRC and improved robustness of Enhanced LRC
Added support for MidiCo LRC and improved robustness of Enhanced LRC
In my next release (this weekend) I'll at least get a BOM fix (that will still allow use of non-UTF8 .lrc files).
Hopefully will also get the MidiCo lrc support added, but want to make sure it does something sane for duets, ideally setting them to different styles, but at least keeping the lines consistent instead of interleaving them.
I'll leave this open until those updates are complete, whether or not they use some of this code.
|
gharchive/pull-request
| 2024-11-04T09:45:24 |
2025-04-01T04:55:12.367760
|
{
"authors": [
"ItMightBeKaraoke",
"beveradb"
],
"repo": "ItMightBeKaraoke/kbputils",
"url": "https://github.com/ItMightBeKaraoke/kbputils/pull/4",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2690647074
|
Bed
Includes changes that allows user to download a BED file, currently limited to a single gene bed. Connects to Variant validator and outputs the relevant information.
Currently only accepts HGNC ID
Now has the ability to download a bed file based on Panel ID or Rcode
|
gharchive/pull-request
| 2024-11-25T12:59:32 |
2025-04-01T04:55:12.375829
|
{
"authors": [
"Mani-varma1"
],
"repo": "Itebbs22/SoftwareDevelopmentVIMMO",
"url": "https://github.com/Itebbs22/SoftwareDevelopmentVIMMO/pull/17",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1215645183
|
WebP Rich Media Support
I've noticed when using a iOS rich media URL that refers to a webp image, the attachment is not shown as part of the notification, however the URL is correctly pulled down in attachment-url when printing the push payload to the XCode console, and changing the format of the image from webp to png (for example) works and is displayed as expected.
Is webp rich media a supported use case, if not, are there any plans or ETAs on when this would be supported? We'd like to use WebP as the file sizes for our use case are a lot smaller 😄
Thanks!
+1
|
gharchive/issue
| 2022-04-26T09:09:53 |
2025-04-01T04:55:12.388003
|
{
"authors": [
"Grimlock257",
"jacao"
],
"repo": "Iterable/swift-sdk",
"url": "https://github.com/Iterable/swift-sdk/issues/552",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2583794686
|
rules expressions documentation
First of all, thank you very much for this amazing project.
A big request - could you please provide a little more detailed information on the compilation rules expressions?
I mean:
rules:
example_rule:
name: Example Rule
expression: or (eq .Message.Subject "cam-1") (eq .Message.Subject "cam-2") <---- This is
What variables are available? A few complex examples would be very helpful
And is it possible to have a rule for *@script.test where the asterisk is any string?
|
gharchive/issue
| 2024-10-13T08:48:32 |
2025-04-01T04:55:12.389744
|
{
"authors": [
"anshibanov"
],
"repo": "ItsNotGoodName/smtpbridge",
"url": "https://github.com/ItsNotGoodName/smtpbridge/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1753356489
|
[ENHANCEMENT]Is It Even Possible To Make Alpaca Connect To Internet.
Hey, as Alpaca makes up some answers which may be frustrating at times when you depend on it for project work, make it connectable to the internet please, it can be revolutionary it will undoubtedly help me, and millions of others (soon-to-use) a ton hence please add this feature, thankyou.
Been wondering about the same thing... But I'm just going to ask you a question.
It's been made don't in that manner but Can it be modified in that manner?
Don't you think that people half clue in their minds would like to get their hands on a ready-to-kill thing?
Just consider it a free demo or some sort.
If anyone made what you ask they probably get sued by some company for some moron reason or just cause.
That's just my own input but I saw lots of those things going on and on in the news over the internet about AI going heywired and some morons wanting to abuse IT to destroy the world like trying to make SkyNet from Terminator`s film,
I added it a whole back. But I guess duck-duck-scrape broke or smth. Because it doesn't work anymore. I'll have a look when I have time.
|
gharchive/issue
| 2023-06-12T18:48:44 |
2025-04-01T04:55:12.392030
|
{
"authors": [
"BL4K-H4CK3R",
"ItsPi3141",
"kdkikosz"
],
"repo": "ItsPi3141/alpaca-electron",
"url": "https://github.com/ItsPi3141/alpaca-electron/issues/89",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
132572613
|
Building iOS framework with build_framework.py fails
Attempting to build iOS framework with build_framework.py gives the following error:
The following build commands failed:
CompileC /Users/steve/Development/GitHub/ios/build/arm64-iPhoneOS/3rdparty/zlib/OpenCV.build/Release-iphoneos/zlib.build/Objects-normal/arm64/gzread.o 3rdparty/zlib/gzread.c normal arm64 c com.apple.compilers.llvm.clang.1_0.compiler
(1 failure)
============================================================
ERROR: Command '['xcodebuild', 'IPHONEOS_DEPLOYMENT_TARGET=6.0', 'ARCHS=arm64', '-sdk', 'iphoneos', '-configuration', 'Release', '-parallelizeTargets', '-jobs', '4', '-target', 'ALL_BUILD', 'build']' returned non-zero exit status 65
============================================================
Traceback (most recent call last):
File "opencv/platforms/ios/build_framework.py", line 87, in build
self._build(outdir)
File "opencv/platforms/ios/build_framework.py", line 81, in _build
self.buildOne(t[0], t[1], mainBD, cmake_flags)
File "opencv/platforms/ios/build_framework.py", line 139, in buildOne
execute(buildcmd + ["-target", "ALL_BUILD", "build"], cwd = builddir)
File "opencv/platforms/ios/build_framework.py", line 34, in execute
retcode = check_call(cmd, cwd = cwd)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 540, in check_call
raise CalledProcessError(retcode, cmd)
CalledProcessError: Command '['xcodebuild', 'IPHONEOS_DEPLOYMENT_TARGET=6.0', 'ARCHS=arm64', '-sdk', 'iphoneos', '-configuration', 'Release', '-parallelizeTargets', '-jobs', '4', '-target', 'ALL_BUILD', 'build']' returned non-zero exit status 65
Command:
python opencv/platforms/ios/build_framework.py ios
OS X 10.11.3
MacBook Pro/2.8 GHz Core 2 Duo
XCode 7.2.1
There is no real error reason in this log cut. Please add more info.
BTW, We can't reproduce this problem with XCode 7.2: http://pullrequest.opencv.org/buildbot/builders/master_iOS-mac/builds/194
Perhaps it's hardware-related? Compiles fine on my iMac/4 GHz i7, OS X 10.11.2, XCode 7.1.1. I'll update this machine to OS X 10.11.3/XCode 7.2 and see what happens.
Have you ever tried compiling with a Core 2 Duo? Is the Core 2 Duo supported for this purpose?
I have the same problem.
OS X 10.11.3
MacBook Pro/2.5 GHz Intel Core i5
XCode 7.2.1
Please, provide full build logs (via attachment, pastebin or gist). Try to make clean build, try to install all updates and restart computer.
Hi mshabunin,
The error log is attached. implicit_declaration_of_function_is_invalid_in_C99.txt
The error shows
gzwrite.c:90:21: error:
implicit declaration of function 'write' is invalid in C99
gzwrite.c:579:9: error:
implicit declaration of function 'close' is invalid in C99
If I add "#include <unistd.h>" in gzguts.h, this error can be fixed.
Actually there is already include <unistd.h> statement. But it is under condition from configuration stage.
Could someone put CMake output with these lines?
-- Looking for fseeko
-- Looking for fseeko - found
-- Looking for unistd.h
-- Looking for unistd.h - found
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of off64_t
-- Check size of off64_t - failed
|
gharchive/issue
| 2016-02-10T00:02:13 |
2025-04-01T04:55:12.399716
|
{
"authors": [
"SSteve",
"alalek",
"alb423",
"mshabunin"
],
"repo": "Itseez/opencv",
"url": "https://github.com/Itseez/opencv/issues/6093",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
59084562
|
Implementation of Ni-Black thresholding algorithm.
This is in response to the feature request: 4024
Could someone please review this?
Thanks!
@samyak-268, thanks! This is new algorithm and it should be contributed to http://github.com/itseez/opencv_contrib instead (to the module ximgproc).
|
gharchive/pull-request
| 2015-02-26T15:00:55 |
2025-04-01T04:55:12.401872
|
{
"authors": [
"samyak-268",
"vpisarev"
],
"repo": "Itseez/opencv",
"url": "https://github.com/Itseez/opencv/pull/3762",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
83929978
|
Bug fix for feature extraction
According to CartToPolar() function documentation, result angles could be in range (0..360). To prevent index overflow this check is important.
+1
|
gharchive/pull-request
| 2015-06-02T09:12:54 |
2025-04-01T04:55:12.402703
|
{
"authors": [
"Aljaksandr",
"bmagyar"
],
"repo": "Itseez/opencv_contrib",
"url": "https://github.com/Itseez/opencv_contrib/pull/242",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2356477757
|
While Running For first time, got this error
11.04 Retrying 1/3...
16.51 ✓ Compiled successfully
16.51 Linting and checking validity of types ...
20.26 Failed to compile.
20.26
20.26 ./components/EmptyChat.tsx:15:22
20.26 Type error: Type '{ size: number; className: string; }' is not assignable to type 'IntrinsicAttributes & { className?: string | undefined; }'.
20.26 Property 'size' does not exist on type 'IntrinsicAttributes & { className?: string | undefined; }'.
20.26
20.26 13 | return (
20.26 14 |
20.26 > 15 |
20.26 | ^
20.26 16 |
20.26 17 |
20.26 18 |
20.29 error Command failed with exit code 1.
20.29 info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
failed to solve: process "/bin/sh -c yarn build" did not complete successfully: exit code: 1
(base) orion
Follow these steps to fix:
cd ui/components/theme/
nano Switcher.tsx
Change the component declaration as follows:
const ThemeSwitcher = ({ className }: { className?: string, size?: number })
This should fix the problem.
+1 @mystogan99's solution works for me!
Fixed in the latest release
|
gharchive/issue
| 2024-06-17T06:17:00 |
2025-04-01T04:55:12.410023
|
{
"authors": [
"7990satyam200",
"ItzCrazyKns",
"mystogan99",
"rb81"
],
"repo": "ItzCrazyKns/Perplexica",
"url": "https://github.com/ItzCrazyKns/Perplexica/issues/202",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1378430072
|
Слегка подправил размер персика
За основу взяты стили персика из официального ВКшного темплейта и слегка подредактированы для большей видимости. Теперь картинка с ним не вылазит за края на мобильных устройствах
До
После
На самом деле это всего лишь пример, так как это шаблон. Но так лучше видна идея структуры и использования стилей, поэтому спасибо!
|
gharchive/pull-request
| 2022-09-19T19:32:06 |
2025-04-01T04:55:12.414081
|
{
"authors": [
"ItzNeviKat",
"SecondThundeR"
],
"repo": "ItzNeviKat/vkma-template",
"url": "https://github.com/ItzNeviKat/vkma-template/pull/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
60328758
|
Can't fully turn off shaders
Hey Ivorforce,
Because some friends have a potato GPU, they were experiencing heavy FPS drops when looking towards the sky / seeing shader effects.
So I told them to disable the shaders in the config, however after I've done some testing on my own, we found out that even with this configuration, the shaders are still enabled:
visual {
B:biomeHeatDistortion=false
B:bypassPingPongBuffer=false
D:digitalEffectPixelRescaleX=0.05000000074505806
D:digitalEffectPixelRescaleY=0.05000000074505806
B:disableDepthBuffer=true
# The strength of DoF blur away from the player.
D:dofFocalBlurFar=0.0
# The strength of DoF blur towards the player.
D:dofFocalBlurNear=0.0
# The point at which DoF starts blurring the screen, away from the player, in blocks.
D:dofFocalPointFar=512.0
# The point at which DoF starts blurring the screen, towards the player, in blocks.
D:dofFocalPointNear=0.0
B:hurtOverlayEnabled=true
B:motionBlur=false
# If a fake skybox should be rendered to make shaders affect the fog line.
B:renderFakeSkybox=true
B:shader2DEnabled=false
B:shaderEnabled=false
D:sunFlareIntensity=0.28
B:waterDistortion=false
B:waterOverlayEnabled=false
}
Screenshot with this config: https://i.imgur.com/dV7QcAh.jpg
(the clouds are from another mod, ignore these)
Me as a owner of a fairly powerful GPU did not even recognize this bug, thanks to the potatoes! :)
No worries - that's not even a shader, it's just a bit of plain OGL magic. It won't have noticeable impact on the performance, but you can still disable it by setting sunFlareIntensity=0.0 :)
Thank you for the immediate reply, I will pass this information to them and report back.
Interestingly the FPS seem to be better than before, guess I won't tell them that this should not have had any impact on that ^^
|
gharchive/issue
| 2015-03-09T11:15:36 |
2025-04-01T04:55:12.418722
|
{
"authors": [
"Ivorforce",
"ultrasn0w"
],
"repo": "Ivorforce/Psychedelicraft",
"url": "https://github.com/Ivorforce/Psychedelicraft/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1995896549
|
enum value config_group and it's string name do not match
enum variable config_group and string array config_group_names are in one-to-one correspondence,
so we need to add gettext_noop("Compat Oracle Options") after adding COMPAT_ORACLE_OPTIONS,
further more, config_group_names must end with NULL
Summary by CodeRabbit
New Features
Introduced "Compat Oracle Options" in the configuration settings.
Added a new configuration group type CONFIG_GROUP_NULL_TERMINATED.
这里应该不需要修复
这里应该不需要修复
不修改两个数组的大小对应不上,编译报错
|
gharchive/pull-request
| 2023-11-16T02:02:00 |
2025-04-01T04:55:12.425767
|
{
"authors": [
"Monkey-LaoHan",
"tanyang-star"
],
"repo": "IvorySQL/IvorySQL",
"url": "https://github.com/IvorySQL/IvorySQL/pull/568",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1342498973
|
Having images next to each other in LaTeX
Dear @IzakMarais,
Could I ask how do you get images to be next to each other when we are executing a iteration through the panels in grafana?
Thank you.
The repoter is provided as is and does not allow for fine control over layout.
|
gharchive/issue
| 2022-08-18T02:52:09 |
2025-04-01T04:55:12.427148
|
{
"authors": [
"IzakMarais",
"rllyryan"
],
"repo": "IzakMarais/reporter",
"url": "https://github.com/IzakMarais/reporter/issues/331",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
102696267
|
It is too easy to destroy lists.
Please make it harder to delete all of the items on the list! As of right now an accidental press of the top-right button can wreck the list that I've been building. It's too easy to press: It should have either have an undo toaster popup or a confirmation dialogue.
I will :) Same as something mentioned in #11
|
gharchive/issue
| 2015-08-24T03:44:35 |
2025-04-01T04:55:12.428176
|
{
"authors": [
"J-8",
"ainola"
],
"repo": "J-8/ShoppingList",
"url": "https://github.com/J-8/ShoppingList/issues/15",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1730420731
|
Addition of Text with Heading in Homepage
Is your feature request related to a problem? Please describe.
No. It's enhancement in the existing design. The homepage is too clear which makes user doubt on whether he/she is on actual web or not.
Describe the solution you'd like
Adding web name heading to the center of page as : Welcome to Jarvis
Then 2-3 line informational guideline explaining what this website about.
This text will be above the "Connect Wallet" button.
This will make homepage look even better.
Describe alternatives you've considered
Addition of text and images on the homepage
Additional context
N/A
please look at #49
|
gharchive/issue
| 2023-05-29T09:31:17 |
2025-04-01T04:55:12.430632
|
{
"authors": [
"J0SAL",
"iamRabia-N"
],
"repo": "J0SAL/Decentralized-Expense-Tracker",
"url": "https://github.com/J0SAL/Decentralized-Expense-Tracker/issues/77",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
788793575
|
小数据集训练问题
你好,
我在训练自己的小数据集上遇到了问题,
我先是用CenterX在coco数据集上,复现了你的res18和res50实验结果。
换成自己的小数据集后,在原始的Centernet上代码上的AP大约在0.59左右;但是在CenterX上经过多次尝试(例如调整学习率方式,数据增强等),AP最好的时候大约只有0.49左右。我这分析觉得可能是数据增强的方式不太对。
我看了一下你对比实验中一项是数据增强的结果比原始的结果要略低一点,对于这个问题你有什么想法或者建议吗?
非常感谢你的反馈,
对于私库我也没有什么比较好的办法,不过一般影响detection的测试性能的坑就是:测试图片大小,数据预处理如减均值除方差等是否和训练一样,以及测试指标的code是否一致。
可以先人眼看看两者的输出在原图上是否有很大的差距,来排除测试指标code不一致的问题,
然后检查一下各自的测试图片大小和数据预处理有没有bug,来排除测试时候的问题。
如果都没有问题,那可能就是训练时候出了问题,对于比较小的数据集的话,任何一个超参对结果的影响可能都比较大,得仔细排查一下。比如小数据集是否锁住BN的统计量,学习率,batch size,数据增强方式等。
再加一个我在centernet里面看到的建议, 貌似可以解决centerX训练的时候mAP骤降的问题
"For other pytorch version, you can manually open torch/nn/functional.py and find the line with torch.batch_norm and replace the torch.backends.cudnn.enabled with False. We observed slight worse training results without doing so.
"
非常感谢你的反馈,
对于私库我也没有什么比较好的办法,不过一般影响detection的测试性能的坑就是:测试图片大小,数据预处理如减均值除方差等是否和训练一样,以及测试指标的code是否一致。
可以先人眼看看两者的输出在原图上是否有很大的差距,来排除测试指标code不一致的问题,
然后检查一下各自的测试图片大小和数据预处理有没有bug,来排除测试时候的问题。
如果都没有问题,那可能就是训练时候出了问题,对于比较小的数据集的话,任何一个超参对结果的影响可能都比较大,得仔细排查一下。比如小数据集是否锁住BN的统计量,学习率,batch size,数据增强方式等。
再加一个我在centernet里面看到的建议, 貌似可以解决centerX训练的时候mAP骤降的问题
"For other pytorch version, you can manually open torch/nn/functional.py and find the line with torch.batch_norm and replace the torch.backends.cudnn.enabled with False. We observed slight worse training results without doing so.
"
@CPFLAME 谢谢哈,参照你的建议,问题已解决,多个模型教小模型的流程也跑通了。
再请教两个问题哈,
在centernet中图像都是缩放然后padding成512x512的,但centerX中每个batch的图像长边是固定为512,短边是不同的,这样做操作可以理解为数据增强嘛?
在centerX中没有固定随机种子嘛?每次训练的结果都不太一样呀。
@CPFLAME 谢谢哈,参照你的建议,问题已解决,多个模型教小模型的流程也跑通了。
再请教两个问题哈,
在centernet中图像都是缩放然后padding成512x512的,但centerX中每个batch的图像长边是固定为512,短边是不同的,这样做操作可以理解为数据增强嘛?
在centerX中没有固定随机种子嘛?每次训练的结果都不太一样呀。
@XiaoCode-er 非常感谢你的回复,可以分享一下问题出在哪里么?
下面对两个问题进行解答
1.centerX中的是直接复用的detectron2的数据处理,不能算作数据增强,只是一种数据预处理的方式罢了,另外长边并不一定是固定512,而是最长的边不会超过512,具体可以看detectron2中的ResizeShortestEdge函数。
2.我在centerX里面并没有固定随机数种子,如果需要的时候可以自行在主函数里面设置一下
@XiaoCode-er 非常感谢你的回复,可以分享一下问题出在哪里么?
下面对两个问题进行解答
1.centerX中的是直接复用的detectron2的数据处理,不能算作数据增强,只是一种数据预处理的方式罢了,另外长边并不一定是固定512,而是最长的边不会超过512,具体可以看detectron2中的ResizeShortestEdge函数。
2.我在centerX里面并没有固定随机数种子,如果需要的时候可以自行在主函数里面设置一下
@CPFLAME 一开始我是按照你的centernet_res18_coco.yaml去配置的,因为在coco运行的时候指标测试都是正常的,所以就直接改了DATASETS部分的配置,在自己的小数据集上测试时指标比较差,一直以为是训练过程有问题。后来在你回复后感觉可能是测试输入的问题,就看了一下INPUT.MIN_SIZE_TEST: 0,觉得有些奇怪所以就把MIN_SIZE_TEST和MAX_SIZE_TEST改成512后,测试指标就上去了。
最近忙着赶紧把流程跑完,所以很多参数都没细看,对detectron2的数据处理挺迷惑的,不太明白这样处理的好处在哪里?而且MIN_SIZE_TEST、MAX_SIZE_TEST、MIN_SIZE_TRAIN、MAX_SIZE_TRAIN,一直没搞明白,我看你有些配置文件中是多个值,有些只有一个值,可能需要仔细看一下ResizeShortestEdge函数。
这两天会在centerX再做一些改动,有些问题可能还需请教一下你,还请多多指教。
@CPFLAME 一开始我是按照你的centernet_res18_coco.yaml去配置的,因为在coco运行的时候指标测试都是正常的,所以就直接改了DATASETS部分的配置,在自己的小数据集上测试时指标比较差,一直以为是训练过程有问题。后来在你回复后感觉可能是测试输入的问题,就看了一下INPUT.MIN_SIZE_TEST: 0,觉得有些奇怪所以就把MIN_SIZE_TEST和MAX_SIZE_TEST改成512后,测试指标就上去了。
最近忙着赶紧把流程跑完,所以很多参数都没细看,对detectron2的数据处理挺迷惑的,不太明白这样处理的好处在哪里?而且MIN_SIZE_TEST、MAX_SIZE_TEST、MIN_SIZE_TRAIN、MAX_SIZE_TRAIN,一直没搞明白,我看你有些配置文件中是多个值,有些只有一个值,可能需要仔细看一下ResizeShortestEdge函数。
这两天会在centerX再做一些改动,有些问题可能还需请教一下你,还请多多指教。
@CPFLAME 你好,我这边遇到一个问题,在训练多个模型教一个模型的时候,我对教师模型输出的结果进行了调整,一开始训练的损失是正常下降的,测试的AP也是正常上升的,迭代十几个epoch后,损失突然上升,AP骤降接近于0,然后继续训练损失没有任何变化。这个可能是什么原因导致的呀?
@CPFLAME 你好,我这边遇到一个问题,在训练多个模型教一个模型的时候,我对教师模型输出的结果进行了调整,一开始训练的损失是正常下降的,测试的AP也是正常上升的,迭代十几个epoch后,损失突然上升,AP骤降接近于0,然后继续训练损失没有任何变化。这个可能是什么原因导致的呀?
@XiaoCode-er 请问你是如何进行调整的教师模型输出?
以及用这个可以尝试一下
“"For other pytorch version, you can manually open torch/nn/functional.py and find the line with torch.batch_norm and replace the torch.backends.cudnn.enabled with False. We observed slight worse training results without doing so.
"”
@XiaoCode-er 请问你是如何进行调整的教师模型输出?
以及用这个可以尝试一下
“"For other pytorch version, you can manually open torch/nn/functional.py and find the line with torch.batch_norm and replace the torch.backends.cudnn.enabled with False. We observed slight worse training results without doing so.
"”
@CPFLAME 将学习率降低后,这个问题也解决了。再请教一个问题哈,在cls分支经过sigmiod数值应该都为正数了,为什么在kd的时候还要再经过一次relu?还有在根据teacher的cls成wh_mask的时候使用的是leakyrelu?
这个relu和leakyrelu是做实验的时候历史遗留问题,属于重复操作,其实并没有什么用。忽略它就好了
|
gharchive/issue
| 2021-01-19T07:53:36 |
2025-04-01T04:55:12.470322
|
{
"authors": [
"CPFLAME",
"XiaoCode-er"
],
"repo": "JDAI-CV/centerX",
"url": "https://github.com/JDAI-CV/centerX/issues/23",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
309800057
|
Error should be throw when a 403 is returned from the server
What i'm seeing (via strace) with jfrog rt dl --url http://privateserver:8081/artifactory --threads 5 /a.spec is that a 403 is being returned the server and jfrog-cli-go just hangs with [Info] Searching items to download...
I was experiencing the same kind of hanging and jfrog rt search command actually in my case gave more information why (some authentication problem). Anyways my workaround was to use jfrog client version 1.13.1.
@shawnbutts & @elhigu
Thanks for reporting this issue.
We fix this issue in the latest release v1.16.0. Feel free to upgrade.
We'll appreciate you feedback for that.
|
gharchive/issue
| 2018-03-29T15:27:07 |
2025-04-01T04:55:12.490063
|
{
"authors": [
"elhigu",
"shawnbutts",
"yahavi"
],
"repo": "JFrogDev/jfrog-cli-go",
"url": "https://github.com/JFrogDev/jfrog-cli-go/issues/145",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
784923076
|
Class 'controllers\IndexController' not found,官方部署站点404了
hi,作者朋友你好~
我是php:7.4-fpm的环境~
按照文档配置nginx、composer install等一系列操作之后,碰到下面的问题~
{"file":"/data/www/photo-map/public/index.php","message":"Class 'controllers\IndexController' not found","line":25,"trace":[]}
猜测是路径加载的问题,手动require几个类之后,会引发其他的问题
朋友有时间可以看看哈~挺稀罕这个小应用的
HI,你可以先使用之前的release版本https://github.com/JIAOBANTANG/photo-map/releases/tag/1.4
近期更新的代码存在问题,我有时间会更新并修复的。
HI,你可以先使用之前的release版本https://github.com/JIAOBANTANG/photo-map/releases/tag/1.4
近期更新的代码存在问题,我有时间会更新并修复的。
|
gharchive/issue
| 2021-01-13T09:08:24 |
2025-04-01T04:55:12.505617
|
{
"authors": [
"JIAOBANTANG",
"jjshare"
],
"repo": "JIAOBANTANG/photo-map",
"url": "https://github.com/JIAOBANTANG/photo-map/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
588445603
|
VI Tester includes project dependencies into search for test-classes
The VI "source/Utilities/Get All Project Class Paths.vi" makes an attempt to only look at libraries and classes that are "before dependencies", but fails to do so because of what looks like a typo in a regular expression.
The VI searches for
<Item Name="Dependencies" Type="Dependencies"/>
but the actual string in the LabVIEW project file is
"<Item Name="Dependencies" Type="Dependencies">"
In my case the "Get All Project Class Paths.vi" finds some packed project libraries in dependencies which it cannot open and returns error 7
Error 7 occurred at Read from Text File in VITesterUtilities.lvlib:Get All Project Class Paths__jki_vi_tester.vi->VITesterUtilities.lvlib:Find Project Test Objects__jki_vi_tester.vi->Graphical Test Runner - Main UI - .vi
I may add that this is only a problem if the library that cannot open is the last to be processed by the "Get All Project Class Paths.vi" while loop. This is because only the last error propagates out of the loop and the VI.
My current workaround is to add a dummy library to the project that will appear higher up in the LabVIEW project file and thus become the last one to be opened. Effectively stopping the earlier error from flowing out of the loop.
I've named it something like "AAAAAA Dummy library.lvlib", just to make sure it ends up as high up as possible even if it would end up in the dependency section and not explicitly referenced in the project.
But really, just by having any "lvlib" in the project above the problematic one should work as a workaround
|
gharchive/issue
| 2020-03-26T14:03:29 |
2025-04-01T04:55:12.518968
|
{
"authors": [
"JonasEr"
],
"repo": "JKISoftware/JKI-VI-Tester",
"url": "https://github.com/JKISoftware/JKI-VI-Tester/issues/53",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1855643281
|
Update all dependencies to point to the latest JNoSQL 1.0.1 version
Enhancement
Update all maven snippet dependency declarations to point to the latest JNoSQL 1.0.1 version;
I've updated the dependencies to point to the JNoSQL 1.1.0 version with the commit: https://github.com/JNOSQL/jnosql.github.io/commit/ebd769eb03c376bc9f77523227b0286ab5bb14ed
|
gharchive/issue
| 2023-08-17T20:14:33 |
2025-04-01T04:55:12.545907
|
{
"authors": [
"dearrudam"
],
"repo": "JNOSQL/jnosql.github.io",
"url": "https://github.com/JNOSQL/jnosql.github.io/issues/67",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1316487301
|
🛑 Grafana is down
In 9fb09b3, Grafana ($SERVER_BASE/grafana/) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Grafana is back up in 9cb283a.
|
gharchive/issue
| 2022-07-25T08:52:17 |
2025-04-01T04:55:12.555148
|
{
"authors": [
"JSAnyone"
],
"repo": "JSAnyone/upptime",
"url": "https://github.com/JSAnyone/upptime/issues/306",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1110502709
|
눈, 비에 의한
날씨감지 우산꽂이 및 바닥 더러워짐 방지
제안 배경
눈이 오고 난 이후에 집에 들어갔을때 현관이 눈이 녹아 생긴 물로 인해 더러워지는 것을 본적있다. 휴지나 신문을 바닥에 깔고 신발을 놓으려고해도 가지고 오면 이미 눈이 다 녹아 현관이 더러워졌다.
그렇기 때문에 날씨를 모니터링해 신문이나 휴지가 집에왔을때 미리 바닥에 있었으면 하는 생각이 들었음
주요 내용
눈이 오거나 비가 왔을때 집에 가면 자동으로 기계에서 바닥이 젖는것을 막을 수 있는 무언가가 나온다. 여기에는 어플을 통해 지금 휴지가 몇장이 나와 바닥에 있는지 확인 할 수 있고 외부에서 원격으로 휴지를 더 뽑아 바닥에 배치시킬 수 있다.
위의 기능만 들어가면 너무 볼품없어보여서 비가 올것 같은 혹은 오는 날에는 비가 올것을 예고하며 우산을 챙길것을 추천해주고 우산을 우산꽂이에 넣었을때 이를 말릴 수 있게한다.
주요 기능
날씨 모니터링
휴지 같은 것을 뽑아서 바닥에 배치하는 기능
우산을 말리는 기능
핵심 기술
아두이노
어플 제작
기대 효과
생활의 편리함…?
활용 분야
기대효과와 동일
졸업작품 우산건조기 참고자료입니닷
|
gharchive/issue
| 2022-01-21T14:01:25 |
2025-04-01T04:55:12.562791
|
{
"authors": [
"JSY8869",
"hypeinthe"
],
"repo": "JSY8869/Senier-Project",
"url": "https://github.com/JSY8869/Senier-Project/issues/54",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1899875222
|
🛑 Glitch is down
In 0dc70f3, Glitch ($GLITCH) was down:
HTTP code: 503
Response time: 301 ms
Resolved: Glitch is back up in 0664383 after 44 minutes.
|
gharchive/issue
| 2023-09-17T18:31:21 |
2025-04-01T04:55:12.586336
|
{
"authors": [
"JYFUX"
],
"repo": "JYFUX/upptime",
"url": "https://github.com/JYFUX/upptime/issues/1421",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1967012825
|
🛑 Glitch is down
In bf72eac, Glitch ($GLITCH) was down:
HTTP code: 503
Response time: 356 ms
Resolved: Glitch is back up in 8b8abf5 after 2 hours, 16 minutes.
|
gharchive/issue
| 2023-10-29T14:27:16 |
2025-04-01T04:55:12.588370
|
{
"authors": [
"JYFUX"
],
"repo": "JYFUX/upptime",
"url": "https://github.com/JYFUX/upptime/issues/2264",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2445426466
|
Cannot use spoilers in tags
Describe the bug
You are not able to use spoilers as a content in tags.
To Reproduce
Create a tag with a content that contains a ||spoiler||.
Use the tag.
Expected behavior
Tag output should contain the spoiler.
Screenshots
as a workaround you can escape the | with
(i think it interprets it as an argument separator {name:arg|arg|arg|arg})
|
gharchive/issue
| 2024-08-02T17:06:52 |
2025-04-01T04:55:12.693699
|
{
"authors": [
"Tolga1452",
"meqativ"
],
"repo": "Jacherr/assyst2",
"url": "https://github.com/Jacherr/assyst2/issues/25",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
208225600
|
Feature Suggestion: Marker Cluster Layer Support
Hey Jack, thanks for this awesome plugin!
I was wondering if it is possible to add this function to Leaflet Maps with Google Sheets: https://github.com/ghybs/Leaflet.MarkerCluster.LayerSupport
As I don't really have developing skills, I didn't yet manage to do it on my own.
Greets!
Malte
Thanks @ilyankou for resolving this issue in our current dev version, to be released here as v.1 very soon
|
gharchive/issue
| 2017-02-16T19:46:11 |
2025-04-01T04:55:12.698060
|
{
"authors": [
"JackDougherty",
"wukkwukk"
],
"repo": "JackDougherty/leaflet-maps-with-google-sheets",
"url": "https://github.com/JackDougherty/leaflet-maps-with-google-sheets/issues/90",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1687687211
|
OSError: [WinError 126] Module not found.
I've been using this module for over a year now, and today it started failing.
I can see that in previous successful builds, it pulled PyInstaller 5.9.0 whereas now it pulled PyInstaller 5.10.1 I'm not sure if it is due to that.
Traceback (most recent call last):
File "c:\Python37\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "c:\Python37\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Python37\Scripts\pyinstaller.exe\__main__.py", line 7, in <module>
File "c:\Python37\lib\site-packages\PyInstaller\__main__.py", line 194, in _console_script_run
run()
File "c:\Python37\lib\site-packages\PyInstaller\__main__.py", line 180, in run
run_build(pyi_config, spec_file, **vars(args))
File "c:\Python37\lib\site-packages\PyInstaller\__main__.py", line 61, in run_build
PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
File "c:\Python37\lib\site-packages\PyInstaller\building\build_main.py", line 978, in main
build(specfile, distpath, workpath, clean_build)
File "c:\Python37\lib\site-packages\PyInstaller\building\build_main.py", line 900, in build
exec(code, spec_namespace)
File "build-on-win.spec", line 14, in <module>
cipher=block_cipher)
File "c:\Python37\lib\site-packages\PyInstaller\building\build_main.py", line 424, in __init__
self.__postinit__()
File "c:\Python37\lib\site-packages\PyInstaller\building\datastruct.py", line 173, in __postinit__
self.assemble()
File "c:\Python37\lib\site-packages\PyInstaller\building\build_main.py", line 696, in assemble
isolated.call(find_binary_dependencies, list(self.binaries), self.binding_redirects, collected_packages)
File "c:\Python37\lib\site-packages\PyInstaller\isolated\_parent.py", line 372, in call
return isolated.call(function, *args, **kwargs)
File "c:\Python37\lib\site-packages\PyInstaller\isolated\_parent.py", line 302, in call
raise RuntimeError(f"Child process call to {function.__name__}() failed with:\n" + output)
RuntimeError: Child process call to find_binary_dependencies() failed with:
File "c:\Python37\lib\site-packages\win32ctypes\pywin32\pywintypes.py", line 35, in pywin32error
yield
File "c:\Python37\lib\site-packages\win32ctypes\pywin32\win32api.py", line 43, in LoadLibraryEx
return _dll._LoadLibraryEx(fileName, 0, flags)
File "c:\Python37\lib\site-packages\win32ctypes\core\ctypes\_util.py", line 42, in check_null
raise make_error(function, function_name)
OSError: [WinError 126] Module not found.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\Python37\lib\site-packages\PyInstaller\isolated\_child.py", line 63, in run_next_command
output = function(*args, **kwargs)
File "c:\Python37\lib\site-packages\PyInstaller\building\build_main.py", line 195, in find_binary_dependencies
return bindepend.Dependencies(binaries, redirects=binding_redirects, xtrapath=extra_libdirs)
File "c:\Python37\lib\site-packages\PyInstaller\depend\bindepend.py", line 287, in Dependencies
for ftocnm, fn in getAssemblyFiles(pth, manifest, redirects):
File "c:\Python37\lib\site-packages\PyInstaller\depend\bindepend.py", line 461, in getAssemblyFiles
for assembly in getAssemblies(pth):
File "c:\Python37\lib\site-packages\PyInstaller\depend\bindepend.py", line 415, in getAssemblies
res = winmanifest.GetManifestResources(pth)
File "c:\Python37\lib\site-packages\PyInstaller\utils\win32\winmanifest.py", line 979, in GetManifestResources
return winresource.GetResources(filename, [RT_MANIFEST], names, languages)
File "c:\Python37\lib\site-packages\PyInstaller\utils\win32\winresource.py", line 155, in GetResources
hsrc = win32api.LoadLibraryEx(filename, 0, LOAD_LIBRARY_AS_DATAFILE)
File "c:\Python37\lib\site-packages\win32ctypes\pywin32\win32api.py", line 43, in LoadLibraryEx
return _dll._LoadLibraryEx(fileName, 0, flags)
File "c:\Python37\lib\contextlib.py", line 130, in __exit__
self.gen.throw(type, value, traceback)
File "c:\Python37\lib\site-packages\win32ctypes\pywin32\pywintypes.py", line 37, in pywin32error
raise error(exception.winerror, exception.function, exception.strerror)
win32ctypes.pywin32.pywintypes.error: (126, 'LoadLibraryExW', 'Module not found.')
I appreciate if you have any insights.
Thanks
I'd say it's very likely an error to do with the Docker image, and is out of my bandwidth at the moment, so PR's are most definitely welcome for this :)
I've started a new branch here where I'll try to push an image up that this action can reference python3-10-pyinstaller-5-3
Sure, I'd be happy to.
Thanks
Can you try your workflow please
Example:
- name: PyInstaller Windows
uses: JackMcKew/pyinstaller-action-windows@python3-10-pyinstaller-5-3
with:
path: src
Cool, it worked.
https://github.com/badabing2005/PixelFlasher/actions/runs/4847280857/jobs/8637394909
Many thanks for the prompt fix.
I'd say it's very likely an error to do with the Docker image, and is out of my bandwidth at the moment, so PR's are most definitely welcome for this :)
I'll try helping with this still. Didn't mean to leave the last issue that related to this hang so long with no progress / inputs / PRs.😴
Keep in mind - I've been trying to take a mental break last few weeks & trying to take in the moment of advancements we are experiencing in humanity.
That said - I'll take the feet down shortly and put this on my to-do list.
Other than that, I hope you're doing well @JackMcKew ✌️
https://github.com/JackMcKew/pyinstaller-action-windows/issues/34#issuecomment-1529196967 fixed this issue for me as well.
Much thanks!
|
gharchive/issue
| 2023-04-27T23:45:13 |
2025-04-01T04:55:12.705522
|
{
"authors": [
"DaemonDude23",
"JackMcKew",
"MarketingPip",
"badabing2005"
],
"repo": "JackMcKew/pyinstaller-action-windows",
"url": "https://github.com/JackMcKew/pyinstaller-action-windows/issues/34",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
427275478
|
求问!mindViewer界面数值代表含义!
求问啊大大 毕设危机
没想到竟然有人看,好长时间没更新了,现在开始把他完成
喵喵喵???
加油
Best Regards!
------------------ 原始邮件 ------------------
发件人: "JackeyLea"notifications@github.com;
发送时间: 2019年7月3日(星期三) 晚上9:49
收件人: "JackeyLea/MindViewer"MindViewer@noreply.github.com;
抄送: "fmx"648399880@qq.com;"Author"author@noreply.github.com;
主题: Re: [JackeyLea/MindViewer] 求问!mindViewer界面数值代表含义! (#1)
没想到竟然有人看,好长时间没更新了,现在开始把他完成
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
敢问大佬界面用的什么比较方便,我在考虑qchart,qpaint,HTML,暂时先用qpainter的吧
软件功能已经全部完成
|
gharchive/issue
| 2019-03-30T12:03:06 |
2025-04-01T04:55:12.776831
|
{
"authors": [
"JackeyLea",
"MoonFullx"
],
"repo": "JackeyLea/MindViewer",
"url": "https://github.com/JackeyLea/MindViewer/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
698453249
|
v1.72 Beta 11
You can't get the 37th endorsement anymore :)
I have made this change on my end so there are no merge conflicts, thanks for suggesting, this change will come out with the next commit to the beta.
|
gharchive/pull-request
| 2020-09-10T20:24:04 |
2025-04-01T04:55:12.802486
|
{
"authors": [
"Jacorb90",
"pg132"
],
"repo": "Jacorb90/DistInc.github.io",
"url": "https://github.com/Jacorb90/DistInc.github.io/pull/33",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
309102593
|
Allow returning a Job
For POST of PATCH requests, there is no real value to return.
Therefore I want to suggest to allow returning a kotlinx.coroutines.experimental.Job.
As Deferred implements job, the adapter could just use a Deferred<Unit> internally.
This is a follow-up of https://github.com/JakeWharton/retrofit2-kotlin-coroutines-adapter/issues/2
Do you accept a PR. My current implementation looks like this (:
override fun get(
returnType: Type,
annotations: Array<out Annotation>,
retrofit: Retrofit
): CallAdapter<*, *>? {
val rawType = getRawType(returnType)
if (rawType == Job::class.java) {
return JobCallAdapter()
}
if (Deferred::class.java != rawType) {
return null
}
...
private class JobCallAdapter : CallAdapter<Unit, Job> {
override fun responseType() = Void::class.java
override fun adapt(call: Call<Unit>): Job {
val deferred = CompletableDeferred<Unit>()
deferred.invokeOnCompletion {
if (deferred.isCancelled) {
call.cancel()
}
}
call.enqueue(object : Callback<Unit> {
override fun onFailure(call: Call<Unit>, t: Throwable) {
deferred.completeExceptionally(t)
}
override fun onResponse(call: Call<Unit>, response: Response<Unit>) {
if (response.isSuccessful) {
deferred.complete(Unit)
} else {
deferred.completeExceptionally(HttpException(response))
}
}
})
return deferred
}
}
I'm about to send a PR 😢
You don't need a separate CallAdapter for this because a Deferred is a Job.
Job is a weird choice here. It doesn't throw when you call join(). How are you expecting to be notified of failure when you are returned a Job?
True :(
|
gharchive/issue
| 2018-03-27T19:28:58 |
2025-04-01T04:55:12.862006
|
{
"authors": [
"JakeWharton",
"PaulWoitaschek"
],
"repo": "JakeWharton/retrofit2-kotlin-coroutines-adapter",
"url": "https://github.com/JakeWharton/retrofit2-kotlin-coroutines-adapter/issues/8",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
218756333
|
install error
when i npm install devtool -g but got error:
/usr/local/bin/devtool -> /usr/local/lib/node_modules/devtool/bin/index.js
> fsevents@1.1.1 install /usr/local/lib/node_modules/devtool/node_modules/fsevents
> node install
[fsevents] Success: "/usr/local/lib/node_modules/devtool/node_modules/fsevents/lib/binding/Release/node-v48-darwin-x64/fse.node" already installed
Pass --update-binary to reinstall or --build-from-source to recompile
> electron@1.4.15 postinstall /usr/local/lib/node_modules/devtool/node_modules/electron
> node install.js
/usr/local/lib/node_modules/devtool/node_modules/electron/install.js:46
throw err
^
Error: connect ETIMEDOUT 54.231.115.43:443
at Object.exports._errnoException (util.js:1022:11)
at exports._exceptionWithHostPort (util.js:1045:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1087:14)
/usr/local/lib
└── (empty)
npm ERR! Darwin 16.5.0
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "install" "devtool" "-g"
npm ERR! node v6.10.0
npm ERR! npm v3.10.10
npm ERR! code ELIFECYCLE
npm ERR! electron@1.4.15 postinstall: `node install.js`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the electron@1.4.15 postinstall script 'node install.js'.
npm ERR! Make sure you have the latest version of node.js and npm installed.
npm ERR! If you do, this is most likely a problem with the electron package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR! node install.js
npm ERR! You can get information on how to open an issue for this project with:
npm ERR! npm bugs electron
npm ERR! Or if that isn't available, you can get their info via:
npm ERR! npm owner ls electron
npm ERR! There is likely additional logging output above.
npm ERR! Please include the following file with any support request:
npm ERR! /Users/wangying/npm-debug.log
npm ERR! code 1
i think its the electron problem, so i install electron again , but cant install.
solved
|
gharchive/issue
| 2017-04-02T10:44:52 |
2025-04-01T04:55:12.865300
|
{
"authors": [
"RifeWang"
],
"repo": "Jam3/devtool",
"url": "https://github.com/Jam3/devtool/issues/117",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
97774940
|
source maps in entry file
example.js
var fs = require('fs')
debugger
no source maps:
however, it works when you require('./foo.js') from an entry.
This happens when index.html points to the same file (e.g. test.js) as we are serving and bundling.
A fix would be to just always use bundle.js -- of course, if somebody does hihat bundle.js the fix will appear again. Maybe using a unique hash or something? Or maybe there is a better solution.
|
gharchive/issue
| 2015-07-28T18:51:08 |
2025-04-01T04:55:12.867669
|
{
"authors": [
"mattdesl"
],
"repo": "Jam3/hihat",
"url": "https://github.com/Jam3/hihat/issues/17",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.