id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
2459038521
|
🛑 C21CoasttoCoast is down
In f736b65, C21CoasttoCoast (https://www.c21coasttocoast.com/) was down:
HTTP code: 0
Response time: 0 ms
Resolved: C21CoasttoCoast is back up in e11f622 after 24 minutes.
|
gharchive/issue
| 2024-08-10T11:09:45 |
2025-04-01T04:54:48.153589
|
{
"authors": [
"C21coastsh"
],
"repo": "C21coastsh/ws-ut",
"url": "https://github.com/C21coastsh/ws-ut/issues/183",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2383346987
|
[KM] 7月KM : 企業對接GO - EP1 對接財政部營業稅營業稅申報系統
協助@gibbs-shih 完成此篇 文章
參考鼎新e-GO營業稅申報管理系統
台灣市場上有許多會計軟體提供商提供與財政部401申報系統整合與介接的服務。以下是一些知名的會計軟體提供商,他們的產品通常都有這些功能:
鼎新電腦(Dingxin Computer):
鼎新電腦提供的TIPTOP ERP系統廣泛應用於台灣的各類企業,並與財政部的電子申報系統進行了深度整合。
鼎捷軟體(Ding Jie Software):
鼎捷的易飛ERP系統具有強大的會計功能,並且與財政部的電子申報系統進行了無縫介接,支持API傳輸。
財神企業(Accufin Corporation):
財神的會計系統專注於中小企業市場,提供了便捷的電子申報功能,與財政部申報系統API直接連接。
博英資訊(BOE Information Technology):
博英的ERP系統提供完整的財務與會計模組,並且支持與財政部的401申報系統進行API介接。
天心資訊(Tianxin Information Technology):
天心資訊的會計系統在台灣中小企業中廣泛使用,具有完備的電子申報功能,支持與財政部的API整合。
正航資訊(Zhenghang Information Technology):
正航的ERP系統強調易用性與高效性,提供了與財政部電子申報系統的全面整合,支持自動數據傳輸。
這些提供商的會計軟體都具有與財政部申報系統的整合功能,方便企業進行高效的稅務申報。選擇適合的軟體可以根據企業的具體需求和規模來決定,確保申報流程的便捷與準確。
媒體申報檔是什麼?
媒體產生
媒體申報檔需要匯出的資料
在設計ERP系統以支持台灣營業稅申報時,確保系統能夠匯出所有必要的申報資料是至關重要的。以下是媒體申報檔需要匯出的關鍵資料項目,這些資料將幫助企業準確、合規地完成營業稅申報。
基本資料
公司基本信息:包括公司名稱、統一編號、聯絡地址、聯絡電話等。
申報期間:申報的起始日期和結束日期。
銷售資料
銷售發票:
發票號碼
開立日期
銷售金額(含稅)
稅額
買受人名稱及統編
發票類型(如三聯式、二聯式)
銷貨退回及折讓:
相關發票號碼
退回或折讓日期
退回或折讓金額
退回或折讓原因
進項資料
進貨發票:
發票號碼
開立日期
進貨金額(含稅)
稅額
賣方名稱及統編
進貨退回及折讓:
相關發票號碼
退回或折讓日期
退回或折讓金額
退回或折讓原因
稅額扣抵資料
扣抵憑證:
憑證號碼
憑證日期
扣抵稅額
扣抵原因
稅款支付資料
支付憑證:
繳款書號碼
繳款日期
繳納稅額
稅額調整資料
調整單:
調整單號碼
調整日期
調整金額
調整原因
其他資料
營業額減免證明(如適用)
減免證明號碼
減免金額
減免原因
ERP系統的功能支持
為了支持這些資料的匯出,ERP系統應具備以下功能:
資料自動整理與生成:系統應自動整理銷售、進項、扣抵、稅款支付和調整資料,並生成相應的申報文件。
格式驗證:確保匯出的資料格式符合台灣國稅局的要求,避免因格式錯誤導致的申報失敗。
電子申報支持:系統應支持電子申報所需的格式和介面,如XML格式,並能夠生成適合上傳的電子媒體檔案(如光碟或USB)。
報表檢查與校對:提供報表檢查功能,讓用戶在提交前檢查和校對資料,確保準確性。
透過這些功能,ERP系統能夠幫助企業有效地管理營業稅申報流程,減少手動操作的錯誤風險,提升工作效率,確保稅務合規。
|
gharchive/issue
| 2024-07-01T09:38:33 |
2025-04-01T04:54:48.248799
|
{
"authors": [
"TeresaYang00"
],
"repo": "CAFECA-IO/KnowledgeManagement",
"url": "https://github.com/CAFECA-IO/KnowledgeManagement/issues/197",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1475405087
|
[Develop] 公告專區 page
components
類別分頁
search
news list
news items
翻頁
take 2 hrs
remain 翻頁列
|
gharchive/issue
| 2022-12-05T01:57:52 |
2025-04-01T04:54:48.252988
|
{
"authors": [
"Julian0701"
],
"repo": "CAFECA-IO/sports-charity",
"url": "https://github.com/CAFECA-IO/sports-charity/issues/5",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2522085361
|
I941175: Fix project registry name sanitization
Ticket: https://internal.almoctane.com/ui/entity-navigation?p=131002/6001&entityType=work_item&id=941175
A developer build has not yet been created for this branch. Click here to go ahead and create the build...
CI Build Link:
https://sou-jenkins2.swinfra.net/job/CAFapi/job/CAFapi~opensuse-opensearch2-image~I941175~CI
|
gharchive/pull-request
| 2024-09-12T11:07:11 |
2025-04-01T04:54:48.255838
|
{
"authors": [
"buildmachine-sou-jenkins2",
"michael-bryson"
],
"repo": "CAFapi/opensuse-opensearch2-image",
"url": "https://github.com/CAFapi/opensuse-opensearch2-image/pull/47",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1360653256
|
Fix GCC 12.2 warning in access to patricia_t members
Hi folks — we're seeing a few warnings in the libpatricia code with newer GCCs:
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:246:18: warning: array subscript ‘prefix_t {aka struct _prefix_t}[0]’ is partly outside array bounds of ‘unsigned char[12]’ [-Warray-bounds]
246 | prefix->bitlen = (bitlen >= 0) ? bitlen : default_bitlen;
| ~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:228:18: note: object of size 12 allocated by ‘calloc’
228 | prefix = calloc(1, sizeof(prefix4_t));
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:247:18: warning: array subscript ‘prefix_t {aka struct _prefix_t}[0]’ is partly outside array bounds of ‘unsigned char[12]’ [-Warray-bounds]
247 | prefix->family = family;
| ~~~~~~~~~~~~~~~^~~~~~~~
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:228:18: note: object of size 12 allocated by ‘calloc’
228 | prefix = calloc(1, sizeof(prefix4_t));
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:248:21: warning: array subscript ‘prefix_t {aka struct _prefix_t}[0]’ is partly outside array bounds of ‘unsigned char[12]’ [-Warray-bounds]
248 | prefix->ref_count = 0;
| ~~~~~~~~~~~~~~~~~~^~~
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:228:18: note: object of size 12 allocated by ‘calloc’
228 | prefix = calloc(1, sizeof(prefix4_t));
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:250:22: warning: array subscript ‘prefix_t {aka struct _prefix_t}[0]’ is partly outside array bounds of ‘unsigned char[12]’ [-Warray-bounds]
250 | prefix->ref_count++;
| ~~~~~~~~~~~~~~~~~^~
/home/christian/devel/zeek/zeek/src/3rdparty/patricia.c:228:18: note: object of size 12 allocated by ‘calloc’
228 | prefix = calloc(1, sizeof(prefix4_t));
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
The above is based on Zeek's version of libpatricia, but the code in question is identical. I wasn't able to test the included tweak directly with your repo but the equivalent fix on the Zeek side resolves them.
Thanks!
|
gharchive/pull-request
| 2022-09-02T22:21:23 |
2025-04-01T04:54:48.257920
|
{
"authors": [
"alistairking",
"ckreibich"
],
"repo": "CAIDA/cc-common",
"url": "https://github.com/CAIDA/cc-common/pull/11",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2633725716
|
Consistent method of enforcing polars subclass
Subclasses of a polars data frame (any UptakeData, e.g.) will revert to regular polars data frame after a polars operation (e.g. filter, with_columns, rename, etc.). We can make versions of these polars functions that return the subclass they were given, or we can coerce polars data frames into subclasses more frequently (i.e. UptakeData(df)). Choose a strategy and be consistent.
I lean toward option #2 (let data frames be data frames, and use the subclasses for explicit validation or when we need extra functionality)
Also, make choose class vs. object methods carefully to avoid self = self.some_method(...), as is currently the case in augment_implicit_columns
Also, make choose class vs. object methods carefully to avoid self = self.some_method(...), as is currently the case in augment_implicit_columns
Or, at least, keep something like that as an object method, but make it not modify-in-place. Eg my_df = my_df.augment() rather than my_df.augment() changing the content of my_df.
this was resolved in a prior PR, maybe #65
|
gharchive/issue
| 2024-11-04T20:00:14 |
2025-04-01T04:54:48.295668
|
{
"authors": [
"eschrom",
"swo"
],
"repo": "CDCgov/cfa-immunization-uptake-projection",
"url": "https://github.com/CDCgov/cfa-immunization-uptake-projection/issues/42",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1497547515
|
CA - Poorly Formatted Accession Number
User states: Hello, we have received a set of report that have an incorrect / poorly formatted accession number. I did a quick review and it looks like these are from Gaslamp Medical Center 250 Market St, San Diego CAn92101 USA.
Please correct and resend the reports for the affected messages."
Me and Sharon proceeded to conduct research and responded to the user " possible that during the conversion on your end it created the shortform version you are seeing"
That value is typically provided by the sender. We wouldn't be doing any alteration. The work here would be to double check that the value we're sending is exactly the same as what was sent to CA. If it is, then we relay that info to CA. If it isn't then we need to do additional investigation into why it's not the same.
@BerniXiongA6 check with sender (according to Brick's recommendation) and then follow up with CA. cc: @sliu1000
Hey @Jcavallo7 @sliu1000 -- I've tried to find the originating email from the sender but there's nothing in the RS shared inbox. Did this come from the SR shared inbox? Could you provide our team with the sender's info? We'd like to start working on the ticket in this coming sprint and we'll need to reach out to the sender to ask follow up questions. Thank you! cc: @brick-green @brandonnava
Hey @BerniXiongA6 it came in through the simple report box, apologies if I neglected that detail. Here are the original senders info:
Jill Meesey (jill.meesey@cdph.ca.gov)
Marjorie Richardson (marjorie.richardson@sdcounty.ca.gov)
BX sent email to original requesters of this ticket on 12/30:
Hi Jill and Marjorie Richardson,
Our operations team here at ReportStream will be bringing ticket (# 7689) into our upcoming work. In order to troubleshoot, we'll need to know what the Accession numbers are so we can confirm whether ReportStream is receiving the correct values from Gaslamp Medical Center that we are passing through to public health.
Can either of you confirm what the Accession numbers should be for this request? I'm attaching the screenshot that we received with the original request.
Once we receive this information, that can help our team identify what could be happening and determine how to fix this issue.
Thanks,
Berni
BX send email to sending facility (since Jill and Marjorie are the receivers) on 12/30:
clinic@gaslampmedicalcenter.com
Report Stream (CDC);Green, Brick (CDC/DDPHSS/OD/HITSSU) (CTR)
Hello Gas Lamp Medical Center Team:
Our operations team here at ReportStream received a request from your public health authority to troubleshoot an issue regarding COVID-19 data that was submitted to ReportStream with incorrect Accession numbers—which failed validation at your jurisdiction.
In order to investigate this further on the ReportStream end, we'll need to know what the Accession numbers are for the records that are included in this screenshot shared with us by your public health authority. Once we know what the correct Accession numbers should've been submitted to ReportStream, that will allow us to verify whether ReportStream is receiving the correct values from Gaslamp Medical Center that we are passing through to public health.
Can your team confirm what the Accession numbers should be for the following messages?
Thanks,
Berni
@brandonnava fyi
@Jcavallo7 @sliu1000 Since this came from SR, can you or someone on the SR side follow up with them?
@TaneaY Do you mind looking into this further.
@sliu1000 Follow-up email sent to requestor jill.meesey@cdph.ca.gov and submitter clinic@gaslampmedicalcenter.com as I do not see a reply from either for additional information. Prior research of file received on 12/14 shows that CA DPH was sent an HL7 file in the correct format.
@brandonnava
Receiver - Jill from California is requesting for the "expanded format" to be used for files being sent to CA as it's causing parsing issues on their side: See below the specific response:
------------------------------https://app.zenhub.com/files/304423150/44837246-5615-441a-a5e0-e6b7d3587152/download-
Hi Tanea, I apologize for my confusion. I was busy last week. I don’t believe the December issue was resolved. And the issue is still happening. Any of the shortened scientific numbers that are the same, will attach to the first instance of the number coming across since accession number is one of our primary matching metrics.
Just going back to 12/1/22, there are 2466 entries with scientific format instead of the expanded number. Many of these have repeated values causing multiple different people to be attached together.
Please escalate this issue to be fixed and resubmitted.
In column H and I, I have highlighted three sets of accession number all for different people that have been attached to each other.
Emailed Jill for an updated example - as the one that is referenced is older than 60 days.
@oslynn Receiver sent over new examples
Please see attached for the accession numbers:
(https://app.zenhub.com/files/304423150/273343fa-3355-4b61-b19d-b85f313970b3/download)
Hi Sharon, here is a list for February. The two examples below are from today.
Shortened and causing reports for different people to attach to each other:
SPM|1|1.67683E+15 & Action Urgent Care I & 05D2078131&CLIA^1.67683E+15 & Action Urgent Care I
Expanded and unique:
SPM|1|1c807a03-f24a-49b6-8f02-36e7eef3bbb5&Torrance Memorial&05D0642594&CLIA^1c807a03-f24a-49b6-8f02-36e7eef3bbb5&Torrance Memorial&05D0642594&CLIA||
After looking at the messages from SimpleReport. I think the sender uploaded CSV file produced by excel that has Format Cells -> Number -> Decimal places set to [ 2 ]. This will automatically be set to a large number in engineering number format (i.e. 1234E+15).
@sliu1000 I need to get with SimpleReport folk to see the csv file that was uploaded to them.
Email sent to SimpleReport Lab:
Hi Noah,
CA State is having issues with reviewing your labs Accession Numbers for 2/21/2023. Can you please advise on the format you are using when you submit your CSV files via SimpleReport? Perhaps you can send us a sample file so we can see what format you are using. Please DO NOT sent any PII/PHI. You can just delete those columns from your file.
Best regards,
Sharon
The ReportStream Support Team
Contact us at reportstream@cdc.gov
Support hours: 8:00 am through 9:00 pm Eastern time
Monday-Friday, excluding US public holidays
Sender used Excel or Spreadsheet software to generate the CSV file to upload to SimpleReport. That software has default Format Cells to Number Decimal places of 2. Therefore when they save the file, it saves to the engineering format for a large number that has 12 digits or more (i.e 123456789012 save to 123457E+11).
To fix this, the sender needs to reformat all Cells to Number Decimal places of 0 as following step with MS Excell:
1.) Press Ctrl + A to select all cells
2.) Right-click on the mouse.
3,) Select Format Cells
4.) Select Number -> Decimal place -> 0
5.) Press OK
6.) Save the file.
@oslynn I sent the email and cc'd you.
@sliu1000 for Google Sheets do the following:
1.) Crtl + A to select all cells
2.) Mouse click on 123 as a screenshot and select Custom number format
3.) Select 0 and Apply
4.) File -> Download -> Comma Separated Values (.csv)
Screenshot before:
Screenshot after:
I am closing this ticket: Done.
|
gharchive/issue
| 2022-12-14T23:10:49 |
2025-04-01T04:54:48.323135
|
{
"authors": [
"BerniXiongA6",
"Jcavallo7",
"TaneaY",
"brandonnava",
"brick-green",
"oslynn",
"sliu1000"
],
"repo": "CDCgov/prime-reportstream",
"url": "https://github.com/CDCgov/prime-reportstream/issues/7689",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1642738836
|
CA Flu Pilot: Create SR Sender Transform
User Story:
As ReportStream, I want to achieve an appropriate level of separation of concerns in our transforms by making sure sender-specific transforms, in this case Simple Report, are not stored with receiver transforms, in this case CA.
Description/Use Case
When getting SimpleReport FHIR FLU data sent to California, sender transforms did not exist and so all our transforms happened in the California receiver FHIR -> HL7 transforms under prime-router/metadata/hl7_mapping/STLTs/CA. Now that Sender transforms are implemented (FHIR -> FHIR) we need to comb through all the mappings in the CA folder and determine which ones are specific to SimpleReport and not CA and then move them to a SR sender transform.
Risks/Impacts/Considerations
Dev Notes:
Receiver transforms are FHIR -> HL7 transforms that occur to accommodate special requirements of the receiver (California)
Sender transforms are FHIR -> FHIR transforms that occur to accommodate special requirements or missing data of the sender (SimpleReport)
There are currently no Sender transforms in the repo. I suggest seeing if prime-router/metadata/fhir_mapping/fhir would be a good place to store them? We could make a folder here for each sender, like SimpleReport?
Victor added comments in the CA mappings files for what should be moved over to SimpleReport Sender transform
Acceptance Criteria
[ ] Created a sender transform for SimpleReport and stored it in the appropriate location in repo
[ ] Moved non-CA specific receiver mappings from CA folder to SimpleReport sender (FHIR) transform
[ ] Validated CA output HL7 is the same before and after changes to the mappings
After talking with Patricia and Victor, there is not anything to move over to simple report. However, there is going to be a ticket coming out of this to figure out how to handle the cliaForSender setting since this is in a bit of a middle no man's land between sender and receiver and may require a custom fhir function or some other very specific work around. We also determined that the sending-facility_namespace-id did not actually need to be set since it was not being set in the COVID pipeline, so next test @victor-chaparro is going to remove it to test to make sure there aren't any issues without it.
|
gharchive/issue
| 2023-03-27T20:07:01 |
2025-04-01T04:54:48.328738
|
{
"authors": [
"JessicaWNava",
"arnejduranovic"
],
"repo": "CDCgov/prime-reportstream",
"url": "https://github.com/CDCgov/prime-reportstream/issues/8861",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2552242425
|
Feature: Complex Map for args For Transformationn Engine
DevEx/OpEx
Currently, the transformation_definitions.json file is capable of passing a Map<String,String> as a parameter to the translations objects. Because of this constraint, we are unable to pass more complex types of maps and this forces us to add the complex map/data as part of the transformation class.
Propose Solution
Add the feature of passing a complex map (Map<String,Object>) from the transformation_definitions.json file. This feature will make it possible to inject the data via the transformation_definitions.json, adding flexibility, reusability, and generalizing our transformations.
Tasks
[x] Add Map<String,Object> args to Interfaces
[x] CustomFhirTransformation
[x] HappyPathCustomTransformationMockClass
[x] Add Map<String,Object> to TransformationRuleMethod
[x] Add Map<String,Object> to TransformationRule
[x] Add Map<String,Object> args to all available transformations
[x] Refactor
[x] failing tests cases
[x] null check transformations that use args
[x] Test coverage of new code
Additional Context
Add any other context or screenshots about the work here.
PR has been merged!
|
gharchive/issue
| 2024-09-27T07:41:20 |
2025-04-01T04:54:48.333310
|
{
"authors": [
"jorg3lopez"
],
"repo": "CDCgov/trusted-intermediary",
"url": "https://github.com/CDCgov/trusted-intermediary/issues/1366",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2036372832
|
Update getRoutes
This branch should be based on upgrade/react-router of common-stack
Can one of the admins verify this patch?
|
gharchive/pull-request
| 2023-12-11T19:22:34 |
2025-04-01T04:54:48.334638
|
{
"authors": [
"cdmbase",
"devmax214"
],
"repo": "CDEBase/fullstack-pro",
"url": "https://github.com/CDEBase/fullstack-pro/pull/345",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
113423112
|
Allow HTML formatting on Template "Detailed Question"
Sort of related to issue:
https://github.com/CDLUC3/dmptool/issues/6
Even though the formatting tools would be great to "see" and use when creating "detailed questions", "suggested text", "guidance", and "example text". HTML code can be used withing ALL of these except for "Detailed Questions".
Please let HTML work on detailed questions (if it is easy) OR I would love to have the formatting tools for all of the text entry if it is not too hard.
include CKEditor support for this field and Customization resources
|
gharchive/issue
| 2015-10-26T18:39:49 |
2025-04-01T04:54:48.343027
|
{
"authors": [
"shlake",
"stephaniesimms"
],
"repo": "CDLUC3/dmptool",
"url": "https://github.com/CDLUC3/dmptool/issues/157",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1080139848
|
Test failure in armhf (ubuntu and debian): test_rpc
Hi,
netopeer 2.0.35 is currently failing to build on armhf[1] due to a test_rpc failure. Here is my attempt at getting the relevant output (it's intertwined with other test results):
2: [ RUN ] test_lock_basic
2: [ OK ] test_lock_basic
2: [ RUN ] test_lock_fail
2: "<rpc-reply xmlns="urn:ietf:params:xml:ns:netconf:base:1.0" message-id="68">
2: <rpc-error>
2: <error-type>protocol</error-type>
2: <error-tag>lock-denied</error-tag>
2: <error-severity>error</error-severity>
2: <error-message xml:lang="en">Access to the requested lock is denied because the lock is currently held by another entity.</error-message>
2: <error-info>
2: <session-id>1</session-id>
2: </error-info>
2: </rpc-error>
2: </rpc-reply>
2: " != "<rpc-reply xmlns="urn:ietf:params:xml:ns:netconf:base:1.0" message-id="0">
2: <rpc-error>
2: <error-type>protocol</error-type>
2: <error-tag>lock-denied</error-tag>
2: <error-severity>error</error-severity>
2: <error-message xml:lang="en">Access to the requested lock is denied because the lock is currently held by another entity.</error-message>
2: <error-info>
2: <session-id>68</session-id>
2: </error-info>
2: </rpc-error>
2: </rpc-reply>
2: "
4: [ RUN ] test_xpath_basic
4: [ OK ] test_xpath_basic
4: [ RUN ] test_xpath_boolean_operator
4: [ OK ] test_xpath_boolean_operator
4: [ RUN ] test_xpath_union
4: [ OK ] test_xpath_union
4: [ RUN ] test_xpath_namespaces
2: [ LINE ] --- ./tests/test_rpc.c:154: error: Failure!
2/15 Test #2: test_rpc .........................Subprocess aborted***Exception: 7.96 sec
[==========] Running 11 test(s).
[ RUN ] test_lock_basic
[ OK ] test_lock_basic
[ RUN ] test_lock_fail
"<rpc-reply xmlns="urn:ietf:params:xml:ns:netconf:base:1.0" message-id="68">
<rpc-error>
<error-type>protocol</error-type>
<error-tag>lock-denied</error-tag>
<error-severity>error</error-severity>
<error-message xml:lang="en">Access to the requested lock is denied because the lock is currently held by another entity.</error-message>
<error-info>
<session-id>1</session-id>
</error-info>
</rpc-error>
</rpc-reply>
" != "<rpc-reply xmlns="urn:ietf:params:xml:ns:netconf:base:1.0" message-id="0">
<rpc-error>
<error-type>protocol</error-type>
<error-tag>lock-denied</error-tag>
<error-severity>error</error-severity>
<error-message xml:lang="en">Access to the requested lock is denied because the lock is currently held by another entity.</error-message>
<error-info>
<session-id>68</session-id>
</error-info>
</rpc-error>
</rpc-reply>
"
[ LINE ] --- ./tests/test_rpc.c:154: error: Failure!
In debian it seems to be also failing in other architectures, see [2]. All but ppc64el have the same test_rpc failure. ppc64el has other failures, so I wonder if this is a 32 bits thing.
https://launchpadlibrarian.net/574054935/buildlog_ubuntu-jammy-armhf.netopeer2_2.0.35-1_BUILDING.txt.gz
https://tracker.debian.org/pkg/netopeer2
Yes, it would seem like a 32b problem because incorrect printf flags were being used, should be fixed now.
Thanks, the test passes now
|
gharchive/issue
| 2021-12-14T19:34:55 |
2025-04-01T04:54:48.401774
|
{
"authors": [
"michalvasko",
"panlinux"
],
"repo": "CESNET/netopeer2",
"url": "https://github.com/CESNET/netopeer2/issues/1106",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
770641806
|
Question: Hide a yang module from client
Hi,
Is there a way in which I can install a yang module on server side, but hide it from client?
Something similar to this in netconfd-pro:
The --hide-module parameter specifies the name of a module to hide from advertisements to client sessions. If the specified module name is loaded into the server, then this parameter will cause it to be omitted from the following data structures:
YANG 1.0 <hello> message
/netconf-state/schemas/schema list
/modules-state/module list
No, netopeer2 does not support such a feature.
|
gharchive/issue
| 2020-12-18T07:20:53 |
2025-04-01T04:54:48.403845
|
{
"authors": [
"mdivyamohan",
"michalvasko"
],
"repo": "CESNET/netopeer2",
"url": "https://github.com/CESNET/netopeer2/issues/781",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2345856058
|
Where is the decoder
It’s been still coming but the decoder is not there it’s been a long time and still not getting a decoder
And more modes too
I've been toying around with a few possible implementations for the decoder. It is much more difficult then I anticipated and requires a custom AudioWorklet to be written. That being said over the past few weeks I've made some excellent progress and hope to have something functioning soon.
As for more modes, I just added a few more this month!
Cheers and 73!
Can't wait for the decoding feature!
please add more modes, like the robot 36 mode on the encoder and decoder
Looking forward to have a decoder too, thank you
|
gharchive/issue
| 2024-06-11T09:14:38 |
2025-04-01T04:54:48.603959
|
{
"authors": [
"CKegel",
"DenisSergeevitch",
"GFLJS2100",
"GFLJS2100-user",
"bulieme",
"gwashark"
],
"repo": "CKegel/Web-SSTV",
"url": "https://github.com/CKegel/Web-SSTV/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2331280001
|
remove all mentions of tactics key
About the PR
previously added in my previous PR; based off memory off a much older version of CM; this no longer exists and would just like to clarify my mistake before proceeding with other PR's :)
que? where's the merge conflict ;(
git genius, merge conflict solved
|
gharchive/pull-request
| 2024-06-03T14:17:12 |
2025-04-01T04:54:48.613204
|
{
"authors": [
"DangerRevolution"
],
"repo": "CM-14/CM-14",
"url": "https://github.com/CM-14/CM-14/pull/2220",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2022147207
|
Vending machines
About the PR
todo:
[x] Add all sprites
[x] Fix sprite animation speeds
[x] Make prototypes
[x] Fill what can be filled
Why / Balance
Resolves #263
Media
[x] I have added screenshots/videos to this PR showcasing its changes ingame, or this PR does not require an ingame showcase
Ready to review.
My opinion is that med vendors should be filled in medical PR.
|
gharchive/pull-request
| 2023-12-02T18:01:05 |
2025-04-01T04:54:48.616080
|
{
"authors": [
"Tunguso4ka"
],
"repo": "CM-14/CM-14",
"url": "https://github.com/CM-14/CM-14/pull/728",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1148799039
|
Implement an efficient way to keep navbar across activities
I noticed a couple of things while working on the navbar:
The way I implemented was putting it inside a Linear Layout but I realized that for every activity I would have to paste it in every single xml. That sounds very very inefficient.
I also noticed that for every onClick it has, the current activity would have to take care of it. So that would mean every single activity would have to implement goToStats, goToAccount, goToLeaderboard just so that the navbar works (it crashes if we dont implement it). Again incredibly very inefficient.
As of right now my currently solution is to put the navbar in a seperate xml and use setContentView() and "inflate" it. This definitely takes care of the first issue so you do NOT have to copy and paste it in every xml activity. However i'm still not sure how to approach the second issue.
I did this today.
|
gharchive/issue
| 2022-02-24T02:51:02 |
2025-04-01T04:54:48.640336
|
{
"authors": [
"philiponions"
],
"repo": "CMPUT301W22T31/QRAdventure",
"url": "https://github.com/CMPUT301W22T31/QRAdventure/issues/40",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2548707993
|
[NOREF] Seed data for echimp
Description
Introduce seed data with existing echimp model data
Update Postman collection
How to test this change
Query the seeded model plan, look at the related echimp CR or TDL
( you can use the updated postman collection to do the query)
PR Author Checklist
[ ] I have provided a detailed description of the changes in this PR.
[ ] I have provided clear instructions on how to test the changes in this PR.
[ ] I have updated tests or written new tests as appropriate in this PR.
[ ] Updated the Postman Collection if necessary.
PR Reviewer Guidelines
It's best to pull the branch locally and test it, rather than just looking at the code online!
When approving a PR, provide a reason why you're approving it
e.g. "Approving because I tested it locally and all functionality works as expected"
e.g. "Approving because the change is simple and matches the Figma design"
Don't be afraid to leave comments or ask questions, especially if you don't understand why something was done! (This is often a great time to suggest code comments or documentation updates)
Check that all code is adequately covered by tests - if it isn't feel free to suggest the addition of tests.
I realize this may have been outside scope, but this was the original issue I brought to clay. I'm unable to query the model plan when querying echimpCRsAndTDLs. This is the error it throws.
Thanks @patrickseguraoddball! As we discussed, the error occurs locally when data is not seeded. Though not a result of this PR, we decided to implement a change to address this in this PR!
Now, if you can't get the cache, we should expect that nil is returned, and an error is logged in the backend. This will make sure we are alerted if this happens in a deployed environment, but allows flexibility when developing. You can confirm that the error shouldn't exist in the GQL response now, but you can look at the docker logs to see an error is logged if you query echimp crs or tdls without seeded data.
|
gharchive/pull-request
| 2024-09-25T18:30:21 |
2025-04-01T04:54:48.646997
|
{
"authors": [
"StevenWadeOddball",
"patrickseguraoddball"
],
"repo": "CMS-Enterprise/mint-app",
"url": "https://github.com/CMS-Enterprise/mint-app/pull/1381",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2749860487
|
fix localization demo script to set use_sim_time parameter
fix localization demo launch file to correctly set use_sim_time parameter to nodes
fix tf_speed_control node to use node->get_clock so that it also works with ROS bag play
It was confirmed that tf_speed_control node and mutl_floor_manger worked correctly on a physical robot after these changes.
|
gharchive/pull-request
| 2024-12-19T10:24:37 |
2025-04-01T04:54:48.739927
|
{
"authors": [
"muratams"
],
"repo": "CMU-cabot/cabot-navigation",
"url": "https://github.com/CMU-cabot/cabot-navigation/pull/114",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1825208810
|
Treat pointer types like desc
Is this pull request associated with an issue(s)?
Fixes #96
Description
This PR adds support for pointer types in signatures and type assertions by treating them like they are desc.
TODOs
[x] Add tests for cpp_is_type() for pointer types
[ ] Restrict cpp_is_type pointer type detection, right now it's a simple regex
[x] Update documentation
I think this is ready for review, I haven't restricted the is_type check but let me know if that would be desirable
To be clear, when you say restricting the is_type check, are you referring to making sure that the the say a bool* actually points to a bool rather than pointing to say a list?
Yeah that's what I mean
Yeah, I'm fine punting on that for now.
Sounds good. I did see a couple diagrams in the documentation, I didn't modify any of them but should we? I think the only relevant one is the type relations diagram
How would a pointer to a pointer be handled in this implementation (for example, list**)?
I did think of that and it should work, but I'll definitely add tests and documentation for it
Sounds good. I did see a couple diagrams in the documentation, I didn't modify any of them but should we? I think the only relevant one is the type relations diagram
It's be good to keep the documentation in synch or at least open an issue about it.
@zachcran is there a source file somewhere for the type_relations.png diagram?
Hmmm, tried opening it in drawio.com and it said it wasn't a diagram file... I checked the true file type and it is indeed a png file. Checked metadata with ImageMagick and I don't see anything that could be interpreted as the diagram source
It's possible I forgot to click the "embed diagram" (or whatever they call the option) upon saving. I guess that means the image will probably need to be remade. Don't worry about doing that unless you really want to (and if you do I recommend using Excalidraw, since we've been using that instead of draw.io for some time now).
Sounds good, for now, I just added a small blurb to the RST file where the diagram is shown explaining the relationships for pointers
|
gharchive/pull-request
| 2023-07-27T20:54:03 |
2025-04-01T04:54:48.747079
|
{
"authors": [
"AutonomicPerfectionist",
"ryanmrichard"
],
"repo": "CMakePP/CMakePPLang",
"url": "https://github.com/CMakePP/CMakePPLang/pull/105",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
620620775
|
[WIP] Refactoring + Adding comprehensive tests
This is not fully complete yet, but can start to get reviewed.
The goal of this PR was to refactor the vol.py tool such that previously single class is separated into two classes each corresponding to two different steps in the geometry creation process.
The two new classes are:
IsoVolDatabase: this is the first step that uses only VisIt to generate a database of isovolumes.
IsoSurfGeom: this is the second step that creates the DAGMC geometry from the database created in the first step.
There are two pieces of information that connect these two steps:
data: the name of the data on the original cartesian mesh file
dbname: the path to the folder containing the database of isovolume files generated in the visit step
levels: the values that were used for the isosurfaces
This information for each step can be assigned in many different ways in each class. These member variables can be assigned when the object is instantiated or when the over-arching method is called. In the case of the IsoSurfGeom class, a IsoVolDatabase object can also be passed in that contains all this information. In this refactor, I have implemented some new logic for handling these various methods for providing input.
Additional changes in this PR:
Introduce the package meshio which allows me to remove some complicated logic for checking if the min/max values for the levels are within the bounds of the provided data.
removal of unused methods and variables
adds a method for writing out a levelfile in the first step and reading it in the second step
Add tests for everything (NOTE: some tests are still needed, this is still incomplete)
code coverage analysis in CI
warnings and errors are real warnings and errors and not just print statements now
Things that still need updating in this PR (WIP):
[ ] complete tests
[ ] update the CLI/argparse options in generate_isogeom.py to be consistent with the refactor
[ ] update readme
[ ] make sure docstrings are all up to date (or existent)
[ ] confirm that all the necessary checks for information are present
[ ] confirm that methods are complete
If you are reviewing the files, there is a lot added because of the tests. The most important file is IsogeomGenerator/vol.py. The changes here are significant enough I would recommend viewing it as a split diff or just look at the file entirely separate on my branch. The other most important file is test/test_vol.py. Both of these files need the diff loaded to view.
@bam241 - can you take a look at the new structure of the files I made here? I separated my huge vol.py file that contained both classes for the two steps into two separate files (IsoVolDatabase.py and IsoSurfGeom.py). I made a new file called tools.py that is meant to be the driver file. There are three available "tools" in this file, the first is a tool to generate levels(0 (optional in the full workflow), generate_volumes() which is the overarching workflow for the information int he IsoVolDatabase class (the Visit step), and generate_geometry() which is the workflow for the second step (moab step). Can you briefly take a look at the function signatures in tools.py and the inits of the two new classes just to see if this is what you were suggesting?
Also, any suggestion of a file name that's better than tools.py or driver.py?
(note, the file generate_isogeom.py is the CLI and has yet to be updated, and the tests aren't updated with the new addition of the driver file).
In both of the classes, there is the common member variables for data, levels, db, and the method read_levels(). So I think I might make a parent class to hold these and then these two classes will inherit from that parent class.
@bam241 - can you take a look at the new structure of the files I made here (specifically in the IsogeomGenerator folder)? I separated my huge vol.py file that contained both classes for the two steps into two separate files (IsoVolDatabase.py and IsoSurfGeom.py). I made a new file called tools.py that is meant to be the driver file. There are three available "tools" in this file: the first is a tool to generate levels() (optional in the full workflow), generate_volumes() which is the overarching workflow for the information int he IsoVolDatabase class (the Visit step), and generate_geometry() which is the workflow for the second step (moab step). Can you briefly take a look at the function signatures in tools.py and the inits of the two new classes just to see if this is what you were suggesting?
yes it is !
Also, any suggestion of a file name that's better than tools.py or driver.py?
I think I prefer driver.py, as it does not a set of useful methods to be used elsewhere, but it is the main course :)
(note, the file generate_isogeom.py is the CLI and has yet to be updated, and the tests aren't updated with the new addition of the driver file).
👍
Thanks so much for taking a look @bam241!
Merging this to provide a clean slate for incremental improvements
|
gharchive/pull-request
| 2020-05-19T02:06:34 |
2025-04-01T04:54:48.759603
|
{
"authors": [
"bam241",
"gonuke",
"kkiesling"
],
"repo": "CNERG/IsogeomGenerator",
"url": "https://github.com/CNERG/IsogeomGenerator/pull/33",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2386257909
|
How to output maximal unitigs?
Hi, Jamshed @jamshed
Thanks for cuttlefish giving me a new way to build cDBG in a very fast speed. I have one question after reading your paper. I noticed that you find the maximal unitigs by traversaling vertex mentioned in paper (called "two walks"). Every walk would stop until finding a "Fuzzy-Side" and concat two path if possible. So I check the code in github repo and try to find how to implement the corresponding logic. But there maybe some different with paper detailed in the repo.
I read output_maximal_unitigs_plain in src/CdBG_Plain_Writer.cpp:, I find there is not vertex traversaling, but visited every kmer in seq[left_end:right_end]. Firstly, find start_kmer and then try to find end_kmer. If paired end kmers were found, the unitig will be ouputed if not ouputed.
This is my understanding of how to output unitg. Please point it out for me if I'm wrong.
template <uint16_t k>
size_t CdBG<k>::output_maximal_unitigs_plain(const uint16_t thread_id, const char* const seq, const size_t seq_len, const size_t right_end, const size_t start_idx)
{
size_t kmer_idx = start_idx;
// assert(kmer_idx <= seq_len - k);
Annotated_Kmer<k> curr_kmer(Kmer<k>(seq, kmer_idx), kmer_idx, *hash_table);
// The subsequence contains only an isolated k-mer, i.e. there's no valid left or right
// neighboring k-mer to this k-mer. So it's a maximal unitig by itself.
if((kmer_idx == 0 || Kmer<k>::is_placeholder(seq[kmer_idx - 1])) &&
(kmer_idx + k == seq_len || Kmer<k>::is_placeholder(seq[kmer_idx + k])))
output_plain_unitig(thread_id, seq, curr_kmer, curr_kmer);
else // At least one valid neighbor exists, either to the left or to the right, or on both sides.
{
// No valid right neighbor exists for the k-mer.
if(kmer_idx + k == seq_len || Kmer<k>::is_placeholder(seq[kmer_idx + k]))
{
// A valid left neighbor exists as it's not an isolated k-mer.
Annotated_Kmer<k> prev_kmer(Kmer<k>(seq, kmer_idx - 1), kmer_idx, *hash_table);
if(is_unipath_start(curr_kmer.state_class(), curr_kmer.dir(), prev_kmer.state_class(), prev_kmer.dir()))
// A maximal unitig ends at the ending of a maximal valid subsequence.
output_plain_unitig(thread_id, seq, curr_kmer, curr_kmer);
// The contiguous sequence ends at this k-mer.
return kmer_idx + k;
}
// A valid right neighbor exists for the k-mer.
Annotated_Kmer<k> next_kmer = curr_kmer;
next_kmer.roll_to_next_kmer(seq[kmer_idx + k], *hash_table);
bool on_unipath = false;
Annotated_Kmer<k> unipath_start_kmer;
Annotated_Kmer<k> prev_kmer;
// No valid left neighbor exists for the k-mer.
if(kmer_idx == 0 || Kmer<k>::is_placeholder(seq[kmer_idx - 1]))
{
// A maximal unitig starts at the beginning of a maximal valid subsequence.
on_unipath = true;
unipath_start_kmer = curr_kmer;
}
// Both left and right valid neighbors exist for this k-mer.
else
{
prev_kmer = Annotated_Kmer<k>(Kmer<k>(seq, kmer_idx - 1), kmer_idx, *hash_table);
if(is_unipath_start(curr_kmer.state_class(), curr_kmer.dir(), prev_kmer.state_class(), prev_kmer.dir()))
{
on_unipath = true;
unipath_start_kmer = curr_kmer;
}
}
if(on_unipath && is_unipath_end(curr_kmer.state_class(), curr_kmer.dir(), next_kmer.state_class(), next_kmer.dir()))
{
output_plain_unitig(thread_id, seq, unipath_start_kmer, curr_kmer);
on_unipath = false;
}
// Process the rest of the k-mers of this contiguous subsequence.
for(kmer_idx++; on_unipath || kmer_idx <= right_end; ++kmer_idx)
{
prev_kmer = curr_kmer;
curr_kmer = next_kmer;
if(is_unipath_start(curr_kmer.state_class(), curr_kmer.dir(), prev_kmer.state_class(), prev_kmer.dir()))
{
on_unipath = true;
unipath_start_kmer = curr_kmer;
}
// No valid right neighbor exists for the k-mer.
if(kmer_idx + k == seq_len || Kmer<k>::is_placeholder(seq[kmer_idx + k]))
{
// A maximal unitig ends at the ending of a maximal valid subsequence.
if(on_unipath)
{
output_plain_unitig(thread_id, seq, unipath_start_kmer, curr_kmer);
on_unipath = false;
}
// The contiguous sequence ends at this k-mer.
return kmer_idx + k;
}
else // A valid right neighbor exists.
{
next_kmer.roll_to_next_kmer(seq[kmer_idx + k], *hash_table);
if(on_unipath && is_unipath_end(curr_kmer.state_class(), curr_kmer.dir(), next_kmer.state_class(), next_kmer.dir()))
{
output_plain_unitig(thread_id, seq, unipath_start_kmer, curr_kmer);
on_unipath = false;
}
}
}
}
// Return the non-inclusive ending index of the processed contiguous subsequence.
return kmer_idx + k;
}
dwpeng
Hi @dwpeng: good to know your interest in the algorithm!
Based on your description, you read the Cuttlefish 2 paper; but the implementation you're looking at is for the original Cuttlefish paper. You'll find the relevant implementation of the specific methods you're looking for at these files (and their .cpp counterparts): Read_CdBG.hpp, Read_CdBG_Constructor.hpp, and Read_CdBG_Extractor.hpp.
Regards.
I am so happy to receive your reply. Thank you. I will reread your code.
Regards.
|
gharchive/issue
| 2024-07-02T13:26:55 |
2025-04-01T04:54:48.770436
|
{
"authors": [
"dwpeng",
"jamshed"
],
"repo": "COMBINE-lab/cuttlefish",
"url": "https://github.com/COMBINE-lab/cuttlefish/issues/42",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1942732401
|
FIrst draft of login page
The php can work. Users can check whether the username and password is correct.
Need to change the code of creating a session.
|
gharchive/pull-request
| 2023-10-13T22:39:35 |
2025-04-01T04:54:48.771566
|
{
"authors": [
"COMPRyanLI"
],
"repo": "COMPRyanLI/COMP333Hw2",
"url": "https://github.com/COMPRyanLI/COMP333Hw2/pull/12",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2434660156
|
Chore/occupi tech
Description
This PR adds Framer Motion animations to the VisualFeatures component, enhancing the user experience with smooth transitions and engaging animations. The changes include:
Adding container animations for each major section
Implementing text animations for content to fade in and slide up
Adding image animations to fade in and scale up slightly
Using Framer Motion variants for easy animation management and reuse
These changes improve the visual appeal of the VisualFeatures component and create a more dynamic and engaging user interface.
Fixes #123 (assuming there was an issue to enhance the VisualFeatures component)
Type of change
[x] New feature (non-breaking change which adds functionality)
How Has This Been Tested?
The changes have been tested in the following ways:
[x] Manual testing in development environment
[x] Verified animations work correctly on different screen sizes
[x] Checked performance impact of animations
Checklist:
[x] My code follows the style guidelines of this project
[x] I have performed a self-review of my code
[x] I have commented my code, particularly in hard-to-understand areas
[x] I have made corresponding changes to the documentation
[x] My changes generate no new warnings
[ ] I have added tests that prove my fix is effective or that my feature works
[ ] New and existing unit tests pass locally with my changes
[ ] Any dependent changes have been merged and published in downstream modules
Additional Notes
The animations are designed to be subtle and not overwhelming
Performance impact should be minimal, but should be monitored in production
Consider adding a toggle for users who prefer reduced motion
@waveyboym please check if the occupi.tech is suitable, also there are a few fixes needed to be added such as the FAQ page, the About Us page, the Privacy policy, Terms and services etc
|
gharchive/pull-request
| 2024-07-29T07:25:05 |
2025-04-01T04:54:48.783904
|
{
"authors": [
"u21631532"
],
"repo": "COS301-SE-2024/occupi",
"url": "https://github.com/COS301-SE-2024/occupi/pull/249",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
225965616
|
Missing gossip on screen
Good afternoon, firstly, thank you very much for such a wonderful modification, which I use with the acquisition of the game.
But there is a problem - gossip on the screen about AI does not appear at all, just notifications about my studied technologies of culture and science.
How can I solve this problem? Or tell me, which file or string I should cut to get rid of the gossip removal function.
I'm playing the Russian version of the game.
Thank you very much!
This may be an issue with the heuristic used for intercepting gossips. I did my best to render it language neutral, but there's only so much I can achieve as an English-only speaker.
Hey @e1ectron, does this issue affect you as well?
Sorry, I can't check it now and in the near future (don't have access to actual game version).
Hey, @chaorace. Test it on a 1.0.4 (1.0.0.129) Mac version on Russian and English.
For a hour, I haven't seen any gossips. So problem is not the language.
May be platform? @legdrop, you play on Win or Mac?
Windows 10, 64 bit.
I think @legdrop wants to read popup gossips that appears on vanilla on center of the screen. If so you need to remove line 35 in UI/Panels/StatusMessagePanel.lua that was added on this commit f04d0b5e0ff387a80420ccb0009865df134fc524
@bolbass
Yes, you understood everything correctly, thank you all very much!
I do not have the option now to run the game, but please check, I selected the correct code to delete?
Don't delete it :)
The commit I mentioned above was already fixed in CQUI so my last comment is wrong.
The last commit that affects gossips is c8963130252a8cb90546423564eb6c22911e64ea
The result of this commit is that you can choose gossips you want to see or disable on CQUI settings that are above the minimap. Have you tried changing it?
Thanks a lot, it works!
Okay... so did we ever figure out if this issue was caused by CQUI not correctly showing gossips?
Yes, there is at least one type of gossips that is missing on gossips settings screen (a gossip when other civs enter new era for example).
This type of gossips is under "City>>X has counquered Y" settings. We should fix it and can close this issue as soon as it's enough to unchek "Trim gossip messages" to see all the gossips.
|
gharchive/issue
| 2017-05-03T12:13:00 |
2025-04-01T04:54:48.823804
|
{
"authors": [
"bolbass",
"chaorace",
"e1ectron",
"legdrop"
],
"repo": "CQUI-Org/cqui",
"url": "https://github.com/CQUI-Org/cqui/issues/517",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1551810011
|
🛑 Superset is down
In 692919f, Superset (https://superset.p3.csgroup.space) was down:
HTTP code: 503
Response time: 655 ms
Resolved: Superset is back up in dac4654.
|
gharchive/issue
| 2023-01-21T14:06:40 |
2025-04-01T04:54:48.849291
|
{
"authors": [
"alambare-csgroup"
],
"repo": "CS-METIS/metis-dev-status-page",
"url": "https://github.com/CS-METIS/metis-dev-status-page/issues/47",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
179422868
|
Typo in "Terminate (WILL revoke)"
There is a typo on the termination confirmation dialog:
"This will terminate the Yubikey, wiping the PIN, PUK, Management Key and Certificates. This will also revoke the certificiate. Proceed?"
Note "certificiate" and not "certificate".
Thx :)
|
gharchive/issue
| 2016-09-27T08:03:37 |
2025-04-01T04:54:48.896393
|
{
"authors": [
"bluikko",
"mike-csis"
],
"repo": "CSIS/EnrollmentStation",
"url": "https://github.com/CSIS/EnrollmentStation/issues/16",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2038841662
|
feature-robo-control
Moved around some items in the main file and created the control_manager.hpp/cpp. We are now good to start working on different control methods, though we do also need to think about how to integrate the yaml into the control manager file.
Marking this as closed. The main robot control code was merged with feature-controllers. This is now obsolete.
|
gharchive/pull-request
| 2023-12-13T02:29:00 |
2025-04-01T04:54:49.057127
|
{
"authors": [
"LachlanMurphy",
"Pandabear1125"
],
"repo": "CU-Robotics/firmware",
"url": "https://github.com/CU-Robotics/firmware/pull/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2124481560
|
🛑 JGA Counsel is down
In eed260a, JGA Counsel (https://www.jgacounsel.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: JGA Counsel is back up in 1944d94 after 9 minutes.
|
gharchive/issue
| 2024-02-08T07:00:15 |
2025-04-01T04:54:49.078995
|
{
"authors": [
"CVRDigital"
],
"repo": "CVRDigital/Upptime",
"url": "https://github.com/CVRDigital/Upptime/issues/12",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2592568461
|
Fix mask at boundaries
Closes #163.
The failing tests are not introduced by this PR. I opened a separate issue #166.
|
gharchive/pull-request
| 2024-10-16T17:25:58 |
2025-04-01T04:54:49.079783
|
{
"authors": [
"NoraLoose"
],
"repo": "CWorthy-ocean/roms-tools",
"url": "https://github.com/CWorthy-ocean/roms-tools/pull/165",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
940950894
|
help pages - increase window size slightly
(German menu is wrapped)
done with #1115
|
gharchive/issue
| 2021-07-09T17:34:19 |
2025-04-01T04:54:49.118279
|
{
"authors": [
"abarz722",
"tpurschke"
],
"repo": "CactuseSecurity/firewall-orchestrator",
"url": "https://github.com/CactuseSecurity/firewall-orchestrator/issues/1106",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
401071074
|
Updated fbresnet to accept different sized images
The update allows fbresnet to process images larger or smaller than prescribed image size (224x224). Will be useful while finetuning or testing on different resolution images.
Thanks @ekagra-ranjan
Sorry for the late merged. I had to manually evaluate the model on imagenet.
|
gharchive/pull-request
| 2019-01-20T07:32:35 |
2025-04-01T04:54:49.119309
|
{
"authors": [
"Cadene",
"ekagra-ranjan"
],
"repo": "Cadene/pretrained-models.pytorch",
"url": "https://github.com/Cadene/pretrained-models.pytorch/pull/124",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1960228294
|
Initial Implement of Super Mario Bros. 3
I have managed to create an implementation for Super Mario Bros. 3 (SMB3).
Quests include:
Score - Collect 100,000 points
Levels - Complete 3 levels
Worlds - Collect a hint token for each world completed
Streak - Get a bonus hint token if you complete 10 levels in a row without dying
There are still cooldown timers on Levels and Streak at the moment as a precaution since there was an issue of values staying in memory too long on death in certain cases resulting in multiple points for those quests. I developed a method of avoiding that issue in code, but have left the timers for the time being as it is beta testing. If we see no reports of messages regarding the cooldown, it should be safe to remove it eventually.
I also included a good many safeguards into the code to avoid means of cheesing the system (entering and exiting pipes on the map) and addressing specific scenarios (warp whistles do not count towards world completion for instance).
Don't forget to add a README entry.
Also a nitpick, and I don't know how @Dinopony feels about this, but I think we've tended to avoid abbreviations in game connector names so far. I personally think there should be no ambiguity when looking at a connector name. SuperMarioBros3Connector is crystal clear. SMB3Connector you have a good sense it's Super Mario Bros 3 but it could be something else when you open it.
Good point, especially in regard to Super Mega Baseball. I've expanded those out to the full names.
Also, didn't realize we needed to add our own entries to the README table, so good to know going forward.
Yup indeed, having full names is encouraged because it could get obscure really quickly if we all used abbreviations 😉
I'll try to find time to test it in the next few days, thanks for the contrib 👍
Tested the connector. Everything working as expected on my end. Good work.
|
gharchive/pull-request
| 2023-10-24T23:03:43 |
2025-04-01T04:54:49.133794
|
{
"authors": [
"Dinopony",
"RadzPrower",
"nbrochu"
],
"repo": "CalDrac/hintMachine",
"url": "https://github.com/CalDrac/hintMachine/pull/75",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
993915679
|
Child Care resources
Background on the problem the feature will solve/improved user experience
Marginalized communities often bear the heaviest burden when trying to exercise their right to vote especially when it comes to childcare. Parents either have to rely on family members to take care of their children during this period or take the child with them. This can be tiring for a parent especially if there are limited toilets, lack of shade, or long wait times.
Describe the solution you'd like
The YMCA offers numerous childcare throughout their centers. As a national organization there is greater opportunity for parents to reach out for child care during voting periods. The YMCA is a proponent of civic engagement as well as community wellness so we can point users in need of childcare to their resources. They have a 'Find your Y' mapping tool that we can either pull into 5/5 or point users to the external link.
Tasks
Investigate how to incorporate the 'Find your Y' location tool as a resource for parents/guardians
Decide if this map can be pulled into the 5/5 solution or should be pointed to via a link on 5/5
Acceptance Criteria
A user can locate the 'childcare' resource in the resources tab
A user can view the 'Find your Y' tool and select the location they need to find childcare in
A user can connect with/ contact the selected Y and follow the process for getting childcare support
From our scrum 09.15 meeting, the task is to replace the
https://ymcaofcoastalga.org/news/2018/10/17/ymca-news/ymca-works-to-help-make-it-easier-for-parents-to-vote/
with
https://www.ymca.org/find-your-y?distance=175&lat=26.0517448&lng=-80.13727569999999&geolocation_geocoder_address=33004&type=branch
That string is in the ui/src/views/SupportPage/SupportPage.vue file.
Probably it's a straight search/replace.
|
gharchive/issue
| 2021-09-11T18:57:45 |
2025-04-01T04:54:49.164745
|
{
"authors": [
"Sabine-Justilien",
"davidnixon",
"upkarlidder"
],
"repo": "Call-for-Code-for-Racial-Justice/Five-Fifths-Voter",
"url": "https://github.com/Call-for-Code-for-Racial-Justice/Five-Fifths-Voter/issues/203",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
746433064
|
Merge master into Production
Summary
Merges master branch into production
We are not ready for 1.1 yet.
soon tho
|
gharchive/pull-request
| 2020-11-19T10:10:19 |
2025-04-01T04:54:49.174250
|
{
"authors": [
"Miqhtiedev",
"SkillBeatsAll"
],
"repo": "CalmGuild/CalmBot",
"url": "https://github.com/CalmGuild/CalmBot/pull/61",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
779224726
|
Updates to the model serialization example
First of all, thanks for this awesome package - it's really helpful for newbies as me to quickly get started in the field of survival analysis. I've trained a CoxPHFitter model and would like to save it to the disk for future inference. I was able to find the following codes from this file but looks like there's a typo below: (see my comments next to those lines)
from dill import loads, dumps
from pickle import loads, dumps
s_cph = dumps(cph)
cph_new = loads(s_cph)
cph.summary # shouldn't it be cph_new.summary?
s_kmf = dumps(kmf)
kmf_new = loads(s_kmf)
kmf.survival_function_ # shouldn't it be kmf_new.summary?
The codes above only serialize binary objects in memory. It would be helpful if the following example codes could be added, too, to help others save their trained models to local path:
import pickle
with open('/path/my.pickle', 'bw') as f:
pickle.dump(cph, f) # saving my trained cph model
with open('/path/my.pickle', 'rb') as f:
cph_new = pickle.load(f)
cph_new.summary # should produce the same output as cph.summary
I can submit a merge request for this, if anybody finds it interested.
I love this suggestion (and agree that it's a typo) - feel free to send a PR and I'll merge it
I love this suggestion (and agree that it's a typo) - feel free to send a PR and I'll merge it
|
gharchive/issue
| 2021-01-05T16:04:28 |
2025-04-01T04:54:49.189802
|
{
"authors": [
"CamDavidsonPilon",
"mekomlusa"
],
"repo": "CamDavidsonPilon/lifelines",
"url": "https://github.com/CamDavidsonPilon/lifelines/issues/1196",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2214064466
|
Add opt options rebase
This is an updated version of #78 rebased onto main after the GPU changes by @jwallwork23
@ElliottKasoar's comment on the original PR:
Resolves #73
Adds flags in all(?) functions that operate on tensors (tensor creation, model loading, forward) to optionally disable autograd, which should improve performance for inference.
Also adds a similar flag to set evaluation mode for the loaded model.
Evaluation mode
Contrary to my initial comments in #73, from testing evaluation mode does appear to be preserved, both between saving and loading TorchScript, and when applied to the loaded model.
In most cases evaluation mode is therefore likely to already be set, but I think it's useful to have the option to change it, particularly if FTorch may be extended to facilitate training (#22).
NoGradMode
Enabling or disabling gradients is more complicated, as it defined via a context manager, which only appears to define the behaviour within its own scope, and so it seems necessary to enable/disable gradients before every code block that operates on tensors (similar to the Python equivalent with torch.no_grad():).
InferenceMode
No changes are currently included, but it would be good to support InferenceMode too eventually, as it should provide further performance benefits over NoGradMode.
However, it has stricter requirements, and the mode was only added (as a beta) in PyTorch 1.9, so we would need to be careful if we want to support older versions.
Model freezing
No changes are currently included, and less directly applicable to the main FTorch library, although there are still interactions e.g. freezing the model can allow InferenceMode to be enabled when loading the model.
Freezing is currently the "default" when tracing in pt2ts.py, but not for scripting, despite potentially improving performance.
Freezing appears to (sometimes) introduce numerical errors when saving the reloading (differences ~10^-6), and can seem to lead to issues loading with Forpy too.
(For more general explanation of autograd/evaluation mode, see autograd mechanics).
Note: I've also removed the old, commented out torch_from_blob function.
Hi @ElliottKasoar after the work @jwallwork23 did there were some conflicts with your PR in #78.
I have done my best to rebase your work, but please could you take a look and see if everything seems in order to you?
@TomMelt You originally reviewed this PR and approved, but if you could take a quick glance and re-review it would be appreciated - since @jwallwork23 restructured the order of functions in the files and added additional arguments in the same place as @ElliottKasoar some of the merge conflicts got a little hairy so I may have missed the odd thing!
Before we merge we need to:
[ ] Add some docs for these new options
[ ] Add an example showing the use of these new options?
Or perhaps not given we are not training yet and just use the defaults...?
Added a note to the FAQ about eval and no_grad settings.
A detailed example will perhaps wait until these are used as part of #111 since for now they are the sensible defaults for running inference.
Squashing and merging shortly.
|
gharchive/pull-request
| 2024-03-28T20:24:03 |
2025-04-01T04:54:49.199265
|
{
"authors": [
"jatkinson1000"
],
"repo": "Cambridge-ICCS/FTorch",
"url": "https://github.com/Cambridge-ICCS/FTorch/pull/103",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
244045267
|
WIKI error Page 'Role Based Authorization'
In your wiki you purpose to do this:
rails generate migration add_role_to_users role:string
But the column type needs to be an integer for work not string then:
rails generate migration add_role_to_users role:integer
Howdy @malstrom thanks for bringing this up. It's a large wiki, could you copy in the link to the specific page you're referring to?
Why does it have to be an integer?
I wrote it in first my post, role must be integer
As per my comment above: why cannot it be a string? Thanks for clarifying.
I close for stale. Feel free to reopen.
|
gharchive/issue
| 2017-07-19T13:42:36 |
2025-04-01T04:54:49.215222
|
{
"authors": [
"Malstrom",
"Schwad",
"coorasse"
],
"repo": "CanCanCommunity/cancancan",
"url": "https://github.com/CanCanCommunity/cancancan/issues/432",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
276090313
|
WIP: Integrate Workflow management app
Core features
[x] Add viewflow new dependency
[x] Configure viewflow
[x] Add django-debug-toolbar dependency for development
[x] Create a submission process
[x] Add tests for the submission process
[x] Add tests for the download task
Fixes
[x] Add support for help text in form fields
[x] Fix content layout (sticky footer & width)
Coverage remained the same at 100.0% when pulling ad995455c214ff36768ddf70d737bb4c0838626e on add-viewflow-and-upload into 0dbb55365ecaa00016b2c45e54e26a02c4dadc02 on master.
|
gharchive/pull-request
| 2017-11-22T14:37:51 |
2025-04-01T04:54:49.241365
|
{
"authors": [
"coveralls",
"jmaupetit"
],
"repo": "Candihub/pixel",
"url": "https://github.com/Candihub/pixel/pull/84",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1224563580
|
404 response to /regulator/{id}/ when not all ids could be mapped.
I am curious about the logic for using 404 as the response if not all ids were able to be attached.
404 is defined as "The server has not found anything matching the Request-URI".
If I am reading the docs correctly in the case that not all ids could be mapped the request URI /regulator/{id}/ is valid and a matching resource was found, the issue exists within the array of ids to map.
Since the /regulator/{id}/ transaction doesn't seem to follow ACID I think the request should return a 200 response code since it is possible that some ids were mapped successfully.
Then it is on the caller to check if the failures array has data / exists at all.
Thoughts on this?
This is a really great point. I am happy to change to a 200 with a failure array.
Nice. I think a 200 would result in the least amount of confusion.
That way 404 can be reserved for when the requested resource is not found.
#13
Closing issue as this will be handled with issue #13
|
gharchive/issue
| 2022-05-03T19:59:46 |
2025-04-01T04:54:49.244501
|
{
"authors": [
"alecKent",
"danaspiegel",
"elibosley"
],
"repo": "Cannabis-Labeling-API/universal-cannabis-api",
"url": "https://github.com/Cannabis-Labeling-API/universal-cannabis-api/issues/9",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2239672115
|
foeture, gauchest
This pull request was generated by the 'mq' tool
/trunk merge
|
gharchive/pull-request
| 2024-04-12T10:08:40 |
2025-04-01T04:54:49.260063
|
{
"authors": [
"paulshih-canva"
],
"repo": "Canva/mergequeue-trunk",
"url": "https://github.com/Canva/mergequeue-trunk/pull/1122",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2241199707
|
accomplis
This pull request was generated by the 'mq' tool
/trunk merge
|
gharchive/pull-request
| 2024-04-13T01:33:02 |
2025-04-01T04:54:49.260794
|
{
"authors": [
"paulshih-canva"
],
"repo": "Canva/mergequeue-trunk",
"url": "https://github.com/Canva/mergequeue-trunk/pull/1335",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2247018493
|
fibrillary, counterscalloped
This pull request was generated by the 'mq' tool
/trunk merge
|
gharchive/pull-request
| 2024-04-16T23:14:13 |
2025-04-01T04:54:49.261505
|
{
"authors": [
"hj1980"
],
"repo": "Canva/mergequeue-trunk",
"url": "https://github.com/Canva/mergequeue-trunk/pull/2921",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
142486955
|
Add support for ubernetes-lite
Requires a 1.2 upgrade. See https://github.com/kubernetes/kubernetes/blob/release-1.2/docs/proposals/federation-lite.md
Should be available in aws after https://github.com/Capgemini/kubeform/pull/64 https://github.com/Capgemini/kubeform/pull/77
|
gharchive/issue
| 2016-03-21T22:15:47 |
2025-04-01T04:54:49.268463
|
{
"authors": [
"tayzlor"
],
"repo": "Capgemini/kubeform",
"url": "https://github.com/Capgemini/kubeform/issues/46",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1576780437
|
🛑 Funky Ducks - Client is down
In 81a0416, Funky Ducks - Client (https://client.funkyducks.ml/) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Funky Ducks - Client is back up in f5cf585.
|
gharchive/issue
| 2023-02-08T20:49:27 |
2025-04-01T04:54:49.287271
|
{
"authors": [
"CaptainQWasTaken"
],
"repo": "CaptainQWasTaken/funkyducks-status",
"url": "https://github.com/CaptainQWasTaken/funkyducks-status/issues/122",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
419041186
|
全屏视频切换浮窗视频为什么要重新创建一个播放器
问题描述:
大神你好,我在看源码ListVideo2Activity的过程中,发现在处理视频进行浮窗切换的时候都是重新创建了一个GSYBaseVideoPlayer实例,复制播放器的参数重新播放,我理解使用同一个播放器,改变view的大小即可。反复切换悬浮窗感觉会消耗较大。请问这样设计有什么原因呢?
因为这样状态好管理,同时最主要是不同状态的窗口可能ui不一样。
这里切换的是ui显示层,对于播放内核并没有切换。
大神你好,在video_player_standard这个布局文件中,是同时放了small_close这样的view来操作小屏视频的,在GSYBaseVideoPlayer中也是负责小屏操作的,设计思路是StandardVideoPlayer既可以操作大屏,也可以操作小屏。我想如果使用一个播放器,通过small_id来区分视频状态,是否可以一个播放器来控制大小屏的ui变化
大神你好,在video_player_standard这个布局文件中,是同时放了small_close这样的view来操作小屏视频的,在GSYBaseVideoPlayer中也是负责小屏操作的,设计思路是StandardVideoPlayer既可以操作大屏,也可以操作小屏。我想如果使用一个播放器,通过small_id来区分视频状态,是否可以一个播放器来控制大小屏的ui变化
应该可以,我就是三种尺寸不同ui的屏幕用的同一个FloatVideo。
|
gharchive/issue
| 2019-03-09T06:46:22 |
2025-04-01T04:54:49.289672
|
{
"authors": [
"CarGuo",
"hseury",
"oveeeee"
],
"repo": "CarGuo/GSYVideoPlayer",
"url": "https://github.com/CarGuo/GSYVideoPlayer/issues/1856",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1090272389
|
HealthKit background delivery entitlement is missing
iOS 15 requires a new HealthKit background delivery entitlement.
2021-12-28 23:48:26.816569-0500 CardinalKit Example[63076:2217316] [VError(::category:)] (
(
"Missing com.apple.developer.healthkit.background-delivery entitlement."
),
"setUpBackgroundDeliveryForDataTypes(types:frequency:_:)"
)
Addressed in dbefc38.
|
gharchive/issue
| 2021-12-29T05:07:12 |
2025-04-01T04:54:49.308374
|
{
"authors": [
"vishnuravi"
],
"repo": "CardinalKit/CardinalKit",
"url": "https://github.com/CardinalKit/CardinalKit/issues/87",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2361755833
|
Release 0.1.1 on Hex
Hey @benwilson512, it seems like you bumped the version to 0.1.1 but that version isn't on Hex. Would you mind releasing it there as well, please? Thank you!
Done, thanks! :)
|
gharchive/issue
| 2024-06-19T08:45:09 |
2025-04-01T04:54:49.309398
|
{
"authors": [
"PJUllrich",
"mcrumm"
],
"repo": "CargoSense/absinthe_client",
"url": "https://github.com/CargoSense/absinthe_client/issues/17",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
236761343
|
Update deps version {hackney, poison}
Hi. I'm using dialyzer it depends on ex_aws.
I found error when getting error building a plt of certifi,
And It caused by an old version of hackney depends on an old version of certifi.
Summery
Updated hackney latest version.
Updated poison latest version.
Backgrounds
current version of hackney has dialyzer warnings. (https://github.com/benoitc/hackney/issues/409)
now latest version of hackney, solved dialyzer bugs.
Hey! I appreciate the PR, but I'm not really sure I see the need to force people to ugprade. I already support the latest versions of hackney and poison, all this PR does is remove support for earlier versions.
I really do appreciate your desire to help, but I think it will be easier for me to just update dialyze, and leave the other dependencies as is.
Thank you reviewing for this PR.
Now I know these backgrounds of Poison and apologize for my fault.
So, you may close this pull request 🙇
Thanks!!
|
gharchive/pull-request
| 2017-06-19T01:45:50 |
2025-04-01T04:54:49.312849
|
{
"authors": [
"benwilson512",
"kdxu"
],
"repo": "CargoSense/ex_aws",
"url": "https://github.com/CargoSense/ex_aws/pull/430",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1088626277
|
"redirected more than 3 times" during download link verification
This tool seems to have stopped working for me recently. Links can still be parsed, but when verifying download links, it says that it redirected more than 3 times. The links are not saved in links.txt. Additionally, the 'Last download' date on Zippyshare is not updated.
Here is my terminal output when running a single link:
Enter URLs (leave blank to stop): https://www25.zippyshare.com/v/dCDCcbJk/file.html
Enter URLs (leave blank to stop):
[*] pattern_1 has failed for link: https://www25.zippyshare.com/v/dCDCcbJk/file.html
[*] Trying next pattern
[*] pattern_2 has failed for link: https://www25.zippyshare.com/v/dCDCcbJk/file.html
[*] Trying next pattern
[*] pattern_3 has failed for link: https://www25.zippyshare.com/v/dCDCcbJk/file.html
[*] Trying next pattern
[*] pattern_4 has failed for link: https://www25.zippyshare.com/v/dCDCcbJk/file.html
[*] Trying next pattern
[*] 1/1 links parsed https://www25.zippyshare.com/d/dCDCcbJk/5/VPN-White-Paper.pdf
[*] Verifying download links...
[*] https://www25.zippyshare.com/d/dCDCcbJk/4/VPN-White-Paper.pdf redirected more than 3 times
[*] All download links saved at links.txt
Turns out I'm just dumb and running an older version of the tool. Git cloned the latest version and it works fine.
|
gharchive/issue
| 2021-12-25T16:40:47 |
2025-04-01T04:54:49.361211
|
{
"authors": [
"SvicidalBug"
],
"repo": "Cartmanishere/zippyshare-scraper",
"url": "https://github.com/Cartmanishere/zippyshare-scraper/issues/35",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
58219827
|
Some software extends dbf standard with extra encoding files, fixing #449
@Kartones, @rafatower CR this, please. I didn't know about those extra encoding files and it's pretty common (and useful for us). For example, QGIS save it. It's a workaround for DBase encoding support (I realised looking at the supported encodings table of dbf gem.
Frontend tests were OK :+1: (details)
Frontend tests were OK :+1: (details)
:+1: good research!
+1 nice!
Frontend tests were OK :+1: (details)
Frontend tests were OK :+1: (details)
Frontend tests were OK :+1: (details)
@Kartones could you CR this again? QGIS allows many exotic encodings not supported by PG, so I had to filter them and code has changed.
+1
|
gharchive/pull-request
| 2015-02-19T15:07:20 |
2025-04-01T04:54:49.367946
|
{
"authors": [
"Cartofante",
"Kartones",
"juanignaciosl",
"rafatower"
],
"repo": "CartoDB/cartodb",
"url": "https://github.com/CartoDB/cartodb/pull/2351",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1521801296
|
Use correct base address for 32-bit ASLR images
Fixes #266.
Adds a method image_base(&self) -> Va to the Pe trait:
For a PeFile, returns the preferred virtual address, i.e. self.optional_header().ImageBase.
For a PeView, it returns the true virtual address of the image. By default, this is the same as the above, except when constructing via PeView::module (uses base directly) or via from_bytes_and_base (where it is provided by the user).
The return value of this method is used for VA/RVA conversions in the Pe trait, replacing self.optional_header().ImageBase.
Unfortunately PeView was meant to be also usable if you just copied the memory of a running image (eg. from another process) where the base of the image ptr is not actually the real virtual address of the image. I've always had a nagging feeling that ignoring it and relying on ImageBase being 'corrected' was not reliable and now it's coming back to bite me :)
This will probably require an extra field in PeView containing the actual image base and trying not to break APIs.
I see. In this case, adding the extra field as you said and only setting it to self.image.as_ptr() as Va when the PeView is constructed using PeView::module should not break any existing APIs, correct?
Also, I guess a new (unsafe?) PeView constructor taking a AsRef<[u8]> and the base address would be useful in case one encounters issue #266 after copying the memory of a running image from another process.
PeView should work when copying the memory of a running image now. However, I'm not familiar with the Serde library, so I'm not sure how to serialize the PeView's image_base field without potentially breaking backwards compatibility for some formats (e.g. sequential ones). Is it fine to assume that some formats may not be compatible no matter what and just add a new serialize_field call at the end of serialize_pe that writes the image base?
Thanks for the update! I'm a bit distracted with other projects right now, I'll try to take a look at it tomorrow or during next week.
I've changed the API to set the base address after construction (I'm not a fan of a bunch of _with constructors, Rust could really use default parameter values...)
|
gharchive/pull-request
| 2023-01-06T02:54:50 |
2025-04-01T04:54:49.408109
|
{
"authors": [
"CasualX",
"tremwil"
],
"repo": "CasualX/pelite",
"url": "https://github.com/CasualX/pelite/pull/267",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1379499203
|
F8 Community Governance Oversight Meeting 3 - Thursday 8th September 2022 - 1400 UTC
Agenda - F8 Community Governance Oversight Meeting 3
Scheduled date/time of meeting : 1400 UTC, Thursday 8th September 2022
Present:
[x] Allison Fromm
[ ] George Lovegrove
[x] Kenric Nelson
[ ] Matthias Sieber
[x] Phil Khoo
[ ] Stephen Whitenstall
[x] Tevo Saks
[x] Thorsten Pottebaum
[x] Vanessa Cardui
[x] Treasury
[x] Andre Diamond
[ ] Miroslav Rajh
Proposed Agenda
Matters arising/action points
Pay Allison $75 for TH Slides - ?
Pay Stephen $500 for PM - Aug 2022 yes
Pay Kenric remainder of the research budget - Aug 2022 yes
Last meeting we were looking to add something to F9 IOG survey (Phil) - did we? - in theory yes but depends on form, scope, data protection
We missed adding something to Danny’s newsletters - tut
We agreed to discuss async the process to build public register of parameters - didn’t happen yet
CGO Treasury
Andre & Miroslav - after what has been paid, what is left?.
https://docs.google.com/spreadsheets/d/1v8EySgaWqoYlOVxHAjKKufj24x8CPn-no33EUPZP8sw/edit?usp=sharing
will rec 6860 $ on Mon
Can pay for all TH slides and project management
Partially pay Treasury management
Items left to pay for CGO8 - Reports, Retros, Surveys, Prep of F9 proposal - there’s not one… Should this be F10 proposal?? Town Hall Slides, Treasury Management. Project Management
All meeting attendance is already paid
should we develop a process for paying deliverables? Splitting equally is easier (Allison)
Tevo - we estimate delivery time of how much tasks cost. Everyone who contributes shares how much time they spent, if no information, the estimated cost will be shared equally between participants of specific task.
Vanessa - Should we split distributions between high level epics/projects/themes or we take reward each deliverable differently.
Phil - community distribution model - equal split and can choose to redistribute your share to others
Kenric - baseline = split evenly, then specific tasks - bonus pool? for those who put in extra effort. Decide thisusing Phil’s method.
Phil - easier to just pay everyone equal and then reallocate
Phil - this doesn’t change what’s already allocated
Andre - pay treasury in full this time, and then divide the rest between all of us Rename it “deliverables”
Leave it with Treasury until such time as deliverables are done? AGREED
Need to set aside money for closing report too
F7 proposal final payment - deliverables payments is what is outstanding
Pay Stephen for close out report and project management
$840 in wallet now
$6,366 to come on Monday
Town Hall Slides
Allison - Updates on Town Hall Slides
https://github.com/Catalyst-Auditing/Community-Governance-Oversight-Coordination/issues/87
Continuing
Will stay with every other week cadence
Will continue on recap of history, what’s happening in governance, and then parameter changes
CGO Project Board Review
Board:
https://github.com/orgs/Catalyst-Auditing/projects/3/views/1
F8 CGO Scope & Deliverables
5.1 Challenge Setting
What is the scope of F8 Challenge Setting oversight ?
Issue :
https://github.com/Catalyst-Auditing/Community-Governance-Oversight-Coordination/issues/88
We’ve agreed George and Phil are working on this, but what exactly are its deliverable(s) likely to be?
Do we need to consider how to maintain (and be seen to maintain) independence of oversight, given George is so strongly involved with working to change the Challenge-Setting process? George - conflict of interest is sthg he’s aware of. Meeting w Kriss on Fri - there will be changes soon. Whatever approach they want to take, will communicate it w CGO. Survey to gain community feedback on what IOG want to do? Tevo: this is best oversight we could have - let’s not add barriers. George : I do the work, oversight comes from everyone else.
What is the deliverable?? A survey? The proposals George has in? Analysis of discussion e.g in Telegram? To be discussed next meeting after George’s meeting with Kriss. Communicate that this discussion has been had, and the oversight will be to engage with or survey the community - sentiment analysis.
Thorsten - who is enacting changes to the challenge setting process? Also what is the intent, and how do we measure if it is achieved? A survey probably won’t cover that.
Kenric - not every community change HAS to be done by vote.Conversations, surveys etc are already happening - that’s also a decentralised process.
Allison - agree, it’s not always about a vote. How much surveying, conversation etc is enough to create legitimacy?? to protect the implementer from accusations of a decision that is not representative
Thorsten - agree, it’s not clearly defined how decision making is done if not via a vote. It becomes hard to determine if what is happening is within the parameter of controlling decision making process. Who decides if a vote is needed or not.
Phil - Dripdropz ATH last night - building a petition module to their voting structure, which fills some of the role we are talking about. An interesting possibility.
F10 looks likely to be another odd one: developer-heavy, and excluding much of the other work people do. Should we be additionally looking to maintain oversight of what sort of effect this is having on the community, the work that is done and the proposals that are getting funded?
5.2 Catalyst Circle Oversight
What is the scope of F8 Catalyst Circle Oversight ?
Issue :
https://github.com/Catalyst-Auditing/Community-Governance-Oversight-Coordination/issues/89
We agreed that those working on this are Tevo (general), Kenric (compensation), Vanessa (CC Oversight) - but what exactly will its deliverable(s) be?
We’ve discussed the ATHs at the end of June regarding planning for CCv4 - things have moved on a lot since then, but do we want to write up anything specific about those? Including Kenric’s points about remuneration?
CCv4 Election will be on DripDropz - what oversight can we / should we have? Do we want to have any oversight of the election design? e.g. what is the effect of doing it via DripDropz, of 1-wallet-1-vote, plutocratic voting power, etc)? And/or do we want to interrogate the resultsof the vote (in terms of what kinds of people voted, who felt “invited”, or anything else)?
Should we oversee the process that got us to this election format (e.g. was there enough community consultation?) It’s already being done by Oversight of Catalyst Circle - is it enough?
Is there scope for us to have oversight of whether the process is working and is fair, or is excluding anyone? (e.g. is there enough time for people to write platform statements; is it OK that no support is being offered to help people stand, no education sessions, etc)
The above is a LOT of oversight. Remuneration for Circle is emerging as quite a big issue, so should we make that our focus? A survey on this could canvass community opinion on how the remuneration affects whether people stand for election, etc.
Kenric - concerns about the structure of the election. CCv3.2 decided to redesign Circle, not just prepare for the election.
Tevo: what are the roles and functions in catalyst - is Circle an official role? - There are functional groups, and they are not represented? (we don’t know that before voting)
Kenric - do we need to put together a report on what we think the problems are? The point of good processes = good outcomes. Kenric thinks we now have a poor outcome - gone from well defined roles (which were not perfect) to no design (? This is not necessarily accurate - the approach of “self-defined” IS a design)
Phil: as an oversight group we shouldn’t have a group opinion? Identify the parameters that have been changed or adjusted for this to happen? could look at this sort of via parameter changes process - it’s a sub-parameter that needs looked at separately
Vanessa - can we assess the outcome when it hasn’t happened yet in the sense that we haven’t had the election?
Phil - yes we could add it to TH slides
George: Observe what is changing and look at the process they took. Pros and cons?
Phil: is Circle becoming a decision making body? V- not yet
Phil- we could use TH slides and newsletter to help inform people about what has been decided
Our oversight ends at after Circle voting results are shared
What about making our focus the remuneration
Kenric - no
Time sensitive tho - decision on pay will not come immediately
Tevo - I think statements from oversight will not have impact
5.3 d-Reps
The White Paper is progressing well - Kenric to give a quick update?
dReps “Review Paper” is nearing completion. Plan to submit to public by the end of Sept. Three sections: literature review, voting power, community agreements
Tevo’s “participating dRep” perspective - what will be the deliverable?
5.4 Governance parameters
What is the scope of F8 Governance parameters ?
Issue : https://github.com/Catalyst-Auditing/Community-Governance-Oversight-Coordination/issues/90
We have agreed that Allison, Vanessa, Thorsten and Matthias are working on this- and Andre? (to maintain the register of changes)? And Phil?
We agreed that one deliverable will be creating, publicising and maintaining a publicly- accessible register of parameter changes (including identifying who is actually making decisions, and info on the conditions on which community can take over each parameter)? So what do we need to do to launch that? And how will it be maintained? (Andre?)
We also suggested defining broad categories of parameters - should that come later, when we see what parameters there are?
Could the deliverables also include suggesting a process that IOG or anyone else should follow to launch a parameter change - particularly., an idea of how to determine what is sufficient community consultation?
We have discussed a decision-making tree (is decision urgent, etc) using info from Jeremy’s Miro https://miro.com/app/board/uXjVOsELu0U=/ - to categorize parameters - Tevo / Phil. Are we proceding with ths? and is it part of the above proposed process?
Tevo: Location of parameter changes - what it affects - who takes the decision and who is affected by it
Phil: do we focus on parameters that have “caused waves”, and ones that are announced during this proposal?
Allison: yes, narrow it down to the ones which are changing now. Possible test case - SSI proposal. Conflict of interest?
Phil: If you’re creating a proof of concept it’s not really a parameter change till it affects everyone. If it’s optional, an experiment, it’s not a parameter change
Tevo: We will send out the google form - but which parameters we report on, we will discuss.
V will share google form in Discord - will put it in next slides
Phil: community participation doesn’t yield that many results so let’s be realistic
Tevo -suggest an emphasis on propsal assessor/VPA parameters. The way we have scaled is ineffective. To be discussed.
We also said that we would follow up on the F7 meeting with Harris - particularly, he was going to share info on IOG’s current process re parameter changes. How will we progress this?
Allocating pay for deliverables
In the light of the above discussion - how do we want to do this? Equal share for all involved, as with F7 project, or something else?
CGO Proposal Reporting
F7 CGO Proposal Reporting
https://github.com/Catalyst-Auditing/Community-Governance-Oversight-Coordination/issues/91
F8 CGO Proposal Reporting
https://github.com/Catalyst-Auditing/Community-Governance-Oversight-Coordination/issues/92
Summary of CGO meeting for Catalyst Weekly Newsletter
Executive, Legislative, and Judicial functions (Kenric)
This topic can be removed - there isn’t time in these meetings - Kenric
2 meetings ago, we said we’d find a time to discuss (async or informally) the below, that Kenric raised. Don’t think it happened, and don’t want to lose it - so can we quickly decide where and how this discussion could happen?
I’m (Kenric) not sure where to put this, but I suggest having a discussion about the Executive, Legislative, and Judicial functions of governance. The voting processes of Catalyst map clearly to a Legislative function. What are the Executive and Judicial functions? Perhaps the approved projects are an executive function. Perhaps the Circle’s identification and discussion of problems is a judicial function. Clarity about this mapping might help define these different roles, and thereby the form and expectations of these roles.
AOB?
|
gharchive/issue
| 2022-09-20T14:15:34 |
2025-04-01T04:54:49.464494
|
{
"authors": [
"stephen-rowan"
],
"repo": "Catalyst-Auditing/Community-Governance-Oversight-Coordination",
"url": "https://github.com/Catalyst-Auditing/Community-Governance-Oversight-Coordination/issues/106",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
177240390
|
Buddybuild crash report on build #520
Buddybuild detected a crash from bigbigchai@gmail.com
View Full Crash Details
Crash report from Unknown user
|
gharchive/issue
| 2016-09-15T17:43:00 |
2025-04-01T04:54:49.469612
|
{
"authors": [
"kevinzhow"
],
"repo": "CatchChat/Yep",
"url": "https://github.com/CatchChat/Yep/issues/449",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1599085386
|
Port Adapters module to Java
https://github.com/CategoricalData/hydra/blob/main/hydra-haskell/src/main/haskell/Hydra/Adapters/Coders.hs
Now: https://github.com/CategoricalData/hydra/blob/main/hydra-haskell/src/main/haskell/Hydra/Adapters.hs
|
gharchive/issue
| 2023-02-24T18:06:47 |
2025-04-01T04:54:49.470886
|
{
"authors": [
"aman-dureja",
"joshsh"
],
"repo": "CategoricalData/hydra",
"url": "https://github.com/CategoricalData/hydra/issues/20",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1977183390
|
🛑 Main Website is down
In c8b1477, Main Website (https://aqurusmc.xyz) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Main Website is back up in 2f4184d after 21 minutes.
|
gharchive/issue
| 2023-11-04T05:44:52 |
2025-04-01T04:54:49.488297
|
{
"authors": [
"CcNicebruh"
],
"repo": "CcNicebruh/aqurusstatuspage",
"url": "https://github.com/CcNicebruh/aqurusstatuspage/issues/222",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2122490382
|
🛑 欧陆风云四百科(eu4cn) is down
In 1141e02, 欧陆风云四百科(eu4cn) (https://www.eu4cn.com) was down:
HTTP code: 500
Response time: 2815 ms
Resolved: 欧陆风云四百科(eu4cn) is back up in 1788a80 after 9 minutes.
|
gharchive/issue
| 2024-02-07T08:47:32 |
2025-04-01T04:54:49.490686
|
{
"authors": [
"Cccc-owo"
],
"repo": "Cccc-owo/upptime",
"url": "https://github.com/Cccc-owo/upptime/issues/185",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1872884420
|
🛑 欧陆风云四百科(eu4cn) is down
In e10ac1e, 欧陆风云四百科(eu4cn) (https://www.eu4cn.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: 欧陆风云四百科(eu4cn) is back up in 8792bf2 after 1 hour, 17 minutes.
|
gharchive/issue
| 2023-08-30T04:36:41 |
2025-04-01T04:54:49.493347
|
{
"authors": [
"Cccc-owo"
],
"repo": "Cccc-owo/upptime",
"url": "https://github.com/Cccc-owo/upptime/issues/97",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1361613558
|
fix: end tag for main element
Changes proposed in this pull request:
Fix missing end tag for the main element
Thanks a lot @magentix 👍
@all-contributors please add @magentix for bug and code.
|
gharchive/pull-request
| 2022-09-05T08:51:38 |
2025-04-01T04:54:49.496405
|
{
"authors": [
"ArnaudLigny",
"magentix"
],
"repo": "Cecilapp/Cecil",
"url": "https://github.com/Cecilapp/Cecil/pull/1455",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2522148887
|
🛑 Nextcloud is down
In b72644d, Nextcloud (https://nextcloud.hiripple.com) was down:
HTTP code: 500
Response time: 114 ms
Resolved: Nextcloud is back up in 1863d06 after 10 minutes.
|
gharchive/issue
| 2024-09-12T11:36:52 |
2025-04-01T04:54:49.506396
|
{
"authors": [
"CelestialRipple"
],
"repo": "CelestialRipple/ripplelog",
"url": "https://github.com/CelestialRipple/ripplelog/issues/2536",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
866638619
|
A User is technically optional on an audit event
Closes #223
Coverage remained the same at 84.235% when pulling ba7125d59df83853147cad4af2a5dec482ad7b92 on bug/224 into 14711e05549bf8908656f938f7578068766d0b2a on master.
|
gharchive/pull-request
| 2021-04-24T05:16:18 |
2025-04-01T04:54:49.537206
|
{
"authors": [
"ChrisMacNaughton",
"coveralls"
],
"repo": "CentauriSolutions/EyeDP",
"url": "https://github.com/CentauriSolutions/EyeDP/pull/225",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1146170578
|
scale-up mizar support
What type of PR is this?
/kind feature
What this PR does / why we need it:
Start arktos-network-controller
create default network object (flat or mizar) of for system tenant
kubeup support mizar
Which issue(s) this PR fixes:
Fixes #
Special notes for your reviewer:
Tested on scenarios blow:
Kubeup (scale up) with mizar: successfully except known pods issue (metrics-server, cordons... pods failure)
$ kubectl get pods -AT | grep mizar
system default mizar-daemon-pzmzn 2841290244949531555 1/1 Running 0 43m
system default mizar-daemon-s777f 7391625322091307921 1/1 Running 0 43m
system default mizar-daemon-xnxfb 5425713208105442661 1/1 Running 0 43m
system default mizar-operator-5c97f7478d-dcbv8 3157000951661054885 1/1 Running 0 43m
Kubeup (scale up with default network): successfully
Kubeup (scale up with default network) + kubemark ( scale up with default network): successfully
Does this PR introduce a user-facing change?:
https://github.com/CentaurusInfra/arktos/wiki/Mizar-Arktos-Integration-Release-2022-0130-Test-Plan
https://github.com/CentaurusInfra/arktos/wiki/Mizar-Arktos-Integration-Release-2022-0130-Test-Plan -- The tests for pods were all passed on 2022-02-24.
/lgtm
/approve
|
gharchive/pull-request
| 2022-02-21T19:45:53 |
2025-04-01T04:54:49.543070
|
{
"authors": [
"Sindica",
"q131172019",
"sonyafenge",
"zmn223"
],
"repo": "CentaurusInfra/arktos",
"url": "https://github.com/CentaurusInfra/arktos/pull/1377",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
178967215
|
Rating Form UI changes [LEI-213]
Refs: https://openscience.atlassian.net/browse/LEI-213
Changes
Increase spacing of radio buttons in questions 2, 4, & 6
Fix scales in questions 2, 6 & 9
Done responding to your comments/questions @abought! Let me know if I should change the rating labels to map to option values instead of indices 🍂
Done responding to your comments/questions @abought! Let me know if I should change the rating labels to map to option values instead of indices 🍂
I'll leave this slightly to your discretion- but maintaining two distinct sets of numbers associated with each item is slightly confusing, and seems like it could make the code harder to maintain. Somewhere at the back of my mind, this seems to hint that the code is duplicating more than necessary.
Let me know what you decide.
@abought, done responding! I changed the option labels to map to the option values to make that clearer. ☔
|
gharchive/pull-request
| 2016-09-23T20:31:39 |
2025-04-01T04:54:49.550921
|
{
"authors": [
"abought",
"samanehsan"
],
"repo": "CenterForOpenScience/exp-addons",
"url": "https://github.com/CenterForOpenScience/exp-addons/pull/145",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
187600355
|
Fix card text overflowing card
Refs: https://trello.com/c/e5LbqRT8/4-text-on-cards-in-sub-categories-runs-off-card
Purpose
On small screen widths, card text runs off of cards.
Summary of changes
Make buckets wider
Make cards wider
Make card text break in between long words
Looks like purely cosmetic changes here; no review needed.
This will look weird on small screens, but ultimately there's not much we can do if the column is narrower than the text.
|
gharchive/pull-request
| 2016-11-06T23:51:37 |
2025-04-01T04:54:49.554129
|
{
"authors": [
"abought",
"samanehsan"
],
"repo": "CenterForOpenScience/exp-addons",
"url": "https://github.com/CenterForOpenScience/exp-addons/pull/197",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
63818765
|
[feature] Multiple emails
Status: ready for review
@lyndsysimon Let's focus this PR on the limited multiple emails requirement i.e. ability to add emails to a user account that are non-existent, or unregistered/unconfirmed -- and then knock this out -- OK
1st pass finished. :point_up:
Ready for second pass.
2nd pass finished. :v:
Bah. For some reason, Travis isn't building this every push.
I've got my repo to build as well, and this passes: https://travis-ci.org/lyndsysimon/osf.io/builds/56771321
Response to 2nd pass complete. Your move, @sloria...
:game_die:
Pass finished. :tanabata_tree:
Third pass review complete.
:pager:ing @sloria.
Is there any way to resend a confirmation email?
Minor "bug": When adding an unconfirmed that is already in the list of unconfirmed emails, the growl message says "Email added" when it should probably say "Email already added" or something similar.
Bug: Whenever I input an unconfirmed email with an uppercase character, I get an error message "Email validation failed", but the email still gets added.
Doesn't have to be added before this is merged, but we might want to add a little helper text (e.g. with a question-mark button) to explain what it means to make an email "primary".
Pass finished. :deciduous_tree:
Some input from Trello re: the "Email added" message: https://trello.com/c/X8aJ9lC7/73-more-information-when-email-is-added
Also: https://trello.com/c/Udvre4jB/75-adding-an-email-for-a-confirmed-account-with-no-email-set
ed20e52 address https://trello.com/c/X8aJ9lC7/73-more-information-when-email-is-added
@sloria rdy for review
:+1:
|
gharchive/pull-request
| 2015-03-23T20:00:26 |
2025-04-01T04:54:49.561231
|
{
"authors": [
"chennan47",
"lbanner",
"lyndsysimon",
"sloria"
],
"repo": "CenterForOpenScience/osf.io",
"url": "https://github.com/CenterForOpenScience/osf.io/pull/2289",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
354425238
|
Account for folders without relying on querying BaseFileNode for path [#PLAT-1053]
Purpose
Yet another addendum to #8643
Changes
Use OsfStorageFileNode (cuz BaseFileNode overrides path)
If path turns out to be a folder and a version is still there, don't error just don't do anything with the folder
QA Notes
Documentation
Side Effects
Ticket
https://openscience.atlassian.net/browse/PLAT-1053
Failing build is due to missing merge migration, which has been fixed on feature/storage-i18n.
|
gharchive/pull-request
| 2018-08-27T18:25:55 |
2025-04-01T04:54:49.564785
|
{
"authors": [
"erinspace",
"sloria"
],
"repo": "CenterForOpenScience/osf.io",
"url": "https://github.com/CenterForOpenScience/osf.io/pull/8651",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
481331146
|
[ENG-691] Enables gitlab addon to access all gitlab projects with user membership
Purpose
Enable the GitLab addon to access all GitLab projects that the user is a member of. Previously, the addon was limited to GitLab projects that the user is the owner of.
Changes
Modifies the function that retrieves projects to use the general purpose project endpoint with a membership requirement as opposed to using the user/projects endpoint.
QA Notes
Testing the GitLab add on and ensuring that users are able to access GitLab projects they own as well as GitLab projects that they do not own but are members of would be beneficial.
I tested this locally using two separate GitLab accounts.
Documentation
N/A
Side Effects
N/A
Ticket
https://openscience.atlassian.net/browse/ENG-691
LGTM 🚢
|
gharchive/pull-request
| 2019-08-15T20:45:50 |
2025-04-01T04:54:49.568029
|
{
"authors": [
"UdayVarkhedkar",
"pattisdr"
],
"repo": "CenterForOpenScience/osf.io",
"url": "https://github.com/CenterForOpenScience/osf.io/pull/9131",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
248104216
|
[SVCS-332][SVCS-292] Make Googledrive give folder children’s metadata on moves and copies
Purpose
On inter/intra moves/copies googledrive doesn't return the folder's children, this fixes that.
Changes
Changes item serializer to recursively call the google drive api include children to metadata.
Side Effects
None that I know of.
Ticket
https://openscience.atlassian.net/browse/SVCS-332
Old PR
https://github.com/CenterForOpenScience/waterbutler/pull/221
Coverage remained the same at 76.68% when pulling ea5986c48aa6f86a6ffe7661db40964f05c8a2a3 on Johnetordoff:Google-drive-return-children into b787a92c180697cb8fb2f5b6c402cf7b931b339d on CenterForOpenScience:develop.
Coverage remained the same at 78.562% when pulling 5f5b5b0a90105c9c6918b1922ca6259943cb76a5 on Johnetordoff:Google-drive-return-children into 761c430396730b72e600f8def98c06e4a5d0316b on CenterForOpenScience:develop.
Coverage decreased (-0.03%) to 78.536% when pulling 547db2b48a29e6455b93ccc9f386a776127eb91b on Johnetordoff:Google-drive-return-children into 761c430396730b72e600f8def98c06e4a5d0316b on CenterForOpenScience:develop.
|
gharchive/pull-request
| 2017-08-04T20:04:52 |
2025-04-01T04:54:49.573706
|
{
"authors": [
"Johnetordoff",
"coveralls"
],
"repo": "CenterForOpenScience/waterbutler",
"url": "https://github.com/CenterForOpenScience/waterbutler/pull/244",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
120933000
|
Always On Mode
Rather than running from cron, we should have a way of running all the time. This would provide for alternate light configurations that don't rely on music to update lights. For example, it would be useful to only play songs at the top of the hour, but leave the lights on at the conclusion of the playlist (or configured subset of the playlist) and the only way to do that is to leave the process running and ensure the GPIO pins stay high
Somewhat wrong here: if you don't explicitly set the GPIO pins to low, they will stay in whatever state the song left them in. In other words, we don't need to run anything. A quick hack to get this up and running is to start a new show in cron, and ensure there are pre and post-cleanup scripts to set all the pins high so the lights stay on the rest of the time.
|
gharchive/issue
| 2015-12-08T05:16:05 |
2025-04-01T04:54:49.579793
|
{
"authors": [
"Cerberus98"
],
"repo": "Cerberus98/lightshowpi",
"url": "https://github.com/Cerberus98/lightshowpi/issues/7",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1527037310
|
Optimize local WAL size
Describe This Problem
In my local env, I find WAL(rocksdb based) is very large compared with SST, even after manually flush.
1.5G data/wal
76M data/manifest
371M data/store
1.9G data
Proposal
Do a compaction after delete_range to try remove dead entry from SST.
https://github.com/CeresDB/ceresdb/blob/9ab659a99417b41f3a4007e59bbe9b8f3ff65b8b/wal/src/rocks_impl/manager.rs#L120
Additional Context
https://github.com/facebook/rocksdb/wiki/RocksDB-Tuning-Guide#trigger-compaction-on-deletes
@ShiKaiWi I don't think this issue get fixed, wal encode is a general way to reduce wal size, but the issue here is to optimize for rocksdb-based, they have no direct relation.
|
gharchive/issue
| 2023-01-10T09:19:27 |
2025-04-01T04:54:49.583059
|
{
"authors": [
"jiacai2050"
],
"repo": "CeresDB/ceresdb",
"url": "https://github.com/CeresDB/ceresdb/issues/554",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1751062738
|
[Feature Request] TilesetCombiner fails for older tilesets with content adressed by url property instead of uri
When using the combine method on tilesets produced by ContextCapture, then an error Content does not have a URI is raised. This is probably because the tile contents in the tileset root json file (plus subsequent tiles) are referred to via the url property rather than the uri property - which references either b3dm or json tile contents.
I know I can use the 3d-tiles-tools upgrade utility on local tilesets I do manage - although it will take a lot of time to upgrade loads of tilesets.
But for tilesets I do not manage, which are stored online on different servers and that I'd like to combine, then the easiest way would be for the combine utility to allow for tilesets which use either the url or the uri property to reference tile content.
We have been pondering this question internally.
The pragmatic view is: All inputs have to be valid. And for the case of the uri vs. url, the recommendation would be: Run upgrade on the input, and then use the upgraded tilesets for any further operations.
But... I can see that this is... invonvenient, particularly when the reason for something not working properly is literally that of a single letter (or "a pixel", as in "the difference between i and l...).
One reason of why we are (currently) not just ignoring this point is that the generalization of this question could be: Which forms of "invalid" inputs should be "ignored"? Or: Which parts of the upgrade should be done autmatically in other operations? At some point, we'd have to spread workarounds for legacy data, handling for special cases, and pseudo-upgrade functionality throughout the code. And it would really be better to summarize this in the upgrade command.
(One example: When you combine two tilesets that use url - should the combined tileset use uri or url then? It should probably use uri, but maybe you want to retain the original url...?)
That being said: The case of the content url was a special one for me as well: The original (pre-refactoring) state of the 3d-tiles-tools had been using the (somewhat unspecified) Tileset sample data extensively for tests. This one uses url. I created a utility function to handle this more transparently. But at some point, I just created a TilesetWithUris from that, to not have to cope with this issue for some tests 😕
So I agree that we should consider to make the tools more resilient for the particular, very special case of uri-vs-url...
Hi again, and again thanks for all this context, very useful.
Understood for the fact that the upgrade utility would be the corner piece before any other workflow operation, which would therefore only have to deal with valid tilesets, simplifying the code a lot. Then indeed, the question of whether to perform such pseudo-upgrade within workflow operations could be important - and even generalized as a cli option for every workflow.
Good question regarding whether the output combined tileset should have a url or uri property. If the tile versioning is a property of each tile json, then since viewers can handle tilesets of different versions, I would say they can be mixed, but I don't know the intricacies of such a decision.
And it's true this uri-vs-url seem like a special corner-case. Being able to combine tilesets without the requirement for upgrading them would make it easier to source external tilesets (if served from a CORS enabled server) without the need to download/serve them and mix multiple sources for content.
Edit final questions related to tiling:
are utilities defined in Cesium3DTilesWriter meant to build a tool to tile arbitrary input meshes to 3D-tiles? Probably something that is used internally by the Cesium Ion pipeline but not meant to be open-sourced as a complete tool yet? All commercial photogrammetry suites have been offering 3D-tiles exporting capabilities for the past few years (RealityCapture, ContextCapture, Agisoft Metashape, etc), but no open-source suite does so at the moment (Colmap/Meshroom etc), probably because of that fact that there is no oss tiler for meshes existing yet, only for pointclouds.
final question: when tiling a pointcloud into a Potree dataset, I can specify the bounding box of the input pointcloud data which will be used as the box of the root tile. Therefore if I reuse the same bbox for tiling multiple pointclouds (where coords are expressed in the same Coordinate Reference System), and PotreeConvert them sequentially with the overwrite option, then it is equivalent to appending pointclouds to the same tiled Potree container. Is there any standard representation as to how to do this for 3d-tiles? This way, we could probably combine tilesets to a master, earth-covering tileset, where tilesets would be appended at a given level in the hierarchy without affecting any other portions of the tileset, and this would be really useful - a way for us to combine a few hundreds indepdent tilesets to a single, earth-wide tileset ala google3dtiles.
Understood for the fact that the upgrade utility would be the corner piece before any other workflow operation,
Ideally, doing an upgrade should never be necessary. It should only be a last-resort option for the case that somebody has a legacy tileset, and wants to try and salvage it. One difficulty is that it is tremendously hard to say which "legacy" elements can be updated (and how), and what exactly the result will be. One example is that of tilesets that contain glTF 1.0 data. It is sometimes simply not possible to update this to glTF 2.0, and whether it is possible depends on many low-level technical factors, and it would be hard or impossible to establish a reliable contract for the behavior of the upgrade in this case.
The url-vs-uri, however, is relatively simple. It only twiddles in the JSON, and can be upgraded with a few lines of code. (So it doesn't really need the broader infrastructure that is offerd by the general upgrade command). We'll still have to see where/how to integrate that step in the most most sensible way. The two "extremes" would be:
Hand this in the most fine-grained manner, on the fly. Instead of writing const uri = content.uri, we'd have to say const uri = Contents.getUri(content), where the latter is the utility function that returns the url from legacy content if necessary
Handle it as a coarse-grained, blanket step. Whenever a tileset.json is read, there could be one pass to PseudoUpgrade.doThatUrlToUriUpgradeIfNecessary(tilesetJson), to have one place where this upgrade happens if necessary, and all the remaining code can use the content.uri directly, without having to worry about the case that it might be a url
Both have pros and cons. We'll have to sort that out..
Regarding the last questions...:
are utilities defined in Cesium3DTilesWriter meant to build a tool to tile arbitrary input meshes to 3D-tiles? Probably something that is used internally by the Cesium Ion pipeline but not meant to be open-sourced as a complete tool yet?
The Cesium3DTilesWriter in particular is just intended to have a mechanism for reading/writing the tileset JSON data. This code is auto-generated from the schema. Users could use this, for example, when they implement a tool that converts meshes to 3D Tiles. Considering the complexity of such a tool, the part that is responsible for writing the JSON would only be a minor, minor part, but ... at least, users wouldn't have to write that part from scratch.
final question: when tiling a pointcloud into a Potree dataset, ...
Sorry, I'm lacking a lot of context here. It sounds like this might be related to external tilesets in general, and maybe something that involves additive refinement. But I don't know Potree well enough to say more here. You might consider bringing this up in the forum at https://community.cesium.com/c/3d-tiles/16 , maybe someone with knowledge about Potree and a better understanding of your goals could chime in there.
Again thanks for all these details, very useful to understand the state of this toolset. And indeed probably not the right place to share these thoughts - the way I did it here is because from my point of view this could be what merge/combine are meant to do - combining tilesets, merging two or more hierarchical structures keeping only the lowest error/highest detail data.
You're right I'll iterate over these thoughts and ask along the way to the community forum instead if I need some feedback. Thanks again for all your help!
As I mentioned, there are some thoughts about possible extensions of the tools in the future. These thoughts differ in how likely it is for them to become actual points on a roadmap.
One example is that we have 'combine' (which creates one large tileset from one that had external tilesets), but we don't have an opposite of that - i.e. there is no "split" function. This could, in some way, be a low-hanging fruit: Just traverse the tileset, and whenever you reach depth x, start writing out a new one.
Now we could go and implement that. But...
nobody needed it until now (i.e. there was no feature request for that)
there is some overlap to concepts of implicit tilesets (the x would be the subtreeLevels, so to speak)
maybe most importantly: This could be generalized ... abritrarily
And before starting something like that, one should at least have a rough idea about how it could be generalized. For example: People might not want to split their tileset into "slices" with x levels each, but maybe based on some geometricError-threshold`, or maybe based on the file/data size of the results...
But if you have ideas (or even specific demand) for a certain functionality, just let us know... (or... open a PR, of course...)
|
gharchive/issue
| 2023-06-10T18:07:30 |
2025-04-01T04:54:49.642806
|
{
"authors": [
"javagl",
"jo-chemla"
],
"repo": "CesiumGS/3d-tiles-tools",
"url": "https://github.com/CesiumGS/3d-tiles-tools/issues/43",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1195454987
|
UE5 + Vehicle = exaggerated motion blur
If you create a vehicle template in UE5 and drive the car and notice the wheel motion blur looks fine in the template map
If you then add the Cesium plugin (CesiumForUnreal-500-v1.12.1-ue5 - windows) and create a blank scene, add Cesium World Terrain + Cesium SunSky and then place the car on a city street in Denver area (default area) the motion blur on the tires go crazy. Everything still has default project settings from the vehicle template project.
If this is some setting I'm missing somewhere, please let me know.
I went back and the wheel motion blur is terrible regardless of Cesium being involved in UE4.27
I don't recall this in older versions of Cesium.
I tried various settings of anti-aliasing to no effect in the Cesium map.
same behavior with first person template. its fine until walking on cesium generated terrain. Maybe this is the physics bodies that are being created?
@nzanepro Is there a drop in frame rate when adding the Cesium tilesets? I am not sure how motion blur is computed, but I wonder if it varies based on the delta time between frames.
The CitySample has a better example of car tire motion blur trickery, which works well with Cesium tileset.
The first person template still exhibits this difference between walking on a Geometric "Plane" or "Cube" object and the ground provided by the Cesium World Terrain.
Is the Cesium Physics geometry able to be presented as a StaticMesh to the Physics system?
Cesium tiles are simply static meshes, though they're created and destroyed in a view-dependent manner. I think Nithin's question about frame rate might be highly relevant. Were you able to confirm a frame rate drop?
ok, I figured out something that is reproduceable for you.
install cesium samples project into ue5.01
navigate to 03_CesiumDenver Level
delete Aerometrex Denver photogrammetry from outliner
play
you will start on the "StartPlatform"
use the "a" button to run to the left
notice the motion blur while on the "StartPlatform" vs the motion blur when you drop to the cesium tiles and continue running to the left.
my fps doesn't seem to drop in the same scene, same settings, but different geometry?
video capture attached including fps displayed.
machine stats:
Intel(R) Core(TM) i9-9900KF CPU @ 3.60GHz 32GB memory
NVIDIA GeForce RTX 2080ti 11GB gpu memory
https://user-images.githubusercontent.com/5840082/167763438-ab1f6a9c-845f-42e1-b439-b01773c64b99.mp4
same steps repeated in ue4.27 for reference. motion blur seems to be fine.
https://user-images.githubusercontent.com/5840082/167765248-58f3e64b-5a8c-476f-8c41-490d01ecb6bc.mp4
|
gharchive/issue
| 2022-04-07T02:56:04 |
2025-04-01T04:54:49.651507
|
{
"authors": [
"kring",
"nithinp7",
"nzanepro"
],
"repo": "CesiumGS/cesium-unreal",
"url": "https://github.com/CesiumGS/cesium-unreal/issues/815",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2283702281
|
Add check to disable LOD transitions in material layer
Follow up to #1414. Fixes #1104.
This PR does the following:
Renames UseLodTransitions to EnableLodTransitions, to keep consistent with the rest of the naming in Cesium3DTileset.
Adds a EnableLodTransitions parameter to the ML_DitherFade material layer, to control whether or not the dithering is applied.
Clarifies that EnableLodTransitions is only compatible with temporal AA modes in the comments.
This material parameter corresponds exactly with the Cesium3DTileset's parameter. When false, dithering won't be applied at all. This helps for projects that use non-temporal AA modes. As long as they don't enable LOD transitions (which would be user error), they don't need to worry about these dithering artifacts on their tilesets.
Before
After
Hm, the artifacts don't appear when Forward Rendering is disabled, even for non-temporal AA modes. So I'll adjust the documentation
@j9liu I see pretty significant artifacts in this branch when "Enable LOD Transitions" is enabled. It kind of looks like the old tile is being removed completely before the new one comes in, so I see big holes in the surface. This is with Google Photorealistic 3D Tiles:
I see the same thing with CWT+Bing, though.
|
gharchive/pull-request
| 2024-05-07T15:51:33 |
2025-04-01T04:54:49.656228
|
{
"authors": [
"j9liu",
"kring"
],
"repo": "CesiumGS/cesium-unreal",
"url": "https://github.com/CesiumGS/cesium-unreal/pull/1416",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
866056510
|
Further Linux platform
This builds on #311, plus:
Enables exceptions in CesiumRuntime so that plugin packaging works.
Travis CI to package the pluginf for Linux.
Removes submodule checkout from the "check formatting" Travis job so that it completes faster.
Reintroduces the debug/release suffix for libs on Linux. It only appeared to not be working because the way to build the Debug/Release configuration is different from Windows (it has to be specified at configure time, not build time).
Thanks for the pull request @kring!
:heavy_check_mark: Signed CLA found.
Reviewers, don't forget to make sure that:
[ ] Cesium for Unreal Samples works.
@kring Just to confirm, have to downloaded the linux build and ran it as standalone with Cesium for Unreal Samples?
My checklist
✅ Cesium Native compiles per readme instructions
✅ Cesium for Unreal plugin builds package using RunUAT.sh
✅ Cesium for Unreal plugin builds package using UE4Editor
✅ Try running all levels in Cesium for Unreal Samples
✅ Cesium for Unreal Samples works with:
✅ Development version cesium-unreal-samples/Plugins/cesium-unreal as a "submodule"
✅ Locally packaged version of Cesium for Unreal
✅ GitHub / Travis Packaged version of Cesium for Unreal (Requested rebuilding CesiumEditor and CesiumRuntime modules).
🟨 Cesium for Unreal Samples builds executable with all the above 3 configurations - Yes - But packaged version fails to run with error Unable to load plugin 'WebBrowserWidget'. Aborting. @baothientran Did you run into this when building for OSX?
Partial solution to this was to add WebBrowerWidget plugin to cesium-unreal-samples (added branch here https://github.com/CesiumGS/cesium-unreal-samples/tree/add-browser-widget). However, that ran into this error. Looks like the solution is to use opengl4 flag when launching UE4Editor using ./UE4Editor -opengl4.
Using the -opengl4 flag and then building the executable got around the above vulkan error, but then ran into this error below. So looks like WebBrowserWidget is still an issue.[2021.04.25-02.30.11:739][ 0]LogModuleManager: Warning: ModuleManager: Module 'WebBrowserWidget' not found - its StaticallyLinkedModuleInitializers function is null.
[2021.04.25-02.31.22:162][ 0]Message dialog closed, result: Ok, title: Message, text: Plugin 'WebBrowserWidget' failed to load because module 'WebBrowserWidget' could not be found. Please ensure the plugin is properly installed, otherwise consider disabling the plugin for this project.
[2021.04.25-02.31.22:162][ 0]LogCore: Engine exit requested (reason: EngineExit() was called)
[2021.04.25-02.31.22:162][ 0]LogSlate: Request Window 'Cesium for Unreal Samples (64-bit Development SF_VULKAN_SM5) ' being destroyed
[2021.04.25-02.31.22:162][ 0]LogSlate: Window 'Cesium for Unreal Samples (64-bit Development SF_VULKAN_SM5) ' being destroyed
[2021.04.25-02.31.22:336][ 0]LogSlate: Slate User Destroyed. User Index 0, Is Virtual User: 0
[2021.04.25-02.31.22:336][ 0]LogExit: Preparing to exit.
I'm not exactly sure what the right fix to this would be. But this looks very close and would love to get it merged for the May 3 release.
Additional notes on above:
I've confirmed that Vulkan actually runs. Confirmed using the Vulkan samples and vkcube
Building Shipping vs Development builds of the executables show different errors. Shipping build shows the effort below.4.26.1-0+++UE4+Release-4.26 522 0
Disabling core dumps.
Project file not found: ../../../CesiumForUnrealSamples/CesiumForUnrealSamples.uproject
Unable to read VR Path Registry from /home/shehzan/.config/openvr/openvrpaths.vrpath
Exiting abnormally (error code: 1)
@shehzan10 I can't seem to even get as far as you. The app build completes successfully, but something is segfaulting during the cooking step. There's almost no information about what is segfaulting, though, and certainly no indication of why.
Log attached, fwiw.
Log.txt
Some more testing I've done:
Create a default first-person shooter game project. Compiled this into a packaged build and successfully ran it with Development and Shipping as well we Vulkan/OpenGL4.
Copied the built version of the Cesium for Unreal plugin into the project's Plugin directory. Enabled the plugin, but did not use it to create any content/levels.
Packaged FPS Game + Cesium for Unreal successfully in Development and Shipping, but executing the project runs into the error (WebBroswerWidget plugin doesn't get loaded).
LinuxNoEditor$ ./DummyGame.sh
Increasing per-process limit of core file size to infinity.
Project file not found: ../../../DummyGame/DummyGame.uproject
LogInit: LLM is enabled
LogInit: LLM CsvWriter: off TraceWriter: off
LogInit: Display: Running engine for game: DummyGame
LogInit: Display: Project file not found: ../../../DummyGame/DummyGame.uproject
LogInit: Display: Attempting to find via project info helper.
LogUProjectInfo: Found projects:
LogPakFile: Display: Found Pak file ../../../DummyGame/Content/Paks/DummyGame-LinuxNoEditor.pak attempting to mount.
LogPakFile: Display: Mounting pak file ../../../DummyGame/Content/Paks/DummyGame-LinuxNoEditor.pak.
LogPakFile: PakFile PrimaryIndexSize=28458
LogPakFile: PakFile PathHashIndexSize=50550
LogPakFile: PakFile FullDirectoryIndexSize=76452
LogPakFile: OnPakFileMounted2Time == 0.000001
LogPlatformFile: Using cached read wrapper
LogTaskGraph: Started task graph with 5 named threads and 17 total threads with 3 sets of task threads.
LogStats: Stats thread started at 0.060521
LogICUInternationalization: ICU TimeZone Detection - Raw Offset: -5:00, Platform Override: ''
LogPluginManager: Error: Unable to load plugin 'WebBrowserWidget'. Aborting.
LogCore: Engine exit requested (reason: EngineExit() was called)
LogExit: Preparing to exit.
LogModuleManager: Shutting down and abandoning module RenderCore (26)
LogModuleManager: Shutting down and abandoning module Landscape (24)
LogModuleManager: Shutting down and abandoning module SlateRHIRenderer (22)
LogModuleManager: Shutting down and abandoning module AnimGraphRuntime (20)
LogModuleManager: Shutting down and abandoning module Renderer (18)
LogModuleManager: Shutting down and abandoning module Engine (16)
LogModuleManager: Shutting down and abandoning module CoreUObject (14)
LogModuleManager: Shutting down and abandoning module NetworkFile (12)
LogModuleManager: Shutting down and abandoning module CookedIterativeFile (10)
LogModuleManager: Shutting down and abandoning module StreamingFile (8)
LogModuleManager: Shutting down and abandoning module SandboxFile (6)
LogModuleManager: Shutting down and abandoning module PakFile (4)
LogModuleManager: Shutting down and abandoning module RSA (3)
LogExit: Exiting.
Exiting abnormally (error code: 1)
In the build log for packaging, there is one line that is running into an error with chromium. I suspect this has something to do with the failure.
UATHelper: Packaging (Linux): [0426/113053:ERROR:browser_main_loop.cc(217)] Running without the SUID sandbox! See https://chromium.googlesource.com/chromium/src/+/master/docs/linux_suid_sandbox_development.md for more information on developing with the sandbox on.
Full packaging build log:
UATHelper: Packaging (Linux): Running AutomationTool...
UATHelper: Packaging (Linux): Fixing inconsistent case in filenames.
UATHelper: Packaging (Linux): Setting up Mono
UATHelper: Packaging (Linux): xbuild Source/Programs/AutomationTool/AutomationTool.csproj /p:Configuration=Development /p:Platform=AnyCPU /verbosity:quiet /nologo /p:NoWarn=1591 /property:AutomationToolProjectOnly=true /p:TargetFrameworkProfile=
UATHelper: Packaging (Linux): Start UAT: mono AutomationTool.exe -ScriptsForProject=/home/shehzan/workspace/DummyGame/DummyGame.uproject BuildCookRun -nocompileeditor -nop4 -project=/home/shehzan/workspace/DummyGame/DummyGame.uproject -cook -stage -archive -archivedirectory=/home/shehzan/workspace/Packages/DummyGame-VK-Cesium/ -package -ue4exe=/home/shehzan/
workspace/UnrealEngine/Engine/Binaries/Linux/UE4Editor -ddc=DerivedDataBackendGraph -pak -prereqs -nodebuginfo -targetplatform=Linux -build -clientconfig=Development -utf8output
UATHelper: Packaging (Linux): Parsing command line: -ScriptsForProject=/home/shehzan/workspace/DummyGame/DummyGame.uproject BuildCookRun -nocompileeditor -nop4 -project=/home/shehzan/workspace/DummyGame/DummyGame.uproject -cook -stage -archive -archivedirectory=/home/shehzan/workspace/Packages/DummyGame-VK-Cesium/ -package -ue4exe=/home/shehzan/workspace/Unr
ealEngine/Engine/Binaries/Linux/UE4Editor -ddc=DerivedDataBackendGraph -pak -prereqs -nodebuginfo -targetplatform=Linux -build -clientconfig=Development -utf8output -compile
UATHelper: Packaging (Linux): Dependencies are up to date (0.167s). Skipping compile.
UATHelper: Packaging (Linux): Setting up ProjectParams for /home/shehzan/workspace/DummyGame/DummyGame.uproject
UATHelper: Packaging (Linux): ********** BUILD COMMAND STARTED **********
UATHelper: Packaging (Linux): Running: mono "/home/shehzan/workspace/UnrealEngine/Engine/Binaries/DotNET/UnrealBuildTool.exe" UnrealPak Linux Development -NoUBTMakefiles -Manifest=/home/shehzan/workspace/UnrealEngine/Engine/Intermediate/Build/Manifest.xml -NoHotReload -log="/home/shehzan/Library/Logs/Unreal Engine/LocalBuildLogs/UBT-UnrealPak-Linux-Develop
ment.txt"
UATHelper: Packaging (Linux): Using 'git status' to determine working set for adaptive non-unity build (/home/shehzan/workspace/UnrealEngine).
UATHelper: Packaging (Linux): ------- Build details --------
UATHelper: Packaging (Linux): Using toolchain located at '/home/shehzan/workspace/UnrealEngine/Engine/Extras/ThirdPartyNotUE/SDKs/HostLinux/Linux_x64/v17_clang-10.0.1-centos7/x86_64-unknown-linux-gnu'.
UATHelper: Packaging (Linux): Using clang (/home/shehzan/workspace/UnrealEngine/Engine/Extras/ThirdPartyNotUE/SDKs/HostLinux/Linux_x64/v17_clang-10.0.1-centos7/x86_64-unknown-linux-gnu/bin/clang++) version '10.0.1' (string), 10 (major), 0 (minor), 1 (patch)
UATHelper: Packaging (Linux): Using bundled libc++ standard C++ library.
UATHelper: Packaging (Linux): Using lld linker
UATHelper: Packaging (Linux): Using llvm-ar : /home/shehzan/workspace/UnrealEngine/Engine/Extras/ThirdPartyNotUE/SDKs/HostLinux/Linux_x64/v17_clang-10.0.1-centos7/x86_64-unknown-linux-gnu/bin/llvm-ar
UATHelper: Packaging (Linux): Using fast way to relink circularly dependent libraries (no FixDeps).
UATHelper: Packaging (Linux): ------------------------------
UATHelper: Packaging (Linux): Writing manifest to /home/shehzan/workspace/UnrealEngine/Engine/Intermediate/Build/Manifest.xml
UATHelper: Packaging (Linux): Target is up to date
UATHelper: Packaging (Linux): Total execution time: 1.67 seconds
UATHelper: Packaging (Linux): Took 1.888515s to run mono, ExitCode=0
UATHelper: Packaging (Linux): Running: mono "/home/shehzan/workspace/UnrealEngine/Engine/Binaries/DotNET/UnrealBuildTool.exe" UE4Game Linux Development -NoUBTMakefiles -remoteini="/home/shehzan/workspace/DummyGame" -skipdeploy -Manifest=/home/shehzan/workspace/UnrealEngine/Engine/Intermediate/Build/Manifest.xml -NoHotReload -log="/home/shehzan/Library/Logs
/Unreal Engine/LocalBuildLogs/UBT-UE4Game-Linux-Development.txt"
UATHelper: Packaging (Linux): Using 'git status' to determine working set for adaptive non-unity build (/home/shehzan/workspace/UnrealEngine).
UATHelper: Packaging (Linux): ------- Build details --------
UATHelper: Packaging (Linux): Using toolchain located at '/home/shehzan/workspace/UnrealEngine/Engine/Extras/ThirdPartyNotUE/SDKs/HostLinux/Linux_x64/v17_clang-10.0.1-centos7/x86_64-unknown-linux-gnu'.
UATHelper: Packaging (Linux): Using clang (/home/shehzan/workspace/UnrealEngine/Engine/Extras/ThirdPartyNotUE/SDKs/HostLinux/Linux_x64/v17_clang-10.0.1-centos7/x86_64-unknown-linux-gnu/bin/clang++) version '10.0.1' (string), 10 (major), 0 (minor), 1 (patch)
UATHelper: Packaging (Linux): Using bundled libc++ standard C++ library.
UATHelper: Packaging (Linux): Using lld linker
UATHelper: Packaging (Linux): Using llvm-ar : /home/shehzan/workspace/UnrealEngine/Engine/Extras/ThirdPartyNotUE/SDKs/HostLinux/Linux_x64/v17_clang-10.0.1-centos7/x86_64-unknown-linux-gnu/bin/llvm-ar
UATHelper: Packaging (Linux): Using fast way to relink circularly dependent libraries (no FixDeps).
UATHelper: Packaging (Linux): ------------------------------
UATHelper: Packaging (Linux): Writing manifest to /home/shehzan/workspace/UnrealEngine/Engine/Intermediate/Build/Manifest.xml
UATHelper: Packaging (Linux): Target is up to date
UATHelper: Packaging (Linux): Total execution time: 5.47 seconds
UATHelper: Packaging (Linux): Took 5.692703s to run mono, ExitCode=0
UATHelper: Packaging (Linux): ********** BUILD COMMAND COMPLETED **********
UATHelper: Packaging (Linux): ********** COOK COMMAND STARTED **********
UATHelper: Packaging (Linux): Running UE4Editor Cook for project /home/shehzan/workspace/DummyGame/DummyGame.uproject
UATHelper: Packaging (Linux): Commandlet log file is /home/shehzan/workspace/UnrealEngine/Engine/Programs/AutomationTool/Saved/Cook-2021.04.26-11.30.01.txt
UATHelper: Packaging (Linux): Running: /home/shehzan/workspace/UnrealEngine/Engine/Binaries/Linux/UE4Editor /home/shehzan/workspace/DummyGame/DummyGame.uproject -run=Cook -TargetPlatform=LinuxNoEditor -fileopenlog -ddc=DerivedDataBackendGraph -unversioned -abslog=/home/shehzan/workspace/UnrealEngine/Engine/Programs/AutomationTool/Saved/Cook-2021.04.26-11.30
.01.txt -stdout -CrashForUAT -unattended -NoLogTimes -UTF8Output
UATHelper: Packaging (Linux): Increasing per-process limit of core file size to infinity.
UATHelper: Packaging (Linux): - Existing per-process limit (soft=18446744073709551615, hard=18446744073709551615) is enough for us (need only 18446744073709551615)
UATHelper: Packaging (Linux): LogInit: Display: Running engine for game: DummyGame
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MeshPainting
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin XGEController
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ScreenshotTools
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MagicLeapMedia
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MagicLeapLightEstimation
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin LuminPlatformFeatures
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MagicLeap
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MagicLeapPassableWorld
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MLSDK
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin OnlineSubsystemUtils
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin OnlineSubsystem
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CryptoKeys
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin DataValidation
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CurveEditorTools
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin SpeedTreeImporter
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin FacialAnimation
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MobileLauncherProfileWizard
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin LightPropagationVolume
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PluginBrowser
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GeometryMode
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MaterialAnalyzer
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AssetManagerEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MacGraphicsSwitching
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GameplayTagsEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PropertyAccessNode
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AnimationSharing
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin KDevelopSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin OnlineSubsystemNull
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin VisualStudioSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PluginUtils
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin RiderSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin NullSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PlasticSourceControl
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CLionSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PerforceSourceControl
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin XCodeSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GitSourceControl
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin SubversionSourceControl
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin LauncherChunkInstaller
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin UdpMessaging
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin LevelSequenceEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin Niagara
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ActorSequence
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MatineeToLevelSequence
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin TemplateSequence
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin Paper2D
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin VisualStudioCodeSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CodeLiteSourceCodeAccess
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin EnvironmentQueryEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin UObjectPlugin
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin VariantManagerContent
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MediaPlayerEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin DatasmithContent
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin TcpMessaging
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin WebMMoviePlayer
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MediaCompositing
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin SignificanceManager
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CameraShakePreviewer
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AndroidMedia
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AvfMedia
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin WmfMedia
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AndroidMoviePlayer
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PhysXVehicles
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ArchVisCharacter
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin SoundFields
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin SunPosition
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ImgMedia
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MobilePatchingUtils
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PlanarCut
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin Synthesis
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ExampleDeviceProfileSelector
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin WebMMedia
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GooglePAD
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin RuntimePhysXCooking
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GoogleCloudMessaging
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin IOSDeviceProfileSelector
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin EditableMesh
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ProceduralMeshComponent
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ActorLayerUtilities
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AppleMoviePlayer
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AudioCapture
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AutomationUtils
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CableComponent
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AndroidDeviceProfileSelector
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CharacterAI
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AudioSynesthesia
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin WindowsMoviePlayer
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin WebBrowserWidget
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin LinuxDeviceProfileSelector
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AISupport
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CustomMeshComponent
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AssetTags
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin MotoSynth
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AppleImageUtils
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GeometryProcessing
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PostSplashScreen
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin SkeletalReduction
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PropertyAccessEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ChunkDownloader
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GeometryCollectionPlugin
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin LocationServicesBPLibrary
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin PlatformCrypto
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AlembicImporter
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ChaosCloth
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ProxyLODPlugin
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ChaosNiagara
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin GeometryCache
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ChaosSolverPlugin
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin BackChannel
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ChaosClothEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin AndroidPermission
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ChaosEditor
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin OnlineSubsystemGooglePlay
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin OnlineSubsystemIOS
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ContentBrowserClassDataSource
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin ContentBrowserAssetDataSource
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin OculusVR
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin SteamVR
UATHelper: Packaging (Linux): LogPluginManager: Mounting plugin CesiumForUnreal
UATHelper: Packaging (Linux): LogInit: Using libcurl 7.65.3-DEV
UATHelper: Packaging (Linux): LogInit: - built for x86_64-unknown-linux-gnu
UATHelper: Packaging (Linux): LogInit: - supports SSL with OpenSSL/1.1.1c
UATHelper: Packaging (Linux): LogInit: - supports HTTP deflate (compression) using libz 1.2.8
UATHelper: Packaging (Linux): LogInit: - other features:
UATHelper: Packaging (Linux): LogInit: CURL_VERSION_SSL
UATHelper: Packaging (Linux): LogInit: CURL_VERSION_LIBZ
UATHelper: Packaging (Linux): LogInit: CURL_VERSION_IPV6
UATHelper: Packaging (Linux): LogInit: CURL_VERSION_ASYNCHDNS
UATHelper: Packaging (Linux): LogInit: CURL_VERSION_LARGEFILE
UATHelper: Packaging (Linux): LogInit: CURL_VERSION_TLSAUTH_SRP
UATHelper: Packaging (Linux): LogInit: CurlRequestOptions (configurable via config and command line):
UATHelper: Packaging (Linux): LogInit: - bVerifyPeer = true - Libcurl will verify peer certificate
UATHelper: Packaging (Linux): LogInit: - bUseHttpProxy = false - Libcurl will NOT use HTTP proxy
UATHelper: Packaging (Linux): LogInit: - bDontReuseConnections = false - Libcurl will reuse connections
UATHelper: Packaging (Linux): LogInit: - MaxHostConnections = 16 - Libcurl will limit the number of connections to a host
UATHelper: Packaging (Linux): LogInit: - LocalHostAddr = Default
UATHelper: Packaging (Linux): LogInit: - BufferSize = 65536
UATHelper: Packaging (Linux): LogOnline: OSS: Creating online subsystem instance for: NULL
UATHelper: Packaging (Linux): LogOnline: OSS: TryLoadSubsystemAndSetDefault: Loaded subsystem for module [NULL]
UATHelper: Packaging (Linux): LogInit: Build: ++UE4+Release-4.26-CL-0
UATHelper: Packaging (Linux): LogInit: Engine Version: 4.26.1-0+++UE4+Release-4.26
UATHelper: Packaging (Linux): LogInit: Compatible Engine Version: 4.26.0-0+++UE4+Release-4.26
UATHelper: Packaging (Linux): LogInit: Net CL: 0
UATHelper: Packaging (Linux): LogInit: OS: GenericOSVersionLabel (GenericOSSubVersionLabel), CPU: AMD Ryzen Threadripper 1950X 16-Core Processor , GPU: GenericGPUBrand
UATHelper: Packaging (Linux): LogInit: Compiled (64-bit): Apr 11 2021 10:30:14
UATHelper: Packaging (Linux): LogInit: Compiled with Clang: 10.0.1
UATHelper: Packaging (Linux): LogInit: Build Configuration: Development
UATHelper: Packaging (Linux): LogInit: Branch Name: ++UE4+Release-4.26
UATHelper: Packaging (Linux): LogInit: Command Line: /home/shehzan/workspace/DummyGame/DummyGame.uproject -run=Cook -TargetPlatform=LinuxNoEditor -fileopenlog -ddc=DerivedDataBackendGraph -unversioned -abslog=/home/shehzan/workspace/UnrealEngine/Engine/Programs/AutomationTool/Saved/Cook-2021.04.26-11.30.01.txt -stdout -CrashForUAT -unattended -NoLogTimes
-UTF8Output
UATHelper: Packaging (Linux): LogInit: Base Directory: /home/shehzan/workspace/UnrealEngine/Engine/Binaries/Linux/
UATHelper: Packaging (Linux): LogInit: Allocator: binned2
UATHelper: Packaging (Linux): LogInit: Installed Engine Build: 0
UATHelper: Packaging (Linux): LogDevObjectVersion: Number of dev versions registered: 29
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Blueprints (B0D832E4-1F89-4F0D-ACCF-7EB736FD4AA2): 10
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Build (E1C64328-A22C-4D53-A36C-8E866417BD8C): 0
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Core (375EC13C-06E4-48FB-B500-84F0262A717E): 4
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Editor (E4B068ED-F494-42E9-A231-DA0B2E46BB41): 40
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Framework (CFFC743F-43B0-4480-9391-14DF171D2073): 37
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Mobile (B02B49B5-BB20-44E9-A304-32B752E40360): 3
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Networking (A4E4105C-59A1-49B5-A7C5-40C4547EDFEE): 0
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Online (39C831C9-5AE6-47DC-9A44-9C173E1C8E7C): 0
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Physics (78F01B33-EBEA-4F98-B9B4-84EACCB95AA2): 4
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Platform (6631380F-2D4D-43E0-8009-CF276956A95A): 0
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Rendering (12F88B9F-8875-4AFC-A67C-D90C383ABD29): 44
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Sequencer (7B5AE74C-D270-4C10-A958-57980B212A5A): 12
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-VR (D7296918-1DD6-4BDD-9DE2-64A83CC13884): 3
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-LoadTimes (C2A15278-BFE7-4AFE-6C17-90FF531DF755): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: Private-Geometry (6EACA3D4-40EC-4CC1-B786-8BED09428FC5): 3
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-AnimPhys (29E575DD-E0A3-4627-9D10-D276232CDCEA): 17
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Anim (AF43A65D-7FD3-4947-9873-3E8ED9C1BB05): 15
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-ReflectionCapture (6B266CEC-1EC7-4B8F-A30B-E4D90942FC07): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Automation (0DF73D61-A23F-47EA-B727-89E90C41499A): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: FortniteMain (601D1886-AC64-4F84-AA16-D3DE0DEAC7D6): 43
UATHelper: Packaging (Linux): LogDevObjectVersion: FortniteRelease (E7086368-6B23-4C58-8439-1B7016265E91): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Enterprise (9DFFBCD6-494F-0158-E221-12823C92A888): 10
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Niagara (F2AED0AC-9AFE-416F-8664-AA7FFA26D6FC): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Destruction (174F1F0B-B4C6-45A5-B13F-2EE8D0FB917D): 10
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-Physics-Ext (35F94A83-E258-406C-A318-09F59610247C): 40
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-PhysicsMaterial-Chaos (B68FC16E-8B1B-42E2-B453-215C058844FE): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-CineCamera (B2E18506-4273-CFC2-A54E-F4BB758BBA07): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-VirtualProduction (64F58936-FD1B-42BA-BA96-7289D5D0FA4E): 1
UATHelper: Packaging (Linux): LogDevObjectVersion: Dev-MediaFramework (6F0ED827-A609-4895-9C91-998D90180EA4): 2
UATHelper: Packaging (Linux): LogInit: Presizing for max 25165824 objects, including 0 objects not considered by GC, pre-allocating 0 bytes for permanent pool.
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [/Script/Engine.StreamingSettings] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.MinBulkDataSizeForAsyncLoading:131072]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.AsyncLoadingThreadEnabled:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.EventDrivenLoaderEnabled:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.WarnIfTimeLimitExceeded:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.TimeLimitExceededMultiplier:1.5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.TimeLimitExceededMinTime:0.005]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.UseBackgroundLevelStreaming:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.PriorityAsyncLoadingExtraTime:15.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.LevelStreamingActorsUpdateTimeLimit:5.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.PriorityLevelStreamingActorsUpdateExtraTime:5.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.LevelStreamingComponentsRegistrationGranularity:10]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.UnregisterComponentsTimeLimit:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.LevelStreamingComponentsUnregistrationGranularity:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.FlushStreamingOnExit:1]]
UATHelper: Packaging (Linux): LogInit: Object subsystem initialized
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[con.DebugEarlyDefault:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.setres:1280x720]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[con.DebugEarlyDefault:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.setres:1280x720]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [/Script/Engine.RendererSettings] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.GPUCrashDebugging:0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [/Script/Engine.RendererOverrideSettings] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [/Script/Engine.StreamingSettings] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.MinBulkDataSizeForAsyncLoading:131072]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.AsyncLoadingThreadEnabled:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.EventDrivenLoaderEnabled:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.WarnIfTimeLimitExceeded:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.TimeLimitExceededMultiplier:1.5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.TimeLimitExceededMinTime:0.005]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.UseBackgroundLevelStreaming:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.PriorityAsyncLoadingExtraTime:15.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.LevelStreamingActorsUpdateTimeLimit:5.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.PriorityLevelStreamingActorsUpdateExtraTime:5.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.LevelStreamingComponentsRegistrationGranularity:10]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.UnregisterComponentsTimeLimit:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.LevelStreamingComponentsUnregistrationGranularity:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[s.FlushStreamingOnExit:1]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [/Script/Engine.GarbageCollectionSettings] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.MaxObjectsNotConsideredByGC:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.SizeOfPermanentObjectPool:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.FlushStreamingOnGC:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.NumRetriesBeforeForcingGC:10]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.AllowParallelGC:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.TimeBetweenPurgingPendingKillObjects:61.1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.MaxObjectsInEditor:25165824]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.IncrementalBeginDestroyEnabled:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.CreateGCClusters:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.MinGCClusterSize:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.ActorClusteringEnabled:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.BlueprintClusteringEnabled:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.UseDisregardForGCOnDedicatedServers:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[gc.MultithreadedDestructionEnabled:1]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [/Script/Engine.NetworkSettings] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [/Script/UnrealEd.CookerSettings] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [ViewDistanceQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkeletalMeshLODBias:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.ViewDistanceScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [AntiAliasingQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.PostProcessAAQuality:4]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [ShadowQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LightFunctionQuality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.ShadowQuality:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.CSM.MaxCascades:10]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.MaxResolution:2048]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.MaxCSMResolution:2048]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.RadiusThreshold:0.01]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.DistanceScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.CSM.TransitionScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.PreShadowResolutionFactor:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DistanceFieldShadowing:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DistanceFieldAO:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AOQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog.GridPixelSize:8]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog.GridSizeZ:128]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog.HistoryMissSupersampleCount:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LightMaxDrawDistanceScale:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.CapsuleShadows:1]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [PostProcessQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.MotionBlurQuality:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionMipLevelFactor:0.4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionMaxQuality:100]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionLevels:-1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionRadiusScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DepthOfFieldQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.RenderTargetPoolMin:400]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LensFlareQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SceneColorFringeQuality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.EyeAdaptationQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.BloomQuality:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.FastBlurThreshold:100]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Upscale.Quality:3]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Tonemapper.GrainQuantization:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LightShaftQuality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Filter.SizeScale:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Tonemapper.Quality:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.AccumulatorQuality:1 ; higher gathering accumulator quality]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.PostfilterMethod:1 ; Median3x3 postfilering method]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.EnableBokehSettings:0 ; no bokeh simulation when gathering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.RingCount:4 ; medium number of samples when gathering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.ForegroundCompositing:1 ; additive foreground scattering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.BackgroundCompositing:2 ; additive background scattering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.EnableBokehSettings:1 ; bokeh simulation when scattering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.MaxSpriteRatio:0.1 ; only a maximum of 10% of scattered bokeh]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Recombine.Quality:1 ; cheap slight out of focus]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Recombine.EnableBokehSettings:0 ; no bokeh simulation on slight out of focus]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.TemporalAAQuality:1 ; more stable temporal accumulation]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Kernel.MaxForegroundRadius:0.025]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Kernel.MaxBackgroundRadius:0.025]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [TextureQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.MipBias:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.AmortizeCPUToGPUCopy:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.MaxNumTexturesToStreamPerFrame:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.Boost:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.MaxAnisotropy:8]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VT.MaxAnisotropy:8]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.LimitPoolSizeToVRAM:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.PoolSize:1000]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.MaxEffectiveScreenSize:0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [EffectsQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.TranslucencyLightingVolumeDim:64]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.RefractionQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSR.Quality:3]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSR.HalfResSceneColor:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SceneColorFormat:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DetailMode:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.TranslucencyVolumeBlur:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.MaterialQualityLevel:1 ; High quality]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AnisotropicMaterials:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.Scale:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.SampleSet:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.Quality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.HalfRes:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSGI.Quality:3]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.EmitterSpawnRateScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.ParticleLightQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.AerialPerspectiveLUT.FastApplyOnOpaque:1 ; Always have FastSkyLUT 1 in this case to avoid wrong sky]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.AerialPerspectiveLUT.SampleCountMaxPerSlice:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.AerialPerspectiveLUT.DepthResolution:16.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.FastSkyLUT:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.FastSkyLUT.SampleCountMin:4.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.FastSkyLUT.SampleCountMax:128.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.SampleCountMin:4.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.SampleCountMax:128.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.TransmittanceLUT.UseSmallFormat:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.TransmittanceLUT.SampleCount:10.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.MultiScatteringLUT.SampleCount:15.0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [FoliageQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[foliage.DensityScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[grass.DensityScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [ShadingQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.HairStrands.SkyLighting.IntegrationType:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.HairStrands.SkyAO.SampleCount:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.HairStrands.Visibility.MSAA.SamplePerPixel:4]]
UATHelper: Packaging (Linux): LogLinux: Selected Device Profile: [Linux]
UATHelper: Packaging (Linux): LogInit: Applying CVar settings loaded from the selected device profile: [Linux]
UATHelper: Packaging (Linux): LogHAL: Display: Platform has ~ 4 GB [135001878528 / 4294967296 / 126], which maps to Smallest [LargestMinGB=32, LargerMinGB=12, DefaultMinGB=8, SmallerMinGB=6, SmallestMinGB=0)
UATHelper: Packaging (Linux): LogHAL: Display: Platform has ~ 4 GB [135001878528 / 4294967296 / 126], which maps to Smallest [LargestMinGB=32, LargerMinGB=12, DefaultMinGB=8, SmallerMinGB=6, SmallestMinGB=0)
UATHelper: Packaging (Linux): LogInit: Going up to parent DeviceProfile []
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [ViewDistanceQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkeletalMeshLODBias:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.ViewDistanceScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [AntiAliasingQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.PostProcessAAQuality:4]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [ShadowQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LightFunctionQuality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.ShadowQuality:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.CSM.MaxCascades:10]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.MaxResolution:2048]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.MaxCSMResolution:2048]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.RadiusThreshold:0.01]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.DistanceScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.CSM.TransitionScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Shadow.PreShadowResolutionFactor:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DistanceFieldShadowing:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DistanceFieldAO:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AOQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog.GridPixelSize:8]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog.GridSizeZ:128]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VolumetricFog.HistoryMissSupersampleCount:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LightMaxDrawDistanceScale:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.CapsuleShadows:1]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [PostProcessQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.MotionBlurQuality:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionMipLevelFactor:0.4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionMaxQuality:100]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionLevels:-1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AmbientOcclusionRadiusScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DepthOfFieldQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.RenderTargetPoolMin:400]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LensFlareQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SceneColorFringeQuality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.EyeAdaptationQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.BloomQuality:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.FastBlurThreshold:100]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Upscale.Quality:3]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Tonemapper.GrainQuantization:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.LightShaftQuality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Filter.SizeScale:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Tonemapper.Quality:5]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.AccumulatorQuality:1 ; higher gathering accumulator quality]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.PostfilterMethod:1 ; Median3x3 postfilering method]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.EnableBokehSettings:0 ; no bokeh simulation when gathering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Gather.RingCount:4 ; medium number of samples when gathering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.ForegroundCompositing:1 ; additive foreground scattering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.BackgroundCompositing:2 ; additive background scattering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.EnableBokehSettings:1 ; bokeh simulation when scattering]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Scatter.MaxSpriteRatio:0.1 ; only a maximum of 10% of scattered bokeh]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Recombine.Quality:1 ; cheap slight out of focus]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Recombine.EnableBokehSettings:0 ; no bokeh simulation on slight out of focus]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.TemporalAAQuality:1 ; more stable temporal accumulation]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Kernel.MaxForegroundRadius:0.025]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DOF.Kernel.MaxBackgroundRadius:0.025]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [TextureQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.MipBias:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.AmortizeCPUToGPUCopy:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.MaxNumTexturesToStreamPerFrame:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.Boost:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.MaxAnisotropy:8]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.VT.MaxAnisotropy:8]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.LimitPoolSizeToVRAM:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.PoolSize:1000]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.Streaming.MaxEffectiveScreenSize:0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [EffectsQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.TranslucencyLightingVolumeDim:64]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.RefractionQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSR.Quality:3]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSR.HalfResSceneColor:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SceneColorFormat:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.DetailMode:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.TranslucencyVolumeBlur:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.MaterialQualityLevel:1 ; High quality]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.AnisotropicMaterials:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.Scale:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.SampleSet:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.Quality:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSS.HalfRes:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SSGI.Quality:3]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.EmitterSpawnRateScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.ParticleLightQuality:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.AerialPerspectiveLUT.FastApplyOnOpaque:1 ; Always have FastSkyLUT 1 in this case to avoid wrong sky]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.AerialPerspectiveLUT.SampleCountMaxPerSlice:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.AerialPerspectiveLUT.DepthResolution:16.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.FastSkyLUT:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.FastSkyLUT.SampleCountMin:4.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.FastSkyLUT.SampleCountMax:128.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.SampleCountMin:4.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.SampleCountMax:128.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.TransmittanceLUT.UseSmallFormat:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.TransmittanceLUT.SampleCount:10.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.SkyAtmosphere.MultiScatteringLUT.SampleCount:15.0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [FoliageQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[foliage.DensityScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[grass.DensityScale:1.0]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [ShadingQuality@3] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Scalability.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.HairStrands.SkyLighting.IntegrationType:2]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.HairStrands.SkyAO.SampleCount:4]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[r.HairStrands.Visibility.MSAA.SamplePerPixel:4]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [Startup] File [../../../Engine/Config/ConsoleVariables.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[net.UseAdaptiveNetUpdateFrequency:0]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[p.chaos.AllowCreatePhysxBodies:1]]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[fx.SkipVectorVMBackendOptimizations:1]]
UATHelper: Packaging (Linux): LogConfig: Applying CVar settings from Section [ConsoleVariables] File [/home/shehzan/workspace/DummyGame/Saved/Config/Linux/Engine.ini]
UATHelper: Packaging (Linux): LogConfig: Setting CVar [[g.TimeoutForBlockOnRenderFence:60000]]
UATHelper: Packaging (Linux): LogInit: Unix hardware info:
UATHelper: Packaging (Linux): LogInit: - we are not the first instance of this executable
UATHelper: Packaging (Linux): LogInit: - this process' id (pid) is 279369, parent process' id (ppid) is 279219
UATHelper: Packaging (Linux): LogInit: - we are not running under debugger
UATHelper: Packaging (Linux): LogInit: - machine network name is 'deuterium'
UATHelper: Packaging (Linux): LogInit: - user name is 'shehzan' (shehzan)
UATHelper: Packaging (Linux): LogInit: - we're logged in locally
UATHelper: Packaging (Linux): LogInit: - we're running without rendering
UATHelper: Packaging (Linux): LogInit: - CPU: AuthenticAMD 'AMD Ryzen Threadripper 1950X 16-Core Processor ' (signature: 0x800F11)
UATHelper: Packaging (Linux): LogInit: - Number of physical cores available for the process: 16
UATHelper: Packaging (Linux): LogInit: - Number of logical cores available for the process: 32
UATHelper: Packaging (Linux): LogInit: - Cache line size: 64
UATHelper: Packaging (Linux): LogInit: - Memory allocator used: binned2
UATHelper: Packaging (Linux): LogInit: - This binary is optimized with LTO: no, PGO: no, instrumented for PGO data collection: no
UATHelper: Packaging (Linux): LogInit: - This is an internal build.
UATHelper: Packaging (Linux): LogCore: Benchmarking clocks:
UATHelper: Packaging (Linux): LogCore: - CLOCK_MONOTONIC (id=1) can sustain 39618256 (39618K, 40M) calls per second without zero deltas.
UATHelper: Packaging (Linux): LogCore: - CLOCK_MONOTONIC_RAW (id=4) can sustain 38769656 (38770K, 39M) calls per second without zero deltas.
UATHelper: Packaging (Linux): LogCore: - CLOCK_MONOTONIC_COARSE (id=6) can sustain 169389488 (169389K, 169M) calls per second with 99.999847% zero deltas.
UATHelper: Packaging (Linux): LogCore: Selected clock_id 1 (CLOCK_MONOTONIC) since it is the fastest support clock without zero deltas.
UATHelper: Packaging (Linux): LogInit: Unix-specific commandline switches:
UATHelper: Packaging (Linux): LogInit: -ansimalloc - use malloc()/free() from libc (useful for tools like valgrind and electric fence)
UATHelper: Packaging (Linux): LogInit: -jemalloc - use jemalloc for all memory allocation
UATHelper: Packaging (Linux): LogInit: -binnedmalloc - use binned malloc for all memory allocation
UATHelper: Packaging (Linux): LogInit: -filemapcachesize=NUMBER - set the size for case-sensitive file mapping cache
UATHelper: Packaging (Linux): LogInit: -useksm - uses kernel same-page mapping (KSM) for mapped memory (OFF)
UATHelper: Packaging (Linux): LogInit: -ksmmergeall - marks all mmap'd memory pages suitable for KSM (OFF)
UATHelper: Packaging (Linux): LogInit: -preloadmodulesymbols - Loads the main module symbols file into memory (OFF)
UATHelper: Packaging (Linux): LogInit: -sigdfl=SIGNAL - Allows a specific signal to be set to its default handler rather then ignoring the signal
UATHelper: Packaging (Linux): LogInit: -httpproxy=ADDRESS:PORT - redirects HTTP requests to a proxy (only supported if compiled with libcurl)
UATHelper: Packaging (Linux): LogInit: -reuseconn - allow libcurl to reuse HTTP connections (only matters if compiled with libcurl)
UATHelper: Packaging (Linux): LogInit: -virtmemkb=NUMBER - sets process virtual memory (address space) limit (overrides VirtualMemoryLimitInKB value from .ini)
UATHelper: Packaging (Linux): LogInit: - Physical RAM available (not considering process quota): 126 GB (128747 MB, 131837772 KB, 135001878528 bytes)
UATHelper: Packaging (Linux): LogInit: - VirtualMemoryAllocator pools will grow at scale 1.4
UATHelper: Packaging (Linux): LogInit: - MemoryRangeDecommit() will be a no-op (re-run with -vmapoolevict to change)
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Linux'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Linux'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxNoEditor'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxNoEditor'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxServer'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxServer'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ASTC'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ASTC'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_DXT'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_DXT'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ETC2'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ETC2'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'AndroidClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'AndroidClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ASTCClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ASTCClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_DXTClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_DXTClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ETC2Client'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_ETC2Client'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_Multi'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_Multi'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_MultiClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'Android_MultiClient'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxAArch64NoEditor'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxAArch64NoEditor'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxAArch64Client'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxAArch64Client'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxAArch64Server'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Loaded TargetPlatform 'LinuxAArch64Server'
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Building Assets For LinuxNoEditor
UATHelper: Packaging (Linux): LogTargetPlatformManager: Display: Building Assets For LinuxNoEditor
UATHelper: Packaging (Linux): LogInit: Physics initialised using underlying interface: PhysX
UATHelper: Packaging (Linux): LogInit: Using OS detected language (en-US).
UATHelper: Packaging (Linux): LogInit: Using OS detected locale (en-US).
UATHelper: Packaging (Linux): LogTextLocalizationManager: No specific localization for 'en-US' exists, so the 'en' localization will be used.
UATHelper: Packaging (Linux): LogRendererCore: Ray tracing is disabled. Reason: r.RayTracing=0.
UATHelper: Packaging (Linux): LogShaderCompilers: Guid format shader working directory is 13 characters bigger than the processId version (../../../../DummyGame/Intermediate/Shaders/WorkingDirectory/279369/).
UATHelper: Packaging (Linux): LogShaderCompilers: Cleaned the shader compiler working directory '../../../../DummyGame/Intermediate/Shaders/tmp/977D2B2C1FCE4863A73E317F80D63CBA/'.
UATHelper: Packaging (Linux): LogShaderCompilers: Display: Using Local Shader Compiler.
UATHelper: Packaging (Linux): LogShaderCompilers: Display: Using Local Shader Compiler.
UATHelper: Packaging (Linux): LogDerivedDataCache: Display: Max Cache Size: 512 MB
UATHelper: Packaging (Linux): LogDerivedDataCache: Display: Max Cache Size: 512 MB
UATHelper: Packaging (Linux): LogDerivedDataCache: Loaded boot cache 0.04s 97MB ../../../../DummyGame/DerivedDataCache/Boot.ddc.
UATHelper: Packaging (Linux): LogDerivedDataCache: Display: Loaded Boot cache: ../../../../DummyGame/DerivedDataCache/Boot.ddc
UATHelper: Packaging (Linux): LogDerivedDataCache: Display: Loaded Boot cache: ../../../../DummyGame/DerivedDataCache/Boot.ddc
UATHelper: Packaging (Linux): LogDerivedDataCache: FDerivedDataBackendGraph: Pak pak cache file ../../../../DummyGame/DerivedDataCache/DDC.ddp not found, will not use a pak cache.
UATHelper: Packaging (Linux): LogDerivedDataCache: Unable to find inner node Pak for hierarchical cache Hierarchy.
UATHelper: Packaging (Linux): LogDerivedDataCache: FDerivedDataBackendGraph: EnginePak pak cache file ../../../Engine/DerivedDataCache/DDC.ddp not found, will not use a pak cache.
UATHelper: Packaging (Linux): LogDerivedDataCache: Unable to find inner node EnginePak for hierarchical cache Hierarchy.
UATHelper: Packaging (Linux): LogDerivedDataCache: Speed tests for ../../../Engine/DerivedDataCache took 0.00 seconds
UATHelper: Packaging (Linux): LogDerivedDataCache: Display: Performance to ../../../Engine/DerivedDataCache: Latency=0.01ms. RandomReadSpeed=2333.15MBs, RandomWriteSpeed=519.06MBs. Assigned SpeedClass 'Local'
UATHelper: Packaging (Linux): LogDerivedDataCache: Display: Performance to ../../../Engine/DerivedDataCache: Latency=0.01ms. RandomReadSpeed=2333.15MBs, RandomWriteSpeed=519.06MBs. Assigned SpeedClass 'Local'
UATHelper: Packaging (Linux): LogDerivedDataCache: Using Local data cache path ../../../Engine/DerivedDataCache: Writable
UATHelper: Packaging (Linux): LogDerivedDataCache: Shared data cache path not found in *engine.ini, will not use an Shared cache.
UATHelper: Packaging (Linux): LogDerivedDataCache: Unable to find inner node Shared for hierarchical cache Hierarchy.
UATHelper: Packaging (Linux): LogAssetRegistry: FAssetRegistry took 0.0018 seconds to start up
UATHelper: Packaging (Linux): LogLinux: Selected Device Profile: [Linux]
UATHelper: Packaging (Linux): LogInit: Active device profile: [0x7f9a0312ab00][0x7f9a031d4600 49] Linux
UATHelper: Packaging (Linux): [0426/113053:ERROR:browser_main_loop.cc(217)] Running without the SUID sandbox! See https://chromium.googlesource.com/chromium/src/+/master/docs/linux_suid_sandbox_development.md for more information on developing with the sandbox on.
UATHelper: Packaging (Linux): LogInit: Profiles: [0x7f9a03122400][0x7f9a0d678bc0 49] Windows, [0x7f9a03122300][0x7f9a0d675d40 49] WindowsNoEditor, [0x7f9a03126300][0x7f9a0d677480 49] WindowsServer, [0x7f9a03126200][0x7f9a0d672ec0 49] WindowsClient, [0x7f9a03126100][0x7f9a0d671780 49] IOS, [0x7f9a03126000][0x7f9a0d670040 49] iPadAir, [0x7f9a03125f00][0x7f9a
0315e8c0 49] iPadAir2, [0x7f9a03125d00][0x7f9a0315ba40 49] IPadPro, [0x7f9a03128500][0x7f9a0315d180 49] iPadAir3, [0x7f9a03125c00][0x7f9a0315a300 49] iPadAir4, [0x7f9a03124300][0x7f9a03158bc0 49] iPadMini2, [0x7f9a03124400][0x7f9a03157480 49] iPadMini3, [0x7f9a03124500][0x7f9a03155d40 49] iPadMini4, [0x7f9a03124600][0x7f9a03154600 49] iPadMini5, [0x7f9a03124
700][0x7f9a03151780 49] iPhone6, [0x7f9a03126400][0x7f9a03152ec0 49] iPodTouch6, [0x7f9a03124800][0x7f9a031ae8c0 49] iPhone7, [0x7f9a03128a00][0x7f9a03150040 49] iPodTouch7, [0x7f9a0312c200][0x7f9a031ad180 49] iPhone5S, [0x7f9a03124900][0x7f9a031aba40 49] iPhone6Plus, [0x7f9a03127200][0x7f9a031aa300 49] iPhone6S, [0x7f9a03124a00][0x7f9a031a8bc0 49] iPhone6SP
lus, [0x7f9a03124b00][0x7f9a031a7480 49] iPhone7Plus, [0x7f9a03124c00][0x7f9a031a5d40 49] iPhoneSE, [0x7f9a03124d00][0x7f9a031a4600 49] iPhone8, [0x7f9a03124e00][0x7f9a031a2ec0 49] iPhone8Plus, [0x7f9a03127800][0x7f9a031a1780 49] iPhoneX, [0x7f9a03124f00][0x7f9a031a0040 49] iPhoneXS, [0x7f9a03125000][0x7f9a031be8c0 49] iPhoneXSMax, [0x7f9a03125100][0x7f9a031
bd180 49] iPhoneXR, [0x7f9a03125200][0x7f9a031bba40 49] iPhone11, [0x7f9a03125300][0x7f9a031ba300 49] iPhone11Pro, [0x7f9a03125400][0x7f9a031b8bc0 49] iPhone11ProMax, [0x7f9a03125500][0x7f9a031b7480 49] iPhoneSE2, [0x7f9a03125600][0x7f9a031b5d40 49] iPhone12Mini, [0x7f9a03125700][0x7f9a031b4600 49] iPhone12, [0x7f9a03125800][0x7f9a031b2ec0 49] iPhone12Pro, [
0x7f9a03125900][0x7f9a031b1780 49] iPhone12ProMax, [0x7f9a0312af00][0x7f9a031b0040 49] iPadPro105, [0x7f9a03125a00][0x7f9a031ce8c0 49] iPadPro129, [0x7f9a03125b00][0x7f9a031cd180 49] iPadPro97, [0x7f9a0312c600][0x7f9a031cba40 49] iPadPro2_129, [0x7f9a0312c500][0x7f9a031ca300 49] iPad5, [0x7f9a0312c400][0x7f9a031c8bc0 49] iPad6, [0x7f9a0312c300][0x7f9a031c748
0 49] iPad7, [0x7f9a03128900][0x7f9a031c5d40 49] iPad8, [0x7f9a03128800][0x7f9a031c4600 49] iPadPro11, [0x7f9a03128700][0x7f9a031c2ec0 49] iPadPro2_11, [0x7f9a0fb7c900][0x7f9a031c1780 49] iPadPro3_129, [0x7f9a0fb7c700][0x7f9a031c0040 49] iPadPro4_129, [0x7f9a0fb7ff00][0x7f9a031de8c0 49] AppleTV, [0x7f9a0fb7fe00][0x7f9a031dd180 49] AppleTV4K, [0x7f9a0312a600]
[0x7f9a031dba40 49] TVOS, [0x7f9a0312a700][0x7f9a031da300 49] Mac, [0x7f9a0312a800][0x7f9a031d8bc0 49] MacClient, [0x7f9a0312a900][0x7f9a031d7480 49] MacNoEditor, [0x7f9a0312aa00][0x7f9a031d5d40 49] MacServer, [0x7f9a0312ab00][0x7f9a031d4600 49] Linux, [0x7f9a0312ac00][0x7f9a031d2ec0 49] LinuxNoEditor, [0x7f9a0312ad00][0x7f9a031d1780 49] LinuxAArch64NoEditor
, [0x7f9a0312ae00][0x7f9a031d0040 49] LinuxClient, [0x7f9a0312b000][0x7f9a031ee8c0 49] LinuxAArch64Client, [0x7f9a0312b100][0x7f9a031ed180 49] LinuxServer, [0x7f9a0312b200][0x7f9a031eba40 49] LinuxAArch64Server, [0x7f9a0312b300][0x7f9a031ea300 49] Android, [0x7f9a0312b400][0x7f9a031e8bc0 49] Android_Low, [0x7f9a0312b500][0x7f9a031e7480 49] Android_Mid, [0x7f
9a03123200][0x7f9a031e5d40 49] Android_High, [0x7f9a03123100][0x7f9a031e4600 49] Android_Default, [0x7f9a03123000][0x7f9a031e2ec0 49] Android_Adreno4xx, [0x7f9a03122f00][0x7f9a031e1780 49] Android_Adreno5xx_Low, [0x7f9a03122e00][0x7f9a031e0040 49] Android_Adreno5xx, [0x7f9a03122d00][0x7f9a031fe8c0 49] Android_Adreno6xx, [0x7f9a03122c00][0x7f9a031fd180 49] An
droid_Adreno6xx_Vulkan, [0x7f9a03122b00][0x7f9a031fba40 49] Android_Mali_T6xx, [0x7f9a03122a00][0x7f9a031fa300 49] Android_Mali_T7xx, [0x7f9a03122900][0x7f9a031f8bc0 49] Android_Mali_T8xx, [0x7f9a03122800][0x7f9a031f7480 49] Android_Mali_G71, [0x7f9a03122700][0x7f9a031f5d40 49] Android_Mali_G72, [0x7f9a03122600][0x7f9a031f4600 49] Android_Mali_G72_Vulkan, [0
x7f9a03122500][0x7f9a031f2ec0 49] Android_Mali_G76, [0x7f9a03128b00][0x7f9a031f1780 49] Android_Mali_G76_Vulkan, [0x7f9a03128c00][0x7f9a031f0040 49] Android_Mali_G77, [0x7f9a03128d00][0x7f9a0320e8c0 49] Android_Mali_G77_Vulkan, [0x7f9a03128e00][0x7f9a0320d180 49] Android_Vulkan_SM5, [0x7f9a03128f00][0x7f9a0320ba40 49] Android_PowerVR_G6xxx, [0x7f9a03129000][
0x7f9a0320a300 49] Android_PowerVR_GT7xxx, [0x7f9a03129100][0x7f9a03208bc0 49] Android_PowerVR_GE8xxx, [0x7f9a03129200][0x7f9a03207480 49] Android_PowerVR_GM9xxx, [0x7f9a03129300][0x7f9a03205d40 49] Android_PowerVR_GM9xxx_Vulkan, [0x7f9a03129400][0x7f9a03204600 49] Android_TegraK1, [0x7f9a03129500][0x7f9a03202ec0 49] Android_Unknown_Vulkan, [0x7f9a03129600][
0x7f9a03201780 49] Lumin, [0x7f9a03129700][0x7f9a03200040 49] Lumin_Desktop, [0x7f9a03129800][0x7f9a0321e8c0 49] HoloLens,
UATHelper: Packaging (Linux): LogMeshReduction: Using QuadricMeshReduction for automatic static mesh reduction
UATHelper: Packaging (Linux): LogMeshReduction: Using SimplygonMeshReduction for automatic skeletal mesh reduction
UATHelper: Packaging (Linux): LogMeshReduction: No automatic mesh merging module available
UATHelper: Packaging (Linux): LogMeshReduction: No distributed automatic mesh merging module available
UATHelper: Packaging (Linux): LogMeshMerging: No automatic mesh merging module available
UATHelper: Packaging (Linux): LogMeshMerging: No distributed automatic mesh merging module available
UATHelper: Packaging (Linux): LogNetVersion: DummyGame 1.0.0, NetCL: 0, EngineNetVer: 16, GameNetVer: 0 (Checksum: 4220270427)
UATHelper: Packaging (Linux): LogPackageLocalizationCache: Processed 25 localized package path(s) for 1 prioritized culture(s) in 0.017340 seconds
UATHelper: Packaging (Linux): LogHAL: Linux SourceCodeAccessSettings: NullSourceCodeAccessor
UATHelper: Packaging (Linux): LogCollectionManager: Loaded 0 collections in 0.001819 seconds
UATHelper: Packaging (Linux): LogFileCache: Scanning file cache for directory '/home/shehzan/workspace/DummyGame/Saved/Collections/' took 0.00s
UATHelper: Packaging (Linux): LogFileCache: Scanning file cache for directory '/home/shehzan/workspace/DummyGame/Content/Developers/shehzan/Collections/' took 0.00s
UATHelper: Packaging (Linux): LogCollectionManager: Fixed up redirectors for 0 collections in 0.000001 seconds (updated 0 objects)
UATHelper: Packaging (Linux): LogFileCache: Scanning file cache for directory '/home/shehzan/workspace/DummyGame/Content/Collections/' took 0.00s
UATHelper: Packaging (Linux): LogUProjectInfo: Found projects:
UATHelper: Packaging (Linux): SourceControl: Source control is disabled
UATHelper: Packaging (Linux): SourceControl: Source control is disabled
UATHelper: Packaging (Linux): SourceControl: Source control is disabled
UATHelper: Packaging (Linux): SourceControl: Source control is disabled
UATHelper: Packaging (Linux): SourceControl: Source control is disabled
UATHelper: Packaging (Linux): LogAudioCaptureCore: Display: No Audio Capture implementations found. Audio input will be silent.
UATHelper: Packaging (Linux): LogAudioCaptureCore: Display: No Audio Capture implementations found. Audio input will be silent.
UATHelper: Packaging (Linux): LogAudioCaptureCore: Display: No Audio Capture implementations found. Audio input will be silent.
UATHelper: Packaging (Linux): LogAudioCaptureCore: Display: No Audio Capture implementations found. Audio input will be silent.
UATHelper: Packaging (Linux): LogAndroidPermission: UAndroidPermissionCallbackProxy::GetInstance
UATHelper: Packaging (Linux): [2021-04-26 11:31:30.844] [info] [TileContentFactory.cpp:12] Registering magic header glTF
UATHelper: Packaging (Linux): [2021-04-26 11:31:30.844] [info] [TileContentFactory.cpp:12] Registering magic header b3dm
UATHelper: Packaging (Linux): [2021-04-26 11:31:30.844] [info] [TileContentFactory.cpp:12] Registering magic header cmpt
UATHelper: Packaging (Linux): [2021-04-26 11:31:30.844] [info] [TileContentFactory.cpp:12] Registering magic header json
UATHelper: Packaging (Linux): [2021-04-26 11:31:30.844] [info] [TileContentFactory.cpp:21] Registering content type application/vnd.quantized-mesh
UATHelper: Packaging (Linux): LogUObjectArray: 20498 objects as part of root set at end of initial load.
UATHelper: Packaging (Linux): LogUObjectAllocator: 5627216 out of 0 bytes used by permanent object pool.
UATHelper: Packaging (Linux): LogUObjectArray: CloseDisregardForGC: 0/0 objects in disregard for GC pool
UATHelper: Packaging (Linux): LogInit: Executing Class /Script/UnrealEd.CookCommandlet
UATHelper: Packaging (Linux): LogCook: Display: CookSettings for Memory: MemoryMaxUsedVirtual 0MiB, MemoryMaxUsedPhysical 16384MiB, MemoryMinFreeVirtual 0MiB, MemoryMinFreePhysical 1024MiB
UATHelper: Packaging (Linux): LogCook: Display: CookSettings for Memory: MemoryMaxUsedVirtual 0MiB, MemoryMaxUsedPhysical 16384MiB, MemoryMinFreeVirtual 0MiB, MemoryMinFreePhysical 1024MiB
UATHelper: Packaging (Linux): LogCook: Display: Mobile HDR setting 1
UATHelper: Packaging (Linux): LogCook: Display: Mobile HDR setting 1
UATHelper: Packaging (Linux): LogCook: Display: Creating asset registry
UATHelper: Packaging (Linux): LogCook: Display: Creating asset registry
UATHelper: Packaging (Linux): LogCook: Display: Discovering localized assets for cultures: en
UATHelper: Packaging (Linux): LogCook: Display: Discovering localized assets for cultures: en
UATHelper: Packaging (Linux): LogCook: Display: Inisetting is different for Linux Game /Script/UnrealEd.ProjectPackagingSettings BuildConfiguration 0, value PPBC_Development != PPBC_Shipping invalidating cook
UATHelper: Packaging (Linux): LogCook: Display: Inisetting is different for Linux Game /Script/UnrealEd.ProjectPackagingSettings BuildConfiguration 0, value PPBC_Development != PPBC_Shipping invalidating cook
UATHelper: Packaging (Linux): LogCook: Display: To avoid this add blacklist setting to DefaultEditor.ini [CookSettings] Linux.Game:/Script/UnrealEd.ProjectPackagingSettings
UATHelper: Packaging (Linux): LogCook: Display: To avoid this add blacklist setting to DefaultEditor.ini [CookSettings] Linux.Game:/Script/UnrealEd.ProjectPackagingSettings
UATHelper: Packaging (Linux): LogCook: Display: Clearing all cooked content for platform LinuxNoEditor
UATHelper: Packaging (Linux): LogCook: Display: Clearing all cooked content for platform LinuxNoEditor
UATHelper: Packaging (Linux): LogCook: Display: Sandbox cleanup took 0.035 seconds for platforms LinuxNoEditor
UATHelper: Packaging (Linux): LogCook: Display: Sandbox cleanup took 0.035 seconds for platforms LinuxNoEditor
UATHelper: Packaging (Linux): LogCook: Display: Cooked packages 0 Packages Remain 390 Total 390
UATHelper: Packaging (Linux): LogCook: Display: Cooked packages 0 Packages Remain 390 Total 390
UATHelper: Packaging (Linux): LogCook: Display: Cook Diagnostics: OpenFileHandles=0, VirtualMemory=11227MiB
UATHelper: Packaging (Linux): LogCook: Display: Cook Diagnostics: OpenFileHandles=0, VirtualMemory=11227MiB
UATHelper: Packaging (Linux): LogCook: Display: Excluding /MagicLeapPassableWorld/MagicLeapARPinInfoActor -> /home/shehzan/workspace/DummyGame/Saved/Cooked/LinuxNoEditor/Engine/Plugins/Lumin/MagicLeapPassableWorld/Content/MagicLeapARPinInfoActor.uasset
UATHelper: Packaging (Linux): LogCook: Display: Excluding /MagicLeapPassableWorld/MagicLeapARPinInfoActor -> /home/shehzan/workspace/DummyGame/Saved/Cooked/LinuxNoEditor/Engine/Plugins/Lumin/MagicLeapPassableWorld/Content/MagicLeapARPinInfoActor.uasset
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture4E9A31EC459F824B7293D9A5960F381F (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture4E9A31EC459F824B7293D9A5960F381F (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureD6C57B644F03DADF4F2699848FAEA3C3 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture8D91C9AE4C6B8D628CDA3C848426284A (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureD6C57B644F03DADF4F2699848FAEA3C3 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture8D91C9AE4C6B8D628CDA3C848426284A (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureF60039EB445BD313D9FB3B83A10248ED (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureF60039EB445BD313D9FB3B83A10248ED (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture34034B4C417D9B375A592E9A23C11137 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture43ADA98049DD7D3E0EABF7B175F6CC97 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture286B71FD456BBC57D3B875B63A2AF0AA (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture310E58554010446A08107499F126CE3D (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureF6495ADB4998903A4DCD9AAEAD18E643 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture617F078740EB0F77ABFB678038ACFCA7 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture2A05A5254D431981518674B220E5E1A1 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture23F9881A4F7D09FA85B8FFBAC56882BF (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureB91D75A74080657D8AF886A1EE756E94 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture3AD1786A42EB6F981CB7E1847869C35B (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture34034B4C417D9B375A592E9A23C11137 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture43ADA98049DD7D3E0EABF7B175F6CC97 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture286B71FD456BBC57D3B875B63A2AF0AA (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture310E58554010446A08107499F126CE3D (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureF6495ADB4998903A4DCD9AAEAD18E643 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture617F078740EB0F77ABFB678038ACFCA7 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture2A05A5254D431981518674B220E5E1A1 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture23F9881A4F7D09FA85B8FFBAC56882BF (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTextureB91D75A74080657D8AF886A1EE756E94 (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogTexture: Display: Building textures: DeprecatedTexture3AD1786A42EB6F981CB7E1847869C35B (BGRA8, 128X128)
UATHelper: Packaging (Linux): LogCook: Display: Cooked packages 400 Packages Remain 167 Total 567
UATHelper: Packaging (Linux): LogCook: Display: Cooked packages 400 Packages Remain 167 Total 567
UATHelper: Packaging (Linux): LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleTODWidget' for InputAction ToggleTODWidget
UATHelper: Packaging (Linux): LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleTODWidget' for InputAction ToggleTODWidget
UATHelper: Packaging (Linux): LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleProfilingWidget' for InputAction ToggleProfilingWidget
UATHelper: Packaging (Linux): LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleProfilingWidget' for InputAction ToggleProfilingWidget
PackagingResults: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleTODWidget' for InputAction ToggleTODWidget
PackagingResults: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleTODWidget' for InputAction ToggleTODWidget
PackagingResults: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleProfilingWidget' for InputAction ToggleProfilingWidget
PackagingResults: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleProfilingWidget' for InputAction ToggleProfilingWidget
UATHelper: Packaging (Linux): LogCook: Display: Cooked packages 569 Packages Remain 0 Total 569
UATHelper: Packaging (Linux): LogCook: Display: Cooked packages 569 Packages Remain 0 Total 569
UATHelper: Packaging (Linux): LogCook: Display: Cook Diagnostics: OpenFileHandles=0, VirtualMemory=11332MiB
UATHelper: Packaging (Linux): LogCook: Display: Cook Diagnostics: OpenFileHandles=0, VirtualMemory=11332MiB
UATHelper: Packaging (Linux): LogCook: Display: Finishing up...
UATHelper: Packaging (Linux): LogCook: Display: Finishing up...
UATHelper: Packaging (Linux): LogCook: Display: Saving BulkData manifest(s)...
UATHelper: Packaging (Linux): LogCook: Display: Saving BulkData manifest(s)...
UATHelper: Packaging (Linux): LogCook: Display: Done saving BulkData manifest(s)
UATHelper: Packaging (Linux): LogCook: Display: Done saving BulkData manifest(s)
UATHelper: Packaging (Linux): LogBlueprintCodeGen: Display: Nativization Summary - AnimBP:
UATHelper: Packaging (Linux): LogBlueprintCodeGen: Display: Nativization Summary - AnimBP:
UATHelper: Packaging (Linux): LogBlueprintCodeGen: Display: Name, Children, Non-empty Functions (Empty Functions), Variables, FunctionUsage, VariableUsage
UATHelper: Packaging (Linux): LogBlueprintCodeGen: Display: Name, Children, Non-empty Functions (Empty Functions), Variables, FunctionUsage, VariableUsage
UATHelper: Packaging (Linux): LogBlueprintCodeGen: Display: Nativization Summary - Shared Variables From Graph: 0
UATHelper: Packaging (Linux): LogBlueprintCodeGen: Display: Nativization Summary - Shared Variables From Graph: 0
UATHelper: Packaging (Linux): LogAssetRegistryGenerator: Display: Saving asset registry.
UATHelper: Packaging (Linux): LogAssetRegistryGenerator: Display: Saving asset registry.
UATHelper: Packaging (Linux): LogAssetRegistryGenerator: Display: Generated asset registry num assets 565, size is 254.87kb
UATHelper: Packaging (Linux): LogAssetRegistryGenerator: Display: Generated asset registry num assets 565, size is 254.87kb
UATHelper: Packaging (Linux): LogAssetRegistryGenerator: Display: Done saving asset registry.
UATHelper: Packaging (Linux): LogAssetRegistryGenerator: Display: Done saving asset registry.
UATHelper: Packaging (Linux): LogShaderLibrary: Display:
UATHelper: Packaging (Linux): LogShaderLibrary: Display:
UATHelper: Packaging (Linux): LogShaderLibrary: Display: Shader Code Stats: SF_VULKAN_SM5
UATHelper: Packaging (Linux): LogShaderLibrary: Display: Shader Code Stats: SF_VULKAN_SM5
UATHelper: Packaging (Linux): LogShaderLibrary: Display: =================
UATHelper: Packaging (Linux): LogShaderLibrary: Display: =================
UATHelper: Packaging (Linux): LogShaderLibrary: Display: Unique Shaders: 4236, Total Shaders: 4782, Unique Shadermaps: 320
UATHelper: Packaging (Linux): LogShaderLibrary: Display: Unique Shaders: 4236, Total Shaders: 4782, Unique Shadermaps: 320
UATHelper: Packaging (Linux): LogShaderLibrary: Display: Unique Shaders Size: 30.26mb, Total Shader Size: 32.33mb
UATHelper: Packaging (Linux): LogShaderLibrary: Display: Unique Shaders Size: 30.26mb, Total Shader Size: 32.33mb
UATHelper: Packaging (Linux): LogShaderLibrary: Display: =================
UATHelper: Packaging (Linux): LogShaderLibrary: Display: =================
UATHelper: Packaging (Linux): LogCook: Display: Cook by the book total time in tick 2.544388s total time 5.863912
UATHelper: Packaging (Linux): LogCook: Display: Cook by the book total time in tick 2.544388s total time 5.863912
UATHelper: Packaging (Linux): LogCook: Display: Peak Used virtual 11620 MiB Peak Used physical 1177 MiB
UATHelper: Packaging (Linux): LogCook: Display: Peak Used virtual 11620 MiB Peak Used physical 1177 MiB
UATHelper: Packaging (Linux): LogCook: Display: Hierarchy Timer Information:
UATHelper: Packaging (Linux): LogCook: Display: Hierarchy Timer Information:
UATHelper: Packaging (Linux): LogCook: Display: Root: 0.000s (0)
UATHelper: Packaging (Linux): LogCook: Display: Root: 0.000s (0)
UATHelper: Packaging (Linux): LogCook: Display: CleanSandbox: 0.035s (1)
UATHelper: Packaging (Linux): LogCook: Display: CleanSandbox: 0.035s (1)
UATHelper: Packaging (Linux): LogCook: Display: ProcessingAccessedStrings: 0.004s (1)
UATHelper: Packaging (Linux): LogCook: Display: ProcessingAccessedStrings: 0.004s (1)
UATHelper: Packaging (Linux): LogCook: Display: CollectFilesToCook: 0.117s (1)
UATHelper: Packaging (Linux): LogCook: Display: CollectFilesToCook: 0.117s (1)
UATHelper: Packaging (Linux): LogCook: Display: CookModificationDelegate: 0.000s (1)
UATHelper: Packaging (Linux): LogCook: Display: CookModificationDelegate: 0.000s (1)
UATHelper: Packaging (Linux): LogCook: Display: GenerateLongPackageName: 0.001s (1)
UATHelper: Packaging (Linux): LogCook: Display: GenerateLongPackageName: 0.001s (1)
UATHelper: Packaging (Linux): LogCook: Display: TickCookOnTheSide: 2.544s (93)
UATHelper: Packaging (Linux): LogCook: Display: TickCookOnTheSide: 2.544s (93)
UATHelper: Packaging (Linux): LogCook: Display: PostLoadPackageFixup: 0.002s (1064)
UATHelper: Packaging (Linux): LogCook: Display: PostLoadPackageFixup: 0.002s (1064)
UATHelper: Packaging (Linux): LogCook: Display: LoadPackageForCooking: 0.595s (569)
UATHelper: Packaging (Linux): LogCook: Display: LoadPackageForCooking: 0.595s (569)
UATHelper: Packaging (Linux): LogCook: Display: SavingPackages: 1.938s (196)
UATHelper: Packaging (Linux): LogCook: Display: SavingPackages: 1.938s (196)
UATHelper: Packaging (Linux): LogCook: Display: BeginPackageCacheForCookedPlatformData: 0.054s (44)
UATHelper: Packaging (Linux): LogCook: Display: BeginPackageCacheForCookedPlatformData: 0.054s (44)
UATHelper: Packaging (Linux): LogCook: Display: PrecachePlatformDataForNextPackage: 0.312s (567)
UATHelper: Packaging (Linux): LogCook: Display: PrecachePlatformDataForNextPackage: 0.312s (567)
UATHelper: Packaging (Linux): LogCook: Display: BeginPackageCacheForCookedPlatformData: 0.312s (525)
UATHelper: Packaging (Linux): LogCook: Display: BeginPackageCacheForCookedPlatformData: 0.312s (525)
UATHelper: Packaging (Linux): LogCook: Display: SaveCookedPackage: 1.544s (569)
UATHelper: Packaging (Linux): LogCook: Display: SaveCookedPackage: 1.544s (569)
UATHelper: Packaging (Linux): LogCook: Display: GEditorSavePackage: 1.531s (568)
UATHelper: Packaging (Linux): LogCook: Display: GEditorSavePackage: 1.531s (568)
UATHelper: Packaging (Linux): LogCook: Display: ConvertingBlueprints: 0.505s (568)
UATHelper: Packaging (Linux): LogCook: Display: ConvertingBlueprints: 0.505s (568)
UATHelper: Packaging (Linux): LogCook: Display: VerifyCanCookPackage: 0.003s (565)
UATHelper: Packaging (Linux): LogCook: Display: VerifyCanCookPackage: 0.003s (565)
UATHelper: Packaging (Linux): LogCook: Display: ClearAllCachedCookedPlatformData: 0.006s (569)
UATHelper: Packaging (Linux): LogCook: Display: ClearAllCachedCookedPlatformData: 0.006s (569)
UATHelper: Packaging (Linux): LogCook: Display: FinalizePackageStore: 0.001s (1)
UATHelper: Packaging (Linux): LogCook: Display: FinalizePackageStore: 0.001s (1)
UATHelper: Packaging (Linux): LogCook: Display: GeneratingBlueprintAssets: 0.490s (1)
UATHelper: Packaging (Linux): LogCook: Display: GeneratingBlueprintAssets: 0.490s (1)
UATHelper: Packaging (Linux): LogCook: Display: SavingCurrentIniSettings: 0.015s (1)
UATHelper: Packaging (Linux): LogCook: Display: SavingCurrentIniSettings: 0.015s (1)
UATHelper: Packaging (Linux): LogCook: Display: ProcessingAccessedStrings: 0.004s (1)
UATHelper: Packaging (Linux): LogCook: Display: ProcessingAccessedStrings: 0.004s (1)
UATHelper: Packaging (Linux): LogCook: Display: SavingAssetRegistry: 0.066s (1)
UATHelper: Packaging (Linux): LogCook: Display: SavingAssetRegistry: 0.066s (1)
UATHelper: Packaging (Linux): LogCook: Display: BuildChunkManifest: 0.010s (1)
UATHelper: Packaging (Linux): LogCook: Display: BuildChunkManifest: 0.010s (1)
UATHelper: Packaging (Linux): LogCook: Display: SaveManifests: 0.003s (1)
UATHelper: Packaging (Linux): LogCook: Display: SaveManifests: 0.003s (1)
UATHelper: Packaging (Linux): LogCook: Display: SaveRealAssetRegistry: 0.015s (1)
UATHelper: Packaging (Linux): LogCook: Display: SaveRealAssetRegistry: 0.015s (1)
UATHelper: Packaging (Linux): LogCook: Display: WriteCookerOpenOrder: 0.001s (1)
UATHelper: Packaging (Linux): LogCook: Display: WriteCookerOpenOrder: 0.001s (1)
UATHelper: Packaging (Linux): LogCook: Display: Done!
UATHelper: Packaging (Linux): LogCook: Display: Done!
UATHelper: Packaging (Linux): LogSavePackage: Display: Took 0.004387s to verify the EDL loading graph.
UATHelper: Packaging (Linux): LogSavePackage: Display: Took 0.004387s to verify the EDL loading graph.
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Misc Cook Stats
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Misc Cook Stats
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ===============
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ===============
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NiagaraShader.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NiagaraShader.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.Load
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.Load
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumPreloadedDependencies=282
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumPreloadedDependencies=282
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumInlineLoads=16
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumInlineLoads=16
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumPackagesLoaded=512
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumPackagesLoaded=512
UATHelper: Packaging (Linux): LogCookCommandlet: Display: LoadPackageTimeSec=6.405135
UATHelper: Packaging (Linux): LogCookCommandlet: Display: LoadPackageTimeSec=6.405135
UATHelper: Packaging (Linux): LogCookCommandlet: Display: CookOnTheFlyServer
UATHelper: Packaging (Linux): LogCookCommandlet: Display: CookOnTheFlyServer
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PeakRequestQueueSize=360
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PeakRequestQueueSize=360
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PeakLoadQueueSize=182
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PeakLoadQueueSize=182
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PeakSaveQueueSize=174
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PeakSaveQueueSize=174
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShaderCompiler
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShaderCompiler
UATHelper: Packaging (Linux): LogCookCommandlet: Display: BlockingTimeSec=0.000001
UATHelper: Packaging (Linux): LogCookCommandlet: Display: BlockingTimeSec=0.000001
UATHelper: Packaging (Linux): LogCookCommandlet: Display: AsyncCompileTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: AsyncCompileTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalBeginCompileShaderTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalBeginCompileShaderTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalBeginCompileShaderCalls=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalBeginCompileShaderCalls=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ProcessAsyncResultsTimeSec=0.000248
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ProcessAsyncResultsTimeSec=0.000248
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalShader.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalShader.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MeshMaterial.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MeshMaterial.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MaterialShader.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MaterialShader.Misc
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ShadersCompiled=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.Save
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.Save
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumPackagesSaved=568
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumPackagesSaved=568
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SavePackageTimeSec=0.985312
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SavePackageTimeSec=0.985312
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsPresaveTimeSec=0.092724
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsPresaveTimeSec=0.092724
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsTimeSec=0.037053
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsTimeSec=0.037053
UATHelper: Packaging (Linux): LogCookCommandlet: Display: FullyLoadLoadersTimeSec=0.000272
UATHelper: Packaging (Linux): LogCookCommandlet: Display: FullyLoadLoadersTimeSec=0.000272
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ResetLoadersTimeSec=0.002413
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ResetLoadersTimeSec=0.002413
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsGetObjectsWithOuter=0.003192
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsGetObjectsWithOuter=0.003192
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsGetObjectsWithMarks=0.000348
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TagPackageExportsGetObjectsWithMarks=0.000348
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SerializeImportsTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SerializeImportsTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SortExportsSeekfreeInnerTimeSec=0.159970
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SortExportsSeekfreeInnerTimeSec=0.159970
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SerializeExportsTimeSec=0.257245
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SerializeExportsTimeSec=0.257245
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SerializeBulkDataTimeSec=0.224419
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SerializeBulkDataTimeSec=0.224419
UATHelper: Packaging (Linux): LogCookCommandlet: Display: AsyncWriteTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: AsyncWriteTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MBWritten=634.202212
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MBWritten=634.202212
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.DifferentPackagesSizeMBPerAsset
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.DifferentPackagesSizeMBPerAsset
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.NumberOfDifferencesInPackagesPerAsset
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.NumberOfDifferencesInPackagesPerAsset
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.PackageDifferencesSizeMBPerAsset
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.PackageDifferencesSizeMBPerAsset
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.DiffTotal
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Package.DiffTotal
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumberOfDifferentPackages=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumberOfDifferentPackages=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DifferentPackagesSizeMB=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DifferentPackagesSizeMB=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumberOfDifferencesInPackages=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NumberOfDifferencesInPackages=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PackageDifferencesSizeMB=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PackageDifferencesSizeMB=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: UnversionedProperties
UATHelper: Packaging (Linux): LogCookCommandlet: Display: UnversionedProperties
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SavedStructs=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SavedStructs=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SavedMB=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SavedMB=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: EquivalentTaggedMB=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: EquivalentTaggedMB=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: CompressionRatio=-nan
UATHelper: Packaging (Linux): LogCookCommandlet: Display: CompressionRatio=-nan
UATHelper: Packaging (Linux): LogCookCommandlet: Display: BitfieldWasteKB=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display: BitfieldWasteKB=0
UATHelper: Packaging (Linux): LogCookCommandlet: Display:
UATHelper: Packaging (Linux): LogCookCommandlet: Display:
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Cook Profile
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Cook Profile
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ============
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ============
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0.CookWallTimeSec=118.333812
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0.CookWallTimeSec=118.333812
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 0.StartupWallTimeSec=112.460146
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 0.StartupWallTimeSec=112.460146
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1.CookByTheBookTimeSec=5.872215
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1.CookByTheBookTimeSec=5.872215
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 0.StartCookByTheBookTimeSec=2.717902
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 0.StartCookByTheBookTimeSec=2.717902
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 0. 0.GameCookModificationDelegateTimeSec=0.000009
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 0. 0.GameCookModificationDelegateTimeSec=0.000009
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1.TickCookOnTheSideTimeSec=3.134793
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1.TickCookOnTheSideTimeSec=3.134793
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 0.TickCookOnTheSideLoadPackagesTimeSec=0.595367
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 0.TickCookOnTheSideLoadPackagesTimeSec=0.595367
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 1.TickCookOnTheSideSaveCookedPackageTimeSec=1.544321
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 1.TickCookOnTheSideSaveCookedPackageTimeSec=1.544321
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 1. 0.TickCookOnTheSideResolveRedirectorsTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 1. 0.TickCookOnTheSideResolveRedirectorsTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 2.TickCookOnTheSideBeginPackageCacheForCookedPlatformDataTimeSec=0.365961
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 2.TickCookOnTheSideBeginPackageCacheForCookedPlatformDataTimeSec=0.365961
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 3.TickCookOnTheSideFinishPackageCacheForCookedPlatformDataTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 1. 3.TickCookOnTheSideFinishPackageCacheForCookedPlatformDataTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 2.TickLoopGCTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 2.TickLoopGCTimeSec=0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 3.TickLoopRecompileShaderRequestsTimeSec=0.000099
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 3.TickLoopRecompileShaderRequestsTimeSec=0.000099
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 4.TickLoopShaderProcessAsyncResultsTimeSec=0.000030
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 4.TickLoopShaderProcessAsyncResultsTimeSec=0.000030
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 5.TickLoopProcessDeferredCommandsTimeSec=0.000073
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 5.TickLoopProcessDeferredCommandsTimeSec=0.000073
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 6.TickLoopTickCommandletStatsTimeSec=0.000015
UATHelper: Packaging (Linux): LogCookCommandlet: Display: 0. 1. 6.TickLoopTickCommandletStatsTimeSec=0.000015
UATHelper: Packaging (Linux): LogCookCommandlet: Display:
UATHelper: Packaging (Linux): LogCookCommandlet: Display:
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DDC Summary Stats
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DDC Summary Stats
UATHelper: Packaging (Linux): LogCookCommandlet: Display: =================
UATHelper: Packaging (Linux): LogCookCommandlet: Display: =================
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalGetHits = 1756
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalGetHits = 1756
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalGets = 1770
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalGets = 1770
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalGetHitPct = 0.992090
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalGetHitPct = 0.992090
UATHelper: Packaging (Linux): LogCookCommandlet: Display: LocalGetHitPct = 0.963277
UATHelper: Packaging (Linux): LogCookCommandlet: Display: LocalGetHitPct = 0.963277
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SharedGetHitPct = 0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SharedGetHitPct = 0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: OtherGetHitPct = 0.028814
UATHelper: Packaging (Linux): LogCookCommandlet: Display: OtherGetHitPct = 0.028814
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GetMissPct = 0.007910
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GetMissPct = 0.007910
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalPutHits = 14
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalPutHits = 14
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalPuts = 14
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalPuts = 14
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalPutHitPct = 1.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: TotalPutHitPct = 1.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PutMissPct = 0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PutMissPct = 0.000000
UATHelper: Packaging (Linux): LogCookCommandlet: Display:
UATHelper: Packaging (Linux): LogCookCommandlet: Display:
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DDC Resource Stats
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DDC Resource Stats
UATHelper: Packaging (Linux): LogCookCommandlet: Display: =======================================================================================================
UATHelper: Packaging (Linux): LogCookCommandlet: Display: =======================================================================================================
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Asset Type Total Time (Sec) GameThread Time (Sec) Assets Built MB Processed
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Asset Type Total Time (Sec) GameThread Time (Sec) Assets Built MB Processed
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ---------------------------------- ---------------- --------------------- ------------ ------------
UATHelper: Packaging (Linux): LogCookCommandlet: Display: ---------------------------------- ---------------- --------------------- ------------ ------------
UATHelper: Packaging (Linux): LogCookCommandlet: Display: StaticMesh 0.76 0.76 0 8.36
UATHelper: Packaging (Linux): LogCookCommandlet: Display: StaticMesh 0.76 0.76 0 8.36
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Texture (Streaming) 0.69 0.00 0 509.06
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Texture (Streaming) 0.69 0.00 0 509.06
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MaterialShader 0.27 0.27 0 19.60
UATHelper: Packaging (Linux): LogCookCommandlet: Display: MaterialShader 0.27 0.27 0 19.60
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalShader 0.02 0.02 0 17.25
UATHelper: Packaging (Linux): LogCookCommandlet: Display: GlobalShader 0.02 0.02 0 17.25
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PhysX (BodySetup) 0.02 0.02 0 5.70
UATHelper: Packaging (Linux): LogCookCommandlet: Display: PhysX (BodySetup) 0.02 0.02 0 5.70
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SoundWave 0.01 0.01 0 2.91
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SoundWave 0.01 0.01 0 2.91
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Audio (Inline) 0.00 0.00 0 0.00
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Audio (Inline) 0.00 0.00 0 0.00
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NavCollision 0.00 0.00 0 0.15
UATHelper: Packaging (Linux): LogCookCommandlet: Display: NavCollision 0.00 0.00 0 0.15
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SkeletalMesh 0.00 0.00 0 2.51
UATHelper: Packaging (Linux): LogCookCommandlet: Display: SkeletalMesh 0.00 0.00 0 2.51
UATHelper: Packaging (Linux): LogCookCommandlet: Display: AnimSequence 0.00 0.00 0 0.06
UATHelper: Packaging (Linux): LogCookCommandlet: Display: AnimSequence 0.00 0.00 0 0.06
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Texture (Inline) 0.00 0.00 14 79.58
UATHelper: Packaging (Linux): LogCookCommandlet: Display: Texture (Inline) 0.00 0.00 14 79.58
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DistanceField 0.00 0.00 0 0.00
UATHelper: Packaging (Linux): LogCookCommandlet: Display: DistanceField 0.00 0.00 0 0.00
UATHelper: Packaging (Linux): 1111LogInit: Display:
UATHelper: Packaging (Linux): LogInit: Display:
UATHelper: Packaging (Linux): LogInit: Display: Warning/Error Summary (Unique only)
UATHelper: Packaging (Linux): LogInit: Display: Warning/Error Summary (Unique only)
UATHelper: Packaging (Linux): LogInit: Display: -----------------------------------
UATHelper: Packaging (Linux): LogInit: Display: -----------------------------------
UATHelper: Packaging (Linux): 10011101LogInit: Display: LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleTODWidget' for InputAction ToggleTODWidget
UATHelper: Packaging (Linux): LogInit: Display: LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleTODWidget' for InputAction ToggleTODWidget
UATHelper: Packaging (Linux): LogInit: Display: LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleProfilingWidget' for InputAction ToggleProfilingWidget
UATHelper: Packaging (Linux): LogInit: Display: LogBlueprint: Warning: [AssetLog] /home/shehzan/workspace/DummyGame/Plugins/CesiumForUnreal/Content/FloatingPawn.uasset: [Compiler] InputAction Event references unknown Action 'ToggleProfilingWidget' for InputAction ToggleProfilingWidget
UATHelper: Packaging (Linux): LogInit: Display:
UATHelper: Packaging (Linux): LogInit: Display:
UATHelper: Packaging (Linux): 1101LogInit: Display: Success - 0 error(s), 2 warning(s)
UATHelper: Packaging (Linux): LogInit: Display: Success - 0 error(s), 2 warning(s)
UATHelper: Packaging (Linux): LogInit: Display:
UATHelper: Packaging (Linux): Execution of commandlet took: 5.87 seconds
UATHelper: Packaging (Linux): LogInit: Display:
UATHelper: Packaging (Linux): Execution of commandlet took: 5.87 seconds
UATHelper: Packaging (Linux): LogHttp: Display: cleaning up 0 outstanding Http requests.
UATHelper: Packaging (Linux): LogContentStreaming: Display: There are 1 unreleased StreamingManagers
UATHelper: Packaging (Linux): Took 120.123049s to run UE4Editor, ExitCode=0
UATHelper: Packaging (Linux): ********** COOK COMMAND COMPLETED **********
UATHelper: Packaging (Linux): ********** STAGE COMMAND STARTED **********
UATHelper: Packaging (Linux): Creating Staging Manifest...
UATHelper: Packaging (Linux): Staging all 1 build products
UATHelper: Packaging (Linux): Product 0: UE4Game
UATHelper: Packaging (Linux): Running: sh -c 'chmod +x "/home/shehzan/workspace/DummyGame/Intermediate/Staging/DummyGame.sh"'
UATHelper: Packaging (Linux): Took 0.006145s to run sh, ExitCode=0
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/BaseEditor.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/BaseEditorKeyBindings.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/BaseEditorPerProjectUserSettings.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/BaseEditorSettings.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/BaseLightmass.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/BasePakFileRules.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Linux/LinuxEditorGameAgnostic.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/Category.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/Editor.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/EditorTutorials.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/Engine.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/Keywords.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/PortableObjectExport.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/PortableObjectImport.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/PropertyNames.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/RepairData.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/ToolTips.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/UnrealEngine/Engine/Config/Localization/WordCount.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/DummyGame/Config/DefaultEditor.ini
UATHelper: Packaging (Linux): Excluding config file /home/shehzan/workspace/DummyGame/Config/DefaultEditorPerProjectUserSettings.ini
UATHelper: Packaging (Linux): Cleaning Stage Directory: /home/shehzan/workspace/DummyGame/Saved/StagedBuilds/LinuxNoEditor
UATHelper: Packaging (Linux): Creating pak using staging manifest.
UATHelper: Packaging (Linux): Executing 1 UnrealPak command...
UATHelper: Packaging (Linux): Waiting for child processes to complete (1/1)
UATHelper: Packaging (Linux): Output from: /home/shehzan/workspace/DummyGame/DummyGame.uproject /home/shehzan/workspace/DummyGame/Saved/StagedBuilds/LinuxNoEditor/DummyGame/Content/Paks/DummyGame-LinuxNoEditor.pak -create="/home/shehzan/Library/Logs/Unreal Engine/LocalBuildLogs/PakList_DummyGame-LinuxNoEditor.txt" -cryptokeys=/home/shehzan/workspace/DummyGam
e/Saved/Cooked/LinuxNoEditor/DummyGame/Metadata/Crypto.json -order=/home/shehzan/workspace/DummyGame/Build/LinuxNoEditor/FileOpenOrder/CookerOpenOrder.log -platform=Linux -multiprocess -abslog="/home/shehzan/Library/Logs/Unreal Engine/LocalBuildLogs/UnrealPak-DummyGame-LinuxNoEditor-2021.04.26-11.32.01.txt"
UATHelper: Packaging (Linux): LogPakFile: Display: Parsing crypto keys from a crypto key cache file
UATHelper: Packaging (Linux): LogPakFile: Display: Loading response file /home/shehzan/Library/Logs/Unreal Engine/LocalBuildLogs/PakList_DummyGame-LinuxNoEditor.txt
UATHelper: Packaging (Linux): LogPakFile: Display: Added 2362 entries to add to pak file.
UATHelper: Packaging (Linux): LogPakFile: Display: Loading pak order file /home/shehzan/workspace/DummyGame/Build/LinuxNoEditor/FileOpenOrder/CookerOpenOrder.log...
UATHelper: Packaging (Linux): LogPakFile: Display: Finished loading pak order file /home/shehzan/workspace/DummyGame/Build/LinuxNoEditor/FileOpenOrder/CookerOpenOrder.log.
UATHelper: Packaging (Linux): LogPakFile: Display: Collecting files to add to pak file...
UATHelper: Packaging (Linux): LogPakFile: Display: Collected 2362 files in 0.01s.
UATHelper: Packaging (Linux): LogPakFile: Display: Creating pak /home/shehzan/workspace/DummyGame/Saved/StagedBuilds/LinuxNoEditor/DummyGame/Content/Paks/DummyGame-LinuxNoEditor.pak.
UATHelper: Packaging (Linux): LogDerivedDataCache: Display: Performance to ../../../Engine/DerivedDataCache: Latency=0.00ms. RandomReadSpeed=999.00MBs, RandomWriteSpeed=999.00MBs. Assigned SpeedClass 'Local'
UATHelper: Packaging (Linux): LogPakFile: Display: Added 2362 files, 711378380 bytes total, time 1.92s.
UATHelper: Packaging (Linux): LogPakFile: Display: PrimaryIndex size: 28458 bytes
UATHelper: Packaging (Linux): LogPakFile: Display: PathHashIndex size: 50550 bytes
UATHelper: Packaging (Linux): LogPakFile: Display: FullDirectoryIndex size: 76452 bytes
UATHelper: Packaging (Linux): LogPakFile: Display: Encryption - DISABLED
UATHelper: Packaging (Linux): LogPakFile: Display: Unreal pak executed in 2.159341 seconds
UATHelper: Packaging (Linux): UnrealPak terminated with exit code 0
UATHelper: Packaging (Linux): Copying NonUFSFiles to staging directory: /home/shehzan/workspace/DummyGame/Saved/StagedBuilds/LinuxNoEditor
UATHelper: Packaging (Linux): ********** STAGE COMMAND COMPLETED **********
UATHelper: Packaging (Linux): ********** PACKAGE COMMAND STARTED **********
UATHelper: Packaging (Linux): ********** PACKAGE COMMAND COMPLETED **********
UATHelper: Packaging (Linux): ********** ARCHIVE COMMAND STARTED **********
UATHelper: Packaging (Linux): Archiving to /home/shehzan/workspace/Packages/DummyGame-VK-Cesium/
UATHelper: Packaging (Linux): ********** ARCHIVE COMMAND COMPLETED **********
UATHelper: Packaging (Linux): BUILD SUCCESSFUL
@shehzan10 can you try putting the plugin in the engine plugins directory instead of the project directory? That's the more typical way it would be used.
@kring That was it, it works! It actually works! 🎉
On a separate note - I was using commit 5df54b7 on the release branch which is for 4.26.1. But I think Travis used 4.26.2. When I tried to run the game project with the Travis built version, I got this error. Looks like engine plugins may not be backward compatible on patch version. Maybe we should use the 4.26.0 release for Linux compilation?
I used a locally compiled version, copied it into the UE Plugins directory, and then rebuilt the game. Worked like a charm.
Looks like engine plugins may not be backward compatible on patch version. Maybe we should use the 4.26.0 release for Linux compilation?
I think it's more complicated than that. Because our Linux builds are built locally, rather than official builds from Epic, their "build ID" is a GUID. And that GUID gets embedded in the UE4Editor.modules file in the Binaries/[Platform] directory. If you look at our built plugin for Windows and Mac, the buildId is identical between them: 14830424. But in the Linux directory, we have a GUID.
I'm not really sure how this is supposed to work when Epic doesn't provide Linux builds. It may be the case that Linux users are forced to build our plugin, just like they're forced to build Unreal Engine itself. And is distributing for Linux on the Marketplace even a thing when there's (AFAICT) no Marketplace for Linux? Apparently there's some dodgy third party thing: https://blog.gamedev.tv/ue4-marketplace-linux/
As far as actual compatibility, the UE policy is not to change header files or anything toolchain related in patch versions, so there shouldn't be any compatibility issue.
I think it's more complicated than that. Because our Linux builds are built locally, rather than official builds from Epic, their "build ID" is a GUID. And that GUID gets embedded in the UE4Editor.modules file in the Binaries/[Platform] directory. If you look at our built plugin for Windows and Mac, the buildId is identical between them: 14830424. But in the Linux directory, we have a GUID.
:+1:
I'm not really sure how this is supposed to work when Epic doesn't provide Linux builds. It may be the case that Linux users are forced to build our plugin, just like they're forced to build Unreal Engine itself. And is distributing for Linux on the Marketplace even a thing when there's (AFAICT) no Marketplace for Linux? Apparently there's some dodgy third party thing: blog.gamedev.tv/ue4-marketplace-linux
The key thing here is that the Marketplace doesn't provide Linux builds for Linux's sake - it seems to provide it for Windows cross compilation, which I was able to run successfully, with one modification.
Steps for Linux Cross Compilation of Cesium for Unreal Samples from Windows
Make sure UE Editor is closed.
Download and install the combined package from travis into UE Engine Plugins directory.
Download and Install the 4.26 | -v17 clang-10.0.1-based toolchain from https://docs.unrealengine.com/en-US/SharingAndReleasing/Linux/GettingStarted/index.html
Open CMD Prompt and run %LINUX_MULTIARCH_ROOT%x86_64-unknown-linux-gnu\bin\clang++ -v. Match the output to https://docs.unrealengine.com/en-US/SharingAndReleasing/Linux/GettingStarted/#troubleshooting.
Open CesiumForUnreal/Source/CesiumRuntime/CesiumRuntime.Build.cs and change the Unknown target variables for libraries to match the Linux target.
I'm not entirely sure why it doesn't use the Linux target.
Open UE Editor and run File -> Package Project -> Linux -> Linux.
I generally feel comfortable merging this based on the testing done so far. We can fix any new findings in separate PRs.
Ah that's cool! And that works even with the GUID in our UE4Editor.modules?
And that works even with the GUID in our UE4Editor.modules
Didn't check or change this at all. I purely did the steps I listed above and used the travis-built combined zip.
Great, let's :shipit: then, eh?
Amendment to my cross compilation comment - Even though Linux exe builds, it doesn't run because I think we're hardcoding/ using environment variables in some places which is causing errors like
terminating with uncaught exception of type std::runtime_error: unable to open database file
I assume this has to do with our cache database, but not sure cuz I didn't dig. In any case, I think we can deal with this separately.
Full log (shipping build):
LinuxNoEditor$ ./CesiumForUnrealSamples.sh /home/shehzan/workspace/Packages/CesiumForUnrealSamplesPackagesLinux/LinuxNoEditor 00:04:21
4.26.1-15228867+++UE4+Release-4.26 522 0
Disabling core dumps.
Unable to read VR Path Registry from /home/shehzan/.config/openvr/openvrpaths.vrpath
[0427/000426:ERROR:browser_main_loop.cc(217)] Running without the SUID sandbox! See https://chromium.googlesource.com/chromium/src/+/master/docs/linux_suid_sandbox_development.md for more information on developing with the sandbox on.
LaunchProcess: failed to execvp:
/home/shehzan/workspace/Packages/CesiumForUnrealSamplesPackagesLinux/LinuxNoEditor/Engine/Binaries/Linux/UnrealCEFSubProcess
LaunchProcess: failed to execvp:
/home/shehzan/workspace/Packages/CesiumForUnrealSamplesPackagesLinux/LinuxNoEditor/Engine/Binaries/Linux/UnrealCEFSubProcess
[2021-04-27 00:04:26.238] [info] [TileContentFactory.cpp:12] Registering magic header glTF
[2021-04-27 00:04:26.238] [info] [TileContentFactory.cpp:12] Registering magic header b3dm
[2021-04-27 00:04:26.238] [info] [TileContentFactory.cpp:12] Registering magic header cmpt
[2021-04-27 00:04:26.238] [info] [TileContentFactory.cpp:12] Registering magic header json
[2021-04-27 00:04:26.238] [info] [TileContentFactory.cpp:21] Registering content type application/vnd.quantized-mesh
[0427/000426:WARNING:ipc_message_attachment_set.cc(57)] MessageAttachmentSet destroyed with unconsumed descriptors: 0/1
Unable to read VR Path Registry from /home/shehzan/.config/openvr/openvrpaths.vrpath
Unable to read VR Path Registry from /home/shehzan/.config/openvr/openvrpaths.vrpath
terminating with uncaught exception of type std::runtime_error: unable to open database file
./CesiumForUnrealSamples.sh: line 5: 376161 Aborted (core dumped) "$UE4_PROJECT_ROOT/CesiumForUnrealSamples/Binaries/Linux/CesiumForUnrealSamples-Linux-Shipping" CesiumForUnrealSamples "$@"
Thanks @kring @Nodrev @opratalsim @baothientran for your help with this. This has been a great learning experience.
I assume this has to do with our cache database, but not sure cuz I didn't dig. In any case, I think we can deal with this separately.
@shehzan10 did you get further with this? I got the same error message and could not even figure out where in the code to take a look.
@petergerten Just to tie up the conversations - I see you have posted in https://github.com/CesiumGS/cesium-unreal/issues/387#issuecomment-832582741, which is the best place to discuss this.
|
gharchive/pull-request
| 2021-04-23T12:16:47 |
2025-04-01T04:54:49.710741
|
{
"authors": [
"cesium-concierge",
"kring",
"petergerten",
"shehzan10"
],
"repo": "CesiumGS/cesium-unreal",
"url": "https://github.com/CesiumGS/cesium-unreal/pull/377",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1264585057
|
Resource retryCallback parameter error not has valid status code
Resource retryCallback parameter error not has valid status code
Sandcastle example:
Browser: chrome
Operating System: mac os
`
function refreshTokenRetryCallback(resource, error) {
console.log('error.statusCode:', error.statusCode);
console.log('error:', error);
if (error.statusCode === 403) {
// 403 status code means a new token should be generated
return getNewAccessToken()
.then(function(token) {
resource.queryParameters.access_token = token;
return true;
})
.catch(function() {
return false;
});
}
return false;
}
const resource = new Resource({
url: 'http://server.com/path/to/resource.json',
proxy: new DefaultProxy('/proxy/'),
headers: {
'X-My-Header': 'valueOfHeader'
},
queryParameters: {
'access_token': '123-435-456-000'
},
retryCallback: refreshTokenRetryCallback,
retryAttempts: 1
});
`
i write the code like doc, but from console:
error.statusCode: undefined
error: RequestErrorEvent {statusCode: undefined, response: undefined, responseHeaders: undefined}
i think the 404 status code should be return. is it a bug?
We return the status code when a valid response is received. On an error event, it appears we do not specify a status code. Perhaps the CORS error you see is the reason why no status code is logged.
Would you be able to provide a Sandcastle code example which duplicates the error?
|
gharchive/issue
| 2022-06-08T11:17:28 |
2025-04-01T04:54:49.724571
|
{
"authors": [
"danny-wang",
"ggetz"
],
"repo": "CesiumGS/cesium",
"url": "https://github.com/CesiumGS/cesium/issues/10436",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
173241589
|
Poor billboard quality
Reported on the forum: https://groups.google.com/d/msg/cesium-dev/OLbOuzNRM7U/MHgo_4N6CwAJ
The icon to the left is a billboard, the icon to the right is the original image
I believe this happens because billboards are not aligned to the nearest pixel. When a billboard has a non-integer screen location, it becomes blurred by the fullscreen AA process.
There's a suggested fix on Stack Overflow that involves making a one-line change to BillboardCollectionVS.glsl#L245, to change it from:
gl_Position = czm_viewportOrthographic * vec4(positionWC.xy, -positionWC.z, 1.0);
to:
gl_Position = czm_viewportOrthographic * vec4(floor(positionWC.xy + 0.5), -positionWC.z, 1.0);
This has the effect of greatly reducing blur on both Billboards and Labels.
What do you guys think of this? If you like it, I can make a pull request out of it (assuming we can take code from SO. I presume most projects already have quite a bit of uncredited SO code around).
The reasoning seems clear: Billboards are always raster images, so non-integer screen locations will always be worse quality than pixel-aligned locations.
Ah cool, can you post a screenshot of the difference?
The Stack Overflow answer has a good screenshot:
http://stackoverflow.com/questions/33784256/cesium-label-blurred/33786151#33786151
If this feels like deja-vu, you're right. Here's where clamp-to-pixel functionality was added: #188 Here's where it was removed again: #565.
Good memory @shunter. In experimenting with this code, it looks like the clamp-to-pixel is not what's actually improving this guy's screenshots. He's also turning off scene.fxaa at the same time, and that appears to be what actually cleans up the label.
Yeah, fxaa is probably the real culprit here. Clamping to pixel actually does not do what we would hope it would do (Especially with text). It also causes all kinds of problem when during motion (either because the item is moving or the camera is moving). Not clamping really is the "correct" option.
I asked Dan about possibly not running fxaa on labels and billboards and he things it might be possible but might require too much overhead or other craziness.
I just noticed he's also provided us with an svg, that's probably compounding the problem because the svg is properly be anti-aliased on rasterization by the browser (and depending on the browser it could be doing a bad job).
@hpinkos can you reach out and get the original icon? I didn't notice it in that thread.
I asked @bagnell offline how our fxaa is actually implemented. He suggested this was based on an older approximation where high-contrast edges are intentionally blurred, which is fast enough for realtime, but is a very different implementation from true anti-aliasing. There was talk of this possibly being upgraded to a different algorithm when WebGL 2 becomes more widely available.
So it sounds like the blurring is a design feature of the current fxaa system, making it almost unavoidable without a rewrite or disabling of fxaa.
I don't think it is out of the question to render to one framebuffer that would get FXAA, and another for labels/billboards/points that would not, and then composite them. This would also work well with the need to render labels on top of polygons since the low level-renderer now would have explicit-ish knowledge of labels/billboards/points.
This will be a non-trivial amount of time though so we can't do it for a few months.
Here is a png and svg version of the icon from the user. He says the png looks better than the svg
icon-library.svg.txt
Here is a png and svg version of the icon from the user. He says the png looks better than the svg
This is exactly what I assumed a large part of the problem was. Browsers do not all do an equally good job of rasterizing/resizing SVGs, so the fact that it's an SVG is one of the reasons it looked so blurry.
True, but we still have big problem with fxaa intentionally blurring our billboards and labels. I think @pjcozzi's comment is the path to persue, when time permits.
I think everyone agrees @emackey I just wanted to point out that SVGs can compound the issue.
Related to #2752 and #3279 . If you're considering separate frame buffers, being able to selectively apply it to imagery tiles as well would be helpful. An AA pass on tiles, especially those with labels, reduces quality.
Also reported here: https://groups.google.com/forum/?hl=en#!topic/cesium-dev/NBfMvj80vGI
I investigated this at the bug bash. It would be quite a bit of work to selectively apply anti-aliasing. Here is the post process that would happen after each frustum instead of on the entire scene:
@bagnell can you post a link to the branch-in-progress and before/after screenshots?
The branch is selective-fxaa.
Before:
After:
Another idea is to render pick ids with MRT (or a separate pass to an offscreen framebuffer/texture), and then only blur pixels where the surrounding pick ids are different so only the ends of objects are blurred. Billboards, the globe texture, etc. would not be blurred except at the silhouettes.
Would these fixes apply to #3279 as well?
@denverpierce they would be part of the solution.
I believe that this is still an issue - has there been any additional thought on this topic in the last ~5 years?
Or is there a recommended approach to solve with billboards and text overlays?
Quality issue when moving reported in https://community.cesium.com/t/billboard-with-custom-canvas-image-gets-darker-brighter-on-scrolling-rotating/25221/2.
I'm still having this issue.
Here is an example: sandcastle
I'm trying to create many billboards with a rectangle texture. I tried supplying an image with the "pixelated" style to prevent the image from blurring. Note that the image on the top is not blurred, but the rendered rectangle in Cesium is blurred.
Also tried disabling fxaa and antialias, without success.
Any further investigation into this issue? I can see that even in the sandcastle examples the billboard quality is low as well.
Thanks for the interest. There hasn't been any activity on this item recently, but it is definitely on our radar.
for anyone else that has found this thread, my co-worker came across this other thread explaining how billboards were made intentionally blurry to help with running on mobile devices. and that viewing billboards at a browser zoom level of anything other than 100% will cause this issue to become more noticeable.
The workaround from the thread states to adjust the value of viewer.resolutionScale property
I tested this fix in this sandcastle with some success.
@dukeofcool199 Thanks big time for this. I've been searching and made too many attempts to fix this than I want to admit lol, but your tip did the trick! :)
for anyone else that has found this thread, my co-worker came across this other thread explaining how billboards were made intentionally blurry to help with running on mobile devices. and that viewing billboards at a browser zoom level of anything other than 100% will cause this issue to become more noticeable. The workaround from the thread states to adjust the value of viewer.resolutionScale property I tested this fix in this sandcastle with some success.
UPDATE:
change this setting for the viewer viewer.useBrowserRecommendedResolution to be false
nice bro.
|
gharchive/issue
| 2016-08-25T15:53:39 |
2025-04-01T04:54:49.746095
|
{
"authors": [
"CassandraCat",
"Huyston",
"badg0003",
"bagnell",
"denverpierce",
"dukeofcool199",
"emackey",
"expatiating",
"ggetz",
"hpinkos",
"lilleyse",
"mramato",
"pjcozzi",
"shunter"
],
"repo": "CesiumGS/cesium",
"url": "https://github.com/CesiumGS/cesium/issues/4235",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
449103486
|
enableLighting = true causes black globe on Android
On my Pixel 2 XL in Chrome 74, setting enableLighting to true causes the globe to render as completely black. I don't see any errors in the console. It seems to be an issue with the combination of showGroundAtmosphere and enableLighting, as disabling either of them brings the globe back.
Can be reproduced in this sandcastle: https://cesiumjs.org/Cesium/Build/Apps/Sandcastle/?src=Ground Atmosphere.html
Thanks for reporting this @jdarpinian. I can reproduce this on my Pixel 3 in Chrome as well. I can also see that zooming in enough once you get to the fade out distance will trigger the globe to render again. But I don't see a separate code path that triggers when this fade reaches 1 or 0 in this shader:
https://github.com/AnalyticalGraphicsInc/cesium/blob/master/Source/Shaders/GlobeFS.glsl#L286
@bagnell any idea here?
@OmarShehata @jdarpinian I guess this is due to the structure, because this will not be reproduced on the pc platform, ios platform, Huawei mobile phone. the platform using Qualcomm processor seems to have this problem, I am removing all the structs, when I finish After working, I can confirm whether to solve this problem. #7651
This issue still occurs on android chrome, is it possible to fix?
Still presenting as an issue on both iPad OS and the latest Android version on a OnePlus 9 Pro.
|
gharchive/issue
| 2019-05-28T07:23:01 |
2025-04-01T04:54:49.750178
|
{
"authors": [
"OmarShehata",
"hst-m",
"jdarpinian",
"nlucis",
"verybigzhouhai"
],
"repo": "CesiumGS/cesium",
"url": "https://github.com/CesiumGS/cesium/issues/7871",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
719710019
|
Update Model.js
Fix for issue https://github.com/CesiumGS/cesium/issues/9169
Added a simple test for those cases when commands are dynamic in a model
Thank you so much for the pull request @shelterit! I noticed this is your first pull request and I wanted to say welcome to the Cesium community!
The Pull Request Guidelines is a handy reference for making sure your PR gets accepted quickly, so make sure to skim that.
:x: Missing CONTRIBUTORS.md entry.
Please add yourself to the contributors file!
:x: Missing CLA.
Please send in a Contributor License Agreement (CLA) and comment back here to let us know to check this!
:grey_question: CHANGES.md was not updated.
If this change updates the public API in any way, please add a bullet point to CHANGES.md.
:grey_question: Unit tests were not updated.
Make sure you've updated tests to reflect your changes, added tests for any new code, and ran the code coverage tool.
Reviewers, don't forget to make sure that:
[ ] Cesium Viewer works.
[ ] Works in 2D/CV.
[ ] Works (or fails gracefully) in IE11.
Thanks for opening a PR Alex! I know you mentioned testing for unintended consequences, but I'm hesitant to merge when we don't have a way to verify this fix. It may not cause any adverse effects but it may push the error further down the line since the underlying cause is not fixed. Why is an undefined command being pushed here in the first place? Do you know where it originates?
Hey,
That's a good question, however I don't have the internal knowledge of
Cesium to answer how that could happen. :) I'll see if I can talk to the
customer if they can share that piece of data with us, although I'm not
sure (they're a bit particular, hush, hush). I can probably get the source
files, and find out what they used to tile it, maybe there's a race
condition in there somewhere, and hopefully at the least try to make a
different model to replicate the issue.
If I go through the original file, what should I be looking for? It's all
linked to whether a property is present in the right place, and possibly
the tiler / somewhere didn't inject the property at some place? Again, I'm
not smart enough to know what goes into the opacity command being sent and
what that stack looks like.
We've currently got this fix in our build stream, but it would be great to
get it out, of course, to keep things "pure". Do you have a test suite of
data for every build that you could test the 'fix' against?
Cheers,
Alex
On Mon, Nov 2, 2020 at 4:11 AM Omar Shehata notifications@github.com
wrote:
Thanks for opening a PR Alex! I know you mentioned testing for unintended
consequences, but I'm hesitant to merge when we don't have a way to verify
this fix. It may not cause any adverse effects but it may push the error
further down the line since the underlying cause is not fixed. Why is an
undefined command being pushed here in the first place? Do you know where
it originates?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/CesiumGS/cesium/pull/9197#issuecomment-720120660, or
unsubscribe
https://github.com/notifications/unsubscribe-auth/AAFHK5XFH55QDRXJR4EN4O3SNWJDRANCNFSM4SNQGIYA
.
--
Information Alchemist / UX consultant / GUI developer for hire
http://thinkplot.org | http://www.linkedin.com/in/shelterit
Thanks again for your contribution @shelterit!
No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?
I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.
Thanks again for your contribution @shelterit!
No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?
I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.
Thanks again for your contribution @shelterit!
No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?
I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.
Thanks again for your contribution @shelterit!
No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?
I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.
Thanks again for your contribution @shelterit!
No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?
I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.
A similar fix was made in https://github.com/CesiumGS/cesium/pull/9271 so I'm going to close this one. Thanks @shelterit.
|
gharchive/pull-request
| 2020-10-12T22:29:50 |
2025-04-01T04:54:49.767370
|
{
"authors": [
"OmarShehata",
"cesium-concierge",
"lilleyse",
"shelterit"
],
"repo": "CesiumGS/cesium",
"url": "https://github.com/CesiumGS/cesium/pull/9197",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
916868623
|
Fixes S2 bounding volumes
This PR fixes some two main issues in the current S2 implementation:
Incorrect conversion from {level}/{x}/{y}/{z} to Hilbert/S2 index.
The parent bounding volume being updated when a child bounding volume is created.
Thanks for the pull request @sanjeetsuhag!
:heavy_check_mark: Signed CLA found.
:grey_question: CHANGES.md was not updated.
If this change updates the public API in any way, please add a bullet point to CHANGES.md.
Reviewers, don't forget to make sure that:
[ ] Cesium Viewer works.
[ ] Works in 2D/CV.
[ ] Works (or fails gracefully) in IE11.
@sanjeetsuhag I'll merge this once CI passes
merged! Thanks @sanjeetsuhag!
|
gharchive/pull-request
| 2021-06-10T04:04:47 |
2025-04-01T04:54:49.771323
|
{
"authors": [
"cesium-concierge",
"ptrgags",
"sanjeetsuhag"
],
"repo": "CesiumGS/cesium",
"url": "https://github.com/CesiumGS/cesium/pull/9607",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2103915445
|
Implemented energy mechanism and UI
Closes #5 and closes #6
Added energy mechanism on GameManager
Added Panel component and energy panel
Implemented energy spend mechanism, locking expensive towers
Added and implemented UIManager
Removed old references to ScrollableCamera to UIManager
Renamed TowerDrag to TowerIcon
Moved things around so folders make more sense now in line with the GDD
No code reviewers are available so will go ahead and merge this
|
gharchive/pull-request
| 2024-01-28T02:00:21 |
2025-04-01T04:54:49.773322
|
{
"authors": [
"CetoBasilius"
],
"repo": "CetoBasilius/energy-defender",
"url": "https://github.com/CetoBasilius/energy-defender/pull/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1359983566
|
Rubify health check script
Issue summary
Currently, the health check script is written in bash. For readability and extendability and having a uniform technology in the sync check, it would be great to rewrite it in Ruby. The current script doesn't do any magic, just grabs the metrics and compares a few values there, so it should be straightforward to implement.
It might be an excellent entry task to get exposed to Ruby.
Task summary
[ ] rewrite health check script in Ruby
[ ] verify that it works the same way as the former bash script (perhaps some basic unit tests?)
[ ] use it in sync check CI
Acceptance Criteria
[ ] bash script is no longer used in sync check
[ ] new version has at least the same functionality as the former one
Other information and links
https://github.com/ChainSafe/forest/pull/1867
Bash is good, and the sync check is rock solid.
|
gharchive/issue
| 2022-09-02T10:08:33 |
2025-04-01T04:54:49.810713
|
{
"authors": [
"LesnyRumcajs"
],
"repo": "ChainSafe/forest",
"url": "https://github.com/ChainSafe/forest/issues/1871",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1530358955
|
Lint: pedantic clippy warnings
Issue summary
To further increase code quality, it would be interesting to look into pedantic clippy group.
For each lint:
add your GH handle to the lint so that two people are not working on the same lint,
figure out if it's helpful and makes sense in our case,
add it to the lint target in Makefile https://github.com/ChainSafe/forest/blob/d8eae69e556038af86e32a388e7e118b3bd3307d/Makefile#L73-L74
fix the warnings if any
tick the corresponding checkbox,
After they are all resolved, enable the entire pedantic group with perhaps some exceptions.
:warning:
This is a low-priority issue; feel free to tackle a lint or two now and then if there's nothing more important to do.
Lint list (may differ in time & clippy version):
[ ] bool_to_int_with_if
[ ] borrow_as_ptr
[ ] case_sensitive_file_extension_comparisons
[ ] cast_lossless
[ ] cast_possible_truncation
[ ] cast_possible_wrap
[ ] cast_precision_loss
[ ] cast_ptr_alignment
[ ] cast_sign_loss
[ ] checked_conversions
[ ] cloned_instead_of_copied
[ ] copy_iterator
[ ] default_trait_access
[ ] doc_link_with_quotes
[ ] doc_markdown
[ ] empty_enum
[ ] enum_glob_use
[ ] expl_impl_clone_on_copy
[ ] explicit_deref_methods
[ ] explicit_into_iter_loop
[ ] explicit_iter_loop
[ ] filter_map_next
[ ] flat_map_option
[ ] float_cmp
[ ] fn_params_excessive_bools
[ ] from_iter_instead_of_collect
[ ] if_not_else - @jdjaustin
[ ] implicit_clone
[ ] implicit_hasher
[ ] inconsistent_struct_constructor
[ ] index_refutable_slice
[ ] inefficient_to_string
[ ] inline_always
[ ] invalid_upcast_comparisons
[ ] items_after_statements
[ ] iter_not_returning_iterator
[ ] large_digit_groups
[ ] large_stack_arrays
[ ] large_types_passed_by_value
[ ] linkedlist
[ ] macro_use_imports
[ ] manual_assert
[ ] manual_instant_elapsed
[ ] manual_let_else
[ ] manual_ok_or
[ ] manual_string_new
[ ] many_single_char_names
[ ] map_unwrap_or
[ ] match_bool
[ ] match_on_vec_items
[ ] match_same_arms
[ ] match_wild_err_arm
[ ] match_wildcard_for_single_variants
[ ] maybe_infinite_iter
[ ] mismatching_type_param_order
[ ] missing_errors_doc
[ ] missing_panics_doc
[ ] module_name_repetitions
[ ] must_use_candidate
[ ] mut_mut
[ ] naive_bytecount
[ ] needless_bitwise_bool
[ ] needless_continue
[ ] needless_for_each
[ ] needless_pass_by_value
[ ] no_effect_underscore_binding
[ ] option_option
[ ] ptr_as_ptr
[ ] range_minus_one
[ ] range_plus_one
[ ] redundant_closure_for_method_calls
[x] redundant_else
[ ] ref_binding_to_reference
[ ] ref_option_ref
[ ] return_self_not_must_use
[ ] same_functions_in_if_condition
[ ] semicolon_if_nothing_returned
[ ] similar_names
[ ] single_match_else
[ ] stable_sort_primitive
[ ] string_add_assign
[ ] struct_excessive_bools
[ ] too_many_lines
[ ] transmute_ptr_to_ptr
[ ] trivially_copy_pass_by_ref
[ ] unicode_not_nfc
[ ] unnecessary_join
[ ] unnecessary_wraps
[ ] unnested_or_patterns
[ ] unreadable_literal
[ ] unsafe_derive_deserialize
[x] unused_async
[ ] unused_self
[ ] used_underscore_binding
[ ] verbose_bit_mask
[ ] wildcard_imports
[ ] zero_sized_map_values
Other information and links
https://github.com/ChainSafe/forest/pull/2428
To re-iterate two important points:
Don't work on this if you have other work to do.
Carefully consider whether the lint is beneficial. Include your arguments for why the lint should be enabled in your PR. Expect a lot of push-back so make sure your arguments are strong.
How about unwrap_used ? So that we enforce using .expect with a reason.
There has a allow-unwrap-in-tests setting in clippy.toml so that we don't need to fix all tests
How about unwrap_used ? So that we enforce using .expect with a reason.
There has a allow-unwrap-in-tests setting in clippy.toml so that we don't need to fix all tests
@jdjaustin Maybe you could have a go at this one. Coming up with sensible error messages might be tough, though.
Stale, we have more priorities than pedantic clippy warnings now. :)
|
gharchive/issue
| 2023-01-12T09:13:55 |
2025-04-01T04:54:49.838816
|
{
"authors": [
"LesnyRumcajs",
"hanabi1224",
"lemmih",
"tyshko5"
],
"repo": "ChainSafe/forest",
"url": "https://github.com/ChainSafe/forest/issues/2420",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1583113541
|
Changes to walk_snapshot to match Forest and Lotus snapshots
Summary of changes
Changes introduced in this pull request:
Changes to walk_snapshot to match Lotus logic (i.e., don't include identity CIDs in snapshot export).
Reference issue to close (if applicable)
Closes #1884
Other information and links
Change checklist
[x] I have performed a self-review of my own code,
[x] I have made corresponding changes to the documentation,
[x] I have added tests that prove my fix is effective or that my feature works (if possible),
[x] I have made sure the CHANGELOG is up-to-date. All user-facing changes should be reflected in this document.
Calibnet snapshots are matching up now but I'm so far unable to verify mainnet. @LesnyRumcajs
@jdjaustin, what's the issue with mainnet snapshots? Is it because of the hardware limitations? If so, have you tried installing Forest with an alternative, turbo fast, pioneer & bleeding edge backend with make install-with-jemalloc
@jdjaustin, what's the issue with mainnet snapshots? Is it because of the hardware limitations? If so, have you tried installing Forest with an alternative, turbo fast, pioneer & bleeding edge backend with make install-with-jemalloc
Yes, I tried make install-with-jemalloc and I'm still getting No space left on device errors. I can clear out some disk space and try again.
Ah, so your issue is with disk space, not RAM. In this case, you need around 250G of free disk space.
clean up forest directory
download and import the snapshot - this would require around 250G at least - 110G snapshot + 130G DB, more or less.
Delete the snapshot to get back some disk space.
Export the snapshot - again, you will need around 250G plus some more because of the additional chain progression.
@jdjaustin If you still don't have enough disk space after deleting the old files, ask Hubert to set up a droplet for you.
Good news. Verified that the Lotus and Forest mainnet snapshots also match!
@jdjaustin does the shasum of both match?
@jdjaustin does the shasum of both match?
Yepp!
Rock solid! Now it would be fantastic to keep it this way. One approach would be to modify the test a bit in our workflow https://github.com/ChainSafe/forest/blob/main/.github/workflows/rust.yml#L150-L172
Download the Lotus snapshot.
Import it into Forest
Export the snapshot at height specified in Lotus snapshot.
Compare the checksums.
We can do it in a separate issue if it's too complex; I may be missing some gotchas.
Linking the "source" PR for future reference if needed. https://github.com/filecoin-project/lotus/pull/8691
@jdjaustin Perhaps we should match the logic in Lotus for the snapshot, just in case. I am not entirely sure we need it but better safe than sorry. So this:
// We only include raw and dagcbor, for now.
// Raw for "code" CIDs.
switch prefix.Codec {
case cid.Raw, cid.DagCBOR:
default:
continue
}
Should also get included.
@jdjaustin Perhaps we should match the logic in Lotus for the snapshot, just in case. I am not entirely sure we need it but better safe than sorry. So this:
// We only include raw and dagcbor, for now.
// Raw for "code" CIDs.
switch prefix.Codec {
case cid.Raw, cid.DagCBOR:
default:
continue
}
Should also get included.
This block of code doesn't seem to be affecting snapshot exports currently, so my biggest question/concern with adding this is: how will I test the Rust code is working as intended?
I believe just checking if the snapshots are still the same will be okay.
I believe just checking if the snapshots are still the same will be okay.
Got it, I thought it was just going to be dead code, but I was just confused by the Go code; once I started writing the Rust code, I understood what was going on. And the snapshots still match:
|
gharchive/pull-request
| 2023-02-13T21:43:51 |
2025-04-01T04:54:49.853474
|
{
"authors": [
"LesnyRumcajs",
"jdjaustin",
"lemmih"
],
"repo": "ChainSafe/forest",
"url": "https://github.com/ChainSafe/forest/pull/2540",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2074703947
|
chore: update url of goerli bootnodes file and genesis ssz
Motivation
Files have been removed from eth-clients/eth2-networks, see https://github.com/eth-clients/eth2-networks/pull/92
Running Lodestar with --network goerli flag causes the following error on startup
Error fetching latest bootnodes: HTTPError: Response code 404 (Not Found)
at Request.<anonymous> (/home/nico/projects/ethereum/lodestar/node_modules/got/dist/source/as-promise/index.js:118:42)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
Description
Update url of goerli bootnodes file and genesis ssz
:tada: This PR is included in v1.14.0 :tada:
|
gharchive/pull-request
| 2024-01-10T16:12:09 |
2025-04-01T04:54:49.858134
|
{
"authors": [
"nflaig",
"wemeetagain"
],
"repo": "ChainSafe/lodestar",
"url": "https://github.com/ChainSafe/lodestar/pull/6279",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2194212576
|
fix: improve state serialization
Motivation
State serialization happens once per epoch in n-historical state so its performance is important
Description
New validator type to optimize its value_serializeToBytes performance, this helps make state.validators.serialize() 3.4x faster
before
✔ serialize 20000 validators manually 773.4012 ops/s 1.292990 ms/op - 45968 runs 60.0 s
✔ serialize 20000 validators from state 206.3978 ops/s 4.845013 ms/op - 12271 runs 60.0 s
after
✔ serialize 20000 validators manually 768.2498 ops/s 1.301660 ms/op - 45667 runs 60.0 s
✔ serialize 20000 validators from state 694.0964 ops/s 1.440722 ms/op - 41260 runs 60.0 s
use the new serializeToBytes() api of ssz v0.15.1
part of #5968
:tada: This PR is included in v1.18.0 :tada:
|
gharchive/pull-request
| 2024-03-19T07:16:02 |
2025-04-01T04:54:49.861361
|
{
"authors": [
"tuyennhv",
"wemeetagain"
],
"repo": "ChainSafe/lodestar",
"url": "https://github.com/ChainSafe/lodestar/pull/6563",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1960840729
|
Table does not parse correctly if there is a title above it.
I am using the latest version and the following code to test:
@cl.on_chat_start
async def main():
msg = """Table 1
| heading | b | c | d |
| - | :- | -: | :-: |
| cell 1 | cell 2 | 3 | 4 |
"""
await cl.Message(content=msg).send()
The output is not parsed as a table:
Table 1
| heading | b | c | d |
| - | :- | -: | :-: |
| cell 1 | cell 2 | 3 | 4 |
Another maybe related problem is if there is markdown list inside a cell. This list will not be parsed either.
I am able to reproduce with your example. However this works
import chainlit as cl
@cl.on_chat_start
async def main():
msg = """
Hello
| heading | b | c | d |
| - | :- | -: | :-: |
| cell 1 | cell 2 | 3 | 4 | """
await cl.Message(content=msg).send()
Looks like indentation is important.
Interesting. The problem is the table and text above and below it are generated by LLM and I can not easily control indentation. Is it possible we make the table parsing more robust?
That is a good question. We use remark-gfm to parse markdown tables. Would have to dig into that.
Do you know what does ChatGPT and Gradio use? At least ChatGPT can produce beatiful tables robustly.
|
gharchive/issue
| 2023-10-25T08:28:12 |
2025-04-01T04:54:49.873908
|
{
"authors": [
"entron",
"willydouhard"
],
"repo": "Chainlit/chainlit",
"url": "https://github.com/Chainlit/chainlit/issues/503",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
272625296
|
Continuation puzzle scene 1
Pull request by Jens Lomander and Andreas Flöjt
You can now throw a rock in the first puzzle. Better than showing nothing doodz
SonarQube analysis reported 1 issue
Note: The following issues were found on lines that were not modified in the pull request. Because these issues can't be reported as line comments, they are summarized here:
AncientFungus:LGEPR: Cppcheck cannot find all the include files (use --check-config for details)
|
gharchive/pull-request
| 2017-11-09T16:23:29 |
2025-04-01T04:54:49.876257
|
{
"authors": [
"AncientFungus",
"NinjaDanz3r"
],
"repo": "Chainsawkitten/LargeGameProjectEngine",
"url": "https://github.com/Chainsawkitten/LargeGameProjectEngine/pull/791",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
283664045
|
Updating PULL_REQUEST_TEMPLATE.md -> Lodash Backlog
Description
MAKING MAINTAINABLE #100
Makes tracking the methods that have been UPDATED or ADDED to the lodash backlog much more easier. Than having the collaborators constantly reading all the snippets and trying to find which has been added or not.
This way all we have to do is just look on lodash backlog timeline and whichever methods.md PR have been merged we can update
What does your PR belong to?
[ ] Website
[ ] Snippets
[x] General / Things regarding the repository (like CI Integration)
Types of changes
[x] Bug fix (non-breaking change which fixes an issue)
[ ] Enhancement (non-breaking improvement of a snippet)
[ ] New feature (non-breaking change which adds functionality)
[ ] Breaking change (fix or feature that would cause existing functionality to change)
Checklist:
[ ] My code follows the code style of this project.
[x] My change requires a change to the documentation.
[x] I have updated the documentation accordingly.
[x] I have checked that the changes are working properly
[x] I have checked that there isn't any PR doing the same
[x] I have read the CONTRIBUTING document.
Yes Will surely do
|
gharchive/pull-request
| 2017-12-20T19:03:06 |
2025-04-01T04:54:49.881274
|
{
"authors": [
"kingdavidmartins"
],
"repo": "Chalarangelo/30-seconds-of-code",
"url": "https://github.com/Chalarangelo/30-seconds-of-code/pull/281",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1618243487
|
更换api不生效
在配置文件service/.env里面更换了api key,重启了后端和前段,在openai上看了但是还是用的旧apikey(两个不同的帐号),是不是还需要其他操作才能生效?
官网有延迟
官网有延迟
已经快一天了,还是走的旧API,服务器重启了还是一样,求原因……
|
gharchive/issue
| 2023-03-10T02:02:10 |
2025-04-01T04:54:49.896822
|
{
"authors": [
"Chanzhaoyu",
"zhzyg"
],
"repo": "Chanzhaoyu/chatgpt-web",
"url": "https://github.com/Chanzhaoyu/chatgpt-web/issues/456",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2454422743
|
General Grievance: Watch content-baer.de
General Grievance requests the watch of the watch_keyword content-baer\.de. See the MS search here and the Stack Exchange search in text, in URLs, and in code.
content-baer\.de has been seen in 0 true positives, 0 false positives, and 0 NAAs.
Approved by Jeff Schaller in Charcoal HQ
|
gharchive/pull-request
| 2024-08-07T21:57:31 |
2025-04-01T04:54:49.906129
|
{
"authors": [
"SmokeDetector",
"metasmoke"
],
"repo": "Charcoal-SE/SmokeDetector",
"url": "https://github.com/Charcoal-SE/SmokeDetector/pull/12557",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
367755827
|
health was renamed as medicalsciences
The site formerly known as Health is now Medical Sciences, but bodyfetcher has the old URL.
Coverage remained the same at 63.784% when pulling f045ad431bd04f4033724d95e27808e37d8e47c4 on normalhuman:patch-7 into cf5d65e5954c7eb7d0d00315c8a0402d56801f43 on Charcoal-SE:master.
|
gharchive/pull-request
| 2018-10-08T11:57:21 |
2025-04-01T04:54:49.907948
|
{
"authors": [
"coveralls",
"normalhuman"
],
"repo": "Charcoal-SE/SmokeDetector",
"url": "https://github.com/Charcoal-SE/SmokeDetector/pull/2460",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
704850473
|
Kulfy: Watch darkhackerworld.com
Kulfy requests the watch of the watch_keyword darkhackerworld\.com. See the MS search here and the Stack Exchange search in text, in URLs, and in code.
darkhackerworld\.com has been seen in 0 true positives, 0 false positives, and 0 NAAs.
From this deleted answer.
Approved by Mast in Charcoal HQ
|
gharchive/pull-request
| 2020-09-19T09:44:47 |
2025-04-01T04:54:49.911778
|
{
"authors": [
"SmokeDetector",
"double-beep",
"metasmoke"
],
"repo": "Charcoal-SE/SmokeDetector",
"url": "https://github.com/Charcoal-SE/SmokeDetector/pull/4874",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
931578831
|
Ollie: Watch 01329 609 260
Ollie requests the watch of the watch_number 01329 609 260. See the MS search here and the Stack Exchange search in text, in URLs, and in code.
Approved by Spevacus in Charcoal HQ
|
gharchive/pull-request
| 2021-06-28T13:33:17 |
2025-04-01T04:54:49.914566
|
{
"authors": [
"SmokeDetector"
],
"repo": "Charcoal-SE/SmokeDetector",
"url": "https://github.com/Charcoal-SE/SmokeDetector/pull/6566",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1320307745
|
cocomac: Watch w88xin.com
cocomac requests the watch of the watch_keyword w88xin\.com. See the MS search here and the Stack Exchange search in text, in URLs, and in code.
w88xin\.com has been seen in 1 true positive, 0 false positives, and 0 NAAs.
Approved by Spevacus in Charcoal HQ
|
gharchive/pull-request
| 2022-07-28T02:23:56 |
2025-04-01T04:54:49.918029
|
{
"authors": [
"SmokeDetector",
"metasmoke"
],
"repo": "Charcoal-SE/SmokeDetector",
"url": "https://github.com/Charcoal-SE/SmokeDetector/pull/7209",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1873074729
|
Nick: Watch kushals.com
Nick requests the watch of the watch_keyword kushals\.com. See the MS search here and the Stack Exchange search in text, in URLs, and in code.
kushals\.com has been seen in 1 true positive, 0 false positives, and 0 NAAs.
Approved by Mast in Charcoal HQ
|
gharchive/pull-request
| 2023-08-30T07:19:00 |
2025-04-01T04:54:49.921145
|
{
"authors": [
"SmokeDetector",
"metasmoke"
],
"repo": "Charcoal-SE/SmokeDetector",
"url": "https://github.com/Charcoal-SE/SmokeDetector/pull/8056",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2057544579
|
Jesse: Watch apkbossnews.com
Jesse requests the watch of the watch_keyword apkbossnews\.com. See the MS search here and the Stack Exchange search in text, in URLs, and in code.
apkbossnews\.com has been seen in 1 true positive, 0 false positives, and 0 NAAs.
Approved by Spevacus in Charcoal HQ
|
gharchive/pull-request
| 2023-12-27T16:10:46 |
2025-04-01T04:54:49.924530
|
{
"authors": [
"SmokeDetector",
"metasmoke"
],
"repo": "Charcoal-SE/SmokeDetector",
"url": "https://github.com/Charcoal-SE/SmokeDetector/pull/9729",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
299042001
|
Refactor navbar
Maybe integrate this to make our navbar less magic-y?
cc @izwick-schachter
I started trying to do this and it really seems like more trouble that it's worth.
|
gharchive/issue
| 2018-02-21T16:38:57 |
2025-04-01T04:54:49.925747
|
{
"authors": [
"izwick-schachter",
"j-f1"
],
"repo": "Charcoal-SE/metasmoke",
"url": "https://github.com/Charcoal-SE/metasmoke/issues/328",
"license": "cc0-1.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1229226686
|
Blazer: LEFT OUTER JOIN query never finishes
I created a query which simply ends up displaying the browser's "Aw, snap" sad face eventually. I have had experiences in the past where Blazer queries were slow and timed out, but this seems like a different beast, as the timeout is not caught and reported by Ruby or other parts of the web framework. I'm wondering if I managed to DoS metasmoke somehow.
The problematic query is https://metasmoke.erwaysoftware.com/data/sql/queries/297-phone-number-in-title-is-not-blacklisted
A variation which avoids the (allegedly more efficient) LEFT OUTER join https://metasmoke.erwaysoftware.com/data/sql/queries/298-phone-number-in-title-is-not-blacklisted finishes quickly and returns some 3,000 rows.
You've given us nothing but a link to the query. Queries are automatically run when you go to the page. You say you're concerned about DoS'ing MS as a result of this query. So, you've left us little alternative for investigation other than clicking on the link you've provided and contributing to the DoS ourselves, if that's what's happening.
Sorry about that. Here is the query text:
-- https://stackoverflow.com/a/14710831/874188
SELECT p.id, p.title, p.body FROM p_posts AS p
LEFT OUTER JOIN p_posts_reasons AS r
ON r.post_id = p.id
AND r.reason_id in (47, 164)
-- 47 = phone number in title
-- 164 = bad phone number in title
WHERE r.post_id IS NULL
Here is a query to fetch the information about an individual query by its Blazer ID:
https://metasmoke.erwaysoftware.com/data/sql/queries/299-blazer-query-by-id?query_id=297
I did actually click on the link you provided. Your query was cached about an hour ago. It returned 357,465 rows. I suspect that you didn't crash MS, but that you did crash the browser tab you were using when it tried to display all 357,465 rows of results.
When experimenting/working on a query, I've found it's a good idea to include a LIMIT to get a short look at if the results resemble something you want. When removing the LIMIT when your query is still exceeding the limit you've set, it's a good idea to run it with a COUNT instead of the unlimited query to get a feel for the number of results which you're returning. I've edited the query to have LIMIT 100.
Here's an image of the prior results:
Thanks for investigating! I'm guessing this is a browser bug then actually. (I'm using Brave.)
(For what it's worth, I have deleted the flawed query now; it didn't do what I wanted anyway.)
|
gharchive/issue
| 2022-05-09T06:14:25 |
2025-04-01T04:54:49.947847
|
{
"authors": [
"makyen",
"tripleee"
],
"repo": "Charcoal-SE/metasmoke",
"url": "https://github.com/Charcoal-SE/metasmoke/issues/921",
"license": "cc0-1.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
273607314
|
Issue with dates and json
I am getting this exception thrown on every date
[DateField] = '(new System.Collections.Generic.Mscorlib_CollectionDebugView<PivotalTrackerDotNet.Domain.Story>(stories).Items[0]).[DateField]' threw an exception of type 'System.FormatException'
Using the exact story getting code listed on the home page of this project. I'm guessing you have the api set up to get the milli date, instead of the standard one. Can you add some overload to get the standard date format? XML is fine reading this format, but I am not using XML.
I've forked your project and added 2 methods to the storyService that specifically get certain date types; you'll probably want to change up how it's done, but it works.
https://github.com/Nixon-Joseph/PivotalTracker.NET
Thanks, I'll take a look.
From the screenshot it looks like either you a wrapping the library with your own code and maybe modified the library your self.
The Story object doesn't have AcceptedOn, CreatedOn projects and there is no Note object.
Therefore I have no idea of what you code is doing to cause this issue.
Also from the screenshot it does look like AcceptedAt CreatedAt properties have been deserialized correctly.
Hmm, maybe that's changed since I posted that. Because I didn't change the models at all.
|
gharchive/issue
| 2017-11-13T22:53:36 |
2025-04-01T04:54:49.952055
|
{
"authors": [
"Hyldahl",
"Nixon-Joseph"
],
"repo": "Charcoals/PivotalTracker.NET",
"url": "https://github.com/Charcoals/PivotalTracker.NET/issues/23",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
713253094
|
High contrast mode
Make a high contrast mode in addition to light mode #37
CSS contrast will do the work, let me help.
Absolutely! Could you add a sub-dropdown in the View dropdown for Themes, and submit a PR when you're ready? Thank you!
|
gharchive/issue
| 2020-10-01T23:44:00 |
2025-04-01T04:54:49.953556
|
{
"authors": [
"CharlesAverill",
"Danowicz"
],
"repo": "CharlesAverill/satyrn",
"url": "https://github.com/CharlesAverill/satyrn/issues/46",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.