id
stringlengths
4
10
text
stringlengths
4
2.14M
source
stringclasses
2 values
created
timestamp[s]date
2001-05-16 21:05:09
2025-01-01 03:38:30
added
stringdate
2025-04-01 04:05:38
2025-04-01 07:14:06
metadata
dict
1564891493
generate-table-partitions: Extend to support non-numeric partition keys Currently generate-table-partitions only supports numeric, monotonically increasing partition keys. We should extend support to non numeric keys with the use of NTILE. This issue would involve building a query using NTILE in the partition_row_builder and generate a new function (similar to _get_partition_key_filters) to run the query and build filters. Closed by #889
gharchive/issue
2023-01-31T19:37:23
2025-04-01T06:37:02.164768
{ "authors": [ "nehanene15" ], "repo": "GoogleCloudPlatform/professional-services-data-validator", "url": "https://github.com/GoogleCloudPlatform/professional-services-data-validator/issues/688", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1749994591
Prometheus UI is broken after adding authentication After implementing #242, Prometheus UI is broken. It doesn't pass any basicAuth headers to GMP. I solved it by adding some browser extension. Here's how: Configure Prometheus UI to use AUTH_USERNAME and AUTH_PASSWORD Run Prometheus UI: kubectl port-forward --namespace prometheus svc/infra-monitoring-prometheus-frontend-production 8080:9090 Open the browser at http://localhost:8080, and make sure it's failing to load metrics But Grafana access works fine. I found a way to overcome it: Install Modify Header Value addon to your browser. In addon, setup basic auth header with the values from your prometheus chart. Make sure your prometheus dashboard is now working (i.e. no erros, metrics can be searched, etc.). Are you running the latest frontend? I believe this was fixed in #339, which was released in v0.7.0 of the frontend.
gharchive/issue
2023-06-09T14:21:28
2025-04-01T06:37:02.169701
{ "authors": [ "iamkarlson", "mhoran" ], "repo": "GoogleCloudPlatform/prometheus-engine", "url": "https://github.com/GoogleCloudPlatform/prometheus-engine/issues/487", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
1363566087
As an arete create user I am running into a missing wait state between project creation and api enablement - running on an existing target project or running create twice is a workaround running arete create after manually deleting the CC cluster - cluster creation does not kick in without deleting the .arete cache running into existing .arete config - deleting - separate issue in https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/issues/94 error - see end of details (timing/wait step between project creation and services enablement) 2:48PM INF Creating Config Controller Cluster.... 2:48PM FTL error="API [krmapihosting.googleapis.com] not enabled on project [153970848512]. Would you like to enable and retry (this will take a few minutes)? (y/N)? ERROR: (gcloud.anthos.config.controller.create) PERMISSION_DENIED: KRM API Hosting API has not been used in project 153970848512 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/krmapihosting.googleapis.com/overview?project=153970848512 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.- '@type': type.googleapis.com/google.rpc.Help links: - description: Google developers console API activation url: https://console.developers.google.com/apis/api/krmapihosting.googleapis.com/overview?project=153970848512- '@type': type.googleapis.com/google.rpc.ErrorInfo domain: googleapis.com metadata: consumer: projects/153970848512 service: krmapihosting.googleapis.com reason: SERVICE_DISABLED" details admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-toolkit-cno 2:37PM INF Config Controller setup complete admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ rm -rf .arete/ .bashrc .docker/ .npm/ .redhat/ .bash_history .cache/ gopath/ .profile .theia/ .bash_logout .config/ .kube/ README-cloudshell.txt wse_github/ admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ ls -la .arete/ total 20 drwxr--r-- 2 admin_root admin_root 4096 Sep 2 18:15 . drwxr-xr-x 12 admin_root admin_root 4096 Sep 6 14:37 .. -rw-r--r-- 1 admin_root admin_root 46 Aug 31 15:06 config.yaml -rw------- 1 admin_root admin_root 100 Aug 31 15:32 .create -rw-r--r-- 1 admin_root admin_root 1318 Sep 2 18:15 solutions.yaml admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ rm -rf .arete/ admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-toolkit-cno 2:40PM INF Enabling required services... 2:40PM INF Operation "operations/acat.p2-491974186555-2e6beaa9-f3df-4413-9a28-419db485c8e0" finished successfully. 2:41PM INF Creating Config Controller Cluster.... 2:41PM FTL error="ERROR: (gcloud.anthos.config.controller.create) ALREADY_EXISTS: Resource 'projects/pubsec-declarative-toolkit-cno/locations/northamerica-northeast1/krmApiHosts/pdt-cno-kcc' already exists- '@type': type.googleapis.com/google.rpc.ResourceInfo resourceName: projects/pubsec-declarative-toolkit-cno/locations/northamerica-northeast1/krmApiHosts/pdt-cno-kcc" deleting project - attempt to reuse may fail on 30 day deleted cache - will try admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ gcloud projects delete pubsec-declarative-toolkit-cno Your project will be deleted. Do you want to continue (Y/n)? y Deleted [https://cloudresourcemanager.googleapis.com/v1/projects/pubsec-declarative-toolkit-cno]. You can undo this operation for a limited period by running the command below. $ gcloud projects undelete pubsec-declarative-toolkit-cno See https://cloud.google.com/resource-manager/docs/creating-managing-projects for information on shutting down projects. admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-toolkit-cno ✔ My Billing Account - 019..3D ✔ nuage-cloud.org - 471..7 ✔ Folder Level ✔ pdt - 346..8 2:44PM FTL error="ERROR: (gcloud.projects.create) Project creation failed. The project ID you specified is already in use by another project. Please try an alternative ID." admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-toolkit-cno2 ✔ My Billing Account - 01..3D ✔ nuage-cloud.org - 471924274947 ✔ Folder Level ✔ pdt - 346242644868 2:45PM FTL error="ERROR: (gcloud.projects.create) argument PROJECT_ID: Bad value [pubsec-declarative-toolkit-cno2]: Project IDs are immutable and can be set only during project creation. They must start with a lowercase letter and can have lowercase ASCII letters, digits or hyphens. Project IDs must be between 6 and 30 characters.Usage: gcloud projects create [PROJECT_ID] [optional flags] optional flags may be --enable-cloud-apis | --folder | --help | --labels | --name | --organization | --set-as-defaultFor detailed information on this command and its flags, run: gcloud projects create --help" 30 char limit admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-tk-cno2 ✔ My Billing Account - 019952-0D0AAC-777E3D ✔ nuage-cloud.org - 471924274947 ✔ Folder Level ✔ pdt - 346242644868 2:48PM INF Create in progress for [https://cloudresourcemanager.googleapis.com/v1/projects/pubsec-declarative-tk-cno2].Waiting for [operations/cp.7885851846085518239] to finish.....done.Enabling service [cloudapis.googleapis.com] on project [pubsec-declarative-tk-cno2]...Operation "operations/acat.p2-153970848512-8ffc1200-8c5a-42fd-b142-e11cdaf69191" finished successfully.Updated property [core/project] to [pubsec-declarative-tk-cno2]. 2:48PM INF Creating Config Controller Cluster.... 2:48PM FTL error="API [krmapihosting.googleapis.com] not enabled on project [153970848512]. Would you like to enable and retry (this will take a few minutes)? (y/N)? ERROR: (gcloud.anthos.config.controller.create) PERMISSION_DENIED: KRM API Hosting API has not been used in project 153970848512 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/krmapihosting.googleapis.com/overview?project=153970848512 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.- '@type': type.googleapis.com/google.rpc.Help links: - description: Google developers console API activation url: https://console.developers.google.com/apis/api/krmapihosting.googleapis.com/overview?project=153970848512- '@type': type.googleapis.com/google.rpc.ErrorInfo domain: googleapis.com metadata: consumer: projects/153970848512 service: krmapihosting.googleapis.com reason: SERVICE_DISABLED" rerun on recently created project - or run on an existing project to avoid the service enablement missing wait timer arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-tk-cno2 4:39PM INF Enabling required services... 4:40PM INF Operation "operations/acf.p2-153970848512-b3d4a2a6-fe02-4a5b-8f5d-d27d917f6527" finished successfully. 4:40PM INF Creating Network... ........................................................................................done.Created instance [pdt-cno-kcc].Fetching cluster endpoint and auth data.kubeconfig entrgenerated for krmapihost-pdt-cno-kcc. 5:09PM INF Add SA to roles/owner role... 5:09PM INF Config Controller setup complete reviewing https://cloud.google.com/anthos-config-management/docs/tutorials/landing-zone#removing_resources Breaking out periodic timeout issue (when we break the 30 min limit (we are usually within 22-25 min admin_root@cloudshell:~ (landing-zone-controller-w8hwa)$ arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-tk-cno2 ✔ My Billing Account - 019952-0D0AAC-777E3D ✔ nuage-cloud.org - 471924274947 ✔ Folder Level ✔ pdt - 346242644868 2:48PM INF Create in progress for [https://cloudresourcemanager.googleapis.com/v1/projects/pubsec-declarative-tk-cno2].Waiting for [operations/cp.7885851846085518239] to finish.....done.Enabling service [cloudapis.googleapis.com] on project [pubsec-declarative-tk-cno2]...Operation "operations/acat.p2-153970848512-8ffc1200-8c5a-42fd-b142-e11cdaf69191" finished successfully.Updated property [core/project] to [pubsec-declarative-tk-cno2]. 2:48PM INF Creating Config Controller Cluster.... 2:48PM FTL error="API [krmapihosting.googleapis.com] not enabled on project [153970848512]. Would you like to enable and retry (this will take a few minutes)? (y/N)? ERROR: (gcloud.anthos.config.controller.create) PERMISSION_DENIED: KRM API Hosting API has not been used in project 153970848512 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/krmapihosting.googleapis.com/overview?project=153970848512 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.- '@type': type.googleapis.com/google.rpc.Help links: - description: Google developers console API activation url: https://console.developers.google.com/apis/api/krmapihosting.googleapis.com/overview?project=153970848512- '@type': type.googleapis.com/google.rpc.ErrorInfo domain: googleapis.com metadata: consumer: projects/153970848512 service: krmapihosting.googleapis.com reason: SERVICE_DISABLED" rerun on recently created project - or run on an existing project to avoid the service enablement missing wait timer arete create pdt-cno-kcc --region=northamerica-northeast1 --project=pubsec-declarative-tk-cno2 4:39PM INF Enabling required services... 4:40PM INF Operation "operations/acf.p2-153970848512-b3d4a2a6-fe02-4a5b-8f5d-d27d917f6527" finished successfully. 4:40PM INF Creating Network... ........................................................................................done.Created instance [pdt-cno-kcc].Fetching cluster endpoint and auth data.kubeconfig entrgenerated for krmapihost-pdt-cno-kcc. 5:09PM INF Add SA to roles/owner role... 5:09PM INF Config Controller setup complete @fmichaelobrien I'm a little confused on this one but I suspect what is going on is that you are getting stuck on the arete create cache here. arete will enable the APIs and it waits until that operation is complete before moving onto the next steps. I think the resolution here will be the addition to the arete create cache tracking that will bucket the cache per create command for multiple runs. Will add this feature and then if you can test again on that branch to determine if this solves this issue. All good, this one is not related to arete - it is the underlying anthoc config controller create under the covers - there are periodic timeouts occurring that we are working out the the CC team. Closing
gharchive/issue
2022-09-06T16:53:53
2025-04-01T06:37:02.176780
{ "authors": [ "fmichaelobrien", "shaunmitchellve" ], "repo": "GoogleCloudPlatform/pubsec-declarative-toolkit", "url": "https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/issues/93", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
2366656429
Cloud Functions broken by security fix for default Cloud Build SA Issue: Cloud Functions v2 relies on Cloud Build in the background to deploy AR resources, so it is silently impacted by the recent platform-wide security fix with the Cloud Build Service Account Change. Many customers also use the org policy iam.automaticIamGrantsForDefaultServiceAccounts, which is a recommended security best practice, and also now enforced by default for all new customer orgs. The combination of these two policies means that this terraform module will always fail. Deploying Cloud Functions depends on a Cloud Build job in the background, using the default compute SA, which must havepermissions like Storage Object Viewer to the the bucket gcf-v2-sources-$PROJECTNUMBER-$REGION. See example issue at https://github.com/terraform-google-modules/terraform-example-foundation/issues/1269 Recommend Fix: Document the dependency and troubleshooting guidance Expose a control for the user to specify which service account is used for the underlying cloud build job. (Not sure whether the Functions API itself already supports that, or if it would require an API-level fix Actually on further testing, this might be an issue with the upstream Functions gen 2 API rather than the terraform modules. Trying to deploy on console with both policies in place will consistently fail., with error messages like the following: This function has failed to deploy and will not work correctly. Please edit and redeploy. Cloud Run service projects/$PROJECTID/locations/us-central1/services/function-1 for the function was not found. The function will not work correctly. Please redeploy. Build failed with status: FAILURE and message: failed to Fetch: failed to download archive gs://gcf-v2-sources-$NUMBER-us-central1/function-1/function-source.zip: Access to bucket gcf-v2-sources-$NUMBER-us-central1 denied. You must grant Storage Object Viewer permission to $NUMBER-compute@developer.gserviceaccount.com. . For more details see the logs at https://console.cloud.google.com/cloud-build/builds;region=us-central1/a5ed7bdf-db90-4578-97fe-1c54a713eb44?project=$NUMBER. Fixed in #132 @prabhu34 The fix unblocks the technical issue, but I think there's still a big challenge for usage and understanding. Going forward, this module will fail by default for all new customers (the overlapping result of changes to default IAM grants to compute service account + changes to default Cloud Build service account. So an extra and non-intuitive step is required to make this work, even if the user thinks that the principal running the module has all the necessary permissions, one of: add permissions to default compute SA override org policies to use legacy behavior for cloud build SA manually specify a build service account with sufficient privilege I've suggested #133 , WDYT?
gharchive/issue
2024-06-21T14:23:06
2025-04-01T06:37:02.199229
{ "authors": [ "eeaton", "prabhu34" ], "repo": "GoogleCloudPlatform/terraform-google-cloud-functions", "url": "https://github.com/GoogleCloudPlatform/terraform-google-cloud-functions/issues/129", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1768146553
test: stricter regex for gke operation filter Since we were not matching the end of the string, this filter could return operations for multiple clusters. The stricter regex should result in operations for a unique cluster name. /assign @haiyanmeng
gharchive/pull-request
2023-06-21T18:09:13
2025-04-01T06:37:02.218525
{ "authors": [ "sdowell" ], "repo": "GoogleContainerTools/kpt-config-sync", "url": "https://github.com/GoogleContainerTools/kpt-config-sync/pull/703", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
989441232
error: unknown command "cfg" for "kpt" Expected behavior I want to configure kpt setter values for the name, project, and location of a management cluster in Google Cloud. kpt cfg set -R . name "${MGMT_NAME}" kpt cfg set -R . gcloud.core.project "${MGMT_PROJECT}" kpt cfg set -R . location "${LOCATION}" Actual behavior I tried running the first line in bash, and got this in return: error: unknown command "cfg" for "kpt" Did you mean this? fn pkg Information I followed the installation instructions from ktp's website, using docker to install both ktp and ktp-gcloud. I'm running ktp version 1.0.0-beta.1 with kubectl version GitVersion:"v1.22.1" and kustomize/v4.3.0. I am following the Kubeflow's tutorial on deploying a management cluster on Google Cloud. I understand that someone has posted about this exact issue 4 days ago, but the issue was closed without a resolution. I am a beginner with ktp/Kubeflow, so apologies if the resolution is obvious. Kubeflow's tutorial has a note saying: Note: kpt v1.0.0-beta.1 or above doesn’t work due to a known issue: https://github.com/kubeflow/pipelines/issues/6100. Please downgrade gcloud or install kpt separately https://github.com/GoogleContainerTools/kpt/releases/tag/v0.39.2 for now. So looks like kubeflow hasn't migrated to kpt 1.0+ yet, so you will have to use https://github.com/GoogleContainerTools/kpt/releases/tag/v0.39.2 release. @droot that resolved my issue; many thanks! Just goes to show that you can get so absorbed into solving an issue that one fails to see the obvious.... I'm adding some additional details below for any n00b like myself in a similar predicament: I had the ktp v1.0+ binary installed, so I removed it running: sudo apt-get remove google-cloud-sdk-kpt Make sure to later run gcloud components update I reinstalled the v0.39 binary using the instructions from: https://cloud.google.com/service-mesh/docs/environment-setup Important: make sure to install ktp within the directory where you want it to execute. Within the context of the context of the Kubeflow's management setup tutorial, that would be within the gcp-blueprints folder: https://www.kubeflow.org/docs/distributions/gke/deploy/management-setup/ Verify that the install worked properly with ktp version The kpt cfg set command should now be working. As I understand it, this command is deprecated in the latest beta binary: https://kpt.dev/installation/migration?id=changes-to-setters
gharchive/issue
2021-09-06T21:19:49
2025-04-01T06:37:02.227146
{ "authors": [ "HP-Nunes", "droot" ], "repo": "GoogleContainerTools/kpt", "url": "https://github.com/GoogleContainerTools/kpt/issues/2481", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
1111989800
Fixed 'Revert new Nitro icon by setting item' Fixed Nitro logo not hidden and added back old blurple for the text. Also changed text to white when tab is selected and made the text blurple when not selected @CanadaHonk Please check this PR
gharchive/pull-request
2022-01-23T21:22:04
2025-04-01T06:37:02.257107
{ "authors": [ "Dabs-Rulez" ], "repo": "Goose-Nest/GT-RevertRebrand", "url": "https://github.com/Goose-Nest/GT-RevertRebrand/pull/76", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2064609899
[Feature Request]: Breadcrumb component Is your feature request related to a problem? Please describe Currently in the Early Childhood Development System portal (Child Care Portal), there are a could different services (accessed through the sidenav) which feature breadcrumbing on their respective pages. Each breadcrumb patter is different. We will soon want all the services in this portal to utilize a consistent breadcrumb component/pattern. Describe the solution you'd like Would like to see a GoA breadcrumb component which can go down to at least 5 to 6 page levels Works on desktop/tablet and mobile breakpoints Is fully accessible Accounts for position/padding in relation to page templates or this first item of content below the breadcrumb on a page All states for the active links and active page in the breadcrumb Some guidelines around breadcrumb label character length and recommended naming convention if possible (brevity and matching the page title and considerate of the url if possible) Provide evidence this is a needed component Currently 3 of 6 Child Care Services in the Early Childhood Development System portal utilize breadcrumbing and all use a different pattern treatment. All services in the portal will use breadcrumbing in early 2024. Consistency is key as this is a single portal for our external users. The Design System team is already exploring breadcrumb component for their Design System website. https://www.figma.com/file/Vw1LNUBsNeFkToPTWqXAzR/Component---Breadcrumb?type=design&node-id=1401-3285&mode=design&t=7I1OkZM3WExILE2j-4 Describe alternatives you've considered Good article which may provide helpful when mapping out the component and it's guidelines: https://www.smashingmagazine.com/2022/04/breadcrumbs-ux-design/ Do you have anything already created for this that we can use? yes Are you currently using this proposal inside your own service yes Are you able to assist to bring the feature to reality? yes Additional context Here are some live breadcrumb examples from ECDS portal Hi @ArakTaiRoth do you need more context/info from me? Cheers Land Titles Office has a component designed and in development for this: Figma Design guidelines Development blueprint Service designer: @garnison-goa Development Developer: Daryl Chiew @Spark450 Gather the relevant info and convert into a design issue This needs to have a complete design definition before it can be worked on in development, see Jira Issue to follow its progress. A new issue will be created once the design work has been completed.
gharchive/issue
2024-01-03T20:31:06
2025-04-01T06:37:02.312299
{ "authors": [ "ArakTaiRoth", "MikeS700", "Spark450", "twjeffery" ], "repo": "GovAlta/ui-components", "url": "https://github.com/GovAlta/ui-components/issues/1548", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2001956161
Afegir botó per tornar al llistat de peticions des-de firma personalitzada Afegir botó per tornar al llistat de peticions des-de firma personalitzada S'ha revisat i a la firma personalitzada es pot tornar al llistat a totes les pases. Falta un botó per tornar al llistat de peticions al llistat de plantilles de fluxs. Es fa una altra issue
gharchive/issue
2023-11-20T11:17:09
2025-04-01T06:37:02.315974
{ "authors": [ "ptrias-fundaciobit" ], "repo": "GovernIB/enviafib", "url": "https://github.com/GovernIB/enviafib/issues/360", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2077572063
🛑 Grammle auf Codeberg is down In 5765d81, Grammle auf Codeberg (https://codeberg.org/Grammle/Grammle) was down: HTTP code: 0 Response time: 0 ms Resolved: Grammle auf Codeberg is back up in 6e11e66 after 4 hours, 7 minutes.
gharchive/issue
2024-01-11T20:56:52
2025-04-01T06:37:02.361624
{ "authors": [ "realpixelcode" ], "repo": "Grammle/status", "url": "https://github.com/Grammle/status/issues/329", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1612907815
java: Add option to include method modifiers Description Added an option for java profiling: --java-include-method-modifiers, disabled by default. When flag provided we use: jvmti->GetMethodModifiers in async-profiler to get method modifiers. Related Issue https://github.com/Granulate/gprofiler/issues/570 How Has This Been Tested? Added test that tests this case Needs this: https://github.com/Granulate/async-profiler/pull/5 to be merged first, and then need to make a change in async_profiler_shared_build.sh Can update the commit, and we're good :) Pushed tag v2.9g6 Looks good, I will merge once tests pas..
gharchive/pull-request
2023-03-07T07:47:14
2025-04-01T06:37:02.365189
{ "authors": [ "Jongy", "mpozniak95" ], "repo": "Granulate/gprofiler", "url": "https://github.com/Granulate/gprofiler/pull/712", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2480105552
Interface is still pending, but the display is incorrect https://github.com/GraphScope/portal/pull/481
gharchive/issue
2024-08-22T08:00:30
2025-04-01T06:37:02.366380
{ "authors": [ "pomelo-nwu", "totoago" ], "repo": "GraphScope/portal", "url": "https://github.com/GraphScope/portal/issues/479", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
72100388
Updated README.md to add in authentication This needs to be reflected on the gh-pages too Thanks for the pull request
gharchive/pull-request
2015-04-30T08:36:00
2025-04-01T06:37:02.370249
{ "authors": [ "kbastani", "rbramley" ], "repo": "Graphify/graphify", "url": "https://github.com/Graphify/graphify/pull/21", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
1229131981
Snapping system is not precise From here: It's hard to tell, but I think the handle points, when being dragged and snapped to things, are snapping a few pixels vertically offset from true. Try dragging a handle point and snapping it to the bottom of a neighboring shape and notice how it's below the bottom. This might also be the case for anchor points, but instead of vertically it's some other sort of offset. Try out the snapping of anchor and control points to other shapes then zoom in to see if they are precisely located in the same exact location (down to many decimal places before floating point error comes into play). Reply by @0HyperCube about this: Yes, if you drag a point with the path tool, it will snap your mouse position and then add the offset dictated by how far your mouse was originally away from the centre of the point. Also would cause more merge conflicts for the path refactor. So I'm marking this as blocked on #605. This has been fixed as the path tool now puts the centre of the square at the mouse position. @0HyperCube do you know which PR or commit fixed this, for the record?
gharchive/issue
2022-05-09T03:43:15
2025-04-01T06:37:02.372981
{ "authors": [ "0HyperCube", "Keavon" ], "repo": "GraphiteEditor/Graphite", "url": "https://github.com/GraphiteEditor/Graphite/issues/630", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1236429454
Viewport zoom/rotation while snapping doesn't show the snapped values In the viewport, the top right number inputs display the current precise values of the ongoing zoom/rotation, but they should show the snapped value which is what the viewport is rendering. I'll try to solve this.
gharchive/issue
2022-05-15T21:53:23
2025-04-01T06:37:02.373962
{ "authors": [ "Keavon", "RahulHi" ], "repo": "GraphiteEditor/Graphite", "url": "https://github.com/GraphiteEditor/Graphite/issues/643", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1353381736
Rotating object via property panel eliminates shear I think the best way to prioritize this is to wait until we have the node-based transform system using the Transform node. Probably isn't worth the expendable effort of a temporary fix when that won't matter after we have nodes. Upon further thought, maybe this can be fixed without that much effort and it would probably be a quality of life improvement for users who run into this. Do you have any thoughts about this and its prioritization @0HyperCube? Upon further thought, maybe this can be fixed without that much effort and it would probably be a quality of life improvement for users who run into this. Do you have any thoughts about this and its prioritization @0HyperCube? This would probably be somewhat ugly because of how skew combines with rotation and scale in the affine2.
gharchive/issue
2022-08-28T15:09:05
2025-04-01T06:37:02.376336
{ "authors": [ "0HyperCube", "Keavon", "dchiasson" ], "repo": "GraphiteEditor/Graphite", "url": "https://github.com/GraphiteEditor/Graphite/issues/770", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1956170555
[Feature Request] Automatically Sync It would be nice if Cultivation could automatically sync with data from official Genshin servers, or at least showcased servers so maybe we could try out builds with new characters that we want to pull for Cultivation is just a launcher, it sounds like you're asking for something related to the server-side like Enka plugin (public showcase copying). There also is absolutely no way to auto sync official data as neither Grasscutter nor Cultivation have such access to official databases or data. Using InventoryKamera or other 3rd party scanning tools on your own PC, or Enka GC plugin for public showcase only is the closest you will get. Either way, that's something to do be done on your server-side (Grasscutter), not on Cultivation which is just a launcher.
gharchive/issue
2023-10-23T01:22:47
2025-04-01T06:37:02.379248
{ "authors": [ "NotThorny", "a-w-a-y" ], "repo": "Grasscutters/Cultivation", "url": "https://github.com/Grasscutters/Cultivation/issues/212", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
116337727
Add visible reference to GitHub repository Add a reference to the GitHub repository so that all users can see it directly (maybe with the help of a bottom fixed div) in an unobtrusive way. But don't use the "Fork me on GitHub" banner. +1, maybe add footer text (copyright, made with <3 by etc. :D) to this fixed div on the bottom of layout and add link to github repo there. Fixed bottom div is great idea because it wont be burried down there under all posts :+1:
gharchive/issue
2015-11-11T13:41:59
2025-04-01T06:37:02.381865
{ "authors": [ "michaltakac", "pmuens" ], "repo": "GravityProject/gravity", "url": "https://github.com/GravityProject/gravity/issues/44", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2742387866
🛑 Server Gaming is down In eb8c742, Server Gaming (https://game.greathost.ro) was down: HTTP code: 0 Response time: 0 ms Resolved: Server Gaming is back up in 849e391 after 3 hours, 15 minutes.
gharchive/issue
2024-12-16T13:23:37
2025-04-01T06:37:02.449585
{ "authors": [ "GreathostRo" ], "repo": "GreathostRo/upptime", "url": "https://github.com/GreathostRo/upptime/issues/4340", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1874409077
🛑 Server Gaming is down In b1d6b4f, Server Gaming (https://game.greathost.ro) was down: HTTP code: 403 Response time: 1501 ms Resolved: Server Gaming is back up in a8c06e6 after 9 minutes.
gharchive/issue
2023-08-30T21:42:08
2025-04-01T06:37:02.451930
{ "authors": [ "GreathostRo" ], "repo": "GreathostRo/upptime", "url": "https://github.com/GreathostRo/upptime/issues/558", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2154957148
JOSS publication Submit CATS to the Journal of Open Source Software Brief update on this - I now have a very rough first draft (on google docs). I'll ask @sadielbartholomew and/or @Llannelongue to take a look before I migrate it to the repository (and to markdown) in a couple of weeks. In drafting, I've found a couple of issues / potential things we could improve. I'll open new issues for these. Oh, and we should also check the JOSS requirement checklist for anything else we need to do to the codebase.
gharchive/issue
2024-02-26T19:31:00
2025-04-01T06:37:02.470020
{ "authors": [ "andreww", "colinsauze" ], "repo": "GreenScheduler/cats", "url": "https://github.com/GreenScheduler/cats/issues/69", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1793680963
Update project structure The promised PR focusing on project structure, I've copied the figure below from the previous PR (the class names may be obsolete as I haven't changed any functions here but the target structure remains the same). Overall quite light changes as some improvements have now been merged into main already :) List of changes: Have a dedicated file to check the validity of all the arguments (more checks to be added in the future). Moved validation of --duration and --jobinfo to this file. The carbon intensity API calls is a separate step as part of main(). Starttime optimiser is also a separate step (rather than API and starttime being intertwined) Some other light renaming of functions/files as needed (looking at the files changed, git seems to have lost track of timeseries_conversion.py, but it was just renamed optimise_starttime.py, same content otherwise). As usual, thoughts/suggestions welcome! Thanks for the speedy review @tlestang! Looks good to me. I wonder if that diagram needs to make it into the documentation somewhere? Thanks @andreww and @tlestang for the reviews! I'll merge it into main now, and further edits can have their own PRs @Llannelongue Not a big deal, but next time we can squash those typos and fixup commits, and group related changes together. For big PRs like this it can really help keeping a readable history.
gharchive/pull-request
2023-07-07T14:49:49
2025-04-01T06:37:02.474350
{ "authors": [ "Llannelongue", "andreww", "tlestang" ], "repo": "GreenScheduler/cats", "url": "https://github.com/GreenScheduler/cats/pull/49", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
346124572
question: pagesize dynamic (rest of viewport) This is a question I could not find in the docs: I basically want Griddle to use the remaining screen that it can use, so that it will always display as many rows as it can. My page has a menu on top, so griddle can use viewport height- header pixels. Is this a stupid question? I believe if there is lots of data to be displayed, then I want to show the maximum that I can to the user. Thanks! This very much depends on how your data is available, but I would look at combining LocalPlugin and PositionPlugin to manage a fixed-height viewport.
gharchive/issue
2018-07-31T10:19:11
2025-04-01T06:37:02.554480
{ "authors": [ "awb99", "dahlbyk" ], "repo": "GriddleGriddle/Griddle", "url": "https://github.com/GriddleGriddle/Griddle/issues/825", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
103535051
Include tests Downstream projects like Linux distributions like to test the packages during installation. Please include the tests needed for the test suite in you releases at pypi. @jlec Uploaded version 1.4.1 to pypi, do you think that will be enough for testing? If you need any other files just ping Cool, that is perfectly fine.
gharchive/issue
2015-08-27T15:24:55
2025-04-01T06:37:02.568902
{ "authors": [ "Grokzen", "jlec" ], "repo": "Grokzen/pykwalify", "url": "https://github.com/Grokzen/pykwalify/issues/20", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
799700252
Shield won't build with Crystal 0.36.1 Shield won't build with the new Crystal release. This new version appears to detect duplicate named arguments better than before, OR duplicate named arguments are happening where they did not before. They appear to be duplicates in attribute lists that are passed to operations. You should be able to see the problems as soon as you attempt to build. You must be on Lucky master. Shield will support Crystal v0.36 when it is supported by Lucky, hopefully in the next version. I intend to begin work on this soon.
gharchive/issue
2021-02-02T21:22:10
2025-04-01T06:37:02.570963
{ "authors": [ "BrucePerens", "akadusei" ], "repo": "GrottoPress/shield", "url": "https://github.com/GrottoPress/shield/issues/34", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
296620610
IndexOutOfRangeException: Array index is out of range. at PlayerStats.HurtPlayer (HitInfo info, UnityEngine.GameObject go) [0x00000] in :0 IndexOutOfRangeException: Array index is out of range. at PlayerStats.HurtPlayer (HitInfo info, UnityEngine.GameObject go) [0x00000] in <filename unknown>:0 at PlayerStats.Explode () [0x00000] in <filename unknown>:0 at AlphaWarheadDetonationController.ExplodePlayers () [0x00000] in <filename unknown>:0 at AlphaWarheadDetonationController.Explode () [0x00000] in <filename unknown>:0 at AlphaWarheadDetonationController.FixedUpdate () [0x00000] in <filename unknown>:0 == STACKTRACK STATS == Times seen: 1 Last reported: 2018-02-13 05:43:40.420612 Fixed in new game version
gharchive/issue
2018-02-13T05:43:40
2025-04-01T06:37:02.576732
{ "authors": [ "Grover-c13", "StacktrackSubmission" ], "repo": "Grover-c13/MultiAdmin", "url": "https://github.com/Grover-c13/MultiAdmin/issues/60", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2313889675
[Request]: Turn off Single Page Application mode What feature would you like to add? Right now spa mode is on by default, and I don't see any way to turn it off. I've tried KnowledgeBasePanel::configureUsing( fn (KnowledgeBasePanel $panel) => $panel->spa(false)); But that doesn't work. The problem when one panel has spa mode and others don't, you get weird behavior when going back and forth. When going from the knowledge base to another panel without spa mode on, the first time you navigate to the other panel, the global search is still the KB search until there is another page refresh. Notes No response Fixed in #24
gharchive/issue
2024-05-23T21:34:33
2025-04-01T06:37:02.608630
{ "authors": [ "iAmKevinMcKee", "lukas-frey" ], "repo": "GuavaCZ/filament-knowledge-base", "url": "https://github.com/GuavaCZ/filament-knowledge-base/issues/23", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2119487267
sticky menu Feedback vielfältig: The sticky menus should not overlap the content – that means we avoid this making the content width smaller in all landscape modi ( and maybe for tablet portrait mode) For mobile here is the suggestion: Sticky menus are only visible on top, and is not fixed or fades out while scrolling (I quickly tried: https://xd.adobe.com/view/770b8a67-668a-40e0-93f5-65f260cbb802-2258/) what about the hamburger menu? in one slide is on top and another is on the bottom? The sticky menus should not overlap the content – that means we avoid this making the content width smaller in all landscape modi ( and maybe for tablet portrait mode) this means there is padding on the right? what about the hamburger menu? in one slide is on top and another is on the bottom? I don't understand exactly, but if you mean for mobile if you mean the sticky menu i thought of switching them in mobile: Auf's Ohr and Jobs to the top and the hamburger to the bottom, maybe we first have to ask client before getting active. The sticky menus should not overlap the content – that means we avoid this making the content width smaller in all landscape modi ( and maybe for tablet portrait mode) this means there is padding on the right? the padding left and right should be the same, but for some screens they can be wider. (Could teh size refer to the sticky elements? but how does this work on mobile? because then there will be a huge space on the left and right? Im not understanding 100% ok and the switching I dont make because it is still not sure? but how does this work on mobile? because then there will be a huge space on the left and right? Im not understanding 100% ok and the switching I dont make because it is still not sure? For mobile, the solution is to accept the overlapping and fade out the sticky header after scrolling/ not fix them i think this is not possible. We would have to build it from start being the middle of the content. I can put it stationary somewhere, but this still means it will be on top of content and it is even worse because it doesnt move. https://github.com/GuidaGG/vielfaeltig/assets/9657908/9a23e6ff-534d-47e3-bfd7-974a4f176791 At least when it is fixed, it sometimes is on top of content but when we scroll it goes away. this might be goog, but we have to gibe teh sticky menus a nice position in the beginning for mobile. And: They commited to the option with menu at the bottom. what about they keep stiky but on top and we give a top padding to content in mobile so we sure people can always see the content? Even there is hard to make it work very weel because there could be 1 or 2 or 3 sticky buttons. Otherwise I have to wait until I have an idea of how to implement that the content makes space for the sticky buttons. Otherwise I have to wait until I have an idea of how to implement that the content makes space for the sticky buttons. I don't understand, will this reduce the hight of the main content? The hamburger at the buttom and the sticky are not done yet, will you still edit this?
gharchive/issue
2024-02-05T21:28:42
2025-04-01T06:37:02.633462
{ "authors": [ "GuidaGG", "ah-bas-und-aer" ], "repo": "GuidaGG/vielfaeltig", "url": "https://github.com/GuidaGG/vielfaeltig/issues/26", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2016057789
Логгирование и debugmode Было бы хорошо иметь возможность выводить в Serial логи чтобы ловить проблемы с подключением. Вчера весь вечер пытался разобраться с ботом и так не смог =( при этом не понятно где именно проблема и в какой момент возникает. (wifi точно подключается и доступ к нему есть). Статус подключения возвращает tick()
gharchive/issue
2023-11-29T08:23:28
2025-04-01T06:37:02.666610
{ "authors": [ "FedorTheUnkle", "GyverLibs" ], "repo": "GyverLibs/FastBot", "url": "https://github.com/GyverLibs/FastBot/issues/60", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2269484656
Проблема с методом MAX7219.clear() Описание проблемы После использования mtrx.print() вызов mtrx.clear() не сбрасывает позицию печати, а продолжает печатать на последующих позициях до тех пор, пока текст не выходит за пределы допустимой области отображения на LED-матрице. Шаги для воспроизведения Используйте mtrx.print() для печати текста. Вызовите mtrx.clear() для очистки содержимого LED-матрицы. Снова используйте mtrx.print() для печати текста. Ожидаемое поведение После вызова mtrx.clear() ожидается, что позиция печати будет сброшена в начальное положение на LED-матрице. Фактическое поведение Позиция печати не сбрасывается после вызова mtrx.clear(), а продолжает печатать на последующих позициях, что приводит к тому, что текст выходит за пределы допустимой области отображения на LED-матрице. Дополнительная информация Версия используемой библиотеки: v1.5 Используемое оборудование: ESP8266 Пример кода: #include <MAX7219.h> MAX7219<8, 2, D2, D1, D3> mtrx; // Используем шаблонное объявление MAX7219 void setup() { mtrx.begin(); // Инициализация дисплея } void loop() { mtrx.clear(); // Очистка содержимого дисплея mtrx.print("Привет, мир!"); // Печать нового текста delay(1000); // Пауза на одну секунду } Почему от clear ожидается сброс курсора? Тогда как я могу управлять, чтобы перезагрузить позицию курсора? void setCursor(int x, int y); спасибо
gharchive/issue
2024-04-29T16:39:08
2025-04-01T06:37:02.672007
{ "authors": [ "GyverLibs", "Sci-fiBrain" ], "repo": "GyverLibs/GyverMAX7219", "url": "https://github.com/GyverLibs/GyverMAX7219/issues/5", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1966243078
⚠️ Homepage has degraded performance In d306779, Homepage (https://h-edu.cz) experienced degraded performance: HTTP code: 200 Response time: 486 ms Resolved: Homepage performance has improved in 8fb9dbe after 29 minutes.
gharchive/issue
2023-10-27T22:02:08
2025-04-01T06:37:02.675418
{ "authors": [ "MilanLempera" ], "repo": "H-edu-dev/upptime", "url": "https://github.com/H-edu-dev/upptime/issues/68", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
175836234
Chips: why are value entries objects with "text" key ...and not simply strings? @zemirco A single chip can have an icon as well. See https://github.com/HBM/react-components/pull/41 for an example.
gharchive/issue
2016-09-08T19:04:29
2025-04-01T06:37:02.718178
{ "authors": [ "lipp", "zemirco" ], "repo": "HBM/react-components", "url": "https://github.com/HBM/react-components/issues/39", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1641059943
🛑 SkipTheCommericals is down In a1fdf88, SkipTheCommericals ($STC) was down: HTTP code: 0 Response time: 0 ms Resolved: SkipTheCommericals is back up in 1181c0f.
gharchive/issue
2023-03-26T19:54:47
2025-04-01T06:37:02.736547
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/15244", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1644949425
🛑 SkipTheCommericals is down In 7436836, SkipTheCommericals ($STC) was down: HTTP code: 0 Response time: 0 ms Resolved: SkipTheCommericals is back up in f73c7d4.
gharchive/issue
2023-03-29T02:56:05
2025-04-01T06:37:02.738676
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/15522", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1680110947
🛑 SkipTheTrailers is down In 09d86d0, SkipTheTrailers ($STT) was down: HTTP code: 0 Response time: 0 ms Resolved: SkipTheTrailers is back up in 1b01595.
gharchive/issue
2023-04-23T17:54:11
2025-04-01T06:37:02.740756
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/19209", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1369097373
⚠️ PrivateHD has degraded performance In c67d66f, PrivateHD ($PHD) experienced degraded performance: HTTP code: 200 Response time: 3036 ms Resolved: PrivateHD performance has improved in 3adf634.
gharchive/issue
2022-09-11T23:29:14
2025-04-01T06:37:02.742855
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/1997", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1689693519
🛑 SkipTheTrailers is down In 35b7a43, SkipTheTrailers ($STT) was down: HTTP code: 0 Response time: 0 ms Resolved: SkipTheTrailers is back up in 0fa3834.
gharchive/issue
2023-04-29T21:29:53
2025-04-01T06:37:02.744974
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/20020", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1294420994
⚠️ SkipTheCommericals has degraded performance In 3d19e34, SkipTheCommericals ($STC) experienced degraded performance: HTTP code: 200 Response time: 2493 ms Resolved: SkipTheCommericals performance has improved in b270460.
gharchive/issue
2022-07-05T15:00:17
2025-04-01T06:37:02.747321
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/215", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2067960634
⚠️ TorrentLeech has degraded performance In abda01d, TorrentLeech ($TL) experienced degraded performance: HTTP code: 200 Response time: 5970 ms Resolved: TorrentLeech performance has improved in cd2f841 after 8 minutes.
gharchive/issue
2024-01-05T20:00:20
2025-04-01T06:37:02.749437
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/25881", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2317714996
⚠️ Empornium has degraded performance In 0e6af8b, Empornium ($EMP) experienced degraded performance: HTTP code: 500 Response time: 1285 ms Resolved: Empornium performance has improved in 9a59654 after 40 minutes.
gharchive/issue
2024-05-26T13:29:48
2025-04-01T06:37:02.751897
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/27982", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2433813994
⚠️ HawkeUno has degraded performance In 67e0e8b, HawkeUno ($UNO) experienced degraded performance: HTTP code: 403 Response time: 1342 ms Resolved: HawkeUno performance has improved in 310f691 after 6 minutes.
gharchive/issue
2024-07-28T08:24:23
2025-04-01T06:37:02.754026
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/28731", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1533542527
🛑 SkipTheTrailers is down In b1ddcb7, SkipTheTrailers ($STT) was down: HTTP code: 0 Response time: 0 ms Resolved: SkipTheTrailers is back up in 45a43f0.
gharchive/issue
2023-01-14T23:03:08
2025-04-01T06:37:02.756233
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/5712", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1535544112
🛑 Telly is down In 7c42348, Telly ($TLY) was down: HTTP code: 0 Response time: 0 ms Resolved: Telly is back up in 8811e0f.
gharchive/issue
2023-01-16T22:11:54
2025-04-01T06:37:02.758490
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/5984", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1536592590
⚠️ SceneTime has degraded performance In 54047b5, SceneTime ($ST) experienced degraded performance: HTTP code: 200 Response time: 3130 ms Resolved: SceneTime performance has improved in 8798c66.
gharchive/issue
2023-01-17T15:36:42
2025-04-01T06:37:02.760588
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/6081", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1537082081
🛑 SkipTheTrailers is down In 19e6d63, SkipTheTrailers ($STT) was down: HTTP code: 0 Response time: 0 ms Resolved: SkipTheTrailers is back up in f6abf55.
gharchive/issue
2023-01-17T21:52:30
2025-04-01T06:37:02.762937
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/6126", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1571112412
🛑 SkipTheCommericals is down In 8de3077, SkipTheCommericals ($STC) was down: HTTP code: 0 Response time: 0 ms Resolved: SkipTheCommericals is back up in 9a3cd34.
gharchive/issue
2023-02-04T20:27:12
2025-04-01T06:37:02.764969
{ "authors": [ "HDVinnie" ], "repo": "HDVinnie/TrackerHub", "url": "https://github.com/HDVinnie/TrackerHub/issues/8678", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
404612122
Multiple variable derivation using autograd How to find a Jacobian using autograd for multiple variable. For example: R1 = x(1)*y(1) + 2y(2)x(2) R2 = -y(1) + y(2)(2)**2 R1 and R2 is a transpose matrix And I need to find [ [dR1/dx1 dR2/dx1], [dR1/dx2 dR2/dx2] ] Does Example 4 here help? @jermwatt Yes, it did give me a better concept. Thank you so much. Just a quick question. The code is so far working for this: f = lambda x,y: np.array([y[0]*x[0]**2, y[1]*x[0]+x[1]]) f_jac = jacobian(f, argnum = 0) But can you suggest me how to get the output for the jacobian: j = f_jac(np.array([[1.,1.], [2.,2.]])) Thanks. I was having the same problem. The above link helped alot. I tried to look at the tests for autograd.jacobian for hints about how to do this. Maybe one could add something similar the below? import autograd.numpy as np # Thinly-wrapped version of Numpy import autograd A = np.array([[1., 2.], [3., 4.]]) def f(x, y): """ Return vector function: f(x, y) = /1 2\ /x\ = /1x + 2y\ \3 4/ \y/ \3x + 4y/ """ return np.tensordot(A, np.array([x, y]), [[1], [0]]) import numpy as np0 # wrapped numpy does not have testing methods x0 = 1; y0 = 1 np0.testing.assert_array_equal(f(x0, y0), np.array([1*x0+2*y0, 3*x0 + 4*y0])) J_autograd = np.array([autograd.jacobian(f, k)(0., 0.) for k in [0, 1]]) np0.testing.assert_array_equal(J_autograd, A.T) In the doc string for jacobian: Should x (in the code) be argnum (mentioned in the doc string)? The code does not reference argnum (?) Or def f(*vector): """ Return vector function: f(x, y) = /1 2\ /x\ = /1x + 2y\ \3 4/ \y/ \3x + 4y/ """ return np.tensordot(A, np.array(vector), [[1], [0]])
gharchive/issue
2019-01-30T05:46:30
2025-04-01T06:37:02.814923
{ "authors": [ "jermwatt", "matiasdahl", "subhanjan21" ], "repo": "HIPS/autograd", "url": "https://github.com/HIPS/autograd/issues/470", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
1014425708
Other resources Language Period Type Link lat 16c MSS https://zenodo.org/record/4780947 Done !
gharchive/issue
2021-10-03T15:07:24
2025-04-01T06:37:02.875983
{ "authors": [ "PonteIneptique" ], "repo": "HTR-United/htr-united", "url": "https://github.com/HTR-United/htr-united/issues/29", "license": "CC0-1.0", "license_type": "permissive", "license_source": "github-api" }
1281638875
Preflight Checklist [X] I agree to follow the Code of Conduct that this project adheres to. [X] I have searched the issue tracker for an issue that matches the one I want to file, without success. [X] I am not looking for support or already pursued the available support channels without success. Version now Installation Type Other (specify below) Service Name DongTai-WebAPI Describe the details of the bug and the steps to reproduce it Agent列表上报的指标数据与服务器不一致,请确认下是否数据源有误 Additional Information No response Logs No response fixed in newest version
gharchive/issue
2022-06-23T03:30:07
2025-04-01T06:37:02.882147
{ "authors": [ "Bidaya0", "songzhibin97" ], "repo": "HXSecurity/DongTai", "url": "https://github.com/HXSecurity/DongTai/issues/722", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1435389257
Verusca : Behavior, Strategy and Implementation, 3 Weeks Learning Objectives Priorities: 🥚, 🐣, 🐥, 🐔 (click to learn more) There is a lot to learn in this repository. If you can't master all the material at once, that's expected! Anything you don't master now will always be waiting for you to review when you need it. These 4 emojis will help you prioritize your study time and to measure your progress: 🥚: Understanding this material is required, it covers the base skills you'll need to move on. You do not need to finish all of them but should feel comfortable that you could with enough time. 🐣: You have started all of these exercises and feel you could complete them all if you just had more time. It may not be easy for you but with effort, you can make it through. 🐥: You have studied the examples and started some exercises if you had time. You should have a big-picture understanding of these concepts/skills, but may not be confident completing the exercises. 🐔: These concepts or skills are not necessary but are related to this module. If you are finished with 🥚, 🐣 and 🐥 you can use the 🐔 exercises to push yourself without getting distracted from the module's main objectives. 1. Remix Practice studying and remixing other people's solutions to coding challenges. Create your own solutions by mixing and matching pieces from other people's code. 🐣 Reconstructing: You can reconstruct a variety of solutions to the same challenge when they are presented as parsons problems. 🐣 Analyzing: You can analyze a function written at your level. This includes: [ ] Behavior: Write documentation, test cases and use cases to describe the function's behaviour. [ ] Strategy: Describe the function's strategy using plain English. [ ] Implementation: List the language features in a function and explain how each is used. [ ] Small Changes: You can think of 2+ changes to the function's implementation that would not change it's strategy. 🐣 Remixing You can analyze several solutions to the same challenge then ... [ ] Write: Your own solution by remixing the ones you studied. [ ] Explain: How the other solutions inspired yours; what ideas did you take from them? what ideas did you not take? [ ] Analyze: Complete a write-up of your own solution as though someone else wrote it. 2. Write 🐣 Function Design: [ ] Writing Tests: Given a working function, you can write passing test cases to describe it's behavior. [ ] Writing Functions: You can design multiple solutions to the same code challenge, keeping notes about different experiments you tried along the way. [ ] 🐣 Generating Documentation: You can write a JSDoc comment for your solutions and run a script to generate markdown documentation. [ ] 🐣 Fuzz Testing: You write solutions that pass randomly generated test cases. 🐥 Test Driven Development: You can solve open-ended, ambiguous coding challenges at your level: [ ] Reading Docs: You can understand what the function is supposed to do by reading it's JSDoc description. [ ] Writing Tests: You can write test cases before there is a function to test. [ ] Writing Functions: You can write one function that passes the test cases you have prepared (even if it's just 1 test case!). [ ] Refactoring: You can improve your function's implementation without failing any test cases that were passing. [ ] Iterative Development: You can repeat the TDD process until you are satisfied with your test cases and solution. [ ] 🐔 Code Golf: Write your solutions with the fewest characters possible! This won't help you write readable code, but it will make you think deeply about JS, your strategy and implementation. 3. Review 🐣 Continuous Integration: You can check your code's quality before pushing so your CI checks all pass. [ ] Formatting [ ] Linting [ ] Testing [ ] 🐣 Code Review: You can use a checklist to give a thorough, positive and constructive review of your classmates' solutions. [ ] 🐔 Code Coverage: You can explain what code coverage is, why it's important, and can write unit tests with 100% code coverage. Behavior, Strategy, and Implementation - week 1 [x] I have pushed my progress to my fork of exercises repo - template write-up I Need Help With: I solved an exercise but I couldn’t go backward after the final submission on Edabit. I re-wrote the solution on my VS Code but I couldn't check it on Edabit to be sure because it moved to the previous exercise. After solving an exercise on Edabit, is it possible to go backwards to copy the solution or see previous work done? General remarks I'm getting to understand a bit more about JS from this module. Sunday Prep Work I am working on the exercises on slack to be completed by Sunday Have you tried just opening the edabit challenge through the provided link again and going straight to the Code tab? If I do that, I can submit a solution again, even though I finished the challenge before. Thank you it worked Behavior, Strategy, and Implementation - week 1 [x] I have pushed my progress to my fork of exercises repo - template write-up I Need Help With: Nothing for now General remarks I'm getting to understand a bit more about JS from this module. Sunday Prep Work I am working on the easy exercises
gharchive/issue
2022-11-04T00:07:30
2025-04-01T06:37:02.902700
{ "authors": [ "Verousca", "vkoldus" ], "repo": "HYF-Class19/home", "url": "https://github.com/HYF-Class19/home/issues/295", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2000713442
🛑 ConUHacks Registration is down In a09c4bd, ConUHacks Registration (https://register.conuhacks.io) was down: HTTP code: 0 Response time: 0 ms Resolved: ConUHacks Registration is back up in bbde6ec after 3 hours, 1 minute.
gharchive/issue
2023-11-19T06:07:24
2025-04-01T06:37:02.924901
{ "authors": [ "DucNgn" ], "repo": "HackConcordia/HC-Upptime", "url": "https://github.com/HackConcordia/HC-Upptime/issues/110", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
711907871
getting last message of recent channels-backend Description get the last message and the title of the channels, from the recent channels for a specific user that defined in the endpoint. Fixes #4 How to test? run "npm run server" and go to "http://localhost:5000/api/channels-message/2" Checklist [x] I have performed a self-review of my own code [x] I have commented my code, particularly in hard-to-understand areas [x] I have made corresponding changes to the documentation [x] This PR is ready to be merged and not breaking any other functionality Hi Benjamin, thank you for your feedback. Yes you are right I must to have a better overview of my classmates work before start coding! I will try to use the API that you mentioned and complete my task on that.
gharchive/pull-request
2020-09-30T12:34:43
2025-04-01T06:37:02.928677
{ "authors": [ "afrouzhakim" ], "repo": "HackYourFuture-CPH/chattie", "url": "https://github.com/HackYourFuture-CPH/chattie/pull/185", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1126388226
Frontend-ContactPage-Container Description Frontend Contact Page container , I have added Header ,Menu ,Contact us and Footer Components in this page .I removed these components from App.js and added Route path for contact us page. Fixes #71 How to test? npm run dev After you run this command it will show Contact page, Please remember it will not show header ,menu and footer in other pages because i removed components from App.js. Please provide a short summary how your changes can be tested? Checklist [x] I have performed a self-review of my own code [x] I have followed the name conventions for CSS Classnames and filenames, Components names and filenames, Style filenames, if you are in doubt check the the project README.MD and here https://github.com/HackYourFuture-CPH/curriculum/blob/master/review/review-checklist.md [x] I have commented my code, particularly in hard-to-understand areas, if you code was simple enough mark the box anyway [x] I have made corresponding changes to the documentation, if you code was simple enough mark the box anyway [x] This PR is ready to be merged and not breaking any other functionality It's looking nice on my machine as well. 👍
gharchive/pull-request
2022-02-07T19:11:55
2025-04-01T06:37:02.932784
{ "authors": [ "Divyajg", "santhoshboinapally" ], "repo": "HackYourFuture-CPH/fp-class19", "url": "https://github.com/HackYourFuture-CPH/fp-class19/pull/91", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
371442380
Create java sum What does the program do ? sum In what programming language it is written? java This pull request do not contains files in specific folders. Please fix it and reopen for help see https://help.github.com/articles/moving-a-file-to-a-new-location/
gharchive/pull-request
2018-10-18T09:35:17
2025-04-01T06:37:02.948815
{ "authors": [ "Aniket965", "iamcodeking" ], "repo": "Hacktoberfest-2018/Hello-world", "url": "https://github.com/Hacktoberfest-2018/Hello-world/pull/4714", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1421687493
Update README.md It seems you renamed the class name Thanks for that. Sorry for the late reply, you got me just after I went on holiday.
gharchive/pull-request
2022-10-25T01:10:05
2025-04-01T06:37:03.001635
{ "authors": [ "Googlproxer", "wf9a5m75" ], "repo": "HalleyAssist/ion-range-calendar", "url": "https://github.com/HalleyAssist/ion-range-calendar/pull/9", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2352888605
add evaluation frameworks LangSmith and Ragas Description Added a few frameworks (LangSmith and Ragas) to help on LLM evaluation tasks. Appreciate!
gharchive/pull-request
2024-06-14T08:57:16
2025-04-01T06:37:03.027416
{ "authors": [ "Hannibal046", "fsantosg" ], "repo": "Hannibal046/Awesome-LLM", "url": "https://github.com/Hannibal046/Awesome-LLM/pull/138", "license": "CC0-1.0", "license_type": "permissive", "license_source": "github-api" }
2045299053
🛑 German is down In ff63a53, German (https://de.scratch-wiki.info) was down: HTTP code: 508 Response time: 1150 ms Resolved: German is back up in bc4e8e0 after 9 minutes.
gharchive/issue
2023-12-17T17:16:43
2025-04-01T06:37:03.033756
{ "authors": [ "Auto5958" ], "repo": "Hans5958/Scratch-Wiki-Upptime", "url": "https://github.com/Hans5958/Scratch-Wiki-Upptime/issues/1570", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2045856367
🛑 German is down In 7a322e0, German (https://de.scratch-wiki.info) was down: HTTP code: 0 Response time: 0 ms Resolved: German is back up in ba60337 after 7 minutes.
gharchive/issue
2023-12-18T06:49:17
2025-04-01T06:37:03.036161
{ "authors": [ "Auto5958" ], "repo": "Hans5958/Scratch-Wiki-Upptime", "url": "https://github.com/Hans5958/Scratch-Wiki-Upptime/issues/1576", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
698101231
新ネタでしてよ! 「コードレビューお嬢様」に私が考えた最強の一文を提案いたしますわ.   ここに書きますから追加なさってくださいまし. 「体裁のうつくしさは月並みですけど、不具合なく動くのであればよくってよ。美しい石像より丈夫なナイフ、ですわ!」 それでは,ごきげんよう. LGTMですわ!
gharchive/issue
2020-09-10T15:08:55
2025-04-01T06:37:03.042152
{ "authors": [ "HansRobo", "KANAIHIROYUKI" ], "repo": "HansRobo/CodeReviewLady", "url": "https://github.com/HansRobo/CodeReviewLady/issues/10", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2280649134
Questions about how to train the model Excellent work! I have some confusion. I would like to know if the final model obtained is a single "delete" model capable of executing all four subtasks of lead optimization (trained with seven masking strategies.?). Or does each subtask of lead optimization correspond to a separate "delete" model? (Each subtask model being trained with four masks, including three enhancement masks and one task-specific mask?) I noticed you mentioned only one checkpoint model. What subtask does this model correspond to?I look forward to your clarification. Thank you very much. Hi, Yes, I only provided a ckpt for usage, but it is recommended to use task-specific ckpt to generate molecules, but you can also use mixed-training for obtaining a general ckpt. I will open-source it as soon as possible! Thanks for your interest! Best, Odin So ,each subtask of lead optimization correspond to a separate "delete" model? (Each subtask model being trained with four masks, including three enhancement masks and one task-specific mask?) But the author, you only provided one model. Does this model correspond to the subtask of fragment growth? Could you clarify if the model generates atoms one by one? Is it an autoregressive model? Does the generation of the next atom depend on the previously generated atoms?
gharchive/issue
2024-05-06T11:21:59
2025-04-01T06:37:03.048986
{ "authors": [ "HaotianZhangAI4Science", "zh2417" ], "repo": "HaotianZhangAI4Science/Delete", "url": "https://github.com/HaotianZhangAI4Science/Delete/issues/7", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
571813910
"Device is available but not used by distribute strategy" Hi, Haoyu ! I got this in when I run run_cust_classifier.py. Is that normal? 然后跑着会出现这种 error: "Resource exhausted: OOM when allocating tensor with shape[4096,3072] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc' GPU资源不够OOM了 GPU资源不够OOM了 为什么会出现这种情况呢,有 8 张 Tesla K80 的显卡,运行的时候需要做哪些限制吗?,单 GPU 跑的时候没有问题啊 这个 log 感觉有问题,这是说没有用 mirror strategy 吗? 这个 log 感觉有问题,这是说没有用 mirror strategy 吗? 开始训练后可以用nvidia-smi看看GPU有没有被占用 开始训练之后 GPU Util 是很高的 90%~100%, 就是log 让人疑惑,不论是把 log_n_every_steps 改成多少,log 上面 的是 这个数字的 2 倍,用你代码里 的 8,就是 16,改为 10 就是 20. I0228 14:58:39.682965 139945045247808 basic_session_run_hooks.py:260] loss = 0.4445804, step = 940 (16.402 sec) I0228 14:58:46.150598 139945045247808 basic_session_run_hooks.py:692] global_step/sec: 1.22899 I0228 14:58:55.966473 139945045247808 basic_session_run_hooks.py:692] global_step/sec: 1.2225 I0228 14:58:55.967319 139945045247808 basic_session_run_hooks.py:260] loss = 0.4284298, step = 960 (16.284 sec) I0228 14:59:05.747961 139945045247808 basic_session_run_hooks.py:692] global_step/sec: 1.22681 I0228 14:59:12.184175 139945045247808 basic_session_run_hooks.py:260] loss = 0.33931226, step = 980 (16.217 sec) I0228 14:59:15.447099 139945045247808 basic_session_run_hooks.py:692] global_step/sec: 1.23722 I0228 14:59:25.238059 139945045247808 basic_session_run_hooks.py:692] global_step/sec: 1.22566 I0228 14:59:28.448702 139945045247808 basic_session_run_hooks.py:260] loss = 0.2939254, step = 1000 (16.265 sec) 开始训练之后 GPU Util 是很高的 90%~100%, 就是log 让人疑惑,不论是把 log_n_every_steps 改成多少,log 上面的都是这个数字的 double,用你代码里 的 8,就是 16,改为 10 就是 20. 你是8卡一起训练吗?之前有人提类似的issue。 大佬,多卡训练速度是不是不一定比单卡训练速度快很多?,下面这种情况正常吗 sigle gpu : 2.66 global step /sec 8 gpus: 2.9 glocal step /sec 开始训练之后 GPU Util 是很高的 90%~100%, 就是log 让人疑惑,不论是把 log_n_every_steps 改成多少,log 上面的都是这个数字的 double,用你代码里 的 8,就是 16,改为 10 就是 20. 你是8卡一起训练吗?之前有人提类似的issue。 是 8 卡同时,看到了那个 ISSUE,上面没有解答。换成4卡之后速度提升明显。。估计是 GPU 连接的问题,或者是 CPU 的瓶颈? @Jhangsy 这个日志问题确实困扰很久了,好在不影响训练。有可能是CPU瓶颈。 @Jhangsy 这个日志问题确实困扰很久了,好在不影响训练。有可能是CPU瓶颈。 哈哈哈,是呀,你回复的好及时,十分感谢~ @Jhangsy 这个日志问题确实困扰很久了,好在不影响训练。有可能是CPU瓶颈。 哈哈哈,是呀,你回复的好及时,十分感谢~ 不客气:P 大佬,多卡训练速度是不是不一定比单卡训练速度快很多?,下面这种情况正常吗 sigle gpu : 2.66 global step /sec 8 gpus: 2.9 glocal step /sec 我的理解:多卡训练很多时候是针对长序列文本。因为如果使用单卡,那么允许的batch_size很小,否则会出现OOM问题。为了是模型训练收敛,必须使用多卡,以达到增加batch_size的效果(global_batch_size = num_gpu * batch_size)。至于训练速度,可能跟负载的平衡, strategy,CPU适配有关。
gharchive/issue
2020-02-27T04:30:40
2025-04-01T06:37:03.059484
{ "authors": [ "HaoyuHu", "Jhangsy", "pengxia24", "shishishu" ], "repo": "HaoyuHu/bert-multi-gpu", "url": "https://github.com/HaoyuHu/bert-multi-gpu/issues/24", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1299769934
🛑 Happy Vibes Bot 4 is down In b0eedd2, Happy Vibes Bot 4 ($BOT4) was down: HTTP code: 0 Response time: 0 ms Resolved: Happy Vibes Bot 4 is back up in 50b123e.
gharchive/issue
2022-07-09T21:24:38
2025-04-01T06:37:03.062163
{ "authors": [ "samosaman73" ], "repo": "Happy-Vibes-Bot/status", "url": "https://github.com/Happy-Vibes-Bot/status/issues/2516", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1419880509
🛑 Happy Vibes Bot 3 is down In 1ee359d, Happy Vibes Bot 3 ($BOT3) was down: HTTP code: 0 Response time: 0 ms Resolved: Happy Vibes Bot 3 is back up in 4ba4e16.
gharchive/issue
2022-10-23T18:18:14
2025-04-01T06:37:03.064705
{ "authors": [ "samosaman73" ], "repo": "Happy-Vibes-Bot/status", "url": "https://github.com/Happy-Vibes-Bot/status/issues/3778", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1456184773
🛑 Happy Vibes Bot 3 is down In 9066719, Happy Vibes Bot 3 ($BOT3) was down: HTTP code: 0 Response time: 0 ms Resolved: Happy Vibes Bot 3 is back up in e1fe544.
gharchive/issue
2022-11-19T03:59:10
2025-04-01T06:37:03.066814
{ "authors": [ "samosaman73" ], "repo": "Happy-Vibes-Bot/status", "url": "https://github.com/Happy-Vibes-Bot/status/issues/4247", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1451429919
Twitch Alerts bot is offline For several days now, the Twitch Alerts bot is offline and don't work anymore (error message "application is not working" and status down). Hi thank you for your report. I've restarted the server where the bot is running, but I forgot to restart the bot 😅
gharchive/issue
2022-11-16T11:39:05
2025-04-01T06:37:03.108821
{ "authors": [ "Harfeur", "Sarvagon" ], "repo": "Harfeur/TwitchAlerts", "url": "https://github.com/Harfeur/TwitchAlerts/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
412759077
Added live example This PR adds configuration to run and code on plot neural net in Gitpod, a free dev environment for GitHub we have been working on. The goal is to make contributions super easy by providing a single click to code experience. You can open gitpod workspaces on any github repository by prefixig the github URL with 'gitpod.io#/'. Depending on the URL it does the right thing. For instance, prefixing a pull request like this one will open it in code review mode. For your project I added a docker with the needed dependencies and made the start script so that it runs test/unet.py and opens the resulting pdf. Also the python editor is opened so that people can play around with it. I hope you find it useful, let me know we you have any questions. It seems that your code does not load docker created for this repository out of the box. Would be nice if you add some guidelines or make it run out of the box. How did you try it? If you start a gitpod on this branch (or my fork) the Dockerfile will be picked up: https://gitpod.io/#https://github.com/HarisIqbal88/PlotNeuralNet/pull/15
gharchive/pull-request
2019-02-21T05:47:38
2025-04-01T06:37:03.112319
{ "authors": [ "HarisIqbal88", "svenefftinge" ], "repo": "HarisIqbal88/PlotNeuralNet", "url": "https://github.com/HarisIqbal88/PlotNeuralNet/pull/15", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1854813504
[Feature] 🌈 Github Action 部署 查看此仓库: Harry-zklcdc/go-proxy-bingai-ga 且用且珍惜 感谢大大 action是不是有时间限制 action是不是有时间限制 是的,所以我加了那个计划任务,每6小时自动启动 action是不是有时间限制 是的,所以我加了那个计划任务,每6小时自动启动 但是action好像有限制每个月能执行多久。 要怎么用啊? 虽然但是 滥用action也不好 搞不好哪天GitHub就关闭action了 虽然但是 滥用action也不好 搞不好哪天GitHub就关闭action了 雀食,所以不建议用这个部署
gharchive/issue
2023-08-17T11:28:24
2025-04-01T06:37:03.116226
{ "authors": [ "2531178480", "Harry-zklcdc", "Xqh-Cxy", "luckyEason", "whx1024" ], "repo": "Harry-zklcdc/go-proxy-bingai", "url": "https://github.com/Harry-zklcdc/go-proxy-bingai/issues/145", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
818243526
Tool Icons We don't have an icon for the rainbow brush. The icon for the line tool only resembles the tool partially and is actually a graph. We need either custom-made icons or some icons from somewhere. I'll see what I can find Thanks
gharchive/issue
2021-02-28T16:14:59
2025-04-01T06:37:03.121747
{ "authors": [ "Crimson-Blade", "HarshKhandeparkar" ], "repo": "HarshKhandeparkar/rainbow-board", "url": "https://github.com/HarshKhandeparkar/rainbow-board/issues/9", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1950051471
Added a Number Guessing Game Hello There, I have added a Number Guessing game in this repository . This a game where the number will be selection by the device automatically between 1 to 100 . The user has to guess the correct number out of the 100 numbers. Hope this changes find you well . Thank you Regards. @Jayu1214 Follow code of conduct by Star the repo! too small commit
gharchive/pull-request
2023-10-18T15:39:03
2025-04-01T06:37:03.123994
{ "authors": [ "HarshwardhanPatil07", "Jayu1214" ], "repo": "HarshwardhanPatil07/HactoberFest2023", "url": "https://github.com/HarshwardhanPatil07/HactoberFest2023/pull/170", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
151528317
clang changes for rio. Signed-off-by: Ronald G. Minnich rminnich@gmail.com Rio will chase me until death. LGTM
gharchive/pull-request
2016-04-28T01:24:57
2025-04-01T06:37:03.127654
{ "authors": [ "elbing", "rminnich" ], "repo": "Harvey-OS/harvey", "url": "https://github.com/Harvey-OS/harvey/pull/140", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
171802317
What is the right way to create a module and add new route? Hi everone, I just know the basic of React Redux, and i need your help :) I found MERN and I using it to make an small app to learn. I clone mern-start and use mern-cli to create a Test module. I just simple render a Hello World text. class Test extends Component { render() { return ( <div>Hello World</div> ); } } then i go to /client/route.js and add a new route under post route. <Route path="/test" getComponent={(nextState, cb) => { require.ensure([], require => { cb(null, require('./modules/Test/Test').default); }); }} /> restart from console and i got this error C:\Users\kieus\Desktop\MERN\client\modules\Auth\Auth.js:17 var _Auth = content.locals; ^ ReferenceError: content is not defined at Object.<anonymous> (Auth.js:6:1) at Module._compile (module.js:541:32) at loader (C:\Users\kieus\Desktop\MERN\node_modules\babel-register\lib\node.js:148:5) at Object.require.extensions.(anonymous function) [as .js] (C:\Users\kieus\Desktop\MERN\node_modules\babel-register\lib\node.js:158:7) at Module.load (module.js:458:32) at tryModuleLoad (module.js:417:12) at Function.Module._load (module.js:409:3) at Module.require (module.js:468:17) at require (internal/module.js:20:19) at Object.<anonymous> (routes.js:21:3) at Module._compile (module.js:541:32) at loader (C:\Users\kieus\Desktop\MERN\node_modules\babel-register\lib\node.js:148:5) at Object.require.extensions.(anonymous function) [as .js] (C:\Users\kieus\Desktop\MERN\node_modules\babel-register\lib\node.js:158:7) at Module.load (module.js:458:32) at tryModuleLoad (module.js:417:12) at Function.Module._load (module.js:409:3) [nodemon] app crashed - waiting for file changes before starting... am i missing something? What i have to do to make it right :( Also css background not working in development mode right? Is there any tutorial about MERN for beginer like: create basic module, jwt authentication? I know too many questions. Please help me Thanks. @kieusonlam Did you find a fix for this? I'm seeing the same thing after running mern init & merng module Editor using the latest mern-cli. Removing the usused style imports fixed this for me. import styles from './Editor.css'; Thanks @Morganjackson error gone away. :+1: wish I had found this issue before spending half a day debugging. I wonder what's ACTUALLY going on here. an empty styles import should not cause a cryptic "segmentation fault"-like error message like this. Downgrading to Webpack ^1.13.3 solved it. Strange bug, though.
gharchive/issue
2016-08-18T02:08:48
2025-04-01T06:37:03.151229
{ "authors": [ "Morganjackson", "alanhorizon", "kieusonlam", "sham3k" ], "repo": "Hashnode/mern-starter", "url": "https://github.com/Hashnode/mern-starter/issues/224", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1756658138
🛑 Mojang's Public API is down In c7b65f0, Mojang's Public API (https://api.mojang.com/) was down: HTTP code: 500 Response time: 362 ms Resolved: Mojang's Public API is back up in ef56968.
gharchive/issue
2023-06-14T11:18:52
2025-04-01T06:37:03.153965
{ "authors": [ "drori200" ], "repo": "HavenCoreNetwork/havencore-status-page", "url": "https://github.com/HavenCoreNetwork/havencore-status-page/issues/142", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1077781319
🛑 Mojang Accounts Website is down In 1af1f8a, Mojang Accounts Website (https://account.mojang.com/) was down: HTTP code: 0 Response time: 0 ms Resolved: Mojang Accounts Website is back up in b470710.
gharchive/issue
2021-12-12T11:24:56
2025-04-01T06:37:03.156572
{ "authors": [ "drori200" ], "repo": "HavenCoreNetwork/havencore-status-page", "url": "https://github.com/HavenCoreNetwork/havencore-status-page/issues/2", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2297509913
🛑 Mojang's Public API is down In c434b2c, Mojang's Public API (https://api.mojang.com/) was down: HTTP code: 429 Response time: 27 ms Resolved: Mojang's Public API is back up in f12f86c after 6 minutes.
gharchive/issue
2024-05-15T10:36:04
2025-04-01T06:37:03.159005
{ "authors": [ "drori200" ], "repo": "HavenCoreNetwork/havencore-status-page", "url": "https://github.com/HavenCoreNetwork/havencore-status-page/issues/432", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
362093060
README文件fabu_nginx.conf配置缺少一个; 原文是 root /home/ubuntu/fabulove/upload 正常 root /home/ubuntu/fabulove/upload; 看了一下代码里面这个配置文件同样的地方也是缺少一个; 谢谢反馈,修复了
gharchive/issue
2018-09-20T09:17:52
2025-04-01T06:37:03.247005
{ "authors": [ "yangguangchao", "zakiso" ], "repo": "HeadingMobile/fabu.love", "url": "https://github.com/HeadingMobile/fabu.love/issues/24", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
186093599
If one chooses regression and pred col is binary, throw error This is the current unhelpful message: ValueError: Unable to parse string "N" at position 0 Set error (with helpful message) when classification is chosen and pred col is numeric While working on a fix for this problem I discovered a few related problems: a classification run on a non-binary column a classification run on a binary column who's unique values are not 'Y' and 'N' (for example: GenderFLG) a regression run on a column containing non-numeric data Relates to #265
gharchive/issue
2016-10-29T18:32:24
2025-04-01T06:37:03.249002
{ "authors": [ "Aylr", "levithatcher" ], "repo": "HealthCatalyst/healthcareai-py", "url": "https://github.com/HealthCatalyst/healthcareai-py/issues/52", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
176971210
Want to use the ObjectMapper for Swift 2.2 using Git submodule Hello Hearst-DD I want to use this library for swift 2.2 and like to integrate it using the submodule. Any help would be highly appreciated Hi there, you will need to use v.1.3.0 or earlier for swift 2.2. With regards to including the project as a submodule, you should be able to do this like you would with any other submodule. Hope this helps. Thanks @tristanhimmelman 👍
gharchive/issue
2016-09-14T17:34:45
2025-04-01T06:37:03.279156
{ "authors": [ "kamarshad", "tristanhimmelman" ], "repo": "Hearst-DD/ObjectMapper", "url": "https://github.com/Hearst-DD/ObjectMapper/issues/574", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2759410047
🛑 Tommy is down In 487bce0, Tommy ($TOMMY_UPTIME) was down: HTTP code: 0 Response time: 0 ms HelioHost staff have been notified and we're investigating the cause of this downtime. Resolved: Tommy is back up in acd4a7f after 7 minutes.
gharchive/issue
2024-12-26T07:36:46
2025-04-01T06:37:03.348671
{ "authors": [ "Kryd0s" ], "repo": "HelioNetworks/status", "url": "https://github.com/HelioNetworks/status/issues/968", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1679436265
本地开发怎样启动这个项目呢? yarn dev @exposir 启动后可能会遇到的问题: 跨域问题:请检查前端服务启动端口为:3000、host 为 localhost、127.0.0.1 图片无法展示:在本地 hosts 文件末尾添加一行 127.0.0.1 dev.hg.com,随后访问 http://dev.hg.com:3000/ 如果无法解决,可以截图发一下开发者模式下的接口响应
gharchive/issue
2023-04-22T09:17:37
2025-04-01T06:37:03.370985
{ "authors": [ "521xueweihan", "exposir" ], "repo": "HelloGitHub-Team/geese", "url": "https://github.com/HelloGitHub-Team/geese/issues/104", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
853528587
Use a variable for the default channel before using it In commands and events, // instead of doing this everytime: client.channels.cache.get(config.defaultChannel).send(); // use this: const defaultChannel = client.channels.cache.get(config.defaultChannel); defaultChannel.send(); Maybe not needed if the channel isn't fetched from client each time.
gharchive/issue
2021-04-08T14:22:45
2025-04-01T06:37:03.428683
{ "authors": [ "Helmasaur" ], "repo": "Helmasaur/Bioman", "url": "https://github.com/Helmasaur/Bioman/issues/159", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
1798696513
problem building on mac Hi, I've been using this wonderful beamer template for some time. Rather recently, I discovered a problem when building on macOS: (./theme/beamerfontthemeHelmholtzAI.sty ! Package beamerhelmholtzai Error: Patching original frame title failed. See the beamerhelmholtzai package documentation for explanation. Type H <return> for immediate help. ... l.202 ...tching original frame title failed}\@ehc} ) (./theme/beamercolorthemeHelmholtzAI.sty) Any ideas? For completeness, here is the full log file of the gitlab CI runner: Running with gitlab-runner 16.1.0 (b72e108d)  on mac-virtual Znzopzz4, system ID: r_LrNRdEzCnajm section_start:1689065118:prepare_executor Preparing the "virtualbox" executor Using VirtualBox version 6.1.40r154048 executor... Creating new VM... Waiting for VM to become responsive... Starting SSH command... section_end:1689065148:prepare_executor section_start:1689065148:prepare_script Preparing environment Running on macos-11-xcode-12.local via fwcmac10.fz-rossendorf.de... section_end:1689065149:prepare_script section_start:1689065149:get_sources Getting source from Git repository Fetching changes with git depth set to 20... Initialized empty Git repository in /Users/gitlabrunner/builds/haicu/presentations/20230613-helmholtzaicon-enn/.git/ Created fresh repository. Checking out c47b45bf as detached HEAD (ref is main)... Skipping Git submodules setup section_end:1689065152:get_sources section_start:1689065152:step_script Executing "step_script" stage of the job script $ which brew /usr/local/bin/brew $ brew install texlive ==> Fetching dependencies for texlive: libpng, freetype, fontconfig, glib, xorgproto, libxau, libxdmcp, libxcb, libx11, libxext, libxrender, lzo, pixman, cairo, libsigsegv, clisp, jpeg-turbo, giflib, highway, imath, libtiff, little-cms2, openexr, webp, jpeg-xl, libvmaf, aom, libavif, gd, jbig2dec, libidn, openjpeg, ghostscript, graphite2, harfbuzz, libxft, lua, luajit, openjdk, openssl@3, berkeley-db, perl, potrace, libde265, shared-mime-info, x265, libheif, liblqr, jasper, libomp, libraw, m4, libtool, imagemagick, plotutils, pstoedit, pygments and python@3.11 ==> Fetching libpng ==> Downloading https://ghcr.io/v2/homebrew/core/libpng/manifests/1.6.40 ==> Downloading https://ghcr.io/v2/homebrew/core/libpng/blobs/sha256:c4f83c1860a79daac87a140dce046a16bafae60f064c4f5b6d25d568db2bf695 ==> Fetching freetype ==> Downloading https://ghcr.io/v2/homebrew/core/freetype/manifests/2.13.1 ==> Downloading https://ghcr.io/v2/homebrew/core/freetype/blobs/sha256:ae5f6d23acb94cd01039a950cdcc99917641fbdb1171f5aa3c78bafc5317c3c4 ==> Fetching fontconfig ==> Downloading https://ghcr.io/v2/homebrew/core/fontconfig/manifests/2.14.2 ==> Downloading https://ghcr.io/v2/homebrew/core/fontconfig/blobs/sha256:337bbb8f41116814b2060eccd4b08f8df7021453b204551afad230ef9f067661 ==> Fetching glib ==> Downloading https://ghcr.io/v2/homebrew/core/glib/manifests/2.76.3 ==> Downloading https://ghcr.io/v2/homebrew/core/glib/blobs/sha256:0cf1ed67fcb0ed20bde83c4c7f7fa954ed5b7399c753f4b78c37730bd4cd2d22 ==> Fetching xorgproto ==> Downloading https://ghcr.io/v2/homebrew/core/xorgproto/manifests/2023.2 ==> Downloading https://ghcr.io/v2/homebrew/core/xorgproto/blobs/sha256:de818c35cca25c4b2286a5642d5d1748320f6031039ec46b375fd11e935ef7e3 ==> Fetching libxau ==> Downloading https://ghcr.io/v2/homebrew/core/libxau/manifests/1.0.11 ==> Downloading https://ghcr.io/v2/homebrew/core/libxau/blobs/sha256:306524aec65e6ea22e5d18fbf5b09f1a544fce2a9bc37349b3bc5d98a14d7984 ==> Fetching libxdmcp ==> Downloading https://ghcr.io/v2/homebrew/core/libxdmcp/manifests/1.1.4 ==> Downloading https://ghcr.io/v2/homebrew/core/libxdmcp/blobs/sha256:2ed240f04f505a9472bc3f1988ba9be5edb9107795ab72f02a2ed7608d7de918 ==> Fetching libxcb ==> Downloading https://ghcr.io/v2/homebrew/core/libxcb/manifests/1.15_1 ==> Downloading https://ghcr.io/v2/homebrew/core/libxcb/blobs/sha256:ba2af806eddb9db3f65ab2c462d749fbadd03dd30d1ee6c5493ee466855dcae2 ==> Fetching libx11 ==> Downloading https://ghcr.io/v2/homebrew/core/libx11/manifests/1.8.6 ==> Downloading https://ghcr.io/v2/homebrew/core/libx11/blobs/sha256:e321aa3d735de0a2f4f7b2d5c6dde6184cc8590c348c3d1dd773fac892db6ecd ==> Fetching libxext ==> Downloading https://ghcr.io/v2/homebrew/core/libxext/manifests/1.3.5 ==> Downloading https://ghcr.io/v2/homebrew/core/libxext/blobs/sha256:674376b0e5cbcfb9278cdcf97e4ccc3910bf134cb1e45bf9e190d614a7f3a200 ==> Fetching libxrender ==> Downloading https://ghcr.io/v2/homebrew/core/libxrender/manifests/0.9.11 ==> Downloading https://ghcr.io/v2/homebrew/core/libxrender/blobs/sha256:be6b3af9fd07f7a95bf2a70e1673383dca5ba972a6984ba359ea0c36be2dee44 ==> Fetching lzo ==> Downloading https://ghcr.io/v2/homebrew/core/lzo/manifests/2.10 ==> Downloading https://ghcr.io/v2/homebrew/core/lzo/blobs/sha256:fcd3c9f7042104ca13be96fd0ec53acdc7da1480c16140441b2e66d4e7c5eb78 ==> Fetching pixman ==> Downloading https://ghcr.io/v2/homebrew/core/pixman/manifests/0.42.2-1 ==> Downloading https://ghcr.io/v2/homebrew/core/pixman/blobs/sha256:9c50d2fadad622cf5b80f24dffb5e5b2edfd0ff91927a2143ca27bbcd392a4c5 ==> Fetching cairo ==> Downloading https://ghcr.io/v2/homebrew/core/cairo/manifests/1.16.0_5 ==> Downloading https://ghcr.io/v2/homebrew/core/cairo/blobs/sha256:cb16c1bb070a7cdca7aaf8899a70e407d73636116d62225626b2c8d31aa8d2ff ==> Fetching libsigsegv ==> Downloading https://ghcr.io/v2/homebrew/core/libsigsegv/manifests/2.14 ==> Downloading https://ghcr.io/v2/homebrew/core/libsigsegv/blobs/sha256:6cefa3529425fcbd306c53d975bc0a727b34d8a3c636c664a1785f67202b2377 ==> Fetching clisp ==> Downloading https://ghcr.io/v2/homebrew/core/clisp/manifests/2.49.92_1 ==> Downloading https://ghcr.io/v2/homebrew/core/clisp/blobs/sha256:4b81399840c98918cda6447d86852ffcb96294f228cb26f6c289f22d90df5a7a ==> Fetching jpeg-turbo ==> Downloading https://ghcr.io/v2/homebrew/core/jpeg-turbo/manifests/2.1.5.1 ==> Downloading https://ghcr.io/v2/homebrew/core/jpeg-turbo/blobs/sha256:88632579a1730a7be4ad57d23e46b54c522d2ae511c9184fae81612fc349e596 ==> Fetching giflib ==> Downloading https://ghcr.io/v2/homebrew/core/giflib/manifests/5.2.1 ==> Downloading https://ghcr.io/v2/homebrew/core/giflib/blobs/sha256:dc23500f50d599c4dbfcea0107b643bef41538c2f5fd162b049f82d21e3d32d5 ==> Fetching highway ==> Downloading https://ghcr.io/v2/homebrew/core/highway/manifests/1.0.4 ==> Downloading https://ghcr.io/v2/homebrew/core/highway/blobs/sha256:9c3214e645ed27aa0c3de5f15b1675f06bbaed2f4019427153ea17c7e524ff34 ==> Fetching imath ==> Downloading https://ghcr.io/v2/homebrew/core/imath/manifests/3.1.9 ==> Downloading https://ghcr.io/v2/homebrew/core/imath/blobs/sha256:4b20276f002c8cf184c4c2ad8d1eb78ee5933e9429424321e2002e583ba25fed ==> Fetching libtiff ==> Downloading https://ghcr.io/v2/homebrew/core/libtiff/manifests/4.5.1 ==> Downloading https://ghcr.io/v2/homebrew/core/libtiff/blobs/sha256:18bd9c73f730afa03c4c5dd3c9b23d810a827e32464d325beafd1499161e47ab ==> Fetching little-cms2 ==> Downloading https://ghcr.io/v2/homebrew/core/little-cms2/manifests/2.15 ==> Downloading https://ghcr.io/v2/homebrew/core/little-cms2/blobs/sha256:c7cb39e28b14011c8ccf73c5de99c77328b6e41626bdfa400265e83911dd2070 ==> Fetching openexr ==> Downloading https://ghcr.io/v2/homebrew/core/openexr/manifests/3.1.9 ==> Downloading https://ghcr.io/v2/homebrew/core/openexr/blobs/sha256:8e35254af423f989877b99f2728f09c170375217fd96cef3d94eea674b8f193d ==> Fetching webp ==> Downloading https://ghcr.io/v2/homebrew/core/webp/manifests/1.3.0_1 ==> Downloading https://ghcr.io/v2/homebrew/core/webp/blobs/sha256:b65c4d06df31960977bd4822cedf8018f66059006b114ab97514599630776eef ==> Fetching jpeg-xl ==> Downloading https://ghcr.io/v2/homebrew/core/jpeg-xl/manifests/0.8.2 ==> Downloading https://ghcr.io/v2/homebrew/core/jpeg-xl/blobs/sha256:d049131b513c305652f382ecb91ec073ffb4cd8366b6427dc7dcd0f4f491ce63 ==> Fetching libvmaf ==> Downloading https://ghcr.io/v2/homebrew/core/libvmaf/manifests/2.3.1 ==> Downloading https://ghcr.io/v2/homebrew/core/libvmaf/blobs/sha256:60fb7784b39ae2aff9a836f08190637e9c7f2ac32755ed24ec3f5ddbac916c64 ==> Fetching aom ==> Downloading https://ghcr.io/v2/homebrew/core/aom/manifests/3.6.1 ==> Downloading https://ghcr.io/v2/homebrew/core/aom/blobs/sha256:acab19ae318c71f1c02c99fb60a2743a84396cb6cf53c51dca5f96bffdcc1971 ==> Fetching libavif ==> Downloading https://ghcr.io/v2/homebrew/core/libavif/manifests/0.11.1 ==> Downloading https://ghcr.io/v2/homebrew/core/libavif/blobs/sha256:cd42d557da81863120e2a542200c5bebc56b0fec042758157b7c2dec6ac557c7 ==> Fetching gd ==> Downloading https://ghcr.io/v2/homebrew/core/gd/manifests/2.3.3_5 ==> Downloading https://ghcr.io/v2/homebrew/core/gd/blobs/sha256:14ebdc0e93c087250ee130dc67de4082f26cd74428dddc047cfa834cc750230e ==> Fetching jbig2dec ==> Downloading https://ghcr.io/v2/homebrew/core/jbig2dec/manifests/0.19 ==> Downloading https://ghcr.io/v2/homebrew/core/jbig2dec/blobs/sha256:44aa9639d58ac2e176c37538c3fe652e077bcbf82264b756b4ba9db041e9273c ==> Fetching libidn ==> Downloading https://ghcr.io/v2/homebrew/core/libidn/manifests/1.41 ==> Downloading https://ghcr.io/v2/homebrew/core/libidn/blobs/sha256:464812fe81d7bafe7c25fe5d4e7348b603e5ded35410ff52b9933db76e6e5724 ==> Fetching openjpeg ==> Downloading https://ghcr.io/v2/homebrew/core/openjpeg/manifests/2.5.0_1 ==> Downloading https://ghcr.io/v2/homebrew/core/openjpeg/blobs/sha256:cbcf47bf1a1dcae0b4331f2f213d278a042e717eafdcbc54f21c920fdc0b56bd ==> Fetching ghostscript ==> Downloading https://ghcr.io/v2/homebrew/core/ghostscript/manifests/10.01.2 ==> Downloading https://ghcr.io/v2/homebrew/core/ghostscript/blobs/sha256:a52592ba7398d8ddd6d9e174c5547b282294bcf5fe15d6685e97428be13d46d1 ==> Fetching graphite2 ==> Downloading https://ghcr.io/v2/homebrew/core/graphite2/manifests/1.3.14 ==> Downloading https://ghcr.io/v2/homebrew/core/graphite2/blobs/sha256:ddc468a1eec491aed5d5b05b22d0cffa38b6059d87eab747301011507fcf6366 ==> Fetching harfbuzz ==> Downloading https://ghcr.io/v2/homebrew/core/harfbuzz/manifests/7.3.0_1 ==> Downloading https://ghcr.io/v2/homebrew/core/harfbuzz/blobs/sha256:b59130fcc25f4cc7ed063e1f22f92789b2dc47f4c7072ecf99e5a811bddf12d5 ==> Fetching libxft ==> Downloading https://ghcr.io/v2/homebrew/core/libxft/manifests/2.3.8 ==> Downloading https://ghcr.io/v2/homebrew/core/libxft/blobs/sha256:9998b2dcd6f3248a13e9b9a8d74c9efe66c45b7528d09867d5b24e144baba315 ==> Fetching lua ==> Downloading https://ghcr.io/v2/homebrew/core/lua/manifests/5.4.6 ==> Downloading https://ghcr.io/v2/homebrew/core/lua/blobs/sha256:e16b7d9390a3906a71d419b840ff81486607dcf557d1c35a03aee2f4b20b538b ==> Fetching luajit ==> Downloading https://ghcr.io/v2/homebrew/core/luajit/manifests/2.1.0-beta3-20230612.1 ==> Downloading https://ghcr.io/v2/homebrew/core/luajit/blobs/sha256:fbd76215acdbd7ba500a82fde8c1331c729fbb874b0550928ace7be2bbb9ed94 ==> Fetching openjdk ==> Downloading https://ghcr.io/v2/homebrew/core/openjdk/manifests/20.0.1 ==> Downloading https://ghcr.io/v2/homebrew/core/openjdk/blobs/sha256:0c0f98e27bd0a59acbcd04097fdeb798327d99db24ee5b16e4928be7fa5e87ca ==> Fetching openssl@3 ==> Downloading https://ghcr.io/v2/homebrew/core/openssl/3/manifests/3.1.1_1 ==> Downloading https://ghcr.io/v2/homebrew/core/openssl/3/blobs/sha256:50c7448f726762394d63abe2722acee2a426d8fd1a8101504ad7c8dfc44bca31 ==> Fetching berkeley-db ==> Downloading https://ghcr.io/v2/homebrew/core/berkeley-db/manifests/18.1.40_2 ==> Downloading https://ghcr.io/v2/homebrew/core/berkeley-db/blobs/sha256:5f4917a225a5986f682c85dbcfb6503024738d6eadb637161210ae621c26f457 ==> Fetching perl ==> Downloading https://ghcr.io/v2/homebrew/core/perl/manifests/5.36.1 ==> Downloading https://ghcr.io/v2/homebrew/core/perl/blobs/sha256:61baac39c1834ec0f7a026d639b6008eaf893090ebb57f947068e143e29ee556 ==> Fetching potrace ==> Downloading https://ghcr.io/v2/homebrew/core/potrace/manifests/1.16 ==> Downloading https://ghcr.io/v2/homebrew/core/potrace/blobs/sha256:3b5294deed86179a4e496236fb882eb0b7ad3c020741e2a1398861b545062712 ==> Fetching libde265 ==> Downloading https://ghcr.io/v2/homebrew/core/libde265/manifests/1.0.12 ==> Downloading https://ghcr.io/v2/homebrew/core/libde265/blobs/sha256:af1c29ef42925e64e1f5d7ca7edde8561f3a78ecfd3f89d6c6443bb7f0e41088 ==> Fetching shared-mime-info ==> Downloading https://ghcr.io/v2/homebrew/core/shared-mime-info/manifests/2.2 ==> Downloading https://ghcr.io/v2/homebrew/core/shared-mime-info/blobs/sha256:3287f34793705e039a140e2614d3aafad8de654e5829515ffd3b77d024de6551 ==> Fetching x265 ==> Downloading https://ghcr.io/v2/homebrew/core/x265/manifests/3.5-1 ==> Downloading https://ghcr.io/v2/homebrew/core/x265/blobs/sha256:55bb46a5dc1924e59b7fa7bc800a21c0cf21355e48cb38b941d8e786427c70a0 ==> Fetching libheif ==> Downloading https://ghcr.io/v2/homebrew/core/libheif/manifests/1.16.2-1 ==> Downloading https://ghcr.io/v2/homebrew/core/libheif/blobs/sha256:29c122901654eb74af43a8e349a0b544c35ab0b1a4b732666fbe3c51bd5627a0 ==> Fetching liblqr ==> Downloading https://ghcr.io/v2/homebrew/core/liblqr/manifests/0.4.2_1-2 ==> Downloading https://ghcr.io/v2/homebrew/core/liblqr/blobs/sha256:a0e88a1ce13ce43c2bf0fb0e4bdd7e9d33a367245d2ebf0f0d8dfe283666be7c ==> Fetching jasper ==> Downloading https://ghcr.io/v2/homebrew/core/jasper/manifests/4.0.0 ==> Downloading https://ghcr.io/v2/homebrew/core/jasper/blobs/sha256:637581823b9568caaa9eb6ddfc59d469c7ec1968c0ef3feab9e3d93a36e0bca5 ==> Fetching libomp ==> Downloading https://ghcr.io/v2/homebrew/core/libomp/manifests/16.0.6 ==> Downloading https://ghcr.io/v2/homebrew/core/libomp/blobs/sha256:4ae172c013b17cde11b708443b02a88605c10789717d781de860f01737ec8e0b ==> Fetching libraw ==> Downloading https://ghcr.io/v2/homebrew/core/libraw/manifests/0.21.1 ==> Downloading https://ghcr.io/v2/homebrew/core/libraw/blobs/sha256:4e3c0f783afeac2b9da275168846a87621d95663e592c3225869b1412b155137 ==> Fetching m4 ==> Downloading https://ghcr.io/v2/homebrew/core/m4/manifests/1.4.19 ==> Downloading https://ghcr.io/v2/homebrew/core/m4/blobs/sha256:b22472f659112cf12163bba770d891618b3ada5aaf5baa01516d80fef6214617 ==> Fetching libtool ==> Downloading https://ghcr.io/v2/homebrew/core/libtool/manifests/2.4.7-1 ==> Downloading https://ghcr.io/v2/homebrew/core/libtool/blobs/sha256:deffadfecec61da06dde9edf5eae19381f80f99ae78e57607732fd54be366b8a ==> Fetching imagemagick ==> Downloading https://ghcr.io/v2/homebrew/core/imagemagick/manifests/7.1.1-12-1 ==> Downloading https://ghcr.io/v2/homebrew/core/imagemagick/blobs/sha256:44dfa459f260a1829e7e9bd2d814086115259df6b4e21afd02dc30b1957bc638 ==> Fetching plotutils ==> Downloading https://ghcr.io/v2/homebrew/core/plotutils/manifests/2.6_1-1 ==> Downloading https://ghcr.io/v2/homebrew/core/plotutils/blobs/sha256:3ca14b49804af8b7364087731097dc992816d16a82fb6da2afeae18c1772e886 ==> Fetching pstoedit ==> Downloading https://ghcr.io/v2/homebrew/core/pstoedit/manifests/3.78-1 ==> Downloading https://ghcr.io/v2/homebrew/core/pstoedit/blobs/sha256:8cf73733366948cd732643dd90b9f8122eb4a3c170961386f8a16f0d3438aa1b ==> Fetching pygments ==> Downloading https://ghcr.io/v2/homebrew/core/pygments/manifests/2.15.1-1 ==> Downloading https://ghcr.io/v2/homebrew/core/pygments/blobs/sha256:4f9ad4f7947192121e59a218ef21d9416eb2162a936945c60335778fbce3939a ==> Fetching python@3.11 ==> Downloading https://ghcr.io/v2/homebrew/core/python/3.11/manifests/3.11.4_1 ==> Downloading https://ghcr.io/v2/homebrew/core/python/3.11/blobs/sha256:aeaf5d77ce1aad2cd39e7acebb78076bd0417459ff4825e1377a3bcee1f4b098 ==> Fetching texlive ==> Downloading https://ghcr.io/v2/homebrew/core/texlive/manifests/20230313_2 ==> Downloading https://ghcr.io/v2/homebrew/core/texlive/blobs/sha256:8618d95d4161d83e59e2ea99d35c5cb389789d48ba2227a5bcb578d963c98b20 ==> Installing dependencies for texlive: libpng, freetype, fontconfig, glib, xorgproto, libxau, libxdmcp, libxcb, libx11, libxext, libxrender, lzo, pixman, cairo, libsigsegv, clisp, jpeg-turbo, giflib, highway, imath, libtiff, little-cms2, openexr, webp, jpeg-xl, libvmaf, aom, libavif, gd, jbig2dec, libidn, openjpeg, ghostscript, graphite2, harfbuzz, libxft, lua, luajit, openjdk, openssl@3, berkeley-db, perl, potrace, libde265, shared-mime-info, x265, libheif, liblqr, jasper, libomp, libraw, m4, libtool, imagemagick, plotutils, pstoedit, pygments and python@3.11 ==> Installing texlive dependency: libpng ==> Pouring libpng--1.6.40.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libpng/1.6.40: 27 files, 1.3MB ==> Installing texlive dependency: freetype ==> Pouring freetype--2.13.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/freetype/2.13.1: 67 files, 2.5MB ==> Installing texlive dependency: fontconfig ==> Pouring fontconfig--2.14.2.big_sur.bottle.tar.gz ==> Regenerating font cache, this may take a while ==> /usr/local/Cellar/fontconfig/2.14.2/bin/fc-cache -frv 🍺 /usr/local/Cellar/fontconfig/2.14.2: 88 files, 2.3MB ==> Installing texlive dependency: glib ==> Pouring glib--2.76.3.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/glib/2.76.3: 455 files, 21.2MB ==> Installing texlive dependency: xorgproto ==> Pouring xorgproto--2023.2.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/xorgproto/2023.2: 267 files, 3.9MB ==> Installing texlive dependency: libxau ==> Pouring libxau--1.0.11.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libxau/1.0.11: 21 files, 121.5KB ==> Installing texlive dependency: libxdmcp ==> Pouring libxdmcp--1.1.4.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libxdmcp/1.1.4: 11 files, 129.8KB ==> Installing texlive dependency: libxcb ==> Pouring libxcb--1.15_1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libxcb/1.15_1: 2,461 files, 6.9MB ==> Installing texlive dependency: libx11 ==> Pouring libx11--1.8.6.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libx11/1.8.6: 1,054 files, 7.0MB ==> Installing texlive dependency: libxext ==> Pouring libxext--1.3.5.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libxext/1.3.5: 87 files, 427.2KB ==> Installing texlive dependency: libxrender ==> Pouring libxrender--0.9.11.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libxrender/0.9.11: 12 files, 198.3KB ==> Installing texlive dependency: lzo ==> Pouring lzo--2.10.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/lzo/2.10: 31 files, 570.9KB ==> Installing texlive dependency: pixman ==> Pouring pixman--0.42.2.big_sur.bottle.1.tar.gz 🍺 /usr/local/Cellar/pixman/0.42.2: 11 files, 1.3MB ==> Installing texlive dependency: cairo ==> Pouring cairo--1.16.0_5.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/cairo/1.16.0_5: 126 files, 6.3MB ==> Installing texlive dependency: libsigsegv ==> Pouring libsigsegv--2.14.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libsigsegv/2.14: 11 files, 167KB ==> Installing texlive dependency: clisp ==> Pouring clisp--2.49.92_1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/clisp/2.49.92_1: 59 files, 15MB ==> Installing texlive dependency: jpeg-turbo ==> Pouring jpeg-turbo--2.1.5.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/jpeg-turbo/2.1.5.1: 44 files, 3.9MB ==> Installing texlive dependency: giflib ==> Pouring giflib--5.2.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/giflib/5.2.1: 19 files, 599.8KB ==> Installing texlive dependency: highway ==> Pouring highway--1.0.4.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/highway/1.0.4: 65 files, 4MB ==> Installing texlive dependency: imath ==> Pouring imath--3.1.9.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/imath/3.1.9: 49 files, 930.6KB ==> Installing texlive dependency: libtiff ==> Pouring libtiff--4.5.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libtiff/4.5.1: 473 files, 7.8MB ==> Installing texlive dependency: little-cms2 ==> Pouring little-cms2--2.15.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/little-cms2/2.15: 21 files, 1.3MB ==> Installing texlive dependency: openexr ==> Pouring openexr--3.1.9.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/openexr/3.1.9: 194 files, 7.7MB ==> Installing texlive dependency: webp ==> Pouring webp--1.3.0_1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/webp/1.3.0_1: 63 files, 2.6MB ==> Installing texlive dependency: jpeg-xl ==> Pouring jpeg-xl--0.8.2.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/jpeg-xl/0.8.2: 43 files, 19.4MB ==> Installing texlive dependency: libvmaf ==> Pouring libvmaf--2.3.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libvmaf/2.3.1: 234 files, 7.2MB ==> Installing texlive dependency: aom ==> Pouring aom--3.6.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/aom/3.6.1: 23 files, 13MB ==> Installing texlive dependency: libavif ==> Pouring libavif--0.11.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libavif/0.11.1: 19 files, 489.6KB ==> Installing texlive dependency: gd ==> Pouring gd--2.3.3_5.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/gd/2.3.3_5: 33 files, 1.5MB ==> Installing texlive dependency: jbig2dec ==> Pouring jbig2dec--0.19.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/jbig2dec/0.19: 13 files, 373.6KB ==> Installing texlive dependency: libidn ==> Pouring libidn--1.41.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libidn/1.41: 73 files, 1MB ==> Installing texlive dependency: openjpeg ==> Pouring openjpeg--2.5.0_1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/openjpeg/2.5.0_1: 536 files, 13.8MB ==> Installing texlive dependency: ghostscript ==> Pouring ghostscript--10.01.2.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/ghostscript/10.01.2: 645 files, 151.9MB ==> Installing texlive dependency: graphite2 ==> Pouring graphite2--1.3.14.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/graphite2/1.3.14: 18 files, 291.9KB ==> Installing texlive dependency: harfbuzz ==> Pouring harfbuzz--7.3.0_1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/harfbuzz/7.3.0_1: 76 files, 9.6MB ==> Installing texlive dependency: libxft ==> Pouring libxft--2.3.8.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libxft/2.3.8: 92 files, 356.3KB ==> Installing texlive dependency: lua ==> Pouring lua--5.4.6.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/lua/5.4.6: 29 files, 745.6KB ==> Installing texlive dependency: luajit ==> Pouring luajit--2.1.0-beta3-20230612.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/luajit/2.1.0-beta3-20230612.1: 57 files, 2.0MB ==> Installing texlive dependency: openjdk ==> Pouring openjdk--20.0.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/openjdk/20.0.1: 636 files, 322.7MB ==> Installing texlive dependency: openssl@3 ==> Pouring openssl@3--3.1.1_1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/openssl@3/3.1.1_1: 6,495 files, 30MB ==> Installing texlive dependency: berkeley-db ==> Pouring berkeley-db--18.1.40_2.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/berkeley-db/18.1.40_2: 44 files, 6.2MB ==> Installing texlive dependency: perl ==> Pouring perl--5.36.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/perl/5.36.1: 2,492 files, 67.7MB ==> Installing texlive dependency: potrace ==> Pouring potrace--1.16.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/potrace/1.16: 16 files, 382.4KB ==> Installing texlive dependency: libde265 ==> Pouring libde265--1.0.12.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libde265/1.0.12: 21 files, 1.6MB ==> Installing texlive dependency: shared-mime-info ==> Pouring shared-mime-info--2.2.big_sur.bottle.tar.gz ==> /usr/local/Cellar/shared-mime-info/2.2/bin/update-mime-database /usr/local/s 🍺 /usr/local/Cellar/shared-mime-info/2.2: 86 files, 4.6MB ==> Installing texlive dependency: x265 ==> Pouring x265--3.5.big_sur.bottle.1.tar.gz 🍺 /usr/local/Cellar/x265/3.5: 11 files, 35.8MB ==> Installing texlive dependency: libheif ==> Pouring libheif--1.16.2.big_sur.bottle.1.tar.gz ==> /usr/local/opt/shared-mime-info/bin/update-mime-database /usr/local/share/mi 🍺 /usr/local/Cellar/libheif/1.16.2: 26 files, 2.0MB ==> Installing texlive dependency: liblqr ==> Pouring liblqr--0.4.2_1.big_sur.bottle.2.tar.gz 🍺 /usr/local/Cellar/liblqr/0.4.2_1: 113 files, 300.2KB ==> Installing texlive dependency: jasper ==> Pouring jasper--4.0.0.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/jasper/4.0.0: 44 files, 1.6MB ==> Installing texlive dependency: libomp ==> Pouring libomp--16.0.6.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libomp/16.0.6: 7 files, 1.7MB ==> Installing texlive dependency: libraw ==> Pouring libraw--0.21.1.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/libraw/0.21.1: 73 files, 5.7MB ==> Installing texlive dependency: m4 ==> Pouring m4--1.4.19.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/m4/1.4.19: 13 files, 724.4KB ==> Installing texlive dependency: libtool ==> Pouring libtool--2.4.7.big_sur.bottle.1.tar.gz 🍺 /usr/local/Cellar/libtool/2.4.7: 75 files, 3.8MB ==> Installing texlive dependency: imagemagick ==> Pouring imagemagick--7.1.1-12.big_sur.bottle.1.tar.gz 🍺 /usr/local/Cellar/imagemagick/7.1.1-12: 809 files, 30.8MB ==> Installing texlive dependency: plotutils ==> Pouring plotutils--2.6_1.big_sur.bottle.1.tar.gz 🍺 /usr/local/Cellar/plotutils/2.6_1: 74 files, 6.8MB ==> Installing texlive dependency: pstoedit ==> Pouring pstoedit--3.78.big_sur.bottle.1.tar.gz 🍺 /usr/local/Cellar/pstoedit/3.78: 41 files, 2.3MB ==> Installing texlive dependency: pygments ==> Pouring pygments--2.15.1.all.bottle.1.tar.gz 🍺 /usr/local/Cellar/pygments/2.15.1: 614 files, 8MB ==> Installing texlive dependency: python@3.11 ==> Pouring python@3.11--3.11.4_1.big_sur.bottle.tar.gz ==> /usr/local/Cellar/python@3.11/3.11.4_1/bin/python3.11 -m ensurepip ==> /usr/local/Cellar/python@3.11/3.11.4_1/bin/python3.11 -m pip install -v --no 🍺 /usr/local/Cellar/python@3.11/3.11.4_1: 3,286 files, 62.0MB ==> Installing texlive ==> Pouring texlive--20230313_2.big_sur.bottle.tar.gz 🍺 /usr/local/Cellar/texlive/20230313_2: 172,065 files, 3.7GB ==> Running `brew cleanup texlive`... Disable this behaviour by setting HOMEBREW_NO_INSTALL_CLEANUP. Hide these hints with HOMEBREW_NO_ENV_HINTS (see `man brew`). $ make Rc files read: NONE Latexmk: This is Latexmk, John Collins, 7 Jan. 2023. Version 4.79. No existing .aux file, so I'll make a simple one, and require run of *latex. Latexmk: applying rule 'xelatex'... Rule 'xelatex': Reasons for rerun Category 'other': Rerun of 'xelatex' forced or previously required ------------ Run number 1 of rule 'xelatex' ------------ ------------ Running 'xelatex -no-pdf -synctex=1 -interaction=nonstopmode -recorder "talk.tex"' ------------ This is XeTeX, Version 3.141592653-2.6-0.999995 (TeX Live 2023/Homebrew) (preloaded format=xelatex) restricted \write18 enabled. entering extended mode (./talk.tex LaTeX2e <2022-11-01> patch level 1 L3 programming layer <2023-02-22> (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamer. cls Document Class: beamer 2023/02/20 v3.69 A class for typesetting presentations (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asemodes.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/etoolbox/etool box.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asedecode.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/iftex/iftex. sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseoptions.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics/keyva l.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/geometry/geome try.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/iftex/ifvtex .sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/math/pgfma th.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/utilities/ pgfrcs.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/utilitie s/pgfutil-common.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/utilitie s/pgfutil-latex.def) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/utilitie s/pgfrcs.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/pgf.revi sion.tex))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/utilities/ pgfkeys.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/utilitie s/pgfkeys.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/utilitie s/pgfkeyslibraryfiltered.code.tex))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf math.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathutil.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathparser.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.basic.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.trigonometric.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.random.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.comparison.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.base.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.round.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.misc.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfunctions.integerarithmetics.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathcalc.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf mathfloat.code.tex))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/fleqn.clo ) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/size11.cl o) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/basiclayer /pgfcore.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics/graph icx.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics/graph ics.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics/trig. sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics-cfg/g raphics.cfg) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics-def/x etex.def))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/systemlaye r/pgfsys.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/systemla yer/pgfsys.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/systemla yer/pgf.cfg) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/systemla yer/pgfsys-xetex.def (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/systemla yer/pgfsys-dvipdfmx.def (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/systemla yer/pgfsys-common-pdf.def)))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/systemla yer/pgfsyssoftpath.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/systemla yer/pgfsysprotocol.code.tex)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/xcolor/xcolor. sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics-cfg/c olor.cfg) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics/mathc olor.ltx)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcore.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/math/pgf int.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorepoints.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorepathconstruct.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorepathusage.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorescopes.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcoregraphicstate.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcoretransformations.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorequick.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcoreobjects.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorepathprocessing.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorearrows.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcoreshade.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcoreimage.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcoreexternal.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorelayers.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcoretransparency.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorepatterns.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/basiclay er/pgfcorerdf.code.tex))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/utilities/ xxcolor.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/atbegshi- ltx.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/hyperref/hyper ref.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/ltxcmds/ltxc mds.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pdftexcmds/p dftexcmds.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/infwarerr/in fwarerr.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/kvsetkeys/kvse tkeys.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/kvdefinekeys /kvdefinekeys.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pdfescape/pd fescape.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/hycolor/hycolo r.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/letltxmacro/le tltxmacro.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/auxhook/auxhoo k.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/hyperref/namer ef.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/refcount/refco unt.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/gettitlestri ng/gettitlestring.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/kvoptions/kvop tions.sty))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/hyperref/pd1en c.def) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/intcalc/intc alc.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/etexcmds/ete xcmds.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/hyperref/puenc .def) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/url/url.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/bitset/bitse t.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/bigintcalc/b igintcalc.sty)) Package hyperref Message: Stopped early. ) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/hyperref/hxete x.def (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/stringenc/st ringenc.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/rerunfilecheck /rerunfilecheck.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/atveryend -ltx.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/uniquecounte r/uniquecounter.sty))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aserequires.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asecompatibility.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asefont.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsfonts/amssy mb.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsfonts/amsfo nts.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/sansmathaccent /sansmathaccent.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/koma-script/sc rlfile.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/koma-script/sc rlfile-hook.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/koma-script/sc rlogo.sty))))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asetranslator.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/translator/tra nslator.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asemisc.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asetwoscreens.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseoverlay.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asetitle.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asesection.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseframe.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseverbatim.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseframesize.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseframecomponents.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asecolor.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asenotes.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asetoc.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asetemplates.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseauxtemplates.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aseboxes.sty))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb aselocalstructure.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/tools/enumerat e.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asenavigation.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asenavigationsymbols.tex)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asetheorems.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsmath/amsmat h.sty For additional information on amsmath, use the `?' option. (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsmath/amstex t.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsmath/amsgen .sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsmath/amsbsy .sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsmath/amsopn .sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amscls/amsthm. sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerb asethemes.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamert hemedefault.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerf ontthemedefault.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamerc olorthemedefault.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beameri nnerthemedefault.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/beamer/beamero uterthemedefault.sty))) (./helmholtzai.sty (./theme/beamerthemeHelmholtzAI.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/booktabs/bookt abs.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/fontenc.s ty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/lm/t1lmss.fd)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/fontspec/fonts pec.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/l3packages/xpa rse/xparse.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/l3kernel/expl3 .sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/l3backend/l3ba ckend-xetex.def))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/fontspec/fonts pec-xetex.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/fontenc.s ty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/fontspec/fonts pec.cfg))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/inputenc. sty Package inputenc Warning: inputenc package ignored with utf8 based engines. ) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/listings/listi ngs.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/listings/lstmi sc.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/listings/listi ngs.cfg)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/frontendla yer/tikz.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/basiclayer /pgf.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/modules/ pgfmoduleshapes.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/modules/ pgfmoduleplot.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/compatibil ity/pgfcomp-version-0-65.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/compatibil ity/pgfcomp-version-1-18.sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pgf/utilities/ pgffor.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/utilitie s/pgffor.code.tex)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/tikz.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/librarie s/pgflibraryplothandlers.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/modules/ pgfmodulematrix.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/libraries/tikzlibrarytopaths.code.tex))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/libraries/tikzlibrarycalc.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/libraries/tikzlibraryturtle.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/libraries/tikzlibrarypositioning.code.tex) (./theme/beamerfontthemeHelmholtzAI.sty ! Package beamerhelmholtzai Error: Patching original frame title failed. See the beamerhelmholtzai package documentation for explanation. Type H <return> for immediate help. ... l.202 ...tching original frame title failed}\@ehc} ) (./theme/beamercolorthemeHelmholtzAI.sty) (./theme/beamerinnerthemeHelmholtzAI.sty) (./theme/beamerouterthemeHelmholtzAI.sty))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/fontawesome5/f ontawesome5.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/l3packages/l3k eys2e/l3keys2e.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/fontawesome5/f ontawesome5-utex-helper.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/fontawesome5/t ufontawesomefree.fd) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/fontawesome5/t ufontawesomebrands.fd))) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/natbib/natbib. sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/appendixnumber beamer/appendixnumberbeamer.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pdfpages/pdfpa ges.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/ifthen.st y) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/tools/calc.sty ) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/eso-pic/eso-pi c.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pdfpages/ppxet ex.def)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/animate/animat e.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/oberdiek/ifdra ft.sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/media9/pdfbase .sty) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/zref/zref-absp age.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/zref/zref-base .sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/ocgx2/ocgbase. sty)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/librarie s/pgflibraryarrows.meta.code.tex) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/libraries/tikzlibraryarrows.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/librarie s/pgflibraryarrows.code.tex)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/libraries/tikzlibrarydecorations.text.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/frontend layer/tikz/libraries/tikzlibrarydecorations.code.tex (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/modules/ pgfmoduledecorations.code.tex)) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/generic/pgf/librarie s/decorations/pgflibrarydecorations.text.code.tex)) Package hyperref Warning: Token not allowed in a PDF string (Unicode): (hyperref) removing `\noindent' on input line 38. Package hyperref Warning: Token not allowed in a PDF string (Unicode): (hyperref) removing `\inserttitle' on input line 43. Package hyperref Warning: Token not allowed in a PDF string (Unicode): (hyperref) removing `\insertsubtitle' on input line 43. (./talk.aux) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/base/ts1cmr.fd ) *geometry* driver: auto-detecting *geometry* detected driver: xetex Package hyperref Warning: Rerun to get /PageLabels entry. (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/translator/tra nslator-basic-dictionary-English.dict) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/translator/tra nslator-bibliography-dictionary-English.dict) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/translator/tra nslator-environment-dictionary-English.dict) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/translator/tra nslator-months-dictionary-English.dict) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/translator/tra nslator-numbers-dictionary-English.dict) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/translator/tra nslator-theorem-dictionary-English.dict) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pdflscape/pdfl scape.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/pdflscape/pdfl scape-nometadata.sty (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/graphics/lscap e.sty))) No file talk.nav. [1] (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsfonts/umsa. fd) (/usr/local/Cellar/texlive/20230313_2/share/texmf-dist/tex/latex/amsfonts/umsb. fd) Overfull \vbox (2.91675pt too high) detected at line 105 Overfull \vbox (10.2738pt too high) has occurred while \output is active [2] Overfull \vbox (2.91675pt too high) detected at line 105 Overfull \vbox (10.2738pt too high) has occurred while \output is active [3] Overfull \vbox (2.91675pt too high) detected at line 105 Overfull \vbox (10.2738pt too high) has occurred while \output is active [4] Package natbib Warning: Citation `olah2020naturally' on page 5 undefined on inp ut line 120. Overfull \vbox (10.2738pt too high) has occurred while \output is active [5] Package natbib Warning: Citation `olah2020naturally' on page 6 undefined on inp ut line 120. Overfull \vbox (10.2738pt too high) has occurred while \output is active [6] Package natbib Warning: Citation `bekkers-lecture21' on page 7 undefined on inp ut line 166. Package natbib Warning: Citation `pmlr-v48-cohenc16' on page 7 undefined on inp ut line 166. Package natbib Warning: Citation `cesa2019' on page 7 undefined on input line 1 66. Overfull \vbox (3.60536pt too high) detected at line 166 Overfull \vbox (10.2738pt too high) has occurred while \output is active [7] Overfull \vbox (10.2738pt too high) has occurred while \output is active [8] Overfull \vbox (10.2738pt too high) has occurred while \output is active [9] Overfull \vbox (10.2738pt too high) has occurred while \output is active [10] Overfull \vbox (10.2738pt too high) has occurred while \output is active [11] Overfull \hbox (17.07181pt too wide) in paragraph at lines 238--238 [][] Overfull \vbox (22.37025pt too high) detected at line 238 Overfull \vbox (10.2738pt too high) has occurred while \output is active [12] Overfull \hbox (7.2992pt too wide) in paragraph at lines 238--238 [][] Overfull \vbox (22.37025pt too high) detected at line 238 Overfull \vbox (10.2738pt too high) has occurred while \output is active [13] Overfull \hbox (7.2992pt too wide) in paragraph at lines 238--238 [][] Overfull \vbox (22.37025pt too high) detected at line 238 Overfull \vbox (10.2738pt too high) has occurred while \output is active [14] Overfull \hbox (7.2992pt too wide) in paragraph at lines 238--238 [][] Overfull \vbox (22.37025pt too high) detected at line 238 Overfull \vbox (10.2738pt too high) has occurred while \output is active [15] Overfull \hbox (7.2992pt too wide) in paragraph at lines 238--238 [][] Overfull \vbox (10.2738pt too high) has occurred while \output is active [16] Overfull \vbox (21.55809pt too high) detected at line 251 Overfull \vbox (10.2738pt too high) has occurred while \output is active [17] Overfull \vbox (21.55809pt too high) detected at line 251 Overfull \vbox (10.2738pt too high) has occurred while \output is active [18] Overfull \vbox (10.2738pt too high) has occurred while \output is active [19] Overfull \vbox (10.2738pt too high) has occurred while \output is active [20] Overfull \vbox (10.2738pt too high) has occurred while \output is active [21] Overfull \vbox (10.2738pt too high) has occurred while \output is active [22] Overfull \vbox (10.2738pt too high) has occurred while \output is active [23] Overfull \vbox (10.2738pt too high) has occurred while \output is active [24] Overfull \vbox (10.2738pt too high) has occurred while \output is active [25] Overfull \vbox (10.2738pt too high) has occurred while \output is active [26] Overfull \vbox (10.2738pt too high) has occurred while \output is active [27] Overfull \vbox (10.2738pt too high) has occurred while \output is active [28] Overfull \vbox (10.2738pt too high) has occurred while \output is active [29] Overfull \vbox (10.2738pt too high) has occurred while \output is active [30] Overfull \vbox (10.2738pt too high) has occurred while \output is active [31] Overfull \vbox (10.2738pt too high) has occurred while \output is active [32] Overfull \vbox (10.2738pt too high) has occurred while \output is active [33] Overfull \vbox (10.2738pt too high) has occurred while \output is active [34] Overfull \vbox (10.2738pt too high) has occurred while \output is active [35] Overfull \vbox (10.2738pt too high) has occurred while \output is active [36] Overfull \vbox (10.2738pt too high) has occurred while \output is active [37] Overfull \vbox (10.2738pt too high) has occurred while \output is active [38] Overfull \vbox (10.2738pt too high) has occurred while \output is active [39] Overfull \vbox (10.2738pt too high) has occurred while \output is active [40] Overfull \vbox (10.2738pt too high) has occurred while \output is active [41] Overfull \vbox (10.2738pt too high) has occurred while \output is active [42] Overfull \vbox (10.2738pt too high) has occurred while \output is active [43] Overfull \vbox (10.2738pt too high) has occurred while \output is active [44] Overfull \vbox (10.2738pt too high) has occurred while \output is active [45] Overfull \vbox (10.2738pt too high) has occurred while \output is active [46] No file talk.bbl. Overfull \vbox (10.2738pt too high) has occurred while \output is active [47] Package hyperref Warning: Token not allowed in a PDF string (Unicode): (hyperref) removing `\translate ' on input line 368. Overfull \vbox (10.2738pt too high) has occurred while \output is active [48] Overfull \vbox (2.55429pt too high) detected at line 379 Overfull \vbox (10.2738pt too high) has occurred while \output is active [49] Overfull \vbox (10.2738pt too high) has occurred while \output is active [50] Overfull \vbox (50.90778pt too high) detected at line 415 Overfull \vbox (10.2738pt too high) has occurred while \output is active [51] Overfull \vbox (50.90778pt too high) detected at line 415 Overfull \vbox (10.2738pt too high) has occurred while \output is active [52] Package natbib Warning: There were undefined citations. LaTeX Warning: Hook 'shipout/lastpage' executed on wrong page (1 not 52). Rerun to correct this. (./talk.aux) LaTeX Warning: Label(s) may have changed. Rerun to get cross-references right. Package rerunfilecheck Warning: File `talk.out' has changed. (rerunfilecheck) Rerun to get outlines right (rerunfilecheck) or use package `bookmark'. ) (see the transcript file for additional information) Output written on talk.xdv (52 pages, 418208 bytes). SyncTeX written on talk.synctex.gz. Transcript written on talk.log. Latexmk: Missing input file 'talk.nav' (or dependence on it) from following: No file talk.nav. Latexmk: Missing bbl file 'talk.bbl' in following: No file talk.bbl. Latexmk: Getting log file 'talk.log' Latexmk: Examining 'talk.fls' Latexmk: Examining 'talk.log' Latexmk: References changed. Latexmk: References changed. Latexmk: References changed. Latexmk: Log file says output to 'talk.xdv' Latexmk: ====List of undefined refs and citations: Citation `olah2020naturally' on page 5 undefined on input line 120 Citation `olah2020naturally' on page 6 undefined on input line 120 Citation `bekkers-lecture21' on page 7 undefined on input line 166 Citation `pmlr-v48-cohenc16' on page 7 undefined on input line 166 Citation `cesa2019' on page 7 undefined on input line 166 Latexmk: If appropriate, the -f option can be used to get latexmk to try to force complete processing. Latexmk: Found bibliography file(s): ./references.bib Latexmk: Summary of warnings from last run of *latex: Latex failed to resolve 5 citation(s) Latexmk: Errors, so I did not complete making targets Collected error summary (may duplicate other messages): xelatex: Command for 'xelatex' gave return code 1 Refer to 'talk.log' and/or above output for details make: *** [talk.pdf] Error 12 section_end:1689066286:step_script section_start:1689066286:cleanup_file_variables Cleaning up project directory and file based variables section_end:1689066288:cleanup_file_variables ERROR: Job failed: Process exited with status 1  I now see this problem on linux as well: (/usr/local/texlive/2022/texmf-dist/tex/generic/pgf/frontendlayer/tikz/librarie s/tikzlibrarypositioning.code.tex) (./theme/beamerfontthemeHelmholtzAI.sty ! Package beamerhelmholtzai Error: Patching original frame title failed. See the beamerhelmholtzai package documentation for explanation. Type H <return> for immediate help. ... l.202 ...tching original frame title failed}\@ehc} ) (./theme/beamercolorthemeHelmholtzAI.sty) This is with texlive-full on fedora 38. Digging a bit further, I added the following statement: \ifpatchable*{\beamer@@frametitle}% {%true \PackageInfo{beamerhelmholtzai}{Patching beamer--frametitle is POSSIBLE}\@ehc }{%false \PackageError{beamerhelmholtzai}{Patching beamer--frametitle is impossible}\@ehc} Just before \patchcmd{\beamer@@frametitle}% in ./theme/beamerfontthemeHelmholtzAI.sty which produces the error stated above. The false path was triggered which hints to the fact why the entire patchcmd call will fail. Fun fact: all CI builds fail for me locally on fc38, the error is NOT triggered when I remove the ifpatchable call I am building with: xelatex -no-pdf -synctex=1 -interaction=nonstopmode --shell-escape -recorder "example.tex" at the version: $ xelatex --version XeTeX 3.141592653-2.6-0.999994 (TeX Live 2022/CVE-2023-32700 patched) kpathsea version 6.3.4 Copyright 2022 SIL International, Jonathan Kew and Khaled Hosny. There is NO warranty. Redistribution of this software is covered by the terms of both the XeTeX copyright and the Lesser GNU General Public License. For more information about these matters, see the file named COPYING and the XeTeX source. Primary author of XeTeX: Jonathan Kew. Compiled with ICU version 72.1; using 72.1 Compiled with zlib version 1.2.13; using 1.2.13 Compiled with FreeType2 version 2.13.0; using 2.13.0 Compiled with Graphite2 version 1.3.14; using 1.3.14 Compiled with HarfBuzz version 7.1.0; using 7.1.0 Compiled with libpng version 1.6.37; using 1.6.37 Compiled with pplib version v2.05 less toxic i hope Compiled with fontconfig version 2.14.2; using 2.14.2 It seems that new TexLive stacks are broken in combination with the template. It seems that only the legacy stacks are working. We do not now yet what the reason is.
gharchive/issue
2023-07-11T11:12:14
2025-04-01T06:37:03.451574
{ "authors": [ "Markus-Goetz", "psteinb" ], "repo": "Helmholtz-AI-Energy/beamer-template", "url": "https://github.com/Helmholtz-AI-Energy/beamer-template/issues/31", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
1416971207
Garbage in translation (http & Trados & OmegaT) Hello, I'm getting "@ @ " garbage in the translations. ENVIRONMENT: win7 ult english OpusCAT v1.2.0.0 (tested v1 & v1.2.3 as well) OmegaT 5.7.1 Trados 2021 (& tested sr2) Firefox ESR In OpusCatMTEngine's window, I do "translate with model" and I get no garbage in the result. I managed to hook Opus in OmegaT & Trados, get lots of "@ @ "... Tested with browser: same bad result. I tested with different versions of the plugin, and engine... always the same. Am I missing something ? Any ideas anyone ? Thank you. Just couldn't stop my brain from juggling with that issue... (since it did work once with OmegaT)... so I came back to the "workbench" and here's what I did, step by step. deleted c_users_admin_local_opuscat c_users_admin_local_OpusCatMTEngine run OpusCATMTEngine_v1.1.0.7 install en-es zip HTTP request -> no garbage exit app run OpusCATMTEngine_v1.2.0 (previously imported en-es still there) HTTP request -> no garbage exit app run OpusCATMTEngine_v1.2.3 dbase update - ok HTTP request -> no garbage exit app run OpusCATMTEngine_v1.2.3 (again) HTTP request -> no garbage test OmegaT -> no garbage test Trados -> no garbage test memoQ -> no result (at this time, enough testing, don't even like memoQ) ** My current conclusion. Deleting the "c_users_admin_local...opus folders" and restart resolved the issue... for now. (time will tell) ** Note: I wanted to save data in installation folder ("Store OPUS-CAT MT Engine data in..." UNCHECKED) and I got exception erros in both v.1.1.0.7 & v1.2.0 did the same test with v1.2.3... no errors but previous en-es gone. exited OpusCATMTEngine_v1.2.3 then moved "c:\users\admin\local\opuscat" folder to the running opus folder, restarted opus and en-es was there tested HTTP again -> no garbage & perfect translation ** Suggestion improvements: - add, under "Store OPUS-CAT MT engine...", a field containing the current path of storage. - When changing the location of data folder (tick-untick "Store OPUS-CAT..."), if OPUS is not doing the move, display a warning/instruction message that the user can/must move the folder to maintain the list of model (with source path to new destination path). - in the translate model tab, beside the "translate" button, add a button/function to jump to browser & test the same string but with the "http://localhost...". It would allow to confirm the results are the same through HTTP (which relate to plugin results, if I'm not mistaking). Note: I know a lot of it is in the documentation... spreaded here and there (you have to admit), still these are details that are time savers if included as I suggest. Note 2: I (I) would have benefited from readiing your debug procedure/tools when it come to plugins. (ie, I was getting good results in the tranalstion tab of OPUS, but not in trados...)(how do you debug that???) That's it! Relieved that it now works... because when it works, hell it works nice. Opus is a awesome tool... be proud. :) Post note. My issue is resolved. "thank god" as would say my mexican friends. Kind regards Claude. I thought my issue was solved, but no. Finally, pinpointed the issue: the 2019 model files. The issue can be reproduced by adding a 2019 model. (at first I downloaded what I could find on opus, then went to tatoeba, that's why I had 2019 & 2020 model files) Furthermore, got a InvalidOperationException every time I attempted to "delete selected model", Opus crash, then I re-run, then "deleted" again the same model without error. But no issues at all deleting non-2019 models. Included the log file. opuscat_log_DELETING-MODEL.txt I leave the ticket open for you to see. cheers. Thanks for your thorough testing, I'll keep this open as an enhancement issue, since the UI fixes you mention should be fairly simple to implement. The root cause of the garbage output seems to be a fix I made in v1.1.0.8 to get rid of batch file post-processing (this was causing problems to some users). Unfortunately this broke some older models, which used BPE subword segmentation. All the newer models use SentencePiece subword segmentation, and since I've only done testing on them recently, this bug went unnoticed. I will either fix it or remove the BPE models from the model download list. A workaround is to use only the newest models, since I think all language pairs have SentencePiece models available for them now. Thx for your reply, genuinely appreciated. Regarding downloading models... Some thoughts of Enhancements: size of the models a cancel "x" in/beside the progress bar (instead of having to exit Opus) (where I am, in a small village in the "3rd world", I'm limited to 3Mbps. With the DL size, I may decide for a more appropriate time to download.) BTW, really happy with Opus, it works really great, with great results. A jewel. :) Take care. Claude.
gharchive/issue
2022-10-20T17:03:36
2025-04-01T06:37:03.476026
{ "authors": [ "TommiNieminen", "claude-ws01" ], "repo": "Helsinki-NLP/OPUS-CAT", "url": "https://github.com/Helsinki-NLP/OPUS-CAT/issues/50", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1918772365
[OTHER] Change donwload files button text What would you like to share? I'd like to know, where i can change the text from the download button for files from hemmelig_files to something different like "Download Files" Regards, FuckingToasters Additional information No response If you are creating your own version of this application you can figure this out yourself. I am not tech support. I notice a trend with all your issues that it is more or less tech support questions. I do not mind answering, but, when it comes to things like this that people either can easily figure out themselves, or if they are forking this product, then I will not spend my time answering or solving this. If you are creating your own version of this application you can figure this out yourself. I am not tech support. I notice a trend with all your issues that it is more or less tech support questions. I do not mind answering, but, when it comes to things like this that people either can easily figure out themselves, or if they are forking this product, then I will not spend my time answering or solving this. Well these are Questions a user might have, you are the creator of it so you should know where the things are xd. I looked at every file but couldn't find it.
gharchive/issue
2023-09-29T07:48:41
2025-04-01T06:37:03.479950
{ "authors": [ "FuckingToasters", "bjarneo" ], "repo": "HemmeligOrg/Hemmelig.app", "url": "https://github.com/HemmeligOrg/Hemmelig.app/issues/217", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
258587632
Improvements for Server Profiles handling of Templates Description Improvements for Server Profiles handling of Templates SP resources which have a serverProfileTemplateUri declared now, will inherit all undeclared attributes from its template Added load_resource internal method to improve retrieval of different resources Added load_template method to common to be used by SP and SPTs at the moment <NOT READY YET, likely failing on tests and definitely coverage> Issues Resolved #156 Check List [ ] New functionality includes testing. [x] All tests pass ($ rake test). [ ] New functionality has been documented in the README if applicable. [ ] New functionality has been thoroughly documented in the examples (please include helpful comments). [ ] Changes are documented in the CHANGELOG. Codecov Report :exclamation: No coverage uploaded for pull request base (master@9039fd2). Click here to learn what that means. The diff coverage is 45.45%. @@ Coverage Diff @@ ## master #164 +/- ## ========================================= Coverage ? 97.33% ========================================= Files ? 128 Lines ? 2286 Branches ? 0 ========================================= Hits ? 2225 Misses ? 61 Partials ? 0 Impacted Files Coverage Δ ...ib/puppet/provider/oneview_server_profile/c7000.rb 100% <100%> (ø) lib/puppet/provider/common.rb 70.88% <21.42%> (ø) lib/puppet/provider/oneview_resource.rb 94.66% <85.71%> (ø) Continue to review full report at Codecov. Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 9039fd2...d3772e8. Read the comment docs.
gharchive/pull-request
2017-09-18T19:07:04
2025-04-01T06:37:03.510255
{ "authors": [ "codecov-io", "fgbulsoni" ], "repo": "HewlettPackard/oneview-puppet", "url": "https://github.com/HewlettPackard/oneview-puppet/pull/164", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
251304626
Implement a schema loader As a dev, I access a class with pre-loaded schema for each redfish structure according to properties specified in a configuration file. The user should be able to specify the folder and the JSON file for each of these structures. Submitted a PR. Don´t know how to link this US with PR. A new US number 28 was created by ??? 'wafflebot'
gharchive/issue
2017-08-18T17:04:45
2025-04-01T06:37:03.511544
{ "authors": [ "ffdarkpenguin", "ricardoas" ], "repo": "HewlettPackard/oneview-redfish-toolkit", "url": "https://github.com/HewlettPackard/oneview-redfish-toolkit/issues/16", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
1097969735
The instance details page is not displayed when the spoc is not defined. Done and merged in dev branch
gharchive/issue
2022-01-10T14:30:44
2025-04-01T06:37:03.512583
{ "authors": [ "EliasBoulharts", "Sispheor" ], "repo": "HewlettPackard/squest", "url": "https://github.com/HewlettPackard/squest/issues/317", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1698625520
SLICK [x] #32 [x] #33 ref @naedmi wenn möglich gerne noch [ ] bauchbare fehlermeldungen und nicht alles im try block ignorieren - so kann man nix debuggen [ ] wenn machbar, updates statt jedes mal ein neues anlegen (auto increment macht problems) [ ] queries lesbarer machen, über namen wenn möglich ref bitte danke ^^
gharchive/issue
2023-05-06T13:07:37
2025-04-01T06:37:03.527452
{ "authors": [ "Ostabo" ], "repo": "HexxagonHTWG/Hexxagon", "url": "https://github.com/HexxagonHTWG/Hexxagon/issues/34", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1988337594
Add toggle-reverse described in issue 16 Describe the pull request Implement the feature described in issue #16. Screenshots If applicable, add screenshots to help explain what is being modified. This is draft. For example, I see there are test files. I did not write test for this new feature. Feel free to modify anything. Thanks for the contribution, super appreciated :raised_hands: This looks pretty straightforward and working as expected, I just added the following changes: Refactor to use an explicit direction (forward or backward) instead of a boolean reverse flag. Add a test to ensure it's working and properly loop back to the end of the list. Update the CHANGELOG file. Thanks again. Sorry for the late reply, did not see your message earlier. I think there is a difference between your case and the one from this PR you linked. The direction explicitely had 2 options that be easily described (forward and backward) whereas in your case, it may not be that explicit with the use of a password or not. It's difficult to say without knowing or writing the code, but my first guess would be, if this happens more than a few times, to not use RequestOpenSession(true) and maybe have an extra wrapper function, e.g. RequestOpenSessionWithPassword() which would be way more explicit for the reader of the code that would call the underlying function with the correct parameter.
gharchive/pull-request
2023-11-10T20:25:35
2025-04-01T06:37:03.540527
{ "authors": [ "HiDeoo", "Kotsuha" ], "repo": "HiDeoo/toggler-vscode", "url": "https://github.com/HiDeoo/toggler-vscode/pull/21", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
352088010
Size mismatch in the morphing example I am trying to run the morphing example from the docs. When I get to morphing in the STRAIGHT and run the following line: mObject3 = directSTRAIGHTmorphing(neutralHai,angryHai,0.5,'linear'); I get the following error: Subscripted assignment dimension mismatch. Error in directSTRAIGHTmorphing (line 45) ap(1:nr1,1:nc1) = (1-mRate)*mObject1.aperiodicityIndex; Debugging shows that nr1 = 2049 while size(mObject1.aperiodicityIndex, 1) = 1025. I am confused as to what might cause this. Thank you for pointing out this. This problem is caused by the mismatch in default parameter setting in exstraightsource.m and exstraightspec.m . Please edit the line 230 of exstraightsource.m by changeing 40 to 80 prm.F0defaultWindowLength = 80; % default frame length for pitch extraction (ms) This fixes the problem. I will check the other side effects. If it is OK, then, I will update the code. I checked side effects and found it is OK.
gharchive/issue
2018-08-20T11:06:44
2025-04-01T06:37:03.561514
{ "authors": [ "HidekiKawahara", "kalenkovich" ], "repo": "HidekiKawahara/legacy_STRAIGHT", "url": "https://github.com/HidekiKawahara/legacy_STRAIGHT/issues/1", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
209045337
Link Directly to Issue / Section Provide a link (in email output) directly to the section, where applicable. For instance, in SharePoint database monitoring, provide a link to Central Admin's database page Done, in next release Still needs some further tests
gharchive/issue
2017-02-21T05:22:45
2025-04-01T06:37:03.566031
{ "authors": [ "HiltonGiesenow" ], "repo": "HiltonGiesenow/PoShMon", "url": "https://github.com/HiltonGiesenow/PoShMon/issues/110", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
266513937
Expose the camera direction This adds the direction of the camera to the readCamera function #70 and might be useful for the expose camera rotation issue #67. cool! thx
gharchive/pull-request
2017-10-18T14:27:11
2025-04-01T06:37:03.585791
{ "authors": [ "cropd", "macrozone" ], "repo": "HippoAR/react-native-arkit", "url": "https://github.com/HippoAR/react-native-arkit/pull/93", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
325372472
Latest version 2018.2 isnt supported? Executing memory patches... Initializing modules... Processing... Cleanup... Applied patches: StringFixer: Failed (Exception has been thrown by the target of an invocation.) ResourceResolver: Failed (Init error: Could not find resolver type) AssemblyResolver: Failed (Init error: Could not find resolver type) Writing new assembly... Done. Could you double-check that the program you're deobfuscating is indeed using EazFuscator, and then upload your sample? I can confirm its Eaz 2018.2 (Latest Release) Upload: https://www105.zippyshare.com/v/KOwGRCis/file.html Found that issue was some naive comparisons on my part. I'm using SigComparer to compare methods to the expected string decrypter, but one of the methods in your target assembly has this code: \u0008\u2007\u2000.\u0002(1996868267) + \u0006\u2005\u2000.\u0002(16) My code thinks both of these are encrypted strings, but the second one isnt (even though it has the same signature and method name). Pushing a fix in a few minutes.
gharchive/issue
2018-05-22T16:30:29
2025-04-01T06:37:03.597854
{ "authors": [ "HoLLy-HaCKeR", "PR4GM4" ], "repo": "HoLLy-HaCKeR/EazFixer", "url": "https://github.com/HoLLy-HaCKeR/EazFixer/issues/9", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1554943143
"You can only submit max 100 commands pr. sync operation. Please split them up in smaller chunks" Getting the following when running python3 autodoist.py -a *** -l next -hf 2 2023-01-24 12:56:47 INFO You are running with the following functionalities: Next action labelling mode: Enabled Regenerate sub-tasks mode: Disabled Shifted end-of-day mode: Disabled 2023-01-24 12:56:49 INFO Autodoist has successfully connected to Todoist! 2023-01-24 12:56:50 INFO SQLite DB has successfully initialized! 2023-01-24 12:57:04 ERROR Error trying to sync with Todoist API: 400 Client Error: Bad Request for url: https://api.todoist.com/sync/v9/sync Traceback (most recent call last): File "/srv/dev-disk-by-uuid-12b417ae-a39e-46cb-aae6-35bf23871f11/dockerdata/autodoist/autodoist/autodoist.py", line 521, in sync response.raise_for_status() File "/usr/local/lib/python3.9/dist-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.todoist.com/sync/v9/sync Hm, that's indeed strange. I'll give it a look too this weekend. Looks like the API key has some issues when passed to the sync API. I was not able to recreate the issue, so I've added a few additional debug logs and pushed it to a new branch '34_extra_logs'. Could you please run this version with the --debug flag on, and send me a copy with the sensitive information removed? I'm only interested in the last part where it caches the sync API errors. Many thanks. Looks like it could be a limit issue? 2023-02-11 11:51:56 DEBUG response: { "error": "You can only submit max 100 commands pr. sync operation. Please split them up in smaller chunks", "error_code": 36, "error_extra": { "event_id": "1eab52c58829436faa9d0d2a319b6052", "retry_after": 7 }, "error_tag": "LIMITS_REACHED_COMMANDS", "http_code": 400 } Hey @Hoffelhas Just checking in, if you have any thoughts on this one? Many thanks! Dug a little into this, as I understand, the following is the code that sends the batched requests off and is what will fail if there are more than 100. data = 'sync_token=' + api.sync_token + \ '&commands=' + json.dumps(api.queue) Many people will not hit this, but as I have quite a lot of tasks and not been able to use autodoist for some time, there are now too many requests to send Having a look to see if I can figure out how to only send 100 at a time and will contribute back if I manage it! Found code on splitting lists into chunks, but not fully sure how to integrate with these lines without making a mess Hi there, sorry it took a while for me to respond. Todoist indeed got quite a bit stricter with the max. amount of syncs you're allowed to send in a given period. Basically the documentation says the following: For each user, you can make a maximum of 450 partial sync requests within a 15 minute period. For each user, you can make a maximum of 45 full sync requests within a 15 minute period. The maximum number of commands is 100 per request. When adding new over time tasks, it's indeed difficult to reach these numbers. However if you have a project with >100 items, and you would activate or change labelling on the project level, then you indeed would get a batch that's too big. However, you're thinking in the right direction; it should then be split up in multiple batches with max. 100. This should be relatively simple: when we enter the 'if api.queue' at line 1523, we have to check if api.queue>100, if so, split it up and run each block through sync(api) currently at line 1524. However do note that if you reach the 450 changes within 15 minutes, then Todoist will hard block your connection. So even we implement this work-around, you should not label and un-label your project with >100 items more than a few times per hour. Yes, I saw that other limitation. I don't think that would really be an issue normally, even the 100 commands per request is a bit unique to initial syncs I think, or as you say, large parallel processed projects Appreciate the reply! OK, I've come up with: # Sync all queued up changes if len(api.queue) < 100: sync(api) else: start = 0 end = len(api.queue) step = 100 for i in range(start, end, step): x = i sync(api.queue[x:x+step]) But I don't think the last line is right, doesn't work when testing at least anyway. Looks like it needs to be "sync(api)", but needs to be called on the list batch Will keep trying anyway! I've done something similar to moorsey on my own project: _COMMAND_CHUNK_SIZE = 99 def _write_some_changes( headers: CaseInsensitiveDict[str], commands: list[Command] ) -> str: """Write changes to the Todoist API. :param headers: Headers for the request (produced by headers.get_headers) :param commands: list of dictionaries (commands) to add to the API :return: sync_token from the API """ resp = requests.post( SYNC_URL, headers=headers, data=json.dumps({"commands": commands}) ) resp.raise_for_status() return str(resp.json()["sync_token"]) def write_changes( sync_token: str, headers: CaseInsensitiveDict[str], commands: list[Command] ) -> str: """Write the changes to the Todoist API, one chunk at a time. :param sync_token: current sync_token, will be updated if any commands are sent :param headers: Headers for the request (produced by headers.get_headers) :param commands: list of dictionaries (commands) to add to the API :return: sync_token from the API I don't know what the soft limit is, but I get lot of bad request errors if I send 1000 commands at once. """ if not commands: return sync_token try: sync_token = _write_some_changes(headers, commands[:_COMMAND_CHUNK_SIZE]) except Exception: # give up and start the whole main loop over return "*" time.sleep(1) return write_changes(sync_token, headers, commands[_COMMAND_CHUNK_SIZE:]) It's not isolated enough to be pasted into autodoist unfortunately, but it might give clues to someone in the thread. It does work. I hit the limit often when hiding / unhiding large projects with autotagging. Just put some code together for this also. Disclaimer, had help from my friend Google Bard on this. We muddled our way through together! First pull request after being on the internet for some time, hoping @Hoffelhas is well and able to look through the other contributions soon Hoping to get my GTD game back in order after a few years lost in the ocean, now hopefully have my next action labelling back! Adding "learn python" to my projects list!
gharchive/issue
2023-01-24T13:01:32
2025-04-01T06:37:03.607974
{ "authors": [ "Hoffelhas", "ShayHill", "moorsey" ], "repo": "Hoffelhas/autodoist", "url": "https://github.com/Hoffelhas/autodoist/issues/34", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1950025145
How to select object via code Hey, thanks for this great package! I've got a 3D view and an external button to delete the selected object (like in your examples). When the object is deleted i want to preselect the nearest object in the scene. Is there a way to trigger selection (highlighting) in code? Thanks! Hi See Example1 private async Task OnSelectObjectByUUIDClick() { if (selectObjectGuid.HasValue) { await View3D1.SelectByUuidAsync(selectObjectGuid.Value); } } Ohh sorry, I overlooked that. Thank you very much!
gharchive/issue
2023-10-18T15:25:14
2025-04-01T06:37:03.623088
{ "authors": [ "ArnoSchiller", "simutaroman" ], "repo": "HomagGroup/Blazor3D", "url": "https://github.com/HomagGroup/Blazor3D/issues/32", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2406843215
ESP32 Arduino Smart Home Project With HomeSpan(HomeKit protocol) VS. MATTER protocol Hi there! First and foremost, I'd like to express my gratitude for creating such an amazing project! I am currently working on a smart home project involving smart lights with Apple Home support. I am contemplating the pros and cons of integrating an esp32 Arduino project with Apple Home using either HomeKit or Matter protocol. Could you please provide some insights into the advantages and disadvantages of these two protocols for this specific integration? HomeSpan better in all things, but their is one problem that forced me to move to Matter, the MDNS issue that randomly happens. Matter its slow than HomeSpan, hard than HomeSpan, But never "no Response" even that they have a memory leak. I think that's right. MDNS is not universally supported the same way by every router, so in some cases it's just too unpredictable. A good case study is the commercial company Rachio - they build really nice irrigation controllers and automatic water hose valves. A few years ago they announced HomeKit compatibility, but ran into the same problems. It works really well for some people, but not very well for others. They eventually decided to stop supporting HomeKit, though they retained the software and the pairing codes so you can still use it if you'd like (just don't call them if it doesn't work). In my own home I originally used a Linksys Velop mesh network for a year and it worked fine. Then the Velop started having problems as a result of firmware updates. At first the problems did not impact HomeSpan stuff, but with each new update I had more and more problems with HomeSpan as well as commercial HomeKit devices. I eventually threw the Velop system in the garbage and switched to a NetGear Orbi mesh network and it's been rock solid for 3+ years. Not sure why some routers work better than others. I recommend starting with HomeSpan. If you load example 1 and it is stable for a few weeks, that likely means your routers fully support the MDNS protocol (at least the version used by Espressif). If unstable (using Example 1), then you may want to try Matter. If you are a developer person, and have a good experience, i will advise you to go for espressif framework and not arduino. You can control everything in espressif. But in arduino you will get some compiled files that cant really change and sone flags that cant change. So i recommend using esp_matter over some arduino matter. But in overall HomeSpan the better simple faster for development and use.
gharchive/issue
2024-07-13T10:47:23
2025-04-01T06:37:03.627873
{ "authors": [ "0100101101001011", "HomeSpan", "ronny-antoon" ], "repo": "HomeSpan/HomeSpan", "url": "https://github.com/HomeSpan/HomeSpan/issues/885", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
817801172
sshfs.c:15:10: fatal error: 'fuse_lowlevel.h' file not found brew config output when brew install datawire/blackbird/telepresence sshfs.c:15:10: fatal error: 'fuse_lowlevel.h' file not found #include <fuse_lowlevel.h> ^~~~~~~~~~~~~~~~~ 1 error generated. make[1]: *** [sshfs-sshfs.o] Error 1 make[1]: *** Waiting for unfinished jobs.... make: *** [install-recursive] Error 1 Do not report this issue to Homebrew/brew or Homebrew/core! Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. brew doctor output sshfs.c:15:10: fatal error: 'fuse_lowlevel.h' file not found #include <fuse_lowlevel.h> ^~~~~~~~~~~~~~~~~ 1 error generated. make[1]: *** [sshfs-sshfs.o] Error 1 make[1]: *** Waiting for unfinished jobs.... make: *** [install-recursive] Error 1 Do not report this issue to Homebrew/brew or Homebrew/core! Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. [X] The brew doctor above contains no "Warning" lines. What were you trying to do (and why)? sshfs.c:15:10: fatal error: 'fuse_lowlevel.h' file not found #include <fuse_lowlevel.h> ^~~~~~~~~~~~~~~~~ 1 error generated. make[1]: *** [sshfs-sshfs.o] Error 1 make[1]: *** Waiting for unfinished jobs.... make: *** [install-recursive] Error 1 Do not report this issue to Homebrew/brew or Homebrew/core! Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. What happened (include all command output)? sshfs.c:15:10: fatal error: 'fuse_lowlevel.h' file not found #include <fuse_lowlevel.h> ^~~~~~~~~~~~~~~~~ 1 error generated. make[1]: *** [sshfs-sshfs.o] Error 1 make[1]: *** Waiting for unfinished jobs.... make: *** [install-recursive] Error 1 Do not report this issue to Homebrew/brew or Homebrew/core! Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. What did you expect to happen? sshfs.c:15:10: fatal error: 'fuse_lowlevel.h' file not found #include <fuse_lowlevel.h> ^~~~~~~~~~~~~~~~~ 1 error generated. make[1]: *** [sshfs-sshfs.o] Error 1 make[1]: *** Waiting for unfinished jobs.... make: *** [install-recursive] Error 1 Do not report this issue to Homebrew/brew or Homebrew/core! Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. Step-by-step reproduction instructions (by running brew commands) sshfs.c:15:10: fatal error: 'fuse_lowlevel.h' file not found #include <fuse_lowlevel.h> ^~~~~~~~~~~~~~~~~ 1 error generated. make[1]: *** [sshfs-sshfs.o] Error 1 make[1]: *** Waiting for unfinished jobs.... make: *** [install-recursive] Error 1 Do not report this issue to Homebrew/brew or Homebrew/core! Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. Do not report this issue to Homebrew/brew or Homebrew/core! Please report this to whomever runs datawire/blackbird.
gharchive/issue
2021-02-27T02:40:23
2025-04-01T06:37:03.640883
{ "authors": [ "jonchang", "wangliguang517" ], "repo": "Homebrew/brew", "url": "https://github.com/Homebrew/brew/issues/10721", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }
1585140777
brew irb errors immediately when HOMEBREW_PRY=1 brew doctor output Your system is ready to brew. Verification [X] My "brew doctor output" above says Your system is ready to brew. and am still able to reproduce my issue. [X] I ran brew update twice and am still able to reproduce my issue. [X] This issue's title and/or description do not reference a single formula e.g. brew install wget. If they do, open an issue at https://github.com/Homebrew/homebrew-core/issues/new/choose instead. brew config output HOMEBREW_VERSION: 3.6.21-132-g724e3e6 ORIGIN: https://github.com/Homebrew/brew HEAD: 724e3e646abf04474b083f581a963a52e39fe47d Last commit: 34 minutes ago Core tap origin: https://github.com/Homebrew/homebrew-core Core tap HEAD: 0bbb89420e74756a5a5c145ed7efa4a32f7e7e7c Core tap last commit: 2 days ago Core tap branch: master Core tap JSON: 15 Feb 03:07 UTC HOMEBREW_PREFIX: /home/linuxbrew/.linuxbrew HOMEBREW_CASK_OPTS: [] HOMEBREW_GITHUB_API_TOKEN: set HOMEBREW_MAKE_JOBS: 2 HOMEBREW_PRY: set Homebrew Ruby: 2.6.8 => /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/vendor/portable-ruby/2.6.8_1/bin/ruby CPU: dual-core 64-bit skylake Clang: N/A Git: 2.39.1 => /bin/git Curl: 7.81.0 => /bin/curl Kernel: Linux 5.4.0-1103-azure x86_64 GNU/Linux OS: Ubuntu 22.04.1 LTS (jammy) Host glibc: 2.35 /usr/bin/gcc: 11.3.0 /usr/bin/ruby: N/A glibc: N/A gcc@11: N/A gcc: N/A xorg: N/A What were you trying to do (and why)? Use HOMEBREW_PRY=1 brew irb to programmatically examine the contents of formulae. What happened (include all command output)? linuxbrew@codespaces-07ca68:/workspaces/brew$ HOMEBREW_PRY=1 brew irb Error: cannot load such file -- method_source Please report this issue: https://docs.brew.sh/Troubleshooting /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/pry-0.14.2/lib/pry/pry_instance.rb:3:in `require' /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/pry-0.14.2/lib/pry/pry_instance.rb:3:in `<top (required)>' /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/pry-0.14.2/lib/pry.rb:61:in `require' /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/pry-0.14.2/lib/pry.rb:61:in `<top (required)>' /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/dev-cmd/irb.rb:54:in `require' /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/dev-cmd/irb.rb:54:in `irb' /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/brew.rb:93:in `<main>' What did you expect to happen? For a Ruby REPL to start. Step-by-step reproduction instructions (by running brew commands) HOMEBREW_PRY=1 brew irb I think this is caused by how we remove vendored gems via git. Workaround is to delete your local pry gem in vendor/bundle/ruby/... and re-trigger the install Hmm, yea, makes sense: 39c6f7d6fbaba464798751c2c42f8f881da00e80 is the first bad commit commit 39c6f7d6fbaba464798751c2c42f8f881da00e80 Author: apainintheneck <apainintheneck@gmail.com> Date: Sun Feb 5 20:54:15 2023 -0800 Stop including `*flight` block source in cask API Originally we were going to try and load the *flight blocks from the API but we ended up going with downloading the caskfile for the subset of casks that need this functionality for consisty's sake. This reverts the following commits from most recent to oldest: - ffc74a51fb32b66a4cd8bc41dbd076dd23d9100e - e5616e94fe42505434c330be35eeafef2739944f - d1490c3d5c087d00f2bca1787cce331202b195c5 - 7ca5a5d9a71a73f21bbb8555a38048f027bee89b - 2d5d132713d0701d02d5ff33e9918812d13d2a83 It also changes how *flight blocks are handled in `.to_h`. Essentially, when *flight blocks exist they are just included as a hash of the artifact to nil to indicate that they exist. More information isn't necessary since we don't evaluate the current source code in the *flight artifacts that we get from the API. .gitignore | 1 + Library/Homebrew/Gemfile | 1 - Library/Homebrew/Gemfile.lock | 1 - .../cask/artifact/abstract_flight_block.rb | 6 - Library/Homebrew/cask/cask.rb | 3 +- Library/Homebrew/cask/cask_loader.rb | 6 +- Library/Homebrew/test/cask/cask_spec.rb | 48 ------- .../fixtures/cask/Casks/conditional-flight.rb | 21 --- .../gems/method_source-1.0.0/lib/method_source.rb | 141 ------------------- .../lib/method_source/code_helpers.rb | 154 --------------------- .../lib/method_source/source_location.rb | 138 ------------------ .../lib/method_source/version.rb | 3 - 12 files changed, 4 insertions(+), 519 deletions(-) delete mode 100644 Library/Homebrew/test/support/fixtures/cask/Casks/conditional-flight.rb delete mode 100644 Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/method_source-1.0.0/lib/method_source.rb delete mode 100644 Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/method_source-1.0.0/lib/method_source/code_helpers.rb delete mode 100644 Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/method_source-1.0.0/lib/method_source/source_location.rb delete mode 100644 Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/method_source-1.0.0/lib/method_source/version.rb Ok, that worked. Thanks for the tip! Would be nice for brew to handle this better somehow, but not really sure how that would work. I suppose having pry a part of the Gemfile probably would improve things here, though I don't think there was a universal agreement to the approach (I still think gem groups makes sense for stuff like that). There's been various other issues caused by vendor gem changes over the years (including brew doctor failures in homebrew-core), but it seems we've ironed that out a little. I'd be game to start including all these sorts of development gems that are installed optionally and/or we remove those that aren't really used. @apainintheneck given your recent PR: is this still a problem? This is something completely different than what was covered in my PR. I actually ran into the exact same error as @carlocab when working with pry and this thread was very helpful. I agree that it'd be nice to have a better way of handling dev dependencies but it's a relatively easy fix in this case. This was fixed in https://github.com/Homebrew/brew/commit/d69c3ef3df7c332fdc646e0533b898b661f5fccf by vendoring pry. Thanks for cleaning this up @apainintheneck 🙇🏻
gharchive/issue
2023-02-15T03:12:27
2025-04-01T06:37:03.650647
{ "authors": [ "Bo98", "MikeMcQuaid", "apainintheneck", "carlocab" ], "repo": "Homebrew/brew", "url": "https://github.com/Homebrew/brew/issues/14634", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }
2267012052
Documentation is replaced by auto-generated RBI comments. brew doctor output N/A Verification [X] My "brew doctor output" above says Your system is ready to brew. and am still able to reproduce my issue. [X] I ran brew update twice and am still able to reproduce my issue. [X] This issue's title and/or description do not reference a single formula e.g. brew install wget. If they do, open an issue at https://github.com/Homebrew/homebrew-core/issues/new/choose instead. brew config output N/A What were you trying to do (and why)? Documentation for e.g. RuboCops only shows auto-generated comments from RBI files: From https://rubydoc.brew.sh/RuboCop/Cop/Homebrew/Blank.html: DO NOT EDIT MANUALLY This is an autogenerated file for dynamic methods in RuboCop::Cop::Homebrew::Blank. Please instead update this file by running bin/tapioca dsl RuboCop::Cop::Homebrew::Blank. What happened (include all command output)? I assume the reason is that yard-sorbet is run last. It should be changed to skip any comments in RBI files. What did you expect to happen? Original documentation should be shown. Step-by-step reproduction instructions (by running brew commands) 1. `brew rubydoc --open` 1. Look at `RuboCop::Cop::Homebrew::Blank` documentation. I opened a PR upstream: https://github.com/Shopify/tapioca/pull/1885 🤞 Thanks for issues @reitermarkus and @dduugg! We can also patch Tapioca if upstream doesn't accept the patch (early feedback isn't positive) 🙈. Something like: https://github.com/Homebrew/brew/compare/tapioca-patch?expand=1 We can also patch Tapioca if upstream doesn't accept the patch (early feedback isn't positive) 🙈. Something like: tapioca-patch?expand=1 (compare) @dduugg This would work fine for me, monkey-patching like this is IMO fairly inconsequential when we're not exposing it to end-users. @dduugg, fine by me. We can always revert it once it is fixed upstream. Upstream PR was accepted, this will be fixed in the next tapioca release
gharchive/issue
2024-04-27T13:17:23
2025-04-01T06:37:03.659263
{ "authors": [ "MikeMcQuaid", "dduugg", "reitermarkus" ], "repo": "Homebrew/brew", "url": "https://github.com/Homebrew/brew/issues/17167", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }
182763939
All things must come to an end eventually. [x] Have you followed the guidelines in our Contributing document? [x] Have you checked to ensure there aren't other open Pull Requests for the same change? [x] Have you successfully run brew tests with your changes locally? 😢 Thanks for all your contributions and all the help you provided me! Why not as former maintainer? Why not as former maintainer? @retokromer He will be going there whether he likes it or not 😉 Yes, thanks for your contributions! Thank you @DomT4 for your dedicated and professional work on Homebrew, it will be missed. Not to forget all the help and support you gave along my PRs. ❤️ @DomT4 I've appreciated all of your work very much. Thank you! ❤️ Wasn't quite expecting so many people to notice the PR 😅. Left a message in the team chat for the maintainer side of things, but thank you to the folks who commented here, it is very much appreciated. I guess people always say that, but it remains true to me ❤️. Getting to work with some of you & being involved in the Homebrew community in general has been a pleasure, sometimes rather frustrating as FOSS can be, but Homebrew is one of the best open-source communities you can stumble into and contribute towards, especially if you're new or returning to the whole open-source thing and want to help out but aren't really sure where to start. I have no idea really why Mike & the team that was here at the time, a few of which remain active, trusted me with maintaining Homebrew however long ago it was now, but I managed to not completely burn the metaphorical house down so I guess it didn't work out too awfully. Folks are welcome to ping me if there's stuff you stumble across later and think "Why?", this account will remain active even if I haven't decided how much it'll be used going forwards and your pings will get filtered into my GH mailbox. Just don't ping me to tell me you think what I did was terrible 😉. Up to @MikeMcQuaid when this gets merged. I'll leave that in your hands. And belatedly, congratulations on becoming a maintainer @tschoonj. Your work on the GNOME stuff has always been 💯 and I've no idea how you consistently find the patience for it. @DomT4 As I stated in Slack but will restate here: it's been a pleasure working with you and you should be proud of all the great work you've done. @tschoonj @reitermarkus @jawshooah Just a note that I used this chance to add you to the README/manpage too. :sob: @MikeMcQuaid Thanks Mike. Appreciate all the support you've given me around Homebrew over my time here. Fairly frequently you had more faith in me than I had in myself. Let me know if I need to sign or post anything to step down from the SFC PLC. @bfontaine ❤️❤️. Thanks for all your work Dom! It's been a pleasure working with you. Thanks for everything Dom, from the amazing and mind-blowing amount work you contributed to Homebrew to the pleasure of working with you and the discussions we had! You've certainly played a key role in me getting more involved with Homebrew and feeling truly welcome here. :heart: 😔 @DomT4 I didn't have too much time to work with you, however I remember you finding some tricky bugs in my code and me fixing them, and it stays as a good memory for me. Thanks for all of your work on Homebrew and best of luck on all your endeavors. @apjanke Thanks Andrew. My sincere apologies that my departure likely leaves the way Homebrew handles Java sandboxing heavily on your shoulders 🙈. Maybe not the best leaving gift in the world. @UniqMartin I might have lasted longer if I hadn't tried to do as much of absolutely everything as I could heh, which is perhaps advice I pass on to any new maintainer in future months or years who stumbles upon this thread. Sometimes you need a break from things and I'm not very good at giving myself them. Thank you for the many discussions we had; I'm glad and thankful for any influence I had in making you feel welcome around Homebrew and a part of the team. @vladshablinsky I learnt quite a bit from keeping an eye on your PRs. Occasionally finding interesting ways to break things was something I formed a habit of doing 😆, but that being useful is always a relief. Although we didn't get the chance to do much work together thrilled to see you join the maintainers team, Homebrew is lucky to have you around.
gharchive/pull-request
2016-10-13T11:26:51
2025-04-01T06:37:03.669442
{ "authors": [ "DomT4", "MikeMcQuaid", "UniqMartin", "apjanke", "bfontaine", "dunn", "mistydemeo", "retokromer", "scpeters", "tschoonj", "vladshablinsky", "vszakats" ], "repo": "Homebrew/brew", "url": "https://github.com/Homebrew/brew/pull/1283", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }
2354287268
sorbet: Update RBI files. Autogenerated by the sorbet workflow. @Bo98 guess this is related to the bootsnap changes recently. Yeah bit of an oversight on my part. Tapioca can't autogen the stuff in Portable Ruby, but we only use one method here so will just provide that manually (and more strongly typed than what Tapioca provides anyway).
gharchive/pull-request
2024-06-15T00:20:58
2025-04-01T06:37:03.671417
{ "authors": [ "Bo98", "BrewTestBot", "MikeMcQuaid" ], "repo": "Homebrew/brew", "url": "https://github.com/Homebrew/brew/pull/17513", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }