id
stringlengths
4
10
text
stringlengths
4
2.14M
source
stringclasses
2 values
created
timestamp[s]date
2001-05-16 21:05:09
2025-01-01 03:38:30
added
stringdate
2025-04-01 04:05:38
2025-04-01 07:14:06
metadata
dict
1353538055
Merge output from evolutionary search into one document Describe the desired feature As an evolutionary search :slot_machine: progresses, a series of summary files are produced, including: convergence__staged_relaxations.html convergence__time_vs_energy_per_atom.html distribution_of_energy_per_atom.html distribution_of_subworkflow_times.html history_of_the_best_individuals.md Rather than writing these to five different files, I think it'd be useful if they were all written to a single html file. That way I only have to bother to open/close one file, not five. Additional context No response To-do items notes by @jacksund : [x] condense workflow outputs to database objects [x] streamline Figure registration from a database table/object [x] allow figures to be written to file and html div with single implementation [x] add default method for writing summary output from db object [x] link to website URL for viewing all results on a single page [ ] add templates and views for structure-prediction flows Agreed, I'd like to move in this direction as well. 👍 I also think the same can be requested for the other summary files too (i.e. merge all outputs into a single "report" file). If we're building a larger html file, we might as well include everything we need. Keep running with this idea, and we end up with a website/django view. Ideally, I think the only "output" files should be (1) a URL link to the website view (even to a local server link) and (2) a "static html" of the website view. This addresses many output files and moves toward a single "report" html (which is also a view in the website). I've planned on doing this for all workflows (not just the evo search) but haven't had much time to put towards website templates. However, I can set up the basics to address this issue. Since you've built django templates before, you'll easily be able to update things once I have the main template set up. Below are just notes on how I would write a static html locally for the view from django.template.loader import get_template from django.template import Context template = get_template(template_src) context = Context(context_dict) html = template.render(context) https://docs.djangoproject.com/en/4.1/ref/templates/api/#loading-a-template https://docs.djangoproject.com/en/4.1/ref/templates/api/#rendering-a-context Also see the render_to_string method: from django.template.loader import render_to_string rendered = render_to_string('my_template.html', {'foo': 'bar'}) https://docs.djangoproject.com/en/4.1/topics/templates/#module-django.template.loader https://docs.djangoproject.com/en/4.1/topics/templates/#django.template.loader.render_to_string @scott-materials I've streamlined how results are used when writing output files + making the web UI. While the last few PRs might seem like overkill, they're pretty important for streamlining our code -- and not having to implement things multiple times & in multiple places. Before these PRs, the processes were isolated for results+output files and database+webUI. This was easier when starting out Simmate, but it's now reached the point where maintaining two implementations was too difficult. Here's how it looked before: graph LR A[Calculation run] --> B[Results]; B --> C[Save to output files]; B --> D[Convert to database object]; D --> E[Save to database]; E --> F[Build into html template]; Here's how things work now: graph LR A[Calculation run] --> B[Results]; B --> C[Convert to database object]; C --> D[Save to database]; D --> E[Save to output files]; D --> F[Build into html template]; adding new plots Now if you have a script that builds a plot from a database table, I can immediately add it to both output files AND the website UI. This will be very important as Simmate grows. How to build a new plot: from simmate.visualization.plotting import PlotlyFigure class MyNewPlot(PlotlyFigure): def get_plot(table, chemical_system,): data = table.filter(...).all() # grab data from the table # .... make a plotly figure and return the object .... return plot end goal Ultimately, there will be a link to the website pages, but no local "report" file. In the simmary_summary.yaml file will be everything you need -- URL link, database table, database id. Saving a local html has complications due to... links and static assets will not work building the html will be slow and affect workflow speed building a minimal template falls back to the original issue (maintaining two implementations of the same thing) So this issue will be closed once there is a website view for structure-prediction workflows. There won't be a single file as you originally requested, but there will be the (even better) option of viewing things in the website UI. Impressive! I like it! @scott-materials I haven't done any html formatting, but everything is in a single page now. You can check out the html here. In the template, calcuation is the database table object (so FixedCompositionSearch.objects.get(id=123))
gharchive/issue
2022-08-29T01:04:45
2025-04-01T06:39:09.769963
{ "authors": [ "jacksund", "scott-materials" ], "repo": "jacksund/simmate", "url": "https://github.com/jacksund/simmate/issues/262", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
1338005028
Links to notes with apostrophes ' in the name are inactive Describe the bug If you (wiki)link a note that has a ' in the name, it will be shown as inactive. It can be searched for, but the note's address doesn't mark ' presence too (http://localhost:1313/Links-inactive/). To Reproduce Create a note with ' in its name Link it from another note (e.g. [[what's happening?]]) Screenshots You can also see doubled letters in the search. This happens sometimes, but I haven't figured out when exactly. Desktop: OS: happens when I build locally (HUGO v0.101.0, latest hugo-obsidian) and through Quartz GitHub action. Browser: tested on Brave (mobile) and Firefox (desktop) Suspect this is due to ' being converted to ’ on Hugo render, will look into it The same goes for .. I had notes that were called plural.sh or restack.io, these were not working as a file name. The workaround was renaming them.
gharchive/issue
2022-08-13T16:39:42
2025-04-01T06:39:09.786860
{ "authors": [ "demicuz", "jackyzha0", "sspaeti" ], "repo": "jackyzha0/quartz", "url": "https://github.com/jackyzha0/quartz/issues/176", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1634231927
Validation Great work! How can I trust the confidence intervals it returns - did you do anykind of validation - maybe comparing results with the standard R implementations? Would be cool to have like a (small) test report for some examples - in case you are in a regulated environment. Hi, The validation I did do was comparing the results of the bootstrap method with the analytical implementation. I didn't compare with R (i'm also not an R user). A test against R is a very good idea! What R functions/packages do you recommend testing this against? I guess it would be cool to have a unit test, where the results are compared against R, maybe using https://rpy2.github.io (Btw, Any contribution around this would be much appreciated!)
gharchive/issue
2023-03-21T16:01:36
2025-04-01T06:39:09.792409
{ "authors": [ "hubtub2", "jacobgil" ], "repo": "jacobgil/confidenceinterval", "url": "https://github.com/jacobgil/confidenceinterval/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
681503588
Hope to provide more detailed content about embeddings.json.gz Hello, I am very interested in your research. I hope to get the details about embeddings.json.gz: the correspondence among words - word embedding - instruction_tokens. I would be very grateful if I could get your reply. Hi @Dominique-github, Thank you for your interest! In the data format displayed here https://jacobkrantz.github.io/vlnce/data, instruction_text is the raw R2R instruction string. We then performed some preprocessing to derive instruction_tokens: We used this tokenization function https://github.com/facebookresearch/habitat-lab/blob/v0.1.5/habitat/datasets/utils.py#L25 to get a list of words for each instruction. We then checked if each word was present in the pre-trained GloVe embedding file glove.6B.50d.txt available here https://nlp.stanford.edu/projects/glove/. If the word was present in the GloVe file, we mapped it to a unique integer >= 2. If it was not, we mapped that word to a value of 1 (representing unknown). We padded each instruction with values of 0 to reach a length of 200. The mapping between word and integer can be found in the instruction_vocab accompanying each dataset split. We computed this vocab for all splits together (train, val_seen, val_unseen), so instruction_vocab is identical for each split. The embeddings file is a list of word embeddings. Indices in the list correspond to mapped word integer tokens in instruction_tokens and instruction_vocab. For example, embeddings[0] is the 50-dimensional zero vector for the pad token. We set the embedding for unknown words (index 1) to the mean of the other word embeddings that exist in R2R. All the rest of the embeddings are from glove.6B.50d.txt. Thank you for your detailed explanation, it is of great help to me.
gharchive/issue
2020-08-19T02:43:02
2025-04-01T06:39:09.796986
{ "authors": [ "Dominique-github", "jacobkrantz" ], "repo": "jacobkrantz/VLN-CE", "url": "https://github.com/jacobkrantz/VLN-CE/issues/2", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
262156407
DESKLINK.DAT is not created if missing If DESKLINK.DAT is missing, it should be created automatically with the default icons for built-in apps, since there is no way to add these from within the UI. Possible solution: Get the data of the current standard icons in DESKLINK.DAT. Make a SUB that resets the icon arrays, then adds the standard elements to the array. At the end of the SUB, call the SaveLink SUB. Make sure DESKLINK.DAT will be deleted - if this is not already in SaveLink, do it in the new SUB before SaveLink is called. The LoadLink SUB already checks for errors. Simply call the new SUB when errors are found. Implemented in commit d5b8672
gharchive/issue
2017-10-02T17:11:37
2025-04-01T06:39:09.799399
{ "authors": [ "jacobpalm" ], "repo": "jacobpalm/costa", "url": "https://github.com/jacobpalm/costa/issues/3", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2564664048
LRCLIB causing crash despite being disabled Steps to reproduce Manually edit lyrics for a song Wait for the backend to attempt to make a search request to LRCLIB Experience crash Expected behavior foobar2000 version: 2.2 OpenLyrics version: 1.11 Debug logs Illegal operation: Code: C0000005h, flags: 00000000h, address: 00007FFE5A536107h Access violation, operation: read, address: 0000000000000090h Call path not available. Code bytes (00007FFE5A536107h): 00007FFE5A5360C7h: E6 FF FF E9 10 FD FF FF 90 68 5F 00 00 1A 5D 00 00007FFE5A5360D7h: 00 1A 5D 00 00 1A 5D 00 00 20 5D 00 00 1A 5D 00 00007FFE5A5360E7h: 00 CC CC CC CC CC CC CC CC 40 53 48 83 EC 30 48 00007FFE5A5360F7h: 8B D9 48 85 C9 74 1E 0F B6 C3 24 0F 3C 01 74 1E 00007FFE5A536107h: 81 39 B0 01 00 00 72 0D 81 79 04 51 55 55 55 0F 00007FFE5A536117h: 84 E0 00 00 00 33 C0 48 83 C4 30 5B C3 CC 48 C1 00007FFE5A536127h: EB 04 48 83 FB 46 73 ED 48 89 6C 24 48 48 89 74 00007FFE5A536137h: 24 50 48 89 7C 24 58 4C 89 7C 24 20 4C 89 74 24 Registers: RAX: 0000000000000000, RBX: 0000000000000090, RCX: 0000000000000090, RDX: 0000000000000000 RSI: 00000010E2CFF4B0, RDI: 00007FFE5A551000, RBP: 00000010E2CFF4E0, RSP: 00000010E2CFF350 Timestamp: 232141ms Crash location: Module: bcrypt Offset: 6107h Symbol: "BCryptGetProperty" (+507h) Loaded modules: foobar2000 loaded at 00007FF62F130000h - 00007FF62F57D000h ntdll loaded at 00007FFE5D7E0000h - 00007FFE5DA43000h KERNEL32 loaded at 00007FFE5C090000h - 00007FFE5C157000h KERNELBASE loaded at 00007FFE5AC90000h - 00007FFE5B041000h SHLWAPI loaded at 00007FFE5CB30000h - 00007FFE5CB8D000h msvcrt loaded at 00007FFE5B6E0000h - 00007FFE5B789000h COMCTL32 loaded at 00007FFE41ED0000h - 00007FFE42160000h WINMM loaded at 00007FFE49D50000h - 00007FFE49D86000h GDI32 loaded at 00007FFE5B790000h - 00007FFE5B7BA000h USER32 loaded at 00007FFE5CD60000h - 00007FFE5CF23000h ucrtbase loaded at 00007FFE5B390000h - 00007FFE5B4DB000h win32u loaded at 00007FFE5B050000h - 00007FFE5B077000h gdi32full loaded at 00007FFE5B260000h - 00007FFE5B385000h ADVAPI32 loaded at 00007FFE5C930000h - 00007FFE5C9E2000h msvcp_win loaded at 00007FFE5ABE0000h - 00007FFE5AC83000h UxTheme loaded at 00007FFE57FD0000h - 00007FFE5807D000h sechost loaded at 00007FFE5BD60000h - 00007FFE5BE06000h combase loaded at 00007FFE5B870000h - 00007FFE5BBEC000h RPCRT4 loaded at 00007FFE5D0B0000h - 00007FFE5D1C6000h SHELL32 loaded at 00007FFE5C1B0000h - 00007FFE5C8A8000h ole32 loaded at 00007FFE5BED0000h - 00007FFE5C069000h OLEAUT32 loaded at 00007FFE5BC80000h - 00007FFE5BD56000h CRYPT32 loaded at 00007FFE5B4E0000h - 00007FFE5B656000h zlib1 loaded at 00007FFE57A70000h - 00007FFE57A8D000h sqlite3 loaded at 00007FFE1DB70000h - 00007FFE1DC64000h shared loaded at 00007FFE3C260000h - 00007FFE3C288000h MSVCP140 loaded at 00007FFE3C1D0000h - 00007FFE3C25D000h MSVCP140_ATOMIC_WAIT loaded at 00007FFE3C1B0000h - 00007FFE3C1C4000h MSIMG32 loaded at 00007FFE42340000h - 00007FFE42348000h OLEACC loaded at 00007FFE41C30000h - 00007FFE41CAC000h imagehlp loaded at 00007FFE5BEA0000h - 00007FFE5BEC0000h AVRT loaded at 00007FFE54D00000h - 00007FFE54D0B000h COMDLG32 loaded at 00007FFE5D1D0000h - 00007FFE5D2BD000h shcore loaded at 00007FFE5C9F0000h - 00007FFE5CAC3000h gdiplus loaded at 00007FFE42170000h - 00007FFE4233A000h WINHTTP loaded at 00007FFE54750000h - 00007FFE54876000h VCRUNTIME140_1 loaded at 00007FFE535C0000h - 00007FFE535CC000h Secur32 loaded at 00007FFE57BE0000h - 00007FFE57BED000h VCRUNTIME140 loaded at 00007FFE3C170000h - 00007FFE3C18E000h dbghelp loaded at 00007FFE47740000h - 00007FFE47981000h dbgcore loaded at 00007FFE28E60000h - 00007FFE28E99000h SSPICLI loaded at 00007FFE59DD0000h - 00007FFE59E18000h IMM32 loaded at 00007FFE5D760000h - 00007FFE5D78F000h kernel.appcore loaded at 00007FFE59B20000h - 00007FFE59B3A000h bcryptPrimitives loaded at 00007FFE5B140000h - 00007FFE5B1D9000h windows.storage loaded at 00007FFE58910000h - 00007FFE5913B000h MSCTF loaded at 00007FFE5CF30000h - 00007FFE5D08A000h atlthunk loaded at 00007FFE19630000h - 00007FFE1963D000h textinputframework loaded at 00007FFE49300000h - 00007FFE49444000h TextShaping loaded at 00007FFE25540000h - 00007FFE255EC000h CoreMessaging loaded at 00007FFE57620000h - 00007FFE57745000h CoreUIComponents loaded at 00007FFE53130000h - 00007FFE53413000h wintypes loaded at 00007FFE53C70000h - 00007FFE53DD8000h CRYPTBASE loaded at 00007FFE5A310000h - 00007FFE5A31C000h foo_ui_std loaded at 00007FFDDB730000h - 00007FFDDB96F000h dwmapi loaded at 00007FFE584A0000h - 00007FFE584CE000h foo_fileops loaded at 00007FFDF2E90000h - 00007FFDF2F18000h foo_quicksearch loaded at 0000000180000000h - 000000018008E000h WindowsCodecs loaded at 00007FFE57DA0000h - 00007FFE57FCA000h foo_metronome loaded at 00007FFE21E90000h - 00007FFE21E9E000h foo_play_next loaded at 00007FFE1FA70000h - 00007FFE1FA7F000h foo_dsp_utility loaded at 00007FFE14370000h - 00007FFE14398000h foo_uie_lyrics3 loaded at 00000123AB100000h - 00000123AB1B0000h WININET loaded at 00007FFE40490000h - 00007FFE40713000h foo_quicktag loaded at 00007FFE13530000h - 00007FFE13555000h foo_dsp_std loaded at 00007FFDF99A0000h - 00007FFDF99E2000h foo_openlyrics loaded at 00007FFDDAC10000h - 00007FFDDADFF000h d2d1 loaded at 00007FFE565B0000h - 00007FFE56BEA000h bcrypt loaded at 00007FFE5A530000h - 00007FFE5A556000h d3d11 loaded at 00007FFE56BF0000h - 00007FFE56E4D000h DWrite loaded at 00007FFE56340000h - 00007FFE565A5000h dxgi loaded at 00007FFE58250000h - 00007FFE58371000h directxdatabasehelper loaded at 00007FFE58100000h - 00007FFE58157000h foo_input_std loaded at 00007FFDDA9C0000h - 00007FFDDAC0A000h MSACM32 loaded at 00007FFDF9B80000h - 00007FFDF9BA1000h avformat-fb2k-60 loaded at 00007FFDF9B40000h - 00007FFDF9B71000h avcodec-fb2k-60 loaded at 00007FFDDA600000h - 00007FFDDA828000h avutil-fb2k-58 loaded at 00007FFDDA400000h - 00007FFDDA5F7000h foo_audioscrobbler loaded at 00000123AB1B0000h - 00000123AB1D6000h foo_run_main loaded at 00007FFE1F6B0000h - 00007FFE1F6BE000h foo_freedb2 loaded at 00007FFDF9910000h - 00007FFDF995B000h foo_beefweb loaded at 00007FFDDA2C0000h - 00007FFDDA3FF000h WS2_32 loaded at 00007FFE5C8B0000h - 00007FFE5C924000h MSWSOCK loaded at 00007FFE5A030000h - 00007FFE5A098000h foo_ui_columns loaded at 00007FFDC2F20000h - 00007FFDC335B000h USP10 loaded at 00007FFDF3780000h - 00007FFDF3799000h urlmon loaded at 00007FFE40A40000h - 00007FFE40C18000h iertutil loaded at 00007FFE40750000h - 00007FFE40A12000h srvcli loaded at 00007FFE40720000h - 00007FFE40749000h netutils loaded at 00007FFE593E0000h - 00007FFE593ED000h foo_dsp_eq loaded at 00007FFDF1D90000h - 00007FFDF1E18000h foo_vis_spectrum_analyzer loaded at 00007FFDF1B80000h - 00007FFDF1BF5000h foo_dsp_effect loaded at 00000123AB250000h - 00000123AB2C3000h foo_unpack loaded at 00007FFDEFAF0000h - 00007FFDEFB87000h foo_vis_milk2 loaded at 00007FFDB9C10000h - 00007FFDBA064000h D3DCOMPILER_47 loaded at 00007FFE55E90000h - 00007FFE5630F000h cryptsp loaded at 00007FFE5A2F0000h - 00007FFE5A30C000h foo_queue_viewer loaded at 00007FFDF36D0000h - 00007FFDF3729000h foo_whatsnew loaded at 00007FFDF31F0000h - 00007FFDF3226000h foo_uie_typefind loaded at 00000123AB420000h - 00000123AB463000h foo_taskbar_playback_progress_b loaded at 00007FFE1DAD0000h - 00007FFE1DADC000h clbcatq loaded at 00007FFE5B7C0000h - 00007FFE5B868000h explorerframe loaded at 00007FFE19710000h - 00007FFE19998000h foo_uie_console loaded at 00007FFDF1D40000h - 00007FFDF1D85000h foo_uie_tagger_mod loaded at 00000123AB4B0000h - 00000123AB52E000h foo_uie_eslyric loaded at 00007FFDB95F0000h - 00007FFDB9C04000h VERSION loaded at 00007FFE54C90000h - 00007FFE54C9B000h foo_httpcontrol loaded at 00007FFDEB790000h - 00007FFDEB803000h MPR loaded at 00007FFE3BA90000h - 00007FFE3BAB1000h foo_converter loaded at 00007FFDDA8E0000h - 00007FFDDA9B7000h foo_wave_minibar_mod loaded at 00007FFDEB880000h - 00007FFDEB8D2000h foo_out_upnp loaded at 00007FFDEB640000h - 00007FFDEB6E2000h IPHLPAPI loaded at 00007FFE59420000h - 00007FFE59451000h foo_playcount loaded at 00007FFDF1B40000h - 00007FFDF1B80000h foo_dsp_audiostretch loaded at 00007FFE1BD20000h - 00007FFE1BD36000h foo_loop loaded at 00007FFE1D9C0000h - 00007FFE1D9CE000h foo_scrobble loaded at 00007FFDD9020000h - 00007FFDD9145000h CONCRT140 loaded at 00007FFDE4650000h - 00007FFDE469D000h DPAPI loaded at 00007FFE5A920000h - 00007FFE5A92A000h Windows.UI loaded at 00007FFE418C0000h - 00007FFE41A14000h Windows.UI.Immersive loaded at 00007FFE27330000h - 00007FFE2747C000h twinapi.appcore loaded at 00007FFE4EB40000h - 00007FFE4ED77000h profapi loaded at 00007FFE5AB00000h - 00007FFE5AB29000h dataexchange loaded at 00007FFE263A0000h - 00007FFE263FA000h MMDevApi loaded at 00007FFE4F2C0000h - 00007FFE4F350000h DEVOBJ loaded at 00007FFE5A860000h - 00007FFE5A88D000h cfgmgr32 loaded at 00007FFE5A8B0000h - 00007FFE5A90F000h NSI loaded at 00007FFE5D0A0000h - 00007FFE5D0AA000h dhcpcsvc6 loaded at 00007FFE54D10000h - 00007FFE54D2E000h dhcpcsvc loaded at 00007FFE54CD0000h - 00007FFE54CF5000h DNSAPI loaded at 00007FFE594B0000h - 00007FFE595D1000h tiptsf loaded at 00007FFDE1FA0000h - 00007FFDE203F000h msiltcfg loaded at 00007FFE3FC20000h - 00007FFE3FC2B000h msi loaded at 00007FFE3B650000h - 00007FFE3B997000h sxs loaded at 00007FFE5AA30000h - 00007FFE5AAD1000h UIAutomationCore loaded at 00007FFE29770000h - 00007FFE29BA2000h dxcore loaded at 00007FFE580B0000h - 00007FFE580F0000h nvldumdx loaded at 00007FFE53420000h - 00007FFE534E4000h msasn1 loaded at 00007FFE5A360000h - 00007FFE5A373000h cryptnet loaded at 00007FFE52FC0000h - 00007FFE52FFB000h wldp loaded at 00007FFE5A3D0000h - 00007FFE5A42D000h drvstore loaded at 00007FFE52E40000h - 00007FFE52FB1000h wintrust loaded at 00007FFE5B1E0000h - 00007FFE5B25A000h PP-UWP-Interop loaded at 00007FFE1CFC0000h - 00007FFE1CFCB000h vccorlib140 loaded at 00007FFDE2420000h - 00007FFDE2475000h Windows.Media.Playback.Backgrou loaded at 00007FFDC3640000h - 00007FFDC3716000h MFPlat loaded at 00007FFE54FF0000h - 00007FFE551F6000h RTWorkQ loaded at 00007FFE54DB0000h - 00007FFE54DE6000h Windows.Media.MediaControl loaded at 00007FFDD9150000h - 00007FFDD91DD000h MFMediaEngine loaded at 00007FFDB8B80000h - 00007FFDB8FA4000h powrprof loaded at 00007FFE599D0000h - 00007FFE59A1E000h XmlLite loaded at 00007FFE55200000h - 00007FFE5523B000h UMPDC loaded at 00007FFE599B0000h - 00007FFE599C4000h AUDIOSES loaded at 00007FFE49550000h - 00007FFE49707000h Windows.Media.Devices loaded at 00007FFE472D0000h - 00007FFE473BE000h rsaenh loaded at 00007FFE59A80000h - 00007FFE59AB8000h Windows.Media.Playback.ProxyStu loaded at 00007FFE13890000h - 00007FFE138AA000h nvgpucomp64 loaded at 00007FFE4F730000h - 00007FFE52440000h OneCoreUAPCommonProxyStub loaded at 00007FFE4E280000h - 00007FFE4E8BE000h nvwgf2umx loaded at 00007FFE49D90000h - 00007FFE4E20C000h nvspcap64 loaded at 00007FFE25050000h - 00007FFE2534D000h ntmarta loaded at 00007FFE59C40000h - 00007FFE59C75000h nvppex loaded at 00007FFE24EF0000h - 00007FFE25045000h resourcepolicyclient loaded at 00007FFE58530000h - 00007FFE58544000h PROPSYS loaded at 00007FFE55750000h - 00007FFE55844000h rasadhlp loaded at 00007FFE4F5F0000h - 00007FFE4F5FB000h fwpuclnt loaded at 00007FFE53890000h - 00007FFE53916000h WINNSI loaded at 00007FFE57D10000h - 00007FFE57D1E000h webio loaded at 00007FFE39600000h - 00007FFE396C3000h schannel loaded at 00007FFE598E0000h - 00007FFE599A4000h ncrypt loaded at 00007FFE5A4F0000h - 00007FFE5A520000h NTASN1 loaded at 00007FFE5A4A0000h - 00007FFE5A4DF000h ncryptsslp loaded at 00007FFE39FA0000h - 00007FFE39FCD000h Stack dump analysis: Address: 00007FFE3C267F1Eh (shared+7F1Eh), symbol: "uCallStackTracker::uCallStackTracker" (+13Eh) Address: 00007FF62F4DCFA8h (foobar2000+3ACFA8h) Address: 00007FFE5A534E40h (bcrypt+4E40h), symbol: "BCryptCloseAlgorithmProvider" (+30h) Address: 00007FFE5B39DDABh (ucrtbase+DDABh), symbol: "free_base" (+1Bh) Address: 00007FF62F4DCFA8h (foobar2000+3ACFA8h) Address: 00007FFDDAC59ED2h (foo_openlyrics+49ED2h), symbol: "run_mvtf_tests" (+92B2h) Address: 00007FFDDAC5B000h (foo_openlyrics+4B000h), symbol: "run_mvtf_tests" (+A3E0h) Address: 00007FF62F4DCFA8h (foobar2000+3ACFA8h) Address: 00007FFDDAC5B066h (foo_openlyrics+4B066h), symbol: "run_mvtf_tests" (+A446h) Address: 00007FFDDADD7DC8h (foo_openlyrics+1C7DC8h), symbol: "foobar2000_get_interface" (+14EE98h) Address: 00007FFE5B4C88B0h (ucrtbase+1388B0h), symbol: "mbcasemap" (+D0h) Address: 00007FFE5B3B7F19h (ucrtbase+27F19h), symbol: "malloc_base" (+39h) Address: 00007FFE5B4C88B0h (ucrtbase+1388B0h), symbol: "mbcasemap" (+D0h) Address: 00007FFDDACD1F7Bh (foo_openlyrics+C1F7Bh), symbol: "foobar2000_get_interface" (+4904Bh) Address: 00007FFDDACD1F7Bh (foo_openlyrics+C1F7Bh), symbol: "foobar2000_get_interface" (+4904Bh) Address: 00007FFDDAC2CF5Eh (foo_openlyrics+1CF5Eh), symbol: "cJSON_free" (+1406Eh) Address: 00007FFDDAC2C86Dh (foo_openlyrics+1C86Dh), symbol: "cJSON_free" (+1397Dh) Address: 00007FFDDAC2C2EDh (foo_openlyrics+1C2EDh), symbol: "cJSON_free" (+133FDh) Address: 00007FF62F4DCFA8h (foobar2000+3ACFA8h) Address: 00007FF62F4DD0C8h (foobar2000+3AD0C8h) Address: 00007FFDDADD7DC8h (foo_openlyrics+1C7DC8h), symbol: "foobar2000_get_interface" (+14EE98h) Address: 00007FFDDADD7DC8h (foo_openlyrics+1C7DC8h), symbol: "foobar2000_get_interface" (+14EE98h) Address: 00007FFDDAC5B000h (foo_openlyrics+4B000h), symbol: "run_mvtf_tests" (+A3E0h) Address: 00007FFDDADD7DC8h (foo_openlyrics+1C7DC8h), symbol: "foobar2000_get_interface" (+14EE98h) Address: 00007FFDDAC3168Eh (foo_openlyrics+2168Eh), symbol: "cJSON_free" (+1879Eh) Address: 00007FF62F4DCFA8h (foobar2000+3ACFA8h) Address: 00007FF62F4DCFA8h (foobar2000+3ACFA8h) Address: 00007FF62F4DD0C8h (foobar2000+3AD0C8h) Address: 00007FFE5B3B56EFh (ucrtbase+256EFh), symbol: "towlower_l" (+E4Fh) Address: 00007FFDDAC9206Fh (foo_openlyrics+8206Fh), symbol: "foobar2000_get_interface" (+913Fh) Address: 00007FFE5B3A4EA0h (ucrtbase+14EA0h), symbol: "wcsrchr" (+1F0h) Address: 00007FFE5C0BDBE7h (KERNEL32+2DBE7h), symbol: "BaseThreadInitThunk" (+17h) Address: 00007FFE5D865A4Ch (ntdll+85A4Ch), symbol: "RtlUserThreadStart" (+2Ch) Address: 00007FFE5ADAD360h (KERNELBASE+11D360h), symbol: "UnhandledExceptionFilter" (+0h) Environment: App: foobar2000 v2.2 preview 2024-09-11 Arch: x64 UI: Columns UI 2.1.0 Components: Core (2024-09-11 14:11:10 UTC) foobar2000 core 2.2 preview 2024-09-11 foo_audioscrobbler (2022-09-06 23:33:52 UTC) Audioscrobbler 1.5.0 foo_beefweb (2023-09-03 13:10:02 UTC) Beefweb Remote Control 0.8 foo_converter (2024-09-11 14:11:42 UTC) Converter 2.2 preview 2024-09-11 foo_dsp_audiostretch (2023-04-21 17:50:58 UTC) Audio-Stretch 0.2 foo_dsp_effect (2024-05-03 06:52:38 UTC) Effect DSP 0.51 foo_dsp_eq (2024-09-11 14:11:46 UTC) Equalizer 1.2.3 foo_dsp_std (2024-09-11 14:11:48 UTC) Standard DSP Array 2.2 preview 2024-09-11 foo_dsp_utility (2023-02-23 23:27:50 UTC) Utility DSP Array 1.3.2 foo_fileops (2024-09-11 14:11:52 UTC) File Operations 2.2 preview 2024-09-11 foo_freedb2 (2024-09-11 14:11:56 UTC) Online Tagger 0.10 foo_httpcontrol (2023-05-21 12:54:40 UTC) HTTP Control 0.97.28 foo_input_std (2024-09-11 14:11:38 UTC) CD Audio Decoder 2.2 preview 2024-09-11 FFmpeg Decoders 6.0 FLAC Decoder 1.4.3 Monkey's Audio Decoder 10.61 Opus Decoder 1.5.2 Standard Input Array 2.2 preview 2024-09-11 WavPack Decoder 5.7.0 foo_loop (2023-04-30 20:11:30 UTC) Loop 1.6 foo_metronome (2022-10-22 21:06:24 UTC) Metronome 1.2 foo_openlyrics (2024-09-06 00:28:24 UTC) OpenLyrics 1.11 foo_out_upnp (2022-08-29 21:33:36 UTC) UPnP MediaRenderer Output 1.4 foo_play_next (2023-03-16 19:06:30 UTC) Play Next 0.2.3 foo_playcount (2023-03-14 19:04:18 UTC) Playback Statistics 3.1.5 foo_queue_viewer (2023-04-28 03:43:14 UTC) Queue Viewer 1.0.22 foo_quicksearch (2024-06-17 16:09:24 UTC) Quick Search Toolbar 3.9 foo_quicktag (2022-09-22 23:42:30 UTC) Quick Tagger 1.1.1 foo_run_main (2022-11-14 00:16:56 UTC) Run Main 1.0.2 foo_scrobble (2022-09-06 04:43:00 UTC) Scrobble 1.6.0.22456 foo_taskbar_playback_progress_bar (2022-09-25 14:01:16 UTC) Taskbar Playback Progress Bar 1.1.3 foo_ui_columns (2023-09-27 02:19:32 UTC) Columns UI 2.1.0 foo_ui_std (2024-09-11 14:11:24 UTC) Album List 2.2 preview 2024-09-11 Decoding Speed Test 2.2 preview 2024-09-11 Default User Interface 2.2 preview 2024-09-11 File Integrity Verifier 2.2 preview 2024-09-11 foo_uie_console (2023-05-07 01:46:38 UTC) Console panel 3.0.0 foo_uie_eslyric (2023-12-25 02:32:36 UTC) ESLyric 0.5.4.1028 (Beta) foo_uie_lyrics3 (2023-11-18 15:53:36 UTC) Lyric Show Panel 3 0.6 foo_uie_tagger_mod (2023-04-22 16:35:44 UTC) Tagger Panel 2.0.0 foo_uie_typefind (2023-11-03 00:43:42 UTC) Typefind 0.4.0 foo_unpack (2024-09-11 14:12:04 UTC) ZIP/GZIP/RAR/7-Zip Reader 2.2 preview 2024-09-11 foo_vis_milk2 (2024-09-15 01:09:10 UTC) MilkDrop 2 Visualisation 0.1.0-beta foo_vis_spectrum_analyzer (2024-04-18 01:50:24 UTC) Spectrum Analyzer 0.7.6.2 foo_wave_minibar_mod (2024-01-16 17:24:36 UTC) Waveform Minibar (mod) 1.2.58 foo_whatsnew (2023-05-04 18:05:24 UTC) Feature Watcher 1.1.2 Recent events: [36000ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [36000ms] INFO-OpenLyrics: Querying for lyrics in file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [36000ms] INFO-OpenLyrics: Querying for lyrics in file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).txt... [36000ms] INFO-OpenLyrics: Found 2 lyrics in local files: file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON) [36000ms] INFO-OpenLyrics: Lookup local-file file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc for lyrics... [36000ms] INFO-OpenLyrics: Successfully retrieved lyrics from file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [36000ms] INFO-OpenLyrics: Successfully looked-up lyrics from source: Local files [36000ms] INFO-OpenLyrics: Parsing lyrics text... [36000ms] INFO-OpenLyrics: Loaded lyrics already form a valid UTF-8 sequence [36000ms] INFO-OpenLyrics: Parsing LRC lyric text... [36000ms] setConfigFloat(core.totalTimePlayed,530886.1286268) [36000ms] Automatic resampling: using Resampler (dBpoweramp/SSRC): 192000 Hz, Resampler (RetroArch): 192000 Hz [36000ms] INFO-OpenLyrics: Lyric loading complete [36000ms] INFO-OpenLyrics: New album art data retrieved [36016ms] Device: Realtek Digital Output (Realtek(R) Audio) Mix format: 192000 Hz / 32-bit float / 2 channels (0x3) [36063ms] Sending stream: 192000 Hz / 32-bit float / 2 channels (0x3) [36125ms] Audioscrobbler: Handshake successful. [36188ms] INFO-OpenLyrics: LyricPanel::compute_background_image took 101539us [36188ms] INFO-OpenLyrics: LyricPanel::on_album_art_retrieved took 183747us [36297ms] INFO-OpenLyrics: Skipping lyric save. Type: 1, Local: yes, Timestamped: yes, Autosave: 1 [38578ms] INFO-OpenLyrics: Spawning editor window... [38578ms] INFO-OpenLyrics: Expanding lyric text... [38594ms] INFO-OpenLyrics: Initializing editor window... [66156ms] INFO-OpenLyrics: Saving lyrics from editor... [66156ms] INFO-OpenLyrics: Parsing LRC lyric text... [66156ms] INFO-OpenLyrics: Expanding lyric text... [66156ms] INFO-OpenLyrics: Saving lyrics to a local file... [66156ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [66156ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [66156ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [73469ms] INFO-OpenLyrics: Synchronising editor line... [73469ms] INFO-OpenLyrics: Parsing LRC lyric text... [82016ms] INFO-OpenLyrics: Synchronising editor line... [82016ms] INFO-OpenLyrics: Parsing LRC lyric text... [83688ms] INFO-OpenLyrics: Synchronising editor line... [83688ms] INFO-OpenLyrics: Parsing LRC lyric text... [83922ms] INFO-OpenLyrics: Synchronising editor line... [83922ms] INFO-OpenLyrics: Parsing LRC lyric text... [89625ms] INFO-OpenLyrics: Synchronising editor line... [89625ms] INFO-OpenLyrics: Parsing LRC lyric text... [92063ms] INFO-OpenLyrics: Synchronising editor line... [92063ms] INFO-OpenLyrics: Parsing LRC lyric text... [97250ms] INFO-OpenLyrics: Synchronising editor line... [97250ms] INFO-OpenLyrics: Parsing LRC lyric text... [99531ms] INFO-OpenLyrics: Synchronising editor line... [99531ms] INFO-OpenLyrics: Parsing LRC lyric text... [100438ms] INFO-OpenLyrics: Saving lyrics from editor... [100453ms] INFO-OpenLyrics: Parsing LRC lyric text... [100453ms] INFO-OpenLyrics: Expanding lyric text... [100453ms] INFO-OpenLyrics: Saving lyrics to a local file... [100453ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [100453ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [100453ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [101938ms] INFO-OpenLyrics: Synchronising editor line... [101938ms] INFO-OpenLyrics: Parsing LRC lyric text... [102797ms] INFO-OpenLyrics: Synchronising editor line... [102797ms] INFO-OpenLyrics: Parsing LRC lyric text... [104031ms] INFO-OpenLyrics: Saving lyrics from editor... [104031ms] INFO-OpenLyrics: Parsing LRC lyric text... [104031ms] INFO-OpenLyrics: Expanding lyric text... [104031ms] INFO-OpenLyrics: Saving lyrics to a local file... [104031ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [104031ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [104031ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [104906ms] INFO-OpenLyrics: Synchronising editor line... [104906ms] INFO-OpenLyrics: Parsing LRC lyric text... [105750ms] INFO-OpenLyrics: Saving lyrics from editor... [105750ms] INFO-OpenLyrics: Parsing LRC lyric text... [105750ms] INFO-OpenLyrics: Expanding lyric text... [105750ms] INFO-OpenLyrics: Saving lyrics to a local file... [105750ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [105750ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [105750ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [106922ms] INFO-OpenLyrics: Synchronising editor line... [106922ms] INFO-OpenLyrics: Parsing LRC lyric text... [107656ms] INFO-OpenLyrics: Saving lyrics from editor... [107656ms] INFO-OpenLyrics: Parsing LRC lyric text... [107656ms] INFO-OpenLyrics: Expanding lyric text... [107656ms] INFO-OpenLyrics: Saving lyrics to a local file... [107656ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [107656ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [107656ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [108985ms] INFO-OpenLyrics: Synchronising editor line... [108985ms] INFO-OpenLyrics: Parsing LRC lyric text... [109688ms] INFO-OpenLyrics: Saving lyrics from editor... [109688ms] INFO-OpenLyrics: Parsing LRC lyric text... [109688ms] INFO-OpenLyrics: Expanding lyric text... [109688ms] INFO-OpenLyrics: Saving lyrics to a local file... [109688ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [109688ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [109688ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [111594ms] INFO-OpenLyrics: Synchronising editor line... [111594ms] INFO-OpenLyrics: Parsing LRC lyric text... [112406ms] INFO-OpenLyrics: Saving lyrics from editor... [112406ms] INFO-OpenLyrics: Parsing LRC lyric text... [112406ms] INFO-OpenLyrics: Expanding lyric text... [112406ms] INFO-OpenLyrics: Saving lyrics to a local file... [112406ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [112406ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [112406ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [114422ms] INFO-OpenLyrics: Synchronising editor line... [114422ms] INFO-OpenLyrics: Parsing LRC lyric text... [115078ms] INFO-OpenLyrics: Saving lyrics from editor... [115078ms] INFO-OpenLyrics: Parsing LRC lyric text... [115078ms] INFO-OpenLyrics: Expanding lyric text... [115078ms] INFO-OpenLyrics: Saving lyrics to a local file... [115078ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [115078ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [115094ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [116563ms] INFO-OpenLyrics: Synchronising editor line... [116563ms] INFO-OpenLyrics: Parsing LRC lyric text... [118281ms] INFO-OpenLyrics: Saving lyrics from editor... [118281ms] INFO-OpenLyrics: Parsing LRC lyric text... [118281ms] INFO-OpenLyrics: Expanding lyric text... [118281ms] INFO-OpenLyrics: Saving lyrics to a local file... [118281ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [118281ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [118281ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [118938ms] INFO-OpenLyrics: Synchronising editor line... [118938ms] INFO-OpenLyrics: Parsing LRC lyric text... [121313ms] INFO-OpenLyrics: Saving lyrics from editor... [121313ms] INFO-OpenLyrics: Parsing LRC lyric text... [121313ms] INFO-OpenLyrics: Expanding lyric text... [121313ms] INFO-OpenLyrics: Saving lyrics to a local file... [121313ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [121313ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [121313ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [121953ms] INFO-OpenLyrics: Synchronising editor line... [121953ms] INFO-OpenLyrics: Parsing LRC lyric text... [124016ms] INFO-OpenLyrics: Synchronising editor line... [124016ms] INFO-OpenLyrics: Parsing LRC lyric text... [126172ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [126688ms] INFO-OpenLyrics: Synchronising editor line... [126688ms] INFO-OpenLyrics: Parsing LRC lyric text... [128985ms] INFO-OpenLyrics: Synchronising editor line... [128985ms] INFO-OpenLyrics: Parsing LRC lyric text... [131313ms] INFO-OpenLyrics: Synchronising editor line... [131313ms] INFO-OpenLyrics: Parsing LRC lyric text... [133625ms] INFO-OpenLyrics: Synchronising editor line... [133625ms] INFO-OpenLyrics: Parsing LRC lyric text... [137141ms] INFO-OpenLyrics: Synchronising editor line... [137141ms] INFO-OpenLyrics: Parsing LRC lyric text... [145422ms] INFO-OpenLyrics: Synchronising editor line... [145422ms] INFO-OpenLyrics: Parsing LRC lyric text... [149906ms] INFO-OpenLyrics: Synchronising editor line... [149906ms] INFO-OpenLyrics: Parsing LRC lyric text... [154813ms] INFO-OpenLyrics: Synchronising editor line... [154813ms] INFO-OpenLyrics: Parsing LRC lyric text... [159360ms] INFO-OpenLyrics: Synchronising editor line... [159360ms] INFO-OpenLyrics: Parsing LRC lyric text... [160453ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [164047ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [165766ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [167672ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [169688ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [170625ms] INFO-OpenLyrics: Saving lyrics from editor... [170625ms] INFO-OpenLyrics: Parsing LRC lyric text... [170625ms] INFO-OpenLyrics: Expanding lyric text... [170625ms] INFO-OpenLyrics: Saving lyrics to a local file... [170625ms] INFO-OpenLyrics: Save file name format '[%artist% - ][%title%]' with directory class 'ConfigDirectory' evaluated to 'file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON)' [170625ms] INFO-OpenLyrics: Saving lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc... [170625ms] INFO-OpenLyrics: Successfully saved lyrics to file://C:\Users\abudd\AppData\Roaming\foobar2000-v2\lyrics\sewerperson - CHAPTER9_HOMINUS NOCTURNA (SKELLINGTON).lrc [172422ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [175094ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [178297ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [181328ms] INFO-OpenLyrics: Skipping lyric upload for sewerperson//sewerperson because there is a more recent upload pending for that track [195735ms] INFO-OpenLyrics: Synchronising editor line... [195735ms] INFO-OpenLyrics: Parsing LRC lyric text... [196953ms] INFO-OpenLyrics: Synchronising editor line... [196953ms] INFO-OpenLyrics: Parsing LRC lyric text... [198110ms] INFO-OpenLyrics: Synchronising editor line... [198110ms] INFO-OpenLyrics: Parsing LRC lyric text... [198953ms] INFO-OpenLyrics: Synchronising editor line... [198953ms] INFO-OpenLyrics: Parsing LRC lyric text... [203547ms] INFO-OpenLyrics: Synchronising editor line... [203547ms] INFO-OpenLyrics: Parsing LRC lyric text... [204078ms] INFO-OpenLyrics: Synchronising editor line... [204078ms] INFO-OpenLyrics: Parsing LRC lyric text... [204797ms] INFO-OpenLyrics: Synchronising editor line... [204797ms] INFO-OpenLyrics: Parsing LRC lyric text... [207031ms] INFO-OpenLyrics: Synchronising editor line... [207031ms] INFO-OpenLyrics: Parsing LRC lyric text... [209469ms] INFO-OpenLyrics: Synchronising editor line... [209469ms] INFO-OpenLyrics: Parsing LRC lyric text... [214063ms] INFO-OpenLyrics: Synchronising editor line... [214063ms] INFO-OpenLyrics: Parsing LRC lyric text... [216422ms] INFO-OpenLyrics: Synchronising editor line... [216422ms] INFO-OpenLyrics: Parsing LRC lyric text... [219078ms] INFO-OpenLyrics: Synchronising editor line... [219078ms] INFO-OpenLyrics: Parsing LRC lyric text... [224000ms] INFO-OpenLyrics: Synchronising editor line... [224000ms] INFO-OpenLyrics: Parsing LRC lyric text... [228406ms] INFO-OpenLyrics: Synchronising editor line... [228406ms] INFO-OpenLyrics: Parsing LRC lyric text... [230641ms] INFO-OpenLyrics: Retrieving lyrics from https://lrclib.net/api/get?artist_name=sewerperson&album_name=&track_name=CHAPTER9_HOMINUS%20NOCTURNA&duration=159 [230969ms] INFO-OpenLyrics: Synchronising editor line... [230969ms] INFO-OpenLyrics: Parsing LRC lyric text... [231219ms] WARN-OpenLyrics: Failed to make LRCLIB search request to https://lrclib.net/api/get?artist_name=sewerperson&album_name=&track_name=CHAPTER9_HOMINUS%20NOCTURNA&duration=159: Object not found [231219ms] INFO-OpenLyrics: Requesting a challenge for LRCLIB upload... [231969ms] INFO-OpenLyrics: Solved challenge SHA256(i3qIWoCl5p88EjCbpdABlQyYT05sWj8C) < 000000FF00000000000000000000000000000000000000000000000000000000 with nonce 0 in 0.00s Machine specifications: OS: Windows 10.0.26120 x64 CPU: 11th Gen Intel(R) Core(TM) i5-11400F @ 2.60GHz, features: MMX SSE SSE2 SSE3 SSE4.1 SSE4.2 AVX LZCNT CPU threads: 12 Audio: Realtek Digital Output (Realtek(R) Audio) -- Additional information To me it appears that the search request to LRCLIB, despite LRCLIB being disabled, is causing a crash. I have messed with loads of settings and still experience this within minutes of using the plugin. Thanks. Here are the raw crash reports failure_00000026.dmp failure_00000026.txt failure_00000027.dmp failure_00000027.txt Hmmm. The code is fairly simple and just checks that it's not set to "Never" so I'm surprised this is happening even with it disabled. I know you said you'd tried various options but just to check the obvious, can you confirm that this persists even when you have uploads disabled in Preferences -> OpenLyrics -> Uploading (IE "Upload lyrics to LRCLIB" set to "Never")? Also I don't see any crashes like this at all in the crash tracker. When it crashes and fb2k asks if you want to submit a crash report, have you ever said "yes"? Hmmm. The code is fairly simple and just checks that it's not set to "Never" so I'm surprised this is happening even with it disabled. I know you said you'd tried various options but just to check the obvious, can you confirm that this persists even when you have uploads disabled in Preferences -> OpenLyrics -> Uploading (IE "Upload lyrics to LRCLIB" set to "Never")? Also I don't see any crashes like this at all in the crash tracker. When it crashes and fb2k asks if you want to submit a crash report, have you ever said "yes"? Yes, I can confirm that it is set to Never. I have also hit yes when it crashes. Thanks for your help.
gharchive/issue
2024-10-03T17:56:53
2025-04-01T06:39:09.818597
{ "authors": [ "aidenszolosi", "jacquesh" ], "repo": "jacquesh/foo_openlyrics", "url": "https://github.com/jacquesh/foo_openlyrics/issues/411", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
279417303
codahale/hdrhistogram github repository is archived by the owner I just noticed this while looking at the dependencies added by this lib: We shouldn't have it as a direct dependency This could be labeled as "good first issue". @yurishkuro, @jpkrohling: what solution do you have in mind? Copy just the necessary code from codahale/hdrhistogram into jaeger-lib/metrics (which is where it is used), or copy the entire repo and continue maintaining it - but where and how? Isn't it just a matter of removing the dependency from glide.lock? I think it was removed from the YAML but not from the lock... @pohly, would you be willing to give it a shot? @jpkrohling no, the package still gets used here: https://github.com/jaegertracing/jaeger-lib/blob/master/metrics/local.go#L23 So the solution isn't just a simple change to glide.lock. Sorry, I thought this was the main repo :-) @jpkrohling do you still think that this is a "good first issue"? codahale/hdrhistogram has some unfixed issues open, so whoever does something probably also needs to have a good understanding of whether those bugs are relevant when copying code. Doesn't look trivial to me. I'd also like to add that this is a show-stopper for me for using Jaeger - depending on unmaintained, potentially buggy components just isn't good and won't pass a closer review. do you still think that this is a "good first issue"? For someone who knows Go, I think this might still be a good first issue. codahale/hdrhistogram has some unfixed issues open, so whoever does something probably also needs to have a good understanding of whether those bugs are relevant when copying code. Doesn't look trivial to me. @yurishkuro can provide more details about this, but I think it would be acceptable to switch to a more modern backend, like Prometheus. Pretty much all providers nowadays are able to scrape Prometheus data. For the main backend, we are using expvar, Prometheus and a noop implementation: https://github.com/jaegertracing/jaeger/blob/master/pkg/metrics/builder.go#L52 So, the real solution is to not depend on this unmaintained library. depending on unmaintained, potentially buggy components just isn't good and won't pass a closer review +1, I don't think anyone would disagree with that, but different issues have different priorities to different people. Local backend is only used in unintentionally tests. We can probably simulate it via Prometheus client if we don't use the default registrar (which is global and will persist across tests). If the local backend is only used for testing, why does it get pulled into production clients? That probably should be changed first. Once that's resolved, depending on an unmaintained components becomes less of a problem. It should be moved to a package As of v2.3.0 we're not using codahale (#82).
gharchive/issue
2017-12-05T15:29:48
2025-04-01T06:39:09.840374
{ "authors": [ "jpkrohling", "pohly", "tbarbugli", "yurishkuro" ], "repo": "jaegertracing/jaeger-lib", "url": "https://github.com/jaegertracing/jaeger-lib/issues/32", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
420546907
jaeger-query service throws "no Elasticsearch node available" very often when we use ES operator provided ES cluster Steps to reproduce, Install jaeger services with the following CR, apiVersion: jaegertracing.io/v1 kind: Jaeger metadata: name: jaegerqe spec: ingress: security: none strategy: production collector: replicas: 1 image: jaegertracing/jaeger-collector:1.11 options: metrics-backend: prometheus collector: num-workers: "50" queue-size: "2000" es: bulk: size: "5000000" workers: "1" flush-interval: "200ms" query: replicas: 1 image: jaegertracing/jaeger-query:1.11 options: metrics-backend: prometheus query: port: 16686 agent: strategy: sidecar image: jaegertracing/jaeger-agent:1.11 options: metrics-backend: prometheus storage: type: elasticsearch esIndexCleaner: enabled: false sparkDependencies: enabled: false elasticsearch: image: quay.io/openshift/origin-logging-elasticsearch5:latest nodeCount: 3 resources: launch jaeger UI and select a service(example: jaeger-query) click on Find Traces often, throws "HTTP Error: Search service failed: no available connection: no Elasticsearch node available", Log files: jaeger-query: jaegerqe-query-84ddcc9654-kph2f_jaeger-query.log jaeger-agent: jaegerqe-query-84ddcc9654-kph2f_jaeger-agent.log jaeger-collector: jaegerqe-collector-6976478cc5-54ngv.log elasticsearch-clientdatamaster-0-1: elasticsearch-clientdatamaster-0-1-9b575665-pfrgb_elasticsearch.log elasticsearch-clientdatamaster-0-1 proxy: elasticsearch-clientdatamaster-0-1-9b575665-pfrgb_proxy.log Other files: other files.zip I have seen this before. I am wondering whether this is related to resource limits. What are the resource limits for ES? Could you please give it more juice and try if the issue still happens? @pavolloffay ES memory and CPU resource limits? I do not specify anything specifically. Just went with default settings. Looks like ES using up to 4GiB and up to 4 cores First I have tried port-forward query port and never have got this error. Increasing es.timeout (default is 0s) to 10s solved the issue. I think we should by default increase the timeout to let's say 5s? @jkandasa are you modifying es.timout when running perf tests on OCP? There is also route timeout https://docs.openshift.com/container-platform/3.5/install_config/configuring_routing.html. I didn't get a timeout on the route so far. @jkandasa if you run into this in tests just increase the timeout high enough. Summary: I don't get any timeouts when jaeger is connected to ECL ES via certs (note that we use token auth by default) I don't get any timeouts when using ES deployment from tests make es I think the root cause of the problem is that we are using token auth with ES - specifically https://github.com/fabric8io/openshift-elasticsearch-plugin. It might do a request to k8s API ( apis/authorization.k8s.io/v1/selfsubjectaccessreviews) per Jaeger request to Elasticsearch whereas when using certs all information is already present in ES container. cc @jcantrill @ewolinetz Summary: I don't get any timeouts when jaeger is connected to ECL ES via certs (note that we use token auth by default) I don't get any timeouts when using ES deployment from tests make es I think the root cause of the problem is that we are using token auth with ES - specifically https://github.com/fabric8io/openshift-elasticsearch-plugin. It might do a request to k8s API ( apis/authorization.k8s.io/v1/selfsubjectaccessreviews) per Jaeger request to Elasticsearch This is correct. There is no caching mechanism whereas when using certs all information is already present in ES container. Certs would be faster, especially if you do not additional identify a token on the request. It will skip all of the token auth logic when the token is not there cc @jcantrill @ewolinetz How do you suggest moving forward? Implement some caching or switch to client certs? Either way we will need a change to the ES image. @jkandasa if you run into this in tests just increase the timeout high enough. @pavolloffay Sure, I will modify and do a recheck. When I face this issue I had only ~10 traces in ES. It was a fresh installation. @jkandasa the es.timeout is just a workaround. The query is still very slow, we will need a faster authorization mechanism. Just to confirm: does this affect the whole query API, or just the UI? Bearer tokens come only when the user is authenticated via the UI, right? @jpkrohling It affects the whole qurery API. Query-service backed failed to get data from the ES cluster. I have disabled UI authentication, ingress: security: none
gharchive/issue
2019-03-13T14:57:10
2025-04-01T06:39:09.856212
{ "authors": [ "jcantrill", "jkandasa", "jpkrohling", "pavolloffay" ], "repo": "jaegertracing/jaeger-operator", "url": "https://github.com/jaegertracing/jaeger-operator/issues/310", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
412319546
Bump Jaeger to 1.10 Similar to https://github.com/jaegertracing/jaeger-operator/pull/176/files Signed-off-by: Pavol Loffay ploffay@redhat.com This change is 
gharchive/pull-request
2019-02-20T09:13:59
2025-04-01T06:39:09.858600
{ "authors": [ "jpkrohling", "pavolloffay" ], "repo": "jaegertracing/jaeger-operator", "url": "https://github.com/jaegertracing/jaeger-operator/pull/212", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
531506203
initial test, multiprocessor mode not working single process works I am getting the following error This may be due to my setup, the --single-process flag works fine. Without --single-process the following error is emitted. RuntimeError: cuda runtime error (801) : operation not supported at C:\w\1\s\tmp_conda_3.7_104508\conda\conda-bld\pytorch_1572950778684\work\torch/csrc/generic/StorageSharing.cpp:245 Traceback (most recent call last): File "C:\test\lib\multiprocessing\queues.py", line 236, in _feed obj = _ForkingPickler.dumps(obj) File "C:\test\lib\multiprocessing\reduction.py", line 51, in dumps cls(buf, protocol).dump(obj) File "C:\test\lib\site-packages\torch\multiprocessing\reductions.py", line 242, in reduce_tensor event_sync_required) = storage.share_cuda() RuntimeError: cuda runtime error (801) : operation not supported at C:\w\1\s\tmp_conda_3.7_104508\conda\conda-bld\pytorch_1572950778684\work\torch/csrc/generic/StorageSharing.cpp:245 Traceback (most recent call last): File "C:\test\lib\multiprocessing\queues.py", line 236, in _feed obj = _ForkingPickler.dumps(obj) File "C:\test\lib\multiprocessing\reduction.py", line 51, in dumps cls(buf, protocol).dump(obj) File "C:\test\lib\site-packages\torch\multiprocessing\reductions.py", line 242, in reduce_tensor event_sync_required) = storage.share_cuda() RuntimeError: cuda runtime error (801) : operation not supported at C:\w\1\s\tmp_conda_3.7_104508\conda\conda-bld\pytorch_1572950778684\work\torch/csrc/generic/StorageSharing.cpp:245 Process _PredictWorker-1: Traceback (most recent call last): File "C:\test\lib\multiprocessing\process.py", line 297, in _bootstrap self.run() File "C:\g\vn\lc\detectron2-pipeline\pipeline\libs\async_predictor.py", line 32, in run task = self.task_queue.get() File "C:\test\lib\multiprocessing\queues.py", line 94, in get res = self._recv_bytes() File "C:\test\lib\multiprocessing\connection.py", line 216, in recv_bytes buf = self._recv_bytes(maxlength) File "C:\test\lib\multiprocessing\connection.py", line 306, in _recv_bytes [ov.event], False, INFINITE) It's hard to say what's going on. It looks like there is something wrong with your CUDA setup or Detectron2 setup with GPU. To be sure you can switch off GPU adding --gpus 0 --cpus 1 options.
gharchive/issue
2019-12-02T20:02:39
2025-04-01T06:39:09.871985
{ "authors": [ "apiszcz", "jagin" ], "repo": "jagin/detectron2-pipeline", "url": "https://github.com/jagin/detectron2-pipeline/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1584731207
Can Patcher framework-res,framework-ext-res? same as above, can i edit framework-res,framework-ext-res? sorry if this is a bad question Thanks your job... Curious about this as well...
gharchive/issue
2023-02-14T19:46:43
2025-04-01T06:39:09.892806
{ "authors": [ "nickelnine", "trinhloivn" ], "repo": "jairaj08/SystemUI-Patcher", "url": "https://github.com/jairaj08/SystemUI-Patcher/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2074569440
Password endpoints fix email screenshot: Now requires enter the password only once while changing it. Also now only password/reset and password/reset/confirm endpoints uses for reset password as unauthorised user. Added two foos for encoding and decoding data via base64. Reorganised validate method for PasswordForm and PasswordChange serializers. Add email sceenshot 2 Update registration username validators (min =2, max=128, not uniq ) 3 Remove stripping base64 string @sound1ust Let's try to replace simple text message with html template <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Password Reset - Meetups</title> <style> body { font-family: 'Arial', sans-serif; margin: 0; padding: 0; display: flex; align-items: center; justify-content: center; height: 100vh; } .email-container { background-color: #FFFAFA; border-radius: 8px; padding: 20px; text-align: center; color: #312E70; width: 80%; max-width: 400px; box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1); } p { margin-top: 20px; margin-bottom: 30px; } .reset-button { display: inline-block; background-color: #5754C6; color: #fff; padding: 10px 20px; text-decoration: none; border-radius: 4px; font-weight: bold; } .footer { margin-top: 20px; color: #312E70; } .text-small { font-size: 34px; font-weight: 700; line-height: 43px; letter-spacing: 0em; text-align: center; color: #312E70; } .header { font-size: 50px; font-weight: 700; line-height: 63px; letter-spacing: 0em; text-align: center; color: #312E70; } </style> </head> <body> <div class="email-container"> <div class="header">Meetups</div> <div class="text-small">Password Reset</div> <p>We received a request to reset your password. If you didn't make this request, you can ignore this email.</p> <a class="reset-button" href="{% verification_link %}">Reset Password</a> <p class="footer">If the button above doesn't work, you can also copy and paste the following link into your browser:</p> <p><span style="color: #5754C6;">{% verification_link %}</span></p> </div> </body> </html>
gharchive/pull-request
2024-01-10T15:04:23
2025-04-01T06:39:10.041527
{ "authors": [ "sakalex", "sound1ust" ], "repo": "jakkemomo/meetups", "url": "https://github.com/jakkemomo/meetups/pull/50", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
119061245
Also participate in intellisense Using the completion provider API this extension could be part of IntelliSense, allowing me to complete commonjs module names or html script tags etc I'll try it out. Don't wanna have performance issues while writing code because of a large workspace though. Maybe there's a way around it
gharchive/issue
2015-11-26T15:08:34
2025-04-01T06:39:10.042799
{ "authors": [ "jakob101", "jrieken" ], "repo": "jakob101/RelativePath", "url": "https://github.com/jakob101/RelativePath/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
312727451
Is this project still maintained? Hi @jakubknejzlik, are you still maintaining this project? I don't see any more activity and my PRs are still pending. Hi @expobrain , sorry for the delay. I missed any notification and didn't noticed the PR's. I'll check them ASAP.
gharchive/issue
2018-04-09T23:56:34
2025-04-01T06:39:10.049812
{ "authors": [ "expobrain", "jakubknejzlik" ], "repo": "jakubknejzlik/sql-condition-builder", "url": "https://github.com/jakubknejzlik/sql-condition-builder/issues/7", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
112528759
Updated installation and a new example Update the readme to use the jupyter-kernelspec command instead of the deprecated ipython kernelspec command. Add an example to show off screenshot capabilities of Sark + IDA IPython Nice work.
gharchive/pull-request
2015-10-21T07:40:52
2025-04-01T06:39:10.060815
{ "authors": [ "james91b", "tmr232" ], "repo": "james91b/ida_ipython", "url": "https://github.com/james91b/ida_ipython/pull/17", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
588575206
The Spring version used by HAPI FHIR has a security vulnerability Describe the bug The Spring version (5.2.1) used by HAPI FHIR (4.2.0) has a security vulnerability -- it needs to be updated to 5.2.3 An edited version of our automated report Spring Framework Reflected File Download Vulnerability. (CVE-2020-5398) Path : server/target/hapi-fhir-jpaserver/WEB-INF/lib/spring-core-5.2.1.RELEASE.jar Installed version : 5.2.1.RELEASE Fixed version : 5.2.3 Feb 26, 2020 05:23:27 EST The remote Windows host contains a web application framework library that is affected by a reflected file download vulnerability. The remote host contains a Spring Framework library version that is 5.0.x prior to 5.0.16 or 5.1.x prior to 5.1.13 or 5.2.x prior to 5.2.3. It is, therefore, affected by a reflected file download vulnerability. An attacker can exploit this tricking user to click on a URL for trusted domain. Upon clicking on the malicious link, the victim will be presented with a download which appears to have originated from a trusted domain. Once downloaded, the malicious payload can execute arbitrary code and potentially completely take-over a system. Upgrade to Spring Framework version 5.0.16 or 5.1.13 or 5.2.3 or later. CVE-2020-5398 Jan 16, 2020 12:00:00 EST Environment (please complete the following information): HAPI FHIR Version: 4.2.0 OS: CentOs (but Doesn't matter) There is no part of HAPI FHIR that uses spring's ContentDisposition class to generate a filename, so I don't see this CVE as having any direct risk in HAPI FHIR (please do comment if you feel that this assessment is incorrect though). Nonetheless, no reason not to bump to a version without known vulnerabiilities. Will fix. Actually- We've already bumped to this version. See: https://github.com/jamesagnew/hapi-fhir/blob/master/pom.xml#L670 Closing the ticket, please comment if you disagree. Thank you.
gharchive/issue
2020-03-26T16:58:48
2025-04-01T06:39:10.067431
{ "authors": [ "RadixSeven", "jamesagnew" ], "repo": "jamesagnew/hapi-fhir", "url": "https://github.com/jamesagnew/hapi-fhir/issues/1779", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1718443480
Concatenate error with multimodal input Description Hello, I have a couple of clinical variables (categorical and float) that I want to use as additional input. I list them as: multi_input = ["age", "sex", "height", "weight"]. Anyway, when I try to train a (very basic) model, I receive an error saying that tensor shapes don't match: ConcatOp : Ranks of all input tensors should match: shape[0] = [16,4,1] vs. shape[1] = [16,2048] [[{{node model/input_merge/concat}}]] [Op:__inference_train_function_35947] I believe there's something wrong with the way how the clinical variables are introduced into the other layers (slide_feature_input)? How can I change it? Or am I missing something trivial? Would be great if anybody could help! Cheers. To Reproduce Steps to reproduce the behavior: 1. Commands import slideflow as sf P = sf.load_project('project') hp = sf.ModelParams( tile_px=299, tile_um=100, ) multi_input = ["age", "sex", "height", "weight"] P.train( 'category', params=hp, val_strategy='none', input_header= multi_input, ) 2. Output [11:45:54] INFO Training model category-HP0... INFO Hyperparameters: { "augment": "xyrj", "batch_size": 16, "drop_images": false, "dropout": 0, "early_stop": false, "early_stop_method": "loss", "early_stop_patience": 0, "epochs": [ 3 ], "hidden_layer_width": 500, "hidden_layers": 0, "include_top": true, "l1": 0.0, "l1_dense": 0.0, "l2": 0.0, "l2_dense": 0.0, "learning_rate": 0.0001, "learning_rate_decay": 0, "learning_rate_decay_steps": 100000, "loss": "sparse_categorical_crossentropy", "manual_early_stop_batch": null, "manual_early_stop_epoch": null, "model": "xception", "normalizer": null, "normalizer_source": null, "optimizer": "Adam", "pooling": "max", "tile_px": 299, "tile_um": 100, "toplayer_epochs": 0, "trainable_layers": 0, "training_balance": "category", "uq": false, "validation_balance": "none" } INFO Val settings: { "strategy": "none", "k_fold": 3, "k": null, "k_fold_header": null, "fraction": null, "source": null, "annotations": null, "filters": null, "dataset": null } INFO Using 687 training TFRecords, 0 validation INFO Adding input variable age as float INFO Adding input variable sex as float INFO Adding input variable height as float INFO Adding input variable weight as float [11:46:28] INFO Training with both images and 4 categories of slide-level input 2023-05-21 11:46:28.822013: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1613] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30948 MB memory: -> device: 0, nam e: Tesla V100-SXM2-32GB, pci bus id: 0000:61:00.0, compute capability: 7.0 [11:46:29] INFO Using pretraining from imagenet Model: "model" Layer (type) Output Shape Param # Connected to tile_image (InputLayer) [(None, 299, 299, 3 0 [] )] xception (Functional) (None, 2048) 20861480 ['tile_image[0][0]'] slide_feature_input (InputLaye [(None, 4)] 0 [] r) post_convolution (Activation) (None, 2048) 0 ['xception[0][0]'] input_merge (Concatenate) (None, 2052) 0 ['slide_feature_input[0][0]', 'post_convolution[0][0]'] logits-0 (Dense) (None, 2) 4106 ['input_merge[0][0]'] out-0 (Activation) (None, 2) 0 ['logits-0[0][0]'] Total params: 20,865,586 Trainable params: 20,811,058 Non-trainable params: 54,528 [11:46:37] INFO Beginning training Epoch 1/3 2023-05-21 11:46:51.242360: I tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:428] Loaded cuDNN version 8201 Traceback (most recent call last): File "", line 1, in File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project.py", line 3378, in train self._train_hp( File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project.py", line 709, in _train_hp self._train_split(dataset, hp, val_settings, s_args) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project.py", line 933, in _train_split project_utils._train_worker( File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project_utils.py", line 147, in _train_worker results = trainer.train(train_dts, val_dts, **training_kw) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/model/tensorflow.py", line 1925, in train self.model.fit( File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 70, in error_handler raise e.with_traceback(filtered_tb) from None File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/tensorflow/python/eager/execute.py", line 52, in quick_execute tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name, tensorflow.python.framework.errors_impl.InvalidArgumentError: Graph execution error: Detected at node 'model/input_merge/concat' defined at (most recent call last): File "", line 1, in File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project.py", line 3378, in train self._train_hp( File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project.py", line 709, in _train_hp self._train_split(dataset, hp, val_settings, s_args) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project.py", line 933, in _train_split project_utils._train_worker( File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/project_utils.py", line 147, in _train_worker results = trainer.train(train_dts, val_dts, **training_kw) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/slideflow/model/tensorflow.py", line 1925, in train self.model.fit( File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 65, in error_handler return fn(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/training.py", line 1650, in fit tmp_logs = self.train_function(iterator) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/training.py", line 1249, in train_function return step_function(self, iterator) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/training.py", line 1233, in step_function outputs = model.distribute_strategy.run(run_step, args=(data,)) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/training.py", line 1222, in run_step outputs = model.train_step(data) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/training.py", line 1023, in train_step y_pred = self(x, training=True) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 65, in error_handler return fn(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/training.py", line 561, in call return super().call(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 65, in error_handler return fn(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/base_layer.py", line 1132, in call outputs = call_fn(inputs, *args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 96, in error_handler return fn(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/functional.py", line 511, in call return self._run_internal_graph(inputs, training=training, mask=mask) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/functional.py", line 668, in _run_internal_graph outputs = node.layer(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 65, in error_handler return fn(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/engine/base_layer.py", line 1132, in call outputs = call_fn(inputs, *args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 96, in error_handler return fn(*args, **kwargs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/layers/merging/base_merge.py", line 196, in call return self._merge_function(inputs) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/layers/merging/concatenate.py", line 134, in _merge_function return backend.concatenate(inputs, axis=self.axis) File "/data/jjung23/miniconda3/envs/sf_2/lib/python3.9/site-packages/keras/backend.py", line 3572, in concatenate return tf.concat([to_dense(x) for x in tensors], axis) Node: 'model/input_merge/concat' ConcatOp : Ranks of all input tensors should match: shape[0] = [16,4,1] vs. shape[1] = [16,2048] [[{{node model/input_merge/concat}}]] [Op:__inference_train_function_35947] Expected behavior Successful training of a multimodal model. Environment: Slideflow Version (e.g., 1.0): 2.0.3-post1 OS (e.g., Ubuntu): Ubuntu 20.04.5 How you installed Slideflow (pip, source): pip install slideflow[tf] cucim cupy-cuda11x Python version: 3.9 CUDA/cuDNN version: 11.6 GPU models and configuration: Tesla V100-SXM2-32GB Any other relevant information: Additional context Thanks for raising this issue - we'll build a test dataset over the next day or so and work on reproducing the error, so we can find the source of the problem. In the meantime, do you see the same error when you train with only a single additional clinical variable? Try training 4 different models, one with each clinical variable as a single additional input, to see if the problem can be isolated to one of the variables. Thanks for replying! I tried to feed the model only one clinical variable, for example "age". What I get is: /data/jjung23/miniconda3/envs/sf_tensfl/lib/python3.10/site-packages/keras/engine/functional.py:638: UserWarning: Input dict contained keys ['slide_feature_input'] which did not match any model input. They will be ignored by the model. After that it looks like it's training normally based on the slides but without the clinical variable. Quick update - I was able to reproduce the problem when using continuous outcomes (like the ones you're using here). Categorical slide inputs (either single or multiple) are working as expected, but there seems to be an issue with continuous outcomes. Our automatic testing included testing categorical clinical variables as additional only, which is why this wasn't caught by our testing protocol. I should have a patch out shortly that fixes the problem, and I'll expand our testing to include continuous input variables, as well. Ok - patch has been applied for the Tensorflow backend. If you have the ability to run from source, let me know if it works on your end, as well. Still working on a fix for the PyTorch backend. If this resolves the issue, I'll incorporate it into the next patch release. Patch has been released as version 2.0.5. Hi, sorry for the late replay. Thank you so much for the patch and the messages! However, I receive the following error: [22:07:38] INFO Beginning training [31/1476] Epoch 1/3 Traceback (most recent call last): File "/data/jjung23/23_04_30/3_train_1.py", line 29, in <module> P.train( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/slideflow/project.py", line 3426, in train self._train_hp( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/slideflow/project.py", line 713, in _train_hp self._train_split(dataset, hp, val_settings, s_args) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/slideflow/project.py", line 937, in _train_split project_utils._train_worker( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/slideflow/project_utils.py", line 147, in _train_worker results = trainer.train(train_dts, val_dts, **training_kw) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/slideflow/model/tensorflow.py", line 1924, in train self.model.fit( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 70, in error_handler raise e.with_traceback(filtered_tb) from None File "/tmp/__autograph_generated_filevedfejsj.py", line 15, in tf__train_function retval_ = ag__.converted_call(ag__.ld(step_function), (ag__.ld(self), ag__.ld(iterator)), None, fscope) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in from_value return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in <genexpr> return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in from_value return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in <genexpr> return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 152, in from_value raise TypeError( TypeError: in user code: File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/engine/training.py", line 1249, in train_function * return step_function(self, iterator) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/engine/training.py", line 1233, in step_function ** outputs = model.distribute_strategy.run(run_step, args=(data,)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/engine/training.py", line 1222, in run_step ** outputs = model.train_step(data) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/engine/training.py", line 1027, in train_step self.optimizer.minimize(loss, self.trainable_variables, tape=tape) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/optimizers/optimizer_experimental/optimizer.py", line 527, in minimize self.apply_gradients(grads_and_vars) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/mixed_precision/loss_scale_optimizer.py", line 1331, in apply_gradients tf.__internal__.smart_cond.smart_cond( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/mixed_precision/loss_scale_optimizer.py", line 1329, in apply_fn return self._apply_gradients(grads, wrapped_vars) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/mixed_precision/loss_scale_optimizer.py", line 1361, in _apply_gradients self._optimizer.apply_gradients( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/optimizers/optimizer_experimental/optimizer.py", line 1140, in apply_gradients return super().apply_gradients(grads_and_vars, name=name) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/optimizers/optimizer_experimental/optimizer.py", line 634, in apply_gradients iteration = self._internal_apply_gradients(grads_and_vars) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/optimizers/optimizer_experimental/optimizer.py", line 1166, in _internal_apply_gradients return tf.__internal__.distribute.interim.maybe_merge_call( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/optimizers/optimizer_experimental/optimizer.py", line 1216, in _distributed_apply_gradients_fn distribution.extended.update( File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/keras/optimizers/optimizer_experimental/optimizer.py", line 1211, in apply_grad_to_update_var return self._update_step_xla(grad, var, id(self._var_key(var))) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in from_value return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in <genexpr> return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in from_value return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 129, in <genexpr> return default_types.Tuple(*(from_value(c, context) for c in value)) File "/data/jjung23/miniconda3/envs/sf_tensorflow/lib/python3.9/site-packages/tensorflow/core/function/trace_type/trace_type_builder.py", line 152, in from_value raise TypeError( TypeError: Python object could not be represented through the generic tracing type. Consider implementing the Tracing Protocol for it: <AutoCastVariable 'block1_conv1/kernel:0' shape =(3, 3, 3, 32) dtype=float32 dtype_to_cast_to=float32> I couldn't really find anything when I googled the error. Maybe (or hopefully) it's something easy to fix? Hmmm - let me investigate. This looks like a separate issue. Can you paste the contents of the model params.json here? Yes of course: { "slideflow_version": "2.0.5", "project": "MyProject", "backend": "tensorflow", "git_commit": "ae6ad0e8937207efe60d23a400e88bf12f5db719", "model_name": "category-HP0", "full_model_name": "category-HP0", "stage": "training", "img_format": "jpeg", "tile_px": 299, "tile_um": 100, "max_tiles": 0, "min_tiles": 0, "model_type": "categorical", "outcomes": [ "category" ], "input_features": [ "age" ], "input_feature_sizes": [ 1 ], "input_feature_labels": { "age": "float" }, "outcome_labels": { "0": "major", "1": "minor" }, "dataset_config": "project/datasets.json", "sources": [ "MyProject" ], "annotations": "project/annotations.csv", "validation_strategy": "none", "validation_fraction": null, "validation_k_fold": 3, "k_fold_i": null, "filters": null, "hp": { "augment": "xyrj", "batch_size": 16, "drop_images": false, "dropout": 0, "early_stop": false, "early_stop_method": "loss", "early_stop_patience": 0, "epochs": [ 3 ], "hidden_layer_width": 500, "hidden_layers": 0, "include_top": true, "l1": 0.0, "l1_dense": 0.0, "l2": 0.0, "l2_dense": 0.0, "learning_rate": 0.0001, "learning_rate_decay": 0, "learning_rate_decay_steps": 100000, "loss": "sparse_categorical_crossentropy", "manual_early_stop_batch": null, "manual_early_stop_epoch": null, "model": "xception", "normalizer": null, "normalizer_source": null, "optimizer": "Adam", "pooling": "max", "tile_px": 299, "tile_um": 100, "toplayer_epochs": 0, "trainable_layers": 0, "training_balance": "category", "uq": false, "validation_balance": "none" }, "training_kwargs": { "save_predictions": "csv" } } This is btw only with one clinical variable (age). Thank you so much for taking care! Nevermind, i think i got it working and am currently training a model with multiple clinical variables. Apparently it had nothing to do with the patch but rather with the conda environment that i re-installed (and obviously in some wrong way). Thank you so much for your help! I will let you know how training and testing turns out. BTW: Is it possible to have clinical variables and tfrecords both as input - and train for a linear outcome? I know that the keyword argument "input_header" is available in things like Project.train or Project.evaluate... But is there a way to pass that input_header argument to the sf.model.LinearTrainer? Or do I have to use the keyword argument "slide_input"? Apparently, it's supposed to be a dictionary... can i then just do a list of dictionaries? Such as: csv = 'project/annotations.csv' df = pd.read_csv(csv) age_dict = df.set_index('slide').to_dict()['age'] sex_dict = df.set_index('slide').to_dict()['sex'] asa_dict = df.set_index('slide').to_dict()['asa'] height_dict = df.set_index('slide').to_dict()['height'] weight_dict = df.set_index('slide').to_dict()['weight'] multi_input = [age_dict, sex_dict, asa_dict, height_dict, weight_dict] my_trainer = sf.model.LinearTrainer( hp=hp, slide_input=multi_input, outdir='outputs', labels=labels, ) my_trainer.train(dataset1, None) It looks like it's running, but i'm not sure if the clinical variables are really being processed... Do you know what I mean? Glad to hear it! Training to linear outcomes is super easy. All you have to do is choose a linear loss function in the hyperparameters (eg "mean_squared_error"), and an outcome that can be interpreted as a continuous variable, and it should just work. You can still use the same P.train() and P.evaluate() interface, and clinical variable input will still work, as well. Alright, thanks for your help! Yeah, i didn't see that i don't really need the LinearTrainer for linear outcome. Quick question: is it possible to train the MIL and CLAM models for a linear outcome measure? I've seen that there's this keyword argument bag_loss (Primary loss function) which can be either ‘ce’ or ‘svm’... it's not possible to change it to something like rmse / mean_squared_error, is it? Training MIL models with linear outcomes is under development! (see PR https://github.com/jamesdolezal/slideflow/pull/287). The plan is to add this in version 2.1, which is still 1-2 months out. I have another question: Apparently, all clinical variables that are used as input for the neural network are treated as float variables. For example, when I look into the params.json it looks like this: "input_features": [ "age", "sex", "asa", "bmi" ], "input_feature_sizes": [ 1, 1, 1, 1 ], "input_feature_labels": { "age": "float", "sex": "float", "asa": "float", "bmi": "float" ], Would it make sense to change the parameters into telling the network that for example things like sex or asa unlike age or bmi are actually categorical (or ordinal) variables and not float? If so, how can I change it? You can definitely mix float and categorical variables. Any variable that can be interpreted as a continuous variable (eg coded with 0 and 1) will be interpreted as float. Is this how "sex" and "ama" are encoded? If so, you can force categorical interpretation by changing "0" and "1" to "M" and "F", for example. Hello James, I have a very quick question: So I trained a model with default 3-fold cross-validation. It automatically created the splits.json with data[0]['strategy'] stating the strategy method, data[0]['patients'] summing up all patients and data[0]['tfrecords']['k-fold-1'], ['k-fold-2'] and ['k-fold-3']. Now this might be a stupid question, but: how do I know which two thirds were used for training and which last third was used for validation? There's no such information in the splits.json stating something like: first run is a+b, val on c. second run is a+c, val on b. third run is b+c, val on a. It could be any order and I couldn't figure from the documentation that you provided. Do you know what I mean? The reason I want to know this is because I eventually want to generate heatmaps only for the validation group. Because I want to understand what parts of the tissue were relevant related to the validation results. For like the first fold, I would have to take the model that was trained on some two thirds - but I would have to exactly locate the last third of patients that was not used for training. Does that make sense? Maybe you could comment on this as well. Would greatly appreciate it! Thank you so much so far. Cheers Hi Jinny - thanks for the question, this could be better clarified in the documentation. The best way to determine what data was used for training/validation is to view the slide_manifest.csv file created in the model folder during training. This is a CSV file with three columns - the slide name, the outcome label, and the dataset (training/validation). You can quickly pull a list of slides that were used for model training or validation using sf.util.get_slides_from_model_manifest(), specifying whether you want to retrieve the training or validation slides using the parameter dataset: import slideflow as sf model_path = '/path/to/saved_model' val_slides = sf.util.get_slides_from_model_manifest(model_path, dataset='validation') You can then use create a dataset from only those slides, and use that dataset for generating heatmaps, or other downstream tasks: P = sf.Project(...) val_dataset = P.dataset(..., filters={'slide': val_slides}) To answer your question more directly though, the splits.json has the slides/tfrecords split into the number of groups equal to your cross-fold (in your case, 3). For k-fold 1, the first group (A) is validation, and the remainder is training (B+C). For k-fold 2, the second group (B) is validation, and the remainder (A+C) is training. And so on. I appreciate you asking, I realize now that I failed to include this information in the documentation. I'll add a section in the documentation explaining this more clearly. Let me know if that makes sense or if I can help clarify further! Thank you, James, this was super fast! Yes, it totally makes sense =)
gharchive/issue
2023-05-21T10:00:16
2025-04-01T06:39:10.144403
{ "authors": [ "jamesdolezal", "jinnyjuice" ], "repo": "jamesdolezal/slideflow", "url": "https://github.com/jamesdolezal/slideflow/issues/282", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
573154714
chore: refactor This commit does two things mainly Simplifies components. For example, instead of Source and Target now there is CodeEditor Uses effector for managing the state. As per our discussion, I removed the use of effector but kept the other changes I made. We have an initial CSS snippet that is intended to show up initially so that the users can get an idea regarding the usage. Also, it should be the case if the textarea field was cleared (left empty). This should be fixed now. Thanks
gharchive/pull-request
2020-02-29T04:06:41
2025-04-01T06:39:10.149478
{ "authors": [ "jamesgeorge007", "renjithgr" ], "repo": "jamesgeorge007/csstox", "url": "https://github.com/jamesgeorge007/csstox/pull/2", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
245727016
[Android 4.4.4 + NETStandard] TakePhotoAsync throws exception Bug Information Version Number of Plugin: 3.0.1 (latest stable) Device Tested On: Google Nexus 10 Simulator Tested On: (tested it on real device, see above) Version of VS: Visual Studio 2017 Version of Xamarin: Xamarin Android 7.3.1.2 Versions of other things you are using: Steps to reproduce the Behavior call TakePhotoAsync Expected Behavior No app crash Actual Behavior App crashes with following call stack. Before migrating my core project from PCL to .NETStandard 1.6, it worked fine I think. 07-26 16:35:19.644 I/MonoDroid(13349): UNHANDLED EXCEPTION: 07-26 16:35:19.684 I/MonoDroid(13349): Java.Lang.NullPointerException: Exception of type 'Java.Lang.NullPointerException' was thrown. 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at Java.Interop.JniEnvironment+StaticMethods.CallStaticObjectMethod (Java.Interop.JniObjectReference type, Java.Interop.JniMethodInfo method, Java.Interop.JniArgumentValue* args) [0x00069] in <bd30a18775d94dc8b6263aecd1ca9077>:0 07-26 16:35:19.684 I/MonoDroid(13349): at Android.Runtime.JNIEnv.CallStaticObjectMethod (System.IntPtr jclass, System.IntPtr jmethod, Android.Runtime.JValue* parms) [0x0000e] in <d855bac285f44dda8a0d8510b679b1e2>:0 07-26 16:35:19.684 I/MonoDroid(13349): at Android.Support.V4.Content.FileProvider.GetUriForFile (Android.Content.Context context, System.String authority, Java.IO.File file) [0x00078] in <3e239b9681084d42bb949c1e01ef500e>:0 07-26 16:35:19.684 I/MonoDroid(13349): at Plugin.Media.MediaPickerActivity.OnCreate (Android.OS.Bundle savedInstanceState) [0x0023f] in C:\projects\mediaplugin\src\Media.Plugin.Android\MediaPickerActivity.cs:162 07-26 16:35:19.684 I/MonoDroid(13349): --- End of stack trace from previous location where exception was thrown --- 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0003e] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter`1[TResult].GetResult () [0x00000] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at Plugin.Media.MediaImplementation+<TakePhotoAsync>d__16.MoveNext () [0x000c7] in C:\projects\mediaplugin\src\Media.Plugin.Android\MediaImplementation.cs:119 07-26 16:35:19.684 I/MonoDroid(13349): --- End of stack trace from previous location where exception was thrown --- 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0003e] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at System.Runtime.CompilerServices.TaskAwaiter`1[TResult].GetResult () [0x00000] in <3fd174ff54b146228c505f23cf75ce71>:0 07-26 16:35:19.684 I/MonoDroid(13349): at MyApp.Services.Impl.FilePicker+<TakePhotoAsync>d__1.MoveNext () [0x0005b] in XXX Code snippet Screenshotst Oh, I think I know what's going on. In my Android project I have "Target Android Version" set to Android 7.1 (Level 25). I need this because I am using anther plugin of yours, Permissions, which requires it, as you also noted in the readme: You MUST set your Target version to API 24+ and Compile against API 24+: But why does the Media plugin crashes when Target Android Version is set to Android 7.1? It seems like it hits an API which is not available on Android 4.4 Works fine on my sample app that I have included in this repo on my 4.4 device just fine. Ensure you follow all the setup with xml files on android for file permissions. Thanks. I'm 100% sure is because of setting "Target Android Version" set to Android 7.1 (Level 25). If I leave it to the default "Use Compile using SDK version", it doesn't crash. I don't understand, why would this have to do with setting Android file permissions, since it works when using "Use Compile using SDK version", ? To be clear, right now, I am not even using Permissions plugin. I was actually preparing to use it. I am only using the Media plugin. To summarize: I am only using the Media plugin. Everything works great on all Android versions. If I set Android project to have "Target Android Version" to Android 7.1 (Level 25), it crashes on all Android below 7.0 Did you do all this: https://github.com/jamesmontemagno/mediaplugin#android-n ? Arghhhh! Thanks! Sorry!! But I have one question: In step #2, you instruct to have the file_paths.xml with the following content: <?xml version="1.0" encoding="utf-8"?> <paths xmlns:android="http://schemas.android.com/apk/res/android"> <external-files-path name="my_images" path="Pictures" /> <external-files-path name="my_movies" path="Movies" /> </paths> and you mention YOUR_APP_PACKAGE_NAME must be set to your app package name! But note there's no YOUR_APP_PACKAGE_NAME in your XML content. However, going to https://developer.android.com/training/camera/photobasics.html they suggest a different name attribute: <?xml version="1.0" encoding="utf-8"?> <paths xmlns:android="http://schemas.android.com/apk/res/android"> <external-path name="my_images" path="Android/data/**com.example.package.name**/files/Pictures" /> </paths> Which one is correct? That is their package name... Whatever your package name is in the android manifest is what you should put in there. Usually it is.... "com.business.company" It creates a private area that can be shared between apps for security reasons.
gharchive/issue
2017-07-26T13:44:51
2025-04-01T06:39:10.160977
{ "authors": [ "jamesmontemagno", "opcodewriter" ], "repo": "jamesmontemagno/MediaPlugin", "url": "https://github.com/jamesmontemagno/MediaPlugin/issues/312", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
535226483
Just a qestion regarding focus When using iPad or iPhone, users can tap the screen and the camera will attempt to focus. We have some users saying their cameras aren't focusing well in our app which uses this plugin, but outside the app, the camera focuses good. Is there a way to show a focus slider or something that we're not aware of? This library just pops up the native Camera itself, so it is out of our hands at that point and the OS takes full control. :(
gharchive/issue
2019-12-09T19:52:49
2025-04-01T06:39:10.162495
{ "authors": [ "BlackLine-maker", "jamesmontemagno" ], "repo": "jamesmontemagno/MediaPlugin", "url": "https://github.com/jamesmontemagno/MediaPlugin/issues/782", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
112145688
Geolocator: Crash on iOS 8.4 on simulator when Location is set to None If I make the following call await CrossGeolocator.Current.GetPositionAsync(10000).ConfigureAwait(false); while running it on the simulator, setting the Location to None on iOS 8.4 makes it crash with a -[CLLocationManager allowsBackgroundLocationUpdates]: unrecognized selector sent to instance <some hex number> I haven't been able to create a minimal test case, but it does happen consistently on my app. However, the error seems to be gone if I modify Failed method of GeolocationSingleUpdateDelegate public override void Failed(CLLocationManager manager, NSError error) { switch((CLError)(int)error.Code) { case CLError.Network: StopListening(); this.tcs.TrySetException(new GeolocationException(GeolocationError.PositionUnavailable)); break; case CLError.LocationUnknown: StopListening(); this.tcs.TrySetException(new GeolocationException(GeolocationError.PositionUnavailable)); break; } } I can add that case in there, but there is no possible way that allowsBackgroundLocationUpdates could be called on iOS 9, I am doing a system number check when setting it and it would only be enabled if you set it to true too. Committing and pushing today Awesome, thanks James!
gharchive/issue
2015-10-19T13:36:05
2025-04-01T06:39:10.165692
{ "authors": [ "gnola14", "jamesmontemagno" ], "repo": "jamesmontemagno/Xamarin.Plugins", "url": "https://github.com/jamesmontemagno/Xamarin.Plugins/issues/119", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2374152025
Set AIProxy DeviceCheck bypass token as env variable The DeviceCheck bypass token is used for AIProxy customers to make requests to AIProxy from the iOS simulator, where DeviceCheck is not available. If the token leaks into a production build of the app, then attackers can use the bypass token themselves to skip one layer of security that AIProxy provides. This patch adjusts the way that developers set the bypass token, removing it from the source code and adding it instead as an env variable. Env variables are not packaged up in distribution app bundles, and are therefore harder to make leak into a production release of your app. Updated the README with new instructions for adding the AIPROXY_DEVICE_CHECK_BYPASS env variable @lzell oops, this LGTM lets resolve the conflict and I can merge :)
gharchive/pull-request
2024-06-26T04:04:43
2025-04-01T06:39:10.167745
{ "authors": [ "jamesrochabrun", "lzell" ], "repo": "jamesrochabrun/SwiftOpenAI", "url": "https://github.com/jamesrochabrun/SwiftOpenAI/pull/55", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1844319395
Deployment does not have minimum availability Pods "homepage-77cdf87bbc-" is forbidden: error looking up service account default/homepage: serviceaccount "homepage" not found:Deployment does not have minimum availability. kubectl create serviceaccount homepage serviceaccount/homepage created Fix this bug I also found this bug on a fresh deploy. Simply creating the SA fixed it like @xinmans said. Since it seems like a misconfiguration, I'm closing this.
gharchive/issue
2023-08-10T02:46:37
2025-04-01T06:39:10.169523
{ "authors": [ "jameswynn", "theeternalrat", "xinmans" ], "repo": "jameswynn/helm-charts", "url": "https://github.com/jameswynn/helm-charts/issues/14", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1394884204
The plugin will no longer work after Nov 1, 2022 Describe the bug Today the plugin stopped working, I'm getting: Oh no, something went wrong! Error: Internal Server Error Screenshots Desktop (please complete the following information): Plugin version: 1.9.0 Obsidian version: 0.15.9 Related to: #177 #157 @pprazzi You are right. It was related to the maintenance. Any idea if this maintenance is related to #176 ? As they stated that the v8 will no longer work after November 1 The earlier will no longer be available after November 1 Any idea if this maintenance is related to Todoist api v2 #176 ? https://groups.google.com/a/doist.com/g/todoist-api/c/33g1sC_ov3Q This confirms my assumptions that the V1 API will no longer work but it's actually after November 30. There's an open PR #176 by @gnapse addressing the API migration to V2. @jamiebrynes7 Any plans for migrating to V2 ? Thanks to all the people who made this plugin possible. Thanks for the heads up, I should probably join that google group! I'll take a look at #176 on the weekend, or earlier if I find some time :) Hey all, tihs was released in v1.10.0!
gharchive/issue
2022-10-03T14:58:51
2025-04-01T06:39:10.183906
{ "authors": [ "aziham", "jamiebrynes7" ], "repo": "jamiebrynes7/obsidian-todoist-plugin", "url": "https://github.com/jamiebrynes7/obsidian-todoist-plugin/issues/179", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
354107966
Add TextMappingFormat Extends MappingFormat and allows reading text mapping formats from readers/writers instead of just binary input/output streams. Merged with https://github.com/jamiemansfield/Lorenz/commit/fa120163db7c81d06e1878ccc41644760a63577d 👍
gharchive/pull-request
2018-08-26T16:28:45
2025-04-01T06:39:10.187630
{ "authors": [ "Minecrell", "jamierocks" ], "repo": "jamiemansfield/Lorenz", "url": "https://github.com/jamiemansfield/Lorenz/pull/6", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
632484977
Add audio loading and audio source playing Resolves #25, resolves #22 Codecov Report Merging #123 into master will increase coverage by 0.28%. The diff coverage is 75.86%. @@ Coverage Diff @@ ## master #123 +/- ## ========================================== + Coverage 76.83% 77.11% +0.28% ========================================== Files 78 87 +9 Lines 2357 2618 +261 Branches 213 233 +20 ========================================== + Hits 1811 2019 +208 - Misses 373 410 +37 - Partials 173 189 +16 Flag Coverage Δ #unittests 77.11% <75.86%> (+0.28%) :arrow_up: Impacted Files Coverage Δ src/fake/audio_context.ts 53.12% <53.12%> (ø) src/fake/response.ts 62.50% <62.50%> (ø) src/fake/gain_node.ts 71.42% <71.42%> (ø) src/fake/audio_buffer_source_node.ts 77.77% <77.77%> (ø) src/standard/audio_source/audio_source_system.ts 82.08% <82.08%> (ø) src/standard/http_audio/http_audio_system.ts 89.79% <89.79%> (ø) src/standard/audio_source/audio_source.ts 90.00% <90.00%> (ø) src/audio/audio_asset.ts 100.00% <100.00%> (ø) src/audio/audio_request.ts 100.00% <100.00%> (ø) ... and 9 more
gharchive/pull-request
2020-06-06T14:06:31
2025-04-01T06:39:10.202234
{ "authors": [ "codecov-commenter", "jthomperoo" ], "repo": "jamjarlabs/JamJar", "url": "https://github.com/jamjarlabs/JamJar/pull/123", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
370824042
Make other background colors than (0/"black") possible for fonts When I tested the embedded graphics library today with my eink I realised that only black background colors for fonts was possible. This PR make different background colors for fonts possible and adds a default stroke_color (a default for fill_color was already set before, so it's just the opposite) so the panic can be removed. That means the following changes (only shows the differences): If no stroke and fill color is set: It doesn't panic anymore and uses 1u8 as stroke color If stroke color is set but fill color is not: No change If stroke and fill color are set: now fill color is used instead of the default 0u8 ("black") Can you also add or edit an example in the simulator examples folder to demonstrate this behaviour? Yes, I am gonna add an example and some tests. In the font_builder we are currently using Style::default() which returns None for both of the colors. Should we change this so it's more visible that we use a default fill and stroke color for None values in the next iterator? (https://github.com/jamwaffles/embedded-graphics/blob/master/embedded-graphics/src/fonts/font_builder.rs#L61) New: fn render_str(text: &'a str) -> Self { Self { pos: Coord::new(0, 0), text, style: Style::default() .with_fill(0u8.into()) .with_stroke(1u8.into()), _conf: Default::default(), } } When I was trying to make a testcase for inverted I might have found a bug in the current implementation: // produced result: [0, 0, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 0, 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // what it should look like (at least thats what i thought) [1, 0, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] That's a bug in master, thanks for surfacing it! Running the simulator with a font where the pixel colour is always set, I see this in master: The black rectangle to the middle left has its first row shifted by one which obviously shouldn't happen. I'll look at getting a fix into master asap. where the pixel colour is always set What do you mean by that? Can you maybe paste your code in here for that example? That looks like the old behaviour before this PR and shouldn't happen. I've just merged #53. Can you updated from master? The off-by-one error should be gone now. What do you mean by that? Sorry for the confusion, it was just a quick temporary change to help debugging by always setting the pixel black :slightly_smiling_face: Travis is also happy now :-) 0.4.2 released with these changes in it!
gharchive/pull-request
2018-10-16T22:23:31
2025-04-01T06:39:10.212753
{ "authors": [ "Caemor", "jamwaffles" ], "repo": "jamwaffles/embedded-graphics", "url": "https://github.com/jamwaffles/embedded-graphics/pull/51", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
789219831
Nonblocking implementation MCU/other hardware in use: STM32h7 Display resolution and interface: I2C, [128x64] Nonblocking I2C support Hello, I'm attempting to build an application that utilizes this library in conjunction with doing real-time DAC/ADC functions. I have noticed that my DAC output, which is executed from a Timer producing a simple sine-wave is blocked by the I2C communication that this library executes, which distorts the output of the DAC. I know that the embedded-hal doesn't support non-blocking for various reasons. What I would like to do is write my own interface adapter and implement I2C interrupt handling within my app to unblock the communication to the display. Looking at the test_helpers.rs, I see this: #[allow(dead_code)] #[derive(Debug, Clone, Copy)] pub struct StubInterface; impl WriteOnlyDataCommand for StubInterface { fn send_commands( &mut self, _cmd: display_interface::DataFormat<'_>, ) -> Result<(), DisplayError> { Ok(()) } fn send_data(&mut self, _buf: display_interface::DataFormat<'_>) -> Result<(), DisplayError> { Ok(()) } } I assume that I could use the WriteOnlyDataCommand trait to achieve this? Kind regards Oliver Sorry for the delay! @therealprof maintains the crate that WriteOnlyDataCommand so might be able to offer more insight, but yes I think it's enough to add a custom impl of WriteOnlyDataCommand and pass that into Builder::new().connect(interface).into() instead of the provided blocking implementations. There's an embedded-hal-async now that provides async/await-compatible bus interfaces. See https://github.com/jamwaffles/ssd1331/pull/13 for a PR adding support to the ssd1311 crate and follow https://github.com/embedded-graphics/embedded-graphics/issues/622 for an upstream embedded-graphics trait. cc https://github.com/jamwaffles/ssd1306/pull/178
gharchive/issue
2021-01-19T17:35:02
2025-04-01T06:39:10.218181
{ "authors": [ "bugadani", "jamwaffles", "ostenning", "quentinmit" ], "repo": "jamwaffles/ssd1306", "url": "https://github.com/jamwaffles/ssd1306/issues/146", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1403286534
Latest release notes / Project visibility Hello, I have been asked to evaluate and install this Jenkins plugin, but I cannot find release notes for the latest version 1.13. The Jenkins plugins directory entry does not provide any issue tracker, and I could only find an indirect link to this GH repo in the Documentation text. Please, can you enrich the plugin metadata, and add a changelog for newer versions? IMHO it should require a little effort, but will provide greater visibility to the plugin development health status 😉 I could only find an indirect link to this GH repo in the Documentation text. The problem is that the update-center generation code is expecting the plugin source-code to be hosted in the jenkinsci organization in GitHub. This is described here: https://www.jenkins.io/doc/developer/publishing/requesting-hosting/#open-hosting-request. However, this plugin was not transferred to that organization. In the update-center code (https://github.com/jenkins-infra/update-center2/blob/7d77cd45525fe5f9ddbc9ec11b1968c00952a2fa/src/main/java/io/jenkins/update_center/HPI.java#L461), the repository of the plugin is excluded. The issue tracker problem is because none is documented in https://github.com/jenkins-infra/repository-permissions-updater/blob/master/permissions/plugin-build-monitor-plugin.yml. add a changelog for newer versions? The repository is not using release-drafter nor any manual release note file. @jan-molak would it be ok to transfer the plugin to the jenkinsci organization? The plugin could benefit from dependabot, jep-229, release-drafter, better integration with plugins.jenkins.io.. @alecharp - last time I've spoken with CloudBees regarding transfer there were some challenges that prevented the transfer; happy to get back to this conversation over email though last time I've spoken with CloudBees To be specific, this hosting process has nothing to do with CloudBees but with the Jenkins project. I was here speaking as a Jenkins community member and because I'm working on https://github.com/jenkins-infra/plugin-health-scoring/ which shows that the plugins is not hosted correctly. Right, thanks for the context. There are several contributors from CloudBees helping out with Jenkins Build Monitor at the moment, so if there are PRs you'd like to propose to improve integration with Jenkins ecosystem we'll be happy to review them? if there are PRs you'd like to propose to improve integration with Jenkins ecosystem the transfer of the repository cannot be done with a pull request. For the release-drafter, cd etc., there configurations are easier once in the jenkinsci organization as they can simply extend basic configuration from https://github.com/jenkinsci/.github Thanks for the explanation, I'll reach out to CloudBees and the Jenkins project to see what's changed since we last discussed this some (long) time ago #418 I'll reach out to CloudBees I'm not sure why you need to reach out. If so, I can help you, as employee of CloudBees. For the Jenkins Project, I can also help, as a long time contributor and I'm not also part of the hosting process. From https://github.com/jan-molak/jenkins-build-monitor-plugin/issues/418, I don't know what you need for your end-to-end tests. For the release process, it's still up to you to use the decide to use semantic versioning or not, but that was also the case back then. To be clear, this issue would not exist if this repository followed the standard conventions for the Jenkins project: Repository hosted in the jenkinsci GitHub organization CI build done on https://ci.jenkins.io CD done with JEP-229 if there are PRs you'd like to propose to improve integration with Jenkins ecosystem we'll be happy to incorporate them? We will not be proposing PRs to improve integration with the Jenkins ecosystem for repositories not hosted in the jenkinsci GitHub organization, using GitHub Actions for CI, and using something other than JEP-229 for CD. If you would like to transfer this repository to the jenkinsci GitHub organization, we will be happy to help move the CI to https://ci.jenkins.io and the CD to JEP-229, which will resolve issues like this. @alecharp @basil - I appreciate your offering to help and support Jenkins Build Monitor. I'll reach out to people I've spoken with originally to discuss the details of any transfers. Thanks @jan-molak. To transfer the repository from the jan-molak GitHub organization to the jenkinsci GitHub organization, you can file a ticket at https://github.com/jenkins-infra/helpdesk. Thanks @basil I'm on holiday at the moment with limited access to the Internet. I'll look into it when I'm back home next week. Thanks for all the details! Any update on this? As @fabricat I was also looking into this plugin to evaluate it and found the info lacking. It makes the process a bit more difficult and the plugin also looks less attractive (it doesn't look maintained). From the Jenkins project's perspective, we are still happy to help normalize this plugin's hosting and release process. The only step that is necessary to begin the process is for Jan Molak to transfer the plugin to the jenkinsci GitHub organization. Build Monitor View is now hosted in the jenkinsci GitHub organization, built on ci.jenkins.io, and deployed with our standard CD process (including changelogs generated with Release Drafter).
gharchive/issue
2022-10-10T14:39:29
2025-04-01T06:39:10.234583
{ "authors": [ "alecharp", "andham", "basil", "fabricat", "jan-molak" ], "repo": "jan-molak/jenkins-build-monitor-plugin", "url": "https://github.com/jan-molak/jenkins-build-monitor-plugin/issues/635", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
358395510
Running :Test* using dotnettest in Windows adds a backslash \ Hello, first thanks for this vim plugin. It has been very useful. I'm having the following issue in Windows: OS: Windows 10 _vimrc: let test#strategy='dispatch' let g:test#csharp#runner='dotnettest' Running :Test* using dotnettest in Windows adds a backslash \: Do not escape ~ on win32 https://github.com/janko-m/vim-test/blob/0941cfc91cdaa896f16f5e32d20940aab902f88c/autoload/test/csharp/dotnettest.vim#L22 https://github.com/janko-m/vim-test/blob/0941cfc91cdaa896f16f5e32d20940aab902f88c/autoload/test/csharp/dotnettest.vim#L24 https://github.com/janko-m/vim-test/blob/0941cfc91cdaa896f16f5e32d20940aab902f88c/autoload/test/csharp/dotnettest.vim#L27 Sorry, but I don't use Windows anymore and I can't help with this one, unfortunately. IIRC escaping ~ was indeed not working for me but I would not say that it is something vim-test specific (I am using neovim, however, so it could have been a neovim issue). @jhonnyslpz, can you confirm what the following command outputs? echo expand('~') I wonder if the PR #328 may provider a solution to this @aignas The command output is: C:\users\jhonnys.lopez my home path. @codeinabox Do you mean to implement a similar approach? I believe the issue is in vim-test, the filter format is: --filter FullyQualifiedName~xyz, in OSX/Linux it is necessary to escape ~ but this is not working in Windows because it just adds a \ which is a wrong filter. Windows CMD Screenshot:
gharchive/issue
2018-09-09T17:28:39
2025-04-01T06:39:10.275296
{ "authors": [ "aignas", "codeinabox", "jhonnyslpz" ], "repo": "janko-m/vim-test", "url": "https://github.com/janko-m/vim-test/issues/321", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2179168897
feat(bulk-import): create bulk-import frontend plugin Rebased https://github.com/janus-idp/backstage-plugins/pull/1271 :+1: /lgtm /approve
gharchive/pull-request
2024-03-11T13:42:19
2025-04-01T06:39:10.293008
{ "authors": [ "gashcrumb", "invincibleJai" ], "repo": "janus-idp/backstage-plugins", "url": "https://github.com/janus-idp/backstage-plugins/pull/1327", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1947306712
CI tab: UXD improvement What do you want to improve? I would like to see all CI tools configured for service in the radio option if there are multiple instead of showing all in the same screen with scroll Screenshots: I don't think a component will use several CI tools. So if the component is using Jenkins or Tekton (or others), the corresponding plugin will be displayed based on the annotation in the catalog-info yaml file. If we go that route, it will be a huge amount of work to have all the CI plugins having the same UI. I don't think a component will use several CI tools. So if the component is using Jenkins or Tekton (or others), the corresponding plugin will be displayed based on the annotation in the catalog-info yaml file. If we go that route, it will be a huge amount of work to have all the CI plugins having the same UI. Hi @christophe-f ideally there will be one but if catalog-info.yaml has multiple then now we show one after another as in below and thus UX came up with this suggestion to show in radio i.e one at at time if there are multiple. Let me know if this is not valid usecase or not on prioirty cc @ShiranHi Unassigned myself for now as this is still under exploration if it should be implemented or not. Closing it for now based on the above discussions and we'll revisit if there is more ask for it.
gharchive/issue
2023-10-17T12:31:50
2025-04-01T06:39:10.296781
{ "authors": [ "christophe-f", "divyanshiGupta", "invincibleJai" ], "repo": "janus-idp/backstage-showcase", "url": "https://github.com/janus-idp/backstage-showcase/issues/621", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2319509811
chore(docker): set the NODE_OPTIONS=--no-node-snapshot env variable Description Adds the NODE_OPTIONS=--no-node-snapshot env variable so that the scaffolder is usable in nodejs 20. Which issue(s) does this PR fix Fixes RHIDP-2436 PR acceptance criteria Please make sure that the following steps are complete: [ ] GitHub Actions are completed and successful [ ] Unit Tests are updated and passing [ ] E2E Tests are updated and passing [ ] Documentation is updated if necessary (requirement for new features) [ ] Add a screenshot if the change is UX/UI related How to test changes / Special notes to the reviewer /retest /test e2e-tests
gharchive/pull-request
2024-05-27T16:43:41
2025-04-01T06:39:10.300248
{ "authors": [ "Zaperex", "rm3l" ], "repo": "janus-idp/backstage-showcase", "url": "https://github.com/janus-idp/backstage-showcase/pull/1277", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1581355061
Provide a GPT to create a Go application managed by Argo, using GH Actions Goal Provide a GPT to create a Go application managed by Argo, using GH Action. Platform engineer can either use this as is or use it as a sample to provide a starting point for their own GPT. What problem does this solve? As a Developer I want a guided UI to create GH repo for a Go application, which will be managed by Argo using GH Actions So that I can get started quickly with a Go application Scenario: Given: A Developer and an existing Backstage application When: The developer wants to add a new component to the application Then: A Go GPT is available to select Given: A Developer and an existing Backstage application When The developer adds a new Go component to the application Then: A starting point application is committed to a source repository completed by https://github.com/janus-idp/software-templates/pull/65
gharchive/issue
2023-02-12T18:07:50
2025-04-01T06:39:10.304567
{ "authors": [ "schultzp2020", "serenamarie125" ], "repo": "janus-idp/software-templates", "url": "https://github.com/janus-idp/software-templates/issues/42", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
427790801
dmenu bind? Hi! Do you have a way to get the paste in the clipboard like clipd menu? https://mpov.timmorgan.org/clipboard-history-in-sway-window-manager/ Thanks!
gharchive/issue
2019-04-01T16:06:34
2025-04-01T06:39:10.308068
{ "authors": [ "kaihendry" ], "repo": "janza/wl-clipboard-history", "url": "https://github.com/janza/wl-clipboard-history/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1984449752
Cannot open the discussion for the new feature Package version 2.0.1 Describe the bug I would like to suggest adding the ability to specify the path to the directory for snapshots in options. I wanted to do this via discussion as the manual says. However, the current page for creating issues does not allow me to do so (image). Because it's a link to a repository (different from this one) that has no discussions. https://github.com/japa/runner/issues/new?title=Discussion%20for%20a%20new%20feature%20-%20%3CYOUR%20FEATURE%20NAME%3E Marked this as a bug, as I don't see any other way to communicate. Reproduction repo No response @thetutlage will be able to tell what we should do here. Probably open a discussion forum like on the Adonis organisation? In the meantime, feel free to open a feature request on this repo if needed. The ability to specify a directory path is something we already have, via resolveSnapshotPath, see: https://japa.dev/docs/plugins/snapshot#configuration-options I think we can use the AdonisJS discussions forum for the same. We even have a Japa category there for the same. https://github.com/adonisjs/core/discussions/categories/japa
gharchive/issue
2023-11-08T21:47:40
2025-04-01T06:39:10.317940
{ "authors": [ "Julien-R44", "mrmlnc", "thetutlage" ], "repo": "japa/snapshot", "url": "https://github.com/japa/snapshot/issues/3", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1083976320
v1.3.3 Fixes Breaks up field validation on APIServicesDHCPdUpdate.inc to validate each field within it's own method. Addresses issue that prevented DHCP configurations from being updated when the default interface DHCP configuration was not initialized. (#178) Adds staticarp field to APIServicesDHCPdUpdate.inc and adds notes regarding issues with static ARP via API (#129). Updates documentation for range_from and range_to fields on /api/v1/services/dhcpd to state their conditional requirement. (briefly mentioned in #178) Updates copyright for 2022 year Fixes typo in documentation for /api/v1/services/dhcpd endpoint that stated incorrect required privilege name Beta builds are available: pfSense 2.5: pkg add https://github.com/jaredhendrickson13/pfsense-api/files/7747370/pfSense-2.5-pkg-API-1.3_3beta_1.zip && /etc/rc.restart_webgui pfSense 2.6: pkg add https://github.com/jaredhendrickson13/pfsense-api/files/7747371/pfSense-2.6-pkg-API-1.3_3beta_1.zip && /etc/rc.restart_webgui pfSense-2.5-pkg-API-1.3_3beta_1.zip pfSense-2.6-pkg-API-1.3_3beta_1.zip
gharchive/pull-request
2021-12-19T00:43:51
2025-04-01T06:39:10.341759
{ "authors": [ "jaredhendrickson13" ], "repo": "jaredhendrickson13/pfsense-api", "url": "https://github.com/jaredhendrickson13/pfsense-api/pull/184", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
481239454
?actionBarTheme for Toolbar seems not correct before change the theme In the demo-main module, we can see that the DrawerActivity includes a Toolbar. But when first run it (Don't change theme), the toolbar's text color seems not correct. Here are the screenshots: It's okay, MainActivity don't use Toolbar. The title text color of Toolbar and other menu icons color is black rather than white. Devices: Genymotion Android 8.0 Xiaomi mix2s MIUI 10 Android 9.0 Have same problem. Launch demo-simple-java. Go to settings. Choose primary color - yellow. Go Back. And here we go: How i can fix this??
gharchive/issue
2019-08-15T16:49:06
2025-04-01T06:39:10.347613
{ "authors": [ "ivan200", "tianma8023" ], "repo": "jaredrummler/Cyanea", "url": "https://github.com/jaredrummler/Cyanea/issues/68", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
276743727
Positional parameters cannot follow a no-parameter option If you have the following scenario: Option -c that takes no parameters Positional parameters The the following command line works: ./example blah -c However this command line fails: ./example -c blah Argument ‘blah’ failed to parse Since the -c option takes no parameters, it would be nice if it could be used anywhere in the command line and not just at the end. In case it's relevant, if you have another option that takes a parameter then that does work: ./example -c -t test blah So if a no-parameter option is followed by another dashed option then it works, but when it's followed by a potential positional parameter then it fails. Just so you know what's going on: this was caused by 6c9bae4a071d6892069e6bd998fbae47193df0a8, which added parsing for boolean values, so that you can write things like --foo=false to explicitly disable an option. The problem is that it treated booleans as taking a parameter now, so -c blah tries to parse blah as a boolean value. It works in a lot of use cases because you might write something like -c -f file, and the -f looks like an option, so it skips it. The short story is that the handling of implicit values is a bit broken, and the entire parser loop is a bit of a mess. So I'll have to work out the best way to patch or rewrite this so that it behaves in a sane way. No worries at all, thanks for the update! Sounds like it could be tricky if you consider -c true as equivalent to -c. I guess if only the long variant is allowed and only with an equals sign (--foo=false) then you could get away with it, but as soon as you allow a space I think you'll end up with ambiguity. Take this for example: (1) $ ./power-on true true # Turn on devices 1 and 2 (2) $ ./power-on --delay=true true # Turn on only device 1 with a delay (3) $ ./power-on --delay true true # Turn on devices 1 and 2 with a delay (or maybe only device 1 with a delay?) I think (3) is always going to be ambiguous unless you require the equals as in (2). Yes I think that's how I'll fix it. It's too ambiguous otherwise, especially when --delay --foo is parsed as two arguments because of the - at the start of --foo.
gharchive/issue
2017-11-25T11:18:30
2025-04-01T06:39:10.365943
{ "authors": [ "Malvineous", "jarro2783" ], "repo": "jarro2783/cxxopts", "url": "https://github.com/jarro2783/cxxopts/issues/84", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
519674926
reduce test time, fix spelling, add Week() Reduced the testing time by lowering some of the iterations and sleeps Spelling fixes Added Week() it was missing Use the interval constants named functions @Streppel Cool @JohnRoesler! Ideally, working on #88 would fix the long testing time as we'd to wrap the clock and use a stub instead. I'm having some thoughts on the best way to implement it.
gharchive/pull-request
2019-11-08T03:56:43
2025-04-01T06:39:10.454937
{ "authors": [ "JohnRoesler", "Streppel" ], "repo": "jasonlvhit/gocron", "url": "https://github.com/jasonlvhit/gocron/pull/125", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }
54876135
@import '_file.scss'; not working Might be a noobish misunderstanding, but importing a dependency file doesn't work for me. When trying to import a partial scss file I'm getting the 'file to import not found or unreadable' error. The main scss file and the partial file are in the same directory. No need to add the underscore or the file extension for partials. Try @import "file". Thanks Jason! It still gives me the error but at least it generates the *.css file correctly. Also in live preview the changes in the imported *.scss file don't apply until the importing file is changed(re-saved). Also in live preview the changes in the imported *.scss file don't apply until the importing file is changed(re-saved). Same problem here. This makes developing and styling tedious Could you try syncing to the latest https://github.com/jasonsanjose/bourbon-example. I updated it to include a local partial file for importing. @Belkar also please try the latest 1.0.4-83 update. Are you on mac or win? Updated the extension, still getting the same behavior. Bourbon has the same behavior too. I am running Windows 8.1. Hmm. I'm on windows and I'm still not seeing an issue. Have you tried quitting and restarting Brackets? @Belkar also please try the latest 1.0.4-83 update. Are you on mac or win? Thanks it works now :smile: @jasonsanjose yes I restarted Brackets several times. Haven't tried a reinstall though. That usually cuts it with programs on Windows. Here is a demo of my problem. @Belkar what OS are you running? I'm using Windows 8.1. So I tried several things. I uninstalled Brackets; removed the installation folder from Program Files; cleaned the registries, the temp files and the Roaming folder; Installed Brackets; Installed brackets-sass. Not sure which one of this things helped but it works for me too. I'm still getting the false error but at least it updates the partial styles in live preview. @Belkar it worked for me with the default config. Thanks for the help guys. Glad it's working for you. Sorry for the extra trouble. Closing.
gharchive/issue
2015-01-20T12:14:40
2025-04-01T06:39:10.463186
{ "authors": [ "Belkar", "aivascu", "jasonsanjose" ], "repo": "jasonsanjose/brackets-sass", "url": "https://github.com/jasonsanjose/brackets-sass/issues/79", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
485933148
unable to configure Phonetisaurus hello, I try to install /Phonetisaurus on ubuntu 19.04 amd64 to run jasper voice assistant http://jasperproject.github.io/documentation/installation/ Phonetisaurus on is a jasper project dependences but opebfst isent't found Here is my configuration : uname -r 5.0.0-25-generic ubuntu 19.04 here is Phonetisaurus's configure log ./configure checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking for a thread-safe mkdir -p... /usr/bin/mkdir -p checking for gawk... gawk checking whether make sets $(MAKE)... yes checking whether make supports nested variables... yes checking whether to enable maintainer-specific portions of Makefiles... no checking for style of include used by make... GNU checking for gcc... gcc checking whether the C compiler works... yes checking for C compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... none needed checking whether gcc understands -c and -o together... yes checking dependency style of gcc... gcc3 checking for ar... ar checking the archiver (ar) interface... ar checking build system type... x86_64-pc-linux-gnu checking host system type... x86_64-pc-linux-gnu checking how to print strings... printf checking for a sed that does not truncate output... /usr/bin/sed checking for grep that handles long lines and -e... /usr/bin/grep checking for egrep... /usr/bin/grep -E checking for fgrep... /usr/bin/grep -F checking for ld used by gcc... /usr/bin/ld checking if the linker (/usr/bin/ld) is GNU ld... yes checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B checking the name lister (/usr/bin/nm -B) interface... BSD nm checking whether ln -s works... yes checking the maximum length of command line arguments... 1572864 checking how to convert x86_64-pc-linux-gnu file names to x86_64-pc-linux-gnu format... func_convert_file_noop checking how to convert x86_64-pc-linux-gnu file names to toolchain format... func_convert_file_noop checking for /usr/bin/ld option to reload object files... -r checking for objdump... objdump checking how to recognize dependent libraries... pass_all checking for dlltool... no checking how to associate runtime and link libraries... printf %s\n checking for archiver @file support... @ checking for strip... strip checking for ranlib... ranlib checking command to parse /usr/bin/nm -B output from gcc object... ok checking for sysroot... no checking for a working dd... /usr/bin/dd checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1 checking for mt... mt checking if mt is a manifest tool... no checking how to run the C preprocessor... gcc -E checking for ANSI C header files... yes checking for sys/types.h... yes checking for sys/stat.h... yes checking for stdlib.h... yes checking for string.h... yes checking for memory.h... yes checking for strings.h... yes checking for inttypes.h... yes checking for stdint.h... yes checking for unistd.h... yes checking for dlfcn.h... yes checking for objdir... .libs checking if gcc supports -fno-rtti -fno-exceptions... no checking for gcc option to produce PIC... -fPIC -DPIC checking if gcc PIC flag -fPIC -DPIC works... yes checking if gcc static flag -static works... yes checking if gcc supports -c -o file.o... yes checking if gcc supports -c -o file.o... (cached) yes checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes checking whether -lc should be explicitly linked in... no checking dynamic linker characteristics... GNU/Linux ld.so checking how to hardcode library paths into programs... immediate checking whether stripping libraries is possible... yes checking if libtool supports shared libraries... yes checking whether to build shared libraries... yes checking whether to build static libraries... yes checking for g++... g++ checking whether we are using the GNU C++ compiler... yes checking whether g++ accepts -g... yes checking dependency style of g++... gcc3 checking how to run the C++ preprocessor... g++ -E checking for ld used by g++... /usr/bin/ld -m elf_x86_64 checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes checking for g++ option to produce PIC... -fPIC -DPIC checking if g++ PIC flag -fPIC -DPIC works... yes checking if g++ static flag -static works... yes checking if g++ supports -c -o file.o... yes checking if g++ supports -c -o file.o... (cached) yes checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes checking dynamic linker characteristics... (cached) GNU/Linux ld.so checking how to hardcode library paths into programs... immediate checking whether g++ supports C++11 features by default... yes checking for gcc... (cached) gcc checking whether we are using the GNU C compiler... (cached) yes checking whether gcc accepts -g... (cached) yes checking for gcc option to accept ISO C89... (cached) none needed checking whether gcc understands -c and -o together... (cached) yes checking dependency style of gcc... (cached) gcc3 checking how to run the C preprocessor... gcc -E checking whether ln -s works... yes checking for getgid in -lc... yes checking for dlopen in -ldl... yes checking for cos in -lm... yes checking for pthread_mutex_init in -lpthread... yes checking stddef.h usability... yes checking stddef.h presence... yes checking for stddef.h... yes checking for stdlib.h... (cached) yes checking for string.h... (cached) yes checking for stdbool.h that conforms to C99... no checking for _Bool... no checking for inline... inline checking for size_t... yes checking for ssize_t... yes checking for ptrdiff_t... yes checking for working strtod... yes checking for memmove... yes checking for strchr... yes checking for strrchr... yes checking for strspn... yes checking fst/fst.h usability... yes checking fst/fst.h presence... yes checking for fst/fst.h... yes checking for openfst libraries... no configure: error: Can't find OpenFST or one or more of its extensions. Use --with-openfst-includes and --with-openfst-libs to specify where you have installed OpenFst. OpenFst should have been configured with the following flags: --enable-static --enable-shared --enable-far --enable-ngram-fsts note : I have been install openfst librery on /usr/local/lib Could you help me please ? Thanks for your support Best regards Battant The work on Jasper - specifically making it work as-is and refactoring to Python 3 is being conducted at https://github.com/aplawson/jasper-client -- including a tutorial on how to build it and/or deploy it with a custom Raspbian ISO image. //adam
gharchive/issue
2019-08-27T17:17:11
2025-04-01T06:39:10.553674
{ "authors": [ "Battant", "aplawson" ], "repo": "jasperproject/jasper-client", "url": "https://github.com/jasperproject/jasper-client/issues/716", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
1467629638
errors:trying to send message larger than max(2179626 vs. 2097152) I have an error message: i.joperator.processing.eventDispatcher: Kubernets exception 500 failure eecyting :PUT at http://xxx .message: rpc error :code =trying to send message larger max(2179622 vs. 2097152) I did not find the configuration item to solve the problem Please help me Hi @wangchenglonggithub , this issue seems to me a generic one with Kubernetes API server. probably the best place to ask is the sig-api-machinery channel on Kubernetes Slack. see: https://kubernetes.slack.com/archives/C0EG7JC6T/p1669669822416379 will close this for now. @wangchenglonggithub pls let us know if you found out the problem.
gharchive/issue
2022-11-29T08:09:38
2025-04-01T06:39:10.586339
{ "authors": [ "csviri", "wangchenglonggithub" ], "repo": "java-operator-sdk/java-operator-sdk", "url": "https://github.com/java-operator-sdk/java-operator-sdk/issues/1634", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
72020097
Little code clean up Reviewed code style. Sorted methods. Android methods go first, then user actions and finally bus events. Added static ChatService assessor to Application. Started ChatService in background to pass the StrictMode. Ensured to handle UI events in the main thread. Refactored common stuff into a BaseActivity. Looks fantastic! +1 Good work +1 Perhaps we should create some style rules at some point to get everyone on the same page Maybe we can use Checkstyle
gharchive/pull-request
2015-04-29T23:53:59
2025-04-01T06:39:10.589363
{ "authors": [ "arreche", "jayseejc", "jephillips" ], "repo": "javabrewery/brew-chat", "url": "https://github.com/javabrewery/brew-chat/pull/27", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
564573734
when i use obfuscator@0.24.5 will cause very strange error. i have a 4000 lines node code, i use version 0.18.7 work fine, but use 0.24.5 will cause very strange error. like: const { title, permission, group} = req.body; sometime will cause group is undefined or some other from req.body be undefined, but i post body is work fine. i have hundreds const {arg1, arg2, arg3...} = req.body; but when i submit, some arg will became undefined. my config: compact code: true identifier Names Generator: hexadecimal self defending: true control flow flatteing: true control flow flattening threshold: 0.8 dead code injection: true dead code injection threshold: 0.4 string array: true rotate string array: true string array encoding: rc4 string array threshold: 0.8 unicode escape sequence: true disable console output: true debug protection: true debug protection interval: true seed:0 target: node const vipbuys = await Vipbuy.find().populate('group'); will cause TypeError: Cannot read property 'refPath' of undefined. this like 'group' be undefined. Hmm. Transform object keys is disabled? Try to remove all opttions, the error will happen? I need an example of the code like: exports.postadddownload = async (req, res) => { const { name, url, type } = req.body; await Download.create({ name, url, path: './download/' + name + '.mp4', status: '等待下载', type, }) res.redirect('/admin/download'); } after post, type property will lost. i dont know what happen, maybe somewhere type is against this 'type'. Can you debug: const { title, permission, group} = req.body; after this line, group already undefined? can fix it by obfuscator more times. i think this is because i have more function inclue type property. like: exports.posteditad = async (req, res) => { const id = req.body.id; const { title, type } = req.body; let types = []; types = types.concat(type); console.log(types); await Ad.updateOne({ _id: id }, { $set: { title, type: types.join(',') } }); res.redirect('/admin/ad'); }; exports.postaddapp = async (req, res) => { const { title, theimg, link, duration, type } = req.body; await App.create({ title, img: theimg, link, duration, type }); res.redirect('/admin/app'); }; i cant reappear it. sorry. i already fix it by obfusecate more times.
gharchive/issue
2020-02-13T10:10:22
2025-04-01T06:39:10.633124
{ "authors": [ "bookyo", "sanex3339" ], "repo": "javascript-obfuscator/javascript-obfuscator", "url": "https://github.com/javascript-obfuscator/javascript-obfuscator/issues/556", "license": "bsd-2-clause", "license_type": "permissive", "license_source": "bigquery" }
1226219201
🛑 Rodriguez Marin, 88 is down In 27c5df0, Rodriguez Marin, 88 (http://159.65.204.200:7057/) was down: HTTP code: 502 Response time: 18618 ms Resolved: Rodriguez Marin, 88 is back up in 39b99d1.
gharchive/issue
2022-05-05T05:42:16
2025-04-01T06:39:10.654752
{ "authors": [ "javisu" ], "repo": "javisu/monitor", "url": "https://github.com/javisu/monitor/issues/1603", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1470934488
🛑 Paseo de Extremadura, 298 is down In cd43365, Paseo de Extremadura, 298 (http://159.65.204.200:7046/) was down: HTTP code: 0 Response time: 0 ms Resolved: Paseo de Extremadura, 298 is back up in 9a7918e.
gharchive/issue
2022-12-01T09:02:20
2025-04-01T06:39:10.657331
{ "authors": [ "javisu" ], "repo": "javisu/monitor", "url": "https://github.com/javisu/monitor/issues/4568", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1572581605
🛑 Rodriguez Marin, 88 is down In 3796207, Rodriguez Marin, 88 (http://159.65.204.200:7057/) was down: HTTP code: 502 Response time: 18557 ms Resolved: Rodriguez Marin, 88 is back up in fe70218.
gharchive/issue
2023-02-06T13:34:03
2025-04-01T06:39:10.659901
{ "authors": [ "javisu" ], "repo": "javisu/monitor", "url": "https://github.com/javisu/monitor/issues/6059", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1580686433
🛑 Antonio Acuña, 10 is down In 097c08c, Antonio Acuña, 10 (http://159.65.204.200:7047/) was down: HTTP code: 502 Response time: 18668 ms Resolved: Antonio Acuña, 10 is back up in 52191ab.
gharchive/issue
2023-02-11T06:29:14
2025-04-01T06:39:10.662180
{ "authors": [ "javisu" ], "repo": "javisu/monitor", "url": "https://github.com/javisu/monitor/issues/6167", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1157915333
🛑 Norma, 357 is down In c593771, Norma, 357 (http://159.65.204.200:7044/) was down: HTTP code: 502 Response time: 18620 ms Resolved: Norma, 357 is back up in 4ae971d.
gharchive/issue
2022-03-03T03:05:51
2025-04-01T06:39:10.664533
{ "authors": [ "javisu" ], "repo": "javisu/monitor", "url": "https://github.com/javisu/monitor/issues/664", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1645064892
🛑 Colegio Malvar Piscina is down In 95eecd7, Colegio Malvar Piscina (http://159.65.204.200:7041/) was down: HTTP code: 0 Response time: 0 ms Resolved: Colegio Malvar Piscina is back up in 025fa6b.
gharchive/issue
2023-03-29T05:32:46
2025-04-01T06:39:10.666890
{ "authors": [ "javisu" ], "repo": "javisu/monitor", "url": "https://github.com/javisu/monitor/issues/7123", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1164869897
🛑 Brescia, 4 is down In 0e68c76, Brescia, 4 (http://159.65.204.200:7051/) was down: HTTP code: 0 Response time: 0 ms Resolved: Brescia, 4 is back up in 8ff1787.
gharchive/issue
2022-03-10T07:50:02
2025-04-01T06:39:10.669194
{ "authors": [ "javisu" ], "repo": "javisu/monitor", "url": "https://github.com/javisu/monitor/issues/825", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
20921951
Added support for flat data structures I added support for key value instances, where the li element's text is the value and no complex data structure is used. Sorry, but this is not a feature I want to add. Thanks anyways!
gharchive/pull-request
2013-10-13T05:01:34
2025-04-01T06:39:10.671489
{ "authors": [ "ermagana", "javve" ], "repo": "javve/list.js", "url": "https://github.com/javve/list.js/pull/157", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2063701034
Unable to get http3 request against pie.dev Hi there, I was directed here by @Ousret from a merge request on httpie - hopefully this is the right location to file this bug. Please note, I would be happy to provide any additional information and try out experiments if that helps debug this issue. Summary I expected to get a http3 connection to pie.dev ❯ python Python 3.12.1 (main, Dec 8 2023, 18:57:37) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> from niquests import Session >>> >>> with Session() as s: ... print(s.get("https://pie.dev/get")) ... <Response HTTP/2 [200]> >>> from urllib3.contrib.hface import HTTPProtocolFactory, HTTP3Protocol >>> >>> print(HTTPProtocolFactory.new(HTTP3Protocol)) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/dwt/.virtualenvs/tempenv-714d1721928c9/lib/python3.12/site-packages/urllib3/contrib/hface/protocols/_factories.py", line 126, in new return implementation_target(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: HTTP3ProtocolAioQuicImpl.__init__() missing 3 required keyword-only arguments: 'remote_address', 'server_name', and 'tls_config' Reproduction Steps See above System Information $ python -m niquests.help ❯ python --version Python 3.12.1 ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ macosver 10.16 ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ pip list Package Version Editable project location ------------------ ---------- ------------------------------- certifi 2023.11.17 cffi 1.16.0 charset-normalizer 3.3.2 cryptography 41.0.7 defusedxml 0.7.1 h11 0.14.0 h2 4.1.0 hpack 4.0.0 httpie 4.0.0b1 /Users/dwt/Code/Projekte/httpie hyperframe 6.0.1 idna 3.6 kiss-headers 2.4.3 markdown-it-py 3.0.0 mdurl 0.1.2 multidict 6.0.4 niquests 3.4.0 pip 23.3.2 pycparser 2.21 Pygments 2.17.2 python-socks 2.4.4 qh3 0.14.0 requests 2.31.0 requests-toolbelt 1.0.0 rich 13.7.0 setuptools 69.0.3 urllib3 2.1.0 urllib3-future 2.4.902 wassima 1.0.3 ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ python -m niquests.help { "charset_normalizer": { "version": "3.3.2" }, "cryptography": { "version": "41.0.7" }, "http1": { "h11": "0.14.0" }, "http2": { "h2": "4.1.0" }, "http3": { "enabled": true, "qh3": "0.14.0" }, "idna": { "version": "3.6" }, "implementation": { "name": "CPython", "version": "3.12.1" }, "niquests": { "version": "3.4.0" }, "ocsp": { "enabled": true }, "platform": { "release": "22.6.0", "system": "Darwin" }, "system_ssl": { "version": "30200000" }, "urllib3.future": { "version": "2.4.902" }, "wassima": { "certifi_fallback": false, "enabled": true, "version": "1.0.3" } } OK, let's try to get more intel. import logging from niquests import Session logger = logging.getLogger() logger.setLevel(logging.DEBUG) explain_handler = logging.StreamHandler() explain_handler.setFormatter( logging.Formatter("%(asctime)s | %(levelname)s | %(message)s") ) logger.addHandler(explain_handler) with Session() as s: print(s.get("https://pie.dev/get")) ❯ python Python 3.12.1 (main, Dec 8 2023, 18:57:37) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import logging >>> from niquests import Session >>> >>> logger = logging.getLogger() >>> logger.setLevel(logging.DEBUG) >>> explain_handler = logging.StreamHandler() >>> explain_handler.setFormatter( ... logging.Formatter("%(asctime)s | %(levelname)s | %(message)s") ... ) >>> logger.addHandler(explain_handler) >>> >>> with Session() as s: ... print(s.get("https://pie.dev/get")) ... 2024-01-03 12:07:11,846 | DEBUG | Converted retries value: 0 -> Retry(total=0, connect=None, read=None, redirect=None, status=None) 2024-01-03 12:07:11,847 | DEBUG | Converted retries value: 0 -> Retry(total=0, connect=None, read=None, redirect=None, status=None) 2024-01-03 12:07:11,859 | DEBUG | Starting new HTTPS connection (1): pie.dev:443 2024-01-03 12:07:12,136 | DEBUG | Converted retries value: 0 -> Retry(total=0, connect=None, read=None, redirect=None, status=None) 2024-01-03 12:07:12,136 | DEBUG | Converted retries value: 0 -> Retry(total=0, connect=None, read=None, redirect=None, status=None) 2024-01-03 12:07:12,138 | DEBUG | Starting new HTTP connection (1): e1.o.lencr.org:80 2024-01-03 12:07:12,250 | DEBUG | http://e1.o.lencr.org:80 "POST / HTTP/1.1" 200 344 2024-01-03 12:07:12,253 | DEBUG | Adding (b':method', b'GET') to the header table, sensitive:False, huffman:True 2024-01-03 12:07:12,253 | DEBUG | Encoding 2 with 7 bits 2024-01-03 12:07:12,253 | DEBUG | Adding (b':scheme', b'https') to the header table, sensitive:False, huffman:True 2024-01-03 12:07:12,253 | DEBUG | Encoding 7 with 7 bits 2024-01-03 12:07:12,253 | DEBUG | Adding (b':path', b'/get') to the header table, sensitive:False, huffman:True 2024-01-03 12:07:12,253 | DEBUG | Encoding 4 with 6 bits 2024-01-03 12:07:12,253 | DEBUG | Encoding 3 with 7 bits 2024-01-03 12:07:12,253 | DEBUG | Adding (b':authority', b'pie.dev') to the header table, sensitive:False, huffman:True 2024-01-03 12:07:12,253 | DEBUG | Encoding 1 with 6 bits 2024-01-03 12:07:12,253 | DEBUG | Encoding 5 with 7 bits 2024-01-03 12:07:12,253 | DEBUG | Adding (b'user-agent', b'niquests/3.4.0') to the header table, sensitive:False, huffman:True 2024-01-03 12:07:12,253 | DEBUG | Encoding 58 with 6 bits 2024-01-03 12:07:12,253 | DEBUG | Encoding 10 with 7 bits 2024-01-03 12:07:12,253 | DEBUG | Adding (b'accept-encoding', b'gzip, deflate') to the header table, sensitive:False, huffman:True 2024-01-03 12:07:12,254 | DEBUG | Encoding 16 with 7 bits 2024-01-03 12:07:12,254 | DEBUG | Adding (b'accept', b'*/*') to the header table, sensitive:False, huffman:True 2024-01-03 12:07:12,254 | DEBUG | Encoding 19 with 6 bits 2024-01-03 12:07:12,254 | DEBUG | Encoding 3 with 7 bits 2024-01-03 12:07:12,254 | DEBUG | Encoded header block to b'\x82\x87D\x83bb\xa7A\x85\xac\xc5^B\xf7z\x8a\xa8\xdd\xad*\x12\x86\x19]\xa5\xc1\x90S\x83\xf9c\xe7' 2024-01-03 12:07:12,305 | DEBUG | Decoding b'?\xe1\x1f\x88a\x96\xe4Y>\x94\x03*e\x1dJ\x08\x02i@\x86\xe0\x1d\xb8\x11)\x8bF\xff_\x8b\x1du\xd0b\r&=LtA\xeaT\x01*@\x96\x19\x08T!b\x1e\xa4\xd8z\x16\x1d\x14\x1f\xc2\xc4\xb0\xb2\x16\xa4\x98t#\x83M\x96\x97@\x8a$\xab\x10d\x9c\xab!#M\xa8\x86\xbf\xcfL:2^@\x87\xb0\xb5\x9e\xc4\xac\x93\xff\xffH\xff\xfd\xfc\x96\xa9+9\xaaJ?\x9b\x9f\xfb\xff\xfd\xfc\xdbe\x1f\xcd\xcf\xe6t\xa6\xb4\\\xff\xfe\x0c\x7f\xff\x06\x06\xbdE\xa1rP{d\x96\x81\xd8U\xc8z\x7f\xff\x83\x16\x16\xb3\xd8\x9f\xff\xe0\xc7v\x7f\xc4A\x9f\x81U\x15\xd1\x8b\x898\xee\xbe\x87\xf0\xe2\xd6a\xdb\xd02*+\xbbp\xf1\xd5+\x93TuE\x86\x95Eu\x00\x9e\x91\xef\xcaX\xe6\xd7\xeb~\x9f\n\x8b\x0e\t\xaf\xedg\x9b\xdc3\xf1\xf5\x82\xee\xc0>4\xfe~\xf6j\xfe\xb9\xe3\x7f\xd8=\x14\x98a\x0c\xc3\xb3\x87z\xc8\x8f^\xf68\xe7EE\x87Sx\x80\xdf\xefLCCro4\x8e\xbe\xdf\x1f\xe7\xff\xdf\xfe}\x7f3X{k\xfen\x7f$\x95j\x8bG\xf3\xf5\xfc\xd2?1\x0eb\xff7\x1c\x03O\x00\x1f\xfe\xff@\x03nel\xb1\xff\xfd\xfc\xa2\xd2\x10\xa8DR\xd82$\xc7\xab\xf9\xb8\x0f\xaf\xe6\xc2\xd6{\x13\x12O\xfc\xdc\xfeI*\xd5\x16\x8f\xe7\xeb\xf9\xa4~b\x1c\xc5\xfen8\x06\x9e\x00?\xfdv\x87%\x07\xb6Ih\x1d\x85@\x85$\xabX?_\x8fy\x99FG$|\xa5}\xd8\xdfm\xa5\xa1\xd1\xbbZ\x83\x9b\xd9\xab@\x85\x1d\tY\x1d\xc9\x90\x9d\x98?\x9b\x8d4\xcf\xf3\xf6\xa5#\x81\xe7\x1a\x00?' 2024-01-03 12:07:12,305 | DEBUG | Decoded 4096, consumed 3 bytes 2024-01-03 12:07:12,305 | DEBUG | Resizing header table to 4096 from 4096 2024-01-03 12:07:12,305 | DEBUG | Decoded 8, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded (b':status', b'200'), consumed 1 2024-01-03 12:07:12,305 | DEBUG | Decoded 33, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded 22, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded (b'date', b'Wed, 03 Jan 2024 11:07:12 GMT'), total consumed 24 bytes, indexed True 2024-01-03 12:07:12,305 | DEBUG | Decoded 31, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded 11, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded (b'content-type', b'application/json'), total consumed 13 bytes, indexed True 2024-01-03 12:07:12,305 | DEBUG | Decoded 20, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded 1, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded (b'access-control-allow-origin', <memory at 0x102dfb700>), total consumed 3 bytes, indexed True 2024-01-03 12:07:12,305 | DEBUG | Decoded 22, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded 3, consumed 1 bytes 2024-01-03 12:07:12,305 | DEBUG | Decoded (b'access-control-allow-credentials', b'true'), total consumed 28 bytes, indexed True 2024-01-03 12:07:12,305 | DEBUG | Decoded 10, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded 6, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded (b'cf-cache-status', b'DYNAMIC'), total consumed 19 bytes, indexed True 2024-01-03 12:07:12,306 | DEBUG | Decoded 7, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded 199, consumed 2 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded (b'report-to', b'{"endpoints":[{"url":"https:\\/\\/a.nel.cloudflare.com\\/report\\/v3?s=LUe%2Ba2VcVSDs9FGPiauj1d%2BRFVOf6gno%2Fm%2Bs0hmaTJebgPyTNw%2FEgDR3Y8ULVyEBQ09atXZq4DPhb9z0yecFA1garUvpcsyzQ66j%2FO5G05ZjGas5dTid795V"}],"group":"cf-nel","max_age":604800}'), total consumed 210 bytes, indexed True 2024-01-03 12:07:12,306 | DEBUG | Decoded 3, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded 49, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded (<memory at 0x102dfb1c0>, b'{"success_fraction":0,"report_to":"cf-nel","max_age":604800}'), total consumed 55 bytes, indexed True 2024-01-03 12:07:12,306 | DEBUG | Decoded 54, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded 7, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded (b'server', b'cloudflare'), total consumed 9 bytes, indexed True 2024-01-03 12:07:12,306 | DEBUG | Decoded 5, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded 15, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded (b'cf-ray', b'83fac6d9ee97b954-AMS'), total consumed 23 bytes, indexed True 2024-01-03 12:07:12,306 | DEBUG | Decoded 26, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded 3, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded (b'content-encoding', b'gzip'), total consumed 5 bytes, indexed True 2024-01-03 12:07:12,306 | DEBUG | Decoded 5, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded 16, consumed 1 bytes 2024-01-03 12:07:12,306 | DEBUG | Decoded (b'alt-svc', b'h3=":443"; ma=86400'), total consumed 24 bytes, indexed True 2024-01-03 12:07:12,306 | DEBUG | https://pie.dev:443 "GET /get HTTP/2.0" 200 None <Response HTTP/2 [200]> >>> Everything seems fine on Niquests side. I have a reasonable explanation onto why. Forget about the HTTP/1.1 it is a OCSP request made to ensure you're not getting MITM-hacked. from niquests import Session with Session() as s: print(s.get("https://pie.dev/get")) print(s.get("https://pie.dev/get")) Then run from niquests import Session with Session(resolver="doh+google://") as s: print(s.get("https://pie.dev/get")) On the HTTPie side, verify your config directory is actually writable. Run (twice): https pie.dev/get and https --resolver "doh+google://" pie.dev/get You should have a file in /Users/dwt/.config/httpie named quic.json. Finally, I did find a bug in the PR. The caching layer wasn't properly injected in the custom HTTPSAdapter, and made the --http3 flag useless. I fixed it. Thank you for the report and debug. Tell me if this (recent push) fixed the problem. Do you confirm its fixed? If so, feel free to close this issue. Re: I am going to suppose everything is OK on your side. If not, do not hesitate to let me know. Regards, Sorry, I was a bit away the last few days - I have retested now and had these findings: ❯ python Python 3.12.1 (main, Dec 8 2023, 18:57:37) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> from niquests import Session with Session() as s: print(s.get("https://pie.dev/get")) print(s.get("https://pie.dev/get")) >>> >>> with Session() as s: ... print(s.get("https://pie.dev/get")) ... print(s.get("https://pie.dev/get")) ... <Response HTTP/2 [200]> <Response HTTP/3 [200]> >>> ^D ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests 6s ❯ python Python 3.12.1 (main, Dec 8 2023, 18:57:37) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> from niquests import Session with Session(resolver="doh+google://") as s: print(s.get("https://pie.dev/get")) >>> >>> with Session(resolver="doh+google://") as s: ... print(s.get("https://pie.dev/get")) ... <Response HTTP/3 [200]> >>> ^D This seems to work - could you perhaps elaborate why the first request didn't work but the second did? Does the session need to collect the information that the host is http3 capable before trying it? On the shell I was not so lucky: ❯ https pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842cb3d0b8d70baa-AMS Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 12:32:20 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=2oJxp%2FRmNZ11y7hhOr9bkcI3MRmBHh0wyDGi5ZbvJbFo7tkxRJ7ULTPR%2F4EJslQGJhR4B2jh2zwO9BkNA8Jl55t2n%2FAHrW99AC7QjOTWDWqvQhgGvSdkuZhV"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842cb3d0b8d70baa-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ https pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842cb3e7be386fab-CDG Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 12:32:24 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=x7%2BYUHaNka9LBnCLM6ndKctXAAokJ1BGvBYvJF9pdbc0yjLZowYIT68ULkJ2sQyw50STbTjTWS8uE2wWAb0%2BmMkbE4JWrCczanMqBw%2BN8Mw0EH24c7eG1RVY"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842cb3e7be386fab-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ https --resolver "doh+google://" pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842cb4360fa76627-AMS Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 12:32:36 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=Fp5K55yzrboR%2FvkSr%2FPSGCQDM4Ghoo0SrBN%2B6c4aGvLtcMZ%2Byb4YFsyN5cXTPwzXXNQ4pQuOSO95TJcA%2FscaiqVFzaZhOREMhbW4n3atnpWkDsFXVNXKsK1B"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842cb4360fa76627-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ https --resolver "doh+google://" pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842cb44ccf350a6f-AMS Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 12:32:40 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=23Q18VEt4d3YuApMUBS0NTExzN7TV5dpwqZTzifTgpck9dNsMyCCsyur8MPNw%2BIJy361tQXa3hGY%2F%2FzQzq3%2Fx1kF2FooPIdblwTZQLSIeDDSxchRi6mKuwmO"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842cb44ccf350a6f-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ https --http3 --resolver "doh+google://" pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842cb5b37abe6f04-CDG Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 12:33:37 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=6R7RN875gKI2cArgg3Mw9EHgmOQdoP5FgQUhkDgs%2F4A6QPxZjnzvdei1aMvyxovWIxvVNdYjScTDrYr%2BtTnHdMKOTePN6X0O4bqBgDplqFoLr1L3%2BRV54SZp"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842cb5b37abe6f04-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } ~/C/P/httpie 🐍 tempenv-714d1721928c9 🌱 feature-tryout-niquests ❯ https --http3 pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842cb5db7d00f110-CDG Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 12:33:44 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=Jhqw1wkYiH55ne8D4%2B4qGnyyHbbx01QNCTZcCcqQYFuXv%2FwQb3ZGu%2FYmZI6VBiglfw4GXmo2XhG0F26m9ZdSgur4U3Gl68Rz83DOglkIAgtv4NtTYxWz9aor"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842cb5db7d00f110-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } All the while the config directory seems to be perfectly writeable, but no config file shows up there. ~/.config/httpie ❯ ls -al . total 8 drwxr-xr-x 3 dwt staff 96 3 Jan 10:43 ./ drwxr-xr-x 23 dwt staff 736 3 Jan 10:43 ../ -rw-r--r-- 1 dwt staff 221 3 Jan 10:43 version_info.json If you give me a point where to start, I can try to debug the library to see why it can't seem to write in that directory? OK. Let's try to further understand your case. Immediately I would try to remove httpie completely and re do the installation of the fork/patch and try again. I am suspicious on how --http3 did not work. I think it will resolve your case definitely. Let me know. could you perhaps elaborate why the first request didn't work but the second did? Does the session need to collect the information that the host is http3 capable before trying it? without a custom DNS, you cannot reach a HTTP/3 endpoint without prior establishing a HTTP/1 or HTTP/2 link, that's why. Not sure, it seems that your assesmen that the quic config file is not written still seems correct. Even when recreating the full venv and trying the requests multiple times that file is not written - and of course the requests stay at http/2 Details ❯ cd Code/Projekte/httpie/ ~/C/P/httpie 🐍 🌱 feature-tryout-niquests ❯ vf tmp Creating tempenv-51b5170562de8 via ~/.local/pipx/venvs/virtualfish/bin/python … ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests 3s ❯ git pull Already up to date. ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests 2s ❯ git remote -v origin git@github.com:Ousret/httpie.git (fetch) origin git@github.com:Ousret/httpie.git (push) ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests ❯ pip install --editable . Obtaining file:///Users/dwt/Code/Projekte/httpie Installing build dependencies ... done Checking if build backend supports build_editable ... done Getting requirements to build editable ... done Preparing editable metadata (pyproject.toml) ... done Requirement already satisfied: pip in /Users/dwt/.virtualenvs/tempenv-51b5170562de8/lib/python3.12/site-packages (from httpie==4.0.0b1) (23.3.2) Collecting charset-normalizer>=2.0.0 (from httpie==4.0.0b1) Using cached charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl.metadata (33 kB) Collecting defusedxml>=0.6.0 (from httpie==4.0.0b1) Using cached defusedxml-0.7.1-py2.py3-none-any.whl (25 kB) Collecting niquests<4,>=3.4.0 (from niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Downloading niquests-3.4.1-py3-none-any.whl.metadata (6.4 kB) Collecting Pygments>=2.5.2 (from httpie==4.0.0b1) Using cached pygments-2.17.2-py3-none-any.whl.metadata (2.6 kB) Collecting setuptools (from httpie==4.0.0b1) Using cached setuptools-69.0.3-py3-none-any.whl.metadata (6.3 kB) Collecting rich>=9.10.0 (from httpie==4.0.0b1) Using cached rich-13.7.0-py3-none-any.whl.metadata (18 kB) Collecting idna<4,>=2.5 (from niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached idna-3.6-py3-none-any.whl.metadata (9.9 kB) Collecting kiss-headers<4,>=2 (from niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached kiss_headers-2.4.3-py3-none-any.whl.metadata (13 kB) Collecting urllib3-future<3,>=2.4.901 (from niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Downloading urllib3_future-2.4.903-py3-none-any.whl.metadata (6.1 kB) Collecting wassima<2,>=1.0.1 (from niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached wassima-1.0.3-cp37-abi3-macosx_11_0_arm64.whl.metadata (3.8 kB) Collecting markdown-it-py>=2.2.0 (from rich>=9.10.0->httpie==4.0.0b1) Using cached markdown_it_py-3.0.0-py3-none-any.whl.metadata (6.9 kB) Collecting mdurl~=0.1 (from markdown-it-py>=2.2.0->rich>=9.10.0->httpie==4.0.0b1) Using cached mdurl-0.1.2-py3-none-any.whl (10.0 kB) Collecting h11<1.0.0,>=0.11.0 (from urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached h11-0.14.0-py3-none-any.whl (58 kB) Collecting h2<5.0.0,>=4.0.0 (from urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached h2-4.1.0-py3-none-any.whl (57 kB) Collecting qh3<1.0.0,>=0.14.0 (from urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached qh3-0.14.0-cp37-abi3-macosx_11_0_arm64.whl.metadata (4.8 kB) Collecting python-socks<3.0,>=2.0 (from urllib3-future[socks]<3,>=2.4.901; extra == "socks"->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached python_socks-2.4.4-py3-none-any.whl.metadata (7.1 kB) Collecting hyperframe<7,>=6.0 (from h2<5.0.0,>=4.0.0->urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached hyperframe-6.0.1-py3-none-any.whl (12 kB) Collecting hpack<5,>=4.0 (from h2<5.0.0,>=4.0.0->urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached hpack-4.0.0-py3-none-any.whl (32 kB) Collecting cryptography<42.0.0,>=41.0.0 (from qh3<1.0.0,>=0.14.0->urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached cryptography-41.0.7-cp37-abi3-macosx_10_12_universal2.whl.metadata (5.2 kB) Collecting cffi>=1.12 (from cryptography<42.0.0,>=41.0.0->qh3<1.0.0,>=0.14.0->urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached cffi-1.16.0-cp312-cp312-macosx_11_0_arm64.whl.metadata (1.5 kB) Collecting pycparser (from cffi>=1.12->cryptography<42.0.0,>=41.0.0->qh3<1.0.0,>=0.14.0->urllib3-future<3,>=2.4.901->niquests<4,>=3.4.0->niquests[socks]<4,>=3.4.0->httpie==4.0.0b1) Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB) Using cached charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl (119 kB) Downloading niquests-3.4.1-py3-none-any.whl (94 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 94.5/94.5 kB 1.2 MB/s eta 0:00:00 Using cached pygments-2.17.2-py3-none-any.whl (1.2 MB) Using cached rich-13.7.0-py3-none-any.whl (240 kB) Using cached setuptools-69.0.3-py3-none-any.whl (819 kB) Using cached idna-3.6-py3-none-any.whl (61 kB) Using cached kiss_headers-2.4.3-py3-none-any.whl (43 kB) Using cached markdown_it_py-3.0.0-py3-none-any.whl (87 kB) Downloading urllib3_future-2.4.903-py3-none-any.whl (337 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 338.0/338.0 kB 6.1 MB/s eta 0:00:00 Using cached wassima-1.0.3-cp37-abi3-macosx_11_0_arm64.whl (261 kB) Using cached python_socks-2.4.4-py3-none-any.whl (52 kB) Using cached qh3-0.14.0-cp37-abi3-macosx_11_0_arm64.whl (263 kB) Using cached cryptography-41.0.7-cp37-abi3-macosx_10_12_universal2.whl (5.3 MB) Using cached cffi-1.16.0-cp312-cp312-macosx_11_0_arm64.whl (177 kB) Building wheels for collected packages: httpie Building editable for httpie (pyproject.toml) ... done Created wheel for httpie: filename=httpie-4.0.0b1-0.editable-py3-none-any.whl size=7312 sha256=d77ed3eca64b9478e633b272b9e9142a5ec2ba03fab00b6a83a31c456fef829b Stored in directory: /private/var/folders/hc/ss_swh2s7rscgx4402mjxtv00000gn/T/pip-ephem-wheel-cache-nppdx0j0/wheels/60/bd/76/3d063ddffe013d962cd599bbfd12c29d9a91313e761fc7f76e Successfully built httpie Installing collected packages: python-socks, wassima, setuptools, Pygments, pycparser, mdurl, kiss-headers, idna, hyperframe, hpack, h11, defusedxml, charset-normalizer, markdown-it-py, h2, cffi, rich, cryptography, qh3, urllib3-future, niquests, httpie Successfully installed Pygments-2.17.2 cffi-1.16.0 charset-normalizer-3.3.2 cryptography-41.0.7 defusedxml-0.7.1 h11-0.14.0 h2-4.1.0 hpack-4.0.0 httpie-4.0.0b1 hyperframe-6.0.1 idna-3.6 kiss-headers-2.4.3 markdown-it-py-3.0.0 mdurl-0.1.2 niquests-3.4.1 pycparser-2.21 python-socks-2.4.4 qh3-0.14.0 rich-13.7.0 setuptools-69.0.3 urllib3-future-2.4.903 wassima-1.0.3 ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests 14s ❯ pip list Package Version Editable project location ------------------ ------- ------------------------------- cffi 1.16.0 charset-normalizer 3.3.2 cryptography 41.0.7 defusedxml 0.7.1 h11 0.14.0 h2 4.1.0 hpack 4.0.0 httpie 4.0.0b1 /Users/dwt/Code/Projekte/httpie hyperframe 6.0.1 idna 3.6 kiss-headers 2.4.3 markdown-it-py 3.0.0 mdurl 0.1.2 niquests 3.4.1 pip 23.3.2 pycparser 2.21 Pygments 2.17.2 python-socks 2.4.4 qh3 0.14.0 rich 13.7.0 setuptools 69.0.3 urllib3-future 2.4.903 wassima 1.0.3 ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests ❯ which https /Users/dwt/.virtualenvs/tempenv-51b5170562de8/bin/https ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests ❯ https --http3 --resolver "doh+google://" pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842e2f7a4e881c82-AMS Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 16:51:31 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=yU2NQKMxVXTqBiwWJaDtlkEUH3k5p4Z6g7%2BhFgmAFrgFXzR1KMqb98mx7Mu5x4LHu3NPz8Wm7NxYanPM4o18syEXQWm9ug2%2BOo6pOjFaMt%2BHmvRAunSgO%2FND"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842e2f7a4e881c82-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests 3s ❯ https --http3 pie.dev/get HTTP/2 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 842e2feaadf365df-FRA Content-Encoding: gzip Content-Type: application/json Date: Tue, 09 Jan 2024 16:51:49 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=oymUHuIvuaPYe9IGgqJnfHIrH6FGoLgcwv%2Bt3Mrkv%2F0%2F4ejdbfgXqUEDsCRESYYv6MSRxbvr84jZnsPyVc5K12ZiaZB%2FCssOdpPV08UNm%2BwNqpAlr7Z1PXL7"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "842e2feaadf365df-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } ~/C/P/httpie 🐍 tempenv-51b5170562de8 🌱 feature-tryout-niquests ❯ ls ~/.config/httpie/ version_info.json OK. I could collect some data using https://github.com/Ousret/httpie-test/actions/runs/7470350480/job/20328900920 I am going to be able to propose a fix for this, hopefully soon. It has to do with the args parser, something seems off. Oh great, it is reproducible! Please ping me if I can help. Also: sorry if it might take me a few days to respond. Wasn't easy, but ultimately found why. It depended on the system openssl default cipher list, that excluded by accident QUIC ciphers, therefore disabled http/3... It is fixed. You may try again. yay! ❯ https --http3 pie.dev/get HTTP/3 200 OK Access-Control-Allow-Credentials: true Access-Control-Allow-Origin: * Alt-Svc: h3=":443"; ma=86400 Cf-Cache-Status: DYNAMIC Cf-Ray: 84678dcdbdec0bdc-AMS Content-Encoding: gzip Content-Type: application/json Date: Tue, 16 Jan 2024 15:57:23 GMT Nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v3?s=g7XjPz9ElzpYpSXcIk4CyBHzEoVn4VOu80R65aLy6UJ9bLs3H9zT%2FK%2BNiuYRKsqn%2BWQgG6VHnUtChRB9JzUL39l9Tk11yplURAJR3pqdteMkkiytQEEvPDR6"}],"group":"cf-nel","max_age":604800} Server: cloudflare { "args": {}, "headers": { "Accept": "*/*", "Accept-Encoding": "gzip", "Cdn-Loop": "cloudflare", "Cf-Connecting-Ip": "91.65.247.116", "Cf-Ipcountry": "DE", "Cf-Ray": "84678dcdbdec0bdc-FRA", "Cf-Visitor": "{\"scheme\":\"https\"}", "Connection": "Keep-Alive", "Host": "pie.dev", "User-Agent": "HTTPie/4.0.0.b1" }, "origin": "91.65.247.116", "url": "https://pie.dev/get" } Indeed this fixes it for me. :)
gharchive/issue
2024-01-03T10:20:17
2025-04-01T06:39:10.699507
{ "authors": [ "Ousret", "dwt" ], "repo": "jawah/niquests", "url": "https://github.com/jawah/niquests/issues/60", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1385063342
[ページ追加] Cloud9 Cloud9の手順も同じサイトの中で見れると良い気がしたのだけどどうだろう? リハーサルに向けて取り込みますー
gharchive/pull-request
2022-09-25T15:10:01
2025-04-01T06:39:10.714156
{ "authors": [ "horsewin", "yamatatsu" ], "repo": "jaws-ug-cdk/aws-cdk-intro-workshop", "url": "https://github.com/jaws-ug-cdk/aws-cdk-intro-workshop/pull/28", "license": "MIT-0", "license_type": "permissive", "license_source": "github-api" }
523761042
gam org operations fail when orgnames contain spaces I have upgraded to the latest GAM release from https://git.io/gamreleases and I still have this issue. I am typing the command as described in the GAM Wiki at https://github.com/jay0lee/gam/wiki Full steps to reproduce the issue: create an OU path that includes spaces in the OU and the sub OU create a user try to move the user to the sub OU you created, quoting the name of the OU to avoid spaces, or using the id, to circumvent the whole spaces issue altogether example: gam update org id:someorgid users someuser reports back: ERROR: 400: Invalid Input: INVALID_OU_ID - invalid trying gam update user someuser org "Some Orgname With Spaces/Suborg With Spaces/SubSubOrg With Spaces" reports back: ERROR: Spaces/Suborg is not a valid argument for "gam update user" Expected outcome (what are you trying to do?): The user is successfully moved to the OU Actual outcome (what errors or bad behavior do you see instead?): error message resulting from spaces in the name It's working for me. Ross $ gam version GAM 4.96 - https://git.io/gam Jay Lee <jay0lee@gmail.com> Python 3.8.0 64-bit final google-api-python-client 1.7.11 MacOS Sierra 10.12.6 x86_64 Path: /Users/admin/Documents/GoogleApps/GAM $ gam info ou "/Test Space/Sub Space/Sub Sub Space" name: Sub Sub Space description: Sub Sub Space orgUnitPath: /Test Space/Sub Space/Sub Sub Space orgUnitId: id:03ph8a2z24acgk2 parentOrgUnitPath: /Test Space/Sub Space parentOrgUnitId: id:03ph8a2z3h9j9wu Got 0 Users: - Users: $ gam update user testuser7 org "/Test Space/Sub Space/Sub Sub Space" updating user testuser7@rdschool.org... $ gam info user testuser7 nogroups nolicenses User: testuser7@rdschool.org First Name: Test Last Name: User7 Is a Super Admin: False Is Delegated Admin: False 2-step enrolled: False 2-step enforced: False Has Agreed to Terms: False IP Whitelisted: False Account Suspended: False Is Archived: False Must Change Password: False Google Unique ID: 666925742703179281666 Customer ID: C012345678 Mailbox is setup: True Included in GAL: True Creation Time: 2019-11-09T00:35:09.000Z Last login time: Never Google Org Unit Path: /Test Space/Sub Space/Sub Sub Space $ gam info ou "/Test Space/Sub Space/Sub Sub Space" name: Sub Sub Space description: Sub Sub Space orgUnitPath: /Test Space/Sub Space/Sub Sub Space orgUnitId: id:03ph8a2z24acgk2 parentOrgUnitPath: /Test Space/Sub Space parentOrgUnitId: id:03ph8a2z3h9j9wu Got 1 Users: testuser7@rdschool.org - testuser7@rdschool.org Users: testuser7@rdschool.org This is not supported by Gam $ gam update user testuser7 org id:03ph8a2z24acgk2 ERROR: id:03ph8a2z24acgk2 is not valid in this context Found the problem, it had to do with how I was executing gam: #!/bin/bash pushd ~/Downloads/gam ./gam popd Just calling gam directly negates the issue.
gharchive/issue
2019-11-16T00:39:37
2025-04-01T06:39:10.719545
{ "authors": [ "charmparticle", "taers232c" ], "repo": "jay0lee/GAM", "url": "https://github.com/jay0lee/GAM/issues/1045", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
962776501
🛑 UpMob-API NL is down In 33c165f, UpMob-API NL ($UPMOB_API) was down: HTTP code: 0 Response time: 0 ms Resolved: UpMob-API NL is back up in e2fe13d.
gharchive/issue
2021-08-06T14:13:28
2025-04-01T06:39:10.722697
{ "authors": [ "jayantkatia" ], "repo": "jayantkatia/status", "url": "https://github.com/jayantkatia/status/issues/72", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1031339662
support in-line cooments #5 but there are performance concerns Thank you for a very good library. When I looked at the issues, there were some that were not supported, so I sent a Pull Request. However, there are performance concerns with this implementation. It will be a bit complicated, but should I change the processing between the cases with in-line comment and without it? Hi Bootjp, Thanks for your kind words :) Can you elaborate a bit on the performance concerns? How large of hosts files are you trying to consume with this? Not sure if I'm missing something, because this PR seems reasonable AFAICT. @bootjp LGTM, thank you!
gharchive/pull-request
2021-10-20T12:16:36
2025-04-01T06:39:10.740263
{ "authors": [ "bootjp", "jaytaylor" ], "repo": "jaytaylor/go-hostsfile", "url": "https://github.com/jaytaylor/go-hostsfile/pull/6", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
96226068
Better support for self signed certificates From mjenk...@idbs.com on June 29, 2012 14:05:03 What steps will reproduce the problem? 1. test an ssl connection with a self signed certificate (no hostname specified) 2. 3. What is the expected output? What do you see instead? I would like to be able to do this but am prevented to because I cannot specify the default host name verifier What version of the product are you using? On what operating system? 1.6.2 (windows) Please provide any additional information below. I would like something like this to work. Many thanks! /* * The following code basically nullifies all SSL checks - this is not recommended to be copied without thought * of the consequences!! Sadly rest assured ignores all this so we may well have to ditch rest assured */ TrustManager[] certs = new TrustManager[] { new X509TrustManager() { @Override public X509Certificate[] getAcceptedIssuers() { return null; } @Override public void checkServerTrusted(X509Certificate[] chain, String authType) throws CertificateException { } @Override public void checkClientTrusted(X509Certificate[] chain, String authType) throws CertificateException { } } }; SSLContext ctx = null; try { ctx = SSLContext.getInstance("TLS"); ctx.init(null, certs, new SecureRandom()); } catch (java.security.GeneralSecurityException ex) { } if (ctx != null) { HttpsURLConnection.setDefaultSSLSocketFactory(ctx.getSocketFactory()); } HttpsURLConnection.setDefaultHostnameVerifier(new HostnameVerifier() { public boolean verify(String hostname, SSLSession session) { return true; } }); Original issue: http://code.google.com/p/rest-assured/issues/detail?id=182 From mjenk...@idbs.com on June 29, 2012 05:05:33 Actually this should be an enhancement - sorry From johan.ha...@gmail.com on July 13, 2012 23:54:26 And you just execute this before you make a request with HTTP Client and it should work? Status: Accepted Labels: -Type-Defect Type-Enhancement From kmin...@gmail.com on October 22, 2012 06:17:18 I just posted to the forum a HttpClient+Jetty sample that might help Johan implement this. included in that post is a suggestion for how I think the API should work. The forum post hasn't show up yet so I can't link it here. I have however attached the sample. Attachment: self-signed-certs.zip From webust...@gmail.com on December 06, 2013 06:20:10 Also see https://github.com/jayway/rest-assured/pull/22 From johan.ha...@gmail.com on December 07, 2013 09:42:25 Great! I've now merged the pull request. From johan.ha...@gmail.com on December 07, 2013 11:03:35 I've actually modified the API (it's not backward compatible). You can now specify the host name verification check using "CertificateAuthSettings" that you may pass in to the "certificate" method. Please try this out and tell me if it works and if you like the API. Depend on version 2.0.2-SNAPSHOT after having added the following Maven repo: <repositories> <repository> <id>sonatype <url> https://oss.sonatype.org/content/repositories/snapshots/ <snapshots /> </repository> </repositories> Status: Fixed
gharchive/issue
2015-07-21T04:48:03
2025-04-01T06:39:10.751567
{ "authors": [ "johanhaleby" ], "repo": "jayway/rest-assured", "url": "https://github.com/jayway/rest-assured/issues/336", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
743316526
Custom username validator is not respected Custom username validator defined as allauth settings is not respected on put/patch request to UserDetailsView although it is respected for registration. This seems valid to me as we are validating the same during the registration process. I think I can give a hand on this, shall I? @iMerica 👍
gharchive/issue
2020-11-15T18:35:21
2025-04-01T06:39:10.755849
{ "authors": [ "iMerica", "jerinpetergeorge", "rphlo" ], "repo": "jazzband/dj-rest-auth", "url": "https://github.com/jazzband/dj-rest-auth/issues/168", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2039266988
Unmodified value isn't set to new default when default is changed Describe the problem I didn't find in the docs what to expect when the default value is changed for some key while the value was not modified before change. I did find two distinct behaviours, based on whether or not the value has been modified or read in the past. Steps to reproduce First case: Key has not been modified or read Let's add a new entry to the constance config: CONSTANCE_CONFIG = { 'SOME_PARAMETER': ("old default", 'Test key', str), } Now, modify the configuration with a new default value: CONSTANCE_CONFIG = { 'SOME_PARAMETER': ("new default", 'Test key', str), } Conclusion, SOME_PARAMETER has been set to the new default value, as it was not modified before. Checking what is going on on the database side seems relevant in this issue: # SELECT * FROM constance_constance; id | key | value ----+-----+------- (0 rows) Second case: Read key before modifying default value Start with a similar configuration: CONSTANCE_CONFIG = { 'SECOND_PARAMETER': ("old default", 'Test key', str), } Now, read it from the console: # python manage.py constance get SECOND_PARAMETER old default Let's check the database now: # SELECT * FROM constance_constance; id | key | value ----+------------------+------------------------------ 7 | SECOND_PARAMETER | gAJYCwAAAG9sZCBkZWZhdWx0cQAu (1 row) Reading the key made it write to the database Let's try to change the default value with new default: Conclusion: SECOND_PARAMETER kept the old default value while it was never modified. Third case: Modify and reset value Again, let's start with a similar configuration: CONSTANCE_CONFIG = { 'THIRD_PARAMETER': ("old default", 'Test key', str), } From the Admin page, or the console, set it to any other value On the database side, the value is inscribed to it at this point And use 'Reset to default' Now, as in the second case, change the configuration to modify the default value. CONSTANCE_CONFIG = { 'THIRD_PARAMETER': ("new default", 'Test key', str), } Conclusion, resetting it to the default value doesn't make it act as if it has never been modified. Conclusion I do like the idea of unmodified values "following" the changes of default values, as in the first case. But I guess the current django-constance version doesn't feature it. Is there any reason for the lack of behaviour consistency as it is right now ? Would it be possible to have unmodified variables auto setting themselves to new default values ? It could also be configured with some CONSTANCE_XXX setting System configuration Django version: 3.2.19 Python version: 3.9.12 Django-Constance version: 3.1.0 This behavior comes from this line https://github.com/jazzband/django-constance/blob/master/constance/base.py#L22 So I did a bit of digging. What is happening when the constance isn't set inside the database ? Getting a new constance resulted in: Receiving the default value Having it written to the database. This comes from this line: https://github.com/jazzband/django-constance/blob/master/constance/base.py#L22 How to handle the cases in the issue Since the behaviour I was looking for is "Whenever a constance is set to the default value, I would like it to change when I change the corresponding default value", here's what I did: from constance.backends.database import DatabaseBackend from constance import settings, signals, config from django.db import ( OperationalError, ProgrammingError ) class CustomDatabaseBackend(DatabaseBackend): def set(self, key, value): if key not in settings.CONFIG: raise AttributeError(key) default_value = settings.CONFIG[key][0] if value != default_value: return super().set(key, value) queryset = self._model._default_manager.all() # Set _for_write attribute as get_or_create method does # https://github.com/django/django/blob/2.2.11/django/db/models/query.py#L536 queryset._for_write = True try: constance = queryset.get(key=key) except (OperationalError, ProgrammingError): # database is not created, noop return except self._model.DoesNotExist: # Key doesn't exist, then do nothing return old_value = constance.value # Constance should be modified to default value -> delete it constance.delete(using=queryset.db) if self._cache: self._cache.set(key, value) signals.config_updated.send( sender=config, key=key, old_value=old_value, new_value=value ) @robin-blanchard you are 100% correct. I am unsure, but this behavior might be added directly in django-constance. I'd consider a pull request with such a change. Perhaps it contains some additional constraints, but I don't see them at first glance.
gharchive/issue
2023-12-13T09:09:18
2025-04-01T06:39:10.768681
{ "authors": [ "Mogost", "robin-blanchard" ], "repo": "jazzband/django-constance", "url": "https://github.com/jazzband/django-constance/issues/535", "license": "bsd-3-clause", "license_type": "permissive", "license_source": "bigquery" }
2483340691
Missing migration in 4.0.0 Describe the problem When using django-constance with django>5, the command to check for missing migrations (makemigrations --check --dry-run) tells me that there is a missing migration. I guess this is because of the editable field, which is set here: https://github.com/jazzband/django-constance/blob/master/constance/migrations/0001_initial.py#L16, but it is not stated on the model (https://github.com/jazzband/django-constance/blob/master/constance/models.py#L7). The created migration looks like this: # Generated by Django 5.1 on 2024-08-23 15:01 from django.db import migrations, models class Migration(migrations.Migration): dependencies = [ ('constance', '0003_drop_pickle'), ] operations = [ migrations.AlterField( model_name='constance', name='value', field=models.TextField(blank=True, null=True), ), ] In case you can reproduce this, I am happy to help. Steps to reproduce In the example-project, install Django > 5 and run python manage.py makemigrations --check --dry-run Run tox as configured here: https://github.com/sebastianmanger/django-constance/commit/2357a61e64f698f18189508d0317806092f5c0f0. Example output: https://github.com/sebastianmanger/django-constance/actions/runs/10527827693/job/29171898640#step:5:542 System configuration Django version: >5 Python version: any Django-Constance version: latest (e428387) That was quick - thanks a lot! Maybe also include this check in tox.ini? django-admin makemigrations --check --dry-run Good idea! Could you suggest a PR?
gharchive/issue
2024-08-23T15:03:01
2025-04-01T06:39:10.774790
{ "authors": [ "Mogost", "sebastianmanger" ], "repo": "jazzband/django-constance", "url": "https://github.com/jazzband/django-constance/issues/570", "license": "bsd-3-clause", "license_type": "permissive", "license_source": "bigquery" }
2120408373
Fix: fix constraint creation using the wrong table and column name Subject: Fix a bug regarding foreign key constraint creation Feature or Bugfix Bugfix Detail This was causing issue if a foreign key was pointing to another field than the remote table id eg. Table1(models.Model): remote_column = models.IntegerField(unique=True) Table2(models.Model): local_column = models.ForeignKey(to=Table1, to_field='remote_column') During the migrations, you would get an error saying Table2 doesn't have a column named remote_column which is absurd since that colum is not supposed to be in that table. @shimizukawa I see you merged the last PR, though I admit it was some time ago. Can you have a look ? PS: I apologise for pinging you if you are the wrong person to address this Ha cool, I'm happy to see this was merged, thanks you
gharchive/pull-request
2024-02-06T10:11:35
2025-04-01T06:39:10.777498
{ "authors": [ "BlueMagma2" ], "repo": "jazzband/django-redshift-backend", "url": "https://github.com/jazzband/django-redshift-backend/pull/118", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
128746663
parse error when data/@expr, assign/@expr is an Object literal Given an SCXML document containing the following datamodel: <datamodel> <data id="o1" expr="{p1: 'v1', p2: 'v2'}"/> </datamodel> the scion compiler produces Unexpected token : by way of esprima. This seems consistent with esprima's validator given a document containing just the object literal: bash-4.2$ cat literal.js {'p1': 'v1', 'p2': 'v2'} bash-4.2$ node_modules/esprima/bin/esvalidate.js literal.js Error: Line 1: Unexpected token : When provided with a complete assignment esprima is happy: bash-4.2$ cat assignment.js o = {'p1': 'v1', 'p2': 'v2'} bash-4.2$ node_modules/esprima/bin/esvalidate.js assignment.js bash-4.2$ Neither the scion compiler nor esprima take issue with an Array literal: bash-4.2$ cat array.js ['a', 42, true, 3.14] bash-4.2$ node_modules/esprima/bin/esvalidate.js array.js Complete scxml document that reproduces the Object literal issue: <scxml xmlns="http://www.w3.org/2005/07/scxml" version="1.0" initial="s1"> <datamodel> <data id="o1" expr="{p1: 'v1', p2: 'v2'}"/> </datamodel> <state id="uber"> <transition event="*" target="fail"> <log expr="'unhandled input ' + JSON.stringify(_event)" label="TEST"/> </transition> <state id="s1"> <transition event="pass"/> <onentry> <log expr="'Starting session ' + _sessionid" label="TEST"/> <raise event="pass"/> </onentry> </state> </state> <final id="pass"> <onentry> <log expr="'RESULT: pass'" label="TEST"/> </onentry> </final> <final id="fail"> <onentry> <log expr="'RESULT: fail'" label="TEST"/> </onentry> </final> </scxml> Same song second verse using : <scxml xmlns="http://www.w3.org/2005/07/scxml" version="1.0" initial="s1"> <datamodel> <data id="o1"/> </datamodel> <state id="uber"> <transition event="*" target="fail"> <log expr="'unhandled input ' + JSON.stringify(_event)" label="TEST"/> </transition> <state id="s1"> <transition event="pass"/> <onentry> <log expr="'Starting session ' + _sessionid" label="TEST"/> <assign location="o1" expr="{p1: 'v1', p2: 'v2'}"/> <raise event="pass"/> </onentry> </state> </state> <final id="pass"> <onentry> <log expr="'RESULT: pass'" label="TEST"/> </onentry> </final> <final id="fail"> <onentry> <log expr="'RESULT: fail'" label="TEST"/> </onentry> </final> </scxml> https://github.com/jbeard4/SCION/pull/367 https://github.com/jbeard4/scxml-test-framework/pull/24
gharchive/issue
2016-01-26T07:32:07
2025-04-01T06:39:10.789815
{ "authors": [ "mattoshry" ], "repo": "jbeard4/SCION", "url": "https://github.com/jbeard4/SCION/issues/366", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
115609844
something broke jbenet @ lorien : ~/git/ipfs/specs * overviews % bsdash *** IPFS Bitswap Dash *** Active Requests 0: Provide Workers 0: Task Workers 0: Rebroadcast Worker: Provider Connector: events: 0/1 prints: 1 (25ms delay) /Users/jbenet/git/node-bsdash/state.js:23 var parts = e.event.split(".") ^ TypeError: Cannot read property 'split' of undefined at updateState (/Users/jbenet/git/node-bsdash/state.js:23:22) at Object.State.s.update (/Users/jbenet/git/node-bsdash/state.js:14:5) at DestroyableTransform._transform (/Users/jbenet/git/node-bsdash/index.js:26:13) at DestroyableTransform.Transform._read (/Users/jbenet/git/node-bsdash/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:172:10) at DestroyableTransform.Transform._write (/Users/jbenet/git/node-bsdash/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:160:12) at doWrite (/Users/jbenet/git/node-bsdash/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:323:12) at writeOrBuffer (/Users/jbenet/git/node-bsdash/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:309:5) at DestroyableTransform.Writable.write (/Users/jbenet/git/node-bsdash/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:236:11) at write (/Users/jbenet/git/node-bsdash/node_modules/ndjson/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:623:24) at flow (/Users/jbenet/git/node-bsdash/node_modules/ndjson/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:632:7) @diasdavid @whyrusleeping any ideas? Seems that logs are different now, .event is not guaranteed to always exist: example 1 - Didn't break { event: 'log API client connected', session: 'b2b765ac-48b1-459c-8e1f-9fcd83868a3f', system: 'core/server', time: '2015-11-07T18:28:26.632599724Z' } example 1 - Error thrown { id: 5649, level: 5, message: 'bitswap net handleNewStream from <peer.ID bqE6Uf>', module: 'bitswap_network', time: '2015-11-07T18:28:30.288928034Z' } :////// need tests even on the logs
gharchive/issue
2015-11-06T23:14:51
2025-04-01T06:39:10.808883
{ "authors": [ "diasdavid", "jbenet" ], "repo": "jbenet/node-bsdash", "url": "https://github.com/jbenet/node-bsdash/issues/2", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2263539195
feat: gated sparse autoencoders Description Implements gated sparse autoencoders, which were recently found to solve the shrinkage problem and achieve equal reconstruction loss with 2x average feature sparsity Fixes #103 Type of change Please delete options that are not relevant. [x] New feature (non-breaking change which adds functionality) Checklist: [x] I have commented my code, particularly in hard-to-understand areas [ ] I have made corresponding changes to the documentation [ ] My changes generate no new warnings [x] I have added tests that prove my fix is effective or that my feature works [x] New and existing unit tests pass locally with my changes [x] I have not rewritten tests relating to key interfaces which would affect backward compatibility You have tested formatting, typing and unit tests (acceptance tests not currently in use) [x] I have run make check-ci to check format and linting. (you can run make format to format code if needed.) Performance Check. If you have implemented a training change, please indicate precisely how performance changes with respect to the following metrics: [ ] L0 [ ] CE Loss [ ] MSE Loss [ ] Feature Dashboard Interpretability Please links to wandb dashboards with a control and test group. Wandb run ongoing: https://wandb.ai/dtch1997/gpt2?nw=nwuserdtch1997 Cluster is being annoying. Here's a colab to run the training. https://colab.research.google.com/drive/1q_X46e0eO9Q1_t283USX1QG41Q5fgDAZ?usp=sharing Comment from Neel: Awesome! I asked Sen (the lead author) for thoughts: The sparsity penalty isn't comparable between the two setups (or even the relationship between L1 and L0 norms IIRC). So the best way to compare them is to plot the Pareto curves. Very excited about this! Report from initial training runs: https://wandb.ai/sae-experiments/sae-experiments/reports/Gated-SAE-analysis--Vmlldzo3ODE5OTAw?accessToken=wcmc1fc13wlz3dze7zl26cmkblfdt6984fi20n79fgaa4x79l4moygyclx31mx3q Final WandB report: https://wandb.ai/sae-experiments/sae-experiments/reports/Gated-SAE-analysis--Vmlldzo3ODE5OTAw?accessToken=wcmc1fc13wlz3dze7zl26cmkblfdt6984fi20n79fgaa4x79l4moygyclx31mx3q Pareto plot: TODO: Clean up the cluster files Merge latest changes from main Thanks for the work so far! I read through the Gated SAE loss func implementation and it seemed right. If setting off runs is cheap, it would be great to get more Gated and Baseline runs in the L0 range 10-100 -- that's the most important region and there are not too many points there Thanks for this! I just merged a seperate PR for this but will look into your transcoder PR shortly!
gharchive/pull-request
2024-04-25T13:05:21
2025-04-01T06:39:10.821505
{ "authors": [ "ArthurConmy", "dtch1997", "jbloomAus" ], "repo": "jbloomAus/SAELens", "url": "https://github.com/jbloomAus/SAELens/pull/104", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1063271976
Async Notifications dbContext disposed Hi I have problem with DbContext in async Notifications. This is my code for running async notifications: public class ParallelNoWaitPublishMediator : Mediator { private Func<IEnumerable<Func<INotification, CancellationToken, Task>>, INotification, CancellationToken, Task> _publish; public ParallelNoWaitPublishMediator(ServiceFactory serviceFactory) : base(serviceFactory) { _publish = ParallelNoWait; } protected override Task PublishCore(IEnumerable<Func<INotification, CancellationToken, Task>> allHandlers, INotification notification, CancellationToken cancellationToken) { return _publish(allHandlers, notification, cancellationToken); } private Task ParallelNoWait(IEnumerable<Func<INotification, CancellationToken, Task>> handlers, INotification notification, CancellationToken cancellationToken) { foreach (var handler in handlers) { Task.Run(() => handler(notification, cancellationToken), cancellationToken); } return Task.CompletedTask; } } And everytime i use DbContext in notification it gets disposed. The solution is to use service provider which creates new instance of dbContext when i call it, but I wanted something more generic. Is there any way to provide new scope for the handler, run in Task.Run() ? This is because the DbContext is registered with DI as a Scoped Service by default and asp creates and closes the scope on every request. There are several ways to fix this behavior: Register all services used in notification handlers as transient or singletone Override ServiceFactory, create custom scope and close it after complete all handlers Use IHostedService for executing you async notifications Use external libraries like Quartz.NET or Hangfire for executing you async notifications For option 2 there, here's some docs from MS on the subject: https://docs.microsoft.com/en-us/aspnet/core/fundamentals/host/hosted-services?view=aspnetcore-6.0&tabs=visual-studio#consuming-a-scoped-service-in-a-background-task
gharchive/issue
2021-11-25T08:09:43
2025-04-01T06:39:10.826152
{ "authors": [ "ZeSzymi", "denisenko93", "jbogard" ], "repo": "jbogard/MediatR", "url": "https://github.com/jbogard/MediatR/issues/675", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
2612609089
🛑 https://goto.srsbsns.lol is down In a24f126, https://goto.srsbsns.lol (https://goto.srsbsns.lol) was down: HTTP code: 502 Response time: 128 ms Resolved: https://goto.srsbsns.lol is back up in a1d2b2e after 7 minutes.
gharchive/issue
2024-10-24T21:42:44
2025-04-01T06:39:10.841313
{ "authors": [ "jbowdre" ], "repo": "jbowdre/upptime", "url": "https://github.com/jbowdre/upptime/issues/237", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
146702445
Gemfile.lock issue when installing on Bitnami stack I'm following all the steps from 'get started' but I get stuck on the 'bundle install' in step 2. I get a "you are trying to install in deployment mode after changing your Gemfile". I tried the suggested at the bottom of set 2 but still get the same message. Environment: Redmine version 3.2.1.stable Ruby version 2.1.8-p440 (2015-12-16) [x86_64-linux] Rails version 4.2.5.2 Environment production Database adapter Mysql2 SCM: Subversion 1.8.13 Cvs 1.12.13 Git 2.6.1 Filesystem Redmine plugins: redmine_stats 0.0.3 Found an alternative in a note at https://wiki.bitnami.com/Applications/BitNami_Redmine#How_to_install_a_plugin_on_Redmine.3f Instructions are confusing though. I'll try to get help at the Bitnami forum...
gharchive/issue
2016-04-07T18:09:36
2025-04-01T06:39:10.845088
{ "authors": [ "aalmada" ], "repo": "jbox-web/redmine_git_hosting", "url": "https://github.com/jbox-web/redmine_git_hosting/issues/589", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
1275395345
🛑 SF Bahá'í is down In 4f3c933, SF Bahá'í (https://www.sfbahai.org) was down: HTTP code: 403 Response time: 41 ms Resolved: SF Bahá'í is back up in 4b84a84.
gharchive/issue
2022-06-17T19:40:29
2025-04-01T06:39:10.847573
{ "authors": [ "jbraithwaite" ], "repo": "jbraithwaite/uptime", "url": "https://github.com/jbraithwaite/uptime/issues/21", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1403830733
use useArray in useRewards What does this PR do and why? Provide a description of what this PR does. Link to any relevant GitHub issues, Notion tasks or Discord discussions. Screenshots or screen recordings If applicable, provide screenshots or screen recordings to demonstrate the changes. Acceptance checklist [ ] I have evaluated the Approval Guidelines for this PR. [ ] I have tested this PR in all supported browsers. [ ] I have tested this PR in dark mode and light mode (if applicable). Current dependencies on/for this PR: main PR #2156 PR #2169 PR #2170 PR #2171 PR #2172 PR #2181 PR #2182 PR #2183 Graphite Merge Job Current status: ⏳ Queued to merge This pull request is currently queued to be merged as part of a stack. This comment was auto-generated by Graphite. Job Reference: BcMq9niLYsok9Kx1gVMU
gharchive/pull-request
2022-10-11T00:06:01
2025-04-01T06:39:10.877360
{ "authors": [ "wraeth-eth" ], "repo": "jbx-protocol/juice-interface", "url": "https://github.com/jbx-protocol/juice-interface/pull/2188", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1331915360
Add Julia example example of how to use proxqp in Julia using Pycall (https://github.com/JuliaPy/PyCall.jl). for now if i do cmake .. -DCMAKE_PREFIX_PATH=/home/fschramm/workspace/julia-1.7.3 it finds the julia executable (not installed in the path where it searches usually right now on my machine), it adds the unittest but when I run make test I get an error saying Start 112: example-jl-overview-simple.jl Process not started /home/fschramm/workspace/proxqp/examples/julia/overview-simple.jl [permission denied] 112/112 Test #112: example-jl-overview-simple.jl .......................................................................................................................................................................***Not Run 0.00 sec 99% tests passed, 1 tests failed out of 112 Total Test time (real) = 26.15 sec The following tests FAILED: 112 - example-jl-overview-simple.jl (BAD_COMMAND) Errors while running CTest I think the problem is this permission denied we see here. on my machine the command that should be executed by the test works without problems. So I'm still trying to figure out how to have the permission also when I run it here in ctest. already running on my machine, waiting for https://github.com/jrl-umi3218/jrl-cmakemodules/pull/538.
gharchive/pull-request
2022-08-08T13:57:17
2025-04-01T06:39:10.907634
{ "authors": [ "fabinsch" ], "repo": "jcarpent/proxqp-unofficial", "url": "https://github.com/jcarpent/proxqp-unofficial/pull/6", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }
141539798
Stateful RNN mod in sampling Does just removing the lines: self:resetStates() in the sample function in langmodel.lua will make it a statefull RNN, thus preserving long term context? I removed these lines, and the loss immediately dropped by 15%. Thanks for the suggestion! But it can be very data-dependent. Interesting, every checkpoint saving resets the state. I wonder if this is the reason why the training loss jumps a bit up after every save: Epoch 1.24 / 50, i = 20 / 4250, loss = 1.025447 Checkpoint = cv/checkpoint_20.t7 Epoch 1.25 / 50, i = 21 / 4250, loss = 1.042428 . . . Epoch 1.35 / 50, i = 30 / 4250, loss = 1.013393 Checkpoint = cv/checkpoint_30.t7 Epoch 1.36 / 50, i = 31 / 4250, loss = 1.036500 . . . Epoch 1.47 / 50, i = 40 / 4250, loss = 1.003105 Checkpoint = cv/checkpoint_40.t7 Epoch 1.48 / 50, i = 41 / 4250, loss = 1.020587 Over a year since last comment, closing stale issue.
gharchive/issue
2016-03-17T10:24:30
2025-04-01T06:39:10.931101
{ "authors": [ "AlekzNet", "aliabbasjp", "dgcrouse" ], "repo": "jcjohnson/torch-rnn", "url": "https://github.com/jcjohnson/torch-rnn/issues/38", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
199071699
User-supplied routes for chown() and chmod() Certain filesystems like Syndicate need to be able to handle chown and chmod themselves. Add routes for this, as well as tests. Added.
gharchive/issue
2017-01-05T22:05:01
2025-04-01T06:39:10.954536
{ "authors": [ "jcnelson" ], "repo": "jcnelson/fskit", "url": "https://github.com/jcnelson/fskit/issues/17", "license": "isc", "license_type": "permissive", "license_source": "bigquery" }
2176929254
Added Instructions for MathCad I've added the instructions for MathCad / PTC accounts. Please be aware there is another section about a PTC product listing "ptc.com" under its domains. I would appreciate some clarification how this duplication is handled. If one or the other entry takes priority upon visiting ptc.com I suggest we combine "Vuforia Chalk" (the other product) and MathCad into a "PTC Accounts" section instead and combine their URL list, since PTC aquired Vuforia. The Vuforia section has spelling mistakes and an to my knowledge outdated support article anyways. In that case give me a ping and I'll fix it :) That said, Vuforia developer accounts (developer.vuforia.com) are still separate from PTC accounts and would require their own section. It should all be taken care of now! At least until they change their account infrastructure again :) Vuforia Chalk account were changed to easy, since a "Delete Account" button now exists for the vast majority of end users and business accounts. Only organisation admins should need to use the "hard" fallback I have provided. Thanks for your contribution!
gharchive/pull-request
2024-03-08T23:42:33
2025-04-01T06:39:11.017284
{ "authors": [ "Dynamotivation", "tupaschoal" ], "repo": "jdm-contrib/jdm", "url": "https://github.com/jdm-contrib/jdm/pull/2003", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
36670419
Required fields I'm having some issues figuring out how to make a field required. I've tried the following, which adds "required": ["comment"] to the object after defining the properties, but that doesn't seem to work. Any guidance would be most appreciated. Thanks! Hello, @jdorn I want to spend some time thinking about the best way to handle the asterisks before implementing that. Has it been implemented? @gnom7 https://github.com/jdorn/json-editor#deprecation-notice https://github.com/json-editor/json-editor/issues/44 I am not too familiar with the code, I'd recommend creating a new issue at https://github.com/json-editor/json-editor/issues The documentation says you can use custom styles but does not say you can use custom classes. When you using barebones JSON editor it uses almost no classes that can override your styles. I looked in the source code looking for a feature that does what you want to do but I did not find anything. A solution could be, to extend the theme class and create a new theme for JSON Editor. You can look in the original repository wiki: https://github.com/jdorn/json-editor/wiki#theme-srcthemejs @germanbisurgi, thank you for the suggestion, but it seems that I can't customize form elements this way, only DOM ones (like tables etc.) and it might happen that I'd like to pull one table to the right and another to the left in my form depending on the element which is rendered, or even something more sophisticated, so it would be nice if we had feature to add classes to json form elements, thanks for reply
gharchive/issue
2014-06-27T14:45:52
2025-04-01T06:39:11.023552
{ "authors": [ "bmcbride", "germanbisurgi", "gnom7", "schmunk42" ], "repo": "jdorn/json-editor", "url": "https://github.com/jdorn/json-editor/issues/158", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
410712737
Killing notification on App swipe Duplicate of https://github.com/jeancsanchez/JcPlayer/issues/22 But the problem there is, onDestroy doesn't get called when we clear the app from recent task. This is causing crashes when we click on the notification again. onDestroy tricks work well on exiting the app, but not swiping and killing it from tasks. That's a problem that I can't solve yet. I didn't find a solution for removing the notifications when the app is popped from the tasks list. @jeancsanchez I have actually managed to resolve it on your Sample code, but it is not working on my code when I add library as gradle module. Once I am able to fix it, will let you know. that problem error on emulator if real device is work fine @rohankandwal @SunryTeang I have same problem in multiple real devices. ERROR: Failed to resolve: com.github.jeancsanchez:JcPlayer:2.6.16 sir how can i solve it
gharchive/issue
2019-02-15T10:54:38
2025-04-01T06:39:11.044220
{ "authors": [ "SunryTeang", "abhishek-singh-a4", "jeancsanchez", "rohankandwal" ], "repo": "jeancsanchez/JcPlayer", "url": "https://github.com/jeancsanchez/JcPlayer/issues/79", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
319793499
openwrt high virtual memory size usage (%VSZ) device : TP-Link TL-MR3420 v1 firmware: Pulpstone v3.1 OpenWrt Chaos Calmer 15.05.1 r49389 (http://pulpstone.pw/exroot-ar71xx/) kernel: Linux PULPSTONE 3.18.45 #2 Thu Mar 30 17:35:40 WIB 2017 mips GNU/Linux dnscrypt-proxy : dnscrypt-proxy-linux_mips-2.0.11.tar.gz Mem: 26856K used, 1624K free, 320K shrd, 1388K buff, 2908K cached CPU: 0% usr 1% sys 0% nic 98% idle 0% io 0% irq 0% sirq Load average: 0.03 0.05 0.01 2/51 5081 PID PPID USER STAT VSZ %VSZ %CPU COMMAND 5000 2863 root R 1364 5% 0% top 69 2 root SW 0 0% 0% [kworker/0:1] 4830 1 root S 649m2332% 0% dnscrypt-proxy-v2 -config /etc/config compared to v1 (5% VSZ) is that normal ? Mem: 16700K used, 11780K free, 320K shrd, 1036K buff, 2652K cached CPU: 9% usr 4% sys 0% nic 85% idle 0% io 0% irq 0% sirq Load average: 0.00 0.02 0.00 1/44 10701 PID PPID USER STAT VSZ %VSZ %CPU COMMAND 8642 2 root SW 0 0% 0% [kworker/u2:2] 967 1 root S 1612 6% 0% /usr/sbin/uhttpd -f -h /www -r PULPST 2852 926 root S 1220 4% 0% /usr/sbin/dropbear -F -P /var/run/dro 494 1 root S 892 3% 0% /sbin/ubusd 1786 1 root S 1664 6% 0% /usr/sbin/hostapd -P /var/run/wifi-ph 1813 1 root S 1580 6% 0% /usr/sbin/wpa_supplicant -B -P /var/r 859 1 root S 1564 5% 0% /sbin/netifd 825 1 root S 1532 5% 0% /sbin/rpcd 1 0 root S 1408 5% 0% /sbin/procd 872 1 root S 1372 5% 0% {ping_loop} /bin/sh /usr/bin/ping_loo 2110 1 nobody S 1364 5% 0% /usr/sbin/dnscrypt-proxy -d -a 127.0. 2863 2852 root S 1364 5% 0% -ash 10475 2863 root R 1364 5% 0% top This is virtual memory, not the actual memory usage. But yes, it requires more memory than the legacy version because it is written in a different programming language and has many more features.
gharchive/issue
2018-05-03T05:39:15
2025-04-01T06:39:11.046875
{ "authors": [ "jedisct1", "kepeto" ], "repo": "jedisct1/dnscrypt-proxy", "url": "https://github.com/jedisct1/dnscrypt-proxy/issues/425", "license": "ISC", "license_type": "permissive", "license_source": "github-api" }
1251754356
求助:群晖docker下ddns-go,无法获取IPV6的地址,用的是cloudflare 问题描述 我是用的pvE下装的黑群晖然后装docker,docker下装的ddns-go, ipv6通过接口获取出现下面的报错 但是打开https://myip6.ipip.net, https://speed.neu6.edu.cn/getIP.php, https://v6.ident.me,是可以获取IPV6的地址, 昨天刚装上的时候一直可以用, 就一直没有没用改动, 突然就开始报错, 我cloudflare以及Token一切都是正确的不知道到底是什么原因, 单独打开https://myip6.ipip.net, https://speed.neu6.edu.cn/getIP.php, https://v6.ident.me,是可以获取ipv6的公网地址, 而且昨天一直都正常,突然就不正常了,并没有改过文件. 2022/05/29 08:27:48 获取IPv6结果失败! 接口: https://myip6.ipip.net ,返回值: 2022/05/29 08:27:48 连接失败! 点击查看接口能否返回IPv6地址, 官方说明:点击访问 2022/05/29 08:27:48 连接失败! 点击查看接口能否返回IPv6地址, 官方说明:点击访问 2022/05/29 08:27:48 未能获取IPv6地址, 将不会更新 检查下群晖网络。 进入docker中通过curl https://myip6.ipip.net 能返回结果吗 检查下群晖网络。通过curl https://myip6.ip.net 进入docker中能返回结果吗? 可以的,一切设置都是正常的,就是不知道什么原因我怀疑是我没有绑定顶级域名,因为我是直接绑定的二级域名,在另外一款脚本上开始也是只绑定了二级域名,出现报错,最后把顶级域名绑定上,就恢复正常了, 现在已经达到了我想要的结果, 等我再测试一下ddns-go是不是也是这种情况,到时候再反馈, 感谢作者的回复. 群晖的套件中心里有ddns-go的套件,搞个矿神套件源下载直接用,没必要用docker 群晖套件版测了几天,发现没有问题, docker不知道是什么原因, 感谢作者的回复.
gharchive/issue
2022-05-29T00:36:34
2025-04-01T06:39:11.183171
{ "authors": [ "QASDJKL5", "hy817120", "jeessy2" ], "repo": "jeessy2/ddns-go", "url": "https://github.com/jeessy2/ddns-go/issues/290", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1222954873
Update README.md title add add content
gharchive/pull-request
2022-05-02T14:24:22
2025-04-01T06:39:11.190683
{ "authors": [ "jeevkola" ], "repo": "jeevkola/github-pages-with-jekyll", "url": "https://github.com/jeevkola/github-pages-with-jekyll/pull/2", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1590623783
MPI_TYPE_GET_CONTENTS with nested user types is broken From https://github.com/jeffhammond/mukautuva/issues/11. If we do e.g. vector of vectors, MPI_TYPE_GET_CONTENTS does not work. https://github.com/jeffhammond/mukautuva/commit/4539c1c4711fec8cae10977b8527f2931a75c450 fixes this
gharchive/issue
2023-02-19T09:39:42
2025-04-01T06:39:11.200120
{ "authors": [ "jeffhammond" ], "repo": "jeffhammond/mukautuva", "url": "https://github.com/jeffhammond/mukautuva/issues/12", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1927454797
[Snyk] Upgrade prismjs from 1.25.0 to 1.29.0 This PR was automatically created by Snyk using the credentials of a real user.Snyk has created this PR to upgrade prismjs from 1.25.0 to 1.29.0. :information_source: Keep your dependencies up-to-date. This makes it easier to fix existing vulnerabilities and to more quickly identify and fix newly disclosed vulnerabilities when they affect your project. The recommended version is 4 versions ahead of your current version. The recommended version was released a year ago, on 2022-08-23. The recommended version fixes: Severity Issue PriorityScore (*) Exploit Maturity Improper Input Validation SNYK-JS-POSTCSS-5926692 551/1000 Why? Recently disclosed, Has a fix available, CVSS 5.3 No Known Exploit (*) Note that the real score may have changed since the PR was raised. Release notes Package name: prismjs 1.29.0 - 2022-08-23Release 1.29.0 1.28.0 - 2022-04-17Release 1.28.0 1.27.0 - 2022-02-17Release 1.27.0 1.26.0 - 2022-01-06Release 1.26.0 1.25.0 - 2021-09-16Release 1.25.0 from prismjs GitHub release notes Commit messages Package name: prismjs 59e5a34 1.29.0 cd080f2 Updated npmignore to include new MD files (#3534) 751664b Added PR stop notice (#3532) 248f6ab Added changelog for v1.29.0 (#3533) 098e300 Line Highlight: Account for offset when clamping ranges (#3518) 6b824d4 Bash: Added "sh" alias (#3509) 15272f7 Website: Added third-party tutorial for Pug template (#3459) c8462a2 Cilk: Add support for Cilk (with C/C++) (#3522) 859f99a Added bqn language support (#3515) 0cad9ae BBj: Improve regexes (#3512) 1134bdf BBj Langauge Support (#3511) 342a003 Java: Added support for constants (#3507) 05ee042 Added security policy (#3070) 866b302 Added list of maintainers (#3410) a090d06 Scala: Updated keywords to support Scala 3 (#3506) b9512b2 Bash: Added support for parameters and the `java` and `sysctl` commands. (#3505) b0c2a9b NSIS: Added missing commands (#3504) 3a53cf0 Bump moment from 2.29.2 to 2.29.4 (#3503) 761b32c Bump terser from 5.12.1 to 5.14.2 (#3502) 2aed9ce SCSS: Fix casing in title of the `scss` lang (#3501) 9d603ef Docs: Add missing word (#3489) 3e93713 Bash: Added `cargo` command (#3488) c4cbeea AsciiDoc: Some regexes are too greedy (#3481) 5665930 Bump shell-quote from 1.7.2 to 1.7.3 (#3483) Compare Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open upgrade PRs. For more information: 🧐 View latest project report 🛠 Adjust upgrade PR settings 🔕 Ignore this dependency or unsubscribe from future upgrade PRs 👷 Deploy Preview for sparkly-horse-283115 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sparkly-horse-283115/deploys/651e536d3077970008d5c832 👷 Deploy Preview for illustrious-tarsier-8142c6 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/illustrious-tarsier-8142c6/deploys/651e536d41eac500085c6f4c 👷 Deploy Preview for delicate-swan-90b76f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/delicate-swan-90b76f/deploys/651e536dcf65460008293ef7 👷 Deploy Preview for legendary-pegasus-2cb136 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/legendary-pegasus-2cb136/deploys/651e536dd7f4a10008b0d3ad 👷 Deploy Preview for silly-truffle-fee02d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/silly-truffle-fee02d/deploys/651e536d7fe04000082715fa 👷 Deploy Preview for unique-faloodeh-cdf913 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/unique-faloodeh-cdf913/deploys/651e536d79dfef000818df95 👷 Deploy Preview for deft-crepe-bb80a2 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/deft-crepe-bb80a2/deploys/651e536d3fca9d0009698898 👷 Deploy Preview for ephemeral-muffin-859de6 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/ephemeral-muffin-859de6/deploys/651e536d1d9b380008be3d40 👷 Deploy Preview for famous-bunny-e5a059 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/famous-bunny-e5a059/deploys/651e536dcf65460008293efc 👷 Deploy Preview for calm-alfajores-970494 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/calm-alfajores-970494/deploys/651e536dab48020008f1626e 👷 Deploy Preview for eloquent-semolina-1797c2 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/eloquent-semolina-1797c2/deploys/651e536d7474ae00077f6390 👷 Deploy Preview for imaginative-toffee-a664ed processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/imaginative-toffee-a664ed/deploys/651e536d4b616a0008ae412a 👷 Deploy Preview for zesty-baklava-aab937 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/zesty-baklava-aab937/deploys/651e536d39b5950008d15be8 👷 Deploy Preview for glowing-parfait-10a201 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/glowing-parfait-10a201/deploys/651e536d14a2080009423f31 👷 Deploy Preview for comfy-vacherin-8f8b64 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/comfy-vacherin-8f8b64/deploys/651e536d6e0d1e0008f104b9 👷 Deploy Preview for transcendent-kheer-67b52d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/transcendent-kheer-67b52d/deploys/651e536d7801450008f235a3 👷 Deploy Preview for admirable-queijadas-2168f4 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/admirable-queijadas-2168f4/deploys/651e536dc5e1ff0008515a9b 👷 Deploy Preview for frabjous-khapse-b3a872 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/frabjous-khapse-b3a872/deploys/651e536d1d30f200081b5a16 👷 Deploy Preview for graceful-bonbon-2d98df processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/graceful-bonbon-2d98df/deploys/651e536d3077970008d5c837 👷 Deploy Preview for mellifluous-hotteok-6bbeaa processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/mellifluous-hotteok-6bbeaa/deploys/651e536d48e46d0007395103 👷 Deploy Preview for inspiring-biscochitos-a7d221 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/inspiring-biscochitos-a7d221/deploys/651e536d2e48b90008341c39 👷 Deploy Preview for incandescent-zuccutto-bc7154 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/incandescent-zuccutto-bc7154/deploys/651e536d4b616a0008ae412f 👷 Deploy Preview for sage-pavlova-d116d8 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sage-pavlova-d116d8/deploys/651e536ddec1790008fe77c0 👷 Deploy Preview for darling-klepon-74e177 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/darling-klepon-74e177/deploys/651e536db1b5170008e83e37 👷 Deploy Preview for beamish-gaufre-4e1ae9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/beamish-gaufre-4e1ae9/deploys/651e536d7801450008f235a8 👷 Deploy Preview for clever-maamoul-a0ae31 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/clever-maamoul-a0ae31/deploys/651e536dab74d30007ae41b5 👷 Deploy Preview for kaleidoscopic-medovik-1a85ff processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/kaleidoscopic-medovik-1a85ff/deploys/651e536dc512c0000844582a 👷 Deploy Preview for symphonious-truffle-f57523 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/symphonious-truffle-f57523/deploys/651e536d48e46d0007395108 👷 Deploy Preview for profound-praline-2697d3 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/profound-praline-2697d3/deploys/651e536db0ba490007f0b4d3 👷 Deploy Preview for willowy-selkie-d94984 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/willowy-selkie-d94984/deploys/651e536d14a2080009423f36 👷 Deploy Preview for subtle-meerkat-333693 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/subtle-meerkat-333693/deploys/651e536dc512c0000844582f 👷 Deploy Preview for jovial-stardust-b4ad7d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/jovial-stardust-b4ad7d/deploys/651e536d6e0d1e0008f104be 👷 Deploy Preview for inspiring-biscuit-2e1ad5 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/inspiring-biscuit-2e1ad5/deploys/651e536e81015f00079c28a9 👷 Deploy Preview for magnificent-cassata-c47b17 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/magnificent-cassata-c47b17/deploys/651e536eb8d0b30008f1cda1 👷 Deploy Preview for sparkly-toffee-66d6c6 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sparkly-toffee-66d6c6/deploys/651e536eb8d0b30008f1cda3 👷 Deploy Preview for kaleidoscopic-lolly-054d2b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/kaleidoscopic-lolly-054d2b/deploys/651e536efcfb36000845cee3 👷 Deploy Preview for scintillating-scone-17a256 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/scintillating-scone-17a256/deploys/651e536eab74d30007ae41ba 👷 Deploy Preview for resplendent-centaur-3e5b9a processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/resplendent-centaur-3e5b9a/deploys/651e536e2caa450008078de3 👷 Deploy Preview for courageous-shortbread-c7b432 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/courageous-shortbread-c7b432/deploys/651e536e3777be00089c5fc8 👷 Deploy Preview for quiet-pavlova-ee1984 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/quiet-pavlova-ee1984/deploys/651e536e034be600081a6fe4 👷 Deploy Preview for fancy-creponne-346f7c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/fancy-creponne-346f7c/deploys/651e536e14a2080009423f3b 👷 Deploy Preview for lustrous-torrone-6f6275 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/lustrous-torrone-6f6275/deploys/651e536e14a2080009423f3f 👷 Deploy Preview for stellular-macaron-012785 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/stellular-macaron-012785/deploys/651e536ef660ea0008849958 👷 Deploy Preview for jovial-kataifi-3cc39c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/jovial-kataifi-3cc39c/deploys/651e536e3777be00089c5fcd 👷 Deploy Preview for snazzy-kitsune-c70cd6 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/snazzy-kitsune-c70cd6/deploys/651e536edd9c7700086cd9d1 👷 Deploy Preview for elaborate-brioche-1a6825 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/elaborate-brioche-1a6825/deploys/651e536e14a2080009423f45 👷 Deploy Preview for charming-cheesecake-1dabeb processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/charming-cheesecake-1dabeb/deploys/651e536e6e0d1e0008f104c3 👷 Deploy Preview for poetic-otter-16f4e5 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/poetic-otter-16f4e5/deploys/651e536ef1a97600081d2b5b 👷 Deploy Preview for harmonious-palmier-623323 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/harmonious-palmier-623323/deploys/651e536e3fca9d000969889d 👷 Deploy Preview for prismatic-marzipan-26cf98 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/prismatic-marzipan-26cf98/deploys/651e536e9cb2d90008529adb 👷 Deploy Preview for resonant-gumdrop-ffc145 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/resonant-gumdrop-ffc145/deploys/651e536e034be600081a6fe9 👷 Deploy Preview for storied-marshmallow-845fa7 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/storied-marshmallow-845fa7/deploys/651e536ed75a4f0008e0d8b6 👷 Deploy Preview for relaxed-speculoos-9355cf processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/relaxed-speculoos-9355cf/deploys/651e536edd9c7700086cd9d6 👷 Deploy Preview for quiet-jalebi-d385ef processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/quiet-jalebi-d385ef/deploys/651e536e6e0d1e0008f104c8 👷 Deploy Preview for resplendent-puppy-228238 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/resplendent-puppy-228238/deploys/651e536e5cdc270008ce0a47 👷 Deploy Preview for mellow-kheer-f64241 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/mellow-kheer-f64241/deploys/651e536ec512c00008445834 👷 Deploy Preview for sweet-granita-eb4fde processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sweet-granita-eb4fde/deploys/651e536e9510240007ef33f9 👷 Deploy Preview for gleeful-dragon-9ccd2f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gleeful-dragon-9ccd2f/deploys/651e536eb8d0b30008f1cda9 👷 Deploy Preview for unrivaled-biscochitos-babab3 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/unrivaled-biscochitos-babab3/deploys/651e536eb14d6600082635e7 👷 Deploy Preview for enchanting-sopapillas-995803 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/enchanting-sopapillas-995803/deploys/651e536e878ee3000841c97d 👷 Deploy Preview for teal-lokum-369903 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/teal-lokum-369903/deploys/651e536ec04e7200074148a7 👷 Deploy Preview for dashing-churros-4c6f00 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/dashing-churros-4c6f00/deploys/651e536e7801450008f235af 👷 Deploy Preview for zesty-jelly-ea2fe1 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/zesty-jelly-ea2fe1/deploys/651e536ecf65460008293f01 👷 Deploy Preview for animated-souffle-19791b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/animated-souffle-19791b/deploys/651e536e57c70d000877e4c0 👷 Deploy Preview for visionary-boba-89f977 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/visionary-boba-89f977/deploys/651e536e7801450008f235b1 👷 Deploy Preview for zesty-caramel-5e8224 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/zesty-caramel-5e8224/deploys/651e536ed7f4a10008b0d3b2 👷 Deploy Preview for frolicking-halva-a1dad9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/frolicking-halva-a1dad9/deploys/651e536e29b2c30008fbf726 👷 Deploy Preview for cerulean-squirrel-103e3d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/cerulean-squirrel-103e3d/deploys/651e536eedd5270008e2f035 👷 Deploy Preview for beautiful-twilight-c313c9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/beautiful-twilight-c313c9/deploys/651e536e6e0d1e0008f104cd 👷 Deploy Preview for superb-crisp-68aa87 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/superb-crisp-68aa87/deploys/651e536e48e46d000739510d 👷 Deploy Preview for lovely-cuchufli-871162 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/lovely-cuchufli-871162/deploys/651e536ecc3ee5000896a789 👷 Deploy Preview for ubiquitous-tartufo-41e4dc processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/ubiquitous-tartufo-41e4dc/deploys/651e536eae05e500082d50e2 👷 Deploy Preview for bright-alfajores-1fc557 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bright-alfajores-1fc557/deploys/651e536e41eac500085c6f51 👷 Deploy Preview for playful-conkies-50ac93 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/playful-conkies-50ac93/deploys/651e536e6e0d1e0008f104d1 👷 Deploy Preview for clever-syrniki-c8c842 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/clever-syrniki-c8c842/deploys/651e536e48e46d0007395112 👷 Deploy Preview for exquisite-brioche-7a7975 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/exquisite-brioche-7a7975/deploys/651e536eae05e500082d50e6 👷 Deploy Preview for brilliant-sopapillas-e444df processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/brilliant-sopapillas-e444df/deploys/651e536ece1e250008b92d13 👷 Deploy Preview for stupendous-gaufre-65714f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/stupendous-gaufre-65714f/deploys/651e536e8314170008d22a43 👷 Deploy Preview for warm-liger-39b63b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/warm-liger-39b63b/deploys/651e536ece1e250008b92d17 👷 Deploy Preview for mellifluous-meerkat-55d1b0 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/mellifluous-meerkat-55d1b0/deploys/651e536e878ee3000841c982 👷 Deploy Preview for exquisite-haupia-77d21e processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/exquisite-haupia-77d21e/deploys/651e536e7801450008f235b7 👷 Deploy Preview for eclectic-belekoy-4e5b23 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/eclectic-belekoy-4e5b23/deploys/651e536eb8d0b30008f1cdae 👷 Deploy Preview for glittering-alfajores-5d0e23 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/glittering-alfajores-5d0e23/deploys/651e536ec8dc5200088e2234 👷 Deploy Preview for fastidious-brigadeiros-f3f904 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/fastidious-brigadeiros-f3f904/deploys/651e536e7801450008f235bd 👷 Deploy Preview for preeminent-fairy-dc7466 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/preeminent-fairy-dc7466/deploys/651e536e7801450008f235bf 👷 Deploy Preview for chipper-starlight-17aafc processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/chipper-starlight-17aafc/deploys/651e536e878ee3000841c987 👷 Deploy Preview for guileless-cassata-3d515e processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/guileless-cassata-3d515e/deploys/651e536e7ba258000834f881 👷 Deploy Preview for jolly-sable-dcee29 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/jolly-sable-dcee29/deploys/651e536e2b733600088dcb1c 👷 Deploy Preview for dulcet-puppy-b5c366 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/dulcet-puppy-b5c366/deploys/651e536e034be600081a6fee 👷 Deploy Preview for cerulean-fox-1a7b38 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/cerulean-fox-1a7b38/deploys/651e536ec8dc5200088e2239 👷 Deploy Preview for super-haupia-ffa0ed processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/super-haupia-ffa0ed/deploys/651e536eed048000089f0e4d 👷 Deploy Preview for effortless-trifle-256fc1 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/effortless-trifle-256fc1/deploys/651e536ec5e1ff0008515aa0 👷 Deploy Preview for sparkly-cupcake-61cc67 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sparkly-cupcake-61cc67/deploys/651e536e8314170008d22a48 👷 Deploy Preview for courageous-lolly-8ddc07 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/courageous-lolly-8ddc07/deploys/651e536ec04e7200074148ac 👷 Deploy Preview for marvelous-pie-d42d33 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/marvelous-pie-d42d33/deploys/651e536e0bcb0e000865e4a2 👷 Deploy Preview for startling-narwhal-741826 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/startling-narwhal-741826/deploys/651e536ea528f900096190fe 👷 Deploy Preview for spectacular-genie-bd2a02 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/spectacular-genie-bd2a02/deploys/651e536e6e0d1e0008f104d7 👷 Deploy Preview for gregarious-griffin-907e2b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gregarious-griffin-907e2b/deploys/651e536eb0ba490007f0b4d8 👷 Deploy Preview for endearing-youtiao-538e5c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/endearing-youtiao-538e5c/deploys/651e536e29b2c30008fbf72b 👷 Deploy Preview for sparkly-cuchufli-2d5098 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sparkly-cuchufli-2d5098/deploys/651e536edd9c7700086cd9dc 👷 Deploy Preview for celadon-kitsune-8d2590 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/celadon-kitsune-8d2590/deploys/651e536ec5e1ff0008515aa5 👷 Deploy Preview for nimble-souffle-732447 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/nimble-souffle-732447/deploys/651e536eedd5270008e2f03a 👷 Deploy Preview for jovial-entremet-7bdcdd processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/jovial-entremet-7bdcdd/deploys/651e536e7ba258000834f886 👷 Deploy Preview for phenomenal-sunflower-128494 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/phenomenal-sunflower-128494/deploys/651e536ef9815a0008545cbc 👷 Deploy Preview for vocal-souffle-227f6c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/vocal-souffle-227f6c/deploys/651e536eb14d6600082635ec 👷 Deploy Preview for aquamarine-puffpuff-4776f4 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/aquamarine-puffpuff-4776f4/deploys/651e536e29b2c30008fbf730 👷 Deploy Preview for fanciful-cobbler-cc5308 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/fanciful-cobbler-cc5308/deploys/651e536e81015f00079c28ae 👷 Deploy Preview for iridescent-donut-a77d46 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/iridescent-donut-a77d46/deploys/651e536ef1a97600081d2b60 👷 Deploy Preview for lucky-sunflower-5709a9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/lucky-sunflower-5709a9/deploys/651e536e7c84c200088c5a75 👷 Deploy Preview for gorgeous-basbousa-bcc1c0 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gorgeous-basbousa-bcc1c0/deploys/651e536e7c84c200088c5a77 👷 Deploy Preview for glittery-torrone-09637c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/glittery-torrone-09637c/deploys/651e536e2b733600088dcb21 👷 Deploy Preview for bright-stardust-c8e42e processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bright-stardust-c8e42e/deploys/651e536e0bcb0e000865e4a7 👷 Deploy Preview for dynamic-narwhal-75d05e processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/dynamic-narwhal-75d05e/deploys/651e536ea545ce0007d59a2c 👷 Deploy Preview for spiffy-stroopwafel-6468ab processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/spiffy-stroopwafel-6468ab/deploys/651e536e578c8c0008e9ffd4 👷 Deploy Preview for stately-tanuki-5ea9e5 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/stately-tanuki-5ea9e5/deploys/651e536e086c61000825e429 👷 Deploy Preview for frabjous-stardust-cd5f08 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/frabjous-stardust-cd5f08/deploys/651e536e578c8c0008e9ffd8 👷 Deploy Preview for guileless-fenglisu-c7d613 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/guileless-fenglisu-c7d613/deploys/651e536ea16f450008f3fd1a 👷 Deploy Preview for zippy-clafoutis-bd6c6d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/zippy-clafoutis-bd6c6d/deploys/651e536e2e48b90008341c3e 👷 Deploy Preview for regal-douhua-401584 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/regal-douhua-401584/deploys/651e536e5478ce000870571a 👷 Deploy Preview for famous-tartufo-09830b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/famous-tartufo-09830b/deploys/651e536e2caa450008078de8 👷 Deploy Preview for hilarious-flan-63c055 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/hilarious-flan-63c055/deploys/651e536e086c61000825e42d 👷 Deploy Preview for cosmic-bavarois-f126ac processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/cosmic-bavarois-f126ac/deploys/651e536e232a1a00083724d4 👷 Deploy Preview for jocular-treacle-2c1d20 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/jocular-treacle-2c1d20/deploys/651e536e89f08d00092b4b8b 👷 Deploy Preview for courageous-fenglisu-e69b7d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/courageous-fenglisu-e69b7d/deploys/651e536e3fca9d00096988a4 👷 Deploy Preview for famous-bunny-e85616 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/famous-bunny-e85616/deploys/651e536e29d5870007ed3747 👷 Deploy Preview for musical-nougat-b9c5c4 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/musical-nougat-b9c5c4/deploys/651e536eab48020008f16273 👷 Deploy Preview for mellifluous-chaja-5a2a83 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/mellifluous-chaja-5a2a83/deploys/651e536e3fca9d00096988a6 👷 Deploy Preview for sage-fairy-09b28f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sage-fairy-09b28f/deploys/651e536e244e7b000886a6da 👷 Deploy Preview for dazzling-fenglisu-3458be processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/dazzling-fenglisu-3458be/deploys/651e536e29d5870007ed3749 👷 Deploy Preview for rad-cassata-7cfc10 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/rad-cassata-7cfc10/deploys/651e536e741cbc0008c09bd8 👷 Deploy Preview for keen-bienenstitch-a28ffb processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/keen-bienenstitch-a28ffb/deploys/651e536e1d9b380008be3d45 👷 Deploy Preview for celadon-torrone-3240b7 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/celadon-torrone-3240b7/deploys/651e536e244e7b000886a6e1 👷 Deploy Preview for fanciful-beignet-9df5ff processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/fanciful-beignet-9df5ff/deploys/651e536e244e7b000886a6e3 👷 Deploy Preview for remarkable-fenglisu-883a94 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/remarkable-fenglisu-883a94/deploys/651e536e5478ce000870571f 👷 Deploy Preview for fastidious-churros-8830a0 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/fastidious-churros-8830a0/deploys/651e536ecb50d900085a61d0 👷 Deploy Preview for starlit-torte-ff08d0 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/starlit-torte-ff08d0/deploys/651e536e29d5870007ed374f 👷 Deploy Preview for astounding-lokum-6a67e3 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/astounding-lokum-6a67e3/deploys/651e536e510d90000804bfca 👷 Deploy Preview for effulgent-griffin-98d1a4 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/effulgent-griffin-98d1a4/deploys/651e536e2e48b90008341c43 👷 Deploy Preview for singular-stroopwafel-39fa3c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/singular-stroopwafel-39fa3c/deploys/651e536ed6ce620008cbab1f 👷 Deploy Preview for euphonious-cocada-1672cd processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/euphonious-cocada-1672cd/deploys/651e536e736e17000855274a 👷 Deploy Preview for tiny-pie-11ab9e processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/tiny-pie-11ab9e/deploys/651e536e1d30f200081b5a20 👷 Deploy Preview for ornate-kringle-5414d9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/ornate-kringle-5414d9/deploys/651e536e29d5870007ed3754 👷 Deploy Preview for dancing-creponne-5fd6ba processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/dancing-creponne-5fd6ba/deploys/651e536e7801450008f235c6 👷 Deploy Preview for bright-fenglisu-a4dd40 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bright-fenglisu-a4dd40/deploys/651e536fb1b5170008e83e3c 👷 Deploy Preview for super-lollipop-38b31b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/super-lollipop-38b31b/deploys/651e536ed6ce620008cbab25 👷 Deploy Preview for effortless-seahorse-9d03ec processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/effortless-seahorse-9d03ec/deploys/651e536f5478ce0008705724 👷 Deploy Preview for elegant-twilight-a42d17 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/elegant-twilight-a42d17/deploys/651e536f3077970008d5c83d 👷 Deploy Preview for dulcet-lebkuchen-b220cb processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/dulcet-lebkuchen-b220cb/deploys/651e536f4b616a0008ae4136 👷 Deploy Preview for famous-melba-349637 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/famous-melba-349637/deploys/651e536ff491530008a04a72 👷 Deploy Preview for luxury-zuccutto-6f1931 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/luxury-zuccutto-6f1931/deploys/651e536f4b616a0008ae4138 👷 Deploy Preview for taupe-mochi-ceb085 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/taupe-mochi-ceb085/deploys/651e536f3077970008d5c841 👷 Deploy Preview for bejewelled-cuchufli-63f072 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bejewelled-cuchufli-63f072/deploys/651e536ff37d4600087c863e 👷 Deploy Preview for marvelous-kangaroo-d2e3e7 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/marvelous-kangaroo-d2e3e7/deploys/651e536fcb50d900085a61d5 👷 Deploy Preview for bucolic-paprenjak-c9dba5 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bucolic-paprenjak-c9dba5/deploys/651e536fb8d0b30008f1cdb3 👷 Deploy Preview for kaleidoscopic-pudding-25b028 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/kaleidoscopic-pudding-25b028/deploys/651e536f3077970008d5c847 👷 Deploy Preview for unique-praline-2dd319 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/unique-praline-2dd319/deploys/651e536fa528f90009619103 👷 Deploy Preview for jade-halva-e98903 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/jade-halva-e98903/deploys/651e536fd6783d0008b1aa94 👷 Deploy Preview for ephemeral-tiramisu-94b6d9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/ephemeral-tiramisu-94b6d9/deploys/651e536fc512c00008445839 👷 Deploy Preview for earnest-baklava-3d3373 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/earnest-baklava-3d3373/deploys/651e536ff37d4600087c8644 👷 Deploy Preview for illustrious-fairy-da5d21 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/illustrious-fairy-da5d21/deploys/651e536fd6783d0008b1aa99 👷 Deploy Preview for glistening-puffpuff-a36a8d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/glistening-puffpuff-a36a8d/deploys/651e536ff1a97600081d2b65 👷 Deploy Preview for sparkly-speculoos-9d392a processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sparkly-speculoos-9d392a/deploys/651e536fb8d0b30008f1cdb8 👷 Deploy Preview for ephemeral-halva-4f82e9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/ephemeral-halva-4f82e9/deploys/651e536ff9815a0008545cc2 👷 Deploy Preview for beautiful-praline-49589d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/beautiful-praline-49589d/deploys/651e536f3077970008d5c84c 👷 Deploy Preview for iridescent-zabaione-94ee11 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/iridescent-zabaione-94ee11/deploys/651e536f034be600081a6ff3 👷 Deploy Preview for clever-sorbet-f5c155 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/clever-sorbet-f5c155/deploys/651e536f034be600081a6ff8 👷 Deploy Preview for warm-faun-9afed6 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/warm-faun-9afed6/deploys/651e536ff9815a0008545cc7 👷 Deploy Preview for boisterous-bunny-3f5dd4 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/boisterous-bunny-3f5dd4/deploys/651e536fd75a4f0008e0d8bb 👷 Deploy Preview for lucky-zabaione-57368a processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/lucky-zabaione-57368a/deploys/651e536f99e7940008790e48 👷 Deploy Preview for brilliant-peony-889689 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/brilliant-peony-889689/deploys/651e536f3077970008d5c853 👷 Deploy Preview for moonlit-chaja-97ecc9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/moonlit-chaja-97ecc9/deploys/651e536fd6783d0008b1aa9f 👷 Deploy Preview for effulgent-paprenjak-a98d88 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/effulgent-paprenjak-a98d88/deploys/651e536f99e7940008790e4e 👷 Deploy Preview for gilded-bienenstitch-541f40 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gilded-bienenstitch-541f40/deploys/651e536f3077970008d5c855 👷 Deploy Preview for lustrous-taffy-72dfc8 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/lustrous-taffy-72dfc8/deploys/651e536f99e7940008790e51 👷 Deploy Preview for classy-douhua-d14ee1 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/classy-douhua-d14ee1/deploys/651e536f034be600081a6ffd 👷 Deploy Preview for admirable-cranachan-c76b9f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/admirable-cranachan-c76b9f/deploys/651e536f29d5870007ed375b 👷 Deploy Preview for warm-tarsier-a6191b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/warm-tarsier-a6191b/deploys/651e536f3fca9d00096988ac 👷 Deploy Preview for silver-cucurucho-15a651 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/silver-cucurucho-15a651/deploys/651e536f29d5870007ed375d 👷 Deploy Preview for splendorous-bombolone-b01832 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/splendorous-bombolone-b01832/deploys/651e536f5cdc270008ce0a4c 👷 Deploy Preview for frabjous-kheer-2f14b4 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/frabjous-kheer-2f14b4/deploys/651e536ff9815a0008545ccd 👷 Deploy Preview for enchanting-zabaione-7eb788 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/enchanting-zabaione-7eb788/deploys/651e536fb8d0b30008f1cdc7 👷 Deploy Preview for chimerical-axolotl-01b04d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/chimerical-axolotl-01b04d/deploys/651e536fd6783d0008b1aaa4 👷 Deploy Preview for bucolic-cactus-e39450 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bucolic-cactus-e39450/deploys/651e536f3077970008d5c85b 👷 Deploy Preview for transcendent-choux-74d002 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/transcendent-choux-74d002/deploys/651e536ff9815a0008545cd2 👷 Deploy Preview for super-duckanoo-5301f6 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/super-duckanoo-5301f6/deploys/651e536fd6783d0008b1aaa9 👷 Deploy Preview for spontaneous-madeleine-996de1 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/spontaneous-madeleine-996de1/deploys/651e536f034be600081a7007 👷 Deploy Preview for silly-squirrel-9da938 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/silly-squirrel-9da938/deploys/651e536f741cbc0008c09bdd 👷 Deploy Preview for brilliant-figolla-5150b7 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/brilliant-figolla-5150b7/deploys/651e536ff37d4600087c864d 👷 Deploy Preview for aquamarine-basbousa-a21fe8 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/aquamarine-basbousa-a21fe8/deploys/651e536f3077970008d5c860 👷 Deploy Preview for sparkling-stroopwafel-19389f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sparkling-stroopwafel-19389f/deploys/651e536fd6783d0008b1aaae 👷 Deploy Preview for teal-paletas-73cfbf processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/teal-paletas-73cfbf/deploys/651e536ff491530008a04a77 👷 Deploy Preview for genuine-dodol-8bf7d7 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/genuine-dodol-8bf7d7/deploys/651e536f510d90000804bfcf 👷 Deploy Preview for gorgeous-cannoli-c936ea processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gorgeous-cannoli-c936ea/deploys/651e536fae05e500082d50ec 👷 Deploy Preview for thriving-croissant-4b508a processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/thriving-croissant-4b508a/deploys/651e536fd7f4a10008b0d3b7 👷 Deploy Preview for imaginative-gaufre-336007 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/imaginative-gaufre-336007/deploys/651e536ff491530008a04a7c 👷 Deploy Preview for resonant-hamster-c6aa90 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/resonant-hamster-c6aa90/deploys/651e536ff9815a0008545cd7 👷 Deploy Preview for silly-haupia-d64b7c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/silly-haupia-d64b7c/deploys/651e536f48e46d0007395118 👷 Deploy Preview for sweet-druid-f17f35 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sweet-druid-f17f35/deploys/651e536ff491530008a04a81 👷 Deploy Preview for adorable-souffle-d52f19 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/adorable-souffle-d52f19/deploys/651e536fd6783d0008b1aab5 👷 Deploy Preview for gorgeous-nougat-bcf0db processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gorgeous-nougat-bcf0db/deploys/651e536f878ee3000841c98c 👷 Deploy Preview for stellular-marigold-96ff52 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/stellular-marigold-96ff52/deploys/651e536fd6783d0008b1aab7 👷 Deploy Preview for thunderous-sprinkles-6f853d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/thunderous-sprinkles-6f853d/deploys/651e536fc512c0000844583e 👷 Deploy Preview for lucent-macaron-166dcb processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/lucent-macaron-166dcb/deploys/651e536f5cdc270008ce0a51 👷 Deploy Preview for peppy-sfogliatella-332440 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/peppy-sfogliatella-332440/deploys/651e536fd7f4a10008b0d3c1 👷 Deploy Preview for glowing-souffle-94dbe5 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/glowing-souffle-94dbe5/deploys/651e536fa528f90009619108 👷 Deploy Preview for moonlit-banoffee-f5163f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/moonlit-banoffee-f5163f/deploys/651e536f9cb2d90008529ae0 👷 Deploy Preview for gleaming-puppy-b862c9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gleaming-puppy-b862c9/deploys/651e536fce1e250008b92d26 👷 Deploy Preview for fluffy-raindrop-34e2c5 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/fluffy-raindrop-34e2c5/deploys/651e536fab74d30007ae41bf 👷 Deploy Preview for magical-monstera-0d98d9 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/magical-monstera-0d98d9/deploys/651e536f9cb2d90008529ae5 👷 Deploy Preview for serene-malasada-bfdbbd processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/serene-malasada-bfdbbd/deploys/651e536f8314170008d22a52 👷 Deploy Preview for rad-jelly-0b21ad processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/rad-jelly-0b21ad/deploys/651e536fcc3ee5000896a793 👷 Deploy Preview for remarkable-mermaid-86af32 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/remarkable-mermaid-86af32/deploys/651e536fed048000089f0e52 👷 Deploy Preview for helpful-chimera-a704c6 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/helpful-chimera-a704c6/deploys/651e536f29b2c30008fbf735 👷 Deploy Preview for funny-wisp-8fbe34 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/funny-wisp-8fbe34/deploys/651e536fcc3ee5000896a797 👷 Deploy Preview for whimsical-bunny-02ea90 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/whimsical-bunny-02ea90/deploys/651e536fd2b2db0008476c7d 👷 Deploy Preview for benevolent-lamington-5cd2e4 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/benevolent-lamington-5cd2e4/deploys/651e536fd7f4a10008b0d3c6 👷 Deploy Preview for bejewelled-meringue-98e38f processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bejewelled-meringue-98e38f/deploys/651e536f00a7b0000826923f 👷 Deploy Preview for resonant-pastelito-2373de processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/resonant-pastelito-2373de/deploys/651e536f8314170008d22a57 👷 Deploy Preview for euphonious-rolypoly-1df441 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/euphonious-rolypoly-1df441/deploys/651e536f81015f00079c28b3 👷 Deploy Preview for sensational-chebakia-f4fe56 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/sensational-chebakia-f4fe56/deploys/651e536fb0ba490007f0b4e2 👷 Deploy Preview for grand-flan-9a325a processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/grand-flan-9a325a/deploys/651e536fc8dc5200088e223e 👷 Deploy Preview for comfy-griffin-c8bb09 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/comfy-griffin-c8bb09/deploys/651e536fedd5270008e2f03f 👷 Deploy Preview for singular-rolypoly-bb1d5d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/singular-rolypoly-bb1d5d/deploys/651e536f2b733600088dcb26 👷 Deploy Preview for tiny-kringle-86057c processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/tiny-kringle-86057c/deploys/651e536fcf65460008293f0b 👷 Deploy Preview for bright-froyo-d20766 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bright-froyo-d20766/deploys/651e536fc8dc5200088e2243 👷 Deploy Preview for classy-lokum-4703f0 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/classy-lokum-4703f0/deploys/651e536fc0fb520008ca3a30 👷 Deploy Preview for symphonious-sprite-58f2a2 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/symphonious-sprite-58f2a2/deploys/651e536fb14d6600082635f7 👷 Deploy Preview for admirable-pixie-88ea82 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/admirable-pixie-88ea82/deploys/651e536f086c61000825e438 👷 Deploy Preview for fabulous-dusk-b1c779 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/fabulous-dusk-b1c779/deploys/651e536f2b733600088dcb2b 👷 Deploy Preview for friendly-bunny-b310f5 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/friendly-bunny-b310f5/deploys/651e536fc0fb520008ca3a35 👷 Deploy Preview for taupe-meerkat-24e5ac processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/taupe-meerkat-24e5ac/deploys/651e536fbc29eb00089cbf56 👷 Deploy Preview for roaring-pithivier-d6b238 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/roaring-pithivier-d6b238/deploys/651e536f9510240007ef340b 👷 Deploy Preview for magical-mousse-256197 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/magical-mousse-256197/deploys/651e536f086c61000825e43d 👷 Deploy Preview for exquisite-marigold-4d3338 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/exquisite-marigold-4d3338/deploys/651e536f2caa450008078df2 👷 Deploy Preview for famous-blancmange-7ac16b processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/famous-blancmange-7ac16b/deploys/651e536f2b733600088dcb30 👷 Deploy Preview for beamish-pegasus-2a5c4d processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/beamish-pegasus-2a5c4d/deploys/651e536fcf8a2600088ba1c9 👷 Deploy Preview for luminous-naiad-4db7f7 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/luminous-naiad-4db7f7/deploys/651e536f89f08d00092b4b90 👷 Deploy Preview for melodious-longma-aede52 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/melodious-longma-aede52/deploys/651e536f2e48b90008341c4a 👷 Deploy Preview for magical-lollipop-84d847 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/magical-lollipop-84d847/deploys/651e536f7bd4c60009832513 👷 Deploy Preview for golden-chimera-317517 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/golden-chimera-317517/deploys/651e536f2e48b90008341c4c 👷 Deploy Preview for jade-cajeta-d23085 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/jade-cajeta-d23085/deploys/651e536fd7a38d0008a88ab2 👷 Deploy Preview for bright-torte-8a1aa1 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/bright-torte-8a1aa1/deploys/651e536f2e48b90008341c52 👷 Deploy Preview for stupendous-faloodeh-8c32eb processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/stupendous-faloodeh-8c32eb/deploys/651e536ffcfb36000845cee8 👷 Deploy Preview for thriving-longma-e72428 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/thriving-longma-e72428/deploys/651e536f1d30f200081b5a25 👷 Deploy Preview for whimsical-rolypoly-f52f49 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/whimsical-rolypoly-f52f49/deploys/651e536f7fe04000082715ff 👷 Deploy Preview for gilded-rabanadas-784340 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/gilded-rabanadas-784340/deploys/651e536fb14d660008263602 👷 Deploy Preview for aquamarine-palmier-9fcf08 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/aquamarine-palmier-9fcf08/deploys/651e536f79dfef000818dfa4 👷 Deploy Preview for scintillating-beignet-aebd10 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/scintillating-beignet-aebd10/deploys/651e536f741cbc0008c09be2 👷 Deploy Preview for frolicking-alpaca-ba2d7a processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/frolicking-alpaca-ba2d7a/deploys/651e536fbc29eb00089cbf5e 👷 Deploy Preview for chipper-eclair-530555 processing. Name Link 🔨 Latest commit f238554b72ae788acd2ccf4e26831b5639e55421 🔍 Latest deploy log https://app.netlify.com/sites/chipper-eclair-530555/deploys/651e536f244e7b000886a6e9
gharchive/pull-request
2023-10-05T06:10:52
2025-04-01T06:39:11.787790
{ "authors": [ "jelly-rampage" ], "repo": "jelly-rampage/nextjs-blog-theme", "url": "https://github.com/jelly-rampage/nextjs-blog-theme/pull/2", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1625574266
`Argument of type 'Promise<typeof import("...@types/swagger-ui-react/index")>' is not assignable to parameter of type 'DynamicOptions<{ spec: any; }> | Loader<{ spec: any; }>'. Describe the bug I am trying to show an open API document and followed same example on the github. I am receiving this error. Do you have any idea how to fix this? Thank you. Argument of type 'Promise<typeof import("c:/workspace/***/***/node_modules/@types/swagger-ui-react/index")>' is not assignable to parameter of type 'DynamicOptions<{ spec: any; }> | Loader<{ spec: any; }>'. Type 'Promise<typeof import("c:/workspace/***/***/node_modules/@types/swagger-ui-react/index")>' is not assignable to type 'LoaderComponent<{ spec: any; }>'. Type 'typeof import("c:/workspace/***/***/node_modules/@types/swagger-ui-react/index")' is not assignable to type 'ComponentType<{ spec: any; }> | ComponentModule<{ spec: any; }>'. Type 'typeof import("c:/workspace/***/***/node_modules/@types/swagger-ui-react/index")' is not assignable to type 'ComponentModule<{ spec: any; }>'. Types of property 'default' are incompatible. Type 'typeof SwaggerUI' is not assignable to type 'ComponentType<{ spec: any; }>'. Type 'typeof SwaggerUI' is not assignable to type 'ComponentClass<{ spec: any; }, any>'. Construct signature return types 'SwaggerUI' and 'Component<{ spec: any; }, any, any>' are incompatible. The types of 'props' are incompatible between these types. Type 'Readonly<SwaggerUIProps>' is not assignable to type 'Readonly<{ spec: any; }>'. Property 'spec' is optional in type 'Readonly<SwaggerUIProps>' but required in type 'Readonly<{ spec: any; }>'.ts(2345) Reproduction Followed example on github README.md System Info "next": "13.0.7", "next-swagger-doc": "^0.3.6", "swagger-ui-react": "^4.18.1", System: OS: Windows 10 10.0.22621 CPU: (16) x64 AMD Ryzen 9 5900HX with Radeon Graphics Memory: 10.91 GB / 31.35 GB Binaries: Node: 16.13.0 - C:\Program Files\nodejs\node.EXE Yarn: 1.22.11 - ~\AppData\Roaming\npm\yarn.CMD npm: 8.1.0 - C:\Program Files\nodejs\npm.CMD Browsers: Edge: Spartan (44.22621.1344.0), Chromium (110.0.1587.69) Internet Explorer: 11.0.22621.1 Used Package Manager yarn Validations [X] Follow our Code of Conduct [X] Read the Contributing Guide. [X] Check that there isn't already an issue that reports the same bug to avoid creating a duplicate. [X] Check that this is a concrete bug. For Q&A, please open a GitHub Discussion instead. [X] The provided reproduction is a minimal reproducible of the bug. I found the issue. I have used: const SwaggerUI = dynamic<SwaggerUIProps>(import("swagger-ui-react"), { ssr: false, }); instead of const SwaggerUI = dynamic<{spec: any}>(import("swagger-ui-react"), { ssr: false, }); Thank you
gharchive/issue
2023-03-15T13:52:35
2025-04-01T06:39:11.798052
{ "authors": [ "ayberkcanturk" ], "repo": "jellydn/next-swagger-doc", "url": "https://github.com/jellydn/next-swagger-doc/issues/646", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2096806187
⚠️ SpeakUpConference.com has degraded performance In 1493e04, SpeakUpConference.com (https://www.speakupconference.com) experienced degraded performance: HTTP code: 200 Response time: 9878 ms Resolved: SpeakUpConference.com performance has improved in 84e96c2 after 11 minutes.
gharchive/issue
2024-01-23T19:13:24
2025-04-01T06:39:11.866220
{ "authors": [ "jemmorey" ], "repo": "jemmorey/upptime", "url": "https://github.com/jemmorey/upptime/issues/142", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1500288641
Provide a GH permissions template See https://github.com/jenkins-infra/helpdesk/pull/3299 I removed the permissions level dropdown, given teams typically have admin permissions over the repository. Looks fine, you might want to add a issue template config and change the number in the filename.
gharchive/pull-request
2022-12-16T14:07:55
2025-04-01T06:39:11.879474
{ "authors": [ "NotMyFault", "lemeurherve" ], "repo": "jenkins-infra/repository-permissions-updater", "url": "https://github.com/jenkins-infra/repository-permissions-updater/pull/3032", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
303655080
Grant @dwnusbaum permission to release cloudbees-folder Description Grant @dwnusbaum permission to release https://github.com/jenkinsci/cloudbees-folder-plugin. CC @oleg-nenashev @jglick @recena for confirmation from existing maintainers. Submitter checklist for changing permissions Always [x] Add link to plugin/component Git repository in description above When adding new uploaders (this includes newly created permissions files) [x] Make sure to @mention an existing maintainer to confirm the permissions request, if applicable [x] Use the Jenkins community (LDAP) account name in the YAML file, not the GitHub account name [x] All newly added users have logged in to Artifactory at least once @dwnusbaum are you also fine with being a default assignee in JIRA? Currently there is no default assignee, and the tickets appear to be untriaged: https://issues.jenkins-ci.org/browse/JENKINS-49988?jql=project %3D JENKINS AND status in (Open%2C "In Progress"%2C Reopened) AND component %3D cloudbees-folder-plugin @oleg-nenashev Sure you can make me the default.
gharchive/pull-request
2018-03-08T22:25:55
2025-04-01T06:39:11.884181
{ "authors": [ "dwnusbaum", "oleg-nenashev" ], "repo": "jenkins-infra/repository-permissions-updater", "url": "https://github.com/jenkins-infra/repository-permissions-updater/pull/624", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
368920745
Create plugin-huaweicloud-credentials Description Submitter checklist for changing permissions Always [ ] Add link to plugin/component Git repository in description above For a newly hosted plugin only [ ] Add link to resolved HOSTING issue in description above For a new permissions file only [ ] Make sure the file is created in permissions/ directory [ ] artifactId (pom.xml) is used for name (permissions YAML file). [ ] groupId / artifactId (pom.xml) are correctly represented in path (permissions YAML file) [ ] Check that the file is named plugin-${artifactId}.yml for plugins When adding new uploaders (this includes newly created permissions files) [ ] Make sure to @mention an existing maintainer to confirm the permissions request, if applicable [ ] Use the Jenkins community (LDAP) account name in the YAML file, not the GitHub account name [ ] All newly added users have logged in to Artifactory at least once Build failed; the context from the latest run is: Expand to view at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247) at io.jenkins.infra.repository_permissions_updater.ArtifactoryPermissionsUpdater$_generateApiPayloads_closure1.doCall(ArtifactoryPermissionsUpdater.groovy:136) at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325) at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1024) at groovy.lang.Closure.call(Closure.java:414) at groovy.lang.Closure.call(Closure.java:430) at org.codehaus.groovy.runtime.ResourceGroovyMethods.eachFile(ResourceGroovyMethods.java:1070) at org.codehaus.groovy.runtime.ResourceGroovyMethods.eachFile(ResourceGroovyMethods.java:1088) at org.codehaus.groovy.runtime.dgm$936.invoke(Unknown Source) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:274) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:56) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125) at io.jenkins.infra.repository_permissions_updater.ArtifactoryPermissionsUpdater.generateApiPayloads(ArtifactoryPermissionsUpdater.groovy:134) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93) at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite$StaticMetaMethodSiteNoUnwrapNoCoerce.invoke(StaticMetaMethodSite.java:151) at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite.callStatic(StaticMetaMethodSite.java:102) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallStatic(CallSiteArray.java:56) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:194) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:214) at io.jenkins.infra.repository_permissions_updater.ArtifactoryPermissionsUpdater.main(ArtifactoryPermissionsUpdater.groovy:317) [Pipeline] stage (Archive) Using the ‘stage’ step without a block argument is deprecated Entering stage Archive Proceeding [Pipeline] archiveArtifacts Archiving artifacts [Pipeline] archiveArtifacts Archiving artifacts [Pipeline] } [Pipeline] // node [Pipeline] End of Pipeline GitHub has been notified of this commit’s build result ERROR: script returned exit code 1 Finished: FAILURE Powered by the Comment Logger Build failed; the context from the latest run is: Expand to view at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247) at io.jenkins.infra.repository_permissions_updater.ArtifactoryPermissionsUpdater$_generateApiPayloads_closure1.doCall(ArtifactoryPermissionsUpdater.groovy:136) at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325) at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1024) at groovy.lang.Closure.call(Closure.java:414) at groovy.lang.Closure.call(Closure.java:430) at org.codehaus.groovy.runtime.ResourceGroovyMethods.eachFile(ResourceGroovyMethods.java:1070) at org.codehaus.groovy.runtime.ResourceGroovyMethods.eachFile(ResourceGroovyMethods.java:1088) at org.codehaus.groovy.runtime.dgm$936.invoke(Unknown Source) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:274) at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:56) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125) at io.jenkins.infra.repository_permissions_updater.ArtifactoryPermissionsUpdater.generateApiPayloads(ArtifactoryPermissionsUpdater.groovy:134) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93) at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite$StaticMetaMethodSiteNoUnwrapNoCoerce.invoke(StaticMetaMethodSite.java:151) at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite.callStatic(StaticMetaMethodSite.java:102) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallStatic(CallSiteArray.java:56) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:194) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:214) at io.jenkins.infra.repository_permissions_updater.ArtifactoryPermissionsUpdater.main(ArtifactoryPermissionsUpdater.groovy:317) [Pipeline] stage (Archive) Using the ‘stage’ step without a block argument is deprecated Entering stage Archive Proceeding [Pipeline] archiveArtifacts Archiving artifacts [Pipeline] archiveArtifacts Archiving artifacts [Pipeline] } [Pipeline] // node [Pipeline] End of Pipeline GitHub has been notified of this commit’s build result ERROR: script returned exit code 1 Finished: FAILURE Powered by the Comment Logger wrong file name
gharchive/pull-request
2018-10-11T01:55:54
2025-04-01T06:39:11.893673
{ "authors": [ "jenkinsadmin", "liucc52" ], "repo": "jenkins-infra/repository-permissions-updater", "url": "https://github.com/jenkins-infra/repository-permissions-updater/pull/887", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
289941720
make the use of the domain easier to override? the exposecontroller settings and domain are encoded in a secret for thunder which makes it hard to use locally or to change the domain etc. I wonder if we can make it easier to change those? As a work around I hacked the secrets.yaml.dec locally then added these new targets in the Makefile... local-install: helm install jenkins-x/$(CHART) --name $(RELEASE) -f ./myvalues.yaml -f ./secrets.yaml.dec --version $(CHART_VERSION) $(ARGS) local-upgrade: helm upgrade $(RELEASE) jenkins-x/$(CHART) -f myvalues.yaml -f secrets.yaml.dec --version $(CHART_VERSION) $(ARGS) wonder if there's a cleaner way? Not an issue anymore
gharchive/issue
2018-01-19T11:14:35
2025-04-01T06:39:11.900686
{ "authors": [ "jstrachan", "rawlingsj" ], "repo": "jenkins-x/cloud-environments", "url": "https://github.com/jenkins-x/cloud-environments/issues/20", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
166518875
publish change list on all releases Please publish change list as part of each release. Currently the plugin points to the Jenkins wiki which states that after version x the change-lists moved to a Github page. When you click this 2nd link you reach to a page that again is missing the current changes. Please fix this so when you click the URL from the plugin manager you would go to a page that DOES contain the change-lists, allowing you to make an informed decision. Jenkins plugins do have the bad habit of breaking Jenkins when updated, please don't make this maintenance process even harder. duplicate of https://github.com/jenkinsci/ghprb-plugin/issues/360 Hi - I volunteered to help out with this repo. I'll group this with 360 as @martenson pointed out and look to address this problem. thanks @benpatterson !
gharchive/issue
2016-07-20T07:52:10
2025-04-01T06:39:11.965409
{ "authors": [ "benpatterson", "martenson", "ssbarnea" ], "repo": "jenkinsci/ghprb-plugin", "url": "https://github.com/jenkinsci/ghprb-plugin/issues/386", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
258098757
Add documentation example on how to pass build arguments into the Groovy script Context: the Lockable Resources plugin includes an option to specify the needed resource by one of: exact resource name, matching a label, or evaluating a Groovy script for each configured resource that would return true if this resource is okay for this job. Then one of the "okay" resources (assuming "Number of resources to request" setting is 1) is reserved while the build runs. My use-case, now solved by code in PR #72, is that we have a test farm, where usually any of the systems can be used for a test (so label-matching is in place), except when developers want to run the job against some particular system - and protect it from being used by any other job during this time. The build job (or actually a Multiphase job calling several others) has build arguments, including specification of the environment to use for the test (a "CONTROLLER" in examples below). For the test, I used a build of the plugin with PR above integrated (made by mvn package), and in the job I selected "Meta Data / This build requires lockable resources", further selected "Groovy Expression" and entered the script below. The "Resources" and "Label" fields were left empty, and the "Number of resources to request" was set to 1 (disregard the warning that Given amount 1 is greater than amount of resources: 0.), and "Reserved resources variable name" is LOCKED_CONTROLLER. Note the println lines below end up in jenkins.log, and can be pretty noisy, so comment them away when your job definition works well :) // We need an available testbed resource marked with label "rc-validation-farm" // If the user specified a particular testbed name in CONTROLLER var, require that one instead (regardless of labels) /* println "Inspecting the resource to lock for requested CONTROLLER='" + CONTROLLER + "' (looking at resourceName='" + resourceName + "' resourceDescription='" + resourceDescription + "' resourceLabels='" + resourceLabels + "')" */ if ( CONTROLLER.startsWith("LOCK_LABEL:") ) { def LOCK_LABEL = (CONTROLLER =~ /^LOCK_LABEL:(.*?)$/ )[0][1]; /* println "Looking for LOCK_LABEL='" + LOCK_LABEL + "' among '" + resourceLabels + "' for '" + resourceName + "' (" + resourceDescription + ")" */ if (resourceLabels.contains( LOCK_LABEL )) { // println "ACCEPTED '" + resourceName + "'" return true; } } else { // println "Looking for 'rc:" + CONTROLLER + "' in the name '" + resourceName + "'" if (resourceName == ("rc:"+CONTROLLER) ) { // println "ACCEPTED '" + resourceName + "'" return true; } } // println "Resource '" + resourceName + "' is not suitable for this job" return false; // Tested resource is not appropriate for this build The numerous corresponding lockable resource definitions in Manage Jenkins define a name (like rc:controller1) and labels (like rc-validation-farm rc-model-3), and sometimes comments about nuances of the controller model. The job build parameter CONTROLLER is a predefined Global Choice Parameter, which includes the names like controller1 that our devs can pick, and a default value of LOCK_LABEL:rc-validation-farm. The script above was made generic enough to ease copy-pasting, so it reacts to the build arguments starting with LOCK_LABEL: and picks the rest of the string as the label to look for in resources; otherwise it would look for the requested resource name (with the prefix rc:). As you could guess, there is a second side to the medal: the default CONTROLLER build argument is not usable as a definitive host name, so that build argument has to be replaced with the actually picked value :) For this we use the EnvInject plugin, so in the job I further selected "Prepare an environment for the run" and entered an "Evaluated Groovy script" with: def map = [:] /* println "Requested build arg CONTROLLER=='" + CONTROLLER + "'; the locked resource == '" + LOCKED_CONTROLLER + "'"; */ if ("LOCK_LABEL:rc-validation-farm".equals(CONTROLLER)) { def CTLNAME = ( (LOCKED_CONTROLLER =~ /^rc:(.*?)$/)[0][1] ) println "Extracted CONTROLLER:='" + CTLNAME + "'" map << [ CONTROLLER : CTLNAME ] } return map (the script actually evaluates optional overrides for a few other variables as well, hence this structure and not a simpler one). Note that println here ends up in the build job's log. Finally note, that for sub-jobs called from this one, you should also specify "Predefined parameters" including CONTROLLER=${CONTROLLER} so the one mapped above is in place for those jobs that would access it, and not the default token. The "Build on same node" and "Current build parameters" are useful for such cases as well :) UPDATE: The pull request has been merged for a while now. Over the past couple of years I've referred a number of IRC discussions to this issue. I believe it has thus become a good basis for a documentation chapter about using this trick, so I rephrased the issue name to have it only closed after adding published documentation rather than keeping it a needle of know-how lost in the haystack.
gharchive/issue
2017-09-15T16:39:30
2025-04-01T06:39:12.042025
{ "authors": [ "jimklimov" ], "repo": "jenkinsci/lockable-resources-plugin", "url": "https://github.com/jenkinsci/lockable-resources-plugin/issues/73", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
120151475
no ordering in config_hash jenkins::sysconfig does not support any kind of ordering. if you need to change JENKINS_ARGS this might lead to a broken sysconfig: example given this config_hash in yaml: jenkins::config_hash: HTTP_PORT: value: -1 HTTPS_PORT: value: 443 HTTPS_CERT: value: '/some/path' HTTPS_KEY: value: '/some/patjh' JENKINS_ARGS: value: '--webroot=/var/cache/$NAME/war --httpPort=$HTTP_PORT --ajp13Port=$AJP_PORT --httpsPort=$HTTPS_PORT --httpsCertificate=$HTTPS_CERT --httpsPrivateKey=$HTTPS_KEY' you might end up with this sysconfig: # port for HTTP connector (default 8080; disable with -1) HTTP_PORT="-1" # port for AJP connector (disabled by default) AJP_PORT=-1 # servlet context, important if you want to use apache proxying PREFIX=/$NAME # arguments to pass to jenkins. # --javahome=$JAVA_HOME # --httpPort=$HTTP_PORT (default 8080; disable with -1) # --httpsPort=$HTTP_PORT # --ajp13Port=$AJP_PORT # --argumentsRealm.passwd.$ADMIN_USER=[password] # --argumentsRealm.roles.$ADMIN_USER=admin # --webroot=~/.jenkins/war # --prefix=$PREFIX JENKINS_ARGS="--webroot=/var/cache/$NAME/war --httpPort=$HTTP_PORT --ajp13Port=$AJP_PORT --httpsPort=$HTTPS_PORT --httpsCertificate=$HTTPS_CERT --httpsPrivateKey=$HTTPS_KEY" HTTPS_PORT="443" HTTPS_CERT="/some/path" HTTPS_KEY="/some/patjh" since the variables are set in the wrong order variable expansion does not work... Hash are inherently unordered. I'm not sure what are you suggesting? Clearly i didn't think this through ;) Hash is a bad idea for defaults as this must be ordered, otherwise it makes config_hash quite useless. @Kentzo There's no interpolation under systemd. The order of the environmentfile directives is meaningless. That's not true when env vars depend on one another. E.g. in /etc/default/jenkins JENKINS_ARGS="--webroot=/var/cache/$NAME/war --httpPort=$HTTP_PORT --httpListenAddress=127.0.0.1" which makes order meaningful. See the comment in https://github.com/jenkinsci/puppet-jenkins/blob/master/templates/jenkins-slave-defaults.erb#L3-L4 and https://www.freedesktop.org/software/systemd/man/systemd.exec.html#Environment EnvironmentFile declarations are parsed into a char ** and set as environ(7). There is no interpolation. I'm not sure how it works, but this is the state of /etc/default/jenkins after installing jenkins with this module on Ubuntu 16.04 Do you mean it's buggy? Debian hasn't been switched over to systemd unit files, yet.
gharchive/issue
2015-12-03T11:38:30
2025-04-01T06:39:12.058346
{ "authors": [ "Kentzo", "TheMeier", "jhoblitt" ], "repo": "jenkinsci/puppet-jenkins", "url": "https://github.com/jenkinsci/puppet-jenkins/issues/443", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
280204016
Feature: Custom messages for success and failure Hi, I would like to add a different custom message when my build fails or succeeds. As far as I know I can set a custom message that will be show in all cases. I made something that works the way I want in victor-accarini/slack-plugin, but I would like a feedback from you guys since I'm not a Java dev. Thanks! PS: @gurumaia Are you still maintaining this plugin? Possibly fixed in #169
gharchive/issue
2017-12-07T16:57:17
2025-04-01T06:39:12.062097
{ "authors": [ "nhathy", "victor-accarini" ], "repo": "jenkinsci/slack-plugin", "url": "https://github.com/jenkinsci/slack-plugin/issues/360", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
56363673
Fix for using with CloudBees Folder plugin https://issues.jenkins-ci.org/browse/JENKINS-24396 Thank you for a pull request! Please check this document for how the Jenkins project handles pull requests Is anyone maintaining this plugin any more?? I don't think so. Might make sense to take over the ownership and maintain the plugin on your own. And don't forget about thorough testing :) Looks like @ndeloof was the last to merge anything ... any insight? Heh, that was awesome and fast, thank you kindly. Are there still plans for maintenance and up-keep, or need a 2nd set of hands to help out? Hit me up on e-mail, can discuss further there. In mean-time ... thanks again!! :) I can't see active maintenance on this plugin, so feel free to request ownership on jenkins-dev ML Cheers.
gharchive/pull-request
2015-02-03T11:07:58
2025-04-01T06:39:12.066596
{ "authors": [ "Brantone", "ctapobep", "jenkinsadmin", "ndeloof" ], "repo": "jenkinsci/template-project-plugin", "url": "https://github.com/jenkinsci/template-project-plugin/pull/13", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
638325193
Convert to the Incubated project before the release? Hi @timja , I would suggest to make it an incubated project so that we have more freedom w.r.t next steps (e.g. merging it into the Jenkins core). I would suggest the following stapes: Add "Incubated project" to the description and docs, similar to https://plugins.jenkins.io/pipeline-as-yaml/ Mark all APIs with @Restricted(Beta.class) Sure
gharchive/issue
2020-06-14T10:30:14
2025-04-01T06:39:12.068591
{ "authors": [ "oleg-nenashev", "timja" ], "repo": "jenkinsci/theme-manager-plugin", "url": "https://github.com/jenkinsci/theme-manager-plugin/issues/6", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
733926069
Weiterentwicklung WebUI: Filter-reset-Button Is your feature request related to a problem? Please describe. no Describe the solution you'd like ... Describe alternatives you've considered ... Additional context ... Auch wenn die Umsetzung sicherlich recht einfach wäre, so bin ich mir nicht sicher ob das einführen zusätzlicher buttons neben jedem "Filter" wirklich zielführend werden. Meines Erachtens kann der Filter völlig ausreichend in jeder Ansicht im unteren Bereich durch drücken "Filter zurücksetzen" zurückgesetzt werden. Wüsste nicht warum man jetzt bei jedem Filter einen einzelnen Button braucht. WebUI Programmierung sollte ohnehin vermieden werden. Lasse das Ticket aber bis auf weiteres trotzdem bestehen falls sich hier jemand angesprochen fühlt eine Implementierung selbst umzusetzen und zur Integration vorzuschlagen.
gharchive/issue
2020-11-01T09:58:09
2025-04-01T06:39:12.087023
{ "authors": [ "MacHannes", "jens-maus" ], "repo": "jens-maus/RaspberryMatic", "url": "https://github.com/jens-maus/RaspberryMatic/issues/973", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
1044453531
How to configure indentation for the case like this Hello! I'm using 3.3.0 version with Android Studio 2020.3.1 patch 3. Kotlin linter complains about this code: Unexpected indentation (16) (should be 12) What is proper way to configure indentation in such example? KLint supports a few different options for configuring indents, kotlinter-gradle will support all the values you might place in .editorconfig https://github.com/pinterest/ktlint#editorconfig It's hard to tell from your screenshot what the context of those string literals is? Is it in line with the android style guide? Should it be a triple quote """ literal? Thanks for great lib. Will keep on investigating. Even https://ktlint-demo.herokuapp.com/ behaves the same with the same code snippet.
gharchive/issue
2021-11-04T08:04:26
2025-04-01T06:39:12.156524
{ "authors": [ "jeremymailen", "oradkovsky" ], "repo": "jeremymailen/kotlinter-gradle", "url": "https://github.com/jeremymailen/kotlinter-gradle/issues/225", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
17875727
Is there any way to check if the browser locales are in the application locales list? I'm using express so I'm getting the user's locales with req.headers['accept-language'] Now I have a string of locales: zh,zh-CN;q=0.8,zh-TW;q=0.6,en;q=0.4,en-US;q=0.2 My accepted locales are I18n.expressBind(app, { locales: ['en', 'zh-CN'] }); ``` Is there any way I can find the client's most preferred locale that is also on my list? Closed (the comment fixes the answer)
gharchive/issue
2013-08-09T17:34:45
2025-04-01T06:39:12.158128
{ "authors": [ "TheSisb", "gjuchault" ], "repo": "jeresig/i18n-node-2", "url": "https://github.com/jeresig/i18n-node-2/issues/24", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
1393610349
Reset outgoing stream on inbound is reset Added state to Stream to reset outbound only when the state is open Relates to #187 Hi @jerry-tao, what do you think about this fix. (this branch is based on your fork jerry-tao/sctp:master). Who to react to the incoming OutgoingResetRequest should be Stream layer (not at the Association layer). Also, if the stream already sent its OutgoingResetRequest, it shouldn't send another one, meaning, we need to introduce a state in the Stream class (s.state), and send OutgoingResetRequest if the state is "open", I believe. Which prevented the vnet test case from failing. As RFC 8831 implies, teardown sequence of a stream starts with outgoing (closing), then incoming (closed). This branch also addresses it by not to cause error (s.readErr) because it will eventually be reset by the remote. I am hoping to merge this into your branch, then we can finalize it in your pull-request. @enobufs Sorry for the delay, just come back form vacation.
gharchive/pull-request
2022-10-02T01:45:01
2025-04-01T06:39:12.168432
{ "authors": [ "enobufs", "jerry-tao" ], "repo": "jerry-tao/sctp", "url": "https://github.com/jerry-tao/sctp/pull/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
209074710
Remove shadowed declarations, undefined identifiers, and specify argument types where it is required. Thanks, I've updated this patch based on your suggestions.
gharchive/pull-request
2017-02-21T08:36:04
2025-04-01T06:39:12.171475
{ "authors": [ "robertsipka" ], "repo": "jerryscript-project/jerryscript", "url": "https://github.com/jerryscript-project/jerryscript/pull/1601", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
777327753
Master fork PLEASE REMOVE THIS TEMPLATE BEFORE SUBMITTING Before submitting a PR, please, make sure that: Changes are in a separate branch, not in master. The branch contains only one commit on top of master (if not, squash them into one commit). The commit has a descriptive commit message with a concise title (first line). The commit message contains fixes #XXXX or closes #XXXX to auto-close the issue(s) that the PR fixes (if any). Tests for the changes have been added (for bug fixes / features). Documentation has been added / updated (if applicable). All new and existing tests passed locally (if not, fix them first and amend the commit). IMPORTANT: Please review the CONTRIBUTING.md file for detailed contributing guidelines. PLEASE REMOVE THIS TEMPLATE BEFORE SUBMITTING @lygstate Please, explain what you are doing. You cannot open a PR by mistake. You cannot open PRs by mistake TWICE. @lygstate Please, explain what you are doing. You cannot open a PR by mistake. You cannot open PRs by mistake TWICE. @lygstate Please, explain what you are doing. You cannot open a PR by mistake. You cannot open PRs by mistake TWICE. It's a mistake and also by github, I am trying to PR to my own jerryscript repo to trigger the CI, but it's default to jerryscript not to my own and this is the result @lygstate Please, explain what you are doing. You cannot open a PR by mistake. You cannot open PRs by mistake TWICE. It's a mistake and also by github, I am trying to PR to my own jerryscript repo to trigger the CI, but it's default to jerryscript not to my own and this is the result @lygstate Now that #4395 has landed, it should be easier to trigger the CI. Any branch you push to your fork will trigger GH Actions on your fork, even without a PR. @lygstate Now that #4395 has landed, it should be easier to trigger the CI. Any branch you push to your fork will trigger GH Actions on your fork, even without a PR. @akosthekiss thanks. @akosthekiss thanks.
gharchive/pull-request
2021-01-01T18:07:28
2025-04-01T06:39:12.178024
{ "authors": [ "akosthekiss", "lygstate" ], "repo": "jerryscript-project/jerryscript", "url": "https://github.com/jerryscript-project/jerryscript/pull/4394", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }