id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
1386421563 | Rohan Gaikwad Review Ticket #6
My Fastpages
Score: 2.7+/3 - based on crossover grade
Score: /3 - based on review
[x] Fibo Java
[x] JS data
[x] Project
Score 2.7/3 for good completion of individual work and team work
| gharchive/issue | 2022-09-26T16:57:43 | 2025-04-01T06:37:30.269169 | {
"authors": [
"RohanG326",
"nicm2"
],
"repo": "RohanG326/rohangfastpages",
"url": "https://github.com/RohanG326/rohangfastpages/issues/7",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1423842156 | SMA100A, SMA100B - FileBrowser does not work
The issue is related to the stricter file system access policy, it will be resolved in 1.2.0
Fixed in 1.3.0
| gharchive/issue | 2022-10-26T11:07:00 | 2025-04-01T06:37:30.270411 | {
"authors": [
"Miloslav-RS"
],
"repo": "Rohde-Schwarz/RcIcPluginBugTracking",
"url": "https://github.com/Rohde-Schwarz/RcIcPluginBugTracking/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1092126907 | Improvements to the [The Fantasy Trip] character sheet
This is a pretty big set of improvements to the sheet. These include:
Nicer looking Roll Templates for attacking and spells, including optional flavor text.
An Initiative button to add you to the Turn Order.
Tracking of unspent and spent experience points and IQ points to add in talent and attribute selection.
An array of attack buttons at various to-hit amounts, to allow single-click attacks at various to-hit amounts. You select your spell or weapon, then click the button you want. There are tooltips for common situations such as rear and flank attacks.
Tooltips on most captions and values.
A repeating section just for equipment, including current count, weight per item.
Description buttons to describe items/talents in the chat text.
Gear buttons to pop open repeating sections to edit detailed descriptions.
Weight tracking. every item has a weight, and a checkbox for whether it is currently carried or not. This is used to compute total weight and the Movement penalties, which are automatically applied.
Player data is preserved. I kept all the old names of attributes.
Layout is also nicer looking, using a grid layout and some formatting.
Roll20 Requests
Comments are very helpful for reviewing the code changes. Please answer the relevant questions below in your comment.
[ ] Does the pull request title have the sheet name(s)? Include each sheet name.
[ ] Is this a bug fix?
[ ] Does this add functional enhancements (new features or extending existing features) ?
[ ] Does this add or change functional aesthetics (such as layout or color scheme) ?
[ ] If changing or removing attributes, what steps have you taken, if any, to preserve player data ?
[ ] If this is a new sheet, did you follow Building Character Sheets standards ?
If you do not know English. Please leave a comment in your native language.
Character Sheet Info Roll20 Internal Use only.
Hello @nmbradley, thanks for the merge. Hey I just checked in-game and I'm not seeing any of my changes when I select "The Fantasy Trip" as my character sheet. How do I see the changes (I mean, without cutting and pasting into custom) Also, I wasn't expecting the pull to happen so quickly, and I actually added some more features today. Should I just make a new pull request?
Hi @jaemzfleming
I'll investigate why these updates are not showing up on the service and get back to you.
Yes, please create a new pull request for any further changes.
One further point: if you're not ready for a PR to be merged yet, you should be able to indicate that it's a draft.
Let me know if you have any problems doing this!
Thanks for looking into that. And I'll do the draft thing, that sounds smart.
Oh, @nmbradley it looks like the sheet.json had a trailing comma after the "instructions" field. This was failing validation. Maybe that's why it's not showing up? I've fixed it in my new pull request, which I've marked as draft. Maybe I'll un-mark it as draft then since it has that fix and do a new pull request next week for any other straggling features.
Hi @cjstoddard and @clevett, not sure if this is the right way to alert you, but I've made some improvements to The Fantasy Trip character sheet, they've been merged and you can grab them to see if you'd like. I'm new to GitHub, not sure exactly where/how discussions should take place.
I am glad to see others working on improvements to this character sheet. Honestly, what I did was pretty crude and my intention was just to get something onto Roll20 hoping others people with actual skills would improve on it.
Mission accomplished then! It provided a great launching point.
| gharchive/pull-request | 2022-01-02T23:40:11 | 2025-04-01T06:37:30.283254 | {
"authors": [
"cjstoddard",
"jaemzfleming",
"nmbradley",
"roll20deploy"
],
"repo": "Roll20/roll20-character-sheets",
"url": "https://github.com/Roll20/roll20-character-sheets/pull/9974",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2291253858 | Pass payload to the rescue handler
In the @next branch, passing the payload to the rescue handler would be helpful for updating database entities with a failure state.
Released as 3.0.0-0 👍🏻
There are no breaking change, but 2.0.0 will be used for an AdonisJS 5 update.
| gharchive/issue | 2024-05-12T09:35:49 | 2025-04-01T06:37:30.287200 | {
"authors": [
"RomainLanz",
"jarle"
],
"repo": "RomainLanz/adonis-bull-queue",
"url": "https://github.com/RomainLanz/adonis-bull-queue/issues/37",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
441014881 | Include allcontributors.org to acknowledge the work of everyone
Reviewing translations is a valuable work. I'd like to be able to acknowledge and enhance this type of contribution.
https://allcontributors.org/
https://github.com/all-contributors/all-contributors
Hi! AllContributors is now available as a bot :)
https://allcontributors.org/docs/en/bot/installation
That said, it appears to only add entries to a single file, so you might struggle with adding acknowledgements to multiple language files.
| gharchive/issue | 2019-05-07T03:57:31 | 2025-04-01T06:37:30.294469 | {
"authors": [
"JonTheNiceGuy",
"RomuloOliveira"
],
"repo": "RomuloOliveira/commit-messages-guide",
"url": "https://github.com/RomuloOliveira/commit-messages-guide/issues/39",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
990151937 | Support Dictionary<string,string> sections
Hi Ronen,
I have a legacy project where one section is used as a Dictionary<string,string> with filters where key can be anything just as value can be anything.
This solution solves this for me, but I am unsure if it would break anything else. I mean, the code will only execute if the instance passed is a Dictionary<string,string>.
I am open for suggestions to improve this, so it can be upstreamed and added to NuGet, and I don't have to keep maintaining my own fork.
Thanks for your work, saved me the hassle of doing the same on my own!
Best regards,
Marlon Beijer
Hello @MBeijer,
Thank you for the PR, I appreciate it!
It's a weird coincidence just today I worked with SINI and wanted to add sections getter, so your timing is good.
I decided to take a slightly different approach for your requirement, please see this new method:
https://github.com/RonenNess/sini/commit/7a2b0655657583c03da49e0cbaff6a7acc326059#diff-c661505f1c432d12a9a2a77ad3ce364d883cdaf110d6cf3c860845d49eded9b6R163
Which can be combined with this getter, if you want all keys from all sections:
https://github.com/RonenNess/sini/commit/7a2b0655657583c03da49e0cbaff6a7acc326059#diff-c661505f1c432d12a9a2a77ad3ce364d883cdaf110d6cf3c860845d49eded9b6R163
I think this way its a cleaner API, but ofc its a matter of personal preference :)
The new version was uploaded to Nuget and will probably be online soon.
@RonenNess In my case it was needed from ToObject as I am casting to an object that I use for both legacy .ini and new .json style config files.
I can implement this in ToObject using your new methods and send you a new PR, using this approach instead. :)
I initially tried to use CustomParsers and add a parser to the default configuration, but that fails because var asStr = ini.GetStr(section, key, null); returns null (of course) because it's a section and not a property-line.
Or am I missing something that makes this work already in ToObject? From your changes I didn't see anything that would do this
Maybe I didn't understand your use case.
Are you parsing an object containing a Dictionary<string, string> or are you trying to convert a whole file / section to Dictionary<string, string>?
@RonenNess
I have a Settings class that has a public Dictionary<string,string> Filter {get; set;}
So I am loading the settings.ini with IniFile.ToObject<Settings>("settings.ini");
public class Settings
{
public static async Task<Settings> CreateConfig(string config)
{
Settings settings = null;
if (config.EndsWith(".ini"))
{
settings = IniFile.ToObject<Settings>(config, IniFile.ParseObjectFlags.AllowMissingFields | IniFile.ParseObjectFlags.AllowAdditionalKeys | IniFile.ParseObjectFlags.SnakecaseKeysAndSections);
await Save(settings, config);
}
else if (config.EndsWith(".json"))
settings = await JsonFile.ToObject<Settings>(config);
if (settings != null)
settings.ConfigFile = config;
return settings;
}
private static async Task Save(Settings settings, string configFile)
{
string path = Path.GetDirectoryName(configFile);
await File.WriteAllTextAsync($"{path}{Path.DirectorySeparatorChar}configuration.json", settings?.ToJson(true));
}
private string ConfigFile { get; set; }
public GeneralModel General { get; set; }
public PogDesignModel Pogdesign { get; set; }
public RssModel Rss { get; set; }
public Dictionary<string,string> Filter { get; set; }
}
``
settings.ini:
[pogdesign]
username =
password =
[general]
loglevel = Info
storage = /home/user/Downloads
[filter]
key1 = value1
key2 = value2
key3 = value3
[rss]
username =
password =
passkey =
cookie_uid =
cookie_pass =
cookie_validation =
The filter section is entirely dynamic, which is why I need Dictionary<string,string>
Ah I understand now, so the whole section 'filters' is translated to a member that is a Dictionary<string,string>. Make sense to have that working.
It can now be better implemented with the new GetKeys() method. In the original PR what bothered me the most was the use of 'GetAllUnreadKeys', which can now be avoided.
I think I'll do it sometime tomorrow, that's a good requirement.
Thanks! :)
@RonenNess I tagged you in a comment on a commit i made, where I did a simple solution (but probably not a complete one) https://github.com/RonenNess/sini/commit/257a3b7fde322e7192ae6e3c9945073177fe14ba
Your solution is good, just added test + fixed AsDictionary() to update accessed keys, so it won't throw exception on unused keys.
https://github.com/RonenNess/sini/commit/9ccf9758b2ed07f476bfa1463b26a954385ed455
NuGet package should update shortly :)
@RonenNess Nice! Thank you! :)
| gharchive/pull-request | 2021-09-07T16:23:34 | 2025-04-01T06:37:30.304787 | {
"authors": [
"MBeijer",
"RonenNess"
],
"repo": "RonenNess/sini",
"url": "https://github.com/RonenNess/sini/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
566550626 | iOS Support
This should be easy to compile on iOS
Resolved by #10
| gharchive/issue | 2020-02-17T22:55:50 | 2025-04-01T06:37:30.424056 | {
"authors": [
"BenLeggiero"
],
"repo": "RougeWare/Swift-MultiplicativeArithmetic",
"url": "https://github.com/RougeWare/Swift-MultiplicativeArithmetic/issues/9",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
413261008 | [Family] Scunion eSports Family
Consult https://github.com/RoyaleAPI/cr-api-ux/wiki/Family-Config and fill in the following information.
name: Scunion eSports Family
key: scunion
color: black
emblem: Clover_02
info:
logo: https://drive.google.com/file/d/1JGSzp-eeErvvpIVuxv4GZgOwX9iqJSRX/view?usp=sharing
description: >
*scunion [skuhn-yuhn]: A term used in the late 1960s during the Vietnam War to signal the inflicting of distress, injury, or destruction // as in, "bring the scunion!" **Scunion eSports** is an eSports team competing in the International region of Clash Royale League. Ever since it began on September 11, 2017, our clans and members have been growing steadily. Our clan comprises members from many different places, different backgrounds, and different ages. We all share our love for the game, donate when possible, help in the Clan Wars when available, and help each other out. _Join us, and let's bring the scunion!_
social:
- facebook: https://www.facebook.com/scunion.esports.14
clans:
- name: Scunion eSports
tag: 82CVUPU2
- name: Scunion Elites
tag: 9Y2PUUYV9
I made a mistake, wrote the wrong facebook link, updated:
name: Scunion eSports Family
key: scunion
color: black
emblem: Clover_02
info:
logo: /static/img/brands/logo/scunion-esports.png
description: >
*scunion [skuhn-yuhn]: A term used in the late 1960s during the Vietnam War to signal the inflicting of distress, injury, or destruction // as in, "bring the scunion!" **Scunion eSports** is an eSports team competing in the International region of Clash Royale League. Ever since it began on September 11, 2017, our clans and members have been growing steadily. Our clan comprises members from many different places, different backgrounds, and different ages. We all share our love for the game, donate when possible, help in the Clan Wars when available, and help each other out. _Join us, and let's bring the scunion!_
social:
- facebook: https://www.facebook.com/groups/100478373784989/
clans:
- name: Scunion eSports
tag: 82CVUPU2
- name: Scunion Elites
tag: Y2PUUYV9
No worries! Thanks for your time and all you do! 👍 @smlbiobot
| gharchive/issue | 2019-01-29T20:19:32 | 2025-04-01T06:37:30.434360 | {
"authors": [
"doongmiin"
],
"repo": "RoyaleAPI/family-config",
"url": "https://github.com/RoyaleAPI/family-config/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1779634205 | 🛑 Website (rubenfixit.com) is down
In 23f80a1, Website (rubenfixit.com) (https://rubenfixit.com) was down:
HTTP code: 503
Response time: 310 ms
Resolved: Website (rubenfixit.com) is back up in bd921a2.
| gharchive/issue | 2023-06-28T19:44:25 | 2025-04-01T06:37:30.439972 | {
"authors": [
"RubenFixit"
],
"repo": "RubenFixit/upptime",
"url": "https://github.com/RubenFixit/upptime/issues/209",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1893462121 | Softmax Classifier & partial training
Hi,
In the documentation it is stated that partial training can be used to reduce memory consumption.
I tried to train a Softmax classifier with several datasets and partial methods.
But only the first labels of the train() method are known. If new labels are present in the dataset given to the partial() method, they are not taken into account.
Can Dataset object retain set of all labels after Labeled::fold() method?
Regards.
Yes, the first training set defines all the possible labels for the model. If you want to fold your dataset such that each fold has samples that correspond to all possible classes in the master dataset then you can use the straftifiedFold() method.
$folds = $dataset->stratifiedFold(5);
https://docs.rubixml.com/2.0/datasets/labeled.html#stratification
| gharchive/issue | 2023-09-12T23:15:41 | 2025-04-01T06:37:30.443873 | {
"authors": [
"ElGigi",
"andrewdalpino"
],
"repo": "RubixML/ML",
"url": "https://github.com/RubixML/ML/issues/307",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2129326643 | Initial Interface
Initial Interface creation and METAR data request by ICAO Code
Interface
| gharchive/pull-request | 2024-02-12T02:33:33 | 2025-04-01T06:37:30.465510 | {
"authors": [
"Ruisth"
],
"repo": "Ruisth/METAR_Decoder",
"url": "https://github.com/Ruisth/METAR_Decoder/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1142939685 | rustpython VM storing defined functions across interpreter calls.
Hi, I'm using the rustpython VM and I'm trying to define some functions in one call and then at a later time I'd like to call those functions an arbitrary amount of times. The trouble I'm running into is saving the state across those calls to the interpreter and I end up trying to call functions that are not defined.
use rustpython_vm::compile::Mode;
use rustpython_vm::Interpreter;
use rustpython_vm::import::import_file;
fn main() {
Interpreter::default().enter(|vm| {
let imp = import_file(vm, "mylib", "mylib".to_owned(), MAIN.to_owned()).unwrap();
let scope = vm.new_scope_with_builtins();
let code_obj = vm
.compile(
r#"import * from mylib
init()"#,
Mode::Exec,
"<embedded>".to_owned(),
)
.unwrap();
vm.run_code_obj(code_obj, scope).unwrap();
});
println!("Hello, rust!");
}
const MAIN: &str = r#"print("PYTHON MODULE INIT")
def init():
print("Python says init")
"#;
This will print PYTHON MODULE INIT as it should when MAIN is compiled. However it will panic when I try to import the library or if I try to run init without importing the module.
How do I save the state (or scope) of the compiled MAIN module and pass it to the interpreter in later calls?
I'm not exactly sure what you mean by passing it to the interpreter, but I'd assume you're later calling Interpreter::default().enter() multiple times? Cause the solution for that is to just save the interpreter instead of recreating it every time, let interp = Interpreter::default(); for _ in _ { interp.enter(|vm| { .. }); }
Alternatively, you could just move your whole program into the enter() call.
Well in this you never have the from mylib import * bit, so there's not init() in scope. If you want to preserve that you should save scope as well and then it'll all happen in the same scope.
Thank you, that clears up a lot of confusion, sorry for the inconvenience :)
| gharchive/issue | 2022-02-18T12:06:28 | 2025-04-01T06:37:30.615921 | {
"authors": [
"DurkelTheDonkey",
"coolreader18"
],
"repo": "RustPython/RustPython",
"url": "https://github.com/RustPython/RustPython/issues/3556",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1517035363 | Implement integer format validation
Integer Format validation is not implemented.
>>>>> f"{4096:,o}"
'1,0000' # Should be raised `ValueError`
>>>>> f"{123456:_n}"
'123_456' # Should be raised `ValueError`
I've implemented format validation.
With this fix, following tests now pass.
test_format_spec_errors in test_types.py
test_int__format__ in test_types.py
Thank you!
| gharchive/pull-request | 2023-01-03T07:46:02 | 2025-04-01T06:37:30.618404 | {
"authors": [
"youknowone",
"yt2b"
],
"repo": "RustPython/RustPython",
"url": "https://github.com/RustPython/RustPython/pull/4412",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
413655802 | [IDEA] Use "extractor" pattern for native functions
First off, this is far from being complete, but I wanted to share it early to see if this is a direction we actually want to go in before continuing.
The idea is to apply the "extractor pattern" used by Rust web frameworks such as actix to native/rust python functions, so that the function signature determines things such as type checking, conversions to and from python objects, etc.
For example, a function that adds two ints would currently look like this:
fn add(vm: &mut VirtualMachine, args: PyFuncArgs) -> PyResult {
arg_check!(
vm,
args,
required = [(zelf, Some(vm.ctx.int_type())), (other, Some(vm.ctx.int_type())]
);
Ok(vm.ctx.new_int(get_value(a) + get_value(a)))
}
This isn't bad, but it's a lot for such a simple function, and for more complex functions the boilerplate overwhelms everything else.
The new way would look like this:
fn add(a: PyInt, b: PyInt) -> PyInt {
a + b
}
Additionally, the old way still works for when you need it, because PyFuncArgs impls FromPyFuncArgs, PyResult impls IntoPyObject, etc.
Some other neat consequences:
A function could take a VarArgs<T> parameter that shifts the rest of the positional args into a Vec.
FromPyObject impls for primitives could raising overflow exceptions so they wouldn't have to be scattered around everywhere as they are now
For functions that take a lot of keyword args, one can imagine a derive(FromFuncArgs) proc-macro.
Codecov Report
Merging #525 into master will decrease coverage by 1.58%.
The diff coverage is 40%.
@@ Coverage Diff @@
## master #525 +/- ##
==========================================
- Coverage 50.08% 48.49% -1.59%
==========================================
Files 68 68
Lines 13924 14271 +347
Branches 3465 3590 +125
==========================================
- Hits 6974 6921 -53
- Misses 5096 5494 +398
- Partials 1854 1856 +2
Impacted Files
Coverage Δ
vm/src/obj/objint.rs
47% <0%> (-25.53%)
:arrow_down:
vm/src/obj/objbool.rs
40.74% <33.33%> (-22.42%)
:arrow_down:
vm/src/obj/objrange.rs
35.13% <40%> (-23.69%)
:arrow_down:
vm/src/pyobject.rs
69.68% <53.65%> (-15.1%)
:arrow_down:
vm/src/obj/objstr.rs
59.29% <0%> (ø)
:arrow_up:
vm/src/stdlib/pystruct.rs
24.5% <0%> (ø)
:arrow_up:
... and 6 more
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2d19486...c442dee. Read the comment docs.
This sounds good to me, I'll have a better look later on. I think it is okay if we have multiple options of doing the same thing. We now have arg_check macro for type checking, and manual checking, this would be a nice third option. I would like to add this option, but also leave the existing way available.
Could you work out your PyInt example in the code?
I would like to add this option, but also leave the existing way available.
Completely agree, the existing ways are fully supported, via an identity impl of PyNativeFuncFactory. There will certainly be cases where it's preferred to work with PyFuncArgs directly.
Could you work out your PyInt example in the code?
I decided to leave int_add alone because I haven't worked out the nicest way to model the NotImplemented constant in this system. Instead, I converted str_endswith, so that should provide a good example.
I think this is ready to merge now. The type checking needs more work, but that's fine for now since almost nothing is using this. (Also, I discovered some edge cases not handled by arg_check!, so I'll spend some time fixing these up for both in a future PR.)
| gharchive/pull-request | 2019-02-23T04:34:34 | 2025-04-01T06:37:30.636618 | {
"authors": [
"OddCoincidence",
"codecov-io",
"windelbouwman"
],
"repo": "RustPython/RustPython",
"url": "https://github.com/RustPython/RustPython/pull/525",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
193466516 | RxAnimatedDataSource crash on performBatchUpdates (line 0)
Hello,
I've got a small amount of crashes on Fabric that seem to be triggered sometimes when calling the performBatchUpdates method.
The error displayed is the following:
UITableView internal bug: unable to generate a new section map with old section count: 2 and new section count: 0
I am initialising the View Controller with 2 sections (0 cells until the data is loaded from the server) and never remove them. I only replace them by an updated version in my handler method, which returns a new set of 2 sections.
Any ideas are much appreciated! Thanks.
The issue was related to the table view's data source not being set (RxCocoa.swift data source not set fatal error). Once I fixed that this crash disappeared.
Hello, i've the same problem and i can't fix it... This crash appears randomly when i load for the first time a collectionView. Any idea ? I think it's a problem of section.
| gharchive/issue | 2016-12-05T10:56:12 | 2025-04-01T06:37:30.645124 | {
"authors": [
"YanisSOTO",
"alexbredy"
],
"repo": "RxSwiftCommunity/RxDataSources",
"url": "https://github.com/RxSwiftCommunity/RxDataSources/issues/80",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1159595621 | Bug where editing a client modal and the DOB fields are being transversed
This is getting called every time a DOB field gets focus (not a big deal because clients DOB is rarely edited) but still shouldn't be calling the API when focus moves about.
API Call: https://domain.com/v1/resident/search
Body:
{
"where": [
["FirstName", "=", "PersonsFirstName"],
["LastName", "=", "PersonsLastName"],
["DOB_YEAR", "=", 1976],
["DOB_MONTH", "=", 10],
["DOB_DAY", "=", "21"],
["Id", "<>", 1038]
],
"withTrashed": true
}
Also one of the fields is null and shouldn't be:
index.js:1 Warning: `value` prop on `input` should not be null. Consider using an empty string to clear the component or `undefined` for uncontrolled components.
at input
at http://localhost:3000/static/js/vendors~main.chunk.js:81498:23
at div
at http://localhost:3000/static/js/vendors~main.chunk.js:80149:23
at div
at http://localhost:3000/static/js/vendors~main.chunk.js:84260:23
at http://localhost:3000/static/js/vendors~main.chunk.js:81805:23
at form
at http://localhost:3000/static/js/vendors~main.chunk.js:81198:23
at div
at http://localhost:3000/static/js/vendors~main.chunk.js:85670:27
at div
at div
at http://localhost:3000/static/js/vendors~main.chunk.js:82735:23
at div
at Transition (http://localhost:3000/static/js/vendors~main.chunk.js:118425:30)
at http://localhost:3000/static/js/vendors~main.chunk.js:80989:24
at DialogTransition
at http://localhost:3000/static/js/vendors~main.chunk.js:116102:24
at http://localhost:3000/static/js/vendors~main.chunk.js:82431:23
at ClientEdit (http://localhost:3000/static/js/main.chunk.js:12019:32)
at ClientHeader (http://localhost:3000/static/js/main.chunk.js:528:87)
at Main (http://localhost:3000/static/js/main.chunk.js:958:81)
at App (http://localhost:3000/static/js/main.chunk.js:363:97)
| gharchive/issue | 2022-03-04T12:29:17 | 2025-04-01T06:37:30.647491 | {
"authors": [
"RyanNerd"
],
"repo": "RyanNerd/rxchart-web",
"url": "https://github.com/RyanNerd/rxchart-web/issues/328",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2254585104 | 🛑 Billing Site is down
In ebd670c, Billing Site (https://boundlesshosting.xyz) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Billing Site is back up in 50448f0 after 10 minutes.
| gharchive/issue | 2024-04-20T14:49:39 | 2025-04-01T06:37:30.650187 | {
"authors": [
"Ryanhindman6654"
],
"repo": "Ryanhindman6654/statsupagebh",
"url": "https://github.com/Ryanhindman6654/statsupagebh/issues/17",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2686889418 | Suggestion for new update
hi can you add edit spells I want create new summon spellls or edit vanilla spells
hi can you add edit spells I want create new summon spellls or edit vanilla spells
Like Magicks? Right now that would require source-code modding- although a patcher tool might be possible in the future
If you wish, a tool like https://github.com/dnSpy/dnSpy can be used to modify that kind of stuff in the meantime if you have knowledge in c#
I dont know how to using dnspy but thanks to answer my question
| gharchive/issue | 2024-11-24T02:20:03 | 2025-04-01T06:37:30.660561 | {
"authors": [
"RyleiC",
"canbeyy1"
],
"repo": "RyleiC/MagickaForgeV2",
"url": "https://github.com/RyleiC/MagickaForgeV2/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2216299589 | [Bug] No cleared highlights if auto-focus is disabled
Steps to reproduce
Enable auto-paste.
Disable "focus editor after pasting" in the pdf++ settings.
Highlight some text in the PDF and auto-paste it to the annotation file.
Problem now: in the pdf viewer the text is now double highlighted, and the doubled highlights go away with a left mouse click only.
Expected behavior
No doubled highlights after auto-paste in the pdf viewer.
Actual behavior
Doubled highlights which don't disappear by itself.
Screen recordings or screenshots (sandbox vault)
Doubled highlights:
Without doubled highlights:
Obsidian debug info
SYSTEM INFO:
Obsidian version: v1.5.11
Installer version: v1.5.3
Operating system: Windows 10 Pro 10.0.19042
Login status: not logged in
Insider build toggle: off
Live preview: on
Base theme: adapt to system
Community theme: none
Snippets enabled: 0
Restricted mode: off
Plugins installed: 1
Plugins enabled: 1
1: PDF++ v0.39.6
RECOMMENDATIONS:
Community plugins: for bugs, please first try updating all your plugins to latest. If still not fixed, please try to make the issue happen in the Sandbox Vault or disable community plugins.
PDF++ debug info
{
"settings": {
"displayTextFormats": [
{
"name": "Title & page",
"template": "{{file.basename}}, p.{{pageLabel}}"
},
{
"name": "Page only",
"template": "p.{{pageLabel}}"
},
{
"name": "Text",
"template": "{{text}}"
}
],
"defaultDisplayTextFormatIndex": 0,
"syncDisplayTextFormat": true,
"syncDefaultDisplayTextFormat": false,
"copyCommands": [
{
"name": "Quote",
"template": "> ({{linkWithDisplay}})\n> {{selection}}\n"
},
{
"name": "Link only",
"template": "{{linkWithDisplay}}"
},
{
"name": "Embed",
"template": "!{{link}}"
},
{
"name": "Callout",
"template": "> [!{{calloutType}}|{{color}}] {{linkWithDisplay}}\n> {{text}}\n"
},
{
"name": "Quote in callout",
"template": "> [!{{calloutType}}|{{color}}] {{linkWithDisplay}}\n> > {{text}}\n> \n> "
}
],
"useAnotherCopyTemplateWhenNoSelection": false,
"copyTemplateWhenNoSelection": "{{linkToPageWithDisplay}}",
"trimSelectionEmbed": false,
"embedMargin": 50,
"noSidebarInEmbed": true,
"noSpreadModeInEmbed": true,
"embedUnscrollable": false,
"singleTabForSinglePDF": true,
"highlightExistingTab": false,
"existingTabHighlightOpacity": 0.5,
"existingTabHighlightDuration": 0.75,
"paneTypeForFirstPDFLeaf": "left",
"openLinkNextToExistingPDFTab": true,
"openPDFWithDefaultApp": false,
"openPDFWithDefaultAppAndObsidian": true,
"focusObsidianAfterOpenPDFWithDefaultApp": true,
"syncWithDefaultApp": false,
"dontActivateAfterOpenPDF": true,
"dontActivateAfterOpenMD": true,
"highlightDuration": 0.75,
"noTextHighlightsInEmbed": false,
"noAnnotationHighlightsInEmbed": true,
"persistentTextHighlightsInEmbed": true,
"persistentAnnotationHighlightsInEmbed": false,
"highlightBacklinks": true,
"selectionBacklinkVisualizeStyle": "highlight",
"dblclickEmbedToOpenLink": true,
"highlightBacklinksPane": true,
"highlightOnHoverBacklinkPane": true,
"backlinkHoverColor": "",
"colors": {
"Yellow": "#ffd000",
"Red": "#ea5252",
"Note": "#086ddd",
"Important": "#bb61e5",
"gdfgdfg": "#d12323"
},
"defaultColor": "gdfgdfg",
"defaultColorPaletteItemIndex": 0,
"syncColorPaletteItem": true,
"syncDefaultColorPaletteItem": false,
"colorPaletteInToolbar": true,
"noColorButtonInColorPalette": false,
"colorPaletteInEmbedToolbar": false,
"showStatusInToolbar": true,
"highlightColorSpecifiedOnly": false,
"doubleClickHighlightToOpenBacklink": true,
"hoverHighlightAction": "preview",
"paneTypeForFirstMDLeaf": "right",
"singleMDLeafInSidebar": true,
"alwaysUseSidebar": true,
"ignoreExistingMarkdownTabIn": [],
"defaultColorPaletteActionIndex": 4,
"syncColorPaletteAction": true,
"syncDefaultColorPaletteAction": false,
"proxyMDProperty": "PDF",
"hoverPDFLinkToOpen": false,
"ignoreHeightParamInPopoverPreview": true,
"filterBacklinksByPageDefault": true,
"showBacklinkToPage": true,
"enableHoverPDFInternalLink": true,
"recordPDFInternalLinkHistory": true,
"alwaysRecordHistory": true,
"renderMarkdownInStickyNote": true,
"enablePDFEdit": false,
"author": "",
"writeHighlightToFileOpacity": 0.2,
"defaultWriteFileToggle": false,
"syncWriteFileToggle": true,
"syncDefaultWriteFileToggle": false,
"enableAnnotationDeletion": true,
"warnEveryAnnotationDelete": false,
"warnBacklinkedAnnotationDelete": true,
"enableAnnotationContentEdit": true,
"enableEditEncryptedPDF": false,
"pdfLinkColor": "#04a802",
"pdfLinkBorder": false,
"replaceContextMenu": true,
"showContextMenuOnMouseUpIf": "Mod",
"contextMenuConfig": [
{
"id": "action",
"visible": true
},
{
"id": "selection",
"visible": true
},
{
"id": "write-file",
"visible": true
},
{
"id": "annotation",
"visible": true
},
{
"id": "modify-annotation",
"visible": true
},
{
"id": "link",
"visible": true
},
{
"id": "text",
"visible": true
},
{
"id": "search",
"visible": true
},
{
"id": "page",
"visible": true
},
{
"id": "settings",
"visible": true
}
],
"selectionProductMenuConfig": [
"color",
"copy-format",
"display"
],
"writeFileProductMenuConfig": [
"color",
"copy-format",
"display"
],
"annotationProductMenuConfig": [
"copy-format",
"display"
],
"updateColorPaletteStateFromContextMenu": true,
"executeBuiltinCommandForOutline": true,
"executeBuiltinCommandForZoom": true,
"executeFontSizeAdjusterCommand": true,
"closeSidebarWithShowCommandIfExist": true,
"autoHidePDFSidebar": false,
"outlineDrag": true,
"outlineContextMenu": true,
"outlineLinkDisplayTextFormat": "{{file.basename}}, {{text}}",
"outlineLinkCopyFormat": "{{linkWithDisplay}}",
"recordHistoryOnOutlineClick": true,
"popoverPreviewOnOutlineHover": true,
"thumbnailDrag": true,
"thumbnailContextMenu": true,
"thumbnailLinkDisplayTextFormat": "{{file.basename}}, page {{pageLabel}}",
"thumbnailLinkCopyFormat": "{{linkWithDisplay}}",
"recordHistoryOnThumbnailClick": true,
"popoverPreviewOnThumbnailHover": true,
"annotationPopupDrag": true,
"useCallout": true,
"calloutType": "PDF",
"calloutIcon": "highlighter",
"highlightBacklinksInEmbed": false,
"highlightBacklinksInHoverPopover": false,
"highlightBacklinksInCanvas": true,
"clickPDFInternalLinkWithModifierKey": true,
"clickOutlineItemWithModifierKey": true,
"clickThumbnailWithModifierKey": true,
"focusEditorAfterAutoPaste": false,
"respectCursorPositionWhenAutoPaste": true,
"autoCopy": false,
"autoFocus": false,
"autoPaste": true,
"autoFocusTarget": "last-active-and-open-then-last-paste",
"autoPasteTarget": "last-active-and-open-then-last-paste",
"openAutoFocusTargetIfNotOpened": true,
"howToOpenAutoFocusTargetIfNotOpened": "right",
"closeHoverEditorWhenLostFocus": true,
"closeSidebarWhenLostFocus": true,
"openAutoFocusTargetInEditingView": true,
"executeCommandWhenTargetNotIdentified": true,
"commandToExecuteWhenTargetNotIdentified": "switcher:open",
"autoPasteTargetDialogTimeoutSec": 20,
"autoCopyToggleRibbonIcon": true,
"autoCopyIconName": "highlighter",
"autoFocusToggleRibbonIcon": true,
"autoFocusIconName": "zap",
"autoPasteToggleRibbonIcon": true,
"autoPasteIconName": "clipboard-paste",
"viewSyncFollowPageNumber": true,
"viewSyncPageDebounceInterval": 0.3,
"openAfterExtractPages": true,
"howToOpenExtractedPDF": "tab",
"warnEveryPageDelete": false,
"warnBacklinkedPageDelete": true,
"extractPageInPlace": false,
"askExtractPageInPlace": true,
"pageLabelUpdateWhenInsertPage": "keep",
"pageLabelUpdateWhenDeletePage": "keep",
"pageLabelUpdateWhenExtractPage": "keep",
"askPageLabelUpdateWhenInsertPage": true,
"askPageLabelUpdateWhenDeletePage": true,
"askPageLabelUpdateWhenExtractPage": true,
"copyOutlineAsListFormat": "{{linkWithDisplay}}",
"copyOutlineAsListDisplayTextFormat": "{{text}}",
"copyOutlineAsHeadingsFormat": "{{text}}\n\n{{linkWithDisplay}}",
"copyOutlineAsHeadingsDisplayTextFormat": "p.{{pageLabel}}",
"copyOutlineAsHeadingsMinLevel": 2,
"newFileNameFormat": "",
"newFileTemplatePath": "",
"newPDFLocation": "current",
"newPDFFolderPath": "",
"rectEmbedStaticImage": false,
"rectImageFormat": "file",
"rectImageExtension": "webp",
"zoomToFitRect": false,
"rectEmbedResolution": 100,
"includeColorWhenCopyingRectLink": true,
"backlinkIconSize": 50,
"showBacklinkIconForSelection": false,
"showBacklinkIconForAnnotation": false,
"showBacklinkIconForOffset": true,
"showBacklinkIconForRect": false,
"showBoundingRectForBacklinkedAnnot": false,
"hideReplyAnnotation": false,
"searchLinkHighlightAll": "true",
"searchLinkCaseSensitive": "true",
"searchLinkMatchDiacritics": "default",
"searchLinkEntireWord": "false",
"dontFitWidthWhenOpenPDFLink": true,
"preserveCurrentLeftOffsetWhenOpenPDFLink": false,
"defaultZoomValue": "page-width",
"scrollModeOnLoad": 0,
"spreadModeOnLoad": 0,
"hoverableDropdownMenuInToolbar": true,
"zoomLevelInputBoxInToolbar": true,
"popoverPreviewOnExternalLinkHover": true,
"actionOnCitationHover": "pdf-plus-bib-popover",
"anystylePath": "",
"enableBibInEmbed": false,
"enableBibInHoverPopover": false,
"enableBibInCanvas": true
},
"styleSettings": null,
"styleSheet": ".pdf-plus-backlink-highlight-layer .pdf-plus-backlink:not(.hovered-highlight)[data-highlight-color=\"yellow\"],\n.pdf-embed[data-highlight-color=\"yellow\"] .textLayer .mod-focused {\n --pdf-plus-color: #ffd000;\n --pdf-plus-backlink-icon-color: #ffd000;\n --pdf-plus-rect-color: #ffd000;\n}\n.pdf-plus-backlink-highlight-layer .pdf-plus-backlink:not(.hovered-highlight)[data-highlight-color=\"red\"],\n.pdf-embed[data-highlight-color=\"red\"] .textLayer .mod-focused {\n --pdf-plus-color: #ea5252;\n --pdf-plus-backlink-icon-color: #ea5252;\n --pdf-plus-rect-color: #ea5252;\n}\n.pdf-plus-backlink-highlight-layer .pdf-plus-backlink:not(.hovered-highlight)[data-highlight-color=\"note\"],\n.pdf-embed[data-highlight-color=\"note\"] .textLayer .mod-focused {\n --pdf-plus-color: #086ddd;\n --pdf-plus-backlink-icon-color: #086ddd;\n --pdf-plus-rect-color: #086ddd;\n}\n.pdf-plus-backlink-highlight-layer .pdf-plus-backlink:not(.hovered-highlight)[data-highlight-color=\"important\"],\n.pdf-embed[data-highlight-color=\"important\"] .textLayer .mod-focused {\n --pdf-plus-color: #bb61e5;\n --pdf-plus-backlink-icon-color: #bb61e5;\n --pdf-plus-rect-color: #bb61e5;\n}\n.pdf-plus-backlink-highlight-layer .pdf-plus-backlink:not(.hovered-highlight)[data-highlight-color=\"gdfgdfg\"],\n.pdf-embed[data-highlight-color=\"gdfgdfg\"] .textLayer .mod-focused {\n --pdf-plus-color: #d12323;\n --pdf-plus-backlink-icon-color: #d12323;\n --pdf-plus-rect-color: #d12323;\n}\n.pdf-plus-backlink-highlight-layer .pdf-plus-backlink:not(.hovered-highlight) {\n --pdf-plus-color: #d12323;\n --pdf-plus-backlink-icon-color: #d12323;\n --pdf-plus-rect-color: #d12323;\n}\n.pdf-plus-backlink-highlight-layer .pdf-plus-backlink.hovered-highlight {\n\t--pdf-plus-color: green;\n\t--pdf-plus-backlink-icon-color: green;\n --pdf-plus-rect-color: green;\n}\n.pdf-plus-color-palette-item[data-highlight-color=\"yellow\"] > .pdf-plus-color-palette-item-inner {\n background-color: #ffd000;\n}\n.pdf-plus-color-palette-item[data-highlight-color=\"red\"] > .pdf-plus-color-palette-item-inner {\n background-color: #ea5252;\n}\n.pdf-plus-color-palette-item[data-highlight-color=\"note\"] > .pdf-plus-color-palette-item-inner {\n background-color: #086ddd;\n}\n.pdf-plus-color-palette-item[data-highlight-color=\"important\"] > .pdf-plus-color-palette-item-inner {\n background-color: #bb61e5;\n}\n.pdf-plus-color-palette-item[data-highlight-color=\"gdfgdfg\"] > .pdf-plus-color-palette-item-inner {\n background-color: #d12323;\n}\n.pdf-plus-color-palette-item:not([data-highlight-color]) > .pdf-plus-color-palette-item-inner {\n background-color: transparent;\n}\n.workspace-leaf.pdf-plus-link-opened.is-highlighted::before {\n\topacity: 0.5;\n}\nbody {\n --pdf-plus-yellow-rgb: 255, 208, 0\n}\nbody {\n --pdf-plus-red-rgb: 234, 82, 82\n}\nbody {\n --pdf-plus-note-rgb: 8, 109, 221\n}\nbody {\n --pdf-plus-important-rgb: 187, 97, 229\n}\nbody {\n --pdf-plus-gdfgdfg-rgb: 209, 35, 35\n}\nbody {\n --pdf-plus-default-color-rgb: var(--pdf-plus-gdfgdfg-rgb)\n}\n.callout[data-callout=\"pdf\"][data-callout-metadata=\"yellow\"] {\n\t--callout-color: var(--pdf-plus-yellow-rgb);\n background-color: rgba(var(--callout-color), var(--pdf-plus-highlight-opacity, 0.2))\n}\n.callout[data-callout=\"pdf\"][data-callout-metadata=\"red\"] {\n\t--callout-color: var(--pdf-plus-red-rgb);\n background-color: rgba(var(--callout-color), var(--pdf-plus-highlight-opacity, 0.2))\n}\n.callout[data-callout=\"pdf\"][data-callout-metadata=\"note\"] {\n\t--callout-color: var(--pdf-plus-note-rgb);\n background-color: rgba(var(--callout-color), var(--pdf-plus-highlight-opacity, 0.2))\n}\n.callout[data-callout=\"pdf\"][data-callout-metadata=\"important\"] {\n\t--callout-color: var(--pdf-plus-important-rgb);\n background-color: rgba(var(--callout-color), var(--pdf-plus-highlight-opacity, 0.2))\n}\n.callout[data-callout=\"pdf\"][data-callout-metadata=\"gdfgdfg\"] {\n\t--callout-color: var(--pdf-plus-gdfgdfg-rgb);\n background-color: rgba(var(--callout-color), var(--pdf-plus-highlight-opacity, 0.2))\n}\n.callout[data-callout=\"pdf\"] {\n\t--callout-color: var(--pdf-plus-default-color-rgb);\n background-color: rgba(var(--callout-color), var(--pdf-plus-highlight-opacity, 0.2))\n}\n.callout[data-callout=\"pdf\"] {\n --callout-icon: lucide-highlighter;\n}"
}
Error messages
No response
Out of the two highlights, one is the backlink highlight rendered by PDF++, and the other is just a text selection. The latter disappears when clicking on the viewer because it's a normal text selection. Does it make sense or am I misunderstanding something?
Yes, it makes sense, thanks.
But there is another option in the pdf++ settings which clears the text selection after clicking the backlink.
Maybe something similar could be introduced here.
I don't know how, but maybe simulate another mouse click after auto-pasting, so that this text selection goes away automatically?
So it's not a bug and it's a feature request, right? I guess nothing unexpected is happening here.
Adding an option to clear text selection after auto-pasting (or pasting in general) is doable.
Yeah, I am sorry for false labeling this thread as a bug.
I would appreciate such an option your described.
Because if the text selection is not cleared automatically, you can't read the highlighted text so easy afterwards.
Done, thanks for the idea! https://github.com/RyotaUshio/obsidian-pdf-plus/releases/tag/0.39.8
Excellent, thanks.
| gharchive/issue | 2024-03-30T07:08:52 | 2025-04-01T06:37:30.680068 | {
"authors": [
"N3C2L",
"RyotaUshio"
],
"repo": "RyotaUshio/obsidian-pdf-plus",
"url": "https://github.com/RyotaUshio/obsidian-pdf-plus/issues/134",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2520268806 | Sprite Overhaul
This took me like 95% of the project and is the sole reason I wasn't able to finish the other part 😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄
(I hope this doesn't break the game again-)
| gharchive/pull-request | 2024-09-11T17:03:03 | 2025-04-01T06:37:30.713252 | {
"authors": [
"S-J-K-F",
"jfeatherston09"
],
"repo": "S-J-K-F/Platformer-Big-Boy",
"url": "https://github.com/S-J-K-F/Platformer-Big-Boy/pull/9",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
} |
1410092662 | ABAP BTP Branch pulled with error "No authorization to change authorization field &/DMO/CNTRY&." in Free Tier
Hello SAP ABAP Reference Scenario Team,
I'm the admin user of my SAP BTP Free Tier ABAP Environment. I've just tried to pull the updates from the https://github.com/SAP-samples/abap-platform-refscen-flight/tree/BTP-ABAP branch. But I get this error message:
/DMO/CNTRY | AUTH | E | No authorization to change authorization field &/DMO/CNTRY&.
For reference also this screenshot:
I've also activated the Authorization Trace for my user but I can't find any failed authorization checks there.
Best Regards
Gregor
Hello Gregor,
the error can be ignored. See: https://github.com/SAP-samples/abap-platform-refscen-flight/tree/BTP-ABAP#download point nr. 8.
Best Regards,
Volker
To bad that the GitHub search doesn't find the README when searching for /DMO/CNTRY. But it now finds this issue.
| gharchive/issue | 2022-10-15T08:23:46 | 2025-04-01T06:37:30.734716 | {
"authors": [
"VolkerDrees",
"gregorwolf"
],
"repo": "SAP-samples/abap-platform-refscen-flight",
"url": "https://github.com/SAP-samples/abap-platform-refscen-flight/issues/9",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
421509252 | DOC: Commit messages don't need scope anymore
Update the commit guidelines to indicate that scope is no longer necessary in the commit messages.
See https://sap-cx.slack.com/archives/GAC8PULNN/p1552634597090800?thread_ts=1552583300.089600&cid=GAC8PULNN for more info.
Fixed in the wiki version of this file, in one of the many commits from GH-1725.
| gharchive/issue | 2019-03-15T13:05:08 | 2025-04-01T06:37:30.736216 | {
"authors": [
"gladius-mtl"
],
"repo": "SAP/cloud-commerce-spartacus-storefront",
"url": "https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/1691",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
432755567 | Part 2/2: Move component styles to storefrontstyles lib
Based on the example set by Tobias on GH-1907 we'll now move the following components as well:
[x] Search Results - Category Listing / product-view component
[x] Store Finder / schedule component
[x] Store Finder / store-finder-grid component
[x] Store Finder / store-finder-header component
[x] Store Finder / store-finder-list-item component
[x] Store Finder / store-finder-map component
[x] Store Finder / store-finder-search component
[x] Store Finder / store-finder-list component
[x] Store Finder / store-finder-search-result component
[x] Store Finder / store-finder-store-description component
[x] Store Finder / store-finder-stores-count component
[x] pwa/components/add-to-home-screen-banner/add-to-home-screen-banner.component.scss
[ ] ui/layout/header/header.component.scss
[x] ui/components/form-components/item-counter/item-counter.component.scss
[x] ui/components/form-components/star-rating/star-rating.component.scss
[x] ui/components/spinner/spinner.component.scss
the header was already done.
| gharchive/issue | 2019-04-12T21:40:13 | 2025-04-01T06:37:30.740732 | {
"authors": [
"developpeurweb",
"mikelMHybris"
],
"repo": "SAP/cloud-commerce-spartacus-storefront",
"url": "https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/2013",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
451368726 | es2015 build not working
Angular 8 allows for differential loading - production build outputs 2 bundles:
for browsers supporting only es5
for browser supporting es2015
You can enable it by setting target: "es2015" in tsconfig of the storefront app.
Unfortunately, when running the es2015 version, there displays only the top loader and blank page:
There seems to be a problem with ngrx + angular + es2015. Here is the PR of angular-cli that claims to fix it:
https://github.com/angular/angular-cli/pull/ 14585
The PR claims to fix all those issues:
https://github.com/ngrx/platform/issues/ 1888 (effects not dispatched with build --prod)
https://github.com/ng-packagr/ng-packagr/issues/ 1307 (ES2015 problem since Angular 8 )
https://github.com/angular/angular-cli/issues/ 14613#issuecomment-498960867 (App that has lazy loading modules is not rendering any pages in prod mode after update to angular 8. )
| gharchive/issue | 2019-06-03T09:01:31 | 2025-04-01T06:37:30.744690 | {
"authors": [
"Platonn"
],
"repo": "SAP/cloud-commerce-spartacus-storefront",
"url": "https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/2821",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
468223101 | [Master] Fix e2e regression tests
The following e2e regression tests are failing and need to be fixed:
billing-address
my-account/payment-methods
reset-password
register
https://jkmaster.test.c3po.b2c.ydev.hybris.com/job/spartacus-regression-tests/234/console
Ticket for billing address: https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/3687
Billing Address: Bug fix ticket created for it #3687
Payment-Methods: I'm not able to fail this locally
Reset-Password: Requires dev17 to be updated
Register: I'm not able to fail this locally
Going to hold this ticket as I'll try to fail these tests. If I can't fail them then I'll bring it up in stand-up. A solution may be to add waits() as that is the likely reason for these tests failing on Jenkins.
Closing because neither Payment-Methods or Register have failed on Jenkins.
Tests are still failing in the Jenkins job that we just ran. If different tests are failing, it is ok to change this ticket description and create follow up tickets but please do not close this one until regression tests are stable
New tests to fix:
regression/my-account/update-profile.e2e-spec.ts
regression/site-context/language/language-registration-page.e2e-spec.ts
regression/site-context/language/language-cart-page.e2e-spec.ts
mobile/register-mobile.e2e-spec.ts
All e2e tests failing on Jenkins have either been fixed or already have tickets created to fix them.
This ticket will remain open and I will continue to monitor Jenkins.
Re-ran Jenkins.
Added new tests that are failing that have not previously been mentioned.
mobile/product-search-store-flow-mobile.e2e-spec.ts - Fixed (https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/3705)
New failure(s) on Jenkins:
language-my-account-pages.e2e-spec.ts
New failure(s) on Jenkins:
mobile/register-mobile.e2e-spec.ts
New failure(s) on Jenkins:
regression/my-account/payment-methods.e2e-spec.ts
regression/register.e2e-spec.ts
mobile/cart-mobile.e2e-spec.ts
New failure(s) on Jenkins:
New locale failure(s):
regression/my-account/order-history-orders-flow.e2e-spec.ts - Fixed (https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/3807)
New failure(s) on Jenkins:
regression/my-account/payment-methods.e2e-spec.ts - Created (https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/3970)
| gharchive/issue | 2019-07-15T16:30:19 | 2025-04-01T06:37:30.753813 | {
"authors": [
"RadhepS",
"hackergil"
],
"repo": "SAP/cloud-commerce-spartacus-storefront",
"url": "https://github.com/SAP/cloud-commerce-spartacus-storefront/issues/3674",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2052024913 | Fix compatibility v4 / v5 with CAP
Currently CAP has to make the following code change in their productive code (non-test) for migrating to SDK v5:
- httpDestination = new DefaultDestination(destination.getProperties()).asHttp();
+ httpDestination = DefaultHttpDestination.fromMap(destination.getProperties()).build();
Implication:
There's a chance for users of SDK v5 + CAP v2 to experience runtime exception about constructor visibility.
We could solve it by making the constructor public again.
Uses in current CAP here:
https://github.wdf.sap.corp/cds-java/cds-services/blob/main/cds-feature-remote-odata/src/main/java/com/sap/cds/services/impl/odata/RemoteODataClient.java#L88
https://github.wdf.sap.corp/cds-java/cds-services/blob/main/cds-feature-remote-hcql/src/main/java/com/sap/cds/services/impl/hcql/RemoteHcqlClient.java#L95
We're planning to solve it differently.
Closed in favor of this
https://github.wdf.sap.corp/cds-java/cds-services/pull/3723
| gharchive/pull-request | 2023-12-21T09:16:31 | 2025-04-01T06:37:30.757299 | {
"authors": [
"newtork"
],
"repo": "SAP/cloud-sdk-java",
"url": "https://github.com/SAP/cloud-sdk-java/pull/201",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2063728766 | feat: move cds annotation converter module
Move cds annotation converter module to open-ux-tools repository
obsolete : created new PR: https://github.com/SAP/open-ux-tools/pull/1609
| gharchive/pull-request | 2024-01-03T10:34:10 | 2025-04-01T06:37:30.811852 | {
"authors": [
"nahmed22"
],
"repo": "SAP/open-ux-tools",
"url": "https://github.com/SAP/open-ux-tools/pull/1573",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2200526325 | Allow adding tiles pointing to remotely available apps in the preview-middleware
Implements: #1763
I tested it on a FE project and it worked nicely. @tobiasqueck Is it also supposed to work on ADP projects?
Yes. I haven't tested it explicitly but the logic is independent of whether it is an app project or an ADP
| gharchive/pull-request | 2024-03-21T15:39:59 | 2025-04-01T06:37:30.813042 | {
"authors": [
"tobiasqueck"
],
"repo": "SAP/open-ux-tools",
"url": "https://github.com/SAP/open-ux-tools/pull/1764",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
216898733 | sap.m.Table: Hidden elements shift adjacent cells when "minScreenWidth" is applied to all Columns
OpenUI5 version: Since 1.44.5 (affects also current nightly 1.47)
Browser/version (+device/version): All
URL (minimal example if possible): https://embed.plnkr.co/OepBn5/
Steps to reproduce the problem:
Open the Plunker example above
Shrink the preview panel horizontally till columns are shown as pop-in
What is the expected result?
Until 1.44.4
What happens instead?
Since 1.44.5
Any other information? (attach screenshot if possible)
This effect goes away as soon as minScreenWidth of one of the columns is not applied (e.g. change one minScreenWidth value to "Phone" or remove it).
Hi @boghyon if you configure all the columns to be shown as a popin then we cannot have a physical table anymore. There must be always at least one visible real column which is not configured for popin.
Such feature has been asked several times but if we allow this we know people start using table like a form. Thats why this limitation will stay. Please see this to make it work
https://plnkr.co/edit/Dy30RGa3cY00kW7vi6Ii?p=preview
Hey @aborjinik , thank you for the answer. I couldn't find any similar issue about this on GitHub. Sorry about that. Could we at least get a "warning" message placed in the API reference of sap.m.Column that it's not recommended to apply demandPopin to all Columns? Or that at least one column should have demandPopin disabled.
For other readers: since UI5 1.72, at least one column is always kept in the header:
Related commit: https://github.com/SAP/openui5/commit/307b09bc71d3cc2cd6f3fc3f83ffa8c824aa6ee1
| gharchive/issue | 2017-03-24T20:38:27 | 2025-04-01T06:37:30.820203 | {
"authors": [
"aborjinik",
"boghyon"
],
"repo": "SAP/openui5",
"url": "https://github.com/SAP/openui5/issues/1396",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
502033903 | Automatic commit: Move cp-portal-cloud-foundry-getting-started from Q…
…A to Production
You can commit multiple tutorials all at once :-)
| gharchive/pull-request | 2019-10-03T12:25:34 | 2025-04-01T06:37:30.875670 | {
"authors": [
"LindsayBert",
"thecodester"
],
"repo": "SAPDocuments/Tutorials",
"url": "https://github.com/SAPDocuments/Tutorials/pull/4013",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
67180413 | Make examples more visual to show the program works.
Either a graph for convergence, or an image, or similar. Make the examples more demonstrative.
an output VTK is created. 4ded8863c6eb06f13616e288ac7539c79ceaca09
| gharchive/issue | 2015-04-08T17:35:54 | 2025-04-01T06:37:30.898575 | {
"authors": [
"brigb123"
],
"repo": "SCIInstitute/SCI-Solver_FEM",
"url": "https://github.com/SCIInstitute/SCI-Solver_FEM/issues/11",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
278681125 | Error when using calculated fields
When I added a calculated field based on [intfield] + 10, I tested it editing some rows and the field was calculated fine.
However, when I try to create a new record, the green confirmation did not show, no error was shown on screen, and the link had: /def/errormessage
Checked the admin logs and I had this:
AbsoluteURL:/Default.aspx
DefaultDataProvider:DotNetNuke.Data.SqlDataProvider, DotNetNuke
ExceptionGUID:b4bad84e-6b90-4255-84eb-e6694121bf63
AssemblyVersion:9.1.1
PortalId:0
UserId:1
TabId:219
RawUrl:/...
Referrer:https://...
UserAgent:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36
ExceptionHash:QKZuiSMY28G3MbdBenUr2XM7FWU=
Message:The Controls collection cannot be modified because the control contains code blocks (i.e. <% ... %>).
StackTrace:
InnerMessage:The Controls collection cannot be modified because the control contains code blocks (i.e. <% ... %>).
InnerStackTrace:
at System.Web.UI.ControlCollection.Add(Control child) at DotNetNuke.Services.Exceptions.Exceptions.ProcessModuleLoadException(String FriendlyMessage, Control ctrl, Exception exc, Boolean DisplayErrorMessage)
Source:
FileName:
FileLineNumber:0
FileColumnNumber:0
Method:
Server Name: ...
ModuleId:864
ModuleDefId:134
FriendlyName:Form and List
ModuleControlSource:DesktopModules/UserDefinedTable/Default.ascx
AbsoluteURL:/Default.aspx
DefaultDataProvider:DotNetNuke.Data.SqlDataProvider, DotNetNuke
ExceptionGUID:96179f9a-4a6a-47ac-b035-1b2d1af5d045
AssemblyVersion:9.1.1
PortalId:0
UserId:1
TabId:219
RawUrl:/...
Referrer:https://...
UserAgent:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36
ExceptionHash:EQDfGxtjEmUKwh4x8o9tEIVTw1U=
Message:Object reference not set to an instance of an object.
StackTrace:
InnerMessage:Object reference not set to an instance of an object.
InnerStackTrace:
at DotNetNuke.Modules.UserDefinedTable.DataTypes.EditString.get_Value() at DotNetNuke.Modules.UserDefinedTable.EditForm.cmdUpdate_Click(Object sender, EventArgs e)
Source:
FileName:
FileLineNumber:0
FileColumnNumber:0
Method:
Server Name: ...
Deleted the calculated field, and it worked fine again.
Any chance that the number was bigger than 999? Then it is the same issue as #6
Way bigger.
Not sure how it is related, but when using the int field only, it does not
give any error. Only when I add a calculated field based on that int.
On Dec 2, 2017 15:37, "Stefan Cullmann" notifications@github.com wrote:
Any chance that the number was bigger than 999? Then it is the same issue
as #6 https://github.com/SCullman/DNN.FormAndList/issues/6
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/SCullman/DNN.FormAndList/issues/7#issuecomment-348699769,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AZSlXzse7_-A3C1kAUWKVZ4ZHTUoJlW8ks5s8W62gaJpZM4QzSuc
.
Should have been fixed in fnl.6.3.3
| gharchive/issue | 2017-12-02T12:40:51 | 2025-04-01T06:37:30.915470 | {
"authors": [
"SCullman",
"eXistenZe"
],
"repo": "SCullman/DNN.FormAndList",
"url": "https://github.com/SCullman/DNN.FormAndList/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
642191997 | Add additional golden file tests (7)
In GitLab by @mweston on Apr 28, 2020, 18:46
The following experiments have been processed and should be put under golden file testing.
https://gitlab.sd2e.org/sd2program/cp-request/raw/master/input/structured_requests/Microbe-LiveDeadClassification.json
https://gitlab.sd2e.org/sd2program/cp-request/raw/master/input/structured_requests/NovelChassis-OR-Circuit-Cycle1-ObstacleCourse.json
https://gitlab.sd2e.org/sd2program/cp-request/raw/master/input/structured_requests/YeastSTATES-CRISPR-Short-Duration-Time-Series-20191213.json
https://gitlab.sd2e.org/sd2program/cp-request/raw/master/input/structured_requests/y4d_cen_pk_inducible_crispr_characterization.json
https://gitlab.sd2e.org/sd2program/cp-request/raw/master/input/structured_requests/y4d_crispr_dose_response.json
https://gitlab.sd2e.org/sd2program/cp-request/raw/master/input/structured_requests/NovelChassis-OR-Circuit-Cycle0-8hour.json
https://gitlab.sd2e.org/sd2program/cp-request/raw/master/input/structured_requests/NovelChassis-OR-Circuit-Cycle0-24hour.json
In GitLab by @mweston on Apr 28, 2020, 18:46
changed the description
In GitLab by @mweston on Apr 28, 2020, 18:46
changed the description
In GitLab by @mweston on May 4, 2020, 13:02
changed the description
In GitLab by @mweston on May 18, 2020, 13:56
@jakebeal @tramyn can this be added to 2.5.1? It should be straightforward as these documents have revision_ids. Given that golden file testing is manual/opt-in, I'd encourage supporting coverage against a larger set of files so that we don't miss anything.
In GitLab by @tramyn on May 18, 2020, 14:06
@mweston There are no issue with this request. I can go ahead and add them now.
In GitLab by @mweston on May 18, 2020, 14:07
Great, thanks!
In GitLab by @tramyn on May 18, 2020, 14:17
created merge request !114 to address this issue
In GitLab by @tramyn on May 18, 2020, 14:17
mentioned in merge request !114
In GitLab by @tramyn on May 18, 2020, 15:40
closed via merge request !114
In GitLab by @tramyn on May 18, 2020, 15:40
mentioned in commit 21124d71583b1bba922c3262382ba7fd3dd07680
In GitLab by @tramyn on Jun 12, 2020, 18:58
mentioned in commit b657fd402c2007f4d495d5efc01a68b4586762ec
| gharchive/issue | 2020-06-19T20:01:19 | 2025-04-01T06:37:30.922117 | {
"authors": [
"mwes"
],
"repo": "SD2E/experimental-intent-parser",
"url": "https://github.com/SD2E/experimental-intent-parser/issues/194",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1333515722 | Xcode 14 - SwiftUI warning
If you try to run the sample app on Xcode 14 beta 5 you get the following warnings.
+1
At the same time our logs are spammed with
[SwiftUI] Publishing changes from within view updates is not allowed, this will cause undefined behavior.
Also seeing this - this is a genuine bug within the library because it is triggering a method from within the view body that has side effects, one of which is mutating some published state:
Specifically, WebImage does this:
public var body: some View {
// This solve the case when WebImage created with new URL, but `onAppear` not been called, for example, some transaction indeterminate state, SwiftUI :)
if imageManager.isFirstLoad {
imageManager.load()
}
...snip...
}
I have no context as to why this change was added but it is fundamentally wrong. Side effects should not be performed from within a SwiftUI's view body - this body method can potentially called many times (every time the view's state changes). In this case, calling .load() not only triggers a side effect, but it causes the @Published isLoading to be mutated in the middle of the view update cycle.
+1
also having the same issue, with the bug affecting my navigationStack and moves me back to previous page, needs a fix asap, at least a workaround for now...
I noticed that the picture flickered a bit in iOS16 beta 5
🥲
I encountered a similar message in my environment.
+1
Any chance this gets fixed before iOS 16's release?
Hi.
This current repo's code, need a refactory to make it works with latest SwiftUI. And drop the usage of that ObservedObject using StateObject.
If someone has expert coding experience, PR is welcomed.
Currently I'm working on LLVM/Swift toolchain in this year, so may not have enough time to maintain this changes. But I'll try to back to this repo once I have some extra time.
+1
Following @lukeredpath's hint, I made some changes that seem to fix things on iOS 16.
https://github.com/SDWebImage/SDWebImageSwiftUI/compare/master...garrettrayj:ios16-undefined-behavior
I haven't tested AppKit and I bet backwards compatibility is messed up, but the changes appear mainly to be removing old workarounds that are no longer needed. Story of my life with SwiftUI, so I'm feeling confident enough to roll with 'em for the sake of getting iOS 16 updates out the door.
If you don't need backwards compatibility another option would be to see if the built in AsyncImage does what you need and have one less third party dependency in your codebase. That's my plan.
@garrettrayj - Great work - Will you make a PR? Then it can be reviewed.
I had the same error with iOS16 RC version.
This warning appears with the latest SwiftUI update but the issue impacts all SwiftUI versions. For instance, I have observed that screens that have a SDWebImage as first element are not correctly rendered in a navigation animation (i.e. when they swipe in from right to left).
Please have a try with v2.1.0
Seems to work perfectly now. Thank you!!
| gharchive/issue | 2022-08-09T16:39:29 | 2025-04-01T06:37:30.953705 | {
"authors": [
"BugMonkey",
"Jeyhey",
"RMehdid",
"Sri2611",
"Teglgaard",
"dreampiggy",
"frlefebvre",
"garrettrayj",
"guidev",
"lukeredpath",
"tichise",
"yoshirozay"
],
"repo": "SDWebImage/SDWebImageSwiftUI",
"url": "https://github.com/SDWebImage/SDWebImageSwiftUI/issues/222",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2265957303 | Replikation von Drittanbieter-Daten
Sofern ein einheitliches Austauschformat für #139 gefunden wurde, soll die Persistenz-Ebene angepasst werden, sodass die in #139 erstellten Austauschdaten gespeichert werden können.
Folgende Daten sollen gespeichert werden:
Aktivitäten
Schritte (Tag)
Es soll für den Nutzer möglich sein diese Informationen abzurufen, für Aktivitäten existiert bereits ein Endpunkt, für Schritte soll dieser hinzugefügt werden (beispielsweise /activities/steps oder /steps). -Der Nutzer darf eigene Aktivitäten erstellen-, Schritte sind hingegen schreibgeschützt.
Update:
Es soll folgendes gespeichert werden:
Anzahl der Schritte, höchstens eine Stunde alt
Weitere Anforderungen lassen wir erstmal weg.
Zudem soll es möglich sein die Daten "zwangsweise" zu aktualisieren, also z.B. beim Start und beim Ende einer Aktivität
Geschlossen da der wesentliche Teil implementiert ist
| gharchive/issue | 2024-04-26T14:52:58 | 2025-04-01T06:37:30.962914 | {
"authors": [
"benedictweis",
"henrybrink"
],
"repo": "SE-TINF22B2/G5-DuoGradus",
"url": "https://github.com/SE-TINF22B2/G5-DuoGradus/issues/140",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1227046125 | unexpected side effect in "currentsteps" computed property
In MailsOverview.vue is the following code:
currentsteps() {
if (!this.mailStore.mails.has(this.student.id)) return []
const data = (this.mailStore.mails.get(this.student.id) ?? []).filter(mail => {
return this.statuses.includes(parseInt(mail.info))
})
if (!this.currentStep) {
this.currentStep = this.statuses.find(s => !data.map(d=>parseInt(d.info)).includes(s)) // <-- gives side effect error
}
return data
},
The indicated line gives an Unexpected side effect in "currentsteps" computed property. error for me.
That's something @Wouter01 added.
| gharchive/issue | 2022-05-05T18:42:54 | 2025-04-01T06:37:30.983421 | {
"authors": [
"Zelzahn",
"lisadejonghe"
],
"repo": "SELab-2/OSOC-5",
"url": "https://github.com/SELab-2/OSOC-5/issues/353",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1672830846 | Names of enumerations
In § Case sensitivity and charsety. Why should the name of an enumeration start with a lowercase letter? In the UML metamodel an enumeration is a specialisation of the classifier Datatype. Datatype names start with a capital, so why not enumerations?
The logic of this recommendation is to have consistency between the naming style of the enumerations declared in a CV or AP, and other externally maintained, and often reused, vocabularies and authority tables (e.g. at-voc:access-right, at-voc:language).
Does this make sense @GeertThijs ?
We are aware that many CVs and APs currently don't follow this recommendation. This is a style guide, with recommendations. It is up to each group and project to decide if they want to follow or not any given the recommendation in the guide.
We have updated the recommendation, relaxing it on the packages, data types and enumerations.
| gharchive/issue | 2023-04-18T10:33:48 | 2025-04-01T06:37:30.986439 | {
"authors": [
"GeertThijs",
"costezki",
"csnyulas"
],
"repo": "SEMICeu/style-guide",
"url": "https://github.com/SEMICeu/style-guide/issues/61",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
964267260 | Ernest Opoku-Kwarteng
exercise 1.csv
exercise 2.csv
exercise 3 - rankreciprocal.pdf
exercise 3 - ranksummethod.pdf
Well done, Ernest. You clearly grasped the material outlined in Module 4 and successfully applied it to these exercises. Thank you for your participation! Please be sure to fill out the post-Module survey before you begin the next Module.
| gharchive/issue | 2021-08-09T18:41:18 | 2025-04-01T06:37:30.994133 | {
"authors": [
"JuliePeeling",
"ernest19"
],
"repo": "SERVIR-WA/GALUP",
"url": "https://github.com/SERVIR-WA/GALUP/issues/172",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
579172315 | API endpoint usage option from Web Platform
I would like to: Provide an API endpoint usage option from Web Platform
So that I can: Allow users to make a test API call from the Web Platform and visualise it
Done in endpoint-usage-option branch.
| gharchive/issue | 2020-03-11T10:49:41 | 2025-04-01T06:37:30.995695 | {
"authors": [
"andyAndyA"
],
"repo": "SERaaS/SERaaS-Web-Platform",
"url": "https://github.com/SERaaS/SERaaS-Web-Platform/issues/11",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
285200675 | Initial support for SFMLConfig.cmake, for review
This PR is for review and feedbacks, not supposed to be merged yet!
TODO list:
[x] Have config file generated on all platforms
[x] Support debug/release config
[ ] Have no absolute path in generated config files on Linux / Windows / macOS / iOS? / Android?
[ ] Export internal dependencies in case of static build (ie. remove the need for SFML_DEPENDENCIES)
[ ] Support for components ?
[ ] Exhaustive testing!
=============
I've started doing the changes to support config file generation. At the moment this is working at least for me on macOS with frameworks. I expect it to work on other platforms too because it reuses the definitions of current SFML targets. This implies a LOT of changes though so I would like your feedback before going on.
Especially I have replaced all the listing of dependencies with calls to target_link_libraries() and target_include_directories() for 3 reasons:
this avoids redoing what CMake already does (target_link_libraries() and target_include_directories() already handle lists)
this makes it easier to group the full setup (include+link) of each dependency
this will allow specifying which dependencies must be public in case of static linking (what SFML_DEPENDENCIES currently does in FindSFML.cmake), same eventually if compile definitions or include dirs must be exported (currently only the root include dir of SFML is exported)
In terms of usage right now (example from sfeMovie) of course without any FindSFML.cmake:
find_package (SFML 2.3 COMPONENTS graphics window system audio REQUIRED)
target_link_libraries(sfeMovie PRIVATE sfml-graphics sfml-window sfml-system sfml-audio)
I didn't expose yet the fact that sfml-graphics depends on sfml-window and sfml-system. Could be done and would allow writing only
target_link_libraries(sfeMovie PRIVATE sfml-graphics sfml-audio)
Dunno yet if we want that though.
If you want to look at what the generated config looks like for now:
http://yalir.org/files/SFML/
It gets installed in /usr/local/lib/cmake/SFML by default.
References
https://cmake.org/cmake/help/latest/manual/cmake-packages.7.html#creating-packages
https://cmake.org/cmake/help/latest/command/find_package.html
https://cmake.org/cmake/help/latest/command/export.html
https://cmake.org/cmake/help/latest/module/CMakePackageConfigHelpers.html
https://github.com/SFML/SFML/compare/master...SrTobi:support-config-file-packages
Related issues
https://github.com/SFML/SFML/issues/758
https://github.com/SFML/SFML/issues/937
Amazing work, thanks :+1:
I know very little about modern CMake usage, but what I see looks ok. We should now test it under various scenarios (static/dynamic link, on all OSes).
this will allow specifying which dependencies must be public in case of static linking
What exactly does that mean? Will CMake now automatically export the private dependencies in case of static linking, or does it require additional work? In any case, we no longer need SFML_DEPENDENCIES, right?
I didn't expose yet the fact that sfml-graphics depends on sfml-window and sfml-system. Could be done
Should be done, since the dependency does exist.
Didn't dig too deep into the changes, but with the branch as-is, sfml-main won't build, since its include directories aren't set (and therefore won't find SFML/Config.hpp).
You're basically missing this:
# set the include directory
target_include_directories(sfml-main PUBLIC
$<BUILD_INTERFACE:${PROJECT_SOURCE_DIR}/include>
$<INSTALL_INTERFACE:include>)
from sfml-main's CMakeLists.txt.
Also are you sure about the location of the SFMLConfig.cmake and the other files? I thought they'd have to be placed directly in lib or am I confusing it with PkgConfig?
I know very little about modern CMake usage, but what I see looks ok. We should now test it under various scenarios (static/dynamic link, on all OSes).
The most important part is correctly defining which target properties have PRIVATE, PUBLIC or INTERFACE visibility. Are you familiar with it?
Basically PRIVATE properties are used only when building the target.
PUBLIC properties are used when building the target and by targets that depend on your target.
INTERFACE properties are used only by targets that depend on your target.
And in these properties you can define include directories, libraries to link or preprocessor definitions.
What exactly does that mean? Will CMake now automatically export the private dependencies in case of static linking, or does it require additional work? In any case, we no longer need SFML_DEPENDENCIES, right?
It will, at least that's what I want. But it's not done yet with current changes. And yes SFML_DEPENDENCIES won't be needed anymore if I get it to work. Will update the PR when it's ready.
Should be done, since the dependency does exist.
Right, will update when it's done
Didn't dig too deep into the changes, but with the branch as-is, sfml-main won't build, since its include directories aren't set (and therefore won't find SFML/Config.hpp).
Indeed. Didn't do it yet as it's not needed on macOS, because I wanted feedbacks before going further. Will fix it.
Also are you sure about the location of the SFMLConfig.cmake and the other files? I thought they'd have to be placed directly in lib or am I confusing it with PkgConfig?
There are several possible paths, see https://cmake.org/cmake/help/latest/command/find_package.html
"Each entry is meant for installation trees following Windows (W), UNIX (U), or Apple (A) conventions:"
<prefix>/ (W)
<prefix>/(cmake|CMake)/ (W)
<prefix>/<name>*/ (W)
<prefix>/<name>*/(cmake|CMake)/ (W)
<prefix>/(lib/<arch>|lib|share)/cmake/<name>*/ (U)
<prefix>/(lib/<arch>|lib|share)/<name>*/ (U)
<prefix>/(lib/<arch>|lib|share)/<name>*/(cmake|CMake)/ (U)
<prefix>/<name>*/(lib/<arch>|lib|share)/cmake/<name>*/ (W/U)
<prefix>/<name>*/(lib/<arch>|lib|share)/<name>*/ (W/U)
<prefix>/<name>*/(lib/<arch>|lib|share)/<name>*/(cmake|CMake)/ (W/U)
<prefix>/<name>.framework/Resources/ (A)
<prefix>/<name>.framework/Resources/CMake/ (A)
<prefix>/<name>.framework/Versions/*/Resources/ (A)
<prefix>/<name>.framework/Versions/*/Resources/CMake/ (A)
<prefix>/<name>.app/Contents/Resources/ (A)
<prefix>/<name>.app/Contents/Resources/CMake/ (A)
Still need to figure out where it'll be located in the end. It's probably not portable on this part right now.
I initiated the CI builds, checkout the build logs (Config.hpp can't be found).
Which is exactly what I've stated above. ;)
Build is fixed for sfml-main, everything builds fine locally on Windows (VS2017 32bits).
The dependencies between SFML modules are now public so examples like VOIP only need to depend on sfml-network and sfml-audio, no need to specify sfml-system. The same applies to external projects that will use find_package(SFML) and depend on a SFML module.
I ran the install step with the Debug/Release configuration and did a find_package(SFML) without any FindSFML.cmake and it did work on Windows, with the correct lib depending on selected configuration (Debug/Release).
There doesn't look to be any absolute path in the generated config file on Windows, which is nice for making releases. On the contrary for now the config file generated on macOS has absolute paths (/Library/Frameworks). To be fixed.
Still need to support static builds, most likely the hardest part because of the need to expose internal dependencies.
One important note
I changed a bit how the sfml-main library is created. Instead of custom add_library() and properties, it's created like any other SFML library : with suffix for debug / static, and with pdb next to the generated .lib file. The only customization I kept is that it always remains a static lib, whatever the value of BUILD_SHARED_LIBS.
Nice. I'd say absolute paths aren't that bad for now. Wouldn't they be the same on any machine anyway? No experience with Frameworks, but from my understand they're supposed to sit in that specific directory?
SFMLBuildMaster: Build this please.
New revision is building fine for me on Windows. Do we even have to keep (and therefore maintain) FindSFML.cmake? Or should it be deprecated/removed altogether? From my understanding we wouldn't need both. Or is there some minimum version requirement higher than using a Find….cmake file?
SFMLBuildMaster: Build this please.
Nice. I'd say absolute paths aren't that bad for now. Wouldn't they be the same on any machine anyway? No experience with Frameworks, but from my understand they're supposed to sit in that specific directory?
It depends on what you want to do. Currently FindSFML.cmake allows finding SFML in standard paths or in custom paths if you set SFML_ROOT. I don't see any reason to restrict the benefits of CMake config files to standard installations only.
Consider for example where you want to make sure that the package you're creating for your program is self-contained. In that case you don't want to have SFML installed in standard paths. You can also want to keep standard paths clean, or don't have admin rights, etc.
New revision is building fine for me on Windows. Do we even have to keep (and therefore maintain) FindSFML.cmake ? Or should it be deprecated/removed altogether? From my understanding we wouldn't need both. Or is there some minimum version requirement higher than using a Find….cmake file?
I'm always for keeping a simple and unique way to do things. But I don't think I'm the one to choose this for SFML :)
As for the minimum requirement.. is SFML's CI using CMake 2.8? If it does then it's ok, because I didn't add any conditional check for some CMake version.
Also I realize that currently existing FindSFML.cmake won't find the new sfml-main lib because of changed suffix. Is it acceptable to introduce such break in SFML 2.5?
Also I realize that currently existing FindSFML.cmake won't find the new sfml-main lib because of changed suffix. Is it acceptable to introduce such break in SFML 2.5?
Do you mean the -s affix? Why not just overwrite it after defining the target? Or won't that work?
Do you mean the -s affix? Why not just overwrite it after defining the target? Or won't that work?
Yes the -s suffix, and yes overwriting does work. Committing, thanks.
@MarioLiebisch Do you know where sfml-graphics links against OpenGL ES libraries on Android? I can't find it on SFML/master and I broke link step of this module on Android: https://ci.sfml-dev.org/builders/android-armeabi-v7a-api13/builds/267/steps/compile/logs/stdio
I saw that you worked on Android support so asking you :)
It used to be here:
https://github.com/SFML/SFML/blob/master/src/SFML/Graphics/CMakeLists.txt#L126
but obviously got lost in the rewrite.
It's still there: https://github.com/Ceylo/SFML/blob/feature/CMakeTargetExport/src/SFML/Graphics/CMakeLists.txt#L124
And I don't think you pointed out the correct line, because in Config.cmake I see that Android uses OpenGL ES.
It seems like it was simply not set. But it previously worked because another signature of target_link_libraries() was used: https://cmake.org/cmake/help/latest/command/target_link_libraries.html?highlight=target_link_libraries#libraries-for-both-a-target-and-its-dependents
Which imported linkage for OpenGL ES from sfml-window target definition : https://github.com/SFML/SFML/blob/master/src/SFML/Window/CMakeLists.txt#L270
And it's no more the case with the explicit PRIVATE linkage (it was implicitly PUBLIC). Hopefully easy to fix :)
But that's outside the SFML_OPENGL_ES branching starting in line 113.
SFMLBuildMaster: Build this please.
Remaining discussed points are done!
When building macOS frameworks, CMAKE_INSTALL_PREFIX should be set by user to /Library/Frameworks for SFML to be directly usable. I've added SFML_DEPENDENCIES_INSTALL_PREFIX and SFML_MISC_INSTALL_PREFIX to support what was discussed. SFML_DEPENDENCIES_INSTALL_PREFIX defaults to /Library/Frameworks on macOS and to CMAKE_INSTALL_PREFIX on other platforms. SFML_MISC_INSTALL_PREFIX defaults to CMAKE_INSTALL_PREFIX except on macOS where it defaults to /usr/local. This allows changing CMAKE_INSTALL_PREFIX to install frameworks in /Library/Frameworks without modifying where "misc" contents goes. So basically by default nothing changes except the install prefix for macOS frameworks.
When creating the SFML package for distribution, you should run the "install" target at least 4 times with the same install prefix. Twice for debug/release x twice for static/dynamic. This will generate the following files:
SFMLConfig.cmake
SFMLConfigDependencies.cmake
SFMLConfigVersion.cmake
SFML${type}Targets.cmake
SFML${type}Targets-${config}.cmake
Where ${type} is "Static" or "Shared", and ${config} is one of CMake build configurations (Release, Debug, MinSizeRel, RelWithDebInfo).
The files that are installed by more that one installation configuration are identical between the different configurations that can write it.
For frameworks it's similar except that only release dynamic config needs to be installed to get a complete SFMLConfig.
Either SFMLStaticTargets.cmake or SFMLSharedTargets.cmake is included by SFMLConfig.cmake depending on whether the user has set SFML_STATIC_LIBRARIES. SFML${config}Targets.cmake includes its debug/release child files depending on which ones exist. Note that if a user does a find_package(SFML) with an installation that only contain debug libs for example, these debug libs are used even when he builds his project in release. Can be an issue on Windows but that's how CMake generates these files.
I did test with CMake 3.0.2 and it did work. I checked the installed files, there are slight differences but nothing that prevents using this version. Requirement can't be lower as there are many uses of interface libraries. Will commit the corresponding changes soon.
For dependencies I created interface libraries. This is done by sfml_find_package(). For example sfml_find_package(FLAC INCLUDE "FLAC_INCLUDE_DIR" LINK "FLAC_LIBRARY") creates a target named "FLAC", and when linking against that target, the target that needs it automatically gets the correct compile and link flags. And it looks just like if we'd just built the FLAC library in the same project. The important point is that the created target name in current example is "FLAC". That means that if a user also creates a FLAC target and uses SFMLConfig, there'll be a conflict. I tried to add a namespace to have something like sf::FLAC as target name but namespaces are not allowed for interface libraries. I also considered sfFLAC but it looks weird, requires some changes and in the end I think there's very low probability to have a conflict. So I just left "FLAC".
The variables previously defined by FindSFML.cmake are not defined by SFMLConfig.cmake except SFML_FOUND and SFML_DOC_DIR.
So I guess now is the time for final review & testing! Would be nice to test as a user, by deleting the FindSFML.cmake you may have in your project and generate the SFMLConfig and use it. Especially to check that no use case is missing.
SFMLBuildMaster: Build this please.
Awesome! Thanks for the detailed summary.
I won't be able able to test this for the next 10 days or so, but maybe someone else with a mac can give some feedback. :)
Another thing, in order to include this with the next SFML RELEASE, we also need to update the tutorials. Could you have a look at them? (But maybe it would be wise just to wait for some additional approval of this PR?)
Cmake wise, we also have #1344. Would you like to have a look? You seems more proficient than I to solve it. No pressure though :)
Thanks :)
As for the tutorials indeed I'd prefer feedbacks on usage first. Will check the current tutorials though to get an idea of what needs to be changed.
And for the other PR I'll check, but I prefer to finish this one first and focus. Also a lot of CMake code has changed so the changes for #1344 would most likely conflict.
Concerning last failed build, I've fixed the error on Windows. On macOS though it requires changes to the cmake command line used for CI, because of the introduced install prefixes for dependencies and misc files. I don't know how that can be done because I suppose that the CI config does not depend on the branch, right? I also don't know where the CI steps are defined in order to help on that matter. Basically CMAKE_INSTALL_FRAMEWORK_PREFIX is not used anymore, and both SFML_DEPENDENCIES_INSTALL_PREFIX and SFML_MISC_INSTALL_PREFIX should be set to what you want.
@mantognini For the tutorials you were thinking of FindSFML.cmake or something else? I didn't find any mention about it except in forums and changelogs.
@LaurentGomila @MarioLiebisch @eXpl0it3r @mantognini So what do we do? This PR has been ready for testing for 2 weeks now.
I made a topic on SFML's forum one week ago: https://en.sfml-dev.org/forums/index.php?topic=23676.0
Since that time some of you answered on the forum and fixes were done, but I'm not aware of anyone actually trying to use this for his project. So no feedback on real uses at the moment.
The more other PRs are merged in the meantime, the harder it is to make sure I fix all conflicts correctly, because codebases diverge. On that part, @MarioLiebisch could you check my latest merge commit? There were many conflicts related to your changes for Android. I think it's ok but… just in case.
Sorry, I wanted to give it a shot this week but, you know, life... I'll try to get around to test it in the middle of next week.
SFMLBuildMaster: Build this please.
That being said, anyone else is welcome to try this PR! :)
Looks like CMake doesn't have the permission to install the FreeType framework on the CI. Does this need to be adjusted on the CI or in the setup?
I hope to have a look at this soon. Went to a C++ user group meeting this week and learned some stuff on modern CMake. 🙂
For the install issue this is what was discussed in https://github.com/SFML/SFML/pull/1335#issuecomment-362847863
There a small changes needed on CI side.
Thanks for checking the PR soon !
Can you open an issue on https://github.com/SFML/SFML-Buildbot with details so @binary1248 can fix it?
@eXpl0it3r No need to open an issue if I am just able to fix it like that. 😉
@binary1248 You should keep CMAKE_FRAMEWORK_INSTALL_PREFIX in cmake configure step as long as this PR is not merged, otherwise you break macOS build on other branches at the moment. For current branch it’ll be an unused cmake arg but it’s temporarily ok.
@Ceylo I like it better like this... more pressure to merge this faster. 😛
So… still blocked with testing for one more week. What do we do?
I can give it a try again on all platforms but that will only make sure it works in my use cases.
What do you think?
SFMLBuildMaster: Build this please.
Our CI install frameworks incorrectly ATM
[...]
-- Installing: /Users/SFML/Desktop/buildbot/osx-clang-el-capitan/install/./SFML.framework
[...]
-- Installing: /Users/SFML/Desktop/buildbot/osx-clang-el-capitan/install/./sfml-system.framework
[...]
They should go into [...]/install/Library/Frameworks.
@binary1248, could you append /Library/Frameworks to -DCMAKE_INSTALL_PREFIX current value? (According to https://github.com/SFML/SFML/pull/1335#issuecomment-362751934.)
@mantognini That doesn't sound right... CMAKE_INSTALL_PREFIX is used as a base directory for everything, not only frameworks. There are already a bunch of things installed into /Library/Frameworks which is specified via SFML_DEPENDENCIES_INSTALL_PREFIX, but I guess some things that should go in there as well aren't sent there by the install script.
It might looks a bit weird, but it's only for when building/installing frameworks. See above comments by @Ceylo. Other misc files are installed according to another variable.
@mantognini Should be fixed in https://ci.sfml-dev.org/#/builders/11/builds/31/steps/20/logs/stdio
Besides the Xcode template script issue, everything seems good on macOS. 👍
Thanks @binary1248, this looks good.
No, a bit more than that. The new script handles ressources being installed
elsewhere (according to the dependencies install prefix). Your change is
consistent with the rest so it's good. I've simply changes a few other
lines.
On Sun, 4 Mar 2018, 17:49 Ceylo, notifications@github.com wrote:
@Ceylo commented on this pull request.
In tools/xcode/templates/SFML/SFML App.xctemplate/TemplateInfo.plist.in
https://github.com/SFML/SFML/pull/1335#discussion_r172055425:
@@ -146,7 +146,7 @@ subject to the following restrictions:
If you're using static libraries (which is not recommended) you should remove this script from your project.
SETTINGS
-CMAKE_INSTALL_FRAMEWORK_PREFIX="@CMAKE_INSTALL_FRAMEWORK_PREFIX@"
+CMAKE_INSTALL_FRAMEWORK_PREFIX="@CMAKE_INSTALL_PREFIX@"
I suppose you mean rename the variable CMAKE_INSTALL_FRAMEWORK_PREFIX to
something else?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/SFML/SFML/pull/1335#discussion_r172055425, or mute
the thread
https://github.com/notifications/unsubscribe-auth/AAqU6y-JJ_n5Ox05iprPUXqoF9216hgUks5tbBszgaJpZM4RPltw
.
Ah right, I had not seen the link to your gist. Committing your changes.
Had a look at SFML tutorials for building SFML with CMake and here are the changes I suggest in the table containing CMake variable description (bold is new contents):
[update] CMAKE_INSTALL_PREFIX: This is the install path. By default, it is set to the installation path that is most typical on the operating system ("/usr/local" for Linux and macOS, "C:\Program Files" for Windows, etc.). When building frameworks on macOS, you may want to change the value to "/Library/Frameworks".
[remove] CMAKE_INSTALL_FRAMEWORK_PREFIX
[add] SFML_DEPENDENCIES_INSTALL_PREFIX: This is the path where SFML's dependencies like Freetype and OpenAL are installed. By default it is the same as CMAKE_INSTALL_PREFIX, except on macOS where it defaults to "/Library/Frameworks", because dependencies on macOS are provided as frameworks. As stated above for CMAKE_INSTALL_PREFIX, it is not mandatory to install SFML after building it, but it is definitely cleaner to do so.
[add] SFML_MISC_INSTALL_PREFIX: This is the path where SFML examples, documentation, license and readme files are installed. On Windows it defaults to CMAKE_INSTALL_PREFIX, and to CMAKE_INSTALL_PREFIX/share/SFML on FreeBSD, Linux and macOS.
Also in https://www.sfml-dev.org/tutorials/2.4/compile-with-cmake.php I found many usages of the name "Mac OS X" which should be replaced with "macOS". "Mac OS X" isn't used for naming the desktop OS from Apple since 2016.
Tried it on Linux (Xubuntu 17.10).
CMake 3.10.2
Works fine for both static and dynamic linking, but I had to do this in my CMakeLists.txt:
cmake_policy(SET CMP0057 NEW)
Otherwise, the script failed:
if given arguments:
"window" "IN_LIST" "SFML_FIND_COMPONENTS"
Unknown arguments specified
Anyways, that's a very handy way of using SFML with CMake, so I'm looking forward to seeing it in SFML's master! It's very useful and makes everything neater.
Thanks a lot for testing and making me notice that it doesn't work with CMake < 3.3 (IN_LIST syntax). Will fix the installed config so that it works with 3.0.2 as the rest of SFML's CMake files.
SFMLBuildMaster: Build this please.
Had a look at SFML tutorials for building SFML with CMake and here are the changes I suggest in the table containing CMake variables descriptions
Thanks for your suggestions, I'll work on a quick PR for the tutorial in order to make this PR mergeable for the next release. :)
PS: my review accounts only for Mac. @eliasdaler can you confirm it works nicely now on Linux too? I leave it to others to double check the behaviour for the other platforms.
It works perfectly well on Linux now. :)
I finally got around to test this on a project of mine. I tried it out on MacOS for dynamic linking and it worked like a charm 👍
I've started SFML/SFML-Website#107 with the changes mentioned above.
Note that now there are some conflicts here... :/
Built with VS 2017 on Windows 10. Built (shared and static libs) without issues, and ran examples without issues!
@dabbertorres Just to make sure, you tested with your own project built against SFML with the find_package(SFML) stuff, right?
@mantognini Can you relaunch build? I fixed conflicts.
@MarioLiebisch @JonnyPtn @mantognini Is it ok for you in terms of code review? How many reviewers to we need to approve the changes?
I haven't been involved in this enough to make a fully informed decision, but I would say merge merge merge!
Is it ok for Android and iOS? Has it been tested with both makefile-based build systems and multi-configuration IDEs?
Haven't had a chance to test Windows yet, but Android projects (so far) shouldn't use CMake anyway. SFML is installed in the NDK's default directory for third party libraries, so you only have to reference them, no lookup required.
on iOS it doesn't find the openAL headers, but I'm struggling to establish why. OPENAL_INCLUDE_DIR seems correct and is set to the right folder...
SFMLBuildMaster: Build this please.
Is it ok for you in terms of code review?
I think it is for me. I went through the thing a few times as you added more commits. I can't pretend to understand everything without having to spend more time on it, but it seems to work (with the exception of iOS) and nothing in the code really made me jump. If you guys could find out why it fails for iOS that'd be great; otherwise, as far I'm concerned, it's ready to be merged.
How many reviewers do we need to approve the changes?
The more the better, as always, but I think we can move forward with this and adapt it if/when we get feedback (probably after official release).
Is it ok for Android and iOS?
Did FindSFML.cmake support Android and iOS? I supposed that it didn't according to
# detect the OS
if(${CMAKE_SYSTEM_NAME} MATCHES "Windows")
set(FIND_SFML_OS_WINDOWS 1)
elseif(${CMAKE_SYSTEM_NAME} MATCHES "Linux")
set(FIND_SFML_OS_LINUX 1)
elseif(${CMAKE_SYSTEM_NAME} MATCHES "FreeBSD")
set(FIND_SFML_OS_FREEBSD 1)
elseif(${CMAKE_SYSTEM_NAME} MATCHES "Darwin")
set(FIND_SFML_OS_MACOSX 1)
endif()
in the original FinSFML.cmake, so I didn't do anything specific on these platforms. And @MarioLiebisch's answer confirms this at least for Android.
Has it been tested with both makefile-based build systems and multi-configuration IDEs?
Yup. At least it does work nicely on macOS with Xcode generator and Unix Makefiles generator for the client project. I expect it to work the same on Windows because the CMake code isn't specific to any platform on this matter.
on iOS it doesn't find the openAL headers, but I'm struggling to establish why. OPENAL_INCLUDE_DIR seems correct and is set to the right folder...
Ok will check.
@JonnyPtn It's fixed. OpenAL was correctly found but OPENAL_INCLUDE_DIR wasn't used on iOS, because I previously relied on the "-framework OpenAL" flag.
@mantognini Can you relaunch the build?
I have tested it using CMake inside QtCreator.
Having some issues on iOS when trying to find SFML from an app, mainly with the dependencies.
I attempted a fix here which you might want to test. It compiles successfully even though It doesn't search for openGL (or openGLES). The app does then crash immediately, but again I'm not sure if that's a problem with the config or with the way I'm using it, or just a completely different bug.
I've also tested macOS static and debug libs which both worked excellently.
SFMLBuildMaster: Build this please.
@JonnyPtn Fix me if I’m wrong but original FindSFML.cmake didn’t support iOS too, so adding it now is out of scope I think.
Sure, I don’t want to hold this up for other platforms, I can do a PR for these later
@LaurentGomila @mantognini @eXpl0it3r Is there anything left to do apart from squashing commits and merge?
For me, it's ready! Feel free to squash commits in a way you find pertinent.
Let's merge it and release it in SFML 2.5. And then let's fix it in 2.5.1 😁
@mantognini @eXpl0it3r @LaurentGomila Up to you now!
Perfect!
SFMLBuildMaster: Build this please.
Just for notice: it looks ready to be merged.
Testing this on Windows with Visual Studio 2017 x64 project file generator, CMake 3.11.0-rc4 and Doxygen 1.8.10.
SFML_DEPENDENCIES_INSTALL_PREFIX isn't being aligned with the CMAKE_INSTALL_PREFIX.
It's nice that there's the option to install it to different places, but having to always adjust both prefixes is a bit cumbersome and unintuitive. My expectation would be that when I change CMAKE_INSTALL_PREFIX it will also change SFML_DEPENDENCIES_INSTALL_PREFIX but not the other way around.
The documentation build fails, even though everything went fine - it even installs the documentation, but for some reason it's reports it as failed.
1>CUSTOMBUILD : error : failed to run html help compiler on index.hhp
[...]
1>Running html help compiler...
1>lookup cache used 3614/65536 hits=22719 misses=3879
1>finished...
1>Done.
1>Building Custom Rule D:/Dev/SFML/doc/CMakeLists.txt
1>CMake does not need to re-run because D:/Dev/SFML/build/doc/CMakeFiles/generate.stamp is up-to-date.
1>Done building project "doc.vcxproj" -- FAILED.
*.pdb files (debug symbol files) are not being installed next to the libraries, but are installed to <install prefix>\lib\Debug. Additionally an empty Release directory is created (maybe because of the same reason). PDB files should really just be placed next to the libraries.
Right now I'm not sure exactly how the PDB files work. This implementation generates PDB files with the none-debug prefix (no -d) in debug mode and no PDB files in release mode. What I'm not sure is:
Do we need PDB files for every release?
Do we need PDB files only for release binaries since the debug libs contain the debug symbols?
Are the release PDB binaries generated in debug mode?
What's up with the lib\cmake\SFML directory that gets installed? Is this the correct location?
You should rebuild and install for all the configurations that you want to use (debug/release, dynamic/static/frameworks) to the same target installation directory. Then remove FindSFML.cmake from your project to make sure it's not used, and clear your CMake cache.
So how does CMake find SFML at my custom location? Previously, I'd set SFML_ROOT and CMAKE_MODULE_PATH and it would find it, how do I tell CMake now to find SFML? Do I need to place the SFMLConfig files somewhere?
CMakeLists.txt
cmake_minimum_required(VERSION 3.1)
project(SFMLTest)
set(CMAKE_CXX_STANDARD 11)
set(SFML_STATIC_LIBRARIES TRUE)
find_package(SFML 2.4 COMPONENTS graphics window system)
add_executable(SFMLTest main.cpp)
target_link_libraries(SFMLTest ${SFML_LIBRARIES})
So how does CMake find SFML at my custom location
Define a SFML_DIR variable that points to the location of the SFMLConfig.cmake file. This is exactly what CMake says in the error message that results from find_package, by the way 😛
I see. Yeah, it showed that message for years and it never applied to SFML, so I didn't think to try it. But this brings me back to the question from above whether <install-prefix>\lib\cmake\SFML is the correct location for the SFMLConfig.cmake file.
Using SFMLConfig.cmake it can find SFML just fine, unfortunately it doesn't set the include directory. Is this intended? Do I still have to use include_directories() if the headers aren't installed in some compiler toolchain known header location?
But this brings me back to the question from above whether \lib\cmake\SFML is the correct location for the SFMLConfig.cmake file.
Looks ok to me, at least that's what I've seen in other libs (Qt).
Using SFMLConfig.cmake it can find SFML just fine, unfortunately it doesn't set the include directory
The output of find_package(SFML ...) are "sfml-xxx" imported targets, so that's what you should pass to the target_link_libraries call. I don't think ${SFML_LIBRARIES} exists anymore now.
Right, I actually wanted to change it and then forgot.
A bit cumbersome that you have to specify the "library" twice, but otherwise it seems to work now.
@eXpl0it3r What do you mean by specifying the library twice? Here's an example I was using to test this: https://github.com/JonnyPtn/SFML-DOOM/blob/master/CMakeLists.txt#L51
First you specify which module you want to find and then you link each module. So you repeat it twice. Sure it might have a different meaning and all, but it's also a bit annoying, plus a source for people getting it wrong and the many questions that follow it.
find_package(SFML 2.4 COMPONENTS graphics window system)
[...]
target_link_libraries(SFMLTest sfml-graphics sfml-window sfml-system)
So the problem is specifying which components to link as opposed to just linking ${SFML_LIBRARIES}? Personally I prefer explicitly stating the components, and it simplifies having multiple projects in a CMakeLists which require different SFML components
First you specify which module you want to find and then you link each module
Listing the modules in find_package is optional (or it should). Plus, you can only specify sfml-graphics since the others are dependencies and will be automatically linked.
Okay, you can just ask for the graphics module and it will pull in the dependencies, but you seem to have to specify the components, otherwise CMake errors with:
find_package(SFML) called with no component
So here's a minimal CMakeLists.txt that works for me with a custom SFML installation.
cmake_minimum_required(VERSION 3.1)
project(SFMLTest)
set(SFML_DIR "<sfml root prefix>/lib/cmake/SFML")
find_package(SFML 2.4 COMPONENTS graphics)
add_executable(SFMLTest main.cpp)
target_link_libraries(SFMLTest sfml-graphics)
My expectation would be that when I change CMAKE_INSTALL_PREFIX it will also change SFML_DEPENDENCIES_INSTALL_PREFIX but not the other way around.
CMake cache system actually prevents from doing that. Once a cache entry is set, setting it in CMake code won't change it, unless you force updating the cache (but then you discard user choice).
The documentation build fails, even though everything went fine - it even installs the documentation, but for some reason it's reports it as failed.
My guess is that it is related to the "error :" that appears in logs. VS considers this a target failure even if the commands actually exit with status 0. I don't think I touched that part though. Is the error also happening on master branch?
*.pdb files (debug symbol files) are not being installed next to the libraries, but are installed to \lib\Debug. Additionally an empty Release directory is created (maybe because of the same reason). PDB files should really just be placed next to the libraries.
Booh… will check.
Right now I'm not sure exactly how the PDB files work. This implementation generates PDB files with the none-debug prefix (no -d) in debug mode and no PDB files in release mode.
Although it is interesting to get the answer, for this PR I will stick with the current behavior on master branch. The purpose of this PR isn't to change how/when PDB are generated.
What's up with the lib\cmake\SFML directory that gets installed? Is this the correct location?
This is one of the standard locations given by find_package()'s documentation. So I'd expect it to be correct :)
So how does CMake find SFML at my custom location (e.g. D:\Dev\SFML\install)? Previously, I'd set SFML_ROOT and CMAKE_MODULE_PATH and it would find it, how do I tell CMake now to find SFML? Do I need to place the SFMLConfig files somewhere?
As @LaurentGomila answered you need to set SFML_DIR instead, and you don't need to set CMAKE_MODULE_PATH because there isn't any FindSFML.cmake to find.
Listing the modules in find_package is optional (or it should). Plus, you can only specify sfml-graphics since the others are dependencies and will be automatically linked.
Did FindSFML.cmake previously work without any component given?
I followed what is done at https://github.com/SFML/SFML/pull/1335/files#diff-e40774a4a91e51bf8e2a254a86083435L126 which means, unless I missed something, that if no component is given, nothing is searched for by the FindSFML.cmake script.
@eXpl0it3r
I can't reproduce the issue about Doxygen. Here it generates the doc without any error log, and thus target succeeds. I'm using CMake 3.10.0, VS 2017 x64 generator and Doxygen 1.8.14. Didn't test with Doxygen 1.8.10 but if latest version works fine I guess it's ok. Does any of the CI builders generate the documentation? Can you test on your side with Doxygen 1.8.14?
As for where the PDB files go (ie. not next to libraries, and always in PREFIX/lib/Debug even for release builds), I checked and this is the same behavior as current master branch. Thus not going to change that for current PR.
The only difference I noticed about PDB files is that with my PR you now also get a PDB for sfml-main library.
All in all, if @LaurentGomila confirms expected behavior with find_package() and components, I guess current PR is still ok for merge. I'll just do a rebase against master to take into account latest commits.
CMake cache system actually prevents from doing that. Once a cache entry is set, setting it in CMake code won't change it, unless you force updating the cache (but then you discard user choice).
Yes, we musn't (can't) make one entry depend on the other. But what about using some default path if SFML_DEPENDENCIES_INSTALL_PREFIX is not set? Something relative to CMAKE_INSTALL_PREFIX.
set(SFML_DIR "<sfml root prefix>/lib/cmake/SFML")
In case someone takes your minimal example as a reference: the SFML_DIR variable should always be defined as a user variable, not directly in the CMakeLists.txt file 😃
Did FindSFML.cmake previously work without any component given?
Not sure about the current behaviour. But that was just an idea anyway. Do you think it's worth investigating?
CMake cache system actually prevents from doing that. Once a cache entry is set, setting it in CMake code won't change it, unless you force updating the cache (but then you discard user choice).
I thought so. For me it's just a different approach I'll have to take and define the prefix manually before clicking the configuration button. Unfortunately this will be an issue many people will fail to understand and we'll have to keep answering the same question. IMHO on Windows it never makes sense to install the dependencies somewhere else, as such this is really inconvenient for Windows users.
Yes, we musn't (can't) make one entry depend on the other. But what about using some default path if SFML_DEPENDENCIES_INSTALL_PREFIX is not set? Something relative to CMAKE_INSTALL_PREFIX.
By default it will use CMAKE_INSTALL_PREFIX. The issue is, that CMAKE_INSTALL_PREFIX has to be defined before you hit the configuration button in the CMake UI, similar to how you'd specify it in the command line the first time you call it.
Would it be an option to not set the SFML_DEPENDENCIES_INSTALL_PREFIX as long as SFML_DEPENDENCIES_INSTALL_PREFIX wasn't set by the user and instead always use CMAKE_INSTALL_PREFIX? That way CMAKE_INSTALL_PREFIX will always be used, unless the user explicitly specifies SFML_DEPENDENCIES_INSTALL_PREFIX. And since it's needed on macOS you could set it there directly. Maybe that's too many if-elses?
I can't reproduce the issue about Doxygen.
Checked again on master branch, it exists as well with my Doxygen version. The reason I stuck with the version is because newer versions don't generate the nice HTML output we want, but that's not important for this PR. Works fine with the latest Doxygen, so let's forget about this.
As for where the PDB files go (ie. not next to libraries, and always in PREFIX/lib/Debug even for release builds), I checked and this is the same behavior as current master branch. Thus not going to change that for current PR.
Odd, I could swear the behavior was different in the past, but yet, I see the same thing on master branch, so let's align this with another PR.
Did FindSFML.cmake previously work without any component given?
Not sure about the current behaviour. But that was just an idea anyway. Do you think it's worth investigating?
I think it's okay the way it is right now. I just suspect again that people will get it wrong and we'll have to keep answering the same question.
My suggestion would be, something that I kind of wanted for a while, to have a dedicated tutorial not on how to build SFML, but on how to use SFML with CMake. That way we can hopefully answer a lot of questions and have a reference for people.
Also we need to make sure to update the existing CMake tutorial for these changes.
But what about using some default path if SFML_DEPENDENCIES_INSTALL_PREFIX is not set? Something relative to CMAKE_INSTALL_PREFIX.
This is a good point indeed. Dunno why I didn't do that just like for SFML_MISC_INSTALL_PREFIX. It's committed.
Would it be an option to not set the SFML_DEPENDENCIES_INSTALL_PREFIX as long as SFML_DEPENDENCIES_INSTALL_PREFIX wasn't set by the user and instead always use CMAKE_INSTALL_PREFIX?
So with the relative path it's just fixed, not need for all these ifs :)
In case someone takes your minimal example as a reference: the SFML_DIR variable should always be defined as a user variable, not directly in the CMakeLists.txt file 😃
I don't think this is always true. For example if you take a project that depends on SFML and wants to embed all their dependencies in their repo (let's say with Git LFS). Setting this variable in the main project allows having a repo ready to use just with a single clone. So it depends on whether the developer wants to embed its dependencies or let its users provide them.
Not sure about the current behaviour. But that was just an idea anyway. Do you think it's worth investigating?
Just to make sure I tried using current FindSFML.cmake with no component. The result is actually worse than what I thought... with no component, it always consider SFML to be found even when it's not :/
Found SFML .. in SFML_INCLUDE_DIR-NOTFOUND
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
SFML_INCLUDE_DIR
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples/Demo
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples/Demo
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples/Demo
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples/MinimalistDemo
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples/MinimalistDemo
used as include directory in directory C:/Users/Ceylo/Documents/sfeMovie/examples/MinimalistDemo
So no regression on that matter. As for investigating about support "no component given" case, I think this should be discussed and eventually changed later. I don't really have an opinion at the moment about whether this is a good idea.
My suggestion would be, something that I kind of wanted for a while, to have a dedicated tutorial not on how to build SFML, but on how to use SFML with CMake. That way we can hopefully answer a lot of questions and have a reference for people.
Currently I'd say that the best tutorial for this is the documentation in SFMLConfig.cmake file. But you have to find first, which is not quite obvious 😄 .
Also we need to make sure to update the existing CMake tutorial for these changes.
Do you mean more than https://github.com/SFML/SFML/pull/1335#issuecomment-370251236 ?
Currently I'd say that the best tutorial for this is the documentation in SFMLConfig.cmake file. But you have to find that file first, which is not quite obvious
Could be more or less a 1:1 copy, I don't like spreading documentation across multiple places, especially since it's not an obvious place to look.
Do you mean more than #1335 (comment) ?
If that covers all, then that's good. 😊
SFMLBuildMaster: Build this please.
Could be more or less a 1:1 copy, I don't like spreading documentation across multiple places, especially since it's not an obvious place to look.
A 1:1 copy would look weird to me. What about giving a link to SFMLConfig.cmake from SFML tutorials page? This way you can also make sure that the tutorial is consistent with what is shipped.
Ideally there would be some sort of Doxygen that generates formatted doc from the comments in CMake files, but I’m not aware of such tool.
By the way builds have finished now.
Merged in fc655f52b91d3e70ee43426dc872723418f2bbee
Thanks for the exhaustive work and the pushing forward, really appreciated! 🎉
Hooorrraaaaaay!!!!!!!
At last… :D
I would like to discuss with you about the PR process. Mainly about the testing part: it took almost 2 months between the day the PR was ready for testing and the day it got merged. This is way too long, even in the opensource world, and honestly if it always take this long I'm not gonna do other PRs. So I'd like to talk and find solutions with you.
Where can we do that? The PR comments don't feel like it's the right place, and on the forum (except maybe in section "SFML development" where I can't open topics) it's also too wide: this is mainly toward SFML development team & contributors.
Yeah, this is a big step forward! I'm really happy for this top-quality contribution of yours! Thanks again.
I understand the feeling, and share your point of view. But the fact is that the team alone cannot handle everything, at least not quickly enough to keep the flow. The only solution I see is asking the community to test things out. Now, in the last few months, several people have joined the effort, and I thank them. Hopefully, this will attract more testers as well. In the meantime, if you have ideas on how to improve the overall process (doc, how-to's, issue handling,...) I feel it would best fit on the forum. I'm pretty sure a moderator will agree to move your thread in the right category. :)
I think that people will be less likely to test if developers either provide detailed instructions on how to perform testing or make a test program if it's possible.
The easier it is for people to test, the more likely they're to do it.
I've opened https://en.sfml-dev.org/forums/index.php?topic=23841.0
| gharchive/pull-request | 2017-12-30T12:37:06 | 2025-04-01T06:37:31.105484 | {
"authors": [
"Ceylo",
"Foaly",
"JonnyPtn",
"LaurentGomila",
"MarioLiebisch",
"binary1248",
"dabbertorres",
"eXpl0it3r",
"eliasdaler",
"mantognini"
],
"repo": "SFML/SFML",
"url": "https://github.com/SFML/SFML/pull/1335",
"license": "Zlib",
"license_type": "permissive",
"license_source": "github-api"
} |
2047997449 | [Release 1.0] 1차 작업사항
작업 내용이 정리된 것이 없어서 우선은 1차 작업사항 이라는 이름으로 PR을 올립니다.
작업 목록(여기에서 작업된 이슈) 이름들을 한번 정리해서 작성하면 좋을 것 같습니다.
@kms0524
https://github.com/SIDETEAM001/side-ios/pull/75
위의 PR까지 develop에 머지되면, 이 PR도 머지처리하겠습니다~
| gharchive/pull-request | 2023-12-19T06:15:49 | 2025-04-01T06:37:31.123335 | {
"authors": [
"dumok-nim"
],
"repo": "SIDETEAM001/side-ios",
"url": "https://github.com/SIDETEAM001/side-ios/pull/90",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2055406529 | 🛑 Website is down
In dee9800, Website (ELTeam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Website is back up in 4bd3feb after 1 hour, 24 minutes.
| gharchive/issue | 2023-12-25T05:55:55 | 2025-04-01T06:37:31.129900 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/10124",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2084313275 | 🛑 DE Server is down
In 18d2f76, DE Server (de1.elteam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: DE Server is back up in d79702d after 30 minutes.
| gharchive/issue | 2024-01-16T15:46:20 | 2025-04-01T06:37:31.132254 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/11936",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2107931369 | 🛑 DNS1.electro is down
In a2688ef, DNS1.electro (ns1.elteam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: DNS1.electro is back up in 47936ca after 39 minutes.
| gharchive/issue | 2024-01-30T13:57:52 | 2025-04-01T06:37:31.134522 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/13067",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2171299288 | 🛑 Website is down
In fec77b5, Website (ELTeam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Website is back up in 1a161d2 after 8 minutes.
| gharchive/issue | 2024-03-06T11:36:40 | 2025-04-01T06:37:31.136837 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/16128",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2215312338 | 🛑 DNS1.electro is down
In 33319dc, DNS1.electro (ns1.elteam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: DNS1.electro is back up in 2feb1d9 after 14 minutes.
| gharchive/issue | 2024-03-29T13:08:01 | 2025-04-01T06:37:31.139310 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/18027",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1919931926 | 🛑 Website is down
In 8df3574, Website (ELTeam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Website is back up in 87340bb after 2 hours, 25 minutes.
| gharchive/issue | 2023-09-29T21:31:47 | 2025-04-01T06:37:31.141576 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/2156",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1932247221 | 🛑 Website is down
In b83e540, Website (ELTeam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Website is back up in 4523501 after 18 minutes.
| gharchive/issue | 2023-10-09T03:39:21 | 2025-04-01T06:37:31.144326 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/2995",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1698582416 | 🛑 DNS1.electro is down
In 4a1a918, DNS1.electro (ns1.elteam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: DNS1.electro is back up in 152f177.
| gharchive/issue | 2023-05-06T10:58:14 | 2025-04-01T06:37:31.146596 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/404",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1983062839 | 🛑 DNS2.electro is down
In dcb9410, DNS2.electro (ns2.elteam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: DNS2.electro is back up in f54656f after 25 minutes.
| gharchive/issue | 2023-11-08T08:51:18 | 2025-04-01T06:37:31.148926 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/5934",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2038863370 | 🛑 DE Server is down
In 5079f93, DE Server (de1.elteam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: DE Server is back up in 989a322 after 14 minutes.
| gharchive/issue | 2023-12-13T02:58:46 | 2025-04-01T06:37:31.151414 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/9105",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2045711643 | 🛑 Website is down
In 632ac36, Website (ELTeam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Website is back up in f671a8e after 20 minutes.
| gharchive/issue | 2023-12-18T04:25:17 | 2025-04-01T06:37:31.153686 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/9467",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2048292088 | 🛑 DNS1.electro is down
In aeb34f7, DNS1.electro (ns1.elteam.ir) was down:
HTTP code: 0
Response time: 0 ms
Resolved: DNS1.electro is back up in d14ce2e after 8 minutes.
| gharchive/issue | 2023-12-19T09:46:37 | 2025-04-01T06:37:31.156022 | {
"authors": [
"SIRMaxis"
],
"repo": "SIRMaxis/uptime-robot",
"url": "https://github.com/SIRMaxis/uptime-robot/issues/9572",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
908070999 | Added participants page for 2021.
Added participants (mostly auto-generated from solver csv file).
Preview is here: http://smtcomp.jochen-hoenicke.de/2021/participants.html
| gharchive/pull-request | 2021-06-01T08:59:31 | 2025-04-01T06:37:31.188554 | {
"authors": [
"jhoenicke"
],
"repo": "SMT-COMP/smt-comp.github.io",
"url": "https://github.com/SMT-COMP/smt-comp.github.io/pull/17",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
332418256 | Add an option to change the username
MS has a route that allows you to change your username without the need to ping someone with database access. The URL is https://metasmoke.erwaysoftware.com/users/username.
Redunda should have such a feature too, if a user wants to change their name.
Or that. Just a way to update it would be nice.
Am Do., 14. Juni 2018 um 17:24 Uhr schrieb Jed Fox <notifications@github.com
:
Alternatively, it could be updated from the user’s SE display name upon
login (or on a schedule).
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/SOBotics/Redunda/issues/50#issuecomment-397335573,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AG-cjN-cym8pyr7UoAaacj2KuXWwaieNks5t8oA_gaJpZM4UoBsU
.
It should now trigger an async username update (from SE) on every login. Pretty cheap call, no reason not to send it often.
| gharchive/issue | 2018-06-14T14:05:20 | 2025-04-01T06:37:31.200729 | {
"authors": [
"Filnor",
"Undo1"
],
"repo": "SOBotics/Redunda",
"url": "https://github.com/SOBotics/Redunda/issues/50",
"license": "cc0-1.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1800749997 | Disable comScore analytics while playing video in the background
As a Mediapulse analyst I don't want videos to be tracked when played in the background.
Context
Mediapulse requires videos not to be tracked when < 50% of the video surface is visible. We can identify 3 matching scenarios:
The video is visible but only < 50% of its surface is actually presented on screen (e.g. covered by other views or played in a mini player bar).
The video is played in PiP, whose overlay can be put away by the user.
The video is played in the background (lock screen or app switching).
Due to the nature of the problem we cannot automatically cover several of these use cases:
< 50% area coverage is difficult to address automatically (we cannot really reliably determine how much of a view is visible).
We cannot know when the PiP overlay is visible or not.
We should therefore:
Deal with the cases that can be addressed automatically (app state transitions).
Document how app developers should manually address the other cases, using the available isTracked flag.
Remark
The isTracked flag is greedy and disables all trackers. If finer-grained control is required we should discuss https://github.com/SRGSSR/pillarbox-documentation/issues/38 as well.
Acceptance criteria
Background video playback is not tracked where automatically feasible.
Limitations and manual implementation requirements are documented.
Tasks
[ ] Disable comScore analytics while the application is in background.
[ ] Document required manual implementation inside an app (disabled for < 50% visibility).
Duplicate of #419. Visibility documentation added to #355. Closed.
| gharchive/issue | 2023-07-12T11:15:16 | 2025-04-01T06:37:31.234234 | {
"authors": [
"defagos"
],
"repo": "SRGSSR/pillarbox-apple",
"url": "https://github.com/SRGSSR/pillarbox-apple/issues/447",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1895256890 | Refactor Communication
[ ] change ambiguous verb/noun names/functions
[ ] confirm battery voltage is sent back to ROS
[ ] confirm torques are sent back to ROS
[ ] IMU values
| gharchive/issue | 2023-09-13T20:58:16 | 2025-04-01T06:37:31.256600 | {
"authors": [
"guyfleeman",
"joe-spall"
],
"repo": "SSL-A-Team/firmware",
"url": "https://github.com/SSL-A-Team/firmware/issues/40",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
193632273 | Returning object of a component client from another component action fails
I am trying to return an object of a component client from an action defined in another component. Passing the same object in action as argument seems to work fine, but returning produces an error. What is the correct way of doing it ?
Minimal code for what I want :
#include <hpx/hpx_init.hpp>
#include <hpx/include/actions.hpp>
#include <hpx/include/components.hpp>
#include <utility>
struct foo_server : public hpx::components::component_base<foo_server>
{
};
using foo_server_type = hpx::components::component<foo_server>;
HPX_REGISTER_COMPONENT(foo_server_type, foo_server);
struct foo : public hpx::components::client_base<foo, foo_server>
{
public:
using base_type = hpx::components::client_base<foo, foo_server>;
foo() = default;
foo(hpx::id_type id) : base_type(std::move(id))
{
}
};
struct bar_server : public hpx::components::component_base<bar_server>
{
foo get_foo() {
//return hpx::new_<foo_server>(hpx::find_here());
return foo(hpx::find_here());
}
HPX_DEFINE_COMPONENT_DIRECT_ACTION(bar_server, get_foo, get_foo_action);
};
using bar_server_type = hpx::components::component<bar_server>;
HPX_REGISTER_COMPONENT(bar_server_type, bar_server);
HPX_REGISTER_ACTION(bar_server::get_foo_action);
struct bar : public hpx::components::client_base<bar, bar_server>
{
public:
using base_type = hpx::components::client_base<bar, bar_server>;
bar() = default;
bar(hpx::id_type id) : base_type(std::move(id))
{
}
hpx::future<foo> get_foo() {
bar_server::get_foo_action act;
// This async statement gives compilation error
return hpx::async(act, hpx::find_here());
}
};
int hpx_main(int, char*[])
{
bar b(hpx::find_here());
auto res = b.get_foo().get();
return hpx::finalize();
}
int main(int argc, char* argv[])
{
return hpx::init(argc, argv);
}
Compilation fails with error :
test.cpp: In member function ‘hpx::lcos::future<foo> bar::get_foo()’:
test.cpp:54:26: error: could not convert ‘hpx::async(F&&, Ts&& ...) [with F = bar_server::get_foo_action&; Ts = {hpx::naming::id_type}; decltype (hpx::detail::async_dispatch<type
name hpx::util::decay<Action>::type>::call(forward<F>(f), (forward<Ts>)(hpx::async::ts)...)) = hpx::lcos::future<hpx::naming::id_type>; typename hpx::util::decay<Action>::type =
bar_server::get_foo_action](hpx::find_here(hpx::error_code&)())’ from ‘hpx::lcos::future<hpx::naming::id_type>’ to ‘hpx::lcos::future<foo>’
return hpx::async(act, hpx::find_here());
just for a note, clang says:
ex.cc:54:15: error: no viable conversion from returned value of type 'future<typename traits::promise_local_result<typename hpx::traits::extract_action<get_foo_action>::remote_result_type>::type>' to
function return type 'future<foo>'
return hpx::async(act, hpx::find_here());
.......
/usr/local/include/hpx/runtime/components/client_base.hpp:78:24: error: no matching conversion for functional-style cast from 'future<hpx::naming::id_type>' to 'foo'
return Derived(future<id_type>(shared_state));
The problem is a result of several issues:
Direct actions currently don't support returning futures (clients are just special futures, and theyare treated as such by HPX). This needs to be fixed at some point.
hpx::async for actions unfortunately behaves differently if compared to hpx::async for local functions. The former automatically unwraps futures, the latter does not automatically unwraps those (the same holds for clients). This is an unfortunate inconsistency which we might want to fix at some point in the future as well.
The client implementations in the given code example miss a special implicit (converting) constructor from future<id_type>
Given these restrictions, the following code does what the OP needed:
#include <hpx/hpx_init.hpp>
#include <hpx/include/actions.hpp>
#include <hpx/include/components.hpp>
#include <utility>
///////////////////////////////////////////////////////////////////////////////
struct foo_server : public hpx::components::component_base<foo_server>
{
};
using foo_server_type = hpx::components::component<foo_server>;
HPX_REGISTER_COMPONENT(foo_server_type, foo_server);
struct foo : public hpx::components::client_base<foo, foo_server>
{
using base_type = hpx::components::client_base<foo, foo_server>;
foo() = default;
// note: this constructor now creates a new component instance
foo(hpx::id_type id) : base_type(hpx::new_<foo_server>(id)) {}
// additional converting constructor
foo(hpx::future<hpx::id_type> && id) : base_type(std::move(id)) {}
};
///////////////////////////////////////////////////////////////////////////////
struct bar_server : public hpx::components::component_base<bar_server>
{
foo get_foo()
{
return foo(hpx::find_here());
}
HPX_DEFINE_COMPONENT_ACTION(bar_server, get_foo, get_foo_action);
};
using bar_server_type = hpx::components::component<bar_server>;
HPX_REGISTER_COMPONENT(bar_server_type, bar_server);
HPX_REGISTER_ACTION(bar_server::get_foo_action);
struct bar : public hpx::components::client_base<bar, bar_server>
{
using base_type = hpx::components::client_base<bar, bar_server>;
bar() = default;
// note: this constructor now creates a new component instance
bar(hpx::id_type id) : base_type(hpx::new_<bar_server>(id)) {}
// additional converting constructor
bar(hpx::future<hpx::id_type> && id) : base_type(std::move(id)) {}
// note the client function returns a foo, not a future<foo>,
// remember clients are 'futures' too, just special ones
foo get_foo()
{
bar_server::get_foo_action act;
return hpx::async(act, hpx::find_here());
}
};
///////////////////////////////////////////////////////////////////////////////
int hpx_main(int, char*[])
{
bar b(hpx::find_here());
auto res = b.get_foo().get();
return hpx::finalize();
}
int main(int argc, char* argv[])
{
return hpx::init(argc, argv);
}
This can be closed as the OP has resolved his problems
| gharchive/issue | 2016-12-05T22:46:03 | 2025-04-01T06:37:31.287948 | {
"authors": [
"AntonBikineev",
"hkaiser"
],
"repo": "STEllAR-GROUP/hpx",
"url": "https://github.com/STEllAR-GROUP/hpx/issues/2420",
"license": "BSL-1.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2747172021 | Fix outdated documentation and missing flags
Fixes #1868
Proposed Changes
Remove/add changed params
Add missing \cond \endcond attributes
Checklist
Not all points below apply to all pull requests.
[ ] I have added a new feature and have added tests to go along with it.
[ ] I have fixed a bug and have added a regression test.
[ ] I have added a test using random numbers; I have made sure it uses a seed, and that random numbers generated are valid inputs for the tests.
Can one of the admins verify this patch?
| gharchive/pull-request | 2024-12-18T08:43:36 | 2025-04-01T06:37:31.291362 | {
"authors": [
"StellarBot",
"shachar-a"
],
"repo": "STEllAR-GROUP/hpx",
"url": "https://github.com/STEllAR-GROUP/hpx/pull/6593",
"license": "BSL-1.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2390692559 | 🛑 SPCast / Loadbalancer is down
In 753352f, SPCast / Loadbalancer (https://loadbalancer.sp.radio.fm/) was down:
HTTP code: 0
Response time: 0 ms
Resolved: SPCast / Loadbalancer is back up in 2966295 after 25 minutes.
| gharchive/issue | 2024-07-04T11:55:05 | 2025-04-01T06:37:31.431105 | {
"authors": [
"scysys"
],
"repo": "STREAMPANEL/status.streampanel.net",
"url": "https://github.com/STREAMPANEL/status.streampanel.net/issues/2780",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2090961967 | 🛑 SPCast / Loadbalancer is down
In c9f6acc, SPCast / Loadbalancer (https://loadbalancer.sp.radio.fm/) was down:
HTTP code: 0
Response time: 0 ms
Resolved: SPCast / Loadbalancer is back up in cf910f7 after 10 minutes.
| gharchive/issue | 2024-01-19T17:17:18 | 2025-04-01T06:37:31.433659 | {
"authors": [
"scysys"
],
"repo": "STREAMPANEL/status.streampanel.net",
"url": "https://github.com/STREAMPANEL/status.streampanel.net/issues/286",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
146840195 | set testr with update alternatives
in an upgrade from 5to6 somehow testr points to /usr/bin/testr-2.6
which soes not exist. there is /usr/bin/testr-2.7 instead so we
need to set the right path with update-alternatives
17:10:42 + ./run_tempest.sh -N -t -s
17:10:42 + tee tempest.log
17:10:42 ./run_tempest.sh: line 90: testr: command not found
17:10:42 ./run_tempest.sh: line 107: testr: command not found
17:10:42 ./run_tempest.sh: line 107: subunit-2to1: command not found
not sure if that is the best solution, maybe we should run it only under certain conditions like "in an upgrade" or sth. does anyone have a better solution here, or can tell me the conditions under which we have to run this code?
hm. wasn't there a patch from @vuntz for that problem already?
hm. wasn't there a patch from @vuntz for that problem already?
i don't know. btw we have to do the same for subunit-2to1
This fix should go to somewhere else. Fixing it only for mkcloud runs seems odd. It is a quick workaround for sure, but it should be commented that this is a temporary workaround (with the corresponding bug number) that should be reverted once its fixed.
If we would keep it this PR hides the real issue from our ci runs.
This fix should go to somewhere else.
I guess that would be python-testrepository
or python-os-testr in D:C:6 in this case?
I just discovered https://github.com/crowbar/crowbar-core/pull/389 which could fix the issue
In that case lets close this PR
| gharchive/pull-request | 2016-04-08T07:23:41 | 2025-04-01T06:37:31.456337 | {
"authors": [
"MaximilianMeister",
"jdsn",
"toabctl"
],
"repo": "SUSE-Cloud/automation",
"url": "https://github.com/SUSE-Cloud/automation/pull/944",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
313225129 | Portus PORTUS_BACKGROUND=true and database empty
Hi All:
I enabled the background process with env PORTUS_BACKGROUND=true.
The problem is when the database are empty and NOT READY because PORTUS on bootstrap does not create the database yet, so, error:
[Database] Not ready yet. Waiting...
/srv/Portus/lib/portus/db.rb:40:in `wait_until': Timeout reached for 'ready' status
[Database] Timeout reached, exiting with error. Check the logs...
Database are not ready because PORTUS launch background proccess before the database creation on bootstrap.
How I could solve this?
Regards
You need to run two containers, one without the worker setting and one with it set to true. If you just run the worker you won't actually have a Portus server running to init the DB.
This issue sadly slipped through...
You need to run two containers, one without the worker setting and one with it set to true. If you just run the worker you won't actually have a Portus server running to init the DB.
Exactly :+1:
I'll close this issue now, but feel free to leave more comments if you have further doubts on this issue.
What flag should I use to docker run Portus in server mode so that it will bootstrap the database.
@anandr781 no flag, actually. See this examples
| gharchive/issue | 2018-04-11T08:38:19 | 2025-04-01T06:37:31.471170 | {
"authors": [
"anandr781",
"felixPG",
"mssola",
"seanhoughton"
],
"repo": "SUSE/Portus",
"url": "https://github.com/SUSE/Portus/issues/1777",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
449180963 | "Invalid filter syntax" on LDAP team sync
Description
Adding new users to teams through LDAP fails with the message [ldap] Connection error: Invalid filter syntax..
Steps to reproduce
Add a team "LW-LI" (matching the LDAP group CN=LW-LI,OU=Universal,OU=Group,OU=NW,OU=DE,OU=Production,DC=my-company,DC=com)
Add a new user "debk0l" through first login of LDAP account
Seeing that the user can login and works
Watching the background taks logs, waiting for team addition
Expected behavior: The use should be added to the team
Actual behavior: [ldap] Connection error: Invalid filter syntax.
Logs don't show any valuable information:
[ldap] Looking up an LDAP group membership for 'debk0l'
User Load (0.4ms) SELECT `users`.* FROM `users` WHERE `users`.`username` = 'debk0l' LIMIT 1
[ldap] Connection error: Invalid filter syntax.
Assumption
I assume, it's failing here when accessing the distinguished name. In our LDAP, there is no field dn, just distinguishedName. This will result in search.groups_from in a filter of the form (&(cn=*)(member=)) which is of course invalid.
I think, like with the uid, the dn attribute needs to be configurable.
I currently can't build the image myself to try it out, since our IT is using a man-in-the-middle proxy (ZScaler) with a self-signed certificate that parts of the your build chain are not trusting.
Deployment information
Deployment method:
Docker compose, pretty similar to the example.
Configuration:
Running docker-image opensuse/portus:head from today.
ldap:
enabled: true
hostname: "mos1d00001.my-company.com"
port: 636
timeout: 5
encryption:
method: "simple_tls"
base: "OU=Production,DC=my-company,DC=com"
group_base: "OU=Universal,OU=Group,OU=NW,OU=DE,OU=Production,DC=my-company,DC=com"
filter: "(&(objectCategory=person)(memberOf=CN=LW-LI,OU=Universal,OU=Group,OU=NW,OU=DE,OU=Production,DC=my-company,DC=com))"
uid: "sAMAccountName"
authentication:
enabled: true
bind_dn: "cn=ldap-user,ou=service,ou=user,ou=MB,ou=DE,ou=production,DC=my-company,DC=com"
group_sync:
enabled: true
default_role: "contributor"
guess_email:
enabled: true
attr: "userPrincipalName"
Portus version: 2.5.0-dev@a1b9f2ebfeb84680a9dcd5629195e4c52815735c
LDAP samples (relevant excerpt)
ldaps://mos1d00001.my-company.com:636/CN=Kraemer%5C,%20Benjamin,OU=LW-LI,OU=JLS,OU=Department,OU=People,OU=User,OU=MB,OU=DE,OU=Production,DC=my-company,DC=com
Field
Value
objectClass
person
cn
Kraemer, Benjamin
distinguishedName
CN=Kraemer, Benjamin,OU=LW-LI,OU=JLS,OU=Department,OU=People,OU=User,OU=MB,OU=DE,OU=Production,DC=my-company,DC=com
memberOf
CN=LW-LI,OU=Universal,OU=Group,OU=NW,OU=DE,OU=Production,DC=my-company,DC=com
sAMAccountName
dejhbk0l
userPrincipalName
Benjamin.Kraemer@my-company.com
ldaps://mos1d00001.my-company.com:636/LW-LI,OU=Universal,OU=Group,OU=NW,OU=DE,OU=Production,DC=my-company,DC=com
Field
Value
objectClass
group
member
CN=Kraemer, Benjamin,OU=LW-LI,OU=JLS,OU=Department,OU=People,OU=User,OU=MB,OU=DE,OU=Production,DC=my-company,DC=com
Still waiting for any attention by the team.
Still a problem
bump
/unstale
| gharchive/issue | 2019-05-28T10:07:51 | 2025-04-01T06:37:31.483412 | {
"authors": [
"Falco20019",
"SuperSandro2000"
],
"repo": "SUSE/Portus",
"url": "https://github.com/SUSE/Portus/issues/2203",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1583797958 | Verbalizzare ricevimento 14/02
Ricevimento con il prof. Cardin - 14/02/2023
Informazioni preliminari
Durata: 8:50 - 9:15
Partecipanti:
Alessandro Massarenti
Mattia Casarotto
Luca Pierobon
Michele Bonavigo
Samuel Peron
Ricevimento
Discussioni iniziali
Discussione iniziale sullo scambio di mail avvenuto con il professor Cardin e Vardanega.
Discussione sulla direzione del PoC: Lo scopo del PoC è quello di costringere a fare una selezione di parte tecnologica, studiandola e verificando che si integri bene. Il PoC deve quindi dimostrare che abbiamo scelto e approfondito le tecnologie in modo che si integrino.
Q&A
Q: Cosa ne pensa il professore di Eclipse Mosquitto?
Chiedere se possiamo concentrarci su Mosquitto come concept e poi in futuro switchare ad altre tecnologie: si pensava HiveMQ.
A: Non si può cambiare tecnologia dopo PoC, segno di pessima progettazione. Ogni modifica deve essere giustificata.
Q: Differenza tra documento di Analisi dei requisiti ideale con quello attuale.
A: La parte fondamentale sono i requisiti, devono essere atomici, quantitativi e verificabili. Specificare i Requisiti di vincolo e i Requisiti di funzionalità.
Q: In che modo conviene (con che protocollo) far comunicare i client con il server per fare in modo che la pagina dell'interfaccia utente si aggiorni automaticamente quando cambia il server.
Aggiornare, perciò, in real-time le varie interfacce mentre l'utente è collegato ogni qualvolta il coordinatore decide di aggiornare lo stato di un lampione. [XMTP, SOAP, Long polling(?), ecc.]
A: Bisogno di un Websocket, a detta sua è prassi.
Altre alternative: Webhook, Server Sent Events (SSE).
Q: Problemi vari con le UC?
In particolare ci sono modi alternativi di rappresentare il Sistema di gestione (pag. 11)? Magari utilizzando un attore placeholder per sfruttare la generalizzazione.
A: Rimandato a nuovo ricevimento/email.
Discussione post ricevimento
Avendo una comunicazione presente già da client a server, SSE sembra la scelta più conveniente.
Necessario scegliere i protocolli anche pensando al problema della cifratura.
Questo è come SSE potrebbe essere utilizzato in una bozza di architettura stilata con @casYy:
| gharchive/issue | 2023-02-14T09:22:29 | 2025-04-01T06:37:31.528257 | {
"authors": [
"alessandro-massarenti",
"pierobonluca01"
],
"repo": "SWEasabi/verbali",
"url": "https://github.com/SWEasabi/verbali/issues/31",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1799875090 | Cannot handle OLD MiSFiTMAME DAT entries.
Ex. MiSFiTMAME (v0.55) dat.
game (
name diggermn
description "Digger Man"
year 1994
manufacturer "Face ?"
romof neogeo
rom ( name dig_p1.rom size 524288 crc eda433d7 region cpu1 offs 0 )
rom ( name dig_s1.rom size 65536 crc 75a88c1f region gfx1 offs 0 )
rom ( name ng-sfix.rom merge ng-sfix.rom size 131072 crc 354029fc region gfx1 offs 20000 )
rom ( name neo-geo.rom merge neo-geo.rom size 131072 crc 9036d879 region user1 flags 0x5 offs 0 )
rom ( name ng-sm1.rom merge ng-sm1.rom size 131072 crc 97cf998b region cpu2 offs 0 )
rom ( name dig_m1.rom size 65536 crc 833cdf1b region cpu2 offs 0 )
rom ( name dig_v1.rom size 524288 crc ee15bda4 region sound1 flags soundonly offs 0 )
rom ( name dig_c1.rom size 524288 crc 3db0a4ed region gfx2 offs 0 )
rom ( name dig_c2.rom size 524288 crc 3e632161 region gfx2 offs 1 )
chip ( type cpu name 68000 clock 12000000 )
chip ( type cpu flags audio name Z80 clock 6000000 )
chip ( type audio name YM2610 clock 8000000 )
video ( screen raster orientation horizontal x 304 y 224 aspectx 4 aspecty 3 freq 60.000000 )
sound ( channels 2 )
input ( players 2 control joy8way buttons 4 coins 2 )
dipswitch ( name "Test Switch" entry "Off" entry "On" default "Off" )
dipswitch ( name "Coin Chutes?" entry "1?" entry "2?" default "2?" )
dipswitch ( name "Autofire (in some games)" entry "Off" entry "On" default "Off" )
dipswitch ( name "COMM Setting" entry "Off" entry "1" entry "2" entry "3" entry "4" default "Off" )
dipswitch ( name "Free Play" entry "Off" entry "On" default "Off" )
dipswitch ( name "Freeze" entry "Off" entry "On" default "Off" )
dipswitch ( name "Territory" entry "Japan" entry "USA" entry "Europe" default "Europe" )
driver ( status good color good sound good palettesize 4096 blit plain )
)
Translated by SabreTools
<machine name="diggermn" romof="neogeo">
<description>Digger Man</description>
<year>1994</year>
<manufacturer>Face ?</manufacturer>
<rom name="dig_c1.rom" size="524288" crc="3db0a4ed" />
<rom name="dig_c2.rom" size="524288" crc="3e632161" />
<rom name="dig_m1.rom" size="65536" crc="833cdf1b" />
<rom name="dig_p1.rom" size="524288" crc="eda433d7" />
<rom name="dig_s1.rom" size="65536" crc="75a88c1f" />
<rom name="dig_v1.rom" size="524288" crc="ee15bda4" />
<rom name="neo-geo.rom" size="131072" crc="9036d879" />
<rom name="ng-sfix.rom" size="131072" crc="354029fc" />
<rom name="ng-sm1.rom" size="131072" crc="97cf998b" />
<rom name="" size="" />
<rom name="Rom_1" size="" />
<chip name="68000" type="" />
<chip name="YM2610" type="" />
<chip name="Z80" type="" />
<dipswitch name="Autofire (in some games)" tag="" mask="" />
<dipswitch name="Coin Chutes?" tag="" mask="" />
<dipswitch name="COMM Setting" tag="" mask="" />
<dipswitch name="Free Play" tag="" mask="" />
<dipswitch name="Freeze" tag="" mask="" />
<dipswitch name="Territory" tag="" mask="" />
<dipswitch name="Test Switch" tag="" mask="" />
<driver status="" emulation="" savestate="" />
<sound channels="" />
</machine>
<rom name="" size="" />
<rom name="Rom_1" size="" />
Where from?
"video" "input" -> vanished.
"sound channels" -> lost value.
Should be supported now.
| gharchive/issue | 2023-07-11T22:56:26 | 2025-04-01T06:37:31.536520 | {
"authors": [
"Lutepatious",
"mnadareski"
],
"repo": "SabreTools/SabreTools",
"url": "https://github.com/SabreTools/SabreTools/issues/93",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
836377156 | ERROR: [TAG] Failed to resolve variable '${junit.version}'
Launching lib/main.dart on Android SDK built for x86 in debug mode...
ERROR: [TAG] Failed to resolve variable '${junit.version}'
ERROR: [TAG] Failed to resolve variable '${animal.sniffer.version}'
@AhmadShakerASH Merge with latest change
| gharchive/issue | 2021-03-19T21:39:37 | 2025-04-01T06:37:31.540611 | {
"authors": [
"AhmadShakerASH",
"Sadmansamee"
],
"repo": "SadaqaWorks/Quran-Flutter",
"url": "https://github.com/SadaqaWorks/Quran-Flutter/issues/19",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1791810113 | docs: Better documentation for developers
Closes #375.
Summary of Changes
Improved developer documentation about tests
Added guidelines about copying objects (rather than modifying them in-place)
Added code-style guidelines
Added code review guidelines
The code style section got somewhat small. Turns out we don't really have a common code style except for "do what the linter says".
Thought about alsl adding an FAQ for developers, but I came up with only like 3 questions, that didn't see emough, so I dropped the idea.
Don't have any other ideas, so let me mark this as ready for review.
:tada: This PR is included in version 0.15.0 :tada:
The release is available on:
v0.15.0
GitHub release
Your semantic-release bot :package::rocket:
| gharchive/pull-request | 2023-07-06T15:45:28 | 2025-04-01T06:37:31.544365 | {
"authors": [
"lars-reimann",
"zzril"
],
"repo": "Safe-DS/Stdlib",
"url": "https://github.com/Safe-DS/Stdlib/pull/427",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1188157841 | Refactor the validation functions in DCA to incorporate changes from schematic
Incorporate backend updates from https://github.com/Sage-Bionetworks/schematic/pull/614
#315
[x] Ensure the updated validateModelManifest() works as expected
[x] Refactor related validation function in DCA to address the changes from schematic: e.g. validateModelManifest() now return two things: "error" and "warning". I think we just need first element (error) of output list in DCA for now
#322
[x] Test cross-manifest rules on HTAN data: https://github.com/ncihtan/data-models/pull/28
[ ] update UI error messages for cross-manifest errors
related to https://github.com/Sage-Bionetworks/schematic/issues/683:
One thing to note for DCA:
Each row represents each invalid value with error in the UI error table, so "matchExactlyOne" might not be fit for the error table since entire column/attribute is invalid. Thus, we might consider "matchExactlyOne" errors as a separated error type, I could just highlight the invalid source attribute and throw its errors.
Editting:
I have run some tests on the cross-manifest testing (MatchAtLeast) using the latest schematic:
testing manifest
testing data model
there is only one cross-manifest rule:
"matchAtLeastOne scRNA-seqLevel1.HTANParentBiospecimenID Biospecimen.HTANBiospecimenID"
Here is my benchmarking result for running time (by running validation 5 times with ^ testing files using hypefine:
Using the schematic before 700 is merged:
hyperfine --warmup 1 --runs 5 --ignore-failure 'schematic model -c schematic_config.yml validate -mp ~/Downloads/htan/test-cross-link.csv -dt ScRNA-seqLevel1'
Using the latest schematic:
hyperfine --warmup 1 --runs 5 --ignore-failure 'schematic model -c schematic_config.yml validate -mp ~/Downloads/htan/test-cross-link.csv -dt ScRNA-seqLevel1'
This is my downloading speed:
Testing download speed................................................................................
Download: 8.13 Mbit/s
Testing upload speed......................................................................................................
Upload: 11.04 Mbit/s
@rrchai this is exactly the kind of benchmarking we need more of! My take aways from it:
we'll need project scoping - and repeat the same benchmark with it
very likely, we'll still need filtering based on component annotation - that would require adding the component annotation to HTAN dataset manifests retrospectively
@GiaJordan when you're back, what are your thoughts on project scoping readiness for testing?
@ychae what do you think about who'd be best positioned to work on HTAN's retroactive component annotations of manifests?
@rrchai
In the development of this I noticed that when there were multiple manifests for a dataset, it would just take the first one and this wouldn't always be desired, so I added logic there for selection. It shouldn't affect datasets with only one manifest though.
The manifests currently on synapse will also need to have component annotations added either manually or by re-uploading for the expedited gathering to work, otherwise it will revert back to the old behavior.
@milen-sage
Project scoping is ready for testing now; I'm planning on merging #701 on Monday
@ychae what do you think about who'd be best positioned to work on HTAN's retroactive component annotations of manifests?
@milen-sage I think the data liaisons might be able to help with this task -- ideally each liaison would add the component annotation for their centers but I'll need to float this idea by Ashley to see how keen the liaisons would be to take this on.
@GiaJordan @milen-sage By validating on only HTAN center A using the latest schematic (merged 701, the speed is amazing now.
validation results
error: Manifest syn27331282 does not contain the value HTA111111_121324 from row 2 of the attribute HTANParentBiospecimenID in the source manifest.
error: Manifest syn27331282 does not contain the value sdsadasd from row 3 of the attribute HTANParentBiospecimenID in the source manifest.
error: Manifest syn27331282 does not contain the value HTA3_8001_1002 from row 4 of the attribute HTANParentBiospecimenID in the source manifest.
Since there is no flag for project_scope using cli, I replace default value of project_scope with centerA's synId for testing.
hyperfine --warmup 1 --runs 5 'schematic model -c schematic_config.yml validate -mp ~/Downloads/htan/test-cross-link.csv -dt ScRNA-seqLevel1'
Time (mean ± σ): 18.844 s ± 5.422 s [User: 4.838 s, System: 0.672 s]
Range (min … max): 11.972 s … 25.155 s 5 runs
@rrchai I'm not sure why there's different behavior for using the flag and chaning the default value. Does the master_fileview for both contain the CenterA project?
The warning for the unmet access restrictions is expected, though I suppose displaying a message that a censored version of the manifest will be downloaded instead would be helpful.
Very happy to hear about the times!
@GiaJordan ah, you are right. I forgot to use the HTAN's fileview. I will re-test it.
I forgot to use the HTAN's fileview. I will re-test it.
This might explain the speed up too. HTAN's fileview is larger.
Test HTAN fileview
validation results
error: Manifest syn27116926 does not contain the value HTA111111_121324 from row 2 of the attribute HTANParentBiospecimenID in the source manifest.
error: Manifest syn27116926 does not contain the value sdsadasd from row 3 of the attribute HTANParentBiospecimenID in the source manifest.
error: Manifest syn27116926 does not contain the value HTA3_8001_1002 from row 4 of the attribute HTANParentBiospecimenID in the source manifest.
error: Manifest syn30560343 does not contain the value HTA111111_121324 from row 2 of the attribute HTANParentBiospecimenID in the source manifest.
error: Manifest syn30560343 does not contain the value sdsadasd from row 3 of the attribute HTANParentBiospecimenID in the source manifest.
error: Manifest syn30560343 does not contain the value HTA3_8001_1002 from row 4 of the attribute HTANParentBiospecimenID in the source manifest.
hyperfine --warmup 1 --runs 5 'schematic model -c schematic_config.yml validate -mp ~/Downloads/htan/test-cross-link.csv -dt ScRNA-seqLevel1 -ps syn20977135'
Time (mean ± σ): 120.861 s ± 66.810 s [User: 16.036 s, System: 1.864 s]
Range (min … max): 54.659 s … 215.867 s 5 runs
The speed is definitely faster by using project scope. I think 2 min averagely for me is okay to use and test on one/two cross-manifests validation. I notice the most time it takes on my end is querying the fileview table?
Downloading [####################]100.00% 74.9MB/74.9MB (538.7kB/s) SYNAPSE_TABLE_QUERY_95623552.csv.synapse_download_95623552 Done...
Since the DCA has already stored the synapseStorage object at the beginning. @milen-sage I wonder if it possible to use the synapseStorage object directly as input to avoid downloading it again? I still can proceed to implement it in the DCA without this enhancement.
@GiaJordan I tested the develop-crossM-union-rule branch:
using "matchAtLeastOne Biospecimen.HTANBiospecimenID set":
It looks like it is validating individual values?
warning: Value(s) ['HTA111111_121324'] from row(s) [2] of the attribute htanparentbiospecimenid in the source manifest are missing. Manifest(s) ['syn27116926', 'syn30560343'] are missing the value(s).
Your manifest has been validated successfully. There are no errors in your manifest, and it can be submitted without any modifications.
using "matchAtLeastOne Biospecimen.HTANBiospecimenID value":
warning: Value(s) ['HTA111111_121324'] from row(s) [2] of the attribute htanparentbiospecimenid in the source manifest are missing. Manifest(s) ['syn27116926', 'syn30560343'] are missing the value(s).
Your manifest has been validated successfully. There are no errors in your manifest, and it can be submitted without any modifications.
FYI: currently, the DCA will only extract the error list (4 elements) from validation results and reformat it into a table:
Row
Column
Value
Error
1st element
2rd element
4th element
3rd element (anything start with "not")
Warning could also be used as long as it has the same format of errors. I found the almost all errors from schematic contains "not". To simplify the UI error message, only characters after "not" (including "not") are used. So the error message (3rd element of error output list) prefers to contain "not".
@rrchai yes, let's give reusing the fileview object a try! If this avoids the download time, it would be worth it for sure.
In this case it sounds like there are two items:
integration of the cross-manifest rule in the DCA and the dashboard
fileview object reuse in the dashboard
If the implementation of 2 doesn't affect 1 too much, you can start with 1 and then move on to 2.
On the other hand, if 2 will affect significantly how 1 is implemented, proceeding with 2 and testing it first would make more sense.
My intuition is that 2 shouldn't affect too much 1. What do you think?
@rrchai Thanks for the feedback. It will validate on the set or value level, but with either case it'll display which values are causing the invalidity. And while they've been switched to warnings, the format is still the same.
In response to the query being the limiting step currently, I added the project scope to the table query if one is provided. That should help to speed things up more. This change is still the cross manifest branch, so you can pull the latest version to try.
cc @milen-sage
okay, my internet issue has been fixed and the downloading speed is back to normal:
Testing download speed................................................................................
Download: 170.74 Mbit/s
Testing upload speed......................................................................................................
Upload: 9.92 Mbit/s
I retested the latest develop branch and downloading fileview table is not as slow as before with ^ speed 😄 :
Using the latest develop branch:
hyperfine --warmup 1 --runs 10 'schematic model -c schematic_config.yml validate -mp ~/Downloads/htan/test-cross-link.csv -dt ScRNA-seqLevel1 -ps syn20977135'
Time (mean ± σ): 42.877 s ± 5.390 s [User: 15.350 s, System: 1.220 s]
Range (min … max): 37.302 s … 51.442 s 10 runs
Using develop-crossM-union-rule branch:
hyperfine --warmup 1 --runs 10 'schematic model -c schematic_config.yml validate -mp ~/Downloads/htan/test-cross-link.csv -dt ScRNA-seqLevel1 -ps syn20977135'
Time (mean ± σ): 23.530 s ± 2.253 s [User: 5.753 s, System: 0.740 s]
Range (min … max): 20.351 s … 26.741 s 10 runs
@GiaJordan Thank you for the improvement. The new querying looks like that it reduces time significantly from 40ish to 20ish 🚀 !
My intuition is that 2 shouldn't affect too much 1. What do you think?
Based on the benchmarking results, I think downloading fileview object will affect the overall speed of validation. Since the @GiaJordan has already improve the query process and ~20s for one cross-manifest validation is good to me, I could start to implement the cross-manifest rules in the app now.
In short, by testing with one cross-manifest rule with my internet, the faster your downloading speed is and less projects/data to query/download, the faster the cross-manifest validation will be.
@rrchai Thanks for the benchmarking and updates!
In the farther future, we'll likely experiment with converting the synapse storage object to a sentinel object that can be reused within one validation run. This would expedite the validation further in cases where multiple attributes use the cross manifest validation rule.
Looks like we just need to review a PR and this will be ready to close?
@milen-sage can we then merge #751 as well?
| gharchive/issue | 2022-03-31T14:18:09 | 2025-04-01T06:37:31.579272 | {
"authors": [
"GiaJordan",
"milen-sage",
"rrchai",
"ychae"
],
"repo": "Sage-Bionetworks/data_curator",
"url": "https://github.com/Sage-Bionetworks/data_curator/issues/309",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
247325968 | How can we add like a Pod?
Hello,
Can we add SRPopView using pod?
thanks in advance.
Vishal
Yes I am working on a swift version also with some enhancement, will update to a pod in a few days
Pod version is available
Ok thanks
I am adding to the trunk pod so it might take a few hours to take effect, so watch out for the update here
sure.
| gharchive/issue | 2017-08-02T09:25:17 | 2025-04-01T06:37:31.607491 | {
"authors": [
"SahebRoy92",
"vishaldeshai"
],
"repo": "SahebRoy92/SRPopView",
"url": "https://github.com/SahebRoy92/SRPopView/issues/2",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1506617838 | cleared
issue cleared
cleared
cleared
| gharchive/issue | 2022-12-21T16:51:23 | 2025-04-01T06:37:31.611042 | {
"authors": [
"SaikiranVoladri"
],
"repo": "SaikiranVoladri/Java-Placement-course",
"url": "https://github.com/SaikiranVoladri/Java-Placement-course/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2231847952 | Set persistent workers for validation dataloader
Following Lightning suggestion ⚡
Codecov Report
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 31.07%. Comparing base (d0a8066) to head (d62facf).
Additional details and impacted files
@@ Coverage Diff @@
## main #151 +/- ##
=======================================
Coverage 31.07% 31.07%
=======================================
Files 19 19
Lines 1223 1223
=======================================
Hits 380 380
Misses 843 843
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
| gharchive/pull-request | 2024-04-08T18:32:16 | 2025-04-01T06:37:31.618686 | {
"authors": [
"codecov-commenter",
"sfmig"
],
"repo": "SainsburyWellcomeCentre/crabs-exploration",
"url": "https://github.com/SainsburyWellcomeCentre/crabs-exploration/pull/151",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1770718090 | Update to 1.20x
Update to 1.20x
I will soon, just finalizing changes for 1.19.4 before thinking of dividing my work
| gharchive/issue | 2023-06-23T03:28:41 | 2025-04-01T06:37:31.619721 | {
"authors": [
"SaishoVibes",
"gitdrug"
],
"repo": "SaishoVibes/Back-in-Classic-Fabric",
"url": "https://github.com/SaishoVibes/Back-in-Classic-Fabric/issues/2",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1905836482 | ACCT_ViewOverride_CTRL Security Fix
Critical Changes
Changes
Issues Closed
Community Ideas Delivered
Features Intended for Future Release
Features for Elevate Customers
New Metadata
Deleted Metadata
W-13641227
| gharchive/pull-request | 2023-09-20T23:14:32 | 2025-04-01T06:37:31.643860 | {
"authors": [
"npsp-reedestockton"
],
"repo": "SalesforceFoundation/NPSP",
"url": "https://github.com/SalesforceFoundation/NPSP/pull/7193",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1012797152 | basic algorithms
#1
Hactoberfest-accepted
hacktoberest-accepted
| gharchive/pull-request | 2021-10-01T02:03:00 | 2025-04-01T06:37:31.650504 | {
"authors": [
"Sam071100",
"vaibhavgarg237"
],
"repo": "Sam071100/CodeForces-Solutions",
"url": "https://github.com/Sam071100/CodeForces-Solutions/pull/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
610359262 | Type issue while onHover and onClick triggers.
Hello, I am facing the issue of gl geocoder disable after adding to clicks on MapGL instance, on hover, and onClick
Let me know how I can resolve this.
Hi @I1mran, is it possible for you to provide a Code Sandbox (https://codesandbox.io/) that demonstrates this issue?
I don't have code right now for sharing.
| gharchive/issue | 2020-04-30T20:19:01 | 2025-04-01T06:37:31.654190 | {
"authors": [
"I1mran",
"SamSamskies"
],
"repo": "SamSamskies/react-map-gl-geocoder",
"url": "https://github.com/SamSamskies/react-map-gl-geocoder/issues/63",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1216083483 | Open to new maintainers?
Hi @SamVerschueren! Would you be open to new maintainers? It seems like this dependency will forever be a part of our dep tree, so it'd be great to see its dependencies updated (keeps getting flagged by vulnerability scanners, even though we only use it in local, non-production contexts).
I'm totally down for new maintainers as I don't have much time myself anymore and have different priorities.
Great! I'll volunteer. How do you want to move forward?
| gharchive/issue | 2022-04-26T14:55:45 | 2025-04-01T06:37:31.658599 | {
"authors": [
"SamVerschueren",
"benasher44"
],
"repo": "SamVerschueren/listr-update-renderer",
"url": "https://github.com/SamVerschueren/listr-update-renderer/issues/31",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
146687236 | add csrf protection for resources available to everyone
Reservations made through sulten were previously vulnerable to csrf attacks.
This manually implements the default behaviour in rails 4.
https://github.com/rails/rails/blob/master/actionpack/lib/action_controller/metal/request_forgery_protection.rb#L194
This manually implements the behaviour in rails 4
| gharchive/pull-request | 2016-04-07T17:12:08 | 2025-04-01T06:37:31.664753 | {
"authors": [
"FluenesHerre"
],
"repo": "Samfundet/Samfundet",
"url": "https://github.com/Samfundet/Samfundet/pull/58",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1182696960 | Implementation of Installation of Compiling Env & Execution Env
Let's work on compiling env & execution env.
In this issue, only "local installation of debian package" will be handled.
[ ] designing UI for compiling env
[ ] designing UI for execution env
[ ] implementation of compiling env
[ ] implementation of execution env
[ ] implementation of auto detect (finding if any existing local env already exists)
find out the following
[ ] are they working
[ ] implementation of uninstall
[ ] implementation of install
[ ] implementation of install over existing one
Out of scope
remote installation
docker installation
supporting various backends
make it for existing backend first
consider various backends for easier refactoring later
but find clues for various backend support
/cc @jyoungyun
compiling env / execution env are very important concepts, I think. And I'm interested in them. Could you share your concrete concepts for impl them?
Related issue: #306
This issue was not done.
I think it's better to close this and reschedule after discussing with @jyoungyun.
| gharchive/issue | 2022-03-27T23:17:23 | 2025-04-01T06:37:31.670953 | {
"authors": [
"YongseopKim",
"hyunsik-yoon",
"jyoungyun"
],
"repo": "Samsung/ONE-vscode",
"url": "https://github.com/Samsung/ONE-vscode/issues/414",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1203945042 | Let's define and schdule for ONE-context-explorer and ONE-context-explorer-detail views
[ ] Define this task: ~4/14(Thr)
[ ] Divide/Separate this task and schedule: ~4/15(Fri)
Maybe it's like the below things?
UI Tasks: task0(M1), task1(M2), ...
Internal Impl Tasks for UI: task0(M1), task1(M2), ...
[ ] Make issues and assign on our ONE-project
/cc @Samsung/one-vscode
(Suggestion)
About the naming for those two views we discussed,
how about 'explorer view' and 'detail view' ?
Things to concern
If we are going to regard all tflite, onnx, ... model files as a 'base model' and npu model file as 'target model', our extension will be not very extensible for other model compiler usages.
For example, a model compiler who basically want to do onnx to tflite jobs, it would not work properly.
Therefore, I suggest the concept of 'base model'. Let's make user to select a 'base model' extension and let the 'explorer' only handles them and the 'detail view' to collect other files.
We may start by blocking other options than 'tflite' to make our jobs rather easy.
First of all, we need to discuss whether we support to convert onnx to tflite in our ONE-vscode. How about talking on next Monday about this subject?
First of all, we need to discuss whether we support to convert onnx to tflite in our ONE-vscode.
My suggestion was based on its extensibility, rather than the actual support. I heard that our extension kinda meant to support the other compilers.
I agree. I am closing this on behalf of other members.
| gharchive/issue | 2022-04-14T01:46:10 | 2025-04-01T06:37:31.675789 | {
"authors": [
"YongseopKim",
"dayo09",
"jyoungyun"
],
"repo": "Samsung/ONE-vscode",
"url": "https://github.com/Samsung/ONE-vscode/issues/476",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1904697846 | [Execute] Introduce DeviceViewNode by device type
This commit introduces DeviceViewNode by device type.
ONE-vscode-DCO-1.0-Signed-off-by: Jiyoung Yun jy910.yun@samsung.com
This PR works normally only when #1660 and #1661 PRs are merged together.
| gharchive/pull-request | 2023-09-20T10:35:19 | 2025-04-01T06:37:31.677253 | {
"authors": [
"jyoungyun"
],
"repo": "Samsung/ONE-vscode",
"url": "https://github.com/Samsung/ONE-vscode/pull/1662",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1132889199 | On Deleting a task confetti should not pop up
You must have observed a new confetti feature on our website. The thing is this appears even when a user deletes a task (clicking on a cross button) . To resolve this call the API call when user wants to delete and API call when user has completed the task should come inside different functions .
@SaraswatGit I want to work on this issue under JWoC.
@Aniket-508 I have assigned the issue to you, Best Wishes, for any problem reach out to me on discord .
| gharchive/issue | 2022-02-11T17:26:27 | 2025-04-01T06:37:31.732284 | {
"authors": [
"Aniket-508",
"SaraswatGit"
],
"repo": "SaraswatGit/PlanZap",
"url": "https://github.com/SaraswatGit/PlanZap/issues/44",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
504760771 | Added gyolo
when you type gyolo it will pull random commit message from the internet and commit changes with that message
@mrsupiri If you are interested, you can add something like Semantic or Conventional commit style.
| gharchive/pull-request | 2019-10-09T16:29:24 | 2025-04-01T06:37:31.733489 | {
"authors": [
"SarathSantoshDamaraju",
"mrsupiri"
],
"repo": "SarathSantoshDamaraju/lazyGit",
"url": "https://github.com/SarathSantoshDamaraju/lazyGit/pull/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1021601405 | MergeSort Without Taking Extra Space
Enter your question -
Merge Without Extra Space
Enter link to the question(if question belongs to any online platform) -
Merge Without Extra Space
Tags for the question(eg - Array, Basic, Stack, etc.) -
Array,Sorting
@RAshid602,
Kindly add your solution to "GeeksForGeeks" folder.
Deadline - 10/10/2021
Can I work on this?
@sRahul-00,
Its great to see your interest to solve the question. But since the deadline ain't over yet. You may refer to CONTRIBUTING.md for getting futher details and guidelines on how to contribute to this repo.
| gharchive/issue | 2021-10-09T04:52:52 | 2025-04-01T06:37:31.736766 | {
"authors": [
"RAshid602",
"SarthakKeshari",
"sRahul-00"
],
"repo": "SarthakKeshari/Java-Questions-and-Solutions",
"url": "https://github.com/SarthakKeshari/Java-Questions-and-Solutions/issues/364",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1012357489 | Need to standardize grammar, labels, language style
[ ] replace all straight/dumb quotation marks "" with fancy/smart quotes “” when needed as actual quotations. Terms and categories should be indicated using bold tags
[ ] use of HET vs Health Equity Tracker
[ ] need to define what is in the term itself, vs which is a logical combination multiple defined groups. &, and, or. When to use commas? When to use oxford commas? Josh noted CDC uses /
hold on this until larger website redesign and copy updates with Mahia
| gharchive/issue | 2021-09-30T16:02:37 | 2025-04-01T06:37:31.760441 | {
"authors": [
"benhammondmusic",
"jgonzalezmsm"
],
"repo": "SatcherInstitute/health-equity-tracker",
"url": "https://github.com/SatcherInstitute/health-equity-tracker/issues/1184",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1750311753 | chore(chats): reactions position
STR:
go to chat
send message
right click
suggestion - when we add a reaction goes below the message but the reaction list is above the message, shouldn't both be either both in top or both in below?
macOS, m1
I think we should eventually use the emoji picker for the reactions or at least have options be in the context menu when you right click rather than be an additional action. Maybe we have a list of common reaction emoji and an extra button to view them all?
| gharchive/issue | 2023-06-09T18:15:37 | 2025-04-01T06:37:31.767395 | {
"authors": [
"XileHorizon",
"stavares843"
],
"repo": "Satellite-im/Uplink",
"url": "https://github.com/Satellite-im/Uplink/issues/874",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2386474972 | update(Friend): Create toast message for friend request and delete modals
What this PR does 📖
1. Create toast messages on friend requests action, and delete modals
2. After send a friend request, if it is successful or not, it delete did from input.
3. Fix null problem on current dev with a if
4. Update rules for input on username, and add rules for status message
Which issue(s) this PR fixes 🔨
Resolve #181
Special notes for reviewers 🗒️
Additional comments 🎤
Tested the following
On Create Account screen:
Pin input empty - passed
Pin input less than 4 characters - passed
Pin input with spaces - passed
Pin input with non-alphanumeric chars - passed
Pin input with more than 32 chars - passed
Status input with more than 128 chars - passed
Noticed that create user button is always enabled even though there is an error input value (we might need a new ticket for this)
On Friend Requests
Modal is gone and now is toast notification displayed after sending friend request (sucess and to yourself) - Passed
Friend DID Input field is cleared out after sending friend request - Passed
Null Problem - Passed (app compiled correctly on branch)
Settings Profile
Same scenarios above passed
Noticed that Save/Cancel buttons are still displayed when there is a wrong input value. If user clicks on Save nothing happens
One more weird scenario found:
On settings profile, enter a new username - good value
Enter now a status exceeding 128 chars
Click on Save button
Profile updated toast notification will show
| gharchive/pull-request | 2024-07-02T14:50:23 | 2025-04-01T06:37:31.774860 | {
"authors": [
"lgmarchi",
"luisecm"
],
"repo": "Satellite-im/UplinkWeb",
"url": "https://github.com/Satellite-im/UplinkWeb/pull/187",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
394617296 | Range: when resetting the form value, the values of begin and end are not reseted.
When in single selection mode, if we reset the form value driving the selected date, then the value of selectedValue is correctly reseted in the SatCalendarBody component.
But in range mode, this is not the case: the values of begin and end in SatCalendarBody remain unchanged.
I think this is a bug.
How do you reset value?
I tried with both ngModel and formControl (respectively resetting the form by setting the model to null and with form.reset()) and I have the same problem.
But indeed, I made more tests and the problem happens only when I reset the form immediately after the dateChange event.
Wrapping the reset in a setTimeout(() => {...}, 0) solves the problem.
Moreover, this only happens when the closeAfterSelection option is false.
So perhaps it could be solved with a markForCheck somewhere but I don't know where.
Here is a sample to reproduce the bug:
import { Component } from '@angular/core';
import {FormBuilder, FormGroup} from "@angular/forms";
@Component({
selector: 'app-root',
template: `
<form [formGroup]="form">
<mat-form-field>
<input matInput
placeholder="Choose a date"
[satDatepicker]="picker"
(dateChange)="dateChange($event)"
formControlName="date">
<sat-datepicker #picker [rangeMode]="true"
[closeAfterSelection]="false">
</sat-datepicker>
<sat-datepicker-toggle matSuffix [for]="picker"></sat-datepicker-toggle>
</mat-form-field>
</form>
`,
styleUrls: ['./app.component.scss']
})
export class AppComponent {
form: FormGroup;
constructor(fb: FormBuilder) {
this.form = fb.group({
date: [{ begin: new Date(2018, 7, 5), end: new Date(2018, 7, 25) }]
});
}
dateChange(event) {
// Do some stuff
this.form.reset();
}
}
Hope, maybe it will help you, To reset i have tried like this, passing | assigning the [beginDate] and [endDate] value as null on sat-calendar.
Hope, maybe it will help you, To reset i have tried like this, passing | assigning the [beginDate] and [endDate] value as null on sat-calendar.
Don't work this!
I tried in this mode but don't work same:
@ViewChild(SatCalendar, {static: true}) calendar: SatCalendar<Date> this.calendar.beginDateSelected = false; this.calendar.beginDate = null; this.calendar.maxDate = null; this.calendar.selected = null; this.calendar.startAt = null; ;
So it's a BUG of library and must to be fix or insert new solution
Hope, maybe it will help you, To reset i have tried like this, passing | assigning the [beginDate] and [endDate] value as null on sat-calendar.
Don't work this!
I tried in this mode but don't work same:
@ViewChild(SatCalendar, {static: true}) calendar: SatCalendar<Date> this.calendar.beginDateSelected = false; this.calendar.beginDate = null; this.calendar.maxDate = null; this.calendar.selected = null; this.calendar.startAt = null; ;
So it's a BUG of library and must to be fix or insert new solution
You have to pass the values(beginDate and endDate) like below in sat-calender element
<sat-calendar
[beginDate]="calendar.beginDate"
[endDate] = "calendar.maxDate">
The my setup is:
The unser click one day and the week is selected (from Mondato to Sunday). When click other day of week this reset week and re-click day for select week.
The bug is when the user (after clicked the day and selected week) re-click the same day or day in week selected the saturn fire (beginDateSelectedChange) and this work badly.
Example StackBlitz
So i would reset range or send to set null beginDate
Hope, maybe it will help you, To reset i have tried like this, passing | assigning the [beginDate] and [endDate] value as null on sat-calendar.
Don't work this!
I tried in this mode but don't work same:
@ViewChild(SatCalendar, {static: true}) calendar: SatCalendar<Date> this.calendar.beginDateSelected = false; this.calendar.beginDate = null; this.calendar.maxDate = null; this.calendar.selected = null; this.calendar.startAt = null; ;
So it's a BUG of library and must to be fix or insert new solution
You have to pass the values(beginDate and endDate) like below in sat-calender element
<sat-calendar
[beginDate]="calendar.beginDate"
[endDate] = "calendar.maxDate">
The my setup is:
The user click one day and the week is selected (from Monday to Sunday). When click other day of week this reset week and re-click day for select week.
The bug is when the user (after clicked the day and selected week) re-click the same day or day in week selected, the saturn fire (beginDateSelectedChange) and this work badly.
Example StackBlitz
So i would reset range of days or send to set beginDate = null
Hope, maybe it will help you, To reset i have tried like this, passing | assigning the [beginDate] and [endDate] value as null on sat-calendar.
Don't work this!
I tried in this mode but don't work same:
@ViewChild(SatCalendar, {static: true}) calendar: SatCalendar<Date> this.calendar.beginDateSelected = false; this.calendar.beginDate = null; this.calendar.maxDate = null; this.calendar.selected = null; this.calendar.startAt = null; ;
So it's a BUG of library and must to be fix or insert new solution
You have to pass the values(beginDate and endDate) like below in sat-calender element
<sat-calendar
[beginDate]="calendar.beginDate"
[endDate] = "calendar.maxDate">
The my setup is:
The user click one day and the week is selected (from Monday to Sunday). When click other day of week this reset week and re-click day for select week.
The bug is when the user (after clicked the day and selected week) re-click the same day or day in week selected, the saturn fire (beginDateSelectedChange) and this work badly.
Example StackBlitz
So i would reset range of days or send to set beginDate = null
NOTE: i don't konow because stackblitz don't work with momentjs. So you must try in local for try work example
Any solution boys??
Hope, maybe it will help you, To reset i have tried like this, passing | assigning the [beginDate] and [endDate] value as null on sat-calendar.
Don't work this!
I tried in this mode but don't work same:
@ViewChild(SatCalendar, {static: true}) calendar: SatCalendar<Date> this.calendar.beginDateSelected = false; this.calendar.beginDate = null; this.calendar.maxDate = null; this.calendar.selected = null; this.calendar.startAt = null; ;
So it's a BUG of library and must to be fix or insert new solution
You have to pass the values(beginDate and endDate) like below in sat-calender element
<sat-calendar
[beginDate]="calendar.beginDate"
[endDate] = "calendar.maxDate">
The my setup is:
The user click one day and the week is selected (from Monday to Sunday). When click other day of week this reset week and re-click day for select week.
The bug is when the user (after clicked the day and selected week) re-click the same day or day in week selected, the saturn fire (beginDateSelectedChange) and this work badly.
Example StackBlitz
So i would reset range of days or send to set beginDate = null
NOTE: i don't konow because stackblitz don't work with momentjs. So you must try in local for try work example
EMPORANY SOLUTION:
I Added SCSS/CSS attributes on root style (style.scss)
`.mat-calendar-body-selected {
pointer-events: none;
cursor: default;
text-decoration: none;
color: black;
}
.mat-calendar-body-begin-range:not(.mat-calendar-body-end-range) {
pointer-events: none;
cursor: default;
text-decoration: none;
color: black;
}
.mat-calendar-body-end-range:not(.mat-calendar-body-begin-range) {
pointer-events: none;
cursor: default;
text-decoration: none;
color: black;
}
.mat-calendar-cell-semi-selected {
pointer-events: none;
cursor: default;
text-decoration: none;
color: black;
}`
| gharchive/issue | 2018-12-28T11:36:23 | 2025-04-01T06:37:31.821877 | {
"authors": [
"SaturnTeam",
"Zacknero",
"agiratech-kumaresanj",
"sancaruso"
],
"repo": "SaturnTeam/saturn-datepicker",
"url": "https://github.com/SaturnTeam/saturn-datepicker/issues/58",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1719086655 | word文档合并字号格式丢失
主文档是主要为一些段落文字,子文档为图表(雷达图),合并之后主文档中的正文段落字号丢失,标题中的字号正常,正文中字体颜色正常。
需要可复现的单元测试。
Auto closed.
| gharchive/issue | 2023-05-22T07:37:51 | 2025-04-01T06:37:31.829253 | {
"authors": [
"Sayi",
"kpcmcn"
],
"repo": "Sayi/poi-tl",
"url": "https://github.com/Sayi/poi-tl/issues/985",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
2753972414 | fix some bug
修复模拟器执行自定义门内存泄漏问题
add_noise可关闭checkgate为自定义门添加噪声
优化概率与counts计算
是不是可以加一个模拟器执行自定义门的测试用例,之前很少用这个功能所以一直没发现内存泄漏的问题
是不是可以加一个模拟器执行自定义门的测试用例,之前很少用这个功能所以一直没发现内存泄漏的问题
这个能测出来吗?要用特定工具吧
| gharchive/pull-request | 2024-12-21T12:26:13 | 2025-04-01T06:37:31.830861 | {
"authors": [
"Zhaoyilunnn",
"lss0208"
],
"repo": "ScQ-Cloud/pyquafu",
"url": "https://github.com/ScQ-Cloud/pyquafu/pull/203",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2368900336 | export not exporting all highlights
I'm suspicious that the export_highlight function is not giving me all my highlights.
I ran it recently and only received 82 total notes in Apple Notes. I have way more than that...
I think it has something to do with the generator or the API wrapper I'm using.
def export_highlights(
updated_after: str = None, book_ids: str = None, token: str = None
) -> Generator[dict, None, None]:
"""Exports the highlights of books based on modification date and/or specific book IDs.
This function iterates over pages of highlights fetched from the client service,
filtering by update time and book IDs if provided, and yields each highlight.
Parameters:
updated_after (str, optional): The ISO 8601 date string to filter highlights
that were updated after a certain date. Defaults to None.
book_ids (str, optional): A comma-separated string of book IDs to filter
highlights by specific books. Defaults to None.
token (str): A Readwise API token. Default to None.
Yields:
dict: A dictionary representing a single book's highlight.
"""
client = get_client(token)
params = {}
if updated_after:
params["updatedAfter"] = updated_after
if book_ids:
params["ids"] = book_ids
for data in client.get_pagination_limit_20("/export/", params=params):
for book in data["results"]:
yield book
The issue was in the API wrapper (pyreadwise) not requesting the next page in the pagination. Based on the API documentation, the /export/ endpoint uses the parameter pageCursor while the other endpoints like /highlights/ use page.
pageCursor – (Optional) A string returned by a previous request to this endpoint. Use it to get the next page of books/highlights if there are too many for one request.
page – specify the pagination counter.
I am deciding whether to create a pull request in the original API wrapper repo or write my own. I'm leaning toward making a pull request.
| gharchive/issue | 2024-06-23T21:39:59 | 2025-04-01T06:37:31.837646 | {
"authors": [
"Scarvy"
],
"repo": "Scarvy/readwise-to-apple-notes",
"url": "https://github.com/Scarvy/readwise-to-apple-notes/issues/4",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
54488838 | Multiple indices on same column are ignored when loading schema on Postgres
When adding a table index with the same column of a foreign key index created by schema_plus (i.e. two indices on the same column with different names), loading the schema will ignore the first index declared in the create_table block for the table.
[This is on OS X 10.9, Postgres 9.3.5 and Ruby 2.1.5]
I have a test Rails application you can use to reproduce the problem. Clone the repo on the index branch, create the database, and try loading and then dumping the schema. (I used rake db:drop db:create db:schema:load db:schema:dump.)
The following diff comes up:
diff --git a/db/schema.rb b/db/schema.rb
index c2bbaf7..2cdbfff 100644
--- a/db/schema.rb
+++ b/db/schema.rb
@@ -28,7 +28,6 @@ ActiveRecord::Schema.define(version: 20150115174027) do
t.integer "airplane_id"
t.integer "pilot_id"
t.index ["airplane_id"], :name => "fk__airplanes_pilots_airplane_id"
- t.index ["pilot_id"], :name => "fk__airplanes_pilots_pilot_id"
t.index ["pilot_id"], :name => "index_airplanes_pilots_on_pilot_id", :unique => true
t.foreign_key ["airplane_id"], "airplanes", ["id"], :on_update => :no_action, :on_delete => :no_action, :name => "fk_airplanes_pilots_airplane_id"
t.foreign_key ["pilot_id"], "pilots", ["id"], :on_update => :no_action, :on_delete => :no_action, :name => "fk_airplanes_pilots_pilot_id"
If you manually change the order of the two indices on pilot_id, and load & dump the schema, then the diff becomes:
diff --git a/db/schema.rb b/db/schema.rb
index c2bbaf7..749bd9d 100644
--- a/db/schema.rb
+++ b/db/schema.rb
@@ -29,7 +29,6 @@ ActiveRecord::Schema.define(version: 20150115174027) do
t.integer "pilot_id"
t.index ["airplane_id"], :name => "fk__airplanes_pilots_airplane_id"
t.index ["pilot_id"], :name => "fk__airplanes_pilots_pilot_id"
- t.index ["pilot_id"], :name => "index_airplanes_pilots_on_pilot_id", :unique => true
t.foreign_key ["airplane_id"], "airplanes", ["id"], :on_update => :no_action, :on_delete => :no_action, :name => "fk_airplanes_pilots_airplane_id"
t.foreign_key ["pilot_id"], "pilots", ["id"], :on_update => :no_action, :on_delete => :no_action, :name => "fk_airplanes_pilots_pilot_id"
end
:+1:, I have this issue too.
The expected schema_plus behavior for me is that the schema does not have a diff after loading and dumping.
@thmzlt thanks for (yet another!) detailed bug report.
This is arguably a bug in rails: t.index only handles one index per column, see https://github.com/rails/rails/blob/4-1-stable/activerecord/lib/active_record/connection_adapters/abstract/schema_definitions.rb#L238. It's tickled by schema_plus though, because rails' schema dumper defines indexes using add_index outside the create_table block, whereas schema_plus tidies the dump by defining them using t.index.
That said, it's probably bad form to have two indexes on the same field anyway; the database will be doing extra unneeded work to maintain them both. No?
The reason you're getting two is that in your migration
create_table :airplanes_pilots do |t|
t.references :airplane
t.references :pilot
end
add_index :airplanes_pilots, [:pilot_id], unique: true
schema_plus (by default) auto-creates an index for the foreign key constraint; and the add_index statement creates a second one. In order to get just one index and have it be unique, you can do:
create_table :airplanes_pilots do |t|
t.references :airplane
t.references :pilot, index: { unique: true } # or just index: :unique
end
I could fix the bug by having schema_plus dump the extra indexes for a column using add_index outside the block. But since this seems like a bug only when doing something that probably shouldn't be done, and there's another way to achieve the result you want, I'm thinking I'll let this slide.
FWIW this is fixed in the schema_plus 2.0 prerelease, i.e. in the new schema_plus_indexes gem.
Thanks @ronen!
| gharchive/issue | 2015-01-15T18:56:35 | 2025-04-01T06:37:31.851690 | {
"authors": [
"fj",
"ronen",
"thmzlt"
],
"repo": "SchemaPlus/schema_plus",
"url": "https://github.com/SchemaPlus/schema_plus/issues/196",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
285273097 | Update to enable DynamoDB
Added a more resilient response reader and changed DynamoDB to force HTTPS
Thanks for your contribution - after some testing I will tag a new release probably by end of this week.
| gharchive/pull-request | 2017-12-31T19:29:19 | 2025-04-01T06:37:31.855914 | {
"authors": [
"Schm1tz1",
"brainstain"
],
"repo": "Schm1tz1/aws-sdk-arduino-esp8266",
"url": "https://github.com/Schm1tz1/aws-sdk-arduino-esp8266/pull/2",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2337485095 | ⚠️ Jenkins Plugins Artifactory has degraded performance
In b8ee78e, Jenkins Plugins Artifactory (https://get.jenkins.io/plugins/artifactory/) experienced degraded performance:
HTTP code: 200
Response time: 1230 ms
Resolved: Jenkins Plugins Artifactory performance has improved in 37b41fe after 35 minutes.
| gharchive/issue | 2024-06-06T06:57:23 | 2025-04-01T06:37:31.859306 | {
"authors": [
"eifelmicha"
],
"repo": "Schmitzis/Uptime",
"url": "https://github.com/Schmitzis/Uptime/issues/1769",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
161129845 | Add a subscribe page
And make it conditionnal to the @labs activation?
Here's the documentation: https://themes.ghost.org/docs/subscribers
Subscription form in Casper: https://github.com/TryGhost/Casper/blob/master/post.hbs
https://github.com/TryGhost/Casper/blob/master/post.hbs#L75
| gharchive/issue | 2016-06-20T07:09:06 | 2025-04-01T06:37:31.861186 | {
"authors": [
"postblue"
],
"repo": "Schoewilliam/Stitch-Blue",
"url": "https://github.com/Schoewilliam/Stitch-Blue/issues/30",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2308920100 | Cannot select prediction file to import (amfbrowser)
Hello,
I've just finished installing amf and amfbrowser and was able to generate predictions for an image using amf.
But when I run amfbrowser and try to select the prediction file, I cannot click on the OK button. I've tried clicking, selecting with Enter, and just about every other key on my keyboard.
My lab mate thinks it is because I am trying to run this on Windows 11. Do you know if there is any way to run this without downgrading to Windows 10? Alternatively I could try installing on a system with Linux.
What is your recommendation?
Thanks,
Robbie
University of Ottawa
Dear Robbie,
Thank you for your interest in AMFinder.
Unfortunately, I have never heard of such an issue, so I can only guess it is a compatibility problem between the graphical system and the OS. The best is to install AMFinder on a Linux machine (it could be a virtual machine). I am happy to help with the installation if needed.
Best regards,
Edouard
| gharchive/issue | 2024-05-21T18:59:18 | 2025-04-01T06:37:31.865418 | {
"authors": [
"EEvangelisti",
"rob-ferg"
],
"repo": "SchornacklabSLCU/amfinder",
"url": "https://github.com/SchornacklabSLCU/amfinder/issues/34",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2194557498 | 55 Write mobility data
Changes and Information
Please briefly list the changes (main added features, changed items, or corrected bugs) made:
Save data about the commuters during mobility
Any Indices (or Groups of indices) can be chosen as input. Additionally, the total number of commuters is saved
Additionally, it is possible to save only selected edges
If need be, add additional information and what the reviewer should look out for in particular:
This functionality should be used carefully because if we save to many compartments from the edges, theres a large amount of storage necessary
Merge Request - Guideline Checklist
Please check our git workflow. Use the draft feature if the Pull Request is not yet ready to review.
Checks by code author
[x] Every addressed issue is linked (use the "Closes #ISSUE" keyword below)
[x] New code adheres to coding guidelines
[x] No large data files have been added (files should in sum not exceed 100 KB, avoid PDFs, Word docs, etc.)
[x] Tests are added for new functionality and a local test run was successful (with and without OpenMP)
[x] Appropriate documentation for new functionality has been added (Doxygen in the code and Markdown files if necessary)
[x] Proper attention to licenses, especially no new third-party software with conflicting license has been added
[ ] (For ABM development) Checked benchmark results and ran and posted a local test above from before and after development to ensure performance is monitored.
Checks by code reviewer(s)
[ ] Corresponding issue(s) is/are linked and addressed
[ ] Code is clean of development artifacts (no deactivated or commented code lines, no debugging printouts, etc.)
[ ] Appropriate unit tests have been added, CI passes, code coverage and performance is acceptable (did not decrease)
[ ] No large data files added in the whole history of commits(files should in sum not exceed 100 KB, avoid PDFs, Word docs, etc.)
[ ] On merge, add 2-5 lines with the changes (main added features, changed items, or corrected bugs) to the merge-commit-message. This can be taken from the briefly-list-the-changes above (best case) or the separate commit messages (worst case).
Closes #55
Using the setup in our simulation with ~30.000 edges and 50 simulation days, the size of the Edges.h file is 135mb.
There is an inconsistency in our project between the use of the terms migrated and mobility. I would leave them here for now and edit this PR after merging so that we only use mobility.
| gharchive/pull-request | 2024-03-19T10:17:23 | 2025-04-01T06:37:31.886453 | {
"authors": [
"HenrZu"
],
"repo": "SciCompMod/memilio",
"url": "https://github.com/SciCompMod/memilio/pull/971",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
176839718 | MuTect1
@MaxUlysse and @Sebastian-D pls have a look - it is a duplication of MuTect2 practically. On long term we have to refactor these into one, especially the collate part.
Looks good, I see the influences ;).
I am doing something similar with HaplotypeCaller, the collateFiles function is bothering me however as I need to change it for normals and I am not proficient at groovy, yet.
| gharchive/pull-request | 2016-09-14T07:56:28 | 2025-04-01T06:37:31.891833 | {
"authors": [
"Sebastian-D",
"szilvajuhos"
],
"repo": "SciLifeLab/CAW",
"url": "https://github.com/SciLifeLab/CAW/pull/66",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1319283470 | imt.savefig.py only saving images at 72 dpi
imt.image_file_postproc() in imt.savefig() is changing or resaving the dpi of figures to 72, when the original dpi could be much higher (e.g. 300).
Code snippet to recreate the issue:
import ImageMetaTag as imt
pathname = '/home/h01/ewarren/cylc-run/u-al685_TCbogus_highres/share/data/Plots/tcver/u-ck824_v_u-co412/FS_Central_Pressure_Bias.png'
img_tags = {'aggperiod': 'None',
'datatimes': 'QG',
'dates': 'None',
'fcrs': 'None',
'period': 'None',
'difference': 'None',
'grid': 'None',
'scales': 'None',
'stations': 'None',
'levels': 'None',
'params': 'None',
'truth': 'None',
'fields': 'Central Pressure',
'probbins': 'None',
'areas': 'Global',
'expid': 'u-ck824 Vs u-co412',
'plot type': 'Forecast',
'stats': 'Bias',
'thresh': 'None',
'tiletype': 'None',
'storm name': 'None'}
imt.image_file_postproc(pathname, img_tags=img_tags)
Example image (with a dpi of 100 before being passed through imt.image_file_postproc()):
branch: 137-imtsavefigpy-only-saving-images-at-72-dpi
| gharchive/issue | 2022-07-27T09:33:29 | 2025-04-01T06:37:31.913504 | {
"authors": [
"elliottwarren",
"melissaebrooks"
],
"repo": "SciTools-incubator/image-meta-tag",
"url": "https://github.com/SciTools-incubator/image-meta-tag/issues/137",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
524336194 | ASV - Document Airspeed Velocity
ASV configuration and a basic benchmark are now part of Iris (#3526). Having completed some basic experiments with ASV against Iris master branch I have some recommended minimum detail that will be needed in our developer documentation testing page:
[x] A new top level section called Performance Benchmarking
[x] The fact we use ASV
[x] The benchmark directory and prefix syntax for tests
[x] The configuration file location and its implications (e.g. the fact it aligns with the requirements for Iris test, ASV will be using conda to manage the environments it creates, etc...)
[x] A standard run configuration that will be used for official benchmarking - currently asv run v2.0.0..master --skip-existing-commits. OR a Nox session that encodes the same thing; this would probably be preferable.
Closed by #4621
| gharchive/issue | 2019-11-18T12:27:33 | 2025-04-01T06:37:31.929598 | {
"authors": [
"trexfeathers"
],
"repo": "SciTools/iris",
"url": "https://github.com/SciTools/iris/issues/3543",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.