id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
2647187922
|
feat(ai): better completion/suggestions of AI engines
Description
The whole completion / snippets / AI is very tricky:
multiple snippet engines
native snippets on > 0.11 set their own keymaps, but not on 0.10
multiple completion engines, like nvim-cmp and blink.cmp
multiple ai completion engines that have a different API
user's preference of showing ai suggestions as completion or not
none of the ai completion engines currently set undo points, which is bad
Solution:
[x] added LazyVim.cmp.actions, where snippet engines and ai engines can register their action.
[x] an action returns true if it succeeded, or false|nil otherwise
[x] in a completion engine, we then try running multiple actions and use the fallback if needed
[x] so <tab> runs {"snippet_forward", "ai_accept", "fallback"}
[x] added vim.g.ai_cmp. When true we try to integrate the AI source in the completion engine.
[x] when false, <tab> should be used to insert the AI suggestion
[x] when false, the completion engine's ghost text is disabled
[x] luasnip support for blink (only works with blink main)
[x] create undo points when accepting AI suggestions
Test Matrix
completion
snippets
ai
ai_cmp
tested?
nvim-cmp
native
copilot
true
✅
nvim-cmp
native
copilot
false
✅
nvim-cmp
native
codeium
true
✅
nvim-cmp
native
codeium
false
✅
nvim-cmp
luasnip
copilot
true
✅
nvim-cmp
luasnip
copilot
false
✅
nvim-cmp
luasnip
codeium
true
✅
nvim-cmp
luasnip
codeium
false
✅
blink.cmp
native
copilot
true
✅
blink.cmp
native
copilot
false
✅
blink.cmp
native
codeium
true
✅
blink.cmp
native
codeium
false
✅
blink.cmp
luasnip
copilot
true
✅
blink.cmp
luasnip
copilot
false
✅
blink.cmp
luasnip
codeium
true
✅
blink.cmp
luasnip
codeium
false
✅
Related Issue(s)
[ ] Closes #4702
Screenshots
Checklist
[ ] I've read the CONTRIBUTING guidelines.
The PR should be ready.
It's been a real pain in handling all the border cases...
I tested the codeium part of this PR along with blink.cmp and from a first glance everything seems to be working as expected.
Hi @folke
I don't get auto-suggestion anymore after updating the lazyVim to the latest version.
Can this be a reason? I didn't configure anything, I was using existing setup that using nvim-cmp.
Now I dont't get copilot suggestions 🤔
Please let me know a way to fix this, Thank you.
|
gharchive/pull-request
| 2024-11-10T12:37:59 |
2025-04-01T04:55:17.275773
|
{
"authors": [
"dpetka2001",
"folke",
"nadunindunil"
],
"repo": "LazyVim/LazyVim",
"url": "https://github.com/LazyVim/LazyVim/pull/4752",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
297399735
|
Curl use for scripting
it could be useful to create a Curl cheat-sheet for scripting - major ways to use Curl.
Yes, that's a good idea. Do you have time to start it?
I think i can, give me access and i'll initiate the project. then we'll see how it flows.
I added you as a collaborator. You should have access to the repository. :-)
you wan me to work on my branch and then merge it ?
yes you can do this :-)
Hi @silent-mobius, how is your progress with curl cheat sheet?
@yurnov feel free to make one if you have the motivation :)
Here is PR#117
Awesome :)
|
gharchive/issue
| 2018-02-15T10:43:39 |
2025-04-01T04:55:17.280311
|
{
"authors": [
"LeCoupa",
"silent-mobius",
"yurnov"
],
"repo": "LeCoupa/awesome-cheatsheets",
"url": "https://github.com/LeCoupa/awesome-cheatsheets/issues/26",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1788163242
|
Add Japanese translated README
I created Japanese translated README.
Good to go
Awesome. @eltociear, thank you!
|
gharchive/pull-request
| 2023-07-04T15:47:48 |
2025-04-01T04:55:17.281621
|
{
"authors": [
"LeKovr",
"SmolFlop",
"eltociear"
],
"repo": "LeKovr/webtail",
"url": "https://github.com/LeKovr/webtail/pull/17",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
24792063
|
Saving and restoring drawn objects
Leaflet.draw is awesome, but I need to save and restore the drawn objects.
I can get JSON for the layer by a somewhat cumbersome iteration in draw:created.
But I don't see an obvious way to go from JSON to objects on the map.
So two questions:
Is there a built-in way to get JSON representation of the objects?
Is there a way to go from JSON to objects on the may?
Thanks.
If you could post an example of that it would be appreciated. I am looking at doing the same thing..
Came to the same conclusion but not sure exactly how to do it.
@AndriusGeo I am running into a similar issue on a project I'm working on. Perhaps you may be able to point me in the right direction. Currently, I'm using the toGeoJSON() function to save my drawn objects. However, GeoJSON doesn't seem to accommodate circles. What did you end up doing for those? Also, how did saving the styles end up going for you? Did you end up going with eachLayer() as you alluded to? Thanks!
Having this same problem; does anyone have an example?
Looking around I found this stackoverflow and this example.
Kinda sucks that circle properties are not saved off by default. So extra work needs to be done to save of shape properties besides just doing toGeoJSON.
I am back to working on this again. I can get the style attributes out but I am yet to get that and the geoJSON together where multiple items are drawn with different styles.
var ditemsOut = '';
var ditems = drawnItems.getLayers();
var iIndex = 0;
console.log(ditems);
//for (itemsIndex = 0; itemsIndex < ditems.length; ++itemsIndex) {
// ditemsOut = ditemsOut + ditems[itemsIndex].options.clickable + '\n';
//}
for (iIndex = 0; iIndex < ditems.length; ++iIndex) {
if (ditems[iIndex]._leaflet_id) { console.log('Item ID: ' + ditems[iIndex]._leaflet_id) };
if ('clickable' in ditems[iIndex].options) { console.log('Clickable: ' + ditems[iIndex].options.clickable) };
if ('color' in ditems[iIndex].options) { console.log('Color: ' + ditems[iIndex].options.color) };
if ('fill' in ditems[iIndex].options) { console.log('Fill: ' + ditems[iIndex].options.fill) };
if ('opacity' in ditems[iIndex].options) { console.log('Opacity: ' + ditems[iIndex].options.opacity) };
if ('stroke' in ditems[iIndex].options) { console.log('Stroke: ' + ditems[iIndex].options.stroke) };
if ('weight' in ditems[iIndex].options) { console.log('Weight: ' + ditems[iIndex].options.weight) };
if ('fillColor' in ditems[iIndex].options) { console.log('FillColor: ' + ditems[iIndex].options.fillColor) };
if ('fillOpacity' in ditems[iIndex].options) { console.log('FillOpacity: ' + ditems[iIndex].options.fillOpacity) };
if ('icon' in ditems[iIndex].options) {
if ('options' in ditems[iIndex].options.icon) {
if ('iconSize' in ditems[iIndex].options.icon.options) { console.log('IconSize0: ' + ditems[iIndex].options.icon.options.iconSize[0]) };
if ('iconSize' in ditems[iIndex].options.icon.options) { console.log('IconSize1: ' + ditems[iIndex].options.icon.options.iconSize[1]) };
if ('iconurl' in ditems[iIndex].options.icon.options) { console.log('IconUrl: ' + ditems[iIndex].options.icon.options.iconUrl) };
};
};
if ('_icon' in ditems[iIndex].options) { console.log('_Icon: ' + ditems[iIndex].options._icon) };
if ('_shadow' in ditems[iIndex].options) { console.log('_Shadow: ' + ditems[iIndex].option._shadow) };
};
And there it is. This will get all the information you require. Just decide if you save the style components in a separate field or as a style attribute of the geoJSON.
From this I will just build a generator for the geoJSON using theses options.
var ditemsOut = ''; var ditems = drawnItems.getLayers(); var iIndex = 0; console.log(ditems); //for (itemsIndex = 0; itemsIndex < ditems.length; ++itemsIndex) { // ditemsOut = ditemsOut + ditems[itemsIndex].options.clickable + '\n'; //} for (iIndex = 0; iIndex < ditems.length; ++iIndex) { if (ditems[iIndex]._leaflet_id) { console.log('Item ID: ' + ditems[iIndex]._leaflet_id) }; if ('clickable' in ditems[iIndex].options) { console.log('Clickable: ' + ditems[iIndex].options.clickable) }; if ('color' in ditems[iIndex].options) { console.log('Color: ' + ditems[iIndex].options.color) }; if ('fill' in ditems[iIndex].options) { console.log('Fill: ' + ditems[iIndex].options.fill) }; if ('opacity' in ditems[iIndex].options) { console.log('Opacity: ' + ditems[iIndex].options.opacity) }; if ('stroke' in ditems[iIndex].options) { console.log('Stroke: ' + ditems[iIndex].options.stroke) }; if ('weight' in ditems[iIndex].options) { console.log('Weight: ' + ditems[iIndex].options.weight) }; if ('fillColor' in ditems[iIndex].options) { console.log('FillColor: ' + ditems[iIndex].options.fillColor) }; if ('fillOpacity' in ditems[iIndex].options) { console.log('FillOpacity: ' + ditems[iIndex].options.fillOpacity) }; if ('icon' in ditems[iIndex].options) { if ('options' in ditems[iIndex].options.icon) { if ('iconSize' in ditems[iIndex].options.icon.options) { console.log('IconSize0: ' + ditems[iIndex].options.icon.options.iconSize[0]) }; if ('iconSize' in ditems[iIndex].options.icon.options) { console.log('IconSize1: ' + ditems[iIndex].options.icon.options.iconSize[1]) }; if ('iconurl' in ditems[iIndex].options.icon.options) { console.log('IconUrl: ' + ditems[iIndex].options.icon.options.iconUrl) }; }; }; if ('_icon' in ditems[iIndex].options) { console.log('_Icon: ' + ditems[iIndex].options._icon) }; if ('_shadow' in ditems[iIndex].options) { console.log('_Shadow: ' + ditems[iIndex].option._shadow) }; if (ditems[iIndex] instanceof L.Polyline) { console.log('Type: LineString') }; if (ditems[iIndex] instanceof L.Polygon) { console.log('Type: Polygon') }; if (ditems[iIndex] instanceof L.Point) { console.log('Type: Point') }; if (ditems[iIndex] instanceof L.Marker) { console.log('Type: Point') }; };
And a function to get the items as geoJSON with the style included
Just pass the layer to the function
function drawnItemsToJSON(ilayer) {
var dOut = '';
var dOut1 = '';
var dOut2 = '';
var ditems = ilayer.getLayers();
dOut = '{"type":"FeatureCollection","features":[';
for (iIndex = 0; iIndex < ditems.length; ++iIndex) {
if (ditems[iIndex] instanceof L.Point || ditems[iIndex] instanceof L.Marker) {
dOut1 = dOut1 + ',{"type":"Feature","properties":{';
if ('icon' in ditems[iIndex].options) {
if ('options' in ditems[iIndex].options.icon) {
dOut1 = dOut1 + '"markerOptions":{';
dOut2 = '';
if ('iconSize' in ditems[iIndex].options.icon.options) { dOut2 = dOut2 + ',"iconSize":[' + ditems[iIndex].options.icon.options.iconSize[0] + ',' + ditems[iIndex].options.icon.options.iconSize[0] + ']' };
if ('iconUrl' in ditems[iIndex].options.icon.options) { dOut2 = dOut2 + ',"iconUrl":"' + ditems[iIndex].options.icon.options.iconUrl + '"' };
dOut1 = dOut1 + dOut2.substring(1) + '}';
};
};
dOut1 = dOut1 + '},"geometry":{"type":"Point","coordinates":['
+ ditems[iIndex]._latlng.lng
+ ',' + ditems[iIndex]._latlng.lat
+ ']},"style":{';
dOut2 = '';
if ('stroke' in ditems[iIndex].options) { if (!ditems[iIndex].options.stroke !== null) { dOut2 = dOut2 + ',"stroke":' + ditems[iIndex].options.stroke } };
if ('color' in ditems[iIndex].options) { if (ditems[iIndex].options.color !== null) { dOut2 = dOut2 + ',"color":"' + ditems[iIndex].options.color + '"' } };
if ('weight' in ditems[iIndex].options) { if (!ditems[iIndex].options.weight !== null) { dOut2 = dOut2 + ',"weight":' + ditems[iIndex].options.weight } };
if ('opacity' in ditems[iIndex].options) { if (!ditems[iIndex].options.opacity !== null) { dOut2 = dOut2 + ',"opacity":' + ditems[iIndex].options.opacity } };
if ('fill' in ditems[iIndex].options) { if (!ditems[iIndex].options.fill !== null) { dOut2 = dOut2 + ',"fill":' + ditems[iIndex].options.fill } };
if ('fillColor' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillColor !== null) { dOut2 = dOut2 + ',"fillColor":"' + ditems[iIndex].options.fillColor + '"' } };
if ('fillOpacity' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillOpacity !== null) { dOut2 = dOut2 + ',"fillOpacity":' + ditems[iIndex].options.fillOpacity } };
if ('fillRule' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillRule !== null) { dOut2 = dOut2 + ',"fillRule":"' + ditems[iIndex].options.fillRule + '"' } };
if ('dashArray' in ditems[iIndex].options) { if (!ditems[iIndex].options.dashArray !== null) { dOut2 = dOut2 + ',"dashArray":"' + ditems[iIndex].options.dashArray + '"' } };
if ('lineCap' in ditems[iIndex].options) { if (!ditems[iIndex].options.lineCap !== null) { dOut2 = dOut2 + ',"lineCap":"' + ditems[iIndex].options.lineCap + '"' } };
if ('lineJoin' in ditems[iIndex].options) { if (!ditems[iIndex].options.lineJoin !== null) { dOut2 = dOut2 + ',"lineJoin":"' + ditems[iIndex].options.lineJoin + '"' } };
if ('clickable' in ditems[iIndex].options) { if (!ditems[iIndex].options.clickable !== null) { dOut2 = dOut2 + ',"clickable":' + ditems[iIndex].options.clickable } };
if ('pointerEvents' in ditems[iIndex].options) { if (!ditems[iIndex].options.pointerEvents !== null) { dOut2 = dOut2 + ',"pointerEvents":"' + ditems[iIndex].options.pointerEvents + '"' } };
if ('className' in ditems[iIndex].options) { if (!ditems[iIndex].options.className !== null) { dOut2 = dOut2 + ',"className":"' + ditems[iIndex].options.className + '"' } };
if (dOut2.length > 1) {
dOut1 = dOut1 + dOut2.substring(1) + '}';
};
dOut2 = '';
dOut1 = dOut1 + '}';
} else if (ditems[iIndex] instanceof L.Polygon) {
dOut1 = dOut1 + ',{"type":"Feature","properties":{},"geometry":{"type":"Polygon","coordinates":[['
dOut2 = '';
for (ll = 0; ll < ditems[iIndex]._latlngs[0].length; ll++) {
dOut2 = dOut2 + ',[' + ditems[iIndex]._latlngs[0][ll].lng + ',' + ditems[iIndex]._latlngs[0][ll].lat + ']';
};
dOut2 = dOut2 + ',[' + ditems[iIndex]._latlngs[0][0].lng + ',' + ditems[iIndex]._latlngs[0][0].lat + ']';
dOut1 = dOut1 + dOut2.substring(1) + ']]},"style":{';
dOut2 = '';
if ('stroke' in ditems[iIndex].options) { if (!ditems[iIndex].options.stroke !== null) { dOut2 = dOut2 + ',"stroke":' + ditems[iIndex].options.stroke } };
if ('color' in ditems[iIndex].options) { if (ditems[iIndex].options.color !== null) { dOut2 = dOut2 + ',"color":"' + ditems[iIndex].options.color + '"' } };
if ('weight' in ditems[iIndex].options) { if (!ditems[iIndex].options.weight !== null) { dOut2 = dOut2 + ',"weight":' + ditems[iIndex].options.weight } };
if ('opacity' in ditems[iIndex].options) { if (!ditems[iIndex].options.opacity !== null) { dOut2 = dOut2 + ',"opacity":' + ditems[iIndex].options.opacity } };
if ('fill' in ditems[iIndex].options) { if (!ditems[iIndex].options.fill !== null) { dOut2 = dOut2 + ',"fill":' + ditems[iIndex].options.fill } };
if ('fillColor' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillColor !== null) { dOut2 = dOut2 + ',"fillColor":"' + ditems[iIndex].options.fillColor + '"' } };
if ('fillOpacity' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillOpacity !== null) { dOut2 = dOut2 + ',"fillOpacity":' + ditems[iIndex].options.fillOpacity } };
if ('fillRule' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillRule !== null) { dOut2 = dOut2 + ',"fillRule":"' + ditems[iIndex].options.fillRule + '"' } };
if ('dashArray' in ditems[iIndex].options) { if (!ditems[iIndex].options.dashArray !== null) { dOut2 = dOut2 + ',"dashArray":"' + ditems[iIndex].options.dashArray + '"' } };
if ('lineCap' in ditems[iIndex].options) { if (!ditems[iIndex].options.lineCap !== null) { dOut2 = dOut2 + ',"lineCap":"' + ditems[iIndex].options.lineCap + '"' } };
if ('lineJoin' in ditems[iIndex].options) { if (!ditems[iIndex].options.lineJoin !== null) { dOut2 = dOut2 + ',"lineJoin":"' + ditems[iIndex].options.lineJoin + '"' } };
if ('clickable' in ditems[iIndex].options) { if (!ditems[iIndex].options.clickable !== null) { dOut2 = dOut2 + ',"clickable":' + ditems[iIndex].options.clickable } };
if ('pointerEvents' in ditems[iIndex].options) { if (!ditems[iIndex].options.pointerEvents !== null) { dOut2 = dOut2 + ',"pointerEvents":"' + ditems[iIndex].options.pointerEvents + '"' } };
if ('className' in ditems[iIndex].options) { if (!ditems[iIndex].options.className !== null) { dOut2 = dOut2 + ',"className":"' + ditems[iIndex].options.className + '"' } };
if ('icon' in ditems[iIndex].options) {
if ('options' in ditems[iIndex].options.icon) {
if ('iconSize' in ditems[iIndex].options.icon.options) { dOut2 = dOut2 + ',"iconSize":[' + ditems[iIndex].options.icon.options.iconSize[0] + ',' + ditems[iIndex].options.icon.options.iconSize[0] + ']"' };
if ('iconurl' in ditems[iIndex].options.icon.options) { dOut2 = dOut2 + ',"iconUrl":"' + ditems[iIndex].options.icon.options.iconUrl + '"' };
};
};
if (dOut2.length > 1) {
dOut1 = dOut1 + dOut2.substring(1) + '}';
};
dOut2 = '';
dOut1 = dOut1 + '}';
} else if (ditems[iIndex] instanceof L.Polyline) {
dOut1 = dOut1 + ',{"type":"Feature","properties":{},"geometry":{"type":"Polygon","coordinates":[['
dOut2 = '';
for (ll = 0; ll < ditems[iIndex]._latlngs[0].length; ll++) {
dOut2 = dOut2 + ',[' + ditems[iIndex]._latlngs[0][ll].lng + ',' + ditems[iIndex]._latlngs[0][ll].lat + ']';
};
dOut1 = dOut1 + dOut2.substring(1) + ']]},"style":{';
dOut2 = '';
if ('stroke' in ditems[iIndex].options) { if (!ditems[iIndex].options.stroke !== null) { dOut2 = dOut2 + ',"stroke":' + ditems[iIndex].options.stroke } };
if ('color' in ditems[iIndex].options) { if (ditems[iIndex].options.color !== null) { dOut2 = dOut2 + ',"color":"' + ditems[iIndex].options.color + '"' } };
if ('weight' in ditems[iIndex].options) { if (!ditems[iIndex].options.weight !== null) { dOut2 = dOut2 + ',"weight":' + ditems[iIndex].options.weight } };
if ('opacity' in ditems[iIndex].options) { if (!ditems[iIndex].options.opacity !== null) { dOut2 = dOut2 + ',"opacity":' + ditems[iIndex].options.opacity } };
if ('fill' in ditems[iIndex].options) { if (!ditems[iIndex].options.fill !== null) { dOut2 = dOut2 + ',"fill":' + ditems[iIndex].options.fill } };
if ('fillColor' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillColor !== null) { dOut2 = dOut2 + ',"fillColor":"' + ditems[iIndex].options.fillColor + '"' } };
if ('fillOpacity' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillOpacity !== null) { dOut2 = dOut2 + ',"fillOpacity":' + ditems[iIndex].options.fillOpacity } };
if ('fillRule' in ditems[iIndex].options) { if (!ditems[iIndex].options.fillRule !== null) { dOut2 = dOut2 + ',"fillRule":"' + ditems[iIndex].options.fillRule + '"' } };
if ('dashArray' in ditems[iIndex].options) { if (!ditems[iIndex].options.dashArray !== null) { dOut2 = dOut2 + ',"dashArray":"' + ditems[iIndex].options.dashArray + '"' } };
if ('lineCap' in ditems[iIndex].options) { if (!ditems[iIndex].options.lineCap !== null) { dOut2 = dOut2 + ',"lineCap":"' + ditems[iIndex].options.lineCap + '"' } };
if ('lineJoin' in ditems[iIndex].options) { if (!ditems[iIndex].options.lineJoin !== null) { dOut2 = dOut2 + ',"lineJoin":"' + ditems[iIndex].options.lineJoin + '"' } };
if ('clickable' in ditems[iIndex].options) { if (!ditems[iIndex].options.clickable !== null) { dOut2 = dOut2 + ',"clickable":' + ditems[iIndex].options.clickable } };
if ('pointerEvents' in ditems[iIndex].options) { if (!ditems[iIndex].options.pointerEvents !== null) { dOut2 = dOut2 + ',"pointerEvents":"' + ditems[iIndex].options.pointerEvents + '"' } };
if ('className' in ditems[iIndex].options) { if (!ditems[iIndex].options.className !== null) { dOut2 = dOut2 + ',"className":"' + ditems[iIndex].options.className + '"' } };
if ('icon' in ditems[iIndex].options) {
if ('options' in ditems[iIndex].options.icon) {
if ('iconSize' in ditems[iIndex].options.icon.options) { dOut2 = dOut2 + ',"iconSize":[' + ditems[iIndex].options.icon.options.iconSize[0] + ',' + ditems[iIndex].options.icon.options.iconSize[0] + ']"' };
if ('iconurl' in ditems[iIndex].options.icon.options) { dOut2 = dOut2 + ',"iconUrl":"' + ditems[iIndex].options.icon.options.iconUrl + '"' };
};
};
if (dOut2.length > 1) {
dOut1 = dOut1 + dOut2.substring(1) + '}';
};
dOut2 = '';
dOut1 = dOut1 + '}';
};
};
//console.log(dOut + dOut1.substring(1) + ']}');
return dOut + dOut1.substring(1) + ']}';
};
function layerToJSON(layer){
var j = layer.toGeoJSON();
var feature = "";
j.properties = layer.options;
feature += JSON.stringify(j)
return JSON.parse(feature);
}
function drawnItemsToJSON(ilayer) {
var json = '{"type": "FeatureCollection","features": [';
var features = "";
ilayer.eachLayer(function(layer) {
features = JSON.stringify(layerToJSON(layer)) + ",";
});
return JSON.parse(json + features.slice(0,-1) + ']}');
};
I formulated this function for getting the layer options and convert it into geojson layer properties, but I still have an error, I cant edit the polygons when I put it again on drawnItems (L.featureGroup). What seems the problem?
function layerToJSON(layer){
var j = layer.toGeoJSON();
var feature = "";
j.properties = layer.options;
feature += JSON.stringify(j)
return JSON.parse(feature);
}
function drawnItemsToJSON(ilayer) {
var json = '{"type": "FeatureCollection","features": [';
var features = "";
ilayer.eachLayer(function(layer) {
features += JSON.stringify(layerToJSON(layer)) + ",";
});
return JSON.parse(json + features.slice(0,-1) + ']}');
};
If you have specified your editable layers when you declare the edit component, you can only edit the objects added to that editable layer
This is correct... Here's a method I wrote to move a polygon from one layer to another that may help (hopefully)? There are calls to methods outside of this one, but those should not change how the method is read.
loadPolygon: function(polygon, returnLayer) {
if(!returnLayer) {
returnLayer = false;
}
var layer = null;
try {
var geoJson = $.parseJSON(polygon);
if(geoJson.type == "Polygon") {
layer = L.geoJson(geoJson, {
color: '#FF6633',
dashArray: '5',
clickable: false,
pointerEvents: 'none',
analysisTool: true
});
} else {
return false;
}
} catch (e) {
polygonPoints = _.map(polygon.rings[0], function(value) {
latitude = value[1];
longitude = value[0];
return [latitude, longitude];
});
layer = L.polygon(polygonPoints, {
color: '#FF6633',
dashArray: '5',
clickable: false,
pointerEvents: 'none',
analysisTool: true
});
}
if(layer && returnLayer == false) {
this.$el.find('input[value="drawn"]').click();
this.$el.find('input[value="polygon"]').click();
if (this.currentDrawer != null) {
this.currentDrawer.disable();
this.currentDrawer = null;
}
this.setUpCancel();
this.enableMap();
this.applyPolygonSelection();
this.polygonSelection.addLayer(layer);
this.updateMapParams();
this.centerOnGeometry();
} else if(returnLayer) {
return layer;
}
}
I'm guessing here that your ilayer is the editable layer? you should be able to get GeoJSON directly off the layer by using toGeoJSON from L.FeatureGroup()s inherited method
Should reduce the complexity of your code just a bit. Then it get's a bit fuzzy where you need to start unwrapping the response to a get to a feature collection - or, particularly, adding the properties to the features. But I don't think you need to string parse the geojson since it's already an object and does not need to be stringified to edit the properties.
In the even that the response is not a feature collection and you need to create a new object - try not to stringify if you don't have to ;) Bit fuzzy at the moment on that process.
Hi ddproxy,
I already tried the toGeoJSON() function, but it can't get the options as geojson property, that's why I created that function. By the way thanks for the suggestion about the stringify thing, I already optimized the function:
function layerToJSON(layer){
var j = layer.toGeoJSON();
var feature = "";
j.properties = layer.options;
return j;
}
function drawnItemsToJSON(ilayer) {
var json = new Object();
json.type = "FeatureCollection";
json.features = [];
ilayer.eachLayer(function(xlayer) {
xlayer.eachLayer(function(layer) {
json.features.push(layerToJSON(layer));
})
});
return json;
};
But it can't help me with my main problem, which is editing the layers once the geojson loaded in drawnItems (L.featureGroup).
This is my function for getting the previous drawnItems stored in sqlserver.
function getDrawnItems(){
$.getJSON(url,function(e){
var zonesDrawn = L.geoJson(e,{
style: function(f) {
return f.properties;
}
}).addTo(drawnItems);
drawnItems.eachLayer(function(layers) {
layers.eachLayer(function(l){
bindPopup(l);
})
})
});
}
Can you copy up your declaration for the drawnItems plugin - or set up a jsfiddle
Hi IBrian71
I create my code simulation in Fiddle, please refer to this link. That is exactly my main problem. If you click the edit and delete function of Leaflet.Draw, I won't do anything
https://jsfiddle.net/fx2nnrvu/9/#&togetherjs=FJC9jzLh2v
The problem does not lie with the geojson layer.
If I remove the geojson layer from the fiddle, it still errors as soon as you click the edit button - after drawing an item
VM1391 Edit.SimpleShape.js:107 Uncaught TypeError: L.Marker.Touch is not a constructor
at e._createMarker (VM1391 Edit.SimpleShape.js:107)
at e._createMoveMarker (VM1393 Edit.Rectangle.js:12)
at e._initMarkers (VM1391 Edit.SimpleShape.js:91)
at e.addHooks (VM1391 Edit.SimpleShape.js:49)
at enable (VM1321 leaflet.js:8)
at e._enableLayerEdit (VM1375 EditToolbar.Edit.js:225)
at eachLayer (VM1321 leaflet.js:7)
at e.addHooks (VM1375 EditToolbar.Edit.js:71)
at enable (VM1321 leaflet.js:8)
at enable (VM1375 EditToolbar.Edit.js:43)
I dropped your code into a page with my sources and the same result. The editing does however work if your code not loaded
I have narrowed it down to your get drawn items function - i have it working in mine but cannot get it to run in your fiddle.
My working code following - style apply omitted
var jsonDrawn = '{"type":"FeatureCollection","features":[{"type":"Feature","properties":{"pane":"overlayPane","nonBubblingEvents":[],"fill":"true","smoothFactor":1,"noClip":false,"stroke":"true","color":"#3388ff","weight":"3","opacity":"1","lineCap":"round","lineJoin":"round","dashArray":null,"dashOffset":null,"fillColor":"#000000","fillOpacity":"0.2","fillRule":"evenodd","interactive":true,"name":"zone1"},"geometry":{"type":"Polygon","coordinates":[[[-123.14017295837404,49.308784358032355],[-123.14017295837404,49.31594672729814],[-123.1245517730713,49.31594672729814],[-123.1245517730713,49.308784358032355],[-123.14017295837404,49.308784358032355]]]}},{"type":"Feature","properties":{"pane":"overlayPane","nonBubblingEvents":[],"fill":true,"smoothFactor":1,"noClip":false,"stroke":true,"color":"#3388ff","weight":3,"opacity":1,"lineCap":"round","lineJoin":"round","dashArray":null,"dashOffset":null,"fillColor":null,"fillOpacity":0.2,"fillRule":"evenodd","interactive":true},"geometry":{"type":"Polygon","coordinates":[[[-123.1589698791504,49.31270140775414],[-123.1589698791504,49.320086996978475],[-123.15150260925294,49.320086996978475],[-123.15150260925294,49.31270140775414],[-123.1589698791504,49.31270140775414]]]}},{"type":"Feature","properties":{"pane":"overlayPane","nonBubblingEvents":[],"fill":true,"smoothFactor":1,"noClip":false,"stroke":true,"color":"#3388ff","weight":3,"opacity":1,"lineCap":"round","lineJoin":"round","dashArray":null,"dashOffset":null,"fillColor":null,"fillOpacity":0.2,"fillRule":"evenodd","interactive":true},"geometry":{"type":"Polygon","coordinates":[[[-123.10644149780275,49.30044560084641],[-123.10644149780275,49.30598627468646],[-123.0886745452881,49.30598627468646],[-123.0886745452881,49.30044560084641],[-123.10644149780275,49.30044560084641]]]}}]}';
var osmUrl = 'http://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png',
osmAttrib = '© <a href="http://openstreetmap.org/copyright">OpenStreetMap</a> contributors',
osm = L.tileLayer(osmUrl, {
maxZoom: 18,
attribution: osmAttrib
}),
map = new L.Map('map', {
center: new L.LatLng(49.3112433333333, -123.088858333333),
zoom: 13
}),
drawnItems = L.featureGroup().addTo(map);
L.control.layers({
'osm': osm.addTo(map),
"google": L.tileLayer('http://www.google.cn/maps/vt?lyrs=s@189&gl=cn&x={x}&y={y}&z={z}', {
attribution: 'google'
})
}, {
'drawlayer': drawnItems
}, {
position: 'topleft',
collapsed: false
}).addTo(map);
function getDrawnItems() {
//var json = JSON.parse(jsonDrawn);
//var zonesDrawn = L.geoJson(json, {
// style: function (f) {
// return f.properties;
// }
//}).addTo(drawnItems);
var json = new L.GeoJSON(JSON.parse(jsonDrawn), {
pointToLayer: function (feature, latlng) {
switch (feature.geometry.type) {
case 'Polygon':
//var ii = new L.Polygon(latlng)
//ii.addTo(drawnItems);
return L.polygon(latlng);
case 'LineString':
return L.polyline(latlng);
case 'Point':
return L.marker(latlng);
default:
return;
}
},
onEachFeature: function (feature, layer) {
layer.addTo(drawnItems);
}
});
//drawnItems.addLayer(json);
};
getDrawnItems();
map.addControl(new L.Control.Draw({
edit: {
featureGroup: drawnItems,
poly: {
allowIntersection: false
}
},
draw: {
polygon: {
allowIntersection: false,
showArea: true
}
}
}));
map.on(L.Draw.Event.CREATED, function (event) {
var layer = event.layer;
drawnItems.addLayer(layer);
});
I think the main issue was you were adding a featuregroup to the edit layer instead of individual polygons
I didnt think of that, I thought geojson automatically adds the feature as a their respective geometric type. You're a genius iBrian71, thanks a lot! Problem solved!
@nmccready I have the same problem around repopulating circle shapes. Did you find a way to get around this issue?
Circle shapes are not explicitly supported by geojson, so to maintain compat across systems, when a circle is created i convert it to a polygon.. Therefore repopulating works fine. I could dig up a sample of the code if you need
@iBrian71 That would be fantastic, thank you! I will try converting to polygon in the meantime.
Looks like you might be able to use a circle 👍
See these reference
https://gist.github.com/virtualandy/1233401
@Zverik what is the reason for var json = drawnItems.toGeoJSON(); this code when trying to repopulate shapes to the map? do you then add the json var as a map layer? this.map.addLayer(json)? Because I tried repopulating L.geoJson() shapes to the map, but the map doesnt display the shapes. But i believe the shapes are present in this.drawnItems because the Edit and Delete toolbar options are enabled.
Actually, I figured out why my shapes weren't populating. I had the coordinates reversed.
|
gharchive/issue
| 2013-12-26T16:07:44 |
2025-04-01T04:55:17.312393
|
{
"authors": [
"andinieves151720",
"ddproxy",
"grg9999",
"iBrian71",
"jbcoder",
"mcastre",
"nmccready"
],
"repo": "Leaflet/Leaflet.draw",
"url": "https://github.com/Leaflet/Leaflet.draw/issues/253",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
131620414
|
Run tests in SlimerJS also.
This PR changes what npm test does. Instead of using just PhantomJS (headless webkit-based browser), it will try to use SlimerJS (almost-headless gecko-based browser).
SlimerJS is available as a NPM package, so running npm install should install all the SlimerJS stuff (including karma connectors and whatnot).
The main advantage is being able to run tests involving CSS animations (and even more complex stuff like WebGL) in a quasi-headless, automated way.
The main disadvantage is that SlimerJS might not be readily available or usable in Win platforms, or when SSHing into some machine due to the need for a graphical environment :-/
I would like some MacOSX user to try this, and see if a simple npm install then npm test runs the tests in both (quasi-)headless browsers.
I can't install karma-slimerjs-launcher for some reason, it says:
Error extracting /Users/mourner/.npm/slimerjs/0.9.5/package.tgz archive: ENOENT: no such file or directory, open '/Users/mourner/.npm/slimerjs/0.9.5/package.tgz'
Although the Slimer version installed is 0.9.6.
OK got it — the solution was to install with NPM 2 instead of NPM 3, lol. :) Tests run just fine locally. Should we check if they're fine on Windows too?
I cannot check on windows (my Win VMs only have IE & Edge, no node tools), and I'm afraid that my workaround for https://github.com/karma-runner/karma-slimerjs-launcher/issues/1 might not kill the dangling slimerjs process afterwards.
As long as the tests don't create more problem in Win, I'll be happy.
Trying to test on windows, ran in to the same npm error as above. Bit dumb that we can't install with npm 3 now :(
Will try get npm 2 going
Here is the console output.
The slimerjs window appears and disappears eventually, maybe disable it on windows, or rewrite the lines that use the command line bits.
@danzel What's the value of os.platform() in your nodejs environment (win32? linux because of cygwin?). It should be easy to add an extra check for those.
console.log(require('os').platform())
win32
@danzel https://github.com/Leaflet/Leaflet/commit/e8e6dae652107510146e5fc5e61ebb9342cba8ba should help, but I'm not sure if the change is too naïve.
That stopped that error, but the slimerjs tests still don't work.
The window appears, we run 0 tests and eventually it times out and continues (and the window closes)
We could disable slimer on windows for now? Or maybe I could take a look later in the week if we're lucky.
Should we revisit this or close?
#5845 #5831 will run tests in Firefox and Chrome in Travis CI. So it looks like supporting SlimmerJS doesn't make a lot of sense now.
Feel free to reopen it if I'm missing something.
|
gharchive/pull-request
| 2016-02-05T11:36:17 |
2025-04-01T04:55:17.322364
|
{
"authors": [
"IvanSanchez",
"cherniavskii",
"danzel",
"mourner"
],
"repo": "Leaflet/Leaflet",
"url": "https://github.com/Leaflet/Leaflet/pull/4197",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1535614295
|
Use standardized ECMAScript classes for Class
Refactors the Class API so that a standardized ECMAScript class is used. This PR purposefully doesn't touch any of the related API surface as to keep review and discussion for these features for the future.
Recommended reading:
Object.setPrototypeOf() (MDN)
Inheritance and the prototype chain - Building longer inheritance chains (MDN)
Looks pretty good for me. If we have no hurry I would like to test it with some "plugins" in the next days.
For me as a reminder to test: instanceof, typeof
Looks pretty good for me. If we have no hurry I would like to test it with some "plugins" in the next days.
We're in no rush, I wanted to get this out there to get some early feedback and explore our options. Feel free to test it against some plugins if you like, but things might still be prone to change.
This is going to be somewhat tricky to get right. Ultimately I'd like to get rid of all the .extend() calls, or at least move all the functionality to the constructor of the base class Class.
@IvanSanchez this is what I have so far to hopefully facilitate this:
import * as Util from './Util';
// @class Class
// @aka L.Class
// @section
// @uninheritable
// Thanks to John Resig and Dean Edwards for inspiration!
export class Class {
// @function extend(props: Object): Function
// [Extends the current class](#class-inheritance) given the properties to be included.
// Returns a Javascript function that is a class constructor (to be called with `new`).
static extend(props) {
const NewClass = class extends this {};
const parentProto = this.prototype;
const proto = NewClass.prototype;
// mix static properties into the class
if (props.statics) {
Util.extend(NewClass, props.statics);
}
// mix includes into the prototype
if (props.includes) {
Util.extend.apply(null, [proto].concat(props.includes));
}
// mix given properties into the prototype
Util.extend(proto, props);
delete proto.statics;
delete proto.includes;
// merge options
if (proto.options) {
proto.options = parentProto.options ? Object.create(parentProto.options) : {};
Util.extend(proto.options, props.options);
}
proto._initHooks = [];
return NewClass;
}
// @function include(properties: Object): this
// [Includes a mixin](#class-includes) into the current class.
static include(props) {
const parentOptions = this.prototype.options;
Util.extend(this.prototype, props);
if (props.options) {
this.prototype.options = parentOptions;
this.mergeOptions(props.options);
}
return this;
}
// @function mergeOptions(options: Object): this
// [Merges `options`](#class-options) into the defaults of the class.
static mergeOptions(options) {
Util.extend(this.prototype.options, options);
return this;
}
// @function addInitHook(fn: Function): this
// Adds a [constructor hook](#class-constructor-hooks) to the class.
static addInitHook(fn, ...args) { // (Function) || (String, args...)
const init = typeof fn === 'function' ? fn : function () {
this[fn].apply(this, args);
};
this.prototype._initHooks = this.prototype._initHooks || [];
this.prototype._initHooks.push(init);
return this;
}
_initHooksCalled = false;
constructor(...args) {
Util.setOptions(this);
// call the constructor
if (this.initialize) {
this.initialize.apply(this, args);
}
// call all constructor hooks
this.callInitHooks();
}
callInitHooks() {
if (this._initHooksCalled) { return; }
if (super.callInitHooks) {
super.callInitHooks.call(this);
}
this._initHooksCalled = true;
for (let i = 0, len = this.prototype._initHooks.length; i < len; i++) {
this.prototype._initHooks[i].call(this);
}
}
}
Class.prototype._initHooks = [];
This is still failing the tests, not sure why (all very much WIP). But it might allow us to at least do X extends Class internally. WDYT?
Still some failing tests but already "better":
prototype is not available in this context and super can't be called.
callInitHooks(ctx) {
if (this._initHooksCalled) { return; }
const prototype = Object.getPrototypeOf(this);
if(prototype && prototype.callInitHooks){
// for some reason the context of the initial callInitHooks call has the same _initHooks but is a different object.
// I think this causes a duplicated execution of these hooks
prototype.callInitHooks(ctx || this);
}
this._initHooksCalled = true;
for (let i = 0, len = this._initHooks.length; i < len; i++) {
this._initHooks[i].call(this);
}
}
With adding proto._initHooksCalled = false; to extend the Class tests are successful but Canvas not anymore.
The 'problem' is that this will always refer to the child class, and never the parent class it makes which impossible to get the prototype of the 'current' class.
As per MDN:
When an inherited function is executed, the value of this points to the inheriting object, not to the prototype object where the function is an own property.
So I think any approach that goes against the stream is here is kinda dead on arrival. I was thinking what we could do instead is place the callInitHooks() in parent class Class, and directly traverse the prototype itself in reverse. This would replicate the existing functionality, but remove the need for super calls.
This seems to work, WDYT @Falke-Design @IvanSanchez?
callInitHooks() {
if (this._initHooksCalled) { return; }
// collect all prototypes in chain
const protos = [];
let proto = Object.getPrototypeOf(this);
while (proto !== null) {
protos.push(proto);
proto = Object.getPrototypeOf(proto);
}
// reverse so the parent prototype is first
protos.reverse();
// call init hooks on each prototype
protos.forEach((proto) => {
const initHooks = proto._initHooks ?? [];
initHooks.forEach((hook) => { hook.call(this); });
});
this._initHooksCalled = true;
}
Remember https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/super#super-calling_static_methods , folks.
@IvanSanchez I plan to move this from NewClass into the Class prototype, so there would be no need for a super() call anymore.
Added the refactored method, however moving stuff to Class still breaks the tests :thinking:
however moving stuff to Class still breaks the tests 🤔
How does the code look like then?
Ok, I managed to move the hook initialization to the base class Class. Also NewClass now properly uses the extends keywork to extend Class.
@Falke-Design @IvanSanchez @mourner I think we're in a pretty good place right now. Can I ask you guys to take another look? It's now possible to extend classes with initialization hooks and all using standard ECMAScript syntax:
class Parent extends L.Class {
initialize() {
console.log("Parent initialize called.");
}
}
class Child extends Parent {
initialize() {
super.initialize();
console.log("Child initialize called.");
}
}
Parent.addInitHook(function () {
console.log("Parent init hook called.");
});
Child.addInitHook(function () {
console.log("Child init hook called.");
});
const child = new Child();
I am trying to document various differences and caveats in the description, feel free to supplement it with your findings. I am getting a good night sleep 😴
Just thinking out loud, but we could also provide class decorators for example:
import { Class, includes } from 'leaflet';
import MyMixin from './mixins';
@includes(MyMixin)
class MyClass extends Class { }
Or with options:
@options({
myOption1: 'foo',
myOption2: 'bar'
})
class MyClass extends Class { }
I'm converting this PR to a draft. I think it's a good place to work on and discuss this feature, but I see more benefit in chunking the work for this PR out into several smaller pieces of work that are easier to review.
Closing this, since all the changes seem to have been merged via five individual PRs.
Yeah, there are still a couple things left. But I feel this PR does not serve a useful purpose at this point.
|
gharchive/pull-request
| 2023-01-17T00:02:28 |
2025-04-01T04:55:17.337277
|
{
"authors": [
"Falke-Design",
"IvanSanchez",
"jonkoops"
],
"repo": "Leaflet/Leaflet",
"url": "https://github.com/Leaflet/Leaflet/pull/8806",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1513180886
|
Debug toggle functionality (issue #100)
Added toggles for each debug info setting.
Refactored some things to make more sense or not wasting work.
This is just the plumbing for most of the functionality. Only the FPS info can be toggled (either by dev_mode as a whole, or by the fps specific one).
Should there be documentation anywhere about the current keybinds for these toggles? It seems right now that information isn't readily available for developers before it makes its way into an info box somewhere.
Also wondering which TODOs are now obsolete with my changes.
Also another question. Right now, the keybinding functionality is in hive_mind.rs which is in emergence_lib, while the location for the keybinding documentation is in the debug_tools crate as per #173.
Would it be a good idea to move this functionality completely into debug_tools now, sometime, or never? That makes sense to me for this to be in debug_tools/lib.rs rather than mingling with hive_mind controls, but just wanted some other input on this.
@bencecile Right, that makes a lot of sense, I didn't realize this was in hive_mind.rs --- it shouldn't be there for sure
One CI check for documentation currently fails, you can check the error yourself with cargo clippy
I would like another look at this before it gets merged in with the 2 related issues being closed (#100 and #173)
Sounds good, let us know when you're confident in this.
|
gharchive/pull-request
| 2022-12-28T21:27:46 |
2025-04-01T04:55:17.342865
|
{
"authors": [
"TimJentzsch",
"alice-i-cecile",
"bencecile",
"bzm3r"
],
"repo": "Leafwing-Studios/Emergence",
"url": "https://github.com/Leafwing-Studios/Emergence/pull/171",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2449244442
|
Update Hooks
resources/app/client/core/hooks.js
No changes @JPMeehan
|
gharchive/issue
| 2024-08-05T18:45:54 |
2025-04-01T04:55:17.343821
|
{
"authors": [
"JPMeehan",
"dovrosenberg"
],
"repo": "League-of-Foundry-Developers/foundry-vtt-types",
"url": "https://github.com/League-of-Foundry-Developers/foundry-vtt-types/issues/2752",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
985155929
|
Adding support to use json file to use only specific plugins
TLTR
This merge request adds a new feature to use a Json file to use only specific plugins --plugins-conf or -p.
Usecase
If you are scanning for a specific port, you don’t necessarily need to load all the plugins. Therefore, we added a new option to set a specific set of plugins.
The default value is plugins.json which is located in the current directory. The format for the file is simple, only the name of the plugins is needed :
{
"plugins": [
"CouchDbOpenPlugin",
"ElasticSearchExplorePlugin",
"ElasticSearchOpenPlugin",
"MongoSchemaPlugin",
"MongoOpenPlugin",
"SSHOpenPlugin",
"DotDsStoreOpenPlugin",
"NucleiPlugin",
"MysqlOpenPlugin",
"MysqlExplorePlugin",
"RedisOpenPlugin",
"KafkaOpenPlugin",
"ApacheStatusHttpPlugin",
"ConfigJsonHttp",
"DotEnvConfigPlugin",
"GitConfigPlugin",
"IdxConfigPlugin",
"LaravelTelescopeHttpPlugin",
"PhpInfoHttpPlugin",
"FirebaseHttpPlugin",
"WpUserEnumHttp"
]
}
Hi, not forgetting about this !
This is interesting, I will probably extend it with plugin configuration so we can finally pass settings to them ( options part of https://github.com/LeakIX/l9format/blob/master/l9plugin.go#L98 )
Probably with include exclude and protocols section with plugins as a list.
|
gharchive/pull-request
| 2021-09-01T13:29:10 |
2025-04-01T04:55:17.350053
|
{
"authors": [
"gboddin",
"xmco"
],
"repo": "LeakIX/l9explore",
"url": "https://github.com/LeakIX/l9explore/pull/8",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
691355632
|
Consolidate type-level and value-level constructors
Instead of two separate SchemaType types (one in Schema.Internal, one in Schema.Show), we'll have one SchemaType' parametrized type. Same for SchemaKey (one in Schema.Internal, one in Schema.Key).
With this consolidation, we can refactor quasiquoters like schema, where instead of going directly from parsed value -> TypeQ, we'll go parsed value -> SchemaV -> TypeQ, which will give us more power in inspecting things.
This does result in a slight decrease in performance due to the intermediate step, and also because we're now reifying included schemas rather than just passing them along. But we get the ability to implement features that would be difficult to do before. For example, we can't currently unwrap into an included schema, but now we can, treating an included schema as more of a shortcut for copy/paste and less of a special case.
Codecov Report
Merging #44 into master will decrease coverage by 4.95%.
The diff coverage is 78.92%.
@@ Coverage Diff @@
## master #44 +/- ##
==========================================
- Coverage 87.83% 82.87% -4.96%
==========================================
Files 12 12
Lines 411 473 +62
Branches 27 37 +10
==========================================
+ Hits 361 392 +31
- Misses 40 67 +27
- Partials 10 14 +4
Impacted Files
Coverage Δ
src/Data/Aeson/Schema/TH/Utils.hs
65.62% <48.64%> (-8.34%)
:arrow_down:
src/Data/Aeson/Schema/TH/Get.hs
85.10% <60.00%> (-7.00%)
:arrow_down:
src/Data/Aeson/Schema/TH/Schema.hs
79.48% <80.18%> (-11.43%)
:arrow_down:
src/Data/Aeson/Schema/TH/Parse.hs
94.28% <88.57%> (-0.16%)
:arrow_down:
src/Data/Aeson/Schema/Type.hs
92.30% <92.30%> (ø)
src/Data/Aeson/Schema/Key.hs
93.75% <93.75%> (-6.25%)
:arrow_down:
src/Data/Aeson/Schema/Internal.hs
100.00% <100.00%> (+1.38%)
:arrow_up:
src/Data/Aeson/Schema/TH/Getter.hs
100.00% <100.00%> (ø)
src/Data/Aeson/Schema/TH/Unwrap.hs
70.00% <100.00%> (ø)
... and 3 more
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update aaafa3b...3811b1d. Read the comment docs.
|
gharchive/pull-request
| 2020-09-02T19:21:48 |
2025-04-01T04:55:17.370170
|
{
"authors": [
"brandon-leapyear",
"codecov-commenter"
],
"repo": "LeapYear/aeson-schemas",
"url": "https://github.com/LeapYear/aeson-schemas/pull/44",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1066180450
|
Add support for TEUR
TRC20 token definition for TEUR (contract TZJmk51TP3YEETw7phRP581A7H3VzLDFsa) added
Replaced by PR #30
|
gharchive/pull-request
| 2021-11-29T15:31:51 |
2025-04-01T04:55:17.418975
|
{
"authors": [
"Alteway",
"lpascal-ledger"
],
"repo": "LedgerHQ/app-tron",
"url": "https://github.com/LedgerHQ/app-tron/pull/27",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1374954082
|
Avalanche P-Chain desktop integration
📝 Description
This PR adds Avalanche (P-Chain) syncing and staking to Ledger Live Desktop
❓ Context
Impacted projects: ``
Linked resource(s): ``
✅ Checklist
[x] Test coverage
[x] Atomic delivery
[x] No breaking changes
📸 Demo
Staking demo
🚀 Expectations to reach
Please make sure you follow these Important Steps.
Pull Requests must pass the CI and be internally validated in order to be merged.
An updated version of this PR can be found here :
https://github.com/LedgerHQ/ledger-live/pull/2506
|
gharchive/pull-request
| 2022-09-15T19:00:18 |
2025-04-01T04:55:17.431758
|
{
"authors": [
"henrily-ledger",
"trentkrogers"
],
"repo": "LedgerHQ/ledger-live",
"url": "https://github.com/LedgerHQ/ledger-live/pull/1292",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2195328050
|
[LIVE-11852] Support/evm sync height
📝 Description
Synchronization of evm like coins is done by retrieving all operations for a block range.
This PR aims to optimize the interval scanned :
before [latest_operation_height, top_block]
after [latest_sync_height, top_block]
✅ Checklist
Pull Requests must pass the CI and be code reviewed. Set as Draft if the PR is not ready.
[v] npx changeset was attached.
[ ] Covered by automatic tests.
[ ] Impact of the changes:
...
🧐 Checklist for the PR Reviewers
[ ] The code aligns with the requirements described in the linked JIRA or GitHub issue.
[ ] The PR description clearly documents the changes made and explains any technical trade-offs or design decisions.
[ ] There are no undocumented trade-offs, technical debt, or maintainability issues.
[ ] The PR has been tested thoroughly, and any potential edge cases have been considered and handled.
[ ] Any new dependencies have been justified and documented.
[ ] Performance considerations have been taken into account. (changes have been profiled or benchmarked if necessary)
/generate-screenshots
|
gharchive/pull-request
| 2024-03-19T15:43:04 |
2025-04-01T04:55:17.437936
|
{
"authors": [
"lambertkevin",
"vbergeron-ledger"
],
"repo": "LedgerHQ/ledger-live",
"url": "https://github.com/LedgerHQ/ledger-live/pull/6482",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2364083882
|
[QAA-118][Playwright][Speculos] Create staging build for speculos tests
✅ Checklist
[ ] npx changeset was attached.
[ ] Covered by automatic tests.
[ ] Impact of the changes:
...
📝 Description
Create staging build for speculos tests
❓ Context
JIRA or GitHub link:
🧐 Checklist for the PR Reviewers
The code aligns with the requirements described in the linked JIRA or GitHub issue.
The PR description clearly documents the changes made and explains any technical trade-offs or design decisions.
There are no undocumented trade-offs, technical debt, or maintainability issues.
The PR has been tested thoroughly, and any potential edge cases have been considered and handled.
Any new dependencies have been justified and documented.
Performance considerations have been taken into account. (changes have been profiled or benchmarked if necessary)
Speculos Tests
Mocked Tests
|
gharchive/pull-request
| 2024-06-20T10:22:46 |
2025-04-01T04:55:17.444135
|
{
"authors": [
"abdurrahman-ledger"
],
"repo": "LedgerHQ/ledger-live",
"url": "https://github.com/LedgerHQ/ledger-live/pull/7151",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1677997691
|
🛑 MobileCompanyGlobal1 is down
In b34a9bf, MobileCompanyGlobal1 (https://mglobal1.wisereport.co.kr/Home/ServerCheck) was down:
HTTP code: 0
Response time: 0 ms
Resolved: MobileCompanyGlobal1 is back up in 61a7adf.
|
gharchive/issue
| 2023-04-21T07:25:37 |
2025-04-01T04:55:17.446620
|
{
"authors": [
"LeeYoungJin"
],
"repo": "LeeYoungJin/fg_upptime",
"url": "https://github.com/LeeYoungJin/fg_upptime/issues/1290",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1717960306
|
🛑 CompanyWise2 is down
In 7e3067c, CompanyWise2 (https://comp2.wisereport.co.kr/servercheck.aspx) was down:
HTTP code: 0
Response time: 0 ms
Resolved: CompanyWise2 is back up in 81ec540.
|
gharchive/issue
| 2023-05-20T00:30:46 |
2025-04-01T04:55:17.449004
|
{
"authors": [
"LeeYoungJin"
],
"repo": "LeeYoungJin/fg_upptime",
"url": "https://github.com/LeeYoungJin/fg_upptime/issues/3046",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1806006966
|
🛑 ETFGlobal2 is down
In 7fa5444, ETFGlobal2 (https://globaletf2.wisereport.co.kr/Home/ServerCheck) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ETFGlobal2 is back up in 131154d.
|
gharchive/issue
| 2023-07-15T10:20:47 |
2025-04-01T04:55:17.451708
|
{
"authors": [
"LeeYoungJin"
],
"repo": "LeeYoungJin/fg_upptime",
"url": "https://github.com/LeeYoungJin/fg_upptime/issues/5821",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1832031064
|
🛑 MobileETFGlobal3 is down
In 2dfd6e2, MobileETFGlobal3 (https://mglobaletf3.wisereport.co.kr/Home/ServerCheck) was down:
HTTP code: 0
Response time: 0 ms
Resolved: MobileETFGlobal3 is back up in 84a9200.
|
gharchive/issue
| 2023-08-01T21:05:53 |
2025-04-01T04:55:17.454245
|
{
"authors": [
"LeeYoungJin"
],
"repo": "LeeYoungJin/fg_upptime",
"url": "https://github.com/LeeYoungJin/fg_upptime/issues/6125",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1876369615
|
🛑 Retamin3 is down
In bd44cb3, Retamin3 (https://app3.wisereport.co.kr/Home/ServerCheck) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Retamin3 is back up in 7aa8c23 after 10 minutes.
|
gharchive/issue
| 2023-08-31T22:00:20 |
2025-04-01T04:55:17.456564
|
{
"authors": [
"LeeYoungJin"
],
"repo": "LeeYoungJin/fg_upptime",
"url": "https://github.com/LeeYoungJin/fg_upptime/issues/6711",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1955907273
|
🛑 MobileETFGlobal3 is down
In a203857, MobileETFGlobal3 (https://mglobaletf3.wisereport.co.kr/Home/ServerCheck) was down:
HTTP code: 0
Response time: 0 ms
Resolved: MobileETFGlobal3 is back up in cbdc29d after 10 minutes.
|
gharchive/issue
| 2023-10-22T13:41:23 |
2025-04-01T04:55:17.458884
|
{
"authors": [
"LeeYoungJin"
],
"repo": "LeeYoungJin/fg_upptime",
"url": "https://github.com/LeeYoungJin/fg_upptime/issues/7239",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
927087288
|
feat(auto play carousel): added auto play carousel feature
Description
Added optional autoPlay mode to Carousel component. This adds a timer with pause and play button to give control to the user. AutoplayDelay is a parameter that can be defined to set the timer.
Storybook link: (once netlify has deployed link provide a link to the component)
Design link: https://legalandgeneral.invisionapp.com/overview/Canopy-Carousel-ckou4o2im00m9013db6ww6lfv/screens?share=&sortBy=1&sortOrder=1&viewLayout=2
Screenshot: https://www.awesomescreenshot.com/video/4219164?key=a2993ad195e64c47115859e396c26b28
Checklist:
[x] The commit messages follow the convention for this project
[x] I have provided an adequate amount of test coverage
[x] I have added the functionality to the test app
[x] I have provided a story in storybook to document the changes
[x] I have provided documentation in the notes section of the story
[x] I have added any new public feature modules to public-api.ts
Hi @elenagarrone and @owensgit You both commented on the play Icon not being aligned to the centre. It actually is a centred canopy icon but looks like an optical illusion to me due to the shape of the svg fill.
We could add a left margin to the icon to shift it so it looks more centred, but adding css to fix the position of the icon feels like hacky bad practice to me.
We do have an Icon called play-spot which looks like the play is more centred but we don't have the pause equivalent. I think a better option than adding margin to the icon would be to leave it as it is for now but get a pause-spot designed and added and then switch to that version as a future enhancement.
What do you guys think?
Hi @elenagarrone and @owensgit You both commented on the play Icon not being aligned to the centre. It actually is a centred canopy icon but looks like an optical illusion to me due to the shape of the svg fill.
We could add a left margin to the icon to shift it so it looks more centred, but adding css to fix the position of the icon feels like hacky bad practice to me.
We do have an Icon called play-spot which looks like the play is more centred but we don't have the pause equivalent. I think a better option than adding margin to the icon would be to leave it as it is for now but get a pause-spot designed and added and then switch to that version as a future enhancement.
What do you guys think?
Thanks for looking into this. I think that's a good idea, IMO we can live with a slight misalignment for now. Alternatively, we could see the CSS hack as a temporary solution until we have proper icons, but I wouldn't mind if we didn't. What do you think @elenagarrone?
Would you be okay to add a new issue to work on the pause-spot icon designed and incorporated so we don't forget?
We do have an Icon called play-spot which looks like the play is more centred but we don't have the pause equivalent. I think a better option than adding margin to the icon would be to leave it as it is for now but get a pause-spot designed and added and then switch to that version as a future enhancement.
What do you guys think?
Thanks for looking into this. I think that's a good idea, IMO we can live with a slight misalignment for now. Alternatively, we could see the CSS hack as a temporary solution until we have proper icons. What do you think @elenagarrone?
@dannyoz Would you be okay to add a new issue to work on the pause-spot icon designed and incorporated so we don't forget?
I can do that sure 👍
|
gharchive/pull-request
| 2021-06-22T10:36:21 |
2025-04-01T04:55:17.523068
|
{
"authors": [
"Nupur04",
"dannyoz",
"owensgit"
],
"repo": "Legal-and-General/canopy",
"url": "https://github.com/Legal-and-General/canopy/pull/424",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2324865262
|
feat: created pull request templates
Hello @josemateuss, I've created the files right now, you can add a simple template of what you think could be perfect and I will refactor if needed ;)
closes #52
@LelouchFR sorry for my dummy question, but how can I send my commits to this branch? Cause my fork is pointing to main.
@LelouchFR sorry for my dummy question, but how can I send my commits to this branch? Cause my fork is pointing to main.
Normally, you just have to fetch the upstream and checkout at the ghpr-template branch, like this:
git remote add upstream https://github.com/lelouchfr/skill-icons.git
git fetch upstream
git checkout -b ghpr-template upstream/ghpr-template
after that, you just have to commit then push and open a pr to this branch (ghpr-template)
@josemateuss no problem, we've all been there don't worry :D
@LelouchFR I was testing this approach in a personal test repository and I could't choose a PR template when I'm opening a PR, even with my templates in the main branch. Did you validate this .github/PULL_REQUEST_TEMPLATES approach?
I just could receive a template when I put a file pull_request_template.md inside .github directory. I believe this approach works only for issues. What do you think?
@josemateuss Normally it should work, you've done a typo in your directory name, there should be no S in the PULL_REQUEST_TEMPLATE (idk if it was intentional) but it should be working, normally you can already try when doing a pull request to the ghpr-template branch, it should be asking you if you want to use a template I guess.
@josemateuss I've got another idea to also include in the pr which could help everyone to make better icons: a CONTRIBUTING.md (since we are talking about files in the .github directory.
anyway, do you think you could do the pr templates, or should I do them alone ?
Sorry @LelouchFR I got caught up in things these days, but I'll make it happen
Sorry @LelouchFR I got caught up in things these days, but I'll make it happen
Yes sure, No problem with that ;) I understand it
Man, I did that: https://github.com/LelouchFR/skill-icons/pull/55#issuecomment-2139642659, but I realized that push is being send to your branch, not mine, do you know how can I make this work?
@josemateuss oh yes I'm dumb, I've put in the commands to set the upstream to mine instead of yours (bruh)
normally this should fix it (I think):
git remote add upstream https://github.com/josemateuss/skill-icons.git
Sorry, I did that but GH opened a new PR :/ #68
Oh, my bad, it's correct, #68 is targeting to your branch, and #55 is targeting main.
Sorry, I did that but GH opened a new PR :/ #68
this is perfect ;)
|
gharchive/pull-request
| 2024-05-30T06:59:35 |
2025-04-01T04:55:17.565474
|
{
"authors": [
"LelouchFR",
"josemateuss"
],
"repo": "LelouchFR/skill-icons",
"url": "https://github.com/LelouchFR/skill-icons/pull/55",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2197416415
|
Latest version is not compatible with the quest version "V63":
All of the following criteria must be met
[ ] Full Latest.log file included. If no file exists then leave this unchecked and state so.
All of the following are optional to answer
[X] Tried reinstalling the game.
[X] Tried reinstalling LemonLoader.
[X] Tried restarting device.
Describe the issue.
I recently factory reset my headset because I have been getting bad performance on many games, but after I factory reset my quest updated to version 63 and I tried using lemon loader but a "error" of sorts comes up whenever I try giving it permissions "Cant use this folder. To protect your privacy, choose another folder". I would appreciate if this can be fixed so I can start having fun modding games again. Thanks for reading!
Did you attach your log file?
[ ] Yes, I attached my log file to the text box above.
[X] No, I could not find a log file at /storage/emulated/0/Android/data/<package name>/files/melonloader/etc/Latest.log
yeah, they need to make it ask for manage all files THEN data and it should work
or just release it to applab
yeah, they need to make it ask for manage all files THEN data and it should work
It does, its a problem to do with the android version its using. Its nothing to do with LemonLoader, but I made this issue to make sure they know about this issue and see if they can fix it. Also switching it to applab it will still have the same problem.
Wait for Meta to fix it, or wait for me to maybe possibly do #81
|
gharchive/issue
| 2024-03-20T12:29:25 |
2025-04-01T04:55:17.601289
|
{
"authors": [
"Evelyn-dumbass",
"TrevTV",
"maxhax123"
],
"repo": "LemonLoader/MelonLoader",
"url": "https://github.com/LemonLoader/MelonLoader/issues/79",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
785823524
|
Bottle o' Lighting doesn't work as it should.
There are no visuals, only the thunder. Entities aren't set on fire too.
Great, Github being stupid and duplicating a issue when the Submit button is being clicked two times.
Great, Github being stupid and duplicating a issue when the Submit button is being clicked two times.
|
gharchive/issue
| 2021-01-14T09:21:20 |
2025-04-01T04:55:17.602567
|
{
"authors": [
"RDKRACZ"
],
"repo": "Lemonszz/Biome-Makeover",
"url": "https://github.com/Lemonszz/Biome-Makeover/issues/25",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
568263151
|
Escrita errada com numeros percentuais...
Testei alguns numeros em percentuais tipo:
3.01
50.05
1.049
Ele ignora o primeiro zero após a virgula.
Ficando por exemplo: "tres virgula um porcento", "cinquenta virgula 5 percentual", "um virgula quarenta de nove porcento".
Bom teste, obrigado pelo feedback.
O correto seria:
Número
Por extenso
3.01
três vírgula um centésimo por cento
50.05
cinquenta vírgula cinco centésimos por cento
1.049
um vírgula quarenta e nove milésimos por cento
Concorda?
Vou trabalhar nessa correção assim que possível, e lanço ela na versão 1.0.8.
Correção efetuada na versão 1.0.8.
Estou fechando a issue, valeu.
|
gharchive/issue
| 2020-02-20T12:22:58 |
2025-04-01T04:55:17.624003
|
{
"authors": [
"LenonBordini",
"wesleybrunoqd"
],
"repo": "LenonBordini/numero-por-extenso",
"url": "https://github.com/LenonBordini/numero-por-extenso/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
377181010
|
New socialmedia
Proposal:
Proposal written by Monkeyyy11
Is it a proposal for the website or for the bot?: bot
What is the title of your proposal?: New socialmedia
Explain your proposal more accurately (It's best to give as much information as possible, so that we can implement the proposal better): Add new social media:
Facebook
Github
Crowdin
Pinterest
askfm
Why should we add this feature?: The users have more possibilities to promote their socialmedia accounts
ReportID: 164
➤ LenoxBot commented:
Approve:
good proposal
Denarioyt#3352
➤ LenoxBot commented:
Approve:
👍
Wandi#9576
➤ LenoxBot commented:
Approve:
good idea
Dadi#7808
|
gharchive/issue
| 2018-11-04T18:35:20 |
2025-04-01T04:55:17.628916
|
{
"authors": [
"LenoxBot-GitHub"
],
"repo": "LenoxBot/LenoxBot",
"url": "https://github.com/LenoxBot/LenoxBot/issues/229",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
307831579
|
fix: fixes #50 added validation for profiles while container create a…
…nd update. Will fail with 400 error and list of profiles that are not present in our db.
Commit message ist flasch, aber der Inhalt sollte stimmen
|
gharchive/pull-request
| 2018-03-22T22:04:54 |
2025-04-01T04:55:17.680622
|
{
"authors": [
"ChibangLW"
],
"repo": "LexicForLXD/Backend",
"url": "https://github.com/LexicForLXD/Backend/pull/83",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
248369676
|
Crashen upon entering Misty World. [1.12]
It's not an insta crash, it crashes after the world is kinda "loaded" or while its loading.
Crash: https://gist.github.com/Fundryi/2e170e4cdd4bc099191e960b3bccc82e
O! This error has already met, but I could not figure out what is the reason.
If possible, send me somehow saving the world with this error. This will help me solve it.
Please check the latest version MistWorld_1.12.1_alpha_a_05.jar
It seems to me, I was able to fix the bug.
|
gharchive/issue
| 2017-08-07T10:29:14 |
2025-04-01T04:55:17.685633
|
{
"authors": [
"Fundryi",
"Liahim85"
],
"repo": "Liahim85/Misty-World",
"url": "https://github.com/Liahim85/Misty-World/issues/7",
"license": "Artistic-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
959781778
|
[DOCUMENTATION] Create semester 2 roadmap
A roadmap showing the hopeful milestones and sprints and when they will occur must be done to give structure to semester 2 and keep us all to a specific timeframe.
The roadmap document may be found here: https://docs.google.com/document/d/1CtCLn9-wIxLlVCPzwvzBTf2yHyYb3vTXD8KfzLblisM/edit?usp=sharing
Additional context is provided in the following discord message: https://discord.com/channels/818314997389328464/818335235010986025/872497604464877668
This will be a living document going forward and will be given its own wiki page, once this is done, the issue will be closed.
Relevent Wiki page:
https://github.com/Liam-Harrison/Celeritas/wiki/Semester-2-Development-Schedule
Further edits will be made here, closing issue.
|
gharchive/issue
| 2021-08-04T02:00:39 |
2025-04-01T04:55:17.688503
|
{
"authors": [
"LiamCMoore",
"ebsmarch"
],
"repo": "Liam-Harrison/Celeritas",
"url": "https://github.com/Liam-Harrison/Celeritas/issues/168",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1187639652
|
Move GitHub actions to Linux
According to About billing for GitHub Actions the macOS minute costs 10 units. Given that a single build + tests take 6 min this is 60 units for a single check. This gives us 33 runs per month.
With PyTests (tests written in Python) + PyTests in release mode we will easily push above 10 min (100 units per run -> 20 runs per month).
On linux single minute costs 1 unit -> 333 runs (or 200 with PyTests).
Additional benefits:
no Xcode on linux -> no tests for Ariel, but also no errors when Xcode version changes (this already happened in the past)
I sometimes forget to check linux builds, this will automate the whole thingy
Running on Linux may take less time, but you don't need to worry about billing unless this repository is public. The billing policy is adapted only for private repositories.
|
gharchive/issue
| 2022-03-31T07:30:48 |
2025-04-01T04:55:17.693821
|
{
"authors": [
"LiarPrincess",
"youknowone"
],
"repo": "LiarPrincess/Violet",
"url": "https://github.com/LiarPrincess/Violet/issues/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2246257951
|
Substrate Proxy Pallet
New tooling has made working with Proxy Pallet and Multisig together easier. I propose that we add the proxy pallet to then be able to add support for Frequency to these tools.
Notes
Primary Use Case: Administrative, Multisig Token management, etc...
Would not have anything to do with Capacity
Minimal code size
Standard configuration
Example Tooling: https://polkadotmultisig.com/
References
https://paritytech.github.io/polkadot-sdk/master/pallet_proxy/index.html
https://github.com/paritytech/polkadot-sdk/blob/master/substrate/frame/proxy/README.md
Notes from Community Call 2024-04-18
General agreement that it is ok
Need to watch out for any impacts on Capacity that are not currently foreseen
Checking with the Signet team to see if there are any requirements around ProxyType support: https://paritytech.github.io/polkadot-sdk/master/pallet_proxy/pallet/trait.Config.html#associatedtype.ProxyType
|
gharchive/issue
| 2024-04-16T14:50:27 |
2025-04-01T04:55:17.727651
|
{
"authors": [
"wilwade"
],
"repo": "LibertyDSNP/frequency",
"url": "https://github.com/LibertyDSNP/frequency/issues/1937",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
500933594
|
Should SQL move to top of table?
Now that SQL has moved to stable should it move up in the Extended Curriculum table? https://librarycarpentry.org/lessons/
Seems like a good idea 👍
Thanks @libcce and @ccronje
Would either of you be interested in putting in a PR?
@maneesha thanks to @ccronje this issue is done. The change is already live. We can close this.
|
gharchive/issue
| 2019-10-01T14:25:18 |
2025-04-01T04:55:17.731349
|
{
"authors": [
"ccronje",
"libcce",
"maneesha"
],
"repo": "LibraryCarpentry/librarycarpentry.github.io",
"url": "https://github.com/LibraryCarpentry/librarycarpentry.github.io/issues/71",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
179591797
|
Registration Step
If the client posts a Patient resource to /register, we should
Parse out the provider
Figure out which connection to use (for now from database.php) by getting the connection out of the provider table
Store the patient in the EHR database using the appropriate connection by calling setConnection() on the repository or the models itself.
Store a reference to the connection and ehr_pid in the users table (this needs to be added)
Build a JSON response
I added connection field to provider table.
I parsed out Provider ID and got connection, see storeInterface() method on FHIRPatientAdapter
TODO
add ehr_pid and connection fields to users table
Make a RegisterController that uses the FHIRPatientAdapter to store the POSTed Patient Resource into the EHR database corresponding to profider, and add route /register
Enhance routing so that we can pass the connection name to the API server like this https://gponline-fhir.vu2vu.com/fhir/mysql/Patient/47 where mysql is the connection key. Then set that connection key in the FHIR***Adapter so that the repository that talks to the EHR will use the correct connection. This is because we don't want to pass the provider with every request. A user will not be allowed to change providers in this version.
JSON request and JSON response look like this (notice in the response, that the connection name is in the path):
`Sample request to /register
{
"resourceType": "Patient",
"name": [
{
"use": "usual",
"family": [
"Everywoman3"
],
"given": [
"Judy Simple"
]
}
],
"telecom": [
{
"system": "phone",
"value": "555-555-2003",
"use": "primary"
},
{
"system": "email",
"value": "",
"use": "primary"
}
],
"gender": "female",
"birthDate": "1955-01-06",
"address": [
{
"use": "home",
"line": [
"45 pine st",
"Mt eden"
],
"city": "Auckland"
}
],
"extension": [
{
"url": "https://gponline-fhir.vu2vu.com/gponline-patient-extensions",
"extension": [
{
"url": "#groupId",
"valueString": ""
},
{
"url": "#status",
"valueString": "pending"
},
{
"url": "#providerId",
"valueString": "34"
},
{
"url": "#pharmacyId",
"valueString": "5"
},
{
"url": "#stripeToken",
"valueString": "some-value"
}
]
}
]
}
Sample response from /register
{
"resourceType": "Bundle",
"type": "transaction-response",
"total": 1,
"entry": [
{
"fullUrl": "https://gponline-fhir.vu2vu.com/fhir/mysql/Patient/47",
"resource": {
"id": 47,
"resourceType": "Patient",
"identifier": [
{
"use": "usual",
"value": 47
}
],
"name": [
{
"use": "usual",
"family": ["Everywoman3"],
"given": ["Judy Simple"]
}
],
"telecom": [
{
"system": "phone",
"value": "555-555-2003",
"use": "primary"
},
{
"system": "email",
"value": "",
"use": "primary"
}
],
"gender": "female",
"birthDate": "1955-01-06",
"address" : [
{
"use" : "home",
"line" : [
"45 pine st",
"Mt eden"
],
"city" : "Auckland"
}
],
"extension" : [
{
"url" : "https://gponline-fhir.vu2vu.com/extensions/gponline-patient-data",
"extension" : [
{
"url" : "#groupId",
"valueString" : "1001"
},
{
"url" : "#status",
"valueString" : "pending"
},
{
"url" : "#providerId",
"valueString" : "34"
},
{
"url" : "#pharmacyId",
"valueString" : "5"
},
{
"url" : "#stripeToken",
"valueString" : "some-value"
}
]
}
]
},
"response": {
"status": "201 Created",
"location": "Patient/47"
}
}
]
}
`
Add RegisterController and amend PatientAdapter for Registration Step
https://github.com/LibreEHR/fhir/pull/49
https://github.com/LibreEHR/core/pull/19
|
gharchive/issue
| 2016-09-27T19:36:57 |
2025-04-01T04:55:17.763947
|
{
"authors": [
"Leo24",
"kchapple"
],
"repo": "LibreEHR/fhir",
"url": "https://github.com/LibreEHR/fhir/issues/46",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
957499198
|
Internal Server error on create caption
Internal Server Error: /api/photosedit/generateim2txt
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/django/core/handlers/exception.py", line 47, in inner
response = get_response(request)
File "/usr/local/lib/python3.8/site-packages/django/core/handlers/base.py", line 181, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/usr/local/lib/python3.8/site-packages/django/views/decorators/csrf.py", line 54, in wrapped_view
return view_func(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/django/views/generic/base.py", line 70, in view
return self.dispatch(request, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/rest_framework/views.py", line 509, in dispatch
response = self.handle_exception(exc)
File "/usr/local/lib/python3.8/site-packages/rest_framework/views.py", line 469, in handle_exception
self.raise_uncaught_exception(exc)
File "/usr/local/lib/python3.8/site-packages/rest_framework/views.py", line 480, in raise_uncaught_exception
raise exc
File "/usr/local/lib/python3.8/site-packages/rest_framework/views.py", line 506, in dispatch
response = handler(request, *args, **kwargs)
File "/code/api/views/views.py", line 727, in post
image_hash = data['image_hash']
KeyError: 'image_hash'
When clicking on "Generate caption"
Should be fixed now!
|
gharchive/issue
| 2021-08-01T14:00:28 |
2025-04-01T04:55:17.840598
|
{
"authors": [
"derneuere",
"nowheretobefound"
],
"repo": "LibrePhotos/librephotos",
"url": "https://github.com/LibrePhotos/librephotos/issues/306",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1478838386
|
Examples
Examples
client.py is missing from the torchaudio directory
Because it was already added in the master/main
|
gharchive/pull-request
| 2022-12-06T11:08:28 |
2025-04-01T04:55:17.859320
|
{
"authors": [
"hhsecond"
],
"repo": "Lightning-AI/LAI-Triton-Server-Component",
"url": "https://github.com/Lightning-AI/LAI-Triton-Server-Component/pull/2",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1340445982
|
Cherry pick to app release branch (app v0.5.6)
What does this PR do?
cherry pick #14002 and #14106 into current app release.
Does your PR introduce any breaking changes? If yes, please list them.
No
cc @borda
@rlizzo is it intentional to bring up just 1/2 of this feature? #14002
Seems to be an issue with the e2e tests, let's fix it first.
@manskx, any progress on the failing Win tests?
@manskx, any progress on the failing Win tests?
@Borda not yet.
@manskx, any progress on the failing Win tests?
@Borda not yet. You mean the e2e tests right ?
Anything needed to be safe with this release =)
|
gharchive/pull-request
| 2022-08-16T14:29:34 |
2025-04-01T04:55:17.862495
|
{
"authors": [
"Borda",
"manskx",
"rlizzo"
],
"repo": "Lightning-AI/lightning",
"url": "https://github.com/Lightning-AI/lightning/pull/14232",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2180453376
|
Add help strings for configs and CLI
In one of the Studios I added help strings for configs because it was requested. Users might explore configs before going to run --help in the CLI, so it might make sense to add them.
Furthermore, I also think the order in which the arguments are presented in the config is important. Relevant ones should be at the top, less frequently used ones at the bottom. We don't have such an ordering at the moment afaik.
Sharing these here in case we want to reuse some of these descriptions.
Pretraining
# The name of the model to pretrain
# Choose from names in litgpt/config.py
model_name: tiny-llama-1.1b
# Where to save checkpoints and logs
# If run in a MMT job, look for it in /teamspace/jobs/<job-name>/share
out_dir: out/pretrain/tiny-llama
# Path to a checkpoint dir to resume from in case training got interrupted
resume: false
# The name of the logger to send metrics to. Choose from 'tensorboard', 'csv', 'wandb'
logger_name: tensorboard
# Dataset arguments
data:
class_path: TinyLlama
init_args:
data_path: /teamspace/s3_connections/tinyllama-template
train:
# The length of the input sequences to train on, also known as "context size"
max_seq_length: 2048
# After how many optimization steps to save a checkpoint
save_interval: 1000
# After how many optimization steps to log metrics
log_interval: 1
# The batch size across all GPUs in a machine
global_batch_size: 512
# The batch size to use for gradient accumulation
# Maximize this value based on the available GPU VRAM
micro_batch_size: 1
# How many epochs to train for. Mutually exclusive with max_tokens and max_steps
epochs: null
# How many tokens to train for (total across all GPUs). Mutually exclusive with epochs and max_steps
max_tokens: 3000000000000
# How many optimization steps to train for. Mutually exclusive with epochs and max_tokens
max_steps: null
# For how many optimization steps to warm up the learning rate
lr_warmup_steps: 2000
# The max learning rate after linear warmup
learning_rate: 4e-4
# The minimum learning rate after cosine decay
min_lr: 4.0e-05
# How much weight decay to use in AdamW
weight_decay: 1e-1
# Beta parameters for AdamW
beta1: 0.9
beta2: 0.95
# Clip gradients to this norm
max_norm: 1.0
# Whether to tie embeddings (depends on the model)
tie_embeddings: null
eval:
# After how many optimization steps to run validation
interval: 1000
# How many tokens to generate during validation
max_new_tokens: null
# How many batches to run during validation
max_iters: 100
# Path to the tokenizer dir that was used for preprocessing the dataset
tokenizer_dir: tokenizer/Llama-2-7b-hf
# How many devices/GPUs to use
devices: auto
# The random seed to initialize the weights of the model
seed: 42
Finetuning
# The path to the base model checkpoint dir to load for finetuning
checkpoint_dir: checkpoints/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
# Where to save checkpoints and logs
out_dir: out/
# The precision to use for finetuning. Possible choices: bf16-true, bf16-mixed, 32-true
precision: bf16-true
# If set, quantize the model with this algorithm.
# Possible choices: bnb.nf4, bnb.nf4-dq, bnb.fp4, bnb.fp4-dq, bnb.int8-training
quantize: null
# How many devices/GPUs to use
devices: 1
# The LoRA hyperparameters
lora_r: 32
lora_alpha: 16
lora_dropout: 0.05
lora_query: true
lora_key: false
lora_value: true
lora_projection: false
lora_mlp: false
lora_head: false
# The name of the logger to send metrics to. Choose from 'tensorboard', 'csv', 'wandb'
logger_name: tensorboard
# Dataset arguments
data:
class_path: litgpt.data.Alpaca2k
init_args:
# Whether to include the prompt part in the optimization
mask_prompt: false
# The prompt style to use. See litgpt/prompts.py for possible choices.
prompt_style: alpaca
# The seed to use for creating the train/val splits and shuffling the data.
seed: 42
# The number of workers to use per GPU for dataloading
num_workers: 4
# Where to download the data
download_dir: data/alpaca2k
train:
# The length of the input sequences to train on, also known as "context size"
# This depends on how long the sequences in your finetuning dataset are, and
# whether you want to truncate to save memory
max_seq_length: 512
# After how many optimization steps to save a checkpoint
save_interval: 800
# After how many optimization steps to log metrics
log_interval: 1
# The batch size across all GPUs in a machine
global_batch_size: 8
# The batch size to use for gradient accumulation
# Maximize this value based on the available GPU VRAM
micro_batch_size: 8
# How many epochs to train for. Mutually exclusive with max_tokens and max_steps
epochs: 4
# How many tokens to train for (total across all GPUs). Mutually exclusive with epochs and max_steps
max_tokens: null
# How many optimization steps to train for. Mutually exclusive with epochs and max_tokens
max_steps: null
# For how many optimization steps to warm up the learning rate
lr_warmup_steps: 10
# The max learning rate after linear warmup
learning_rate: 0.0002
# The minimum learning rate after cosine decay
min_lr: 6.0e-05
# How much weight decay to use in AdamW
weight_decay: 0.0
# Beta parameters for AdamW
beta1: 0.9
beta2: 0.95
# Clip gradients to this norm
max_norm: null
# Whether to tie embeddings (depends on the model)
tie_embeddings: null
eval:
# After how many optimization steps to run validation
interval: 100
# How many tokens to generate during validation
max_new_tokens: 100
# How many batches to run during validation
max_iters: 100
# The random seed to initialize the weights of the model
seed: 1337
Great. We need to add these to all the docstrings in the scripts so that they appear in the CLI
This is all addressed in #1092 correct?
|
gharchive/issue
| 2024-03-11T23:47:05 |
2025-04-01T04:55:17.866271
|
{
"authors": [
"awaelchli",
"carmocca",
"rasbt"
],
"repo": "Lightning-AI/litgpt",
"url": "https://github.com/Lightning-AI/litgpt/issues/1087",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
513330024
|
Help
Hello, i cannot get this to work with my home assistant setup
copied the three files to the custom_components-linkplay folder but configuration wont reboot??
I cannot help you until I have full information about the situation. To do this, I need the logs that your Home Assistant writes, and configs. Fields for filling all the necessary information were in the template for this issue. Why did you ignore it?
|
gharchive/issue
| 2019-10-28T14:12:27 |
2025-04-01T04:55:17.920847
|
{
"authors": [
"Limych",
"ukscarface"
],
"repo": "Limych/media_player.linkplay",
"url": "https://github.com/Limych/media_player.linkplay/issues/8",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1149951349
|
🛑 SearXNG is down
In 036643e, SearXNG (https://searxng.linerly.repl.co) was down:
HTTP code: 0
Response time: 0 ms
Resolved: SearXNG is back up in bfa400e.
|
gharchive/issue
| 2022-02-25T01:47:38 |
2025-04-01T04:55:17.942917
|
{
"authors": [
"Linerly"
],
"repo": "Linerly/status",
"url": "https://github.com/Linerly/status/issues/294",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1478547119
|
🛑 LinerlyBot (Discord) is down
In 914364c, LinerlyBot (Discord) (https://linerlybot-discord.linerly.tk) was down:
HTTP code: 0
Response time: 0 ms
Resolved: LinerlyBot (Discord) is back up in ca2a114.
|
gharchive/issue
| 2022-12-06T08:32:31 |
2025-04-01T04:55:17.945533
|
{
"authors": [
"Linerly"
],
"repo": "Linerly/status",
"url": "https://github.com/Linerly/status/issues/3146",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1524553744
|
🛑 MortyProxy is down
In c746fe6, MortyProxy (https://proxy.linerly.tk) was down:
HTTP code: 0
Response time: 0 ms
Resolved: MortyProxy is back up in 27972ee.
|
gharchive/issue
| 2023-01-08T15:22:24 |
2025-04-01T04:55:17.947745
|
{
"authors": [
"Linerly"
],
"repo": "Linerly/status",
"url": "https://github.com/Linerly/status/issues/3459",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1555511173
|
🛑 Immich is down
In 8bfe08b, Immich (https://immich.linerly.tk) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Immich is back up in 610e846.
|
gharchive/issue
| 2023-01-24T18:59:08 |
2025-04-01T04:55:17.949973
|
{
"authors": [
"Linerly"
],
"repo": "Linerly/status",
"url": "https://github.com/Linerly/status/issues/3897",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1735613192
|
🛑 MortyProxy is down
In bb0c645, MortyProxy (https://morty.linerly.repl.co) was down:
HTTP code: 0
Response time: 0 ms
Resolved: MortyProxy is back up in 2d0bb30.
|
gharchive/issue
| 2023-06-01T06:56:30 |
2025-04-01T04:55:17.952224
|
{
"authors": [
"Linerly"
],
"repo": "Linerly/status",
"url": "https://github.com/Linerly/status/issues/4375",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2724883429
|
Feature Request: Bring back Technical Details
On the UE Marketplace, the "Technical Details" section was part of the product page, while with Fab, it is two extra clicks away.
An option for the extension to read the content of the "Technical Details" tab in the "See Details" popup and inject that text below the Description (above "Included formats") would be a nice addition.
With a little digging, I think it's possible
Hello,
I've added your feature, but we'll have to wait for the Chrome Web Store to validate it.
|
gharchive/issue
| 2024-12-08T01:06:59 |
2025-04-01T04:55:17.953976
|
{
"authors": [
"HoaxXx",
"Linj1k"
],
"repo": "Linj1k/fab_extended",
"url": "https://github.com/Linj1k/fab_extended/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
892512961
|
Fixed removeStrings() divider
This doesn't seem right at all, because 168 / (168 / 8) is 8 while 168 / 8 is the correct amount of 21. Does hard coding the this value work?
Hang on...
It alternates with different values. Removes 8-Core, but not Processor and vice versa.
So the implementation is flawed. Time to look into it.
Using ffStrbufRemoveStrings(&namePretty, 2, " Processor", " 8-Core"); fails - '8-core' is removed.
Using ffStrbufRemoveStrings(&namePretty, 1, " Processor"); works.
All these functions end up in strbufRemoveTest(). Is there a condition missing or something in there?
Manipulating strbuf->length -= k; with for example strbuf->length -= k + 1; on line 316, will change the output of the "Processor" string, so it's definitely passed both checks in the for loop.
Does the order in which the string elements are encountered matter?
Rearranging the array to this:
const char* removeStrings[] = {
"(R)", "(r)", "(TM)", "(tm)",
" Dual-Core", " Quad-Core", " Six-Core", " Eight-Core", " Ten-Core",
" 2-Core", " 4-Core", " 6-Core", " 8-Core", " 10-Core", " 12-Core", " 14-Core", " 16-Core",
" CPU", " FPU", " APU", " Processor"
};
fixes it completely for me. This array order mimics the order in the original string - AMD Ryzen 7 3800X 8-Core Processor.
Both 8-Core and Processor is removed. Not a proper fix, but should help track it down.
I opened an issue for this #47.
|
gharchive/pull-request
| 2021-05-15T18:44:27 |
2025-04-01T04:55:17.959534
|
{
"authors": [
"DarNCelsius",
"LinusDierheimer"
],
"repo": "LinusDierheimer/fastfetch",
"url": "https://github.com/LinusDierheimer/fastfetch/pull/46",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
850885201
|
Speed up fetchHomeConfig git process
close #52
/approve
|
gharchive/pull-request
| 2021-04-06T01:58:30 |
2025-04-01T04:55:17.980604
|
{
"authors": [
"JohnNiang",
"LinuxSuRen"
],
"repo": "LinuxSuRen/http-downloader",
"url": "https://github.com/LinuxSuRen/http-downloader/pull/53",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1190605998
|
🛑 CARI is down
In 0fb8eb0, CARI (https://cari.flitswallet.app/api) was down:
HTTP code: 0
Response time: 0 ms
Resolved: CARI is back up in 5781f71.
|
gharchive/issue
| 2022-04-02T11:34:21 |
2025-04-01T04:55:18.020154
|
{
"authors": [
"Liquid369"
],
"repo": "Liquid369/FlitsUptime",
"url": "https://github.com/Liquid369/FlitsUptime/issues/1213",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1190608585
|
🛑 SMNC is down
In da5a89a, SMNC (https://smnc.flitswallet.app/api) was down:
HTTP code: 0
Response time: 0 ms
Resolved: SMNC is back up in a8ebd76.
|
gharchive/issue
| 2022-04-02T11:45:43 |
2025-04-01T04:55:18.022481
|
{
"authors": [
"Liquid369"
],
"repo": "Liquid369/FlitsUptime",
"url": "https://github.com/Liquid369/FlitsUptime/issues/1234",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1309726278
|
🛑 FLS is down
In 0f6b975, FLS (https://fls.flitswallet.app/api) was down:
HTTP code: 0
Response time: 0 ms
Resolved: FLS is back up in df2de31.
|
gharchive/issue
| 2022-07-19T15:52:32 |
2025-04-01T04:55:18.025117
|
{
"authors": [
"Liquid369"
],
"repo": "Liquid369/FlitsUptime",
"url": "https://github.com/Liquid369/FlitsUptime/issues/1726",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1092003561
|
🛑 APR is down
In f401459, APR (https://apr.flitswallet.app/api) was down:
HTTP code: 0
Response time: 0 ms
Resolved: APR is back up in a7129ab.
|
gharchive/issue
| 2022-01-02T11:46:47 |
2025-04-01T04:55:18.027639
|
{
"authors": [
"Liquid369"
],
"repo": "Liquid369/FlitsUptime",
"url": "https://github.com/Liquid369/FlitsUptime/issues/370",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1070213395
|
🛑 MARC is down
In 31b2ab4, MARC (https://marc.flitswallet.app/api) was down:
HTTP code: 502
Response time: 930 ms
Resolved: MARC is back up in 1d6bab0.
|
gharchive/issue
| 2021-12-03T04:42:23 |
2025-04-01T04:55:18.029934
|
{
"authors": [
"Liquid369"
],
"repo": "Liquid369/FlitsUptime",
"url": "https://github.com/Liquid369/FlitsUptime/issues/81",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
217819119
|
Class not found exception when using LiquidCoreAndroid with proguard
Hi there,
When using the LiquidCore lib while proguard is activated will fail.
To solve one just need to add a simple proguard keep rule:
-keep class org.liquidplayer.javascript.** { *; }
May this could be part of the Library itself
Sample stacktrace for one running into the same issue:
03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] JNI DETECTED ERROR IN APPLICATION: JNI GetMethodID called with pending exception java.lang.ClassNotFoundException: Didn't find class "org.liquidplayer.javascript.JSValue$JNIReturnObject" on path: DexPathList[[zip file "/data/app/packagename.of.the.app-2/base.apk"],nativeLibraryDirectories=[/data/app/packagename.of.the.app-2/lib/x86, /data/app/packagename.of.the.app-2/base.apk!/lib/x86, /system/lib, /vendor/lib]] 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at java.lang.Class dalvik.system.BaseDexClassLoader.findClass(java.lang.String) (BaseDexClassLoader.java:56) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at java.lang.Class java.lang.ClassLoader.loadClass(java.lang.String, boolean) (ClassLoader.java:380) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at java.lang.Class java.lang.ClassLoader.loadClass(java.lang.String) (ClassLoader.java:312) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at org.liquidplayer.javascript.JSValue$b org.liquidplayer.javascript.JSObject.setProperty(long, long, java.lang.String, long, int) (JSObject.java:-2) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void org.liquidplayer.javascript.JSObject$5.run() (JSObject.java:337) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void org.liquidplayer.javascript.JSContext.a(java.lang.Runnable) (JSContext.java:66) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void org.liquidplayer.javascript.JSObject.a(java.lang.String, java.lang.Object, int) (JSObject.java:345) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void org.liquidplayer.javascript.JSObject.a(java.lang.String, java.lang.Object) (JSObject.java:360) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void packagename.of.the.app.wrapper.a.c.<init>(android.content.Context) (RiveScriptWrapper.java:51) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void packagename.of.the.app.wrapper.d.<init>(android.content.Context) (Wrapper.java:20) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at java.lang.Integer packagename.of.the.app.wrapper.d.b.d$a.a(java.net.URL[]) (SampleClass.java:384) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at java.lang.Object packagename.of.the.app..b.d$a.doInBackground(java.lang.Object[]) (AFragment.java:375) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at java.lang.Object android.os.AsyncTask$2.call() (AsyncTask.java:305) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void java.util.concurrent.FutureTask.run() (FutureTask.java:237) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void android.os.AsyncTask$SerialExecutor$1.run() (AsyncTask.java:243) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void java.util.concurrent.ThreadPoolExecutor.runWorker(java.util.concurrent.ThreadPoolExecutor$Worker) (ThreadPoolExecutor.java:1133) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void java.util.concurrent.ThreadPoolExecutor$Worker.run() (ThreadPoolExecutor.java:607) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427] at void java.lang.Thread.run() (Thread.java:761) 03-29 11:13:15.861 6806-6828/packagename.of.the.app A/art: art/runtime/runtime.cc:427]
Thanks for reporting. I will add a proguard rules file to the library.
Should be fixed in 70e9a7f0556886463ffe2d35ec4d6afb60149c02
This is fixed in Release 0.2.2
|
gharchive/issue
| 2017-03-29T09:38:34 |
2025-04-01T04:55:18.033683
|
{
"authors": [
"ericwlange",
"martinspaeth"
],
"repo": "LiquidPlayer/LiquidCore",
"url": "https://github.com/LiquidPlayer/LiquidCore/issues/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
596624380
|
Fix spacing between headings and paragraphs
Describe the bug
The spacing between paragraphs and headings is sometimes pretty big.
Probably css related issue, check if css can be improved.
Fixed in the new UI
|
gharchive/issue
| 2020-04-08T14:30:08 |
2025-04-01T04:55:18.042669
|
{
"authors": [
"Tschakki"
],
"repo": "LiskHQ/lisk-docs",
"url": "https://github.com/LiskHQ/lisk-docs/issues/524",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
975676185
|
[QUESTION] context for french translation
Hello,
In order to update the fr json file
Can you tell me (screen?), When (context?) the following mesages are displayed?
thank you so much
"proceed-to-dashboard": "Proceed to Dashboard",
"proceed-guest-button": "Proceed as Guest"
Ah yes, good question - this is for guest mode.
When guest mode is enabled, and the user navigates to the login page, there is a button that lets them skip login and have read-only guest access the dashboard, like this:
The second bit of text probably isn't needed, it's for when the user is already logged in, and they manually navigate to the login page, it looks like this:
Thx for your reply :)
#170
|
gharchive/issue
| 2021-08-20T14:44:50 |
2025-04-01T04:55:18.050511
|
{
"authors": [
"EVOTk",
"Lissy93"
],
"repo": "Lissy93/dashy",
"url": "https://github.com/Lissy93/dashy/issues/169",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1076948017
|
[FEEDBACK] Remove Preview and Production Environments
Hey,
Maybe consider removing the Preview and Production Environments. They both seem to be inactive and outdated, and it will clean up this repository a bit.
Yes I have issues with making repositories as clean as possible. I contribute to every repo I see that can be "cleaned". 😅
Agreed, it's annoying. But not sure if GH allows you to remove environment deploys. If it does, I've not seen how.
Maybe it's possible to do via the GH API, so I will look into that...
Figured this out using the following script, based on this answer on StackOverflow
env=Preview
token=xxxx
repo=dashy
user=lissy93
for id in $(curl -u $user:$token https://api.github.com/repos/$user/$repo/deployments\?environment\=$env | jq ".[].id"); do
curl -X POST -u $user:$token -d '{"state":"inactive"}' -H 'accept: application/vnd.github.ant-man-preview+json' https://api.github.com/repos/$user/$repo/deployments/$id/statuses
curl -X DELETE -u $user:$token https://api.github.com/repos/$user/$repo/deployments/$id
done
I'm leaving the GitHub pages env there, since I'm using that for deploying the dashy.to docs website. But the other stale environment has now been removed.
|
gharchive/issue
| 2021-12-10T15:12:09 |
2025-04-01T04:55:18.053876
|
{
"authors": [
"Lissy93",
"walkxcode"
],
"repo": "Lissy93/dashy",
"url": "https://github.com/Lissy93/dashy/issues/365",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
945256186
|
🛑 Sentiment Sweep is down
In 88b629e, Sentiment Sweep (https://sentiment-sweep.com) was down:
HTTP code: 403
Response time: 124 ms
Resolved: Sentiment Sweep is back up in a50d934.
|
gharchive/issue
| 2021-07-15T10:54:53 |
2025-04-01T04:55:18.056335
|
{
"authors": [
"Lissy93"
],
"repo": "Lissy93/uptime",
"url": "https://github.com/Lissy93/uptime/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
160868015
|
你好
很感谢你的移植。
我在这个模板上大概发现两处BUG
1.评论需要提交两次才会显示。
2.style.css文件380行左右,code标签与pre标签差一个逗号
呃。。顺便补充一句 好像不能回复别人的回复?
希望完善 很漂亮的主题 谢谢你的移植
@daimarushi 是的,那个地方确实缺少了一个,不过这是原作者疏忽造成的,我只是一个“搬运工”嘛。我已经在我这边把逗号加上了。
评论提交两次才显示,这个我不是太明白,是说第一次提交写不进数据库?还是说提交以后刷新才显示?为什么我在这里测试都可以直接显示呢?能否说得更清楚些,谢谢?
最后一个问题,这个主题不支持嵌套评论,你可以参考作者的博客,我不是作者,也无能为力,抱歉。
|
gharchive/issue
| 2016-06-17T11:28:49 |
2025-04-01T04:55:18.129250
|
{
"authors": [
"LjxPrime",
"daimarushi"
],
"repo": "LjxPrime/Bedford-Typecho",
"url": "https://github.com/LjxPrime/Bedford-Typecho/issues/1",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
641069736
|
YAML Incompatible with Kubernetes 1.16 and above
This is currently using the removed extensions/v1beta1 Deployments API. It would be good if the YAML used apps/v1.
https://github.com/LogMeIn/k8s-aws-operator/blob/14e856c6a438e57763632112ca484bc03ac40b1f/deploy/deployment.yaml#L1
Thanks!
fixed in v0.0.3
|
gharchive/issue
| 2020-06-18T09:50:23 |
2025-04-01T04:55:18.150286
|
{
"authors": [
"cablespaghetti",
"seb-daehne"
],
"repo": "LogMeIn/k8s-aws-operator",
"url": "https://github.com/LogMeIn/k8s-aws-operator/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
137693409
|
Added nanomsg classes - DO NOT MERGE
new PR to just add a new queuenado pattern using nanomsg.
please review for design to ensure there is nothing out of the ordinary. you will not be able to build this unless you have nanomsg installed.
please check for typos/code style errors, but you only need to point them out once per file, as I have an editor that I can use to set the style for the entire file.
TODO:
integrate with research where we want to test perofrmance
In the packaging spec file you need to add a dependency to the nanomsg rpm.
@john-gress et. al. it's annoying but since Yannicks user is gone and his account is closed this is an orphaned pull request which cannot be pulled down by another user. It is very closely related to this old github bug which is still unresolved: https://github.com/isaacs/github/issues/168
The solution I'm going for here is that I have created a copy of our master master-nanomsg and I have edited this pull request to be directed to that branch instead. I will merge it there and it is now available for us for pulling down and working with
(another approach could be to manually copy all the changes in this PR and do a new PR)
|
gharchive/pull-request
| 2016-03-01T21:34:29 |
2025-04-01T04:55:18.153088
|
{
"authors": [
"KjellKod",
"ghost",
"john-gress"
],
"repo": "LogRhythm/QueueNado",
"url": "https://github.com/LogRhythm/QueueNado/pull/40",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2100307495
|
🛑 LogSentinel SIEM API is down
In 4e76d54, LogSentinel SIEM API (https://api.logsentinel.com) was down:
HTTP code: 503
Response time: 643 ms
Resolved: LogSentinel SIEM API is back up in 24a2c73 after 15 days, 4 hours, 24 minutes.
|
gharchive/issue
| 2024-01-25T12:37:29 |
2025-04-01T04:55:18.155707
|
{
"authors": [
"Glamdring"
],
"repo": "LogSentinel/status",
"url": "https://github.com/LogSentinel/status/issues/1228",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1290995603
|
🛑 LogSentinel SIEM API is down
In c58e8bd, LogSentinel SIEM API (https://api.logsentinel.com) was down:
HTTP code: 502
Response time: 496 ms
Resolved: LogSentinel SIEM API is back up in 61651e7.
|
gharchive/issue
| 2022-07-01T06:59:22 |
2025-04-01T04:55:18.158032
|
{
"authors": [
"Glamdring"
],
"repo": "LogSentinel/status",
"url": "https://github.com/LogSentinel/status/issues/171",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1321428759
|
🛑 LogSentinel SIEM dashboard is down
In b7c6472, LogSentinel SIEM dashboard (https://siem.logsentinel.com) was down:
HTTP code: 502
Response time: 446 ms
Resolved: LogSentinel SIEM dashboard is back up in c78f3ed.
|
gharchive/issue
| 2022-07-28T20:17:47 |
2025-04-01T04:55:18.160554
|
{
"authors": [
"Glamdring"
],
"repo": "LogSentinel/status",
"url": "https://github.com/LogSentinel/status/issues/188",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1323184293
|
🛑 LogSentinel SIEM dashboard is down
In 7d17599, LogSentinel SIEM dashboard (https://siem.logsentinel.com) was down:
HTTP code: 502
Response time: 457 ms
Resolved: LogSentinel SIEM dashboard is back up in f55f5cb.
|
gharchive/issue
| 2022-07-30T15:45:48 |
2025-04-01T04:55:18.162868
|
{
"authors": [
"Glamdring"
],
"repo": "LogSentinel/status",
"url": "https://github.com/LogSentinel/status/issues/293",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
410141379
|
Does it support email subjects?
Hi!
I'm wondering if this supports email subjects. I need to output a different vowel in a word depending on the sex of the contact. In the body it was somewhat easy to solve with the dynamic content feature, but it's not supported for subjects in Mautic. Can I include TWIG conditionals in the email subject if I use your plugin?
Thanks!
Not at the moment unfortunately.
On Thu, Feb 14, 2019, 7:59 AM Marcelo Serpa <notifications@github.com wrote:
Hi!
I'm wondering if this supports email subjects. I need to output a
different vowel in a word depending on the sex of the contact. In the body
it was somewhat easy to solve with the dynamic content feature, but it's
not supported for subjects in Mautic. Can I include TWIG conditionals in
the email subject if I use your plugin?
Thanks!
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/Logicify/mautic-advanced-templates-bundle/issues/2,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AFVaD5LraerX103Q8bSfm-8ztpxD25Uyks5vNPs-gaJpZM4a646W
.
Thanks for the reply! Do you know of any workarounds for this scenario?
Nevermind, I managed to extend your plugin to process subjects. I don't have time to push a PR now though but will do so over the next few days. Would you be willing to review/accept it?
Sure, it will be great.
On Fri, Feb 15, 2019, 1:12 AM Marcelo Serpa <notifications@github.com wrote:
Nevermind, I managed to extend your plugin to process subjects. I don't
have time to push a PR now though but will do so over the next few days.
Would you be willing to review/accept it?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/Logicify/mautic-advanced-templates-bundle/issues/2#issuecomment-463840583,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AFVaD-qOBQfVSiuM9Tv0PSMsJGy4thOsks5vNe1AgaJpZM4a646W
.
@fullofcaffeine any progress on the PR?
This PR could fixed it
https://github.com/Logicify/mautic-advanced-templates-bundle/pull/10
Sorry, I totally forgot about this issue. Are people still interested in this changeset?
Sorry, I totally forgot about this issue. Are people still interested in this changeset?
Yes we do :)
I would really like to test it.
PR which adds support for subjects has been accepted, please use release 1.1. Kudos to @kuzmany !
Could anyone confirm it works now?
Seems Like it works to me.
|
gharchive/issue
| 2019-02-14T05:59:25 |
2025-04-01T04:55:18.175329
|
{
"authors": [
"PietkaSmig",
"bobsburgers",
"corvis",
"fullofcaffeine",
"kfrankie",
"kuzmany",
"virgilwashere"
],
"repo": "Logicify/mautic-advanced-templates-bundle",
"url": "https://github.com/Logicify/mautic-advanced-templates-bundle/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1120969215
|
Add Support for Custom Domains
Checklist
[X] You created a branch from develop branch.
[X] You PR is raised on develop branch and not onmain.
[X] You have read the Contribution Guidlines before creating this PR.
Description of the Change
Add support in the library for the optional "customDomain" value to be passed in to allow users to specify that the library should use a customer vanity domain for IDX logins rather than the .hub.loginradius.com variant.
This is required for third party cookies to work with customer domains.
Alternate Designs
N/A
Risk Impact
Risk should be very low as the new parameter is optional so all existing use cases can ignore it.
Verification Process
We built a version of the library with these changes included. First verifying that omission of the new parameter results in identical behavior as before (sending users to .hub.loginradius.com).
We also verified that including the new customDomain parameter resulted in users being sent to our custom IDX domain for sign in.
Release Notes
Add support for customDomain to be specified in library
Hi @mohammed786,
Sorry to bother but I wanted to follow up on this as we are continuing to have to deploy a customized version of this package to our sites.
Is there anything I've missed that is needed to start the review process?
Thanks!
|
gharchive/pull-request
| 2022-02-01T17:04:03 |
2025-04-01T04:55:18.179930
|
{
"authors": [
"drewf7"
],
"repo": "LoginRadius/loginradius-react",
"url": "https://github.com/LoginRadius/loginradius-react/pull/15",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1912562203
|
Start markdown
I have created H1 header unable to edit #'s
$ git init
Initialized empty Git repository in /Users/skills/Projects/recipe-repository/.git/
[ ] Turn on GitHub Pages
[ ] Outline my portfolio
[ ] Introduce myself to the world
|
gharchive/pull-request
| 2023-09-26T02:21:55 |
2025-04-01T04:55:18.185096
|
{
"authors": [
"LokeshAyiti"
],
"repo": "LokeshAyiti/skills-communicate-using-markdown",
"url": "https://github.com/LokeshAyiti/skills-communicate-using-markdown/pull/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2493772243
|
OSOE-861: Add Git tags for Azure deployments and swaps
OSOE-861
This doesn't need to be merged.
|
gharchive/pull-request
| 2024-08-29T08:22:12 |
2025-04-01T04:55:18.189271
|
{
"authors": [
"AydinE",
"Piedone"
],
"repo": "Lombiq/PowerShell-Analyzers",
"url": "https://github.com/Lombiq/PowerShell-Analyzers/pull/46",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2525898215
|
OFFI-101: Frontend server support and interactive mode improvements
OFFI-101
Instead, I think a sample should be created in a separate OSOCE issue.
|
gharchive/pull-request
| 2024-09-14T00:41:09 |
2025-04-01T04:55:18.190656
|
{
"authors": [
"sarahelsaig"
],
"repo": "Lombiq/UI-Testing-Toolbox",
"url": "https://github.com/Lombiq/UI-Testing-Toolbox/pull/407",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1474259851
|
🛑 jpvir is down
In 7087def, jpvir ($JPVIR) was down:
HTTP code: 0
Response time: 0 ms
Resolved: jpvir is back up in 2d78591.
|
gharchive/issue
| 2022-12-03T22:33:14 |
2025-04-01T04:55:18.233599
|
{
"authors": [
"LonelyJupiter"
],
"repo": "LonelyJupiter/UPPTIME",
"url": "https://github.com/LonelyJupiter/UPPTIME/issues/2045",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1478819316
|
🛑 jpvir is down
In d5c74cb, jpvir ($JPVIR) was down:
HTTP code: 0
Response time: 0 ms
Resolved: jpvir is back up in 1a7f56f.
|
gharchive/issue
| 2022-12-06T10:59:13 |
2025-04-01T04:55:18.235807
|
{
"authors": [
"LonelyJupiter"
],
"repo": "LonelyJupiter/UPPTIME",
"url": "https://github.com/LonelyJupiter/UPPTIME/issues/2123",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1507998285
|
🛑 jpvir is down
In 069a3c1, jpvir ($JPVIR) was down:
HTTP code: 0
Response time: 0 ms
Resolved: jpvir is back up in 3147688.
|
gharchive/issue
| 2022-12-22T14:34:40 |
2025-04-01T04:55:18.237802
|
{
"authors": [
"LonelyJupiter"
],
"repo": "LonelyJupiter/UPPTIME",
"url": "https://github.com/LonelyJupiter/UPPTIME/issues/2602",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1800116736
|
🛑 Link Shortener is down
In 11e98c0, Link Shortener (https://lr-link.vercel.app) was down:
HTTP code: 504
Response time: 10734 ms
Resolved: Link Shortener is back up in 5a5679b.
|
gharchive/issue
| 2023-07-12T03:53:12 |
2025-04-01T04:55:18.266416
|
{
"authors": [
"LordRonz"
],
"repo": "LordRonz/status",
"url": "https://github.com/LordRonz/status/issues/3502",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1350006962
|
Fix music#library.getArtists() and add MusicShelf.bottom_button
Kind of fixes #143 by returning items with item_type of 'library_artist' instead of 'artist'. Items of this type have an endpoint that leads to a page listing of the artist's songs (equivalent to clicking an artist in Library -> Artists). I think, at this stage, it would have to be up to the user to call the endpoint to obtain further content, including the artist's channel_id. I am however open to ideas on how this part can be better presented to the user.
PR also adds bottom_button to MusicShelf. If you follow the endpoint of a library artist, you will see one such button with the 'See All By Artist' text.
I think, at this stage, it would have to be up to the user to call the endpoint to obtain further content, including the artist's channel_id.
That's right, we shouldn't overcomplicate things so let's just leave that to the user.
PR also adds bottom_button to MusicShelf. If you follow the endpoint of a library artist, you will see one such button with the 'See All By Artist' text.
Great! Looks good but there seems to be a small conflict in that particular file.
Should be resolved now. Thanks.
|
gharchive/pull-request
| 2022-08-24T20:50:48 |
2025-04-01T04:55:18.345667
|
{
"authors": [
"LuanRT",
"patrickkfkan"
],
"repo": "LuanRT/YouTube.js",
"url": "https://github.com/LuanRT/YouTube.js/pull/152",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2703570570
|
Error occurred while enabling LuckPerms v5.4.146 (Is it up to date?)
Description
docker image startup failing when adding in LuckPerms (I am new to LuckPerms)
java.lang.NoClassDefFoundError: me/lucko/luckperms/lib/bytebuddy/dynamic/scaffold/subclass/ConstructorStrategy
Paper 1.21.3
Reproduction Steps
When removing LuckPerms from docker-compose--everything starts up without errors. If adding it back in --the failure occurs
Expected Behaviour
Startup with LuckPerms package
Server Details
Paper
LuckPerms Version
v5.4.146
Logs and Configs
https://gist.github.com/pmaroun/58fb1cbe25f3b5ee936269846c534888.js
Extra Details
No response
Can you try deleting your plugins/LuckPerms/libs folder? See the wiki here for more information. Your logs also seem to be wrong ..thing..?
|
gharchive/issue
| 2024-11-29T01:00:44 |
2025-04-01T04:55:18.355643
|
{
"authors": [
"pmaroun",
"powercasgamer"
],
"repo": "LuckPerms/LuckPerms",
"url": "https://github.com/LuckPerms/LuckPerms/issues/4003",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
993960110
|
🛑 cleanData is down
In 97816fe, cleanData ($CLEAN_DATA_URL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: cleanData is back up in 228e3a1.
|
gharchive/issue
| 2021-09-11T22:09:52 |
2025-04-01T04:55:18.362550
|
{
"authors": [
"LuckyHookin"
],
"repo": "LuckyHookin/hookin_fun_upptime",
"url": "https://github.com/LuckyHookin/hookin_fun_upptime/issues/310",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
520037013
|
openapi_3 to 2: Set required for formData params
When converting schema properties to parameters in formData, if the property is required, make the parameter required, since these have the same meaning.
Thanks for considering,
Kevin
Thanks again @rbren!
|
gharchive/pull-request
| 2019-11-08T14:14:05 |
2025-04-01T04:55:18.363661
|
{
"authors": [
"kevinoid"
],
"repo": "LucyBot-Inc/api-spec-converter",
"url": "https://github.com/LucyBot-Inc/api-spec-converter/pull/237",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
787484882
|
killquest plugin
i had a friend on my minecraft server complain about creeper griefing
I'm the type of player who has really sweaty palms while playing . i know how to play block game and i try my best.
the friend is more on the casual side. likes building a little bit more than anything else.
sooo.... after a little bit of thought i came up with a concept.
some Areas (smt like 1000blocks x 1000blocks each area) are able to track mob deaths. if enough monsters are killed, then the Area cancels any hostile mob event.
so my line of design goes like this : you want creepers to not grief your shit, you have grind for it. in this case the grind is 64 zombie kills, 64 skeleton kills, 64 creeper kills, 64 spider kills. a stack of each. the reward would be : an irreversable change in hostile spawning and one use of /lore or some shit.
/lore would be a commad that allows to add a little bit of subtext on an item held in main hand.
cant trigger the spawnraidevent. feels stupid using the api
cant trigger the spawnraidevent. feels stupid using the api
spider kills doentregister or dont show up on /killquest
spider kills doentregister or dont show up on /killquest
i think done. clearing it also stops witches and endermen from spawning. so the only mob to farm from spawners remains the tinee tiny mineshaft spider.
could be considered a bug but idc atm
i think done. clearing it also stops witches and endermen from spawning. so the only mob to farm from spawners remains the tinee tiny mineshaft spider.
could be considered a bug but idc atm
|
gharchive/issue
| 2021-01-16T15:06:26 |
2025-04-01T04:55:18.381417
|
{
"authors": [
"LukasOfLockless",
"likethedrama"
],
"repo": "LukasOfLockless/pidgeon-server-info",
"url": "https://github.com/LukasOfLockless/pidgeon-server-info/issues/6",
"license": "Unlicense",
"license_type": "permissive",
"license_source": "github-api"
}
|
1073710134
|
🛑 Greater Bank is down
In e72af00, Greater Bank (https://public.cdr-api.greater.com.au/cds-au/v1/banking/products) was down:
HTTP code: 403
Response time: 949 ms
Resolved: Greater Bank is back up in 768dad5.
|
gharchive/issue
| 2021-12-07T19:58:12 |
2025-04-01T04:55:18.390159
|
{
"authors": [
"LukePrior"
],
"repo": "LukePrior/OpenBankingUptime",
"url": "https://github.com/LukePrior/OpenBankingUptime/issues/1860",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1106292511
|
🛑 Greater Bank is down
In 34bed5f, Greater Bank (https://public.cdr-api.greater.com.au/cds-au/v1/banking/products) was down:
HTTP code: 403
Response time: 786 ms
Resolved: Greater Bank is back up in 4025bb2.
|
gharchive/issue
| 2022-01-17T22:02:58 |
2025-04-01T04:55:18.392575
|
{
"authors": [
"LukePrior"
],
"repo": "LukePrior/OpenBankingUptime",
"url": "https://github.com/LukePrior/OpenBankingUptime/issues/2999",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2664135508
|
🛑 Official(泠泫凝的异次元空间-主页) is down
In dfe83a9, Official(泠泫凝的异次元空间-主页) (https://lxnchan.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Official(泠泫凝的异次元空间-主页) is back up in 659a2a1 after 8 minutes.
|
gharchive/issue
| 2024-11-16T10:33:53 |
2025-04-01T04:55:18.446369
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/10945",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2756416673
|
🛑 EnderChest(末影箱-云盘) is down
In 0700225, EnderChest(末影箱-云盘) (https://enderchest.anavi.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: EnderChest(末影箱-云盘) is back up in 4f5ca1b after 5 minutes.
|
gharchive/issue
| 2024-12-23T16:54:37 |
2025-04-01T04:55:18.448785
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/12783",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2763062415
|
🛑 EnderChest(末影箱-云盘) is down
In 0f48296, EnderChest(末影箱-云盘) (https://enderchest.anavi.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: EnderChest(末影箱-云盘) is back up in 1c0626b after 19 minutes.
|
gharchive/issue
| 2024-12-30T11:24:18 |
2025-04-01T04:55:18.451181
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/13211",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1969524461
|
🛑 EnderChest(末影箱-云盘) is down
In 65f8f76, EnderChest(末影箱-云盘) (https://enderchest.anavi.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: EnderChest(末影箱-云盘) is back up in e99f3e5 after 5 minutes.
|
gharchive/issue
| 2023-10-31T02:35:50 |
2025-04-01T04:55:18.453861
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/2011",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2474772727
|
🛑 Official(泠泫凝的异次元空间-主页) is down
In fa9cc85, Official(泠泫凝的异次元空间-主页) (https://lxnchan.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Official(泠泫凝的异次元空间-主页) is back up in b2ad142 after 45 minutes.
|
gharchive/issue
| 2024-08-20T05:50:36 |
2025-04-01T04:55:18.456223
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/5962",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2482623612
|
🛑 Official(泠泫凝的异次元空间-主页) is down
In 52ac59c, Official(泠泫凝的异次元空间-主页) (https://lxnchan.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Official(泠泫凝的异次元空间-主页) is back up in 9b4296b after 15 minutes.
|
gharchive/issue
| 2024-08-23T08:35:01 |
2025-04-01T04:55:18.458759
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/6155",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2485023520
|
🛑 EnderChest(末影箱-云盘) is down
In c583a25, EnderChest(末影箱-云盘) (https://enderchest.anavi.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: EnderChest(末影箱-云盘) is back up in 6f73957 after 1 hour, 1 minute.
|
gharchive/issue
| 2024-08-25T03:30:02 |
2025-04-01T04:55:18.461146
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/6270",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2554160614
|
🛑 Official(泠泫凝的异次元空间-主页) is down
In e3ad3f2, Official(泠泫凝的异次元空间-主页) (https://lxnchan.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Official(泠泫凝的异次元空间-主页) is back up in 5a19ce8 after 5 minutes.
|
gharchive/issue
| 2024-09-28T10:49:17 |
2025-04-01T04:55:18.463786
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/8259",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2555179282
|
🛑 Official(泠泫凝的异次元空间-主页) is down
In 931a2bd, Official(泠泫凝的异次元空间-主页) (https://lxnchan.cn) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Official(泠泫凝的异次元空间-主页) is back up in db4bfa7 after 21 minutes.
|
gharchive/issue
| 2024-09-29T21:51:19 |
2025-04-01T04:55:18.466201
|
{
"authors": [
"LxnChan"
],
"repo": "LxnChan/status",
"url": "https://github.com/LxnChan/status/issues/8354",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1468756146
|
🛑 mongo-hetzner is down
In e08e91e, mongo-hetzner ($URI_HETZNER) was down:
HTTP code: 0
Response time: 0 ms
Resolved: mongo-hetzner is back up in 1d80d12.
|
gharchive/issue
| 2022-11-29T21:58:26 |
2025-04-01T04:55:18.474165
|
{
"authors": [
"LyoSU"
],
"repo": "LyoSU/upptimly",
"url": "https://github.com/LyoSU/upptimly/issues/125",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1079394387
|
🛑 Edge Compute Cluster - US West is down
In b3880b4, Edge Compute Cluster - US West (https://uswest1-vega.lyrid.io/version) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Edge Compute Cluster - US West is back up in 3143b32.
|
gharchive/issue
| 2021-12-14T07:14:18 |
2025-04-01T04:55:18.476589
|
{
"authors": [
"soemarko"
],
"repo": "LyridInc/statuspage",
"url": "https://github.com/LyridInc/statuspage/issues/295",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2267107162
|
🛑 sspanel is down
In c702feb, sspanel (https://ss.168167.xyz) was down:
HTTP code: 500
Response time: 999 ms
Resolved: sspanel is back up in 9049682 after 1 hour, 56 minutes.
|
gharchive/issue
| 2024-04-27T17:21:16 |
2025-04-01T04:55:18.503411
|
{
"authors": [
"M1saka10010"
],
"repo": "M1saka10010/uptime",
"url": "https://github.com/M1saka10010/uptime/issues/138",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2329542885
|
Unresponsive news letter banner
Describe the bug
The news letter banner in the home screen is unresponsive on smaller devices
To Reproduce
Go to home page
Screenshots
Hey @MAVRICK-1 I would like to work on this issue under SSOC
@iSubhamMani duplicate issue #11
|
gharchive/issue
| 2024-06-02T08:50:51 |
2025-04-01T04:55:18.517785
|
{
"authors": [
"MAVRICK-1",
"iSubhamMani"
],
"repo": "MAVRICK-1/Nest-Ondc",
"url": "https://github.com/MAVRICK-1/Nest-Ondc/issues/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
624532450
|
How do other networks use SCConv convolution block?
If I want to use it in other networks, such as VGG ,ALexNet,and so on, I just need to replace the standard convolution blocks with this SCConv module whether or not?
You should include the K1 path as well: https://github.com/MCG-NKU/SCNet/blob/c0b5bd6aa919c00afb5815b2810e645e6a4a5976/scnet.py#L78 .
Also, the BN layers should be removed accordingly.
The BN layers ? removed?
Does it refer to the BN layer contained in K1,K2,K3,K4?
Do I just need to remove them?
Architectures like VGGs do not have BN layers.
For those models, you should remove all the BN layers.
|
gharchive/issue
| 2020-05-26T00:30:40 |
2025-04-01T04:55:18.521321
|
{
"authors": [
"HiDay1",
"backseason"
],
"repo": "MCG-NKU/SCNet",
"url": "https://github.com/MCG-NKU/SCNet/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
159833753
|
Bug in generator
[Server] WARN This is a bug in the generator plugin, NOT a bug in Multiverse!
[Server] WARN Failed to set the generator for world 'test' to 'TerrainControl':
java.lang.NullPointerException: The validated object is null
Could you post the rest of the error? While these lines are the most useful for you, they don't contain any information for us to resolve the issue.
there is no more error juste that's
There must be other lines nearby containing stuff like at com.khorn.terraincontrol, but maybe your admin panel doesn't display them.
What I would try:
create a world with another name - the name 'test' sometimes causes problems, as that name is also used internally by Multiverse for several tests.
make sure you have the correct TerrainControl version for your Minecraft version -- see the table over here.
Ok, thanks
|
gharchive/issue
| 2016-06-12T15:47:16 |
2025-04-01T04:55:18.524488
|
{
"authors": [
"Culvanen",
"rutgerkok"
],
"repo": "MCTCP/TerrainControl",
"url": "https://github.com/MCTCP/TerrainControl/issues/433",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2716360425
|
Allow specifying custom task repo - all-in-one PR
Allow users to specify the task repo rather than always using TASK_REPO_URL
Watch out:
.env changes
Testing:
try running a task from another repo
Same as this stack:
https://github.com/METR/vivaria/pull/737 - Add repoName to TaskSource
https://github.com/METR/vivaria/pull/738 - Add taskRepoName to task_environments_t
https://github.com/METR/vivaria/pull/739 - Update the frontend taskRepoUrl function to use the DB taskRepoName
https://github.com/METR/vivaria/pull/740 - Fetch tasks from repos other than TASK_REPO_URL
https://github.com/METR/vivaria/pull/741 - Allow specifying custom task repo
https://github.com/METR/vivaria/pull/742 - Add more params to CopyRunCommandButton
But @sjawhar requested an all-in-one PR
Also, getOrCreateLockFile() is a misleading function. The lock file does not need to exist, its parent directory does.
When I tried it it seemed that the lockfile did need to exist. And when you were encountering errors before, you said it was because wellKnownDir didn't exist, but Vivaria has always put lockfiles in wellKnownDir so I would be surprised if that was the case, I'm pretty sure the errors you were encountering was because the lockfile didn't exist
Please do not merge without fixing the / in the image name
When I tried it it seemed that the lockfile did need to exist.
$ rm -f this_file_does_not_exist.lock && git status && flock this_file_does_not_exist.lock git status
On branch report-run-error-rate
Your branch is up to date with 'origin/report-run-error-rate'.
nothing to commit, working tree clean
On branch report-run-error-rate
Your branch is up to date with 'origin/report-run-error-rate'.
Untracked files:
(use "git add <file>..." to include in what will be committed)
this_file_does_not_exist.lock
nothing added to commit but untracked files present (use "git add" to track)
Flock will create the lock file, that is not the issue
Please do not merge without fixing the / in the image name
I did fix this already in 062ae40
|
gharchive/pull-request
| 2024-12-04T01:46:11 |
2025-04-01T04:55:18.539771
|
{
"authors": [
"oxytocinlove",
"sjawhar"
],
"repo": "METR/vivaria",
"url": "https://github.com/METR/vivaria/pull/753",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
940803314
|
Grid spacing becomes inconsistent with high slope values.
The italic grid gets closer together the higher the slope value.
This has been fixed in the newest working branch commit. Closing this here.
|
gharchive/issue
| 2021-07-09T14:12:36 |
2025-04-01T04:55:18.547826
|
{
"authors": [
"MatthewBlanchard"
],
"repo": "MFEK/glif",
"url": "https://github.com/MFEK/glif/issues/132",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1942554155
|
[Question] NVIDIA GeForce RTX 4080 GPU is not yet supported in the Docker
:question: Question
Hi, I build and run the docker, everything looks right except a warning that stated Detected NVIDIA NVIDIA GeForce RTX 4080 GPU, which is not yet supported in this version of the container
ERROR: No supported GPU(s) detected to run this container
Is there any solution to this problem ?
You could try to bump the base image in the docker image to a newer PyTorch version (make sure to keep the PyTorch Version below 2.0 since it is not yet supported in nnDetection).
|
gharchive/issue
| 2023-10-13T19:52:44 |
2025-04-01T04:55:18.556624
|
{
"authors": [
"Yasmin-Kassim",
"mibaumgartner"
],
"repo": "MIC-DKFZ/nnDetection",
"url": "https://github.com/MIC-DKFZ/nnDetection/issues/204",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
857662002
|
[QUESTION] test on two ply but the result is not good
clouds.zip
Thanks for your code and paper. It is very interesting.
I installed teaser++ on windows (with some modifications). I got the two examples (*.cc) running.
I tested on two point cloud with teaser_cpp_fpfh but the result was not good (the point cloud is so simple and symmetric).
I think the performance depends on keypoint detection, descriptor extraction.
teaser_cpp_ply takes a lot of time because it takes all points in the cloud as keypoint. [
](url)
Can you please test your algorithms on these clouds ?
Thanks.
Hi @david-ngo-sixense , you need to first generate correspondences & then feed the correspondences to TEASER++. I recommend you to use the Python interface together with Open3D so that you can test different feature detectors and select the best combination.
Hi @jingnanshi , thanks for your reply.
I installed your teaser++ (c++ version, on a windows machine) and used it sucessfully for a C++ project.
Yes, I want to use the python interface, but I can't install the python wrapper.
`(Pytorch) C:\dev\TEASER-plusplus\build\python>pip install .
Processing c:\dev\teaser-plusplus\build\python
Building wheels for collected packages: teaserpp-python
Building wheel for teaserpp-python (setup.py) ... done
Created wheel for teaserpp-python: filename=teaserpp_python-1.0.0-py3-none-any.whl size=18144 sha256=631c0432c1ba85f575c7cace87144edbc03ab4154435b71693b54234290699cf
Stored in directory: C:\Users\ttngo\AppData\Local\Temp\pip-ephem-wheel-cache-f485dfpb\wheels\d7\cc\2e\d79e3ca0e0949697fc37003fde9b332b17db6e371bdd357de5
Successfully built teaserpp-python
Installing collected packages: teaserpp-python
Successfully installed teaserpp-python-1.0.0
(Pytorch) C:\dev\TEASER-plusplus\build\python>python
Python 3.8.8 (default, Feb 24 2021, 15:54:32) [MSC v.1928 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
import teaserpp_python
Traceback (most recent call last):
File "", line 1, in
File "C:\dev\TEASER-plusplus\build\python\teaserpp_python_init_.py", line 1, in
from .teaserpp_python import *
ModuleNotFoundError: No module named 'teaserpp_python.teaserpp_python'
`
Do you know why ?
Make sure when you compile the C++ project you turn on BUILD_PYTHON_BINDINGS
Hi, I don't have too much experience with C++. How would I go about turning on BUILD_PYTHON_BINDINGS when compiling the project? I just followed the instructions for installation over at https://teaser.readthedocs.io/en/master/installation.html and it did not mention anything about the flag you mention
|
gharchive/issue
| 2021-04-14T08:34:05 |
2025-04-01T04:55:18.571668
|
{
"authors": [
"david-ngo-sixense",
"jingnanshi",
"manavkulshrestha"
],
"repo": "MIT-SPARK/TEASER-plusplus",
"url": "https://github.com/MIT-SPARK/TEASER-plusplus/issues/92",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
567700590
|
This is a C project not a C++ project
Hello
You wrote a nice library.
But it shows that you are still a C++ beginner.
It is COMPLETELY WRONG to require the user of your library to call 'delete' to detroy a JSONValue. Anybody who forgets this will have a MEMORY LEAK.
In a correclty programmed C++ library there must NEVER be the need to ever use 'delete' for a class. C++ classes have a destructor which should do all the work of freeing memory.
The correct way would be to use a MEMBER function Parse() instead of a static function Parse().
JSONValue myValue;
myValue.Parse(stringData);
Then the destructor of myValue does the clean up of the memory alone.
@Elmue you tone is soooo agresssive, you should at least propose a fix in a pull request ;p
Better now ?
|
gharchive/issue
| 2020-02-19T17:00:41 |
2025-04-01T04:55:18.576904
|
{
"authors": [
"Elmue",
"lilpit"
],
"repo": "MJPA/SimpleJSON",
"url": "https://github.com/MJPA/SimpleJSON/issues/40",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1402635639
|
Update bug reporter url
Summary
Changes the url for the bug-reporter api.
3 commits, one line changed. Some sort of record(?).
Why is this change needed?
Because the bug-reporter server code was moved to the organisation repository, so the server needed to be redeployed under a new domain.
Does it depend on any other changes/PRs?
IMPORTANT PLS READ!
This depends on the mlvet repo being moved to the organisation account. The bug-report feature will not work until that happens, and will instead tell you to "try again later". This is because the server code already points to the to-be location of the mlvet repo, not the current location.
OS
[ ] Linux
[ ] MacOS
[x] Windows
MLVET has been moved, does it work now?
Yep, it works
|
gharchive/pull-request
| 2022-10-10T05:58:29 |
2025-04-01T04:55:18.590518
|
{
"authors": [
"LiamTodd"
],
"repo": "MLVETDevelopers/mlvet",
"url": "https://github.com/MLVETDevelopers/mlvet/pull/358",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.