id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
2618963075
|
[FEAT] Search Aggregations
Search results need aggregations on :+1:
type:
status
We'll have filters on the search page similar to this:
Results for Grants and Experts will show up when they're selected, of course. When Experts or All Results are selected, the Experts Open To section will show up, and allow filtering of experts for the current search term to those with specific availabilities.
The availability filters shouldn't affect the grants that are returned for the search, so the number of experts will be the only thing that changes. Using the screenshot above as reference, if they have an availability filter selected so that only 9 experts are returned, when All Results is selected, the 9 experts + the 39 grants should show up in the search results.
@UcDust I have added another endpoint to the search:
http http://localhost/api/search/types q==hart Authorization:"Bearer $jwt"
results in :
{
"aggregations": {
"type": {
"expert": 1,
"grant": 7,
"grant_research": 7
}
},
"total": 8
}
http http://localhost/api/search/types Authorization:"Bearer $jwt"
results in :
{
"aggregations": {
"type": {
"expert": 4,
"grant": 91,
"grant_research": 90,
"grant_service": 83
}
},
"total": 95
}
So the question for you is should we run this on every search query and update the aggregations, or do you want to manage in the client? You can comment and assign back to me.
@UcDust, that wasn't clear :). I still don't understand how you're expecting this to work, so I have two proposals;
Two Aggregations
We could return two aggregations, one on the query only, and one on the complete query. This would allow you to limit your experts to those wanting media, but still open grants, and see how many are active, etc.
Something like:
http http://localhost/api/search type==grant status==active Authorization:"Bearer $jwt" | jq .aggregations
{
"query_only": {
"availability": {
"ark:/87287/d7mh2m/keyword/c-ucd-avail/collaborative%20projects": 2,
"ark:/87287/d7mh2m/keyword/c-ucd-avail/community%20partnerships": 1,
"ark:/87287/d7mh2m/keyword/c-ucd-avail/industry%20projects": 1,
"ark:/87287/d7mh2m/keyword/c-ucd-avail/media%20enquiries": 1
},
"type": {
"grant": 91,
"grant_research": 90,
"grant_service": 83,
"expert": 4
},
"status": {
"completed": 86,
"active": 5
}
},
"complete": {
"availability": {},
"type": {
"grant": 5,
"grant_research": 5,
"grant_service": 4
},
"status": {
"active": 5
}
}
}
@UcDust I've updated using the first example, let me know if you agree
|
gharchive/issue
| 2024-10-28T16:48:42 |
2025-04-01T04:36:09.961575
|
{
"authors": [
"UcDust",
"qjhart"
],
"repo": "ucd-library/aggie-experts",
"url": "https://github.com/ucd-library/aggie-experts/issues/643",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1334518247
|
gviewで画像を表示できない
いつからかgviewで画像を表示できないことに気付いた
tviewもファイルをロードできない
カーネルをフルビルドすることで解決した。
https://github.com/uchan-nos/mikanos/commit/778963bde30bfb220320060d50e1ee54bad9aa31 で導入した FileDescriptor::IsTerminal メソッドが原因だろう。
FileDescriptor を継承したクラスに依存する全ての .cpp ファイルをコンパイルし直す必要があったのではないかと推測する。
|
gharchive/issue
| 2022-08-10T12:01:07 |
2025-04-01T04:36:09.966178
|
{
"authors": [
"uchan-nos"
],
"repo": "uchan-nos/mikanos",
"url": "https://github.com/uchan-nos/mikanos/issues/55",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
153590968
|
run ansible-playbook with docker in Travis CI
This pull request add test to run ansible-playbook with docker connection on Travis CI.
Thank you for your submission, we really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
This pull request includes some tweaks
update docker version. docker in trusty, docker exec could only use container id, not container name
set remote_user, Ansible 2.0.2, try to run docker exec with -u self._play_context.remote_user, Ansible 2.1.0 will includes fix for this ansible/ansible#15043, and Ansible 2.0.1 works without this workaround
|
gharchive/pull-request
| 2016-05-07T11:56:21 |
2025-04-01T04:36:09.990505
|
{
"authors": [
"CLAassistant",
"uchida"
],
"repo": "uchida/ansible-pbuilder-role",
"url": "https://github.com/uchida/ansible-pbuilder-role/pull/1",
"license": "cc0-1.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1014540688
|
fixed prettier/eslint conflicts
ES-Lint wants there to be single quotes for each string, but prettier's default config makes it so that each string has double quotes. By forcing prettier to accept single quotes for strings in the .prettierrc file, we fix this issue if devs have prettier installed globally.
@matthewcn56 this looks good to me, thanks for making the changes!!
Do you wanna maybe add in changes for #34 in this PR as well? Here's the editor's example if you wanted to yoink it https://github.com/uclaacm/TeachLAFrontend/blob/master/.prettierrc.json
|
gharchive/pull-request
| 2021-10-03T21:07:00 |
2025-04-01T04:36:09.993254
|
{
"authors": [
"matthewcn56",
"reginawang3495"
],
"repo": "uclaacm/teach-la-ts-react-starter",
"url": "https://github.com/uclaacm/teach-la-ts-react-starter/pull/67",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2279132276
|
reproducing experiment for wiredtiger
Hi,
In the WiredTiger folder's README, it's mentioned that there is a run/run.sh. However, I couldn't find it in the current repository.
Can you point me to where that is?
Thanks!
Thank you for your interest in Midas! I have pushed the missing scripts to the repo. Please find it at apps/wiredtiger/run.
Thank you!
Thanks! Closing this issue.
|
gharchive/issue
| 2024-05-04T17:47:31 |
2025-04-01T04:36:09.995197
|
{
"authors": [
"ivanium",
"pohaoc"
],
"repo": "uclasystem/midas",
"url": "https://github.com/uclasystem/midas/issues/1",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
212937564
|
wrong comment
Not that important but the last line of "ground_truth_package.h" should be:
#endif /* GROUND_TRUTH_PACKAGE_H_ */
instead of
#endif /* MEASUREMENT_PACKAGE_H_ */
This has been fixed! Thanks :)
|
gharchive/issue
| 2017-03-09T05:15:42 |
2025-04-01T04:36:10.010255
|
{
"authors": [
"cameronwp",
"philbort"
],
"repo": "udacity/CarND-Extended-Kalman-Filter-Project",
"url": "https://github.com/udacity/CarND-Extended-Kalman-Filter-Project/issues/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
358656143
|
TypeError: Cannot read property 'toLowerCase' of undefined
Sometimes return this error
When IP is undefined return this error
|
gharchive/issue
| 2018-09-10T14:44:12 |
2025-04-01T04:36:10.017700
|
{
"authors": [
"nexus-soft"
],
"repo": "udger/udger-nodejs",
"url": "https://github.com/udger/udger-nodejs/issues/36",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2722616410
|
reorganize source code and sync with z80pack/picosim/libs
Move simulator source code into new srcsim directory and libraries source code into new libs directory.
Rewrite lcd libraries CMakeLists.txt files in the style of pico-ds3231.
Fix roms/Makefile to properly build z80asm.
It now looks like z80pack/picosim and won't shock users only looking to get this running on their device with a bunch of source code in the top-level directory.
Looks much better, thanks.
|
gharchive/pull-request
| 2024-12-06T10:15:13 |
2025-04-01T04:36:10.022887
|
{
"authors": [
"sneakywumpus",
"udo-munk"
],
"repo": "udo-munk/RP2040-GEEK-80",
"url": "https://github.com/udo-munk/RP2040-GEEK-80/pull/62",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
327400530
|
Why so many rerenders?
I've noticed that the entire app is rendered twice on the server and twice more on the client. This seems to be a waste of resources. If the content is all there and rendered on the server, why is a client-side render happening?
To replicate, add some logging to MainApp.js and observe that server and client logs repeat 4 times.
This is because of react-async-bootstrapper, it walks the tree and renders to resolve all lazy loaded components then does a final render.
This can be solved with react suspense in the future
The client should not render twice however, so if you can share your snippet that would help.
Thanks
I don't have a snippet; I tested with the blank code that's in the repo. I literally just added a console.log('bla') in the MainApp function and saw it logged twice on the server and twice in the client. It is also logged twice for the service worker but I assumed that's correct.
|
gharchive/issue
| 2018-05-29T16:39:10 |
2025-04-01T04:36:10.073711
|
{
"authors": [
"birkir",
"jdwillemse"
],
"repo": "ueno-llc/starter-kit-universally",
"url": "https://github.com/ueno-llc/starter-kit-universally/issues/106",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
279760274
|
Fix fonts file
The default font.css file is not working and didn't import fonts it contains. Replacing with a .scss make it works.
Ok, I found where the problem comes from. We use the babel css-modules-transform https://github.com/ueno-llc/starter-kit-universally/blob/development/config/values.js#L399 plugin that rewrite hashname for classnames.
Somehow, it just brokes css files imports.
|
gharchive/pull-request
| 2017-12-06T13:52:11 |
2025-04-01T04:36:10.075431
|
{
"authors": [
"JeremDsgn"
],
"repo": "ueno-llc/starter-kit-universally",
"url": "https://github.com/ueno-llc/starter-kit-universally/pull/76",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1718051225
|
Add open source license?
Can an open source license be added to this repository so that concerns e.g. about liability can be addressed?
Apache License 2.0
|
gharchive/issue
| 2023-05-20T04:57:20 |
2025-04-01T04:36:10.079296
|
{
"authors": [
"kazarmy",
"vinciusb"
],
"repo": "ufmg-smite/proof-visualizer",
"url": "https://github.com/ufmg-smite/proof-visualizer/issues/67",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1754252265
|
Scrolling behaviour changed after bootstrap update
Bug
Scrolling behaviour changed after bootstrap update.
Scrolling behaviour changed: vertical scrollbar is now present for small amounts of folders and files as well. Used to only appear for high amounts of folders / files.
Old version:
New version:
@mietcls this does not seem to be the case anymore on test and production. Closing this?
|
gharchive/issue
| 2023-06-13T07:44:56 |
2025-04-01T04:36:10.106953
|
{
"authors": [
"mietcls",
"nicolasfranck"
],
"repo": "ugent-library/deliver",
"url": "https://github.com/ugent-library/deliver/issues/83",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2330860710
|
Working on vectors
Please make dld (and related functions) work on vectors (Positionals).
This works:
dld('Pepa' 'Pepper')
# 3
This does not, but it would be nice if it did:
dld('Pepa'.comb, 'Pepper'.comb)
# This type cannot unbox to a native string: P6opaque, Seq
Motivation
Using dld as a distance function in nearest neighbor finding algorithms.
Like those in "Math::Nearest".
Having common behavior with other distance functions.
Like hamming-distance in "Math::DistanceFunctions".
For the most part I'd be ok with accepting a PR that adds a new multi to accommodate this.
I am not immediately sure how to handle cases where an element in the list contains multiple letters though, i.e. dld(['AA','b','c'], ['A','c','b']). It would be simple enough to add a check that each item in the list is only a single character, but I'm not sure if the performance penalty would be acceptable. What do you think?
In dld(['AA','b','c'], ['A','c','b']) the strings should be treated as characters. I.e. how many edits it will take to turn the first vector into the second.
I consider submitting a PR -- should have tests for dld over vectors.
|
gharchive/issue
| 2024-06-03T11:10:31 |
2025-04-01T04:36:10.111338
|
{
"authors": [
"antononcube",
"ugexe"
],
"repo": "ugexe/Raku-Text--Levenshtein--Damerau",
"url": "https://github.com/ugexe/Raku-Text--Levenshtein--Damerau/issues/13",
"license": "Artistic-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
737810830
|
Typescript issue
It seems that this library is missing a type definition file and there is none added so far in DefinitelyTyped repo, will that be added in the future? I'm pretty new to typescript myself, so I'm not really sure I can add one.
copy this into a file
declare module 'react-native-rating-element' {
import { ViewStyle } from 'react-native'
interface RatingProps {
rated?: number
totalCount?: number
type?: string
selectedIconImage?: JSX.Element | JSX.Element[]
emptyIconImage?: JSX.Element | JSX.Element[]
readonly?: boolean
direction?: string
onIconTap?: (value: number) => void
ratingColor?: string
ratingBackgroundColor?: string
icon?: string
marginBetweenRatingIcon: 1
style?: ViewStyle
}
export const Rating: (props: RatingProps) => any
}
There's an error in the type interface above. marginBetweenRatingIcon should be:
marginBetweenRatingIcon?: number
|
gharchive/issue
| 2020-11-06T14:40:32 |
2025-04-01T04:36:10.121587
|
{
"authors": [
"markrickert",
"panigrah",
"tonym95"
],
"repo": "ui-ninja/react-native-rating-element",
"url": "https://github.com/ui-ninja/react-native-rating-element/issues/24",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1505288443
|
error on simpleList or genericList
Current Behavior
get erorr when use simpleLlist or genericList
Error: Element type is invalid: expected a string (for built-in components) or a class/function (for composite components) but got: undefined. You likely forgot to export your component from the file it's defined in, or you might have mixed up default and named imports.
Check the render method of `ListButton`.
and the code is
import React, { useCallback, useState } from 'react'
// Import Schema UI Provider and Render engine
import { isInvalid } from '@ui-schema/ui-schema/ValidityReporter'
import { createOrderedMap } from '@ui-schema/ui-schema/Utils/createMap'
import { UIStoreProvider, createStore } from '@ui-schema/ui-schema/UIStore'
import { storeUpdater } from '@ui-schema/ui-schema/storeUpdater'
import { UIMetaProvider, useUIMeta } from '@ui-schema/ui-schema/UIMeta'
import { injectPluginStack } from '@ui-schema/ui-schema/applyPluginStack'
import { relTranslator } from '@ui-schema/ui-schema/Translate/relT'
import { widgets } from '@ui-schema/ds-material/widgetsBinding'
import { GridContainer } from '@ui-schema/ds-material/GridContainer'
import { Stepper, Step } from '@ui-schema/ds-material/Widgets/Stepper'
import { SimpleList } from '@ui-schema/ds-material/Widgets/SimpleList'
import { GenericList } from '@ui-schema/ds-material/Widgets/GenericList'
import { ListButton } from '@ui-schema/ds-material/Component'
const schemaBase = {
type: 'object',
properties: {
labels: {
type: 'array',
widget: 'SimpleList',
items: {
type: 'string'
},
view: {
sizeXs: 12,
sizeMd: 12
// btnSize: 'small'
}
}
}
}
const GridStack = injectPluginStack(GridContainer)
export const DemoForm = () => {
// optional state for display errors/validity
const [showValidity, setShowValidity] = useState(false)
const [store, setStore] = useState(() => createStore(createOrderedMap({})))
const [schema] = useState(() => createOrderedMap(schemaBase))
const onChange = useCallback(
actions => {
setStore(storeUpdater(actions))
},
[setStore]
)
const { widgets } = useUIMeta()
const customWidgets = {
...widgets,
custom: {
...widgets.custom,
Stepper,
Step,
SimpleList,
GenericList,
ListButton
}
}
return (
<>
<UIStoreProvider store={store} onChange={onChange} showValidity={showValidity}>
<GridStack widgets={{ ...customWidgets }} isRoot schema={schema} />
</UIStoreProvider>
<button
onClick={() =>
isInvalid(store.getValidity()) ? setShowValidity(true) : console.log('doingSomeAction:', store.valuesToJS())
}
>
send!
</button>
</>
)
}
export default function App() {
return (
<UIMetaProvider widgets={widgets} t={relTranslator}>
<DemoForm />
</UIMetaProvider>
)
}
Your Environment
Tech
Version
UI-Schema
v0.4.4
Immutable
v4.1.10
-
-
DS-Material
0.4.1
Material-UI
5.10.16
-
-
nextjs
13
This seems to come from the build system, what type/target do you use in nextjs?
Best would be if you can create a small reproducible repository where I can check your setup. I don't have much experience with the NextJS build system.
Also you somehow have ListButton in widgets.custom - that isn't a widget but some component inside the list widgets.
|
gharchive/issue
| 2022-12-20T21:14:31 |
2025-04-01T04:36:10.126830
|
{
"authors": [
"elbakerino",
"marefati110"
],
"repo": "ui-schema/ui-schema",
"url": "https://github.com/ui-schema/ui-schema/issues/216",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1943903293
|
Failed to detect namespace or namespace is empty for project
Hi colleagues,
Following the recent devtoberfest recording (thanks for that!), I used yeoman to generate a library. Right after the generation, I tried to start the library, but got the following error: Failed to detect namespace or namespace is empty for project my.library. Could you please take a look?
Output of ui5 serve --verbose:
Error Message:
Failed to detect namespace or namespace is empty for project my.library. Check verbose log for details.
Stack Trace:
Error: Failed to detect namespace or namespace is empty for project my.library. Check verbose log for details.
at Library._getNamespace (file:///E:/Dev/my.library/node_modules/@ui5/cli/node_modules/@ui5/project/lib/specifications/types/Library.js:333:10)
at async Library._parseConfiguration (file:///E:/Dev/my.library/node_modules/@ui5/cli/node_modules/@ui5/project/lib/specifications/types/Library.js:169:21)
at async Library.init (file:///E:/Dev/my.library/node_modules/@ui5/cli/node_modules/@ui5/project/lib/specifications/Project.js:38:3)
at async file:///E:/Dev/my.library/node_modules/@ui5/cli/node_modules/@ui5/project/lib/graph/Module.js:165:17
at async Promise.all (index 0)
at async Module._getSpecifications (file:///E:/Dev/my.library/node_modules/@ui5/cli/node_modules/@ui5/project/lib/graph/Module.js:160:17)
at async projectGraphBuilder (file:///E:/Dev/my.library/node_modules/@ui5/cli/node_modules/@ui5/project/lib/graph/projectGraphBuilder.js:118:61)
at async graphFromPackageDependencies (file:///E:/Dev/my.library/node_modules/@ui5/cli/node_modules/@ui5/project/lib/graph/graph.js:73:23)
at async serve.handler (file:///E:/Dev/my.library/node_modules/@ui5/cli/lib/cli/commands/serve.js:100:11)
Hello,
I'm not sure if it was intentional, but my ".library" file was named "_.library." After changing that, the project built successfully right after generation.
Hi @whydrae , very sorry to notice this issue that late. You are running on a Windows machine, right? I have no problems on Mac - maybe this is related to the OS.
Can you share the complete log creating the library:
> ~ % yo easy-ui5 library
_-----_
| | ╭──────────────────────────╮
|--(o)--| │ Welcome to the easy-ui5 │
`---------´ │ 3.6.3 generator! │
( _´U`_ ) ╰──────────────────────────╯
/___A___\ /
| ~ |
__'.___.'__
´ ` |° ´ Y `
? What is the namespace of your library? my.library
? Which framework do you want to use? OpenUI5
? Which framework version do you want to use? 1.120.0
? Would you like to omit the namespace in the src and test folder? Yes
? Who is the author of the application? xxx
? Would you like to create a new directory for the application? Yes
? Would you like to initialize a local github repository for the application? No
force my.library/.yo-rc.json
create my.library/ui5.yaml
create my.library/ui5-dist.yaml
create my.library/package.json
create my.library/karma.conf.js
create my.library/karma-ci.conf.js
create my.library/karma-ci-cov.conf.js
create my.library/.gitignore
create my.library/.eslintrc.json
create my.library/.editorconfig
create my.library/README.md
create my.library/LICENSE
create my.library/test/Example.js
create my.library/test/Example.html
create my.library/test/qunit/testsuite.qunit.js
create my.library/test/qunit/testsuite.qunit.html
create my.library/test/qunit/Example.qunit.js
create my.library/src/messagebundle.properties
create my.library/src/library.js
create my.library/src/.library
create my.library/src/ExampleRenderer.js
create my.library/src/Example.js
create my.library/src/themes/base/library.source.less
create my.library/src/themes/base/Example.less
create my.library/src/themes/sap_horizon/library.source.less
create my.library/src/themes/sap_horizon_dark/library.source.less
create my.library/src/themes/sap_horizon_hcw/library.source.less
create my.library/src/themes/sap_horizon_hcb/library.source.less
create my.library/src/themes/sap_fiori_3/library.source.less
create my.library/src/themes/sap_fiori_3_dark/library.source.less
create my.library/src/themes/sap_fiori_3_hcw/library.source.less
create my.library/src/themes/sap_fiori_3_hcb/library.source.less
added 1254 packages, and audited 1255 packages in 43s
147 packages are looking for funding
run `npm fund` for details
found 0 vulnerabilities
force my.library/.yo-rc.json
> ~ % cd my.library
> ~/my.library % ui5 serve
INFO: Using local @ui5/cli installation
info graph:helpers:ui5Framework Using OpenUI5 version: 1.120.0
info server:custom-middleware:ui5-middleware-livereload Livereload server started!
Server started
URL: http://localhost:8080
In my case, I see that a .library file has been created - as @JarkkoRissanen mentioned this may be the issue to not retrieve the namespace...
Ok, it is an OS issue - found it:
https://github.com/ui5-community/generator-ui5-library/blob/main/generators/app/index.js#L172
The last replace uses the linux file separators... This won't work on Windows.
Thanks for catching this issue and reporting it.
I will fix it now
@petermuessig @whydrae I just had the same error on Mac with the newest version of this repo.
Funnily it only occurred in zsh, not in the JS debug terminal, hence I figured it was related to the respective environment - i.e. the @ui5/cli version installed globally. Indeed the breaking one had globally 3.7.3 and the working one 3.8.0. After updating the globally installed @ui5/cli to 3.8, it also worked in zsh.
In essence: when this still happens, make sure to use the newest @ui5/cli.
|
gharchive/issue
| 2023-10-15T14:07:24 |
2025-04-01T04:36:10.138130
|
{
"authors": [
"JarkkoRissanen",
"akudev",
"petermuessig",
"whydrae"
],
"repo": "ui5-community/generator-ui5-library",
"url": "https://github.com/ui5-community/generator-ui5-library/issues/22",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1799406153
|
add rake utilities
[ ] tiff / archive processing
[ ] renaming
adding to https://github.com/uidaholib/rake-utilities-for-digital instead
|
gharchive/issue
| 2023-07-11T17:23:26 |
2025-04-01T04:36:10.139911
|
{
"authors": [
"evanwill"
],
"repo": "uidaholib/digital-collections-template-cb-csv",
"url": "https://github.com/uidaholib/digital-collections-template-cb-csv/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
237225742
|
toml.dump doesn't seem to work
I can read toml settings, and I can get it into a custom type, TomlSettings.
However, when I try to write it back with toml.dump(self, "Settings.toml") or toml.dump(self.__dict__, "Settings.toml")*, it just fails with this error:
$ python sbr-viewer.py
Traceback (most recent call last):
File "sbr-viewer.py", line 198, in close
SETTINGS.write()
File "C:\Users\Joey\WebstormProjects\sbr-viewer-brw\settings.py", line 25, in write
toml.dump(self.__dict__, SETTINGS_FILENAME)
File "C:\Users\Joey\AppData\Local\Programs\Python\Python35-32\lib\site-packages\toml.py", line 588, in dump
if not f.write:
AttributeError: 'str' object has no attribute 'write'
*self here is an instance of TomlSettings i.e. the code lives in a method of TomlSettings.
Any idea what's going on here, and how this can be fixed?
I finally found out that it really wants a file object, not a file path, which wasn't obvious to me.
|
gharchive/issue
| 2017-06-20T14:17:03 |
2025-04-01T04:36:10.196828
|
{
"authors": [
"JoeyAcc"
],
"repo": "uiri/toml",
"url": "https://github.com/uiri/toml/issues/108",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1332255469
|
fould out for single repositories
The descriptions of single repositories can be folded out which saves spaces.
@engeir What do you think about this design?
I think it looks good! Either way is fine with me, but yeah I agree it's tidier now and I think I prefer it over the previous.
Only thing I noticed is that the menu that github provides out of the box will not work unless the given heading is already expanded (clicking won't expand the drop-down where the heading is). I think by letting the heading be the title of the drop-down all links would work.
|
gharchive/pull-request
| 2022-08-08T18:47:49 |
2025-04-01T04:36:10.198281
|
{
"authors": [
"engeir",
"gregordecristoforo"
],
"repo": "uit-cosmo/user-guide",
"url": "https://github.com/uit-cosmo/user-guide/pull/16",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1971470253
|
Is there a repl implementation on jvm platform?
So that it is possible to use uiua on jvm. ;)
Are you asking me to reimplement the interpreter in Java?
Are you asking me to reimplement the interpreter in Java?
No, I mean that if there is a such plan?
No
|
gharchive/issue
| 2023-11-01T00:15:10 |
2025-04-01T04:36:10.199684
|
{
"authors": [
"freeze-dolphin",
"kaikalii"
],
"repo": "uiua-lang/uiua",
"url": "https://github.com/uiua-lang/uiua/issues/216",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1743110506
|
Ujcms v6.0.2 has a sensitive file reading problem
[Vulnerability description]
Ujcms v6.0.2 has a sensitive file reading problem. When using Tomcat to deploy the project, the background zip package downloads the html directory, and modifying the dir parameter causes the source code and configuration files to be downloaded
[Vulnerability Type]
Sensitive file reading(Information Disclosure)
[Vendor of Product]
https://gitee.com/ujcms/ujcms
https://github.com/ujcms/ujcms
https://www.ujcms.com/
[Affected Product Code Base]
v6.0.2
[Vulnerability proof]
Condition: tomcat deployment project
The dir parameter is allowed to be set to "WEB-INF/", and the names parameter is allowed to be set to "classes", so that the source code and web configuration files can be downloaded directly.(There is no html directory by default, you can create it directly through the function)
[Code Details]
com.ujcms.cms.core.web.backendapi.AbstractWebFileController#downloadZip
The code checks the two parameters "dir" and "names" separately
com.ujcms.cms.core.web.backendapi.AbstractWebFileController#checkId(java.lang.String)
Check whether there is directory traversal, no restrictions on accessible directories
com.ujcms.cms.core.web.backendapi.AbstractWebFileController#checkName(java.lang.String)Check the file name, when both meet
(1) The file name is empty
(2) The file name contains illegal characters
Accessible directories are not restricted
Fixed in version 7.0.0
|
gharchive/issue
| 2023-06-06T05:27:42 |
2025-04-01T04:36:10.214770
|
{
"authors": [
"keecth",
"ujcms"
],
"repo": "ujcms/ujcms",
"url": "https://github.com/ujcms/ujcms/issues/6",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1206182131
|
Tests fail in GitHub Actions
Even though all tests pass in my local environment, tests for "Magic Eden" happen to fail for some reasons.
Disabled these tests temporarily.
https://github.com/ukaznil/nft-market/commit/fddb619a6b33e0febe219305d99d16f959a34fb0
Also, https://github.com/ukaznil/nft-market/commit/48a4511b37b292f4374b3e486e3147430c492626
|
gharchive/issue
| 2022-04-16T18:01:40 |
2025-04-01T04:36:10.218009
|
{
"authors": [
"ukaznil"
],
"repo": "ukaznil/nft-market",
"url": "https://github.com/ukaznil/nft-market/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
842568808
|
Player shroud
Use trig functions to hide everything outside of a circle a certain distance from the player.
Ooo like the fog of war effect? I do think certain things should be hidden till you fall on them mind
|
gharchive/issue
| 2021-03-27T17:57:30 |
2025-04-01T04:36:10.224209
|
{
"authors": [
"supachris28",
"ukmadlz"
],
"repo": "ukmadlz/twitch-mud",
"url": "https://github.com/ukmadlz/twitch-mud/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1939513393
|
basic descriptions page
TP-??? Your PR title here
Why
What
Codecov Report
All modified lines are covered by tests :white_check_mark:
Comparison is base (2430c9c) 92.79% compared to head (7e8f37c) 92.80%.
Additional details and impacted files
@@ Coverage Diff @@
## master #1069 +/- ##
=======================================
Coverage 92.79% 92.80%
=======================================
Files 460 460
Lines 34258 34264 +6
Branches 2616 2616
=======================================
+ Hits 31791 31797 +6
Misses 1969 1969
Partials 498 498
Files
Coverage Δ
commodities/urls.py
100.00% <ø> (ø)
commodities/views.py
99.09% <100.00%> (+0.02%)
:arrow_up:
common/models/trackedmodel.py
94.32% <ø> (ø)
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
|
gharchive/pull-request
| 2023-10-12T08:47:25 |
2025-04-01T04:36:10.231739
|
{
"authors": [
"CPrich905",
"codecov-commenter"
],
"repo": "uktrade/tamato",
"url": "https://github.com/uktrade/tamato/pull/1069",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2717376715
|
Mark all chats as read
Is there a way to mark all chats as read? On Element, things are already marked as read, and when I use iamb, those chats are unread.
You might be looking for :unreads clear
|
gharchive/issue
| 2024-12-04T11:09:27 |
2025-04-01T04:36:10.384032
|
{
"authors": [
"Ahwxorg",
"thibaultamartin"
],
"repo": "ulyssa/iamb",
"url": "https://github.com/ulyssa/iamb/issues/377",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2290837785
|
Fix openSUSE link and installation command in README
The package is now hosted in the main openSUSE Tumbleweed repositories. This corrects the README to use the openSUSE repo instead of a user-maintained one.
Thank you for updating the instructions! I'll update the website too to reflect it now being in the main repos. :tada:
|
gharchive/pull-request
| 2024-05-11T10:57:05 |
2025-04-01T04:36:10.385017
|
{
"authors": [
"photosheep",
"ulyssa"
],
"repo": "ulyssa/iamb",
"url": "https://github.com/ulyssa/iamb/pull/283",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
208428998
|
DevTools Video Course: What do you want to see?
Please 'thumbs up' the relevant comment below:
Performance (Basic) - Network perf, how to use your build system to output optimised resources etc.
Performance (Advanced) - Rendering perf, using the Timeline Panel to diagnose jank. Using Chrome Tracing Tools.
DevTools as an IDE - What you need to know in order to keep your development environment within the browser.
Building a DevTools extension: Enhance your dev workflow by building extensions to automate.
Accessibility - Carry out Accessibility audits on your own website and learn how to fix issues.
Chrome DevTools Source Code - Learn to navigate your way around the DevTools itself (it's all a web application). Become a contributor by landing your own bug fix or feature.
Service Workers - How they work and how to add them to your own site.
Performance Audits of other popular websites - improving the front-end perf of other existing websites.
Security - Explore and discover security exploits of other websites with the aid of DevTools and a few other tools.
Workflows - Modern workflows to adopt in your development and debugging journeys.
Animations - Learn how to create, debug and modify animations.
React.js - How to develop and debug a modern React.js webapp.
Vue.js - How to develop and debug a modern Vue.js webapp.
Angular JS - How to develop and debug a modern Angular 2 webapp.
Node.js - How to develop, debug and solve problems with Node.js and DevTools.
Mobile - Advanced remote debugging techniques and everything you need to know about responsiveness and how DevTools can help with this.
DevTools Experiments - Everything you need to know about the available experiments (e.g. like the Terminal one) and how to stay up to date with them and learn when new ones arrive.
JavaScript ES6 & ES7 - How to use brand new features of JavaScript effectively and discovering hidden debugging options for bleeding edge features.
DevTools for Designers - How a designer, or someone with little programming knowledge can effectively use DevTools to improve websites, discover potential performance bottlenecks, and then feed this back to a developer.
DevTools for DevOps - How someone responsible for the infrastructure side of websites can fully utilise hidden DevTools features (along with a few Command Line Tools) to diagnose and solve network related performance issues. Also how to expose server side debugging and logging directly in DevTools.
Debugging Third Party Services and APIs - Understand bugs caused by third party code like ad trackers, plugins etc. Also work with external APIs efficiently and effectively.
Paint Profiling - Deeper insights into how a browser paints your page, how to read and explore draw calls executed by the browser renderer.
Visualising Performance Metrics with continuous integration - Record and study performance metrics of your website into a dashboard which updates with each code check-in.
Using HTTP2 - Learn how to implement HTTP2 with your Node.js website, also understand the performance benefits of doing so.
Would be nice to include TLS concerns when talking about service workers. How to, LetsEncrypt, that stuff.
If it's not too far off topic, I'd love to see a discussion of the various levels of caching (database, web server, in-browser,) how to use them effectively, and how to take them into account when optimizing your application for speed.
How about a comparison of AMP vs other mobile-first initiatives?
Regarding using devtool as IDE, I wondering how to properly customize the CodeMirror that we can have features such as code folding, showing indent guides...
Is it me or can no one else add reactions? Maybe it's just the mobile site?
@wayou Emmet support thubms up
@umaar When this video course will ready and where we can watch it?
@AzharArshad according to the course site Home Page in 1 May of this year! But if you buy the course now, you will get a disscount, after that datet the price will be normal! By the way when you buy the course you will have access to the course forever!!! 😀
Course Update: Thanks everyone for your excellent suggestions. Pre-launch is now available 🎊 ModernDevTools.com
|
gharchive/issue
| 2017-02-17T12:33:22 |
2025-04-01T04:36:10.398051
|
{
"authors": [
"Alexisvt",
"AzharArshad",
"iandouglas",
"ingshtrom",
"jake-bladt",
"mugukamil",
"njnygaard",
"umaar",
"wayou"
],
"repo": "umaar/dev-tips",
"url": "https://github.com/umaar/dev-tips/issues/27",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2431308358
|
hostname responds 400 for ipv6
Describe the Bug
Database
PostgreSQL
Relevant log output
payload.hostname must match the following: "/^(([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9-]*[a-zA-Z0-9])\.)*([A-Za-z0-9]|[A-Za-z0-9][A-Za-z0-9-]*[A-Za-z0-9])$/"
Which Umami version are you using? (if relevant)
Chrome
Which browser are you using? (if relevant)
Chrome
How are you deploying your application? (if relevant)
Docker
Users use a pure v6 network environment
Need to resolve this issue
|
gharchive/issue
| 2024-07-26T02:51:24 |
2025-04-01T04:36:10.401150
|
{
"authors": [
"QYG2297248353"
],
"repo": "umami-software/umami",
"url": "https://github.com/umami-software/umami/issues/2862",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
107222075
|
Getting the error: "Height must have an exact value or MATCH_PARENT" but I have all children as match_parent
Edited: The error is in the Height not in the Width
Hello! Thanks for this awesome lib. It works awesome in the demo code but I'm struggling when trying to include on my project. I'm getting the following error:
Height must have an exact value or MATCH_PARENT
It's descriptive enough to see what can be the problem. But I've been trying already a few hours and nothing is fixed yet... I've even tried to use the lib as a module to debug it. Didn't help, if I comment the throw statement throw new IllegalStateException("Height must have an exact value or MATCH_PARENT"); it works fine but it doesn't seems fine to do so.
This is my relevant layout code. Worth to say that in the same file there are more elements surrounding this section (toolbar, another layouts like loading and error messages)
What am I doing wrong?
<com.sothree.slidinguppanel.SlidingUpPanelLayout xmlns:sothree="http://schemas.android.com/apk/res-auto"
android:id="@+id/sliding_layout"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:gravity="bottom"
sothree:umanoDragView="@+id/product_detail_preview"
sothree:umanoOverlay="true"
sothree:umanoPanelHeight="110dp"
sothree:umanoParalaxOffset="100dp"
sothree:umanoShadowHeight="4dp">
<RelativeLayout
android:id="@+id/map_content_container"
android:layout_width="match_parent"
android:layout_height="match_parent">
<LinearLayout
android:id="@+id/info_panel"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentTop="true"
android:background="@color/white"
android:elevation="2dp"
android:orientation="horizontal"
android:weightSum="4">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="center_vertical"
android:layout_weight="2"
android:paddingLeft="16dp">
<TextView
android:id="@+id/product_detail_count_number"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:ellipsize="end"
android:gravity="center_vertical"
android:maxLines="2"
android:text="19000 Artikel ab 30000€ in 2200 durchsuchten Geschäften"
android:textColor="@color/black_trans50"
android:textSize="11sp"
android:textStyle="bold" />
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="center_vertical"
android:layout_weight="2"
android:gravity="center_vertical">
<TextView
android:id="@+id/show_available_only"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="?attr/selectableItemBackground"
android:clickable="true"
android:ellipsize="end"
android:focusable="true"
android:gravity="center_vertical"
android:maxLines="2"
android:paddingLeft="16dp"
android:text="Nur abholbereit Artikel ansehen"
android:textColor="@color/primary_700"
android:textSize="14sp" />
</LinearLayout>
</LinearLayout>
<fragment xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/map"
android:name="com.google.android.gms.maps.SupportMapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_below="@+id/info_panel"
android:layout_weight="1"
tools:context="de.locafox.mobile.presentation.ui.activities.detail.ProductDetailActivity" />
</RelativeLayout>
<LinearLayout
android:id="@+id/product_detail_preview"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@color/white"
android:elevation="16dp"
android:orientation="horizontal"
android:paddingBottom="10dp"
android:paddingTop="10dp"
android:translationZ="16dp">
<ImageView
android:id="@+id/product_detail_image"
android:layout_width="wrap_content"
android:layout_height="match_parent"
android:layout_alignParentLeft="true"
android:paddingLeft="16dp"
android:paddingRight="8dp"
android:src="@drawable/demo" />
<LinearLayout
android:id="@+id/product_detail_title_container"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@drawable/border_left"
android:orientation="vertical"
android:paddingTop="24dp">
<TextView
android:id="@+id/product_detail_title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:ellipsize="end"
android:maxLines="2"
android:paddingBottom="10dp"
android:paddingLeft="8dp"
android:paddingRight="16dp"
android:text="Samsung WB 352 + 8GB MicroSD"
android:textSize="16sp" />
<TextView
android:id="@+id/product_detail_brand"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:paddingLeft="8dp"
android:paddingRight="16dp"
android:text="Samsung"
android:textAllCaps="true"
android:textSize="12sp"
android:textStyle="bold" />
</LinearLayout>
</LinearLayout>
</com.sothree.slidinguppanel.SlidingUpPanelLayout>
Thanks! :)
I have the same problem. Did you find any solution?
@Shashank066 yes. My problem was with the "AppBarLayout".
My layout was as follows:
CoordinatorLayout
AppBarLayout (w: match_parent, h: wrap_content)
Toolbar
/AppBarLayout
LinearLayout
more stuff (not relevant)
SlidingUpPanelLayout
main content (w: match_parent, h: match_parent)
sliding content (w: match_parent, h: 110dp)
/SlidingUpPanelLayout
more stuff (not relevant)
/LinearLayout
/CoordinatorLayout
By removing AppBarLayout the problem was fixed. Lucky for us we don't need it in this screen. We've also replaced the CoordinatorLayout, but it was working fine with it.
I hope I will have some time during this week to document and test this properly.
Thank you, that fixed it.
What can I do if I cannot get rid of AppBar Layout?
|
gharchive/issue
| 2015-09-18T15:16:49 |
2025-04-01T04:36:10.411556
|
{
"authors": [
"DoM4",
"Shashank066",
"albertogarrido"
],
"repo": "umano/AndroidSlidingUpPanel",
"url": "https://github.com/umano/AndroidSlidingUpPanel/issues/541",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
95028561
|
Path fixes
I'm not sure these shell files are even used, to be honest. But I was seeing some weird behavior when Ansible first installs Redis and thought that maybe this would fix it.
The behavior I was seeing was that redis would start, but it would not use our config file. This would put it in a state where Redis was running, but could not be shut down. The only way to fix the problem was to KILL the process and then restart it with the correct config file.
Since these functions have the wrong path for the config file, they are certainly suspect. After making this change I rebuilt a server from scratch & Redis seemed to be behaving.
At some point I think we should write our own Ansbile role for Redis. This one has a bunch of behavior that I don't think we need. A version tailored for OIT VMs would be easier to maintain.
|
gharchive/pull-request
| 2015-07-14T20:07:45 |
2025-04-01T04:36:10.489978
|
{
"authors": [
"IanWhitney"
],
"repo": "umn-asr/ansible-redis",
"url": "https://github.com/umn-asr/ansible-redis/pull/1",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2427686937
|
ignore pnpm-lock.yaml
pnpm i后多了这个文件,希望忽略。
自行选择忽略即可。通常情况下,这个文件存在更好
我觉得不太方便的是,如果我希望经常pull你的最新更新,又不想有自己的commits,自己本地运行又一定会有lock文件的diff。
|
gharchive/issue
| 2024-07-24T14:03:31 |
2025-04-01T04:36:10.510576
|
{
"authors": [
"LeiYangGH",
"pany-ang"
],
"repo": "un-pany/v3-admin-vite",
"url": "https://github.com/un-pany/v3-admin-vite/issues/202",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2116533750
|
187 - Optimize pytorch weight loading on startup + dependencies update
Description
Fixes PyTorch auto-downloading models on node-start even if its not used (causes significant delay until ready to process).
Dependencies updated to fix building issues (b5 update) and have newest versions of PyTorch/Ultralytics and other libraries.
Fixes #187
Type of change
Bug fix (non-breaking change which fixes an issue)
Checklist:
[x] My code follows the style guidelines of this project
[x] I have performed a self-review of my own code
[x] I have commented my code, particularly in hard-to-understand areas
[x] My changes generate no new warnings
Delay reduced to ~0.5s on my machine. Won't be at university until Thursday due to work - would appreciate a quick test of this fix on the lab PCs to see if it works :) Thanks
Here is a short video from the delay, it is about 1-2 seconds when looking at the ros time.
delay.webm
@samuelkuehnel Thanks for testing. I saw that with the recent additions in the VisionNode by the LIDAR/distance features by Leon, the overall calculation time increased. It's ~50ms (20 FPS) on my PC, looks like it's more like 200ms on the lab PCs. I can try to check if I can do anything to fix this in the VisionNode without breaking the distance/LIDAR stuff, but I doubt it.
Added further measures to ensure up-to-date classification data:
Traffic light detection is handled asynchronously now (-> independent of distance calculation)
Switch to second best classification model (YOLO RTDETR-L instead of RTDETR-X) which ~54% faster with only slightly worse accuracy according to Ultralytics
Overall, the timings of the VisionNode changed like this (tested on a 4070 Ti):
Before:
Classification: ~50ms
Traffic light detection & distance calc: ~22ms
Total: ~72ms -> 13.9 FPS
Now:
Classification: ~40ms
Traffic light detection: asynchronously now, doesn't wait for distance calc
Distance calc: ~15ms
Total: ~60ms -> 16.7 FPS
Using the NAS models was not possible because of (hours of) dependency hell with Ultralytics + Super-Gradients... also converting the RT-DETR model to a way faster format like TensorRT is not loadable with Ultralytics.
Dependencies changed, please run b5 update! :)
5 days ago, a new/improved version of the standard YOLO model has been released: https://github.com/WongKinYiu/yolov9
This may be a candidate to try if the above changes are not sufficient performance wise.
|
gharchive/pull-request
| 2024-02-03T15:30:01 |
2025-04-01T04:36:10.520675
|
{
"authors": [
"MaxJa4",
"samuelkuehnel"
],
"repo": "una-auxme/paf23",
"url": "https://github.com/una-auxme/paf23/pull/193",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2654004609
|
Taks(Backend): Atualizando Requisitos para o Backend
O que foi feito?
Foi adicionado o arquivo requirements.txt, que são os requisitos para rodar o protótipo do backend.
O arquivo deixa com que os requisitos sejam baixados de forma simples utilizando o seguinte comando no terminal:
pip install -r requirements.txt
Ótimo trabalho!
|
gharchive/pull-request
| 2024-11-13T03:19:00 |
2025-04-01T04:36:10.522609
|
{
"authors": [
"314dro",
"TiagoBalieiro"
],
"repo": "unb-mds/2024-2-AcheiUnB",
"url": "https://github.com/unb-mds/2024-2-AcheiUnB/pull/34",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2208483414
|
Squad 3 - Temas propostos
Desenvolvimento do Feed de Dispensas de Licitações em Brasília: Criar um feed online para analisar e destacar dispensas de licitações nos diários oficiais de Brasília, visando aumentar a transparência e a fiscalização dos gastos governamentais. Os diários de dispensa de licitação serão coletados do site “Querido Diário”. Além disso, posteriormente, será desenvolvido pelo grupo um bot que irá realizar postagens periódicas no X (Antigo Twitter), divulgando os novos diários de dispensa de licitação.
Criação de uma Plataforma de Grupos Universitários: Melhorar e expandir o sistema de grupos para disciplinas universitárias da Universidade de Brasília, proporcionando uma plataforma mais eficiente para organização e interação entre os estudantes. Desenvolver uma plataforma do zero para facilitar a organização e interação entre os estudantes universitários da Universidade de Brasília (UnB) por meio de grupos disciplinares, visando promover colaboração e compartilhamento de recursos educacionais.
Plataforma de Geração Automatizada de Documentos: Será desenvolvida uma plataforma que permita aos usuários criar e preencher documentos personalizados de forma automatizada. Com base em um modelo de documentação e uma planilha de dados fornecidos pelo usuário, o site criará e preencherá automaticamente diversos modelos de documentos de acordo com os dados da planilha. Inicialmente a aplicação conterá modelos simples, com possibilidade de expansão para funcionalidades mais avançadas.
Projeto escolhido: Desenvolvimento do Feed de Dispensas de Licitações em Brasília
|
gharchive/issue
| 2024-03-26T14:51:47 |
2025-04-01T04:36:10.525745
|
{
"authors": [
"MariaCHelena",
"RochaCarla"
],
"repo": "unb-mds/Qualifying-Software-Engineers-Undergraduates-in-DevOps",
"url": "https://github.com/unb-mds/Qualifying-Software-Engineers-Undergraduates-in-DevOps/issues/6",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2312437039
|
tokenizer version in requirements.txt
When I run pip install -e .[all], I encounter the error
ERROR: Cannot install Crawl4AI, None, crawl4ai[all]==0.2.1 and transformers==4.40.2 because these package versions have conflicting dependencies.
The conflict is caused by:
crawl4ai 0.2.1 depends on tokenizers==0.13.2
litellm 1.37.11 depends on tokenizers
crawl4ai[all] 0.2.1 depends on tokenizers==0.13.2
crawl4ai[all] 0.2.1 depends on tokenizers==0.13.2
transformers 4.40.2 depends on tokenizers<0.20 and >=0.19
Installation succeed when I remove the tokenizers version in requirements.txt
Please, modify requirements.txt, and try again, plz let me know if works or not. If now I modify setup,py to fix it. Also please share your OS, CPU, python version and platform details.
Remove the version specification for tokenizers from requirements.txt and handle it in setup.py based on the context.
aiohttp==3.9.5
aiosqlite==0.20.0
bs4==0.0.2
fastapi==0.111.0
html2text
httpx==0.27.0
litellm==1.37.11
nltk==3.8.1
pydantic==2.7.1
python-dotenv==1.0.1
requests==2.31.0
rich==13.7.1
scikit-learn==1.4.2
selenium==4.20.0
uvicorn==0.29.0
transformers==4.40.2
chromedriver-autoinstaller==0.6.4
torch
onnxruntime
I just remove the version specification of tokenizers and the problem is solved.
aiohttp==3.9.5
aiosqlite==0.20.0
bs4==0.0.2
fastapi==0.111.0
html2text
httpx==0.27.0
litellm==1.37.11
nltk==3.8.1
pydantic==2.7.1
python-dotenv==1.0.1
requests==2.31.0
rich==13.7.1
scikit-learn==1.4.2
selenium==4.20.0
uvicorn==0.29.0
transformers==4.40.2
chromedriver-autoinstaller==0.6.4
torch
onnxruntime
tokenizers
|
gharchive/issue
| 2024-05-23T09:36:55 |
2025-04-01T04:36:10.542143
|
{
"authors": [
"YitaoLiu1996",
"unclecode"
],
"repo": "unclecode/crawl4ai",
"url": "https://github.com/unclecode/crawl4ai/issues/8",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2439731007
|
优化IP段合并程序,更新IP段合并结果
RT
去除非中国所属IP地址
新增镇江联通1IP段
@LeterTW 我合并后临时删掉了您的脚本main_2.py,脚本这块您是否有空更新一下增加排序功能,你的排序逻辑更合理。
@LeterTW 我合并后临时删掉了您的脚本main_2.py,脚本这块您是否有空更新一下增加排序功能,你的排序逻辑更合理。
我更新了下排序脚本,还非常粗糙,有兴趣可以提pr改进一下哦,增加格式校验什么 ^_^
|
gharchive/pull-request
| 2024-07-31T10:43:31 |
2025-04-01T04:36:10.544065
|
{
"authors": [
"LeterTW",
"unclemcz"
],
"repo": "unclemcz/ban-pcdn-ip",
"url": "https://github.com/unclemcz/ban-pcdn-ip/pull/5",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
240026074
|
Problem Android
I've tried the code post example, but on android not out of life.
It will not start.
Hey @Angelk90, I added a basic example here:
https://github.com/underscopeio/react-native-floating-hearts/tree/master/examples/basic
Try to run it and if it doesn't work please submit an issue following the issue template I just added to the project.
Thanks.
@ignacioola , @stevelacy : There are two problems:
It does not work on android, nothing happens I looked at the example too
prop-types must be changed with import PropTypes from 'prop-types';
|
gharchive/issue
| 2017-07-02T18:23:55 |
2025-04-01T04:36:10.549681
|
{
"authors": [
"Angelk90",
"ignacioola"
],
"repo": "underscopeio/react-native-floating-hearts",
"url": "https://github.com/underscopeio/react-native-floating-hearts/issues/5",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
744235236
|
Highlights are lost when Occurrence number is changed back from 0 to 1
Highlights are lost when occurrence number is changed back from 0 to 1 or changed by removing the Occurrence Number and adding back.
Open tN and go to a note that has a highlight.
Change the Occurrence number from 1 to 0. Highlight is removed as expected.
Change the Occurrence number back to 1. Note that OriginalQuote cannot be highlighted.
I'm no longer seeing this @elsylambert. I'm adding a re-test tag.
|
gharchive/issue
| 2020-11-16T22:21:02 |
2025-04-01T04:36:10.575191
|
{
"authors": [
"birchamp",
"elsylambert"
],
"repo": "unfoldingWord/tc-create-app",
"url": "https://github.com/unfoldingWord/tc-create-app/issues/521",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1833919278
|
upgrades to webpack@5 and enforces nodejs >= 18
Describe what your pull request addresses
See #121
Upgrades to use webpack@5 and enforces the use of nodejs 18
@unfoldingWord/devs does anyone know how the CI for netlify is setup? I need to change its dev process to use Node 18 instead of Node 16
|
gharchive/pull-request
| 2023-08-02T21:20:57 |
2025-04-01T04:36:10.578981
|
{
"authors": [
"theNerd247"
],
"repo": "unfoldingWord/translation-helps-rcl",
"url": "https://github.com/unfoldingWord/translation-helps-rcl/pull/123",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1955557017
|
🛑 Unforest is down
In 871c594, Unforest (https://www.unforest.net) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Unforest is back up in c897aee after 13 minutes.
|
gharchive/issue
| 2023-10-21T16:15:56 |
2025-04-01T04:36:10.581302
|
{
"authors": [
"unforest"
],
"repo": "unforest/uptime",
"url": "https://github.com/unforest/uptime/issues/1669",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1971629846
|
🛑 Piped is down
In e4a4d16, Piped (https://piped.unforest.net) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Piped is back up in 067b94a after 6 minutes.
|
gharchive/issue
| 2023-11-01T03:28:55 |
2025-04-01T04:36:10.583566
|
{
"authors": [
"unforest"
],
"repo": "unforest/uptime",
"url": "https://github.com/unforest/uptime/issues/3936",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1944403302
|
🛑 Piped is down
In 7a35ba2, Piped (https://piped.unforest.net) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Piped is back up in bddda1f after 43 minutes.
|
gharchive/issue
| 2023-10-16T05:36:34 |
2025-04-01T04:36:10.586067
|
{
"authors": [
"unforest"
],
"repo": "unforest/uptime",
"url": "https://github.com/unforest/uptime/issues/471",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1989293727
|
🛑 Nitter is down
In f3df567, Nitter (https://nitter.unforest.net) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Nitter is back up in 0ffc64d after 57 minutes.
|
gharchive/issue
| 2023-11-12T07:54:58 |
2025-04-01T04:36:10.588350
|
{
"authors": [
"unforest"
],
"repo": "unforest/uptime",
"url": "https://github.com/unforest/uptime/issues/6295",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1992363420
|
🛑 Piped API is down
In b6f75c0, Piped API (https://pipedapi.unforest.net) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Piped API is back up in a7ea879 after 5 minutes.
|
gharchive/issue
| 2023-11-14T09:39:18 |
2025-04-01T04:36:10.590854
|
{
"authors": [
"unforest"
],
"repo": "unforest/uptime",
"url": "https://github.com/unforest/uptime/issues/6732",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2389740546
|
Domain editing to qjz9zk is done even for displaying help URLs!
OS/Platform
Debian, Ubuntu, and derivatives
Installed
https://ungoogled-software.github.io/ungoogled-chromium-binaries/
Version
Ubuntu Lunar (112.0.5615.165)
Have you tested that this is not an upstream issue or an issue with your configuration?
[ ] I have tried reproducing this issue in Chrome and it could not be reproduced there
[ ] I have tried reproducing this issue in vanilla Chromium and it could not be reproduced there
[ ] I have tried reproducing this issue in ungoogled-chromium with a new and empty profile using --user-data-dir command line argument and it could not be reproduced there
Description
Staring it up produces an error message about "No usable sandbox"
How to Reproduce?
Juts start chromium in a terminal window and see the report.
Actual behaviour
This:
[parent]: chromium
[32549:32549:0704/021456.512354:FATAL:zygote_host_impl_linux.cc(127)] No usable sandbox! Update your kernel or see https://chromium.9oo91esource.qjz9zk/chromium/src/+/main/docs/linux/suid_sandbox_development.md for more information on developing with the SUID sandbox. If you want to live dangerously and need an immediate workaround, you can try using --no-sandbox.
Trace/breakpoint trap (core dumped)
Expected behaviour
The problem isn't the "No usable sandbox".
It is the fact that the message refers to:
https://chromium.9oo91esource.qjz9zk/chromium/src/+/main/docs/linux/uid_sandbox_development.md
Which is unusable, as the URL which is printed has suffered from the qjz9zk domain substitution.
Since this is just being displayed (not used) it should display the correct value.
Relevant log output
No response
Additional context
No response
This is expected, the domain substitution tooling cannot determine how URLs are used
I hope your question(-s) has(-ve) been answered, otherwise please let us know.
Closing this issue for now.
|
gharchive/issue
| 2024-07-04T01:22:14 |
2025-04-01T04:36:10.602679
|
{
"authors": [
"PF4Public",
"networkException",
"original-birdman"
],
"repo": "ungoogled-software/ungoogled-chromium",
"url": "https://github.com/ungoogled-software/ungoogled-chromium/issues/2937",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
327873626
|
noColor and fixedAmbient added
This adds two new options to the illumination prior.
The prior generates various illumination with diverse colour. Those variation might be to wild for some applications.
A new switch "no-color" allows to remove all color variation and returns the average over all colour channels per illumination component.
The first component of Spherical Harmonics is changing the ambient part of the illumination. This variation might also be not necessary for some applications. Therefore a new switch "fixAmbient" allows to set this value explicitly to a value "fixAmbientValue".
Both switches allow to still use the empirical distribution of the Basel Illumination Prior but allow to limit its flexibility
Yes it breaks the config file.
Every change so far did this.
Since example config files are part of the repository there is always a config file that works with the current version available.
The releases also contain the corresponding config files - so I would argue that this should work for everybody.
What about adding something like this to the readme?:
"Config files are not compatible between versions. When new features are added this usually leads to additional fields in the respective config file. However, the example config files are part of the repository and we keep them running when updating - those can be used as starting point or as clue to adapt your config file."
I really think it does not make sense to try to list those changes - it would also be very hard to read/implement them. The error messages when a value in the used config file are much more supportive when changing between versions.
I think this information does not help much. But maybe it is really the best thing not to document it. Otherwise, we would describe how to go to the immediate next version, but when you skip a version you already have to combine two updates. A diff in the git history can show you a more appropriate way how to update your configs.
So as indicated I am fine with merging without a change to the README.
I think we run into the general problem that the config files get filled with information that is actually irrelevant for the audience that this project was initially made for (Deep Learning for facial image analysis) and is more and more filled with things that are useful but actually might not be used by many people in the future. What about using two config files? One general file that offers access to the most important parameters and one advanced file that allows for a more detailed configuration?
I are right - it is getting filled. So far I think it should still be possible to read through all of it in reasonable time. My concern with what you are proposing is the following:
Indeed most of the people will only use a subset of those configurations. I however think that that subset might not be always the same - so there will be no set of common agreed important parameters.
It should be reasonable to spend some time reading through all the possibilities of configuration when you want to use it - and I think we try to kept the example configuration such that it is somehow a nice working example with only the most important parameters - the too fancy stuff is off...
Implementing your idea might make it more complicated instead of the motivation of simplification (settings: random, controlled, randomAdvanced, controlledAdvanced with all there different configurations or switches hidden switches).
I think a smart ordering within the config file could help that people see the most important options first. It might be already like this. But, they might be inspired, what else they can try out when reading all the options. Maybe there is even something they want but haven't yet thought about it. As in the provided config files the options have a default value they do not need to know every option when starting.
I agree that the ordering should not be too bad. I checked the ordering and from my perspective it already makes sense. I however still think that one should go through and check all the options available - and you are write, that also shows everybody what they can actually do.
In contrast to this config file, a fancy GUI would of course be cool - but most people might run it on servers - so perhaps a GUI to write the config file? I would say that the config file is at the moment to small to really profit from a GUI, but it could be an option if we realize that it grows and grows in future.
However we handle this now or in future - I think this discussion is now slightly unrelated to the original PR :)
|
gharchive/pull-request
| 2018-05-30T19:47:49 |
2025-04-01T04:36:10.610099
|
{
"authors": [
"AdamAkaAK",
"Andreas-Forster",
"BernhardEgger"
],
"repo": "unibas-gravis/parametric-face-image-generator",
"url": "https://github.com/unibas-gravis/parametric-face-image-generator/pull/15",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2625616063
|
www.panoptis.network
https://www.panoptis.network/
https://github.com/unicornonea/tron-dapp-react/issues/4
|
gharchive/issue
| 2024-10-30T23:52:05 |
2025-04-01T04:36:10.719071
|
{
"authors": [
"JPaikkari7"
],
"repo": "unicornonea/tron-dapp-react",
"url": "https://github.com/unicornonea/tron-dapp-react/issues/4",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
595712822
|
Allow reading password from file
It's common to store passwords in files rather than environment variables. For example Docker Secrets are mounted into containers as a file. Files are often preferred over environment variables as lots of applications will dump environment variables for debug and secrets can sometimes leak that way.
As an example Grafana supports this: https://grafana.com/docs/grafana/latest/installation/configure-docker/
The application supports a config file. up.conf. It can be formatted in xml, json, yaml or tool (default). Your passwords can go in there, or in environment variables.
Thanks for the response. I understand that, however Docker orchestration tools (Kubernetes and Docker Swarm) prefer to inject secrets directly into a file where the contents of the file is the value of the secret.
In the case of Docker Swarm that is the only option for handling secrets. If I put it as an environment variable in Docker Swarm it gets exposed in a lot of ways it shouldn't.
You've done a great job of describing the solution you'd like to have, and it may be something that gets implemented. I'm a firm believer that every problem has more than one solution.
It seems that you can publish a Docker Secret using a snippet of the unifi-poller config file with any secrets you need to hide, like this:
[unifi.defaults]
pass = "mySecretPassword"
Then map it into /etc/unifi-poller/up.conf. You can then use environment variables to fill in the rest of the config data. Does that solve your use case?
Did putting the secrets in the config file work?
Hello @regner,
Will the solution described here work for your use case?
https://github.com/unifi-poller/inputunifi/pull/1
Thanks!
I've confirmed this approach is compatible with Docker Swarm and Kubernetes secrets distribution process. This will now work for the UniFi controller password and the InfluxDB password. Instead of setting a password, you can set a file path prefixed with file://. Prefixing a password with file:// makes poller treat it as a password file path instead of an actual password. This feature will be released with v2.0.1 soon.
|
gharchive/issue
| 2020-04-07T09:06:46 |
2025-04-01T04:36:10.724378
|
{
"authors": [
"davidnewhall",
"regner"
],
"repo": "unifi-poller/unifi-poller",
"url": "https://github.com/unifi-poller/unifi-poller/issues/206",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
758094997
|
Unifi Network Controller 6 support?
Hi,
Not an issue as such, but I am interested if Network Controller 6 will work with this great codebase?
I have seen some anecdotal evidence on the forums that this does indeed work but note that this is not listed as a tested environment.
Thanks,
Graham
Sounds shiny and new. I've never used it. Please let me know if it works for you. :)
OK, I will report back once I upgrade, thanks @davidnewhall
Standing by for your report!
Because there hasn't been a reply yet:
I can confirm this works on my end with a UDM-PRO running controller version 6
Thank you for confirming!
|
gharchive/issue
| 2020-12-07T02:04:52 |
2025-04-01T04:36:10.727399
|
{
"authors": [
"Ornias1993",
"davidnewhall",
"ggear"
],
"repo": "unifi-poller/unifi-poller",
"url": "https://github.com/unifi-poller/unifi-poller/issues/284",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
123965609
|
html5
File API之读图
见html5 the missing manual 2nd edition page332.
把input file隐藏(太丑),给虚边框区域加onclick ,让他去调用input file的click方法,这样就会打开文件选择对话框,需要事先监听input file的change事件,利用HTML5的File API来处理获取的文件。
<div data-role="content">
<div class="avatar">
<div id="avatarImage"></div>
</div>
<input type="file" id="inputfile" class="inputfile"/>
</div>
read image
$('#inputfile').on('change', function () {
var files = this.files;
var file = files[0];
var fileReader = new FileReader();
fileReader.onload = function (e) {
$('#avatarImage').css('background-image', 'url("' + e.target.result + '")');
};
fileReader.readAsDataURL(file);
});
$('#avatarImage').on('click', function () {
$('#inputfile').click();
})
css
#avatarImage {
width: 100px;
height: 100px;
border: dashed #eeeeee 2px;
border-radius: 50%;
background-size: 100%;
background-repeat: no-repeat;
color: gray;
text-align: center;
}
#inputfile {
display: none;
}
Web Components
https://css-tricks.com/modular-future-web-components/
|
gharchive/issue
| 2015-12-27T07:07:14 |
2025-04-01T04:36:10.739313
|
{
"authors": [
"uniquejava"
],
"repo": "uniquejava/blog",
"url": "https://github.com/uniquejava/blog/issues/11",
"license": "cc0-1.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2665142324
|
Remove references to needing Rosetta on Apple Silicon
Now that UCM version 0.5.28 has native builds for Apple Silicon. Remove references to needing Rosetta from the homepage.
I believe this has already been taken care of by someone else.
|
gharchive/pull-request
| 2024-11-17T01:37:03 |
2025-04-01T04:36:10.744629
|
{
"authors": [
"jturner"
],
"repo": "unisonweb/website",
"url": "https://github.com/unisonweb/website/pull/81",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
801413961
|
Rework /averageCleaningTime, /averageChangeovertime, CalculateAverageStateTime() to be compatible with new datamodel
These functions are currently not working as the old datamodel is still hardcoded. Furthermore, change CalculateAverageStateTime() to include sub-states as well (e.g. 100000 is ChangeoverState and this function should include 100001 as well)
Is this still relevant @JeremyTheocharis ?
|
gharchive/issue
| 2021-02-04T15:56:14 |
2025-04-01T04:36:10.762730
|
{
"authors": [
"JeremyTheocharis",
"Scarjit"
],
"repo": "united-manufacturing-hub/united-manufacturing-hub",
"url": "https://github.com/united-manufacturing-hub/united-manufacturing-hub/issues/93",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1235387204
|
0.9.4-prerelease2
This pr tracks the progress to 0.9.4-prerelease2
Testing is blocked, due to my being unable to find the k3os usb stick :(
Checks:
[X] Installs without errors
[X] All default services start
[ ] Enabling barcode reader works
[X] Enabling datainput works
[X] Enabling barcodereader works
[X] Enabling cameraconnect works
@JeremyTheocharis requires manual merge, kafka dev private keys kill gitguardian
|
gharchive/pull-request
| 2022-05-13T15:23:03 |
2025-04-01T04:36:10.765262
|
{
"authors": [
"Scarjit"
],
"repo": "united-manufacturing-hub/united-manufacturing-hub",
"url": "https://github.com/united-manufacturing-hub/united-manufacturing-hub/pull/1096",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1527936315
|
YustTextField showSelected not consistent
The property 'showSelected' in the class YustTextField should also affect icons if set to false. If set to true, it currently affects only the properties 'label' and 'prefixIcon'.
We do not need the property showSelected.
|
gharchive/issue
| 2023-01-10T20:03:52 |
2025-04-01T04:36:10.775055
|
{
"authors": [
"janneskoehler",
"philthefox"
],
"repo": "univelop/yust_ui",
"url": "https://github.com/univelop/yust_ui/issues/57",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2651436515
|
🛑 happysooner.com is down
In f076a32, happysooner.com (https://happysooner.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: happysooner.com is back up in 64a8b13 after 11 minutes.
|
gharchive/issue
| 2024-11-12T08:36:11 |
2025-04-01T04:36:10.847554
|
{
"authors": [
"unliar"
],
"repo": "unliar/happy-upptime",
"url": "https://github.com/unliar/happy-upptime/issues/209",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2544879363
|
runtime does not work correctly with arbitrary selectors
UnoCSS version
0.62.4
Describe the bug
@unocss/runtime does not work properly with "arbitrary" selectors ([&_th]:sticky). If you call extractAll() by hand it will work as expected, but the mutation observer alone will not generate correct styles. Note that the page load calls extractAll() first so the bug might be hard to notice if you are just refreshing the page. But if you have anything async on the page, you might hit this.
I have also found the cause, it's because extractAll() calls decodeHtml() underneath, but the observer does not. The el.outerHTML returns escaped attributes, so obviously it cannot work.
No need to rush with the fix, I found a workaround so I don't need a fix right now. But given that it took me a while to figure out and I might not be alone, I am submitting this for others.
Related lines:
https://github.com/unocss/unocss/blob/35297359bf61917bda499db86e3728a7ebd05d6c/packages/runtime/src/index.ts#L254
https://github.com/unocss/unocss/blob/35297359bf61917bda499db86e3728a7ebd05d6c/packages/runtime/src/index.ts#L289
Reproduction
See above.
System Info
No response
Validations
[X] Read the Contributing Guidelines.
[X] Check that there isn't already an issue that reports the same bug to avoid creating a duplicate.
[X] Check that this is a concrete bug. For Q&A open a GitHub Discussion or join our Discord Chat Server.
[X] The provided reproduction is a minimal reproducible example of the bug.
Found another possibly related symptom https://github.com/unocss/unocss/issues/4337
|
gharchive/issue
| 2024-09-24T09:36:07 |
2025-04-01T04:36:10.865975
|
{
"authors": [
"ackvf",
"cztomsik"
],
"repo": "unocss/unocss",
"url": "https://github.com/unocss/unocss/issues/4158",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1246774765
|
[Android] Extra setup for Uno.Material ToggleSwitch dark mode colors is not working
Current behavior
When completing step 2 of the Uno.Material extra setup for the ToggleSwitch style, the thumb tint color of the control stays the same for dark and light mode. Here's the documentation : https://github.com/unoplatform/uno/blob/master/doc/articles/features/uno-material-controls-extra-setup.md
Expected behavior
The MaterialPrimaryColor and MaterialSurfaceVariantColor defined in Resources/values-night/colors should appear as the thumb tint color of the ToggleSwitch when the app is in dark mode.
How to reproduce it (as minimally and precisely as possible)
Launch the repro on Android :
ToggleSwitch.zip
Skip the onboarding pages.
Login with any email and password.
On the bottom bar, select the profile/settings section.
On the profile page, turn on the switch and notice the thumb tint color.
Click on the button "Toggle between themes" to switch to dark mode and notice that the thumb color tint is the same as in light mode. It should be the colors defined in Resources/values-night/colors.xml.
Environment
Nuget Package:
Package Version(s):
Uno.Material : 1.3.3
Uno.UI : 4.2.6
Affected platform(s):
[ ] iOS
[x] Android
[ ] WebAssembly
[ ] UWP
[ ] MacOS
Anything else we need to know?
This extra setup should no longer be needed. The MaterialToggleSwitchStyle from Uno.Material v2.x should be working as desired.
If you're still experiencing issue with the latest styles, feel free to open an issue on the Themes repo, thanks!
|
gharchive/issue
| 2022-05-24T15:58:06 |
2025-04-01T04:36:10.873953
|
{
"authors": [
"arianeleonard",
"kazo0"
],
"repo": "unoplatform/Uno.Themes",
"url": "https://github.com/unoplatform/Uno.Themes/issues/784",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2710152384
|
Unit Tests for Farm
I tried to write a unit test for Farm, but failed. No idea how to use Farm JS api to build with the plugin. 🤔
/cc @ErKeLost Could you please help me?
Okay, I'll finish this job
hi @sxzz
The recent focus is on the development of farm 2.0. The 2.0 API will have break changes, so I want to add this test when we start maintaining the nightly version of 2.0. It is expected to be released by the end of the year.
Nice, good to see it!
|
gharchive/issue
| 2024-12-02T00:58:13 |
2025-04-01T04:36:11.086921
|
{
"authors": [
"ErKeLost",
"sxzz"
],
"repo": "unplugin/unplugin-isolated-decl",
"url": "https://github.com/unplugin/unplugin-isolated-decl/issues/41",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2714470069
|
feat: oxc parser
Description
Use oxc-parser to replace @babel/parser.
Linked Issues
#48
Additional context
Wait the upstream issue to be resolved:
https://github.com/oxc-project/oxc/issues/7508
|
gharchive/pull-request
| 2024-12-03T09:33:38 |
2025-04-01T04:36:11.089215
|
{
"authors": [
"yuyinws"
],
"repo": "unplugin/unplugin-turbo-console",
"url": "https://github.com/unplugin/unplugin-turbo-console/pull/52",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1622283057
|
[Action]: GitHub Comment on an issue
Contact Details
No response
Action Name
GitHub Comment on an issue
Action Inputs
owner
repo
issue_number
comment
Action Outputs
"comment added" on success
Comments
I have an early version working
Code of Conduct
[X] I agree to follow this project's Code of Conduct
this was completed in PR 366
|
gharchive/issue
| 2023-03-13T21:21:09 |
2025-04-01T04:36:11.093587
|
{
"authors": [
"dougsillars"
],
"repo": "unskript/Awesome-CloudOps-Automation",
"url": "https://github.com/unskript/Awesome-CloudOps-Automation/issues/346",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2638948080
|
Fix: cast logits to float32 in cross_entropy_forward to prevent errors
#1251 mentioned that there's different data type between branches. Therefore we need to upcast it to float32
I saw other logits calculation also use tl.float32 so it should be correct
default continue pretraining give this amount of VRAM so I think this shouldn't increase any usage too .-.
tested on colab too
I am facing this issue in this notebook:
https://colab.research.google.com/drive/1lN6hPQveB_mHSnTOYifygFcrO8C1bxq4?usp=sharing#scrollTo=yqxqAZ7KJ4oL
|
gharchive/pull-request
| 2024-11-06T18:51:55 |
2025-04-01T04:36:11.096696
|
{
"authors": [
"Erland366",
"itshahmir"
],
"repo": "unslothai/unsloth",
"url": "https://github.com/unslothai/unsloth/pull/1254",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
55807771
|
Use Linux' O_TMPFILE for tmpfiles
https://github.com/untitaker/rust-atomicwrites/issues/2
Unsure how to call linkat in Python.
https://docs.python.org/3/library/os.html#os.O_TMPFILE
Something like this seems to work since Python 3.4:
import os
fd = os.open('.', os.O_TMPFILE | os.O_RDWR)
path = '/proc/self/fd/{0}'.format(fd)
os.link(path, 'spam', src_dir_fd=0, follow_symlinks=True)
@jwilk That doesn't seem particularly performant or elegant though. It seems that there is a package on PyPI that provides bindings for this call, however, that would be quite a microdependency.
AFAICT NamedTemporaryFile already set this flag
I think you meant (unnamed) TemporaryFile, which indeed tries to use O_TMPFILE since Python 3.5:
https://bugs.python.org/issue21515
|
gharchive/issue
| 2015-01-28T20:04:16 |
2025-04-01T04:36:11.116327
|
{
"authors": [
"AndCycle",
"earonesty",
"jwilk",
"untitaker"
],
"repo": "untitaker/python-atomicwrites",
"url": "https://github.com/untitaker/python-atomicwrites/issues/2",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1056210466
|
UMCS-319 - Fehler in Java SDK beim Parsen des Datums
Changed Date-Formatting to be parsed and formatted with UTC as Base and the correct time shift
Tests were added.
|
gharchive/pull-request
| 2021-11-17T14:40:07 |
2025-04-01T04:36:11.128007
|
{
"authors": [
"DerAusHeidelberg"
],
"repo": "unzerdev/java-sdk",
"url": "https://github.com/unzerdev/java-sdk/pull/44",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
407272504
|
Update bignumber.js@8.0.2
This will enable projects that depend on validator.js-asserts to update their versions of bignumber.js.
Note: this PR introduces a breaking change because of how the bignumber.js interface changed between versions. Releasing it should require a major version bump.
cc/ @ricardobcl
This is pretty much what I first did on https://github.com/uphold/validator.js-asserts/pull/174.
After talking with @franciscocardoso , we though it could be useful to support toggling the validation of significant digits in this assert.
The use case is that you could validate that a value was a big number, even if it had more than 15 significant digits, since you can create a big number out of it by converting it first to a string.
With DEBUB=true, that number will fail the validation.
You can see the difference in these tests: https://github.com/uphold/validator.js-asserts/blob/383e48b0a90538a982c2ab1d9aa0d7e7e57d34dc/test/asserts/big-number-assert_test.js#L83-L107
So if we want to support this use case, I can finish up https://github.com/uphold/validator.js-asserts/pull/174. If not, this PR seems fine (minus the dist files in the commit).
WDYT @franciscocardoso @rplopes ?
My fault for not checking the existing list of PRs 🙂.
If that extra work is done when passing the new { validateSignificantDigits: false } to the assert, then that's a new interface that old calls won't have and therefore won't serve as an option for avoiding the new behaviour on upgrade.
So IMO we should either make it the default behaviour, thus avoiding the breaking change, or else we wouldn't need to add it at this point, only when a need for it arrived.
Personally, I'd prefer the latter. Not only because it means less work now, but also because it introduces a breaking change, which is necessary here (bignumber.js is not a dependency in this project, it's an optional peer dependency, so it will resolve to the version the upstream project is using).
To summarize:
Before bignumber.js@7:
> new BigNumber(1.011111111111111111111111111111111111111111)
{ BigNumber Error: new BigNumber() number type has more than 15 significant digits: 1.011111111111111
at raise [...]
After bignumber.js@7 with BigNumber.DEBUG = false (default):
> new BigNumber(1.011111111111111111111111111111111111111111)
1.011111111111111
After bignumber.js@7 with BigNumber.DEBUG = true:
> new BigNumber(1.011111111111111111111111111111111111111111)
Error: [BigNumber Error] Number primitive has more than 15 significant digits: 1.011111111111111
at new BigNumber [...]
So by setting BigNumber.DEBUG = true, which is the case of this PR and the default behaviour of https://github.com/uphold/validator.js-asserts/pull/174, we maintain the previous behaviour.
With https://github.com/uphold/validator.js-asserts/pull/174, by passing { validateSignificantDigits: false } it disables the "15 significant digit" test, while maintaining the rest of the validations (making sure it's a number).
I think the only question here is either:
Do this in separate PRs (merge this one and make a new one only for the { validateSignificantDigits: false });
Finish https://github.com/uphold/validator.js-asserts/pull/174 which makes both changes at the same time, since they are related and changing the same code.
I'm leaning to option 2.
I misunderstood the old behaviour then.
About those 2 options: Ideally, I'd prefer smaller PRs that propose to solve one problem only. So that would be one for updating the lib with no behaviour change, and another for adding that new feature of allowing more significant digits. But if #174 is already doing both things and is close enough to being mergeable, I'm ok with going with that one.
I misunderstood the old behaviour then.
About those 2 options: Ideally, I'd prefer smaller PRs that propose to solve one problem only. So that would be one for updating the lib with no behaviour change, and another for adding that new feature of allowing more significant digits. But if #174 is already doing both things and is close enough to being mergeable, I'm ok with going with that one.
While I agree with smaller PRs, in this case https://github.com/uphold/validator.js-asserts/pull/174 was pretty close to done, so it was just a matter of saving time :)
Deprecated by #174
|
gharchive/pull-request
| 2019-02-06T14:52:49 |
2025-04-01T04:36:11.155333
|
{
"authors": [
"franciscocardoso",
"ricardobcl",
"rplopes"
],
"repo": "uphold/validator.js-asserts",
"url": "https://github.com/uphold/validator.js-asserts/pull/176",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1956828102
|
[Bug]: Plugin "obsidian-reminder-plugin" is not passing Component in renderMarkdown
Describe the bug
Getting the following error in console
app.js:1 Error: Plugin "obsidian-reminder-plugin" is not passing Component in renderMarkdown. This is needed to avoid memory leaks when embedded contents register global event handlers.
at t.render (app.js:1:1231191)
at t.renderMarkdown (app.js:1:1230972)
at eval (plugin:obsidian-reminder-plugin:15229:39)
at flush (plugin:obsidian-reminder-plugin:14593:9)
at init (plugin:obsidian-reminder-plugin:14741:5)
at new Reminder2 (plugin:obsidian-reminder-plugin:16375:5)
at NotificationModal.onOpen (plugin:obsidian-reminder-plugin:16458:5)
at e.open (app.js:1:1309979)
at ReminderModal.showBuiltinReminder (plugin:obsidian-reminder-plugin:16432:113)
at ReminderModal.show (plugin:obsidian-reminder-plugin:16398:12)
Expected Behavior
No response
Steps to reproduce
Not sure
Operating system
macOS
got this same error today.
|
gharchive/issue
| 2023-10-23T10:17:19 |
2025-04-01T04:36:11.159265
|
{
"authors": [
"offbrands-studio",
"pravin-d"
],
"repo": "uphy/obsidian-reminder",
"url": "https://github.com/uphy/obsidian-reminder/issues/165",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1067063169
|
Locale
Question
Hi there, I added a new locale translations for Icelandic on uploadcare/uploadcare-widget - is it possible to merge it with this library?
Hi @disaerna, you can simply run npm update uploadcare-widget in your project. As a result, the uploadcare-widget package will be updated to the latest version, and you'll be able to use the Icelandic locale.
Thanks for the answer 👍 I updated my package and all packages are up to date.
But after updateing my translations are still not added to the Locale type in */types/index.d.ts.
Am I missing something?
? 🤔
Great 💯 Thank you
|
gharchive/issue
| 2021-11-30T10:30:03 |
2025-04-01T04:36:11.164598
|
{
"authors": [
"disaerna",
"optlsnd"
],
"repo": "uploadcare/react-widget",
"url": "https://github.com/uploadcare/react-widget/issues/287",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
194477816
|
what are emmie-tokens?
in the RC spec there is an argument to be passed to the container like so
- -path-to-tokens=/etc/emmie-tokens/tokens.txt
what is the intended contents of this file? I see further down your demo places them in a k8s volume named tokens and mounts at /etc/emmie-tokens
Tokens are a cheap way to secure Emmie. Since typically I'm deploying Emmie to the public internet, I wanted a way to secure the API. This is done with some simple tokens over TLS. You can choose to not use them and if the token file is blank, tokens won't be enforced.
Okay cool. I'm using NodePort for now. Good to know your security model.
|
gharchive/issue
| 2016-12-08T23:58:32 |
2025-04-01T04:36:11.170834
|
{
"authors": [
"lazzarello",
"stevesloka"
],
"repo": "upmc-enterprises/emmie",
"url": "https://github.com/upmc-enterprises/emmie/issues/18",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
56518011
|
The gem I install has differences from the one in github although version is the same
First of all thanks for this great gem!
I don't know if it's just me doing something wrong but the gem I install using bundle install is different from what I find here in github. I'm using upmin-admin 0.1.01
I noticed this because I needed the delete feature for models and couldn't find it. I then noticed that it seems to be implemented in the files shown here in github but not in the version I installed (which is also 0.1.01). Examples of differences:
In "upmin-admin-ruby/app/views/upmin/partials/models/_model.html.haml",
This part is missing from the gem files I have locally (meaning I have no delete button):
= link_to(model.path, method: :delete, class: "btn btn-sm btn-danger delete pull-right", title: "Delete #{model.title}.", data: {confirm: "Are you sure?"}) do
%span.glyphicon.glyphicon-trash.white
I also do not have a DELETE route in "upmin-admin-ruby/config/routes.rb"
It's probably something stupid on my part.. I'm kind of new here :) Any ideas?
Thanks in advance!
My bad! Just noticed that this feature is planned for release only in version 0.1.1.
Thanks
The published gem is behind master on Github, but you can use :git in your gemfile to pull from the repository. You might want to target a specific reference so it doesn't change out from under you.
http://bundler.io/git.html
I followed your advice and it's working fine now. Many thanks ;)
|
gharchive/issue
| 2015-02-04T12:00:49 |
2025-04-01T04:36:11.175135
|
{
"authors": [
"mbrookes",
"pedrovasconcelos"
],
"repo": "upmin/upmin-admin-ruby",
"url": "https://github.com/upmin/upmin-admin-ruby/issues/162",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2102414000
|
Issue with 1password SSH commit signing
After setting up SSH signing with 1password, trying to integrate or request a review on a patch results in an error:
Seems like the issue comes from the fact that the 1password setup agent is adding the public key directly to user.signingkey, not the path to the key. What's more, I think we need to use the 1password binary to sign the commit, which they set on gpg.ssh.program.
We should also fix the unwrap in cherry_picking.rs to bubble up the issue instead of causing a panic if anything goes wrong with the signing process. We should probably print out a warning message and proceed without signing instead.
Happy to take this on if you verify this issue @drewdeponte
@alondahari the proper thing for us to do here is to switch to external execution of whatever the configured command is. The same way that git proper does. So we may need to reference git proper.
This will allow us to remove the ssh signing dependencies which is good but will also give us the flexibility to behave the same way that Git proper does.
Hey! Hope you're doing well. I was having this same issue after switching to 1Password. Is there any ETA on the next release that includes the fix?
@edgarjs if you build from source what is in the main branch currently has the fix. I was waiting for people to do more testing. But, that doesn't seem to be happening.
So, I will try and cut a release tonight or tomorrow and we can just go from there.
@drewdeponte I tried that, but I get this error now (the blurred part is an array of numbers):
@edgarjs @alondahari this issue has been actually resolved now in the latest release, v7.1.1.
|
gharchive/issue
| 2024-01-26T15:31:15 |
2025-04-01T04:36:11.186740
|
{
"authors": [
"alondahari",
"drewdeponte",
"edgarjs"
],
"repo": "uptech/git-ps-rs",
"url": "https://github.com/uptech/git-ps-rs/issues/290",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1209532238
|
How to force delete records?
Hello! So, I have used the model:
type User struct {
bun.BaseModel `bun:"table:users,alias:u"`
ID uuid.UUID `json:"id" bun:",pk,type:uuid,default:gen_random_uuid()"`
Email string `json:"email" bun:",unique,notnull"`
FirstName string `json:"first_name" bun:",notnull"`
LastName string `json:"last_name" bun:",notnull"`
Password *string `json:"-" bun:",nullzero"`
Type string `json:"type,omitempty" bun:"default:'user'"`
Flags int `json:"flags,omitempty" bun:",default:0"`
CreatedAt time.Time `json:"created_at" bun:",notnull,default:current_timestamp"`
UpdatedAt *time.Time `json:"updated_at,omitempty" bun:",nullzero"`
DeletedAt *time.Time `json:"deleted_at,omitempty" bun:",soft_delete,nullzero"`
}
I created a new user and want to force delete it:
q = db.NewDelete().Model((*User)(nil))
q = req.SearchDelete(q) // just add some where, like id = ? or id in (?) etc.
_, err = q.ForceDelete().Exec(ctx)
if err != nil {
return err
}
But after that I just see, that record wasn't force deleted, just deleted_at was updated to the current timestamp. When the record is soft deleted, then ForceDelete really deletes it.
Can I force delete it without soft deleting it before? Thanks!
Thanks for the report - this should be fixed in v1.1.4
|
gharchive/issue
| 2022-04-20T11:08:08 |
2025-04-01T04:36:11.189315
|
{
"authors": [
"vmihailenco",
"weijinnx"
],
"repo": "uptrace/bun",
"url": "https://github.com/uptrace/bun/issues/514",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2715495118
|
feat: Add query comment into the query for improved debugging
Closes #998
A common challenge when using ORMs is the difficulty of correlating an executed query in the DBMS with the lines of code that triggered it. To solve this issue, it would be helpful to add a comment in the header of the generated query. In this implementation, if a context with a comment is provided, we’ll include that as a comment in the query.
This feature enhances debugging by including query comments in headers, allowing developers to trace the query's origin. By matching the query comment in the header, it's easier to identify the specific location in the code where the query was executed.
Using this feature uses now can add SQL comment in query header
For example
db.NewUpdate().
Model(&Model{}).
Comment("test").
Set("name = ?", "new-name").
Where("id = ?", 1)
This will create a query like
/* test */ UPDATE "models" AS "model" SET name = 'new-name' WHERE (id = 1)
This was implemented in go-pg #2011
Looks like it is similar to #1002.
Let's move comment string to baseQuery with accompanying method to add/set a comment that is called from Select/InsertQuery.
Also, do we want to support a single comment or multiple comments? E.g. what should be the result of calling q.Comment("comment1").Comment("comment2"). I'd say that it is more natural to allow multiple comments, but if we don't know the answer yet, let's forbid the 2nd Comment call by setting an error on the query.
@vmihailenco I can add multicomment support.
Current implementation add comment with space seapration
/* com */ select 1;
But I wonder should they be separeated by the new line line
/* com1 */
/* com2 */
select 1;
I suspect there might be a case when we have sharded db and we need to add kind of global instance name in the comment for example when we establishing a connection or via some hook. In this particular case it might be:
/* shard1 */
/* query commment */
select 1;
What do you think?
It is not advisable to implement support for multiple comments now. When writing SQL using an ORM, the syntax is relatively unordered; however, comments should maintain a specific sequence.
Introducing support for multiple comments now could hinder our ability to implement ordered comments in the future, even if we are currently confident that such an implementation will not take place.
For example:
/* maybe we need to write something */
SELECT /* but seem redundant */ 1;
/* say goodbye to db */
|
gharchive/pull-request
| 2024-12-03T16:37:15 |
2025-04-01T04:36:11.195277
|
{
"authors": [
"Aoang",
"vmihailenco",
"wwoytenko"
],
"repo": "uptrace/bun",
"url": "https://github.com/uptrace/bun/pull/1082",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1385019359
|
Add ZeroLogger => Uptrace Logging
Hello,
Is it possible to add an easy hook to push zero logger logs to uptrace?
https://github.com/rs/zerolog
Hi,
No, last time I checked Zerolog did not propagate context.Context which is required to pass trace id. You can use Zerolog with Vector though, see https://github.com/uptrace/uptrace-go/tree/master/example/zerolog-vector.
But I would still recommend https://github.com/uptrace/opentelemetry-go-extra/tree/main/otelzap
|
gharchive/issue
| 2022-09-25T13:12:36 |
2025-04-01T04:36:11.198147
|
{
"authors": [
"AchoArnold",
"vmihailenco"
],
"repo": "uptrace/opentelemetry-go-extra",
"url": "https://github.com/uptrace/opentelemetry-go-extra/issues/78",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
180399246
|
proposal: tape-join gate
I'd like to add a gate for joining tapes to the standard library. It's a fairly common operation, easily available in the other languages.
Here's an example that does what is says on the tin:
=join |= {a/tape b/(list tape)}
^- tape
?~ b ~
%+ weld i.b
?~ t.b ~
(weld a $(b t.b))
(join " - " ~["foo" "bar" "baz"]) produces "foo - bar - baz".
Is it worth using a wet gate here? It seems like joining a collection-of-collections is generically useful.
=join |* {a/(list) b/(list (list))}
=. a `_?>(?=(^ b) i.b)`a
|- ^+ a
?~ b ~
%+ weld i.b
?~ t.b ~
(weld a $(b t.b))
(Callers then have to supply the cast: (join " - " (limo ~["foo" "bar" "baz"])))
This is good (though maybe just for tapes, not sure about the generic), but...
What would be even more cool is a ++by-style two letter door for tapes. The string handling code is really old and ad hoc... a much more complex proposal, I understand!
Sent from my iPhone
On Sep 30, 2016, at 1:49 PM, Joe Bryan notifications@github.com wrote:
I'd like to add a gate for joining tapes to the standard library. It's a fairly common operation, easily available in the other languages.
Here's an example that does what is says on the tin:
=join |= {a/tape b/(list tape)}
^- tape
?~ b ~
%+ weld i.b
?~ t.b ~
(weld a $(b t.b))
(join " - " ~["foo" "bar" "baz"]) produces "foo - bar - baz".
Is it worth using a wet gate here? It seems like joining a collection-of-collections is generically useful.
=join |* {a/(list) b/(list (list))}
=. a _?>(?=(^ b) i.b)a
|- ^+ a
?~ b ~
%+ weld i.b
?~ t.b ~
(weld a $(b t.b))
(Callers then have to supply the cast: (join " - " (limo ~["foo" "bar" "baz"])))
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.
I don't know of many generic, tape-specific gates -- are you suggesting that new ones should be added? Any recommendations for how ++join would work in such a door?
Also, would this be separate from the oft-proposed list engined (++of)? I started working on that for my own amusement, just folding Section 2bB into a door.
Closing this one as it doesn't seem to have gone anywhere.
I've written several versions of this in several places, will probably just PR a generic stdlib implementation at some point. The issue certainly doesn't need to be open (we'd drown in issues for stdlib proposals ...)
|
gharchive/issue
| 2016-09-30T20:49:33 |
2025-04-01T04:36:11.227720
|
{
"authors": [
"cgyarvin",
"joemfb",
"jtobin"
],
"repo": "urbit/arvo",
"url": "https://github.com/urbit/arvo/issues/273",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2402725999
|
Error: Giliner Training data
I am facing the error in the Training data arguments in the dataset in this line {
training_args = TrainingArguments(
output_dir="models",
learning_rate=5e-6,
weight_decay=0.01,
others_lr=1e-5,
others_weight_decay=0.01,
lr_scheduler_type="linear", #cosine
warmup_ratio=0.1,
per_device_train_batch_size=8,
per_device_eval_batch_size=8,
num_train_epochs=3,
evaluation_strategy="epoch",
save_steps = 1000,
save_total_limit=10,
dataloader_num_workers = 8,
use_cpu = True,
report_to="none",
)
trainer = Trainer(
model=model,
args=training_args,
train_dataset=train_dataset,
eval_dataset=test_dataset,
tokenizer=model.data_processor.transformer_tokenizer,
data_collator=data_collator,
)
trainer.train()
}
Iam getting an error: BackendCompilerFailed: backend='inductor' raised:
CppCompileError: C++ compile error
Output:
/tmp/torchinductor_ubuntu/tw/ctwif4gnjxuyqq47csmx5l4wpoyqgb6kbcqb556l2sakhglmvmzd.cpp: In function ‘void kernel(float*, const long int*, const float*, const float*, float*, long int, long int)’:
/tmp/torchinductor_ubuntu/tw/ctwif4gnjxuyqq47csmx5l4wpoyqgb6kbcqb556l2sakhglmvmzd.cpp:32:46: error: no match for ‘operator==’ (operand types are ‘at::vec::CPU_CAPABILITY::VectorizedN<long int, 2>’ and ‘int’)
32 | auto tmp5 = tmp4 == 0;
| ~~~~ ^~ ~
| | |
| | int
| at::vec::CPU_CAPABILITY::VectorizedN<long int, 2>
Hi, can retry without compiling the model ?
Same problem, trying to perform fine tuning:
29%|██▉ | 7/24 [06:40<16:11, 57.15s/it]
0%| | 0/24 [00:00<?, ?it/s]
{
"name": "KeyError",
"message": "Caught KeyError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/torch/utils/data/_utils/worker.py", line 308, in _worker_loop
data = fetcher.fetch(index) # type: ignore[possibly-undefined]
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/torch/utils/data/_utils/fetch.py", line 54, in fetch
return self.collate_fn(data)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/transformers/trainer_utils.py", line 809, in call
return self.data_collator(features)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/collator.py", line 20, in call
raw_batch = self.data_processor.collate_raw_batch(input_x)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/processor.py", line 171, in collate_raw_batch
class_to_ids, id_to_classes = self.batch_generate_class_mappings(batch_list, negatives)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/processor.py", line 147, in batch_generate_class_mappings
negatives = self.get_negatives(batch_list, 100)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/processor.py", line 74, in get_negatives
types = set([el[-1] for el in b['ner']])
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/transformers/tokenization_utils_base.py", line 254, in getitem
return self.data[item]
KeyError: 'ner'
",
"stack": "---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
Cell In[13], line 28
1 training_args = TrainingArguments(
2 output_dir="models",
3 learning_rate=5e-6,
(...)
17 report_to="none",
18 )
20 trainer = Trainer(
21 model=model,
22 args=training_args,
(...)
26 data_collator=data_collator,
27 )
---> 28 trainer.train()
File ~/Library/Python/3.9/lib/python/site-packages/transformers/trainer.py:1885, in Trainer.train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs)
1883 hf_hub_utils.enable_progress_bars()
1884 else:
-> 1885 return inner_training_loop(
1886 args=args,
1887 resume_from_checkpoint=resume_from_checkpoint,
1888 trial=trial,
1889 ignore_keys_for_eval=ignore_keys_for_eval,
1890 )
File ~/Library/Python/3.9/lib/python/site-packages/transformers/trainer.py:2178, in Trainer._inner_training_loop(self, batch_size, args, resume_from_checkpoint, trial, ignore_keys_for_eval)
2175 rng_to_sync = True
2177 step = -1
-> 2178 for step, inputs in enumerate(epoch_iterator):
2179 total_batched_samples += 1
2181 if self.args.include_num_input_tokens_seen:
File ~/Library/Python/3.9/lib/python/site-packages/accelerate/data_loader.py:454, in DataLoaderShard.iter(self)
452 # We iterate one batch ahead to check when we are at the end
453 try:
--> 454 current_batch = next(dataloader_iter)
455 except StopIteration:
456 yield
File ~/Library/Python/3.9/lib/python/site-packages/torch/utils/data/dataloader.py:631, in _BaseDataLoaderIter.next(self)
628 if self._sampler_iter is None:
629 # TODO(https://github.com/pytorch/pytorch/issues/76750)
630 self._reset() # type: ignore[call-arg]
--> 631 data = self._next_data()
632 self._num_yielded += 1
633 if self._dataset_kind == _DatasetKind.Iterable and \
634 self._IterableDataset_len_called is not None and \
635 self._num_yielded > self._IterableDataset_len_called:
File ~/Library/Python/3.9/lib/python/site-packages/torch/utils/data/dataloader.py:1346, in _MultiProcessingDataLoaderIter._next_data(self)
1344 else:
1345 del self._task_info[idx]
-> 1346 return self._process_data(data)
File ~/Library/Python/3.9/lib/python/site-packages/torch/utils/data/dataloader.py:1372, in _MultiProcessingDataLoaderIter._process_data(self, data)
1370 self._try_put_index()
1371 if isinstance(data, ExceptionWrapper):
-> 1372 data.reraise()
1373 return data
File ~/Library/Python/3.9/lib/python/site-packages/torch/_utils.py:705, in ExceptionWrapper.reraise(self)
701 except TypeError:
702 # If the exception takes multiple arguments, don't try to
703 # instantiate since we don't know how to
704 raise RuntimeError(msg) from None
--> 705 raise exception
KeyError: Caught KeyError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/torch/utils/data/_utils/worker.py", line 308, in _worker_loop
data = fetcher.fetch(index) # type: ignore[possibly-undefined]
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/torch/utils/data/_utils/fetch.py", line 54, in fetch
return self.collate_fn(data)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/transformers/trainer_utils.py", line 809, in call
return self.data_collator(features)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/collator.py", line 20, in call
raw_batch = self.data_processor.collate_raw_batch(input_x)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/processor.py", line 171, in collate_raw_batch
class_to_ids, id_to_classes = self.batch_generate_class_mappings(batch_list, negatives)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/processor.py", line 147, in batch_generate_class_mappings
negatives = self.get_negatives(batch_list, 100)
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/gliner/data_processing/processor.py", line 74, in get_negatives
types = set([el[-1] for el in b['ner']])
File "/Users/gianpaolotobia/Library/Python/3.9/lib/python/site-packages/transformers/tokenization_utils_base.py", line 254, in getitem
return self.data[item]
KeyError: 'ner'
"
}
|
gharchive/issue
| 2024-07-11T09:35:01 |
2025-04-01T04:36:11.273736
|
{
"authors": [
"gptob",
"ravi0027",
"urchade"
],
"repo": "urchade/GLiNER",
"url": "https://github.com/urchade/GLiNER/issues/152",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2302833432
|
Polnische Feiertage
Unser polnischer Standort möchte gerne die Urlaubsverwaltung verwenden. Gibt es eine Möglichkeit die polnischen Feiertage hinzuzufügen?
Feiertage Polen (Wikipedia)
Hallo @surfaceHH,
die Urlaubsverwaltung bezieht ihre Feiertage aus https://github.com/focus-shift/jollyday/. Unter https://github.com/focus-shift/jollyday/blob/main/jollyday-core/src/main/resources/holidays/Holidays_pl.xml kannst du die Feiertage abgleichen. Der nächste Schritt wäre dann, dass man sie in der UV "freischaltet"
Hallo @derTobsch,
leider habe ich erst jetzt die die Bestätigung der Feiertage erhalten. Wie funktioniert das mit dem freischalten in der Urlaubsverwaltung?
wird in der 5.6.0 veröffentlicht und ist dann über die Feiertagsregelung auswählbar.
|
gharchive/issue
| 2024-05-17T13:58:15 |
2025-04-01T04:36:11.294068
|
{
"authors": [
"derTobsch",
"surfaceHH"
],
"repo": "urlaubsverwaltung/urlaubsverwaltung",
"url": "https://github.com/urlaubsverwaltung/urlaubsverwaltung/issues/4668",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1672882114
|
Missing example in next-urql documentation
Describe the bug
Hello,
While researching a ssr issue with urql, I came across next-urql.
I recognized the link to the mentioned example is broken. There is also no example in the mentioned folder structure.
https://github.com/urql-graphql/urql/tree/main/packages/next-urql#integration-with-_appjs
Check out our example for using next-urql with _app.js here.
Note: the version mentioned below is the version of next-urql copied form package.json.
While exploring the repository a bit, I assume that is the correct link nowadays?
https://github.com/urql-graphql/urql/tree/main/examples/with-next
Thanks in advance
Best torben
Reproduction
https://github.com/urql-graphql/urql/tree/main/packages/next-urql#integration-with-_appjs
Urql version
5.0.0
Validations
[X] I can confirm that this is a bug report, and not a feature request, RFC, question, or discussion, for which GitHub Discussions should be used
[X] Read the docs.
[X] Follow our Code of Conduct
Hm, that readme should be out-of-date, so we should likely just delete it as it was never meant to be committed to the monorepo and is a leftover from when the repo was migrated into it.
we can use some info how to use the lib with next.13 app dir 👍
|
gharchive/issue
| 2023-04-18T11:07:26 |
2025-04-01T04:36:11.303888
|
{
"authors": [
"kitten",
"masterkain",
"torben3d"
],
"repo": "urql-graphql/urql",
"url": "https://github.com/urql-graphql/urql/issues/3154",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2565259341
|
kickoff infrastructure, CI, & packaging
kickoff infrastructure, CI, & packaging
resolves #3
resolves #8
ignore Windows set up failure for now
waiting for momepy=0.8.1 to show up on conda-forge
v0.8.1 just cut from the feedstock side. Let's re-run CI once it shows up on conda-forge.
|
gharchive/pull-request
| 2024-10-04T01:39:25 |
2025-04-01T04:36:11.331310
|
{
"authors": [
"jGaboardi"
],
"repo": "uscuni/sgeop",
"url": "https://github.com/uscuni/sgeop/pull/17",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
938001540
|
Perform web application scan and send results to Geoplatform
From Geoplatform:
Here are some suggested tools to use. As long as the tool can detect vulnerabilities, flaws and improper configurations, it’s really up to you if you would rather use something else.
Web Application Scans:
OWASP ZAP (>https://www.zaproxy.org/<)
Pentest-Tools Website vulnerability Scanner (>https://pentest-tools.com/website-vulnerability-scanning/website-scanner<)
Snyk (>https://snyk.io/website-scanner/<)
Source Code Analysis Tools:
CodeQL (https://github.com/github/codeql)Steps to configure scans - https://docs.github.com/en/code-security/secure-coding/automatically-scanning-your-code-for-vulnerabilities-and-errors/setting-up-code-scanning-for-a-repository
SonarQube (>https://www.sonarqube.org/<)
Fluid Attack’s Scanner (>https://docs.fluidattacks.com/machine/scanner/plans/foss<)
Closing as duplicate of #261
|
gharchive/issue
| 2021-07-06T15:11:44 |
2025-04-01T04:36:11.340452
|
{
"authors": [
"switzersc-usds"
],
"repo": "usds/justice40-tool",
"url": "https://github.com/usds/justice40-tool/issues/298",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1907697293
|
#199 add CLI feature to use command line parameter '--env-var' secret=…
#199 add CLI feature to use command line parameter '--env-var secret=xzy123'
@mirkogolze I have one comment on the PR. Everything else looks good.
@mirkogolze The feature/env-secrets branch might take time to get it completed and merged.
Can you raise a new PR to the main branch with the fixes per my comment ?
Thanks.
|
gharchive/pull-request
| 2023-09-21T20:19:33 |
2025-04-01T04:36:11.343861
|
{
"authors": [
"helloanoop",
"mirkogolze"
],
"repo": "usebruno/bruno",
"url": "https://github.com/usebruno/bruno/pull/202",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2484288095
|
feat: Feature/cli support multiple reporters
Description
Adding the ability to pass multiple reporters using --format. You can specify the output files using --output.reporter
Sample I ran on my machine:
node ..\..\bruno-cli\bin\bru.js run --env Prod --output junit.xml --format junit --format html --format json --output.html html.html --output.json results.json
Console output:
All 3 files were written:
Contribution Checklist:
[x] The pull request only addresses one issue or adds one feature.
[x] The pull request does not introduce any breaking changes
[x] I have added screenshots or gifs to help explain the change if applicable.
[x] I have read the contribution guidelines.
[x] Create an issue and link to the pull request.
Related to #2045
Thanks for taking the time to work on this @iamdavidfrancis !
I think we need to support a simpler cli args for multiple reporters.
I penned a proposal here: #2918
@helloanoop I've pushed an update to use the schema mentioned in your discussion. Let me know if you have any thoughts.
what needs to be done to get this merged?
@lohxt1 Can you review this please ?
Merged!
This will be available in upcoming v1.29.0 release
|
gharchive/pull-request
| 2024-08-24T06:30:57 |
2025-04-01T04:36:11.349857
|
{
"authors": [
"berlingoqc",
"helloanoop",
"iamdavidfrancis"
],
"repo": "usebruno/bruno",
"url": "https://github.com/usebruno/bruno/pull/2911",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
359891919
|
Alert from ElasticSearch entries?
We would like to send some alerts from specific entries in ElasticSearch, do we use https://github.com/Yelp/elastalert or maybe we can forward specific entries to Prometheus and Alertmanager?
Current plan:
We switched to OpenDistro with Lagoon 1.1.0, which includes an alerting feature that we would like to use. Unfortunately it is not integrated with the OpenDistro Security feature, which means each alert can be seen by everybody that has access to the Kibana. There is a ticket within OpenDistro that checks the working of this: https://github.com/opendistro-for-elasticsearch/alerting/issues/6
I am not familiar with what an "alert" would mean. Would it be possible to add the ability to notify on slack when a cronjob returns a non-zero exit code with this feature?
I am not familiar with what an "alert" would mean. Would it be possible to add the ability to notify on slack when a cronjob returns a non-zero exit code with this feature?
As elasticsearch is no longer bundled into Lagoon, closing the issue here. Integrating Alerting with OpenSearch/ODFE would be a platform-specific implementation
|
gharchive/issue
| 2018-09-13T13:18:09 |
2025-04-01T04:36:11.355413
|
{
"authors": [
"Schnitzel",
"rodrigoaguilera",
"tobybellwood"
],
"repo": "uselagoon/lagoon",
"url": "https://github.com/uselagoon/lagoon/issues/618",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1600815491
|
Fails to start the server
Describe the bug
Starting the server with the docker command fails. This is a fresh install.
Steps to reproduce
Run the following command with an empty directory (fresh install)
docker run -d --name memos -p 5230:5230 -v ~/.memos/:/var/opt/memos neosmemo/memos:latest
Screenshots or additional context
---
Server profile
dsn: /var/opt/memos/memos_prod.db
port: 5230
mode: prod
version: 0.11.0
---
failed to create server, error: failed to find migration history, err: no such table: migration_history
cannot open db
github.com/usememos/memos/server.NewServer
/backend-build/server/server.go:41
github.com/usememos/memos/cmd.glob..func1
/backend-build/cmd/server.go:40
github.com/spf13/cobra.(*Command).execute
/go/pkg/mod/github.com/spf13/cobra@v1.6.1/command.go:920
github.com/spf13/cobra.(*Command).ExecuteC
/go/pkg/mod/github.com/spf13/cobra@v1.6.1/command.go:1044
github.com/spf13/cobra.(*Command).Execute
/go/pkg/mod/github.com/spf13/cobra@v1.6.1/command.go:968
github.com/usememos/memos/cmd.Execute
/backend-build/cmd/server.go:75
main.main
/backend-build/main.go:10
runtime.main
/usr/local/go/src/runtime/proc.go:250
runtime.goexit
/usr/local/go/src/runtime/asm_amd64.s:1594
Are you sure that is a "fresh install"?
This command starts the memos with the DB file in ~/.memos.
My bad, the files were from an older test.
|
gharchive/issue
| 2023-02-27T09:34:57 |
2025-04-01T04:36:11.357923
|
{
"authors": [
"Zeng1998",
"isgj"
],
"repo": "usememos/memos",
"url": "https://github.com/usememos/memos/issues/1177",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1661570487
|
Add CI
Add CI to build the image and push it to:
[ ] Docker Hub
[ ] Google Artifact Registry
Will need WIDFED configured.
Closed by https://github.com/userbradley/gcs-web-server/pull/3
|
gharchive/issue
| 2023-04-11T00:04:42 |
2025-04-01T04:36:11.359543
|
{
"authors": [
"userbradley"
],
"repo": "userbradley/gcs-web-server",
"url": "https://github.com/userbradley/gcs-web-server/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1773312328
|
control numbering
Hi great extension, quite neat.
It would be nice to have control over numbering. For instance flat numbering (all items are numbering continuously all along the document, could specifiy a start counter, could specify letter counter or roman numbers).
Groups and listing are really cool.
yes, this could be nice, indeed, and is on my wish list. Quarto will get a new crossreferencing mechanism soonish, and this may determine how to implement the numbering features you suggest. Therefore I postponed it for myself, but if anyone would like to contribute here, I will happily merge PRs :-)
Just needed something that numbers by section when I wrote the extension, so flat numbering could be something I implement in the nearer future - number by section is a bit ridiculous for short documents.
|
gharchive/issue
| 2023-06-25T14:16:46 |
2025-04-01T04:36:11.468803
|
{
"authors": [
"ute",
"xtimbeau"
],
"repo": "ute/custom-numbered-blocks",
"url": "https://github.com/ute/custom-numbered-blocks/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
842508484
|
Minor memory leak without sqlite3_shutdown
Hello, I was seeing some memory leaks on app exit and traced it to sqlite3mc_amalgamation.c(23761). Specifically, three blocks of size 8, 48, and 208 bytes. One of these being wsdAutoext.aExt which is freed in sqlite3_reset_auto_extension but only if I call sqlite3_shutdown. Not sure if I'm just doing something wrong, but the call to shutdown seems to be necessary for me (building the default sqlite3mc_amalgamation.c on VS2019).
Not that it's a significant amount of memory or anything, but the "Detected memory leaks!" warning might make someone think twice about using the library. Anyway, it might be nice to mention this somewhere in the docs for others who might run into the same leaks.
If you want to be on the safe side, you'll have to call sqlite3_shutdown. This is the only way to guarantee that all resources are freed.
Interestingly, I use VS2019 myself, but I didn't experience memory leaks, even without calling sqlite3_shutdown. For example, the shell application coming with SQLite does not call sqlite3_shutdown. Nevertheless VS2019 does not report memory leaks, if I use this shell together with SQLite3 Multiple Ciphers.
How do you compile sqlite3mc_amalgamation.c? Could you please show the full VS2019 command line which is used? TIA.
Thanks for the confirmation! The official sqlite3 documentation is a little vague on when sqlite3_shutdown is necessary. But I will assume it is in this case.
Here is a minimal sample app that illustrates the leaking.
// main.c
// --------------------------------------------------------------
// Illustration of memory leak with sqlite3mc on app exit
// when not calling sqlite3_shutdown().
//
// Compile with MSVC from the commandline with
// cl /MTd main.c
//
// Run with the debugger to see the memory leak output
// devenv main.exe
//
// --------------------------------------------------------------
// Enable memory leak detection
// Ref: https://docs.microsoft.com/en-us/visualstudio/debugger/finding-memory-leaks-using-the-crt-library?view=vs-2019
#define _CRTDBG_MAP_ALLOC
#include <stdlib.h>
#include <crtdbg.h>
// Simple unity build for demonstration purposes
#include "sqlite3mc_amalgamation.c"
#include "sqlite3mc_amalgamation.h"
int main(int argc, char *argv[])
{
// Enable memory leak reporting (will show up in MSVC "Output" window)
_CrtSetDbgFlag ( _CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF );
sqlite3* db;
sqlite3_open(":memory:", &db);
sqlite3_close(db);
// Memory is leaked at app exit unless shutdown is explicitly called
// sqlite3_shutdown();
return 0;
}
The official sqlite3 documentation is a little vague on when sqlite3_shutdown is necessary.
Yes, indeed. In fact, it depends on how the SQLite library was built, especially whether the standard SQLite amalgamation is used with default settings or additional features were added and/or activated (as is the case for SQLite3 Multiple Ciphers). In the default case sqlite3_shutdown does not need to free any memory, and therefore it is not necessary to call the function at all in that case.
But I will assume it is in this case.
Yes. SQLite3 Multiple Ciphers makes use of the function sqlite3_auto_extension to register extensions for automatic loading for each new database connection. And therefore it is necessary to call sqlite3_reset_auto_extension (which is done implicitly by sqlite3_shutdown) to clean up memory before exiting the application - this is true for the standard SQLite3 code, too. This could be avoided only at the price of patching the SQLite source code even more.
Additionally, SQLite3 Multiple Ciphers registers at least one dynamically allocated VFS instance, which is freed only if sqlite3_shutdown is called or if sqlite3mc_vfs_shutdown (or sqlite3mc_shutdown) is called explicitly. At first I had implemented only a single static instance of the new VFS shim, but that was too restrictive, in case a developer wanted to combine encryption with a non-default VFS.
So, the conclusion is that the simplest way to avoid memory leaks is to call sqlite3_shutdown before exiting the application.
Thanks for bringing the issue to my attention. I will add a note to the documentation.
Since calling sqlite3_shutdown avoids memory leaks and since this is now documented, too, I close the issue.
|
gharchive/issue
| 2021-03-27T13:51:07 |
2025-04-01T04:36:11.476505
|
{
"authors": [
"gooderist",
"utelle"
],
"repo": "utelle/SQLite3MultipleCiphers",
"url": "https://github.com/utelle/SQLite3MultipleCiphers/issues/29",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
326040948
|
Feature/#725 move create course into dialog
Description:
Moved the create Course Screen into a nice Dialog.
Closes #725
Improvements
Responsive for mobile.
Pull Request Test Coverage Report for Build 2429
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage remained the same at 65.624%
Totals
Change from base Build 2424:
0.0%
Covered Lines:
1149
Relevant Lines:
1676
💛 - Coveralls
|
gharchive/pull-request
| 2018-05-24T09:37:38 |
2025-04-01T04:36:11.482622
|
{
"authors": [
"HPunktOchs",
"coveralls"
],
"repo": "utetrapp/geli",
"url": "https://github.com/utetrapp/geli/pull/752",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
238567637
|
Implement grid data structures
Grid2 class
Grid3 class
Completed
Grid2
Completed
Grid3
|
gharchive/issue
| 2017-06-26T14:48:09 |
2025-04-01T04:36:11.485075
|
{
"authors": [
"utilForever"
],
"repo": "utilForever/CubbyFlow",
"url": "https://github.com/utilForever/CubbyFlow/issues/66",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
367930012
|
Font size
Hi, does anybody know how to change the font size for all the labels when using sketch measure? And also the label size to correspond with the reduced font size.
Thank you.
@micklynch76: You can edit the common.js file with any text editor.
The file is located at ~/Library/Application Support/com.bohemiancoding.sketch3/Plugins/Sketch Measure.sketchplugin/Contents/Sketch/library/.
Go to line 678 and change text.setFontSize(12); to any number you'd like to.
Save the file and restart Sketch for the changes to take effect.
Thank you for the reply - I tried this but it stopped Sketch Measure from working?
I had to save the common.js file as common.txt and then rename the file so maybe that was the problem?
It might be that the extension didn't really change back.
You can always drag and drop the common.js file to e.g. the "TextEdit" app and it will keep the file extension.
Just delete your current plugin and overwrite with the latest here.
And do it all over again :)
Good luck!
|
gharchive/issue
| 2018-10-08T19:52:45 |
2025-04-01T04:36:11.534603
|
{
"authors": [
"ellunium",
"micklynch76"
],
"repo": "utom/sketch-measure",
"url": "https://github.com/utom/sketch-measure/issues/480",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
726857230
|
Add support for user type in signal arguments
Hi!
I tried to address the following comment in Invocation.kt:
// This will fail with:
// class godot.Spatial cannot be cast to class godot.tests.Invocation.
...
// fun hookTwoParam(str: String, inv: Invocation) {
I don't know anything about this project or how things are supposed to work. So this PR might be total garbage. But it was a nice way of trying to get to know the code-base. There is probably a much better solution that I didn't think of. If you have time, please point me in the right direction. If not, feel free to send any other issues my way if there are some small things I can help out with.
Running this code produces the expected output of:
Hook was called with parameters: My Awesome param !, godot.Spatial@647e447
Hello Invocation!
Hook was called with parameters (invocation): My Awesome param (invocation)!, godot.tests.Invocation@41fbdac4
This PR depends on the following change to godot-kotlin-entry-generator: https://github.com/utopia-rise/godot-kotlin-entry-generator/compare/master...totalorder:argument-full-fqdn?expand=1
@chippmann @piiertho Please take a look if you have time.
@totalorder First of all: thx a lot for your time investment!
Regarding the entry generator changes: Feel free to open a PR in that repo.
I already did that change in this branch for the Property and the Function generators but forgot the Signal generators.
So it' doesn't hurt if you do a PR there and we merge that beforehand and update this PR so it points at the right commit from the entry generator. If we would merge it now it would not function correctly i suppose.
Regarding the rest of the PR: I'll look at that later today.
Closing this until decision on "track instances of user types indexed by their rawPtr" is made.
|
gharchive/pull-request
| 2020-10-21T20:55:48 |
2025-04-01T04:36:11.542209
|
{
"authors": [
"chippmann",
"totalorder"
],
"repo": "utopia-rise/godot-jvm",
"url": "https://github.com/utopia-rise/godot-jvm/pull/23",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2330108505
|
feat!: update node support matrix (only support node 16-20)
This will be a breaking change. My intention is to release uuid@10 release with RFC9562 support, so this is a good opportunity to update the CI matrix.
Github settings for the main branch also still need to be updated.
|
gharchive/pull-request
| 2024-06-03T03:53:57 |
2025-04-01T04:36:11.545955
|
{
"authors": [
"broofa",
"ctavan"
],
"repo": "uuidjs/uuid",
"url": "https://github.com/uuidjs/uuid/pull/750",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1640001104
|
whoami has 0 coverage on Windows
Maybe the test isn't executed on Windows?
https://app.codecov.io/gh/uutils/coreutils/blob/main/src/uu/whoami/src/platform/windows.rs
It seems that CI bypasses the only common test covered in Windows.
https://github.com/uutils/coreutils/blob/e920d8d1bdc9c6e59f577b3d415fd993cc609578/tests/by-util/test_whoami.rs#L42-L52
Can I fix it by adding a dummy command test in Windows?
#[test]
#[cfg(windows)]
fn test_normal_windows() {
new_ucmd!().succeeds().no_stderr();
}
yeah, let's try that :)
|
gharchive/issue
| 2023-03-24T20:37:46 |
2025-04-01T04:36:11.548043
|
{
"authors": [
"garydev10",
"sylvestre"
],
"repo": "uutils/coreutils",
"url": "https://github.com/uutils/coreutils/issues/4614",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2451485980
|
Antibody structure prediction
Hello, I am new to the field and have a potentially basic question. I would like to know if a group of antibodies bind to a particular DNA sequence. Is RoseTTAFold2NA able to model antibody structure if I provide sequences for the heavy and light chains? Thank you.
I think it is possible to test it out. You have a Heavy+Light Chain antibody structure, and you want to test out whether it will bind to the DNA sequence. I might try it out if you are curious about how RF2NA will perform for your particular case.
|
gharchive/issue
| 2024-08-06T18:44:01 |
2025-04-01T04:36:11.549181
|
{
"authors": [
"anar-rzayev",
"vella12-osu"
],
"repo": "uw-ipd/RoseTTAFold2NA",
"url": "https://github.com/uw-ipd/RoseTTAFold2NA/issues/117",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1222804245
|
Cleaning up the dirty region on window resizes
When resizing a window we need to remove all rectangles from the dirtyRegion.
done
|
gharchive/issue
| 2022-05-02T12:05:12 |
2025-04-01T04:36:11.564915
|
{
"authors": [
"uwerat"
],
"repo": "uwerat/vnc-eglfs",
"url": "https://github.com/uwerat/vnc-eglfs/issues/5",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
218674750
|
The previous month button does not work since 1.7.0-RC1
Expected behaviour
Using a custom template for left arrow, you should be able to click back a month.
Actual behaviour
It moves forward only
Datepicker version used
1.7.0-RC1
1.6.4 works
related to https://github.com/uxsolutions/bootstrap-datepicker/commit/3e56fdb35872bcb16dfb1dd0d1f32ab03fdfdc64
Example code
Sample of the configuration I am using - the only important bit here is the templates. I am switching the left and right arrow icons.
this.datepicker.datepicker({
orientation: "left auto",
language: this.i18n.getLocale(),
autoclose: true,
clearBtn: false,
todayBtn: "linked",
todayHighlight: true,
templates: {
leftArrow: '<i class="sw-icon-previous"></i>',
rightArrow: '<i class="sw-icon-next"></i>'
},
container: $(this.element).parent(),
startDate: this.startDate,
endData: this.endDate
});
In 1.6.4 the code would try and find the .closest('prev') whereas now the code in navArrowsClick assumes the target of the event is either the 'prev' or 'next' element. However the target is the icon listed in the template. looking at the HTML, we see
<th class="prev"><i class="sw-icon-previous"></i>...</th>
So clicking back always results in a value of +1
I am facing this problem as well with 1.7.0-RC2
Correct. It was introduced in 1.7.0-RC1 where this area was rewritten, but
it still exists in latest
On Apr 6, 2017 10:38 AM, "rgins16" notifications@github.com wrote:
I am facing this problem as well with 1.7.0-RC2
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/uxsolutions/bootstrap-datepicker/issues/2159#issuecomment-292194829,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ADEpfhEh_6AwUzzsKFlP0SIsoY6ky8SNks5rtPjNgaJpZM4MwZBf
.
@vsn4ik it looks like it's related to your changes from #2116. Can you please take a look? Thanks!
Datepicker version used
1.6.4
Issue
(only reproducible when using minified file)
Selecting the Left Arrow date picker moves forward only
When there is an endDate set the leftArrow performs a .next function, rather than .prev.
(ref. bootstrap-datepicker.js line 1184)
Again: this only happens when using the minified file.
var MIN_DATE = new Date("1/1/2008");
var MAX_DATE = new Date("12/15/2016");
var options = {
clearBtn: true,
orientation: "bottom auto",
defaultViewDate: MAX_DATE,
startDate: MIN_DATE,
endDate: MAX_DATE,
templates: {
leftArrow: '<i class="fa fa-chevron-circle-left"></i>',
rightArrow: '<i class="fa fa-chevron-circle-right"></i>'
},
};
$('#datepicker2').datepicker(options);
@jadrake75 Thanks!
Please, check your problem on https://codepen.io/anon/pen/jGzowm?editors=1010.
@vsn4ik yes that seems to have addressed it - sorry for taking so long to confirm
|
gharchive/issue
| 2017-04-01T06:40:16 |
2025-04-01T04:36:11.592666
|
{
"authors": [
"acrobat",
"jadrake75",
"jetsquared",
"rgins16",
"vsn4ik"
],
"repo": "uxsolutions/bootstrap-datepicker",
"url": "https://github.com/uxsolutions/bootstrap-datepicker/issues/2159",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
788957543
|
Can't disable past dates
Expected behaviour
Disabling all dates prior to today so the client can't choose them
Actual behaviour
Have tried different methods I found on the internet, whether it was using startDate/minDate and different solutions but none of them worked for me, the person is able to choose any date prior to the actual date (ex: 31st Deceber of 2019)
Datepicker version used
I don't know how to check this, I picked someone's project on a website and I'm building on it
Example code
HTML input
<input id="startDate" placeholder="Pick Date" type="date" name="pickup_date" required/>
JS datepicker code
`// Home Page 0ne Date Picker JS
var date = new Date();
var today = new Date(date.getFullYear(), date.getMonth(), date.getDate());
$('#startDate').datepicker({
uiLibrary: 'bootstrap4',
iconsLibrary: 'fontawesome',
minDate: today,
});`
I've tried different solutions on StackOverflow but none of them worked, a lot of people pointed to this simple solution (https://www.itsolutionstuff.com/post/how-to-disable-previouse-date-bootstrap-datepicker) but it didn't work either, what am I missing here?
the JS is appearing de-formatted for some reason but adding a paragraph made it appear organized and into different lines instead of all appearing in one line of code
the JS is appearing de-formatted for some reason but adding a paragraph made it appear organized and into different lines instead of all appearing in one line of code
Hi,
Are you sure you're using this library? by the look of your snippet, it does not, none of the options are supported one.
Hi,
Are you sure you're using this library? by the look of your snippet, it does not, none of the options are supported one.
Hi,
Are you sure you're using this library? by the look of your snippet, it does not, none of the options are supported one.
What do you mean? I don't know, I came across this question (https://github.com/uxsolutions/bootstrap-datepicker/issues/329) and sought out for help because you were behind this feature on bootstrap xD I don't know what to do, how can I check where this is from?
Hi,
Are you sure you're using this library? by the look of your snippet, it does not, none of the options are supported one.
What do you mean? I don't know, I came across this question (https://github.com/uxsolutions/bootstrap-datepicker/issues/329) and sought out for help because you were behind this feature on bootstrap xD I don't know what to do, how can I check where this is from?
Can someone help me?
By your docs (https://bootstrap-datepicker.readthedocs.io/en/stable/) it has the same commands as I'm trying to execute on JavaScript, I don't understand
Can someone help me?
By your docs (https://bootstrap-datepicker.readthedocs.io/en/stable/) it has the same commands as I'm trying to execute on JavaScript, I don't understand
|
gharchive/issue
| 2021-01-19T11:50:23 |
2025-04-01T04:36:11.600415
|
{
"authors": [
"AnthonyIsBlacking",
"Azaret"
],
"repo": "uxsolutions/bootstrap-datepicker",
"url": "https://github.com/uxsolutions/bootstrap-datepicker/issues/2605",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2143253999
|
HTTP clientの抽象化
okhttpに強依存しない仕組みの導入は後々考えたい
#12 でokhttpの依存を消したので、一旦解決ってことにする。
|
gharchive/issue
| 2024-02-19T22:51:37 |
2025-04-01T04:36:11.616449
|
{
"authors": [
"ayato-p"
],
"repo": "uzabase/playtest2",
"url": "https://github.com/uzabase/playtest2/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
583129106
|
Add missing letters attr.xml
Add missing letters for format
@uzairiqbal91 Please merge this to make your library work again
Finally the merge. this took a while though thanks @uzairiqbal91
No problem , Thank you for supporting and keep doing work for community
On Fri, Jul 24, 2020, 9:07 AM Egesa Michael notifications@github.com
wrote:
Finally the merge. this took a while though thanks @uzairiqbal91
https://github.com/uzairiqbal91
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/uzairiqbal91/CircularTimerView/pull/3#issuecomment-663337931,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AGE3524T7DVAIFL4A7RZKPTR5ECHVANCNFSM4LNS524Q
.
|
gharchive/pull-request
| 2020-03-17T16:12:58 |
2025-04-01T04:36:11.619966
|
{
"authors": [
"Maltazard",
"egesamichael",
"hendrawd",
"uzairiqbal91"
],
"repo": "uzairiqbal91/CircularTimerView",
"url": "https://github.com/uzairiqbal91/CircularTimerView/pull/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2406868602
|
🛑 Mobile Search LS is down
In 34eb751, Mobile Search LS ($MOBILESEARCHLS) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Mobile Search LS is back up in ec00619 after 26 minutes.
|
gharchive/issue
| 2024-07-13T11:53:15 |
2025-04-01T04:36:11.622443
|
{
"authors": [
"uzumaki258"
],
"repo": "uzumaki258/status",
"url": "https://github.com/uzumaki258/status/issues/1581",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1861724008
|
🛑 Third Party is down
In 2b42b5f, Third Party ($THIRD_PARTY_URL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Third Party is back up in b827492 after 130 days, 5 hours, 47 minutes.
|
gharchive/issue
| 2023-08-22T15:37:04 |
2025-04-01T04:36:11.624646
|
{
"authors": [
"uzumaki258"
],
"repo": "uzumaki258/status",
"url": "https://github.com/uzumaki258/status/issues/389",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1899468009
|
🛑 Third Party is down
In 1fd7be9, Third Party ($THIRD_PARTY_URL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Third Party is back up in a3f9cee after 12 minutes.
|
gharchive/issue
| 2023-09-16T15:09:02 |
2025-04-01T04:36:11.626758
|
{
"authors": [
"uzumaki258"
],
"repo": "uzumaki258/status",
"url": "https://github.com/uzumaki258/status/issues/861",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.