id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
2309251343
|
Restoring and using int4 arrays on XLA CPU is broken after #20610.
Description
Casting int4 arrays to int8 after restoring them from a checkpoint was broken in #20610 (confirmed via CL sweep) with the following error:
deserialized_arr = deserialized_arr.astype(jax.numpy.int8)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/numpy/array_methods.py", line 68, in _astype
return lax_numpy.astype(arr, dtype, copy=copy, device=device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/numpy/lax_numpy.py", line 2857, in astype
lax.convert_element_type(x_arr, dtype),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/lax/lax.py", line 517, in convert_element_type
return _convert_element_type(operand, new_dtype, weak_type=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/lax/lax.py", line 559, in _convert_element_type
return convert_element_type_p.bind(operand, new_dtype=new_dtype,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/core.py", line 408, in bind
return self.bind_with_trace(find_top_trace(args), args, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/core.py", line 412, in bind_with_trace
out = trace.process_primitive(self, map(trace.full_raise, args), params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/core.py", line 901, in process_primitive
return primitive.impl(*tracers, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/dispatch.py", line 86, in apply_primitive
outs = fun(*args)
^^^^^^^^^^
File "jax/_src/traceback_util.py", line 179, in reraise_with_filtered_traceback
return fun(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^
File "jax/_src/pjit.py", line 305, in cache_miss
outs, out_flat, out_tree, args_flat, jaxpr, attrs_tracked = _python_pjit_helper(
^^^^^^^^^^^^^^^^^^^^
File "jax/_src/pjit.py", line 182, in _python_pjit_helper
out_flat = pjit_p.bind(*args_flat, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/core.py", line 2811, in bind
return self.bind_with_trace(top_trace, args, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/core.py", line 412, in bind_with_trace
out = trace.process_primitive(self, map(trace.full_raise, args), params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/core.py", line 901, in process_primitive
return primitive.impl(*tracers, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/pjit.py", line 1525, in _pjit_call_impl
return xc._xla.pjit(
^^^^^^^^^^^^^
File "jax/_src/pjit.py", line 1508, in call_impl_cache_miss
out_flat, compiled = _pjit_call_impl_python(
^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/pjit.py", line 1438, in _pjit_call_impl_python
inline=inline, lowering_parameters=mlir.LoweringParameters()).compile()
^^^^^^^^^
File "jax/_src/interpreters/pxla.py", line 2407, in compile
executable = UnloadedMeshExecutable.from_hlo(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/interpreters/pxla.py", line 2932, in from_hlo
in_layouts, out_layouts = _get_layouts_from_executable(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "jax/_src/interpreters/pxla.py", line 2633, in _get_layouts_from_executable
raise AssertionError(
AssertionError: Unexpected XLA layout override: (XLA) DeviceLocalLayout({0:E(4)}) != DeviceLocalLayout({0}) (User input layout)
This can be tested locally using the following code:
import jax
from jax.experimental.array_serialization import serialization
import numpy as np
dtype = jax.numpy.int4
shape = (8, 2)
arr = jax.numpy.arange(np.prod(shape)).reshape(shape).astype(dtype)
# Run serialization.
sharding = jax.sharding.GSPMDSharding.get_replicated(jax.devices())
tspecs = jax.tree_util.tree_map(
serialization.get_tensorstore_spec, ['/tmp/test_ckpt']
)
manager = serialization.GlobalAsyncCheckpointManager()
manager.serialize(
[arr],
tspecs,
on_commit_callback=lambda: None,
)
manager.wait_until_finished()
# Run deserialization.
(deserialized_arr,) = serialization.run_deserialization(
shardings=[sharding],
tensorstore_specs=tspecs,
global_shapes=[shape],
dtypes=[dtype],
)
# Test usage.
deserialized_arr = deserialized_arr.astype(jax.numpy.int8)
assert deserialized_arr + deserialized_arr == 2 * deserialized_arr
and tested in the CI by adding the following to the end of test_checkpointing_with_int4 in jax/experimental/array_serialization/serialization_test.py:
# Cast to int8 to test deserialized execution.
m2 = m2.astype(jax.numpy.int8)
self.assertArraysEqual(m2 + m2 == 2 * m2)
This also fails when casting is added to Orbax's single_host_test.py.
System info (python version, jaxlib version, accelerator, etc.)
Google internal HEAD:
jax: 0.4.29
jaxlib: 0.4.29
numpy: 1.26.3
python: 3.11.8 (stable, redacted, redacted) [Clang (fc57f88f007497a4ead0ec8607ac66e1847b02d6)]
jax.devices (1 total, 1 local): [CpuDevice(id=0)]
process_count: 1
platform: uname_result(system='Linux', node='ill.redact.this.too', release='', version='', machine='x86_64')
I ran this function internally and it doesn't seem to error:
def test_astype(self):
x = jnp.arange(4, dtype=jnp.int4)
x.astype(jnp.int8)
Does this error for you?
Note that you need to run on TPU and not CPU. Do you see the same error on TPU?
@yashk2810 Isn't the issue as stated by the reporter that int4 specifically regressed on CPU?
Uh oh, I skipped that part lol.
https://github.com/google/jax/pull/21372 should fix it
Thank you! My team uses XLA CPU when exporting to MLIR for mobile deployment, so this is quite helpful!
|
gharchive/issue
| 2024-05-21T22:57:38 |
2025-04-01T04:34:24.427247
|
{
"authors": [
"hawkinsp",
"phoenix-meadowlark",
"yashk2810"
],
"repo": "google/jax",
"url": "https://github.com/google/jax/issues/21339",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2342939736
|
Fix Typos
This PR fixes some doc typos and a typo in an error message.
Thanks!
|
gharchive/pull-request
| 2024-06-10T06:10:42 |
2025-04-01T04:34:24.428624
|
{
"authors": [
"rajasekharporeddy",
"superbobry"
],
"repo": "google/jax",
"url": "https://github.com/google/jax/pull/21759",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1111397059
|
Implement jax2tf scatter_* ops with no XLA
Linking to #9269
This is a work in progress, it seems to work for the simplest examples. Will update with tests, etc.
I'm doing this to convert some JAX code to ONNX. I find that my current fix using
tensor_scatter_nd_add
tensor_scatter_nd_max
tensor_scatter_nd_min
don't get converted by tf2onnx, while tensor_scatter_nd_update works fine and my devolved implementation of scatter_mul in JAX converts correctly. So I'm going to reimplement add,max,min similarly to how mul is currently
Reimplementation looks prettier in the onnx representation
Pic below is scatter_add.
I have found other routes for the task than no_xla, closing task. Thank you for your suggestions and comments.
@oliverdutton for reference, what other route did you find?
Implementing the XLA op on the backend directly. This meant I didn't need the format changing to ONNX which is what pushed me to look at no XLA scatter.
|
gharchive/pull-request
| 2022-01-22T09:49:16 |
2025-04-01T04:34:24.431893
|
{
"authors": [
"marcvanzee",
"oliverdutton"
],
"repo": "google/jax",
"url": "https://github.com/google/jax/pull/9289",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
335626551
|
Added exports edge [phab:D1532]
Created by craigbarber at 2017-04-17 20:21:34:
Added exports edge to Kythe schema documentation.
craigbarber wrote at 2017-04-17 20:27:08:
Changed exported to exports for more clarity
craigbarber wrote at 2017-04-17 20:30:28:
added exports edge to java EdgeKind enum
jrtom commented inline (context lost during migration) at 2017-04-17 21:23:08:
add line breaks
s/deps/dependencies
jrtom commented inline (context lost during migration) at 2017-04-17 21:23:08:
Please add a brief example.
craigbarber wrote at 2017-04-17 21:58:49:
incorporated CL feedback
craigbarber commented inline (context lost during migration) at 2017-04-17 21:59:11:
jekyll handles the formatting automatically, no line breaks are needed.
jrtom commented inline (context lost during migration) at 2017-04-17 22:29:25:
I assume that it does; this is more of a style/consistency issue, as the rest of the file uses hard line breaks inside paragraphs.
jrtom wrote at 2017-04-17 22:29:25:
OK modulo the formatting issue :)
jrtom accepted this revision at 2017-04-17 22:29:25
craigbarber wrote at 2017-04-17 22:30:48:
fixed spelling error
craigbarber wrote at 2017-04-17 22:36:00:
added line breaks for consistency, and moved description text above example.
craigbarber wrote at 2017-04-17 22:40:51:
adjusted layout of exports section to make it more consistent
craigbarber commented inline (context lost during migration) at 2017-04-17 22:41:49:
Added line breaks, also corrected the other edge "depends" which I authored previously, to ensure consistent line breaks.
craigbarber closed via commit ece076b8021a6397dae0eff7b3151d36bd883519 at 2017-04-17 22:42:27
|
gharchive/issue
| 2018-06-26T01:40:06 |
2025-04-01T04:34:24.451533
|
{
"authors": [
"kythe"
],
"repo": "google/kythe",
"url": "https://github.com/google/kythe/issues/1949",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
335655872
|
Removed unused docker image from java extractors. [phab:D2274]
Created by craigbarber at 2018-05-01 18:09:01:
fixed formatting on BUILD file
schroederc accepted this revision at 2018-05-01 18:12:10
craigbarber closed via commit 883a2b1ef75da9d0ae428d70d9b13e0fc0959eaf at 2018-05-01 18:15:10
|
gharchive/issue
| 2018-06-26T04:34:26 |
2025-04-01T04:34:24.453601
|
{
"authors": [
"kythe"
],
"repo": "google/kythe",
"url": "https://github.com/google/kythe/issues/2738",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
329649968
|
Represent type params by their erasure in JVM names.
Previously we hard-coded j.l.Object in the source indexer, which did not match what the binary indexer emits.
Tests that this works for both simple and multiple-bound generics (in the latter case, the first bound is the erasure, by definition).
All javatests pass. Ran google-java-format on the changes.
|
gharchive/pull-request
| 2018-06-05T22:39:32 |
2025-04-01T04:34:24.454745
|
{
"authors": [
"benjyw"
],
"repo": "google/kythe",
"url": "https://github.com/google/kythe/pull/120",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
575230913
|
Encryption of LevelDB
I need to encrypt my LevelDB files.
I'll decrypt them in memory when I need them.
How to load LevelDB from memory?
If not, can you add this feature, or make another package for this feature?
I won't decrypt them and write them back to disk for loading.
Hi @berrycof. leveldb doesn't support encryption and we don't plan on adding it as the goal is to keep the API simple and avoid feature creep. However there are two ways for you to encrypt data in leveldb today. The first, if I understand your idea, is to encrypt the data before you write to the db. This will likely result in poor performance if the keys are encrypted. Alternatively you can implement your own leveldb::Env, or override the provided one. This has the advantage of encrypting all data being written to disk - not just your key/value pairs. Good luck.
Hello @cmumford ,I'm new in C++ and level db. Can you make clear your idea for encrypt the data before you write to the db please :
lternatively you can implement your own leveldb::Env, or override the provided one
Thank you.
I just meant that just before the Env did the actual write (like by
calling WriteUnbuffered) it would encrypt the data. However, in practice,
this would be very difficult. An alternative is to get a library that does
the file encryption for you, and just use it. File based encryption is
difficult to implement. Adding encryption properly is extremely difficult
to get right. I strongly recommend going with an existing solution instead
of implementing your own. Like can you use the OS features to encrypt the
entire hard drive?
On Tue, Mar 31, 2020 at 6:34 PM thanhlv93 notifications@github.com wrote:
Hello @cmumford https://github.com/cmumford ,I'm new in C++ and level
db. Can you make clear your idea for encrypt the data before you write to
the db please :
lternatively you can implement your own leveldb::Env, or override the
provided one
Thank you.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/google/leveldb/issues/782#issuecomment-606974308, or
unsubscribe
https://github.com/notifications/unsubscribe-auth/AAMQWRMOT3ELOLTS46W7RN3RKKK25ANCNFSM4LA7OR4Q
.
Thank you for the explanation @cmumford . I will try your approach, decrypt data before WriteUnbuffered in env_posix.cc file and decrypt data in db_impl#Get .
After putting the log in WriteUnbuffered, I can see that the method not only I call but also other places in the system also call (example: write MANIFEST data ...). How can I tell which data is mine?
You can't do it here if you only want your data encrypted. In that case
encrypt your keys/values before using the leveldb API.
On Wed, Apr 1, 2020 at 7:44 PM thanhlv93 notifications@github.com wrote:
After putting the log in WriteUnbuffered, I can see that the method not
only I call but also other places in the system also call (example: write
MANIFEST data ...). How can I tell which data is mine?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/google/leveldb/issues/782#issuecomment-607590148, or
unsubscribe
https://github.com/notifications/unsubscribe-auth/AAMQWRLJV6M3R7UACU2W6QTRKP3ZXANCNFSM4LA7OR4Q
.
But as you said : I just meant that just before the Env did the actual write (like by
calling WriteUnbuffered) it would encrypt the data. -> This is process write data to a file so It does not have the same meaning as encrypt a file.
I logout ( using __android_log_print ) data in WriteUnbuffered method but I dont understand what happend with data, can you explain or suggest how can encrypt data with this format :
WriteUnbuffered size = 1029 ,data = �z
��
and java code to put to db :
char[] data = new char[1000];
Arrays.fill(data, 'f')
String str = new String(data);
db.put("test-0", str.getBytes());
Thank you.
|
gharchive/issue
| 2020-03-04T08:48:22 |
2025-04-01T04:34:24.466117
|
{
"authors": [
"berrycof",
"cmumford",
"suntzu93",
"thanhlv93"
],
"repo": "google/leveldb",
"url": "https://github.com/google/leveldb/issues/782",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
114892482
|
Drawer button doesn't have ripple effect
Hi,
The drawer button doesn't have ripple effect when it's added to header.
I can't find that part in the MD specs but the ripple effect is present on all Google's Android apps.
You are right. I felt like we had an issue for this before but I can’t find it.
@sgomes Do we want to address this in 1.1 or the rewrite (and therefore v2)?
I think we should handle this in 2.0. Handling this in 1.1 means introducing more coupling between ripple and layout, where in 2.0 we are should make Ripple a complete standalone component and will already be doing a lot of layout work. So adding this in means more logic to undo later. Best just wait and do it all in one swoop.
+1, this sounds like 2.0 work.
Added to #1735. Closing this. Thanks for issuing, @jfily :)
|
gharchive/issue
| 2015-11-03T19:35:10 |
2025-04-01T04:34:24.473493
|
{
"authors": [
"Garbee",
"jfily",
"sgomes",
"surma"
],
"repo": "google/material-design-lite",
"url": "https://github.com/google/material-design-lite/issues/1842",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
122289088
|
Fix duplicate checkFocus method.
Fixes recent merge that went awry with closure.
To fix the build I'm just merging now. I can't break it anymore can I?
|
gharchive/pull-request
| 2015-12-15T14:54:04 |
2025-04-01T04:34:24.474354
|
{
"authors": [
"Garbee"
],
"repo": "google/material-design-lite",
"url": "https://github.com/google/material-design-lite/pull/2016",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1232775798
|
scaled 3d landmarks Objectron
Solution: Objectron
Language: Javascript
I'm trying to render a cube on top of the predicted landmarks of the shoe using Three.js and the Objectron javascript API.
The model predicts 2d landmarks, and the 3d_landmarks are obtained using the epnp algorithm.
If I 'unproject' the 2d landmarks onto the world space, I obtain some world coordinates, that I expect to be the same as the 3d_landmarks predicted coordinates. (according to the coordinates documentation)
The problem is that the 3d coordinates are not reaching the borders of the canvas, they seem to be scaled in a way that they never reach the corresponding bigger values of the 2d unprojected landmarks.
Seeing this issue maybe is a scale problem, but I would like to understand how the focal length and the principal point influence my solution.
Maybe it's because I didn't set the focal length and the principal point. but I don't understand if this is something related to the hardware device I use or to some projection matrix or the camera object parameters.
I would like to know how to better understand the 3d_landmarks respect to the camera
Can somebody help with this?
Can somebody help with this?
did you find a way to work with this?
i have been trying to achieve the same result
Hello @UX3D-mazzini
We are ending support for these MediaPipe Legacy Solutions, but upgrading the others. However, the libraries, documentation, and source code for all the MediapPipe Legacy Solutions will continue to be available in our GitHub repository and through library distribution services, such as Maven and NPM.
You can continue to use those legacy solutions in your applications if you choose. Though, we would request you to check new MediaPipe solutions which can help you more easily build and customize ML solutions for your applications. These new solutions will provide a superset of capabilities available in the legacy solutions.
|
gharchive/issue
| 2022-05-11T14:47:28 |
2025-04-01T04:34:24.479313
|
{
"authors": [
"UX3D-mazzini",
"ayushgdev",
"kaugray"
],
"repo": "google/mediapipe",
"url": "https://github.com/google/mediapipe/issues/3341",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
70051555
|
"terekomu" → テレ込む
When I type "terekomu" and transform, the IME does not propose テレコム (a
very usual word) but テレ込む which makes no sense.
I had noticed IBus-Anthy had the same problem, so that's why I switched to
mozc, but it seems to have the same problem. Can anyone confirm?
Original issue reported on code.google.com by nicolas.raoul@gmail.com on 9 May 2011 at 1:14
Added evaluation test cases in this issue.
https://github.com/google/mozc/blob/master/src/data/test/quality_regression_test/oss.tsv
https://github.com/google/mozc/blob/master/src/data/dictionary_oss/evaluation.tsv
|
gharchive/issue
| 2015-04-22T07:52:11 |
2025-04-01T04:34:24.484778
|
{
"authors": [
"GoogleCodeExporter",
"hiroyuki-komatsu"
],
"repo": "google/mozc",
"url": "https://github.com/google/mozc/issues/82",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
249330238
|
hello? I cannot JNI build mannually in ECLIPSE
after my entering the /jni directory and type"ndk-build" in CMD terminal,It says:
Android NDK: Your APP_BUILD_SCRIPT points to an unknown file:path/to/jni/Android.mk
could u give me some advice on how to test this project in eclipse?
in eclipse I can test a bundle of stuff just in one window and eclipse dont take as much time to automatically analyse the code before i myself really want to analyse the code.
my machine is a poor machine and I dont want to launch a heavy AS to test your code at all.
I drag the android.mk from branch one,but the terminal return ERROR 1
:
path/to/jni/dx7note.cc:174: error: undefined reference to 'PitchEnv::set(int const*, int const*)'
path/to/jni/dx7note.cc:184: error: undefined reference to 'PitchEnv::getsample()'
path/to/jni/dx7note.cc:203: error: undefined reference to 'exp2tab'
path/to/jni/dx7note.cc:208: error: undefined reference to 'PitchEnv::keydown(bool)'
path/to/jni/sawtooth.cc:207: error: undefined reference to 'exp2tab'
path/to/jni/synth_unit.cc:51: error: undefined reference to 'Exp2::init()'
path/to/jni/synth_unit.cc:52: error: undefined reference to 'Tanh::init()'
path/to/jni/synth_unit.cc:54: error: undefined reference to 'Lfo::init(double)'
path/to/jni/synth_unit.cc:55: error: undefined reference to 'PitchEnv::init(double)'
path/to/jni/synth_unit.cc:116: error: undefined reference to 'Lfo::reset(char const*)'
path/to/jni/synth_unit.cc:149: error: undefined reference to 'Lfo::keydown()'
path/to/jni/synth_unit.cc:257: error: undefined reference to 'Lfo::getsample()'
path/to/jni/synth_unit.cc:258: error: undefined reference to 'Lfo::getdelay()'
HELP!
temporary I will study OpenSynth instead.
|
gharchive/issue
| 2017-08-10T12:29:53 |
2025-04-01T04:34:24.490815
|
{
"authors": [
"KnIfER"
],
"repo": "google/music-synthesizer-for-android",
"url": "https://github.com/google/music-synthesizer-for-android/issues/19",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1329416583
|
Rendering scripts
Hi,
Do you have python script to render nerf's orginal blend files?
Best,
This should be what you want. LMK if you need more help.
|
gharchive/issue
| 2022-08-05T04:02:45 |
2025-04-01T04:34:24.492225
|
{
"authors": [
"derrick-xwp",
"xiumingzhang"
],
"repo": "google/nerfactor",
"url": "https://github.com/google/nerfactor/issues/22",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2413991911
|
feat(status): add modal status message option that can be closed back to a normal status message
I tried to be unopinionated with the styling, keeping it minimal/using original status message styling. I am not blocking any input while the modal is up. Multiple modals at the same time get arranged in a list from oldest to newest. Perhaps I should reverse the order.
Here is an example of it in action (if you aren't logged in, incognito tab is good for testing this)
https://spelunker.cave-explorer.org/#!{"dimensions":{"x":[8e-9%2C"m"]%2C"y":[8e-9%2C"m"]%2C"z":[4e-8%2C"m"]}%2C"position":[122597.5%2C95833.5%2C21354.5]%2C"crossSectionScale":1%2C"projectionScale":262144%2C"layers":[{"type":"new"%2C"source":"graphene://middleauth+https://minnie.microns-daf.com/segmentation/table/minnie65_public"%2C"tab":"source"%2C"name":"minnie65_public"}]%2C"selectedLayer":{"visible":true%2C"layer":"minnie65_public"}%2C"layout":"4panel"}
|
gharchive/pull-request
| 2024-07-17T15:58:54 |
2025-04-01T04:34:24.494852
|
{
"authors": [
"chrisj"
],
"repo": "google/neuroglancer",
"url": "https://github.com/google/neuroglancer/pull/618",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
250475580
|
Switch from request to phin!
request is way too large and has way too many dependencies! Use phin instead!
@googlebot I signed it!
Moved to #14
|
gharchive/pull-request
| 2017-08-16T00:31:51 |
2025-04-01T04:34:24.496441
|
{
"authors": [
"bdsomer"
],
"repo": "google/node-gtoken",
"url": "https://github.com/google/node-gtoken/pull/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
338979969
|
INVALID_REFERER for ekey without referer
I can't make request with ekey but without http referrer works.
It always returned INVALID_REFERER.
{"error_message":"Needs Google API key to locate addresses","status":"NO_API_KEY","keyerror":"INVALID_REFERER"}
On the wiki said, referrer is only needed for extra security.
Is it become mandatory now?
PS: I also wonder if Plus Codes can actually requested directly via Google Geocoding API?
Because I'm currently developing Android app now, but Geocoding response never has Plus Codes components.
@drinckes to explain ekey/referrer behavior
Plus codes in the Geocoding API is being worked on right now! Doug, do we have an ETA we can share?
Looks like there's a bug handling these requests. I'll look into it now.
Hi @mu10 sorry about that, there was a bug in the processing of encrypted keys. I've fixed it and pushed an update to the live site, your requests should work now. Thanks for letting us know about it, let us know if it recurs.
As @zongweil mentioned, we're working on including plus codes in the normal Google APIs. It's already in the Places API, but we're still working on the Geocoding API. Once done we'll announce to the mailing list.
|
gharchive/issue
| 2018-07-06T15:25:32 |
2025-04-01T04:34:24.512280
|
{
"authors": [
"drinckes",
"mu10",
"zongweil"
],
"repo": "google/open-location-code",
"url": "https://github.com/google/open-location-code/issues/178",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2235392710
|
Unsatisfied linear constraint with no free enforcement literal: enforcement_literal [occurs sometimes]
What version of OR-Tools and what language are you using?
CP-SAT solver v9.9.3963 Release version
Language: Minizinc
Which solver are you using (e.g. CP-SAT, Routing Solver, GLOP, BOP, Gurobi)
CP-SAT
What operating system (Linux, Windows, ...) and version?
Windows
What did you do?
The original minizinc mzn/dzn combination was compiled to fzn and executed against fzn-cp-sat.exe. I typically run the model in parallel with 24 CPUs.
What did you expect to see
Most of the time the solver completes successfully [see attached result_successful_solve.txt].
What did you see instead?
About once every five runs, the executable terminates with an error:
F0410 10:29:09.107739 44632 cp_model_postsolve.cc:105] Unsatisfied linear constraint with no free enforcement literal: enforcement_literal: 2323 linear { vars: 2305 coeffs: 1 domain: 0 domain: 0 }
Make sure you include information that can help us debug (full error message, model Proto).
I've included the fzn file.
Anything else we should know about your project / environment
I'm currently running fzn-cp-sat in a Windows 10 environment.
ortools_error.zip
I can reproduce.
It would be super nice if you could reduce the size of the model.
Dear Laurent,
I've attempted to reduce the model's size, but it seems that the likelihood of the error now occurring is lower. However, I could repeat the issue with the attached model as well:
fzn-cp-sat.exe error_2024_04_10_reduced.fzn --time_limit=120 --threads=24 --cp_model_stats=true --display_all_solutions=true --statistics=true --cp_trace_search=true --v=99999 1>result_002.txt
F0411 09:35:05.807726 14216 cp_model_postsolve.cc:106] Unsatisfied linear constraint with no free enforcement literal: enforcement_literal: -1191 linear { vars: 1184 coeffs: 1 domain: 0 domain: 0 }
*** Check failure stack trace: ***
@ 00007FF66DC8BFE9 (unknown)
@ 00007FF66D9B58ED (unknown)
@ 00007FF66D9B4620 (unknown)
@ 00007FF66D9B5392 (unknown)
@ 00007FF66D93392B (unknown)
@ 00007FF66D7A26A2 (unknown)
@ 00007FF66DB3AB24 (unknown)
@ 00007FF66DB3A55F (unknown)
@ 00007FFEEC381BB2 (unknown)
@ 00007FFEECF87614 (unknown)
@ 00007FFEEEA026B1 (unknown)
I've included two additional txt files: result_001.txt shows where it succeeded, and result_002.txt where it failed.
I have not been able to reduce the size of the model further as yet.
Sincere regards.
ortools_error_2.zip
Thanks.
|
gharchive/issue
| 2024-04-10T11:42:41 |
2025-04-01T04:34:24.521949
|
{
"authors": [
"forceoffire",
"lperron"
],
"repo": "google/or-tools",
"url": "https://github.com/google/or-tools/issues/4176",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
888773298
|
Save function names in presets
Get rid of hashes in preset files.
I did not look at the code, but I assume this will break compatibility with old preset files? If that is the case, please make sure the corresponding bug is included in the sprint hotlist and has a release note
I did not look at the code, but I assume this will break compatibility with old preset files? If that is the case, please make sure the corresponding bug is included in the sprint hotlist and has a release note
It won't.
|
gharchive/pull-request
| 2021-05-11T21:43:03 |
2025-04-01T04:34:24.523699
|
{
"authors": [
"antonrohr",
"dimitry-"
],
"repo": "google/orbit",
"url": "https://github.com/google/orbit/pull/2311",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2625217569
|
fix(semantic): support parsing versions without a numeric component
While I'm pretty these are technically invalid, they're easy to support and without this Alpine is the only comparator that panics when parsing an empty string which I think is a little sad.
Codecov Report
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 68.28%. Comparing base (8af6458) to head (fedd017).
Additional details and impacted files
@@ Coverage Diff @@
## main #1365 +/- ##
==========================================
- Coverage 68.41% 68.28% -0.14%
==========================================
Files 183 183
Lines 17620 17623 +3
==========================================
- Hits 12055 12034 -21
- Misses 4900 4922 +22
- Partials 665 667 +2
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
|
gharchive/pull-request
| 2024-10-30T20:02:18 |
2025-04-01T04:34:24.528413
|
{
"authors": [
"G-Rath",
"codecov-commenter"
],
"repo": "google/osv-scanner",
"url": "https://github.com/google/osv-scanner/pull/1365",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
505848148
|
Issue while playing Widevine protected content
Hello,
We are trying to use Shaka Player Embedded integrated with Widevine DRM. Compiling CDM into Shaka Player Embedded worked smoothly according to the CDM and Shaka Player Embedded documentation.
Content that is not encrypted works fine, as expected.
When switching to DRM protected content the player just freezes when trying to display encrypted content. Tests were done using Widevine sample provided content and Shaka Packager prepared content. The below content/proxy has problems only on IOS - everything works fine on Exoplayer and Browser.
content: https://storage.googleapis.com/wvmedia/cenc/h264/tears/tears_sd.mpd
proxy: https://proxy.staging.widevine.com/proxy
Same results on iPhone 5s iPhone X.
According to proxy logs, license was issued corectly. Widevine team sugested I submit this issue; below is the code used to start player and logs. Any thoughts on this problem?
How the payer framework was used:
#import "ViewController.h"
#import <ShakaPlayerEmbedded/ShakaPlayerEmbedded.h>
@implementation ViewController
- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
// Make a Shaka Player view.
ShakaPlayerView *player = [[ShakaPlayerView alloc] init];
player.frame = self.view.bounds;
[self.view addSubview:player];
[player setClient:NULL];
// Load and play an asset.
// all ok - clear
//[player load:@"https://storage.googleapis.com/shaka-demo-assets/angel-one/dash.mpd" withBlock:^(ShakaPlayerError *error){if (error) {NSLog(@"###error");} else {NSLog(@"###ok");}}];
// not working - encrypted
[player load:@"https://storage.googleapis.com/wvmedia/cenc/h264/tears/tears_sd.mpd" withBlock:^(ShakaPlayerError *error){if (error) {NSLog(@"###error");} else {NSLog(@"###ok");}}];
[player configure:@"drm.servers.com\\.widevine\\.alpha" withString:@"https://proxy.staging.widevine.com/proxy"];
[player play];
}
@end
Application log for iPhone 5s is below:
[Info]: "Starting attach..."
[Info]: "Starting load of https://storage.googleapis.com/wvmedia/cenc/h264/tears/tears_sd.mpd..."
[Log]: "Found variant with audio and video content, so filtering out audio-only content in all periods."
[Info]: "Created MediaKeys object for key system" "com.widevine.alpha"
[Log]: "codecs" "avc1-mp4a" "avg bandwidth" 1409174
[Log]: "onChooseStreams_" {startTime:0, textStreams:[...], variants:[...]}
[Log]: "Choosing new streams after period changed"
[Log]: "init: completed initial Stream setup"
[Log]: "Ignoring duplicate init data."
[Log]: "Ignoring duplicate init data."
[Log]: "Ignoring duplicate init data."
[Log]: "Ignoring duplicate init data."
[Warn]: "No preferred audio language set. We will choose an arbitrary language initially"
2019-10-09 18:14:59.461739+0300 testShaka[8314:7569067] ###ok
[Log]: "(audio:1)" "looking up segment:" "presentationTime=0" "currentPeriod.startTime=0"
[Log]: "(video:2)" "looking up segment:" "presentationTime=0" "currentPeriod.startTime=0"
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(120):OpenSession] CdmEngine::OpenSession
[DEBUG:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/crypto_session.cpp(644):Open] CryptoSession::Open: requested_security_level: Default
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/usage_table_header.cpp(38):Init] UsageTableHeader::Init: security level: 3
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/usage_table_header.cpp(70):Init] UsageTableHeader::Init: number of usage entries: 0
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/device_files.cpp(155):ExtractDeviceInfo] ExtractDeviceInfo Entry
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(156):OpenSession] CdmEngine::OpenSession: ksidA149B344
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(529):QueryStatus] CdmEngine::QueryStatus
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(242):GenerateKeyRequest] CdmEngine::GenerateKeyRequest: ksidA149B344
[DEBUG:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/license.cpp(336):PrepareKeyRequest] PrepareKeyRequest: nonce=2502408683
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(691):generateRequest] A license request has been generated.
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(120):OpenSession] CdmEngine::OpenSession
[DEBUG:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/crypto_session.cpp(644):Open] CryptoSession::Open: requested_security_level: Default
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/device_files.cpp(155):ExtractDeviceInfo] ExtractDeviceInfo Entry
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(156):OpenSession] CdmEngine::OpenSession: ksidDF5BE7F3
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(529):QueryStatus] CdmEngine::QueryStatus
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(242):GenerateKeyRequest] CdmEngine::GenerateKeyRequest: ksidDF5BE7F3
[DEBUG:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/license.cpp(336):PrepareKeyRequest] PrepareKeyRequest: nonce=2818731362
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(691):generateRequest] A license request has been generated.
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x10401fc00] Protocol name not provided, cannot determine if input is local or a network protocol, buffers and access patterns cannot be configured optimally without knowing the protocol
WARNING: Logging before InitGoogleLogging() is written to STDERR
I1009 18:15:00.359156 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x10402d800] Protocol name not provided, cannot determine if input is local or a network protocol, buffers and access patterns cannot be configured optimally without knowing the protocol
[h264 @ 0x103060c00] Reinit context to 320x144, pix_fmt: yuv420p
I1009 18:15:00.475390 1876488192 media_processor.cc:462] Using decoder: h264, with hardware accelerator: videotoolbox
[Log]: "(video:2)" "startup complete"
[Log]: "(all) setting up Period 0"
[Log]: "(all) Stream 2 is being or has been set up"
[Log]: "(all) Stream 1 is being or has been set up"
[Log]: "(all) Period 0 is being or has been set up"
[h264 @ 0x1040a7800] Reinit context to 320x144, pix_fmt: videotoolbox_vld
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(331):AddKey] CdmEngine::AddKey: ksidDF5BE7F3
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(759):IsReleaseSession] CdmEngine::IsReleaseSession: ksidDF5BE7F3
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(331):AddKey] CdmEngine::AddKey: ksidA149B344
[INFO:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(759):IsReleaseSession] CdmEngine::IsReleaseSession: ksidA149B344
[Log]: "canSwitch_"
[Log]: "Calling switch_(), bandwidth=2987 kbps"
[Log]: "switch_"
[Log]: "switch: switching to Stream (video:3)"
[Log]: "switch: Stream (audio:1) already active"
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x10401fc00] Failed to seek for auxiliary info, will only parse senc atoms for encryption info
[Log]: "Choosing new streams after key status changed"
[Log]: "(video:3)" "looking up segment:" "presentationTime=12" "currentPeriod.startTime=0"
I1009 18:15:03.058017 11381696 video_controller.cc:163] Dropped 1 frames.
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x10402d800] Found duplicated MOOV Atom. Stopping.
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x10402d800] Failed to seek for auxiliary info, will only parse senc atoms for encryption info
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x10402d800] Protocol name not provided, cannot determine if input is local or a network protocol, buffers and access patterns cannot be configured optimally without knowing the protocol
[h264 @ 0x103060c00] Reinit context to 864x384, pix_fmt: yuv420p
[h264 @ 0x103060c00] left block unavailable for requested intra4x4 mode -1
[h264 @ 0x103060c00] error while decoding MB 0 0, bytestream 43889
[h264 @ 0x103060c00] concealing 1296 DC, 1296 AC, 1296 MV errors in I frame
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[Log]: "Calling switch_(), bandwidth=12272 kbps"
[Log]: "switch_"
[Log]: "switch: Stream (video:3) already active"
[Log]: "switch: Stream (audio:1) already active"
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x10401fc00] Failed to seek for auxiliary info, will only parse senc atoms for encryption info
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[Log]: "Stall detected at 10.018999999999949 for 1.005000114440918 seconds. Seeking forward 0.1 seconds."
I1009 18:15:11.669751 1876488192 media_processor.cc:462] Using decoder: h264, with hardware accelerator: videotoolbox
[Log]: "(all): seeked: buffered seek: presentationTime=10.118999999999948"
I1009 18:15:11.682724 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[h264 @ 0x1040b3800] Reinit context to 320x144, pix_fmt: videotoolbox_vld
I1009 18:15:11.784584 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:11.890714 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:11.993680 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:12.099826 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[Log]: "Stall detected at 10.118999999999948 for 1.0209999084472656 seconds. Seeking forward 0.1 seconds."
[Log]: "(all): seeked: buffered seek: presentationTime=10.218999999999948"
I1009 18:15:12.962056 1876488192 media_processor.cc:462] Using decoder: h264, with hardware accelerator: videotoolbox
[h264 @ 0x103090e00] Reinit context to 320x144, pix_fmt: videotoolbox_vld
I1009 18:15:13.041841 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:13.147902 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:13.252373 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:13.359730 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[....]
[Log]: "Stall detected at 11.618999999999943 for 1.0049998760223389 seconds. Seeking forward 0.1 seconds."
[Log]: "(all): seeked: buffered seek: presentationTime=11.718999999999943"
I1009 18:15:32.081878 1876488192 media_processor.cc:462] Using decoder: h264, with hardware accelerator: videotoolbox
I1009 18:15:32.090919 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[h264 @ 0x10307f600] Reinit context to 320x144, pix_fmt: videotoolbox_vld
I1009 18:15:32.195667 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:32.302528 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:32.409524 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:32.490700 1876488192 media_processor.cc:462] Using decoder: h264, with hardware accelerator: videotoolbox
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:32.512061 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:32.617478 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
I1009 18:15:32.720180 1869606912 media_processor.cc:456] No hardware-accelerators available, using decoder: aac_at
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/core/src/cdm_engine.cpp(1699):Decrypt] CdmEngine::Decrypt: session not found: Empty session ID
[ERROR:/var/lib/jenkins/workspace/CDM_Arxan_Create_Release/oemcrypto-arxan/third_party/cdm/cdm/src/cdm.cpp(1193):decrypt] Key not available.
[....]
It works fine for me on both an armv7 and arm64 device. What version of the CDM and Shaka Player Embedded are you using? What are the flags you passed to ./configure?
Steps I performed:
Built the Embedded framework manually
../configure --ios --cpu arm64 --eme-impl ~/prebuilt_cdm/shaka_plugin/dev_cdm.json
Copied framework to sample_xcode_project
Edited sample_xcode_project/build.sh to do nothing
Edited sample_xcode_project/sample_xcode_project/assets.plist to include your new asset
Launched sample project and selected from list
If you define a client, do you get any error logs?
@interface ViewController () <ShakaPlayerClient>
@end
@implementation ViewController
- (void)onPlayerError:(ShakaPlayerError *)error {
NSLog(@"Async error: %@", [error message]);
}
...
@end
[player setClient:self];
To help debug, you could edit this part of the decrypt operation to log the requested key ID and the existing key IDs (using cdm->GetKeyStatuses). You could also log here for when we get the license back. The Widevine CDM synchronously loads the session, so after it returns we should have the keys.
CDM version used is 15.2.1. Shaka Player Embedded downloaded from GitHub.
Embedded framewark was built running configure --ios --eme-impl ~/shaka/prebuilt_cdm/shaka_plugin/dev_cdm.json
No changes to assets.plist and build.sh .
Just passed the asset as seen in the code.
Will try the way you suggested, adding more logs, and get back to you with an answer.
Retrying what you suggested and doing everything mentioned above did not work.
Reinstalling MacOS (back to Mojave), XCode and recompiling everything made things work flawless. The versions of Shaka Player Embedded and CDM used were the latest from repositories.
Sorry for the troubles and thanks for involvement.
|
gharchive/issue
| 2019-10-11T13:18:29 |
2025-04-01T04:34:24.600385
|
{
"authors": [
"TheModMaker",
"onefarad"
],
"repo": "google/shaka-player-embedded",
"url": "https://github.com/google/shaka-player-embedded/issues/58",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
90607967
|
TTML support
In the latest release of DashJS (1.4.0), they added support for TTML.
It would be perfect if support for TTML could be added to Shaka-player.
Teststream: http://dash.edgesuite.net/dash264/TestCases/4b/qualcomm/1/ED_OnDemand_5SecSeg_Subtitles.mpd
<AdaptationSet mimeType="application/ttml+xml" lang="en">
<Role schemeIdUri="urn:mpeg:dash:role" value="subtitle" />
<Representation id="7" bandwidth="268" >
<BaseURL>English_track.xml</BaseURL>
</Representation>
</AdaptationSet>
<AdaptationSet mimeType="application/ttml+xml" lang="ge">
<Role schemeIdUri="urn:mpeg:dash:role" value="subtitle" />
<Representation id="8" bandwidth="268" >
<BaseURL>German_track.xml</BaseURL>
</Representation>
</AdaptationSet>
Thanks for the request. At the moment, we rely on the browser's subtitle support. On Chrome or Firefox, Shaka Player will do WebVTT.
For context, Shaka Player has made a point of not doing any DOM manipulation or CSS so far. Both of those things would have to change in order for us to render any subtitle format not directly supported by the browser. It's doable, but would require some architectural changes first.
We can't fit this into our schedule right now, but we'll keep it in mind.
Related to this and a possible alternative for some people, how difficult would it be to add an option to sidecar load WebVTT subtitles?
Please see #133 for a separate discussion on sidecar loading (outside of the MPD) WebVTT subtitles.
Hi, my team is watching this issue. I have a couple of questions that I hope you can answer to help me plan:
Are you implementing TTML Language 1 or TTML Language 2 support? I don't see references to TTML2, so I assume you are following the TTML Language 1 specification. Is that right?
Does this work include support for chunked TTML for Live streams?
Hi Doug,
We are implementing partial TTML 1 support. We are using the TTML 1 spec, but we are only implementing what can be represented in the browser's native cues interface. The means we can't implement extents or origins based on pixel sizes, animation, etc.
Shaka v2 betas already support chunked/segmented subtitles, but only in the specific formats we parse. Currently, that's the textual forms of WebVTT and TTML. We do not yet support text streams embedded in MP4 (#277 and #278).
One important note about segmented text is that just like audio and video, segments must exist to cover the entire period. If there are no cues to display during a given segment's time, the segment would be a valid WebVTT or TTML file, but would contain no cues.
|
gharchive/issue
| 2015-06-24T08:27:31 |
2025-04-01T04:34:24.606970
|
{
"authors": [
"dougdoe",
"joeyparrish",
"sanbornhnewyyz",
"sirkro",
"tdrews"
],
"repo": "google/shaka-player",
"url": "https://github.com/google/shaka-player/issues/111",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
442148440
|
[BUG] In android chrome incognito mode, Widevine DRM content is not playing and giving an error 6001
Have you read the FAQ and checked for duplicate open issues?
Yes
What version of Shaka Player are you using?
2.5.0
Can you reproduce the issue with our latest release version?
yes
Can you reproduce the issue with the latest code from master?
yes
Are you using the demo app or your own custom app?
custom
If custom app, can you reproduce the issue using our demo app?
yes
What browser and OS are you using?
android 7.0; Lenovo Chrome 74.0.3729.136
What are the manifest and license server URIs?
URL has shortly expired token, however, our website is public www.airtelxstream.in/movie/in-the-heart-of-the-sea/HOOQ_MOVIE_68234
What did you do?
open chrome in incognito mode on android phone
play widevine DRM content
it will give error 6001
above error happens only in android incognito, it works fines in a normal window
What did you expect to happen?
DRM content should play in android chrome incognito mode
What actually happened?
In android chrome incognito mode, Widevine DRM content is not playing and giving an error 6001
please use India VPN to access the website
Dear Shaka Team,
Please help.
Chrome doesn't allow using Widevine in an incognito tab on Android. This was a deliberate decision since offline licenses won't work properly in incognito mode. On Desktop you can request Widevine support so long as you don't use persistent state. I would link to some Chrome bugs I found, but they are all restricted.
This is a Chrome issue and there is nothing we can do about it.
Your app should still be able to function in incognito mode if it doesn't use the offline storage features.
Thanks @TheModMaker for explaining, I understood that there is no way to play widevine on chrome in android in the incoginato mode.
@joeyparrish, I am not using any offline storage features of shakaplayer and showing an error screen for this case.
I have been using incognito mode in android chrome after the last update. It is very helpful for any one, but I have faced a lot of problems like confirm form resubmission, turn off confirm form resubmission, etc. Then I stopped browsing it. I have solved that issue after browsing [url=https://browsertechnicalsupportnumbers.com/blog/disable-confirm-form-resubmission-popup-chrome/]confirm form resubmission[/url], if you feel the same. Then you can also browse here. I am sure that you will get a valid solution.
I have been using incognito mode in android chrome after the last update. It is very helpful for any one, but I have faced a lot of problems like confirm form resubmission, turn off confirm form resubmission, etc. Then I stopped browsing it. I have solved that issue after browsing confirm form resubmission, if you feel the same. Then you can also browse here. I am sure that you will get a valid solution.
|
gharchive/issue
| 2019-05-09T09:49:37 |
2025-04-01T04:34:24.617366
|
{
"authors": [
"JohnStokes02",
"TheModMaker",
"joeyparrish",
"vksbansal"
],
"repo": "google/shaka-player",
"url": "https://github.com/google/shaka-player/issues/1928",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
444880065
|
Content-Type not parsed in the correct way
Have you read the FAQ and checked for duplicate open issues? Yes
What version of Shaka Player are you using? 2.5.0
Can you reproduce the issue with our latest release version? Yes
Can you reproduce the issue with the latest code from master? Yes
Are you using the demo app or your own custom app? Custom
If custom app, can you reproduce the issue using our demo app? Yes
What browser and OS are you using? Google Chrome 74, Windows 10
For embedded devices (smart TVs, etc.), what model and firmware version are you using?
What are the manifest and license server URIs?
What did you do?
Load DASH stream
What did you expect to happen?
I expected Shaka player to correctly parse
Content-Type: application/dash+xml;charset=utf-8;profiles="urn:mpeg:dash:profile:isoff-live:2011,urn:hbbtv:dash:profile:isoff-live:2012" and set mime type to application/dash+xml and play the stream.
What actually happened?
Shaka throw 4000 error UNABLE_TO_GUESS_MANIFEST_TYPE.
I think this is caused because stream response header returns Content-Type which Shaka Player doesn't parse in the correct way.
I did manage to fix this by going to ../lib/media/manifest_parser.js in Shaka player directory. Then in method shaka.media.ManifestParser.getFactory_ I added mimeType = 'application/dash+xml'; at the beginning of that function.
And that works fine, but it isn't really a good solution for me because it involves messing with source code.
Any help would be appreciated . Thanks.
As a workaround, you can pass the mime type to load, player.load(url, null, 'application/dash+xml'). But we should actually parse the Content-Type header instead of just using it for string comparison.
Hi
We are having the same problem.
We are getting this url
https://csm-e-cepreurxaws101j8-5rarw7tm1jwg.bln1.yospace.com/179380410/1.mpd;jsessionid=895FAD293C4D1F0E52B527DA3AAA778D.csm-e-cepreurxaws101j8-5rarw7tm1jwg.bln1.yospace.com;jsessionid=895FAD293C4D1F0E52B527DA3AAA778D.csm-e-cepreurxaws101j8-5rarw7tm1jwg.bln1.yospace.com?yo.ac=true&ppid=88db9bcf94224e7fb10066553ab1fa61&externalId=tv2_mdash_avc1_nodrm
from yospace. Shaka player is unable to detect the content type from the extension.
It tries to resolve the content type by calling the manifest file
https://github.com/google/shaka-player/blob/8c8552ae7cfbabede8ee6c5d3cf00f38622c2c67/lib/media/manifest_parser.js#L219
The server that hosts the manifest file returns this content type "application/dash+xml;charset=utf-8"
It compares the result here
https://github.com/google/shaka-player/blob/8c8552ae7cfbabede8ee6c5d3cf00f38622c2c67/lib/media/manifest_parser.js#L81
And because "application/dash+xml;charset=utf-8" is not in the list it throws 4000 error UNABLE_TO_GUESS_MANIFEST_TYPE.
I see 3 solutions
Add "application/dash+xml;charset=utf-8" to the list
Do a split on ";" and only take the first part
call load with the right content type as @TheModMaker said
The problem with solution number 3 is that we are seeing the problem when using shaka player with the CAF SDK. The CAF SDK wrapper does not provide the load function with the content type, even when set in the CAF SDK.
This should be easy enough to fix by splitting the detected mime type on semicolon and using the first part only. Feel free to create a PR for this.
|
gharchive/issue
| 2019-05-16T10:35:25 |
2025-04-01T04:34:24.627835
|
{
"authors": [
"TheModMaker",
"joeyparrish",
"mimse",
"petar11199"
],
"repo": "google/shaka-player",
"url": "https://github.com/google/shaka-player/issues/1946",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
954662317
|
Mac OS Safari HLS playback does not work (3016)
Have you read the Tutorials?
Yes
Have you read the FAQ and checked for duplicate open issues?
Yes
What version of Shaka Player are you using?
3.2.0
Please ask your question
I have searched the issue list and couldnt find anyone with the same problem - I found issus related to ios or fairplay or wrong content. But I am testing on Mac (big sur) in Safari 14.1.2 with HLS content that I found on shaka player demo page and it works there. The HLS content also works when its just the video with src without shaka player.
Therefore I assume the content is OK and should be playabale via src via shaka player. I used the skeleton demo app from documentation and latest shaka player from cdn.
For some reason, the playback fails with 3016 as per the screenshot below:
Can you please check the code below, should it work? Am I missing something? I want to play the content via the src option (right now im testing clear hls but the ultimate goal is to have fairplay support). I am sorry if its a duplicate or a stupid question, I just couldnt find the solution anywhere :(
Thank you a lot!
Jakub
<!DOCTYPE html>
<html>
<head>
<script src="https://cdnjs.cloudflare.com/ajax/libs/shaka-player/3.2.0/shaka-player.compiled.debug.min.js" ></script>
<script>
const manifestUri =
'https://storage.googleapis.com/shaka-demo-assets/bbb-dark-truths-hls/hls.m3u8';
function initApp() {
shaka.polyfill.installAll();
if (shaka.Player.isBrowserSupported()) {
initPlayer();
} else {
console.error('Browser not supported!');
}
}
async function initPlayer() {
const video = document.getElementById('video');
const player = new shaka.Player(video);
window.player = player;
player.addEventListener('error', onErrorEvent);
try {
await player.load(manifestUri);
console.log('The video has now been loaded!');
} catch (e) {
onError(e);
}
}
function onErrorEvent(event) {
onError(event.detail);
}
function onError(error) {
console.error('Error code', error.code, 'object', error);
}
document.addEventListener('DOMContentLoaded', initApp);
</script>
</head>
<body>
<video id="video"
width="640"
src="https://storage.googleapis.com/shaka-demo-assets/bbb-dark-truths-hls/hls.m3u8"
controls></video>
</body>
</html>
Hi jakub,
I could reproduce this error when I opened your example in a local html file. But before the error you mentioned, I got the following message:
Not allowed to load local resource: blob:null/7350b1f9-9c47-4998-b80c-fafeeac8a586
So I tried to run this same html file but hosted from a local server (http-server), and that fixed the error to me.
Hi @jakubvojacek ,
You should either set the video element's src attribute, or call player.load(), but not both. If you're using Shaka Player, player.load() will suffice. If you can set src directly without player.load(), then you have no need for Shaka Player.
Does this help?
Hello @guilleccc
thank you, I totally overlooked that you answered (i subscribed to all shaka issues and must have missed this one as i didnt hope for an aswer anymore)
Anyway, I tested the solution to put the file behind webserver and it worked splendidly.
Thanks 👍
|
gharchive/issue
| 2021-07-28T09:31:15 |
2025-04-01T04:34:24.635374
|
{
"authors": [
"guilleccc",
"jakubvojacek",
"joeyparrish"
],
"repo": "google/shaka-player",
"url": "https://github.com/google/shaka-player/issues/3554",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1917542229
|
fix: human time must not hard set the timezone
Hello,
utils.get_human_time() function format a UNIX timestamp as string to be stored into BigQuery.
But it replaces the timezone without dealing with the UTC time.
Using datetime.fromtimestamp() do the job well.
Tests have been updated accordingly.
Regards,
Sébastien
Hello @lvaylet,
I don't understand why some checks failed as it works well and I'm pretty sure the method signature is correct.
Any help will be really appreciated.
Regards,
Sébastien
Hello @sdenef-adeo, I will take a look later today.
@sdenef-adeo The pytype stage of make lint completes successfully on my machine after removing timestamp=:
dt_tz = datetime.fromtimestamp(timestamp, tz=to_zone)
But then the safety stage fails because of CVEs in Flask, GitPython, Requests and a few other packages.
Hello @lvaylet,
👌
AFAIU those CVE are not due to the code provided here. Is there any other PR I need to merge into this one to fix those CVE?
@sdenef-adeo I just created #352 and will start investigate soon.
@sdenef-adeo Done. Can you merge https://github.com/google/slo-generator/commit/fb35d870f80a86d7dcc7aae05725e5cf713b11a8 into your PR?
merged
Then I guess you also have to remove timestamp= from your code, as highlighted above.
Thanks @sdenef-adeo. Now that all checks complete successfully, let me reproduce the (bad) behavior and check this PR fixes it.
I just confirmed that the current code does not return the expected values.
For example:
python
Python 3.9.15 (main, Nov 5 2022, 10:20:53)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from slo_generator.utils import get_human_time
>>> get_human_time(1702660513.987654, timezone='Europe/Paris')
'2023-12-15T17:15:13.987654+01:00'
>>> get_human_time(1565092435, timezone='Europe/Paris')
'2019-08-06T11:53:55.000000+02:00'
According to Epoch & Unix Timestamp Conversion Tools, the first call should return '2023-12-15T18:15:13.987654+01:00'. The second call should return '2019-08-06T13:53:55.000000+02:00'.
Your version returns the right values.
Can you rebase your branch as it is now out-of-date? Then I will merge this PR.
for sure @lvaylet
It should be OK now.
Not quite. GitHub still reports 'This branch is out-of-date with the base branch'. I will move forward anyway, as these changes do not overlap with anything in recent PRs.
Unfortunately, GitHub is preventing me from moving forward.
Your master branch seems to lag behind by 2 commits:
Can you update your fork, then rebase your adeo:fix/timestamp_human_tz branch too?
I understand. It should be due to my "master branch" on the fork repo is out-of-date. I will do.
All good. Thanks for your patience @sdenef-adeo!
|
gharchive/pull-request
| 2023-09-28T13:20:06 |
2025-04-01T04:34:24.685694
|
{
"authors": [
"lvaylet",
"sdenef-adeo"
],
"repo": "google/slo-generator",
"url": "https://github.com/google/slo-generator/pull/350",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2038391615
|
SoundChecker - Test THD and SNR - App crashed when selecting output device as "USB audio -..."
Connect Pixel 7Pro with Vantec Audio Adapter (connect loop back cables on the device)
Launch SoundChecker
Select "Test THD and SNR"
Select Input device as USB Audio
Select Output device as "USB Audio - ICUSBAUDIO7D USB device card=1;device=0"
EXPECTED
Test can be run
ACTUAL
Sound-Checker app crashed
This is being tracked internally with:
b/316181327 | P2 | SoundChecker - Test THD and SNR - App crashed when selecting output device as "USB-Audio .."
|
gharchive/issue
| 2023-12-12T19:05:51 |
2025-04-01T04:34:24.688630
|
{
"authors": [
"philburk",
"xiaotongtzhao"
],
"repo": "google/sound-checker",
"url": "https://github.com/google/sound-checker/issues/15",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2727568239
|
Clarification needed regarding the use of the Walrus Operator
Hello,
I noticed that the current version of the Python style guide does not mention the Walrus Operator (:=), which was introduced in Python 3.8. I believe it would be beneficial to include a section discussing this operator, as it can impact code readability and maintainability. Are there any emerging standards or agreed good & bad practices for Walrus Operators that could be added to your python style guide? If you don't have anything in place maybe you can consider adopting the PEP-572 style guide recommendations as well as the examples provided by Tim Peters in Appendix A.
Thank you for considering this suggestion!
Best regards,
Diogo Rosário
@jeremyhylton @nathanielmanistaatgoogle @shanel-at-google
|
gharchive/issue
| 2024-12-09T16:07:57 |
2025-04-01T04:34:24.691435
|
{
"authors": [
"diogofrosario",
"vapier"
],
"repo": "google/styleguide",
"url": "https://github.com/google/styleguide/issues/868",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
333943398
|
syz-executor:using a lot of memory of device, OOM(out of memory) occurs when device is running
When I use syzkaller tools to test, i found that OOM(out of memory) occurs frequently when device is running.This will lead to kill processes and device will panic later.I found that syz-executor process is using too many memory at this time, is it abnormal?How to deal with this problem?
Thanks
Hi @q00355042,
What is frequently? What's the amount of memory on device? What's the value of "procs" in your config?
Resolving #589 should help. There is no other way to restrict memory consumption on linux.
dup #589
|
gharchive/issue
| 2018-06-20T06:43:02 |
2025-04-01T04:34:24.693649
|
{
"authors": [
"dvyukov",
"q00355042"
],
"repo": "google/syzkaller",
"url": "https://github.com/google/syzkaller/issues/638",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
657493516
|
all: initialize vhci, provide syz_emit_vhci and vhci flags
Before sending a pull request, please review Contribution Guidelines:
https://github.com/google/syzkaller/blob/master/docs/contributing.md
Apologies for the noise, I did not expect my author info rebase to request review from all of you.
|
gharchive/pull-request
| 2020-07-15T16:38:37 |
2025-04-01T04:34:24.695233
|
{
"authors": [
"TheOfficialFloW"
],
"repo": "google/syzkaller",
"url": "https://github.com/google/syzkaller/pull/1945",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2110654280
|
Issue #356: Show first and last events instead of only first ones when displaying an EventSet
Adjusted displays_util.py to display 7 at a time until more than 7 timestamps are created, in which the ellipses row is implemented after the first three rows, with the last 3 rows following that row.
Formatting check is failing - ensure you have black is correctly configured (should be running on save by default if no user VSCode setting is overriding the workspace's ones)
Please merge main branch into your PR's branch so that tests are ran on it (fixed in https://github.com/google/temporian/pull/363)
Hi @jtaylor205 thanks for the contribution!
Testing this locally I found that setting tp.config.display_max_events = 1 shows all events
Hi @javiber , I have fixed that problem. Below is what it now shows. I will upload my changes now
Congrats on your first open-source contribution @jtaylor205! Good job 💪🏼
|
gharchive/pull-request
| 2024-01-31T17:51:50 |
2025-04-01T04:34:24.698958
|
{
"authors": [
"ianspektor",
"javiber",
"jtaylor205"
],
"repo": "google/temporian",
"url": "https://github.com/google/temporian/pull/362",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
901711783
|
Why normalize_scale is set to 4.0?
I found in the https://github.com/google/tirg/blob/c58daa066d8af1a5b3de1a0ef6d112519ddda611/img_text_composition_models.py#L42, the normalize_scale is set to 4 at the beginning. It is interesting because the performance will drop a lot if I set the scale to other number like 1.0.
I wonder why it is important and why 4 is an ideal number.
Thanks!
great question, this number affect the scale of the logits (input to the logistic function https://en.wikipedia.org/wiki/Logistic_function), if you look at the logistic function, the logits must be big enough to cover the whole 0-1 output range (but if it's too big it could land into the flat gradient zone)
I just found experimentally 4.0 works, you can make it a learnable weight, though optimizing that could be more difficult. Finally I've seen similar implementation where people did make it work with 1.0 I'm not sure how
Thank you for your reply. I wanted to replace it with torch.nn.functional.normalize at first, but the result shows it is not as simple as I thought.
|
gharchive/issue
| 2021-05-26T03:50:06 |
2025-04-01T04:34:24.701941
|
{
"authors": [
"invisprints",
"lugiavn"
],
"repo": "google/tirg",
"url": "https://github.com/google/tirg/issues/13",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
748399746
|
Prerelease fixes [WIP]
Several fixes - WIP
Pins datastore and google-cloud-logging
Only start Prometheus if configured to
Update how we can libcloudforensics to match current release
Add authentication message if datastore client initialization fails for auth reasons
Only load google_cloud if configured
Add block device checks to preprocessor
Fix up some local_path in evidence to always point to the last processed path.
Add required states for binary extractor Job
Move dependency check output to debug logs
Adds some more debug log lines
Fix some linting
Dropping this in favor of https://github.com/google/turbinia/pull/650
|
gharchive/pull-request
| 2020-11-23T01:09:22 |
2025-04-01T04:34:24.708852
|
{
"authors": [
"aarontp"
],
"repo": "google/turbinia",
"url": "https://github.com/google/turbinia/pull/641",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
78386474
|
"Native thread-sleep not available." when gulp images
Terminal warning:
"Native thread-sleep not available.
This will result in much slower performance.
You should re-install spawn-sync if possible."
And the images won't be copied to directory app.
Tried to re-install spawn-sync, but didn't fix the problem.
I'm in same problem and found how to remove the warning.
npm list
would shows you need "try-thread-sleep" to use spawn-sync.
So, add bellow in your package.json and "npm install" again.
"try-thread-sleep": "^1.0.0",
Solved after installing try-thread-sleep
@TatsuyaHishima and @rizqinizamil thanks for you inputs. I ran the npm list and it clearly indicated the error
npm ERR! missing: try-thread-sleep@^1.0.0, required by spawn-sync@1.0.13 npm ERR! missing: try-thread-sleep@^1.0.0, required by spawn-sync@1.0.11 npm ERR! not ok code 0
I installed try-thread-sleep and the error went away.
For anyone that runs into this in future, the resolution is:
Upgrade to node@0.12 or node 4
Install try-thread-sleep as a dependency of your app directly.
It should no longer be an issue in master.
Worked for me too . tnx
|
gharchive/issue
| 2015-05-20T05:55:07 |
2025-04-01T04:34:24.712965
|
{
"authors": [
"TatsuyaHishima",
"addyosmani",
"angusjune",
"doomsbuster",
"rizqinizamil",
"zhilevan"
],
"repo": "google/web-starter-kit",
"url": "https://github.com/google/web-starter-kit/issues/691",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
981635607
|
Dashboard: Template Details - Related Templates
We have enough templates with metadata to make this grid show templates that are related to the one getting viewed in some capacity. Currently they're just at random.
I think we should match template metadata and show up to 6 templates that match.
Acceptance Criteria
Display templates that are actually related (via metadata) in the template details view
Tests, please!
Verified in QA
|
gharchive/issue
| 2021-08-27T22:24:01 |
2025-04-01T04:34:24.714628
|
{
"authors": [
"BrittanyIRL",
"csossi"
],
"repo": "google/web-stories-wp",
"url": "https://github.com/google/web-stories-wp/issues/8827",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
953649666
|
CDN: Optimize templates images
Context
I was optimizing the images for the text sets and checked other images too, it turns out we can save a bit on the templates.
Summary
Relevant Technical Choices
To-do
User-facing changes
Testing Instructions
QA
[ ] This is a non-user-facing change and requires no QA
This PR can be tested by following these steps:
UAT
[ ] UAT should use the same steps as above.
This PR can be tested by following these steps:
Reviews
Does this PR have a security-related impact?
Does this PR change what data or activity we track or use?
Does this PR have a legal-related impact?
Checklist
[ ] This PR addresses an existing issue and I have linked this PR to it in ZenHub
[ ] I have tested this code to the best of my abilities
[ ] I have verified accessibility to the best of my abilities (docs)
[ ] I have verified i18n and l10n (translation, right-to-left layout) to the best of my abilities
[ ] This PR contains automated tests (unit, integration, and/or e2e) to verify the code works as intended (docs)
[ ] I have added documentation where necessary
[ ] I have added a matching Type: XYZ label to the PR
Fixes #
@merapi FYI, images are compressed automatically during deployment:
https://github.com/google/web-stories-wp/blob/20aaa7331d7260d58d19e29a7aeb746b3cca62ee/.github/workflows/continuous-integration-firebase.yml#L54-L59
But of course can't hurt to have smaller files on your computer :-)
Weird that this GHA with pngQuality: Number, integer 1-100, default 80 stored in a string. is worse than lossless ImageOptim by 11-72%.
|
gharchive/pull-request
| 2021-07-27T08:45:24 |
2025-04-01T04:34:24.723202
|
{
"authors": [
"merapi",
"swissspidy"
],
"repo": "google/web-stories-wp",
"url": "https://github.com/google/web-stories-wp/pull/8480",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1895377569
|
Use formal modeling to test or prove the correctness of core algorithms
While zerocopy is conceptually complex, the amount of code that executes in order to perform a particular operation is often very small. This makes it a perfect target for formal modeling techniques. We should use tools like Kani on our most important core algorithms in order to prove their correctness.
A list of tests which are good candidates:
test_round_down_to_next_multiple_of_alignment from #393
test_validate_cast_and_convert_metadata
Various tests in byteorder
I left a comment about our need for alternative targets here: https://github.com/model-checking/kani/issues/2402#issuecomment-1726613332
|
gharchive/issue
| 2023-09-13T23:06:31 |
2025-04-01T04:34:24.725936
|
{
"authors": [
"joshlf",
"jswrenn"
],
"repo": "google/zerocopy",
"url": "https://github.com/google/zerocopy/issues/378",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
341589589
|
Update protobuf to 3.6.0.
Probably fixes #469.
Should we try auto-release on this one as well?
|
gharchive/pull-request
| 2018-07-16T16:29:39 |
2025-04-01T04:34:24.752788
|
{
"authors": [
"alexander-fenster",
"lukesneeringer"
],
"repo": "googleapis/artman",
"url": "https://github.com/googleapis/artman/pull/470",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
679574060
|
Synthesis failed for RealTimeBidding
Hello! Autosynth couldn't regenerate RealTimeBidding. :broken_heart:
Here's the output from running synth.py:
2020-08-15 06:18:49,622 autosynth [INFO] > logs will be written to: /tmpfs/src/logs/elixir-google-api
2020-08-15 06:18:50,100 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
2020-08-15 06:18:50,103 autosynth [DEBUG] > Running: git config user.name yoshi-automation
2020-08-15 06:18:50,106 autosynth [DEBUG] > Running: git config user.email yoshi-automation@google.com
2020-08-15 06:18:50,108 autosynth [DEBUG] > Running: git config push.default simple
2020-08-15 06:18:50,111 autosynth [DEBUG] > Running: git branch -f autosynth-realtimebidding
2020-08-15 06:18:50,114 autosynth [DEBUG] > Running: git checkout autosynth-realtimebidding
Switched to branch 'autosynth-realtimebidding'
2020-08-15 06:18:50,337 autosynth [INFO] > Running synthtool
2020-08-15 06:18:50,338 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/real_time_bidding/synth.metadata', 'synth.py', '--']
2020-08-15 06:18:50,338 autosynth [DEBUG] > log_file_path: /tmpfs/src/logs/elixir-google-api/RealTimeBidding/sponge_log.log
2020-08-15 06:18:50,340 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata clients/real_time_bidding/synth.metadata synth.py -- RealTimeBidding
2020-08-15 06:18:50,557 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/elixir-google-api/synth.py.
On branch autosynth-realtimebidding
nothing to commit, working tree clean
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 93, in main
with synthtool.metadata.MetadataTrackerAndWriter(metadata):
File "/tmpfs/src/github/synthtool/synthtool/metadata.py", line 237, in __enter__
self.observer.start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 260, in start
emitter.start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 110, in start
self.on_thread_start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify.py", line 121, in on_thread_start
self._inotify = InotifyBuffer(path, self.watch.is_recursive)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_buffer.py", line 35, in __init__
self._inotify = Inotify(path, recursive)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 203, in __init__
self._add_watch(path, event_mask)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 412, in _add_watch
Inotify._raise_error()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 432, in _raise_error
raise OSError(err, os.strerror(err))
FileNotFoundError: [Errno 2] No such file or directory
2020-08-15 06:18:50,648 autosynth [ERROR] > Synthesis failed
2020-08-15 06:18:50,648 autosynth [DEBUG] > Running: git clean -fdx
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 630, in _inner_main
).synthesize(synth_log_path / "sponge_log.log")
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/real_time_bidding/synth.metadata', 'synth.py', '--', 'RealTimeBidding']' returned non-zero exit status 1.
Google internal developers can see the full log here.
Autosynth is still having trouble generating RealTimeBidding. :sob:
Here's the output from running synth.py:
2020-08-16 06:19:55,370 autosynth [INFO] > logs will be written to: /tmpfs/src/logs/elixir-google-api
2020-08-16 06:19:55,867 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
2020-08-16 06:19:55,870 autosynth [DEBUG] > Running: git config user.name yoshi-automation
2020-08-16 06:19:55,873 autosynth [DEBUG] > Running: git config user.email yoshi-automation@google.com
2020-08-16 06:19:55,876 autosynth [DEBUG] > Running: git config push.default simple
2020-08-16 06:19:55,879 autosynth [DEBUG] > Running: git branch -f autosynth-realtimebidding
2020-08-16 06:19:55,882 autosynth [DEBUG] > Running: git checkout autosynth-realtimebidding
Switched to branch 'autosynth-realtimebidding'
2020-08-16 06:19:56,100 autosynth [INFO] > Running synthtool
2020-08-16 06:19:56,100 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/real_time_bidding/synth.metadata', 'synth.py', '--']
2020-08-16 06:19:56,100 autosynth [DEBUG] > log_file_path: /tmpfs/src/logs/elixir-google-api/RealTimeBidding/sponge_log.log
2020-08-16 06:19:56,102 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata clients/real_time_bidding/synth.metadata synth.py -- RealTimeBidding
2020-08-16 06:19:56,323 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/elixir-google-api/synth.py.
On branch autosynth-realtimebidding
nothing to commit, working tree clean
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 93, in main
with synthtool.metadata.MetadataTrackerAndWriter(metadata):
File "/tmpfs/src/github/synthtool/synthtool/metadata.py", line 237, in __enter__
self.observer.start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 260, in start
emitter.start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 110, in start
self.on_thread_start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify.py", line 121, in on_thread_start
self._inotify = InotifyBuffer(path, self.watch.is_recursive)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_buffer.py", line 35, in __init__
self._inotify = Inotify(path, recursive)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 203, in __init__
self._add_watch(path, event_mask)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 412, in _add_watch
Inotify._raise_error()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 432, in _raise_error
raise OSError(err, os.strerror(err))
FileNotFoundError: [Errno 2] No such file or directory
2020-08-16 06:19:56,420 autosynth [ERROR] > Synthesis failed
2020-08-16 06:19:56,420 autosynth [DEBUG] > Running: git clean -fdx
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 630, in _inner_main
).synthesize(synth_log_path / "sponge_log.log")
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/real_time_bidding/synth.metadata', 'synth.py', '--', 'RealTimeBidding']' returned non-zero exit status 1.
Google internal developers can see the full log here.
Autosynth is still having trouble generating RealTimeBidding. :sob:
Here's the output from running synth.py:
2020-08-17 06:21:11,424 autosynth [INFO] > logs will be written to: /tmpfs/src/logs/elixir-google-api
2020-08-17 06:21:12,322 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
2020-08-17 06:21:12,326 autosynth [DEBUG] > Running: git config user.name yoshi-automation
2020-08-17 06:21:12,336 autosynth [DEBUG] > Running: git config user.email yoshi-automation@google.com
2020-08-17 06:21:12,339 autosynth [DEBUG] > Running: git config push.default simple
2020-08-17 06:21:12,343 autosynth [DEBUG] > Running: git branch -f autosynth-realtimebidding
2020-08-17 06:21:12,346 autosynth [DEBUG] > Running: git checkout autosynth-realtimebidding
Switched to branch 'autosynth-realtimebidding'
2020-08-17 06:21:12,564 autosynth [INFO] > Running synthtool
2020-08-17 06:21:12,564 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/real_time_bidding/synth.metadata', 'synth.py', '--']
2020-08-17 06:21:12,564 autosynth [DEBUG] > log_file_path: /tmpfs/src/logs/elixir-google-api/RealTimeBidding/sponge_log.log
2020-08-17 06:21:12,567 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata clients/real_time_bidding/synth.metadata synth.py -- RealTimeBidding
2020-08-17 06:21:12,794 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/elixir-google-api/synth.py.
On branch autosynth-realtimebidding
nothing to commit, working tree clean
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 93, in main
with synthtool.metadata.MetadataTrackerAndWriter(metadata):
File "/tmpfs/src/github/synthtool/synthtool/metadata.py", line 237, in __enter__
self.observer.start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/api.py", line 260, in start
emitter.start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/utils/__init__.py", line 110, in start
self.on_thread_start()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify.py", line 121, in on_thread_start
self._inotify = InotifyBuffer(path, self.watch.is_recursive)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_buffer.py", line 35, in __init__
self._inotify = Inotify(path, recursive)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 203, in __init__
self._add_watch(path, event_mask)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 412, in _add_watch
Inotify._raise_error()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/watchdog/observers/inotify_c.py", line 432, in _raise_error
raise OSError(err, os.strerror(err))
FileNotFoundError: [Errno 2] No such file or directory
2020-08-17 06:21:12,888 autosynth [ERROR] > Synthesis failed
2020-08-17 06:21:12,889 autosynth [DEBUG] > Running: git clean -fdx
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 630, in _inner_main
).synthesize(synth_log_path / "sponge_log.log")
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/real_time_bidding/synth.metadata', 'synth.py', '--', 'RealTimeBidding']' returned non-zero exit status 1.
Google internal developers can see the full log here.
Autosynth passed, closing! :green_heart:
|
gharchive/issue
| 2020-08-15T13:18:52 |
2025-04-01T04:34:24.761646
|
{
"authors": [
"yoshi-automation"
],
"repo": "googleapis/elixir-google-api",
"url": "https://github.com/googleapis/elixir-google-api/issues/6195",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
704612911
|
[javanaut][operationExpr][1/2]feat: Add Assignment operation expr with multiply and assignment operator
enables *= operator.
^= will come with next PR [2/2]
type-checking tables in go/javanaut-assignment-operation-expr
Addressed the comments in the design doc with changes:
Update the convenience function name using your suggestion.
Rename lhsExpr to variableExpr and rhsExpr to valueExpr. Add check to make sure variable expr is not decl with unit tests.
Add comments.
|
gharchive/pull-request
| 2020-09-18T19:43:07 |
2025-04-01T04:34:24.765302
|
{
"authors": [
"summer-ji-eng"
],
"repo": "googleapis/gapic-generator-java",
"url": "https://github.com/googleapis/gapic-generator-java/pull/320",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
345908701
|
[Ruby] AVAILABLE_VERSIONS is not finding any results
Apparently as a result of https://github.com/googleapis/gapic-generator/pull/2163 now the AVAILABLE_VERSIONS determination for a single-service library is returning no results.
Specifically, it's failing in the case where postVersionDirPath is empty. The original intent for that case appears to be that the directory name itself gets .rb appended to it (e.g. /path/to/v1 -> /path/to/v1.rb). But the change yields /path/to/v1/.rb which is incorrect.
The resulting change (e.g. https://github.com/GoogleCloudPlatform/google-cloud-ruby/pull/2209/files) then fails tests.
I think we need to have different logic for empty vs non-empty postVersionDirPath.
@jbolinger @landrito ping
Ah woops. I'll have a fix in by early next week.
|
gharchive/issue
| 2018-07-30T19:48:44 |
2025-04-01T04:34:24.773413
|
{
"authors": [
"dazuma",
"landrito"
],
"repo": "googleapis/gapic-generator",
"url": "https://github.com/googleapis/gapic-generator/issues/2174",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
627531299
|
Is RefreshingManagedChannel being used, or just an experiment?
I happened on RefreshingManagedChannel added in #805, and it concerned me because re-creating the channel is heavy-weight and this may prevent future work. I'm most concerned that to my knowledge grpc team wasn't notified, since this can bite us.
Is this being used by someone, or just an experiment?
Looks like it is, see #804. Please reach out to the original reporters / authors if further discussion is needed.
@miraleung, did you paste the wrong issue/PR? I saw the issue posted and saw the PR #805. But that just exposes an API. It is enabled if you provide a ChannelPrimer, which is @InternalApi("For internal use by google-cloud-java clients only"). The question is if there is a user of it.
Based on a search of GitHub, it appears it is only used by CBT, but since the implementation is using the grpc-experimental API it requires explicit opt-in via a BetaApi.
|
gharchive/issue
| 2020-05-29T20:51:20 |
2025-04-01T04:34:24.776396
|
{
"authors": [
"ejona86",
"miraleung"
],
"repo": "googleapis/gax-java",
"url": "https://github.com/googleapis/gax-java/issues/1084",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1257244865
|
refactor(storage): Dual-Region Sample for Updated API
[Internal]
For details and coordination: go/gcs-dpe-dual-region-milestones-api-refactor
Implementation Example: https://github.com/googleapis/nodejs-storage/pull/1977
This is a dup of #9135, or at best it is asking for the examples for the same feature. Closing.
|
gharchive/issue
| 2022-06-01T23:06:28 |
2025-04-01T04:34:24.786604
|
{
"authors": [
"coryan",
"danielbankhead"
],
"repo": "googleapis/google-cloud-cpp",
"url": "https://github.com/googleapis/google-cloud-cpp/issues/9136",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
738840727
|
Release Google.Cloud.RecommendationEngine.V1Beta1 version 1.0.0-beta01
Changes in this release:
First beta release.
Created release for Google.Cloud.RecommendationEngine.V1Beta1-1.0.0-beta01
|
gharchive/pull-request
| 2020-11-09T09:09:55 |
2025-04-01T04:34:24.788219
|
{
"authors": [
"jskeet",
"yoshi-automation"
],
"repo": "googleapis/google-cloud-dotnet",
"url": "https://github.com/googleapis/google-cloud-dotnet/pull/5534",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1192721195
|
Release Google.Cloud.Iam.V1 version 2.4.0
Changes in this release:
New features
AuditConfig for IAM v1 (commit bc4cfd8)
Created release for Google.Cloud.Iam.V1-2.4.0
|
gharchive/pull-request
| 2022-04-05T07:10:09 |
2025-04-01T04:34:24.790101
|
{
"authors": [
"jskeet",
"yoshi-automation"
],
"repo": "googleapis/google-cloud-dotnet",
"url": "https://github.com/googleapis/google-cloud-dotnet/pull/8313",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1264897454
|
Release Google.Cloud.LifeSciences.V2Beta version 2.0.0-beta01
Changes in this release:
This is the first version of this package to depend on GAX v4.
There are some breaking changes, both in GAX v4 and in the generated code. The changes that aren't specific to any given API are described in the Google Cloud documentation. We don't anticipate any changes to most customer code, but please file a GitHub issue if you run into problems.
The most important change in this release is the use of the Grpc.Net.Client package for gRPC communication, instead of Grpc.Core. When using .NET Core 3.1 or .NET 5.0+ this should lead to a smaller installation footprint and greater compatibility (e.g. with Apple M1 chips). Any significant change in a core component comes with the risk of incompatibility, however - so again, please let us know if you encounter any issues.
Created release for Google.Cloud.LifeSciences.V2Beta-2.0.0-beta01
|
gharchive/pull-request
| 2022-06-08T15:02:29 |
2025-04-01T04:34:24.793152
|
{
"authors": [
"jskeet",
"yoshi-automation"
],
"repo": "googleapis/google-cloud-dotnet",
"url": "https://github.com/googleapis/google-cloud-dotnet/pull/8674",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
323261844
|
Import BOM
@garrettjonesgoogle
👋 @elharo just checking in on the status of some older PRs :) Is this current?
|
gharchive/pull-request
| 2018-05-15T15:10:37 |
2025-04-01T04:34:24.794039
|
{
"authors": [
"JustinBeckwith",
"elharo"
],
"repo": "googleapis/google-cloud-java",
"url": "https://github.com/googleapis/google-cloud-java/pull/3269",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
625165970
|
chore: release 0.133.0
:robot: I have created a release *beep* *boop*
0.133.0
google/cloud-asset 1.1.0
Features
introduce support for org/access context policies (#2849) (5714dd7)
google/cloud-automl 1.1.1
Documentation
update documentation formatting (#2931) (5dca3ed)
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-bigquerydatatransfer 1.0.2
Bug Fixes
update retry config, reorder methods (#3039) (fef66e8)
google/cloud-bigquery-storage 0.2.0
⚠ BREAKING CHANGES
update createReadSession, readRows, splitReadStream (#3056)
Features
update createReadSession, readRows, splitReadStream (#3056) (8b1244c)
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-bigtable 1.4.0
Features
update timeouts (#3052) (503b9c8)
Miscellaneous Chores
update metadata (#2968) (6359010)
google/cloud-container 1.0.2
Bug Fixes
update retry configuration, reorder methods (#3040) (d91787b)
google/cloud-core 1.37.0
Features
async upload for storage buckets (#2878) (61ec33e)
google/cloud-data-catalog 0.1.1
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
Documentation
update documentation formatting (#3000) (69e8e8c)
google/cloud-datastore 1.11.3
Documentation
update value documentation (#2977) (5760f40)
google/cloud-debugger 1.1.0
Features
update timeout config (#3041) (1f04f2e)
Bug Fixes
correctly serialize evaluated expressions (#2974) (601ee77)
google/cloud-dialogflow 0.15.0
⚠ BREAKING CHANGES
add agent environment, change AgentsClient::exportAgent() (#3057)
Features
add agent environment, change AgentsClient::exportAgent() (#3057) (e568324)
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-error-reporting 0.16.3
Miscellaneous Chores
reorder methods (#3042) (d97d927)
google/cloud-firestore 1.13.0
Features
allow array-of-arrays value for "in" query filter (#2927) (e35c2fa)
google/cloud-iot 1.1.0
Features
update retry and timeout config (#3043) (029556b)
google/cloud-memcache 0.1.1
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-oslogin 1.0.1
Miscellaneous Chores
reorder methods (#3044) (fce91ee)
google/cloud-pubsub 1.24.1
Documentation
update received message documentation (#3028) (df19d02)
google/cloud-recaptcha-enterprise 0.1.1
Miscellaneous Chores
add synth metadata file (#2924) (f55cf03)
google/cloud-recommendations-ai 0.1.1
Miscellaneous Chores
reorder methods and update samples (#2976) (bd57c5b)
google/cloud-recommender 1.0.1
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-scheduler 1.3.1
Miscellaneous Chores
reorder methods (#3045) (8f1082d)
google/cloud-service-directory 0.1.2
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-spanner 1.29.0
Features
update timeout configuration #3029 (b7b24d9)
spanner: Option to use GAPIC backoff strategy (#2062) (#2701) (2067078)
Miscellaneous Chores
update timeouts (#3053) (fc7c09b)
google/cloud-speech 1.1.1
Bug Fixes
update retry code for recognize (#3046) (53ef1ae)
google/cloud-storage 1.21.0
Features
async upload for storage buckets (#2878) (61ec33e)
Bug Fixes
add bucket to v4 post policy conditions (#2929) (4ac9846)
fix check for user provided MD5 hash (#2970) (60db51f)
google/cloud-talent 0.12.1
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-tasks 1.6.2
Bug Fixes
update retry codes (#3047) (4293079)
google/cloud-text-to-speech 1.0.2
Miscellaneous Chores
reorder methods (#3048) (2a7c230)
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-trace 1.1.0
Features
update retry and timeout config (#3049) (6509ccc)
google/cloud-translate 1.7.3
Miscellaneous Chores
reorder methods (#3050) (393c0f6)
google/cloud-vision 1.0.1
Miscellaneous Chores
sample context aware commits for a broader base of services (#2932) (35c6057)
google/cloud-web-security-scanner 0.4.3
Miscellaneous Chores
reorder methods (#3051) (175f98d)
This PR was generated with Release Please.
Release is at https://github.com/googleapis/google-cloud-php/releases/tag/v0.133.0
|
gharchive/pull-request
| 2020-05-26T20:09:01 |
2025-04-01T04:34:24.835293
|
{
"authors": [
"dwsupplee",
"yoshi-automation"
],
"repo": "googleapis/google-cloud-php",
"url": "https://github.com/googleapis/google-cloud-php/pull/3064",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
651950351
|
test: clean up system-test setup against emulator
Use Docker image instead of manually downloading the emulator binary and running it.
The timeout in the integration tests against the emulator is caused by an eternal aborted-retry loop in ITReadWriteAutocommitSpannerTest. What happens in the following:
Transaction 1 executes UPDATE TEST SET NAME='test18' WHERE ID=1000 and then calls Rollback.
Transaction 2 executes DELETE FROM TEST WHERE ID=1000. This statement returns an Aborted error. No Rollback or Commit is called, as the transaction has already been aborted.
Transaction 2 is retried and executes DELETE FROM TEST WHERE ID=1000 in a new transaction. This statement returns an Aborted error. No Rollback or Commit is called, as the transaction has already been aborted.
... this repeats for ever ...
|
gharchive/pull-request
| 2020-07-07T02:49:11 |
2025-04-01T04:34:24.863249
|
{
"authors": [
"olavloite",
"skuruppu"
],
"repo": "googleapis/java-spanner",
"url": "https://github.com/googleapis/java-spanner/pull/319",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
451601502
|
Adding "Command"
Is there a plan/timeline to add Command to the library? We use the compute library to create a new VM, then we'd like to run a command on it from our application without having to call gcloud from an exec.
@sofisl Could you please provide a link to the code that allows you to run a command on a Windows Instance via nodejs-compute? We're using 3.3.0 and I don't see the method in the API that allows us to do this.
|
gharchive/issue
| 2019-06-03T17:28:36 |
2025-04-01T04:34:24.864649
|
{
"authors": [
"stoked10"
],
"repo": "googleapis/nodejs-compute",
"url": "https://github.com/googleapis/nodejs-compute/issues/321",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
543016025
|
fix: overhauled quickstart
Redid dataproc quickstart to match quickstarts in other languages
@bradmiro this is looking good to me :+1: but I think you might need to run npm run fix.
Interesting that these weren't picked up locally. Should be fixed, thanks!
|
gharchive/pull-request
| 2019-12-27T23:10:15 |
2025-04-01T04:34:24.865838
|
{
"authors": [
"bcoe",
"bradmiro"
],
"repo": "googleapis/nodejs-dataproc",
"url": "https://github.com/googleapis/nodejs-dataproc/pull/280",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
460697547
|
chore: release 1.0.2
:robot: I have created a release *beep* *boop*
1.0.2 (2019-06-26)
Bug Fixes
docs: link to reference docs section on googleapis.dev (#132) (be231be)
This PR was generated with Release Please.
Release is at https://github.com/googleapis/nodejs-paginator/releases/tag/v1.0.2
The release build has started, the log can be viewed here. :sunflower:
:egg: You hatched a release! The release build finished successfully! :purple_heart:
|
gharchive/pull-request
| 2019-06-25T23:58:24 |
2025-04-01T04:34:24.869464
|
{
"authors": [
"yoshi-automation"
],
"repo": "googleapis/nodejs-paginator",
"url": "https://github.com/googleapis/nodejs-paginator/pull/133",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1106880422
|
chore(samples): Snippet Generating System
This is a rather big PR, but please don't be intimidated by it's size. Most of it comes from just rearranging the code samples.
SGS - Snippet Generating System
Why did it happen?
After working on code samples for the Python GAPIC library for couple months now, I have learned two things:
Most of the snippets will be really similar to each other. After all, most of them are about creating a Virtual Machine one way or another with just some minor adjustments.
We should try to keep the 1 file - 1 sample ratio for ease of including those snippets in various places.
Turns out, achieving the 1:1 ratio would require a lot of code being replicated in multiple places with only small adjustments. As a result, keeping it all up to date would soon become a big chore. Changing one thing may require changing many different files.
So how does this PR help?
Since I noticed that most of the samples share a lot of common code, I decided to write a script that will assemble the code samples from pieces - this way, the code duplication in all the samples will be under control and updating things should be much simpler. Now, with the SGS we have a clear set of ingredients and recipes that can be used to create new code snippets with minimal need for new code.
How to review?
Since this PR contains both the SGS script and the restructured snippets, here's what you should pay attention to:
The SGS README in samples/README.md - This file should explain the main idea of how SGS works.
The sgs.py script itself - This is the script that handles the assembly of the final snippets.
You can have a look at some ingredients, recipes and snippets files to see how it all comes together. A good example, showing how the pieces are assembled can be samples/snippets/instances/create_start_instance/create_from_public_image.py which is assembled from recipe: samples/recipes/instances/create_start_instance/create_from_public_image.py using ingredients found in:
samples/ingredients/images/get_image_from_family.py
samples/ingredients/disks/from_image.py
samples/ingredients/instances/create_instance.py
samples/ingredients/instances/create_start_instance/create_from_public_image.py
As requested by @parthea I'm breaking this PR into two separate PRs that will be easier to review. First one, with just the SGS script can be found here: https://github.com/googleapis/python-compute/pull/199
|
gharchive/pull-request
| 2022-01-18T12:59:06 |
2025-04-01T04:34:24.882648
|
{
"authors": [
"m-strzelczyk"
],
"repo": "googleapis/python-compute",
"url": "https://github.com/googleapis/python-compute/pull/196",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
246858529
|
Incorrect googleapis-common-protos dependency in Ruby gemspec
What
The Ruby gemspec currently specifies a dependency on google-common-protos (which does not exist). In fact, the dependency should be on googleapis-common-protos.
cc: @landrito
This is resolved in googleapis here https://github.com/googleapis/googleapis/blob/master/gapic/packaging/dependencies.yaml#L84
#1442 should be able to resolve this for the test cases in toolkit.
Thanks, Ernest. This must be a problem with my local environment.
|
gharchive/issue
| 2017-07-31T19:27:53 |
2025-04-01T04:34:24.983866
|
{
"authors": [
"geigerj",
"landrito"
],
"repo": "googleapis/toolkit",
"url": "https://github.com/googleapis/toolkit/issues/1440",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
263467961
|
Fix crashing bug in localization.
Now that we deserialize trajectory nodes, we need to make sure
that the global matcher sampler is added for the map trajectory.
Oops. I actually handled this in #552. That's the curse of splitting everything in many PRs.
|
gharchive/pull-request
| 2017-10-06T14:36:14 |
2025-04-01T04:34:24.984912
|
{
"authors": [
"ojura",
"wohe"
],
"repo": "googlecartographer/cartographer",
"url": "https://github.com/googlecartographer/cartographer/pull/575",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
256561696
|
how to tune the 3d cartographer for my vel-16 data
hello, I am tuning cartographer 3d for my vel-16 data,but the effect is very confusing, The /scan_matched_points2 laser frame has a great jump,can you help me?
configuration file , launch file, urdf file and png file is here
the bag file is here
I assume this a duplicate of #509. Closing this issue as duplicate.
|
gharchive/issue
| 2017-09-11T02:00:49 |
2025-04-01T04:34:24.986571
|
{
"authors": [
"SirVer",
"sunmk2006"
],
"repo": "googlecartographer/cartographer_ros",
"url": "https://github.com/googlecartographer/cartographer_ros/issues/501",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
659513122
|
Log.java:12: error: Cannot find getter for field.
\hiltfirst\app\build\tmp\kapt3\stubs\debug\com\example\android\hilt\data\Log.java:12: error: Cannot find getter for field.
private long id = 0L;
warning: The following options were not recognized by any processor: '[dagger.fastInit, dagger.hilt.android.internal.disableAndroidSuperclassValidation, kapt.kotlin.generated]'[WARN] Incremental annotation processing requested, but support is disabled because the following processors are not incremental: androidx.room.RoomProcessor (DYNAMIC).
Task :app:kaptDebugKotlin FAILED
^
In which step of the codelab does this happen?
|
gharchive/issue
| 2020-07-17T18:34:00 |
2025-04-01T04:34:24.988583
|
{
"authors": [
"hasan5151",
"manuelvicnt"
],
"repo": "googlecodelabs/android-hilt",
"url": "https://github.com/googlecodelabs/android-hilt/issues/3",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
327784368
|
std::_bind is not compatible with std::function
if (error == AAUDIO_ERROR_DISCONNECTED){
std::function<void(void)>
restartFunction = std::bind(&AudioEngine::restart,
static_cast<AudioEngine *>(userData));
new std::thread(restartFunction);
}
Went over the whole file multiple times, and I can't find any difference from what is in your tutorial, but the block of text in the middle always shows an error
Thanks very much for filing this issue - and sorry that you ran into it. I believe something changed in the NDK which means that you now need to explicitly link against libandroid which can be done within CMakeLists.txt by adding android to the list of target_link_libraries
I thought that was the problem but having now removed libandroid from the list of libraries the app still compiles. Please could you post the compilation error you're receiving. I'll try cloning from scratch to see if I can reproduce.
Thanks for getting back to me!
It’s “Class ‘std::_bind<void(AudioEngine::)(),AudioEngine>’ is not compatible with class ‘std::function <void()>”
|
gharchive/issue
| 2018-05-30T15:30:37 |
2025-04-01T04:34:24.991945
|
{
"authors": [
"SmartyNance",
"dturner"
],
"repo": "googlecodelabs/android-wavemaker",
"url": "https://github.com/googlecodelabs/android-wavemaker/issues/7",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
823601357
|
Codelabs:Google Maps Platform—JavaScript
Hi,
There should be some minimum average time to complete any step that will be very useful to track a developer if its going with reading/understanding the content. if it's not happened then the steps color should not be change into blue which will help exactly to prevent false badge availability or its value.
I Think, The above logic might be applicable for other Codelabs tutorials.
Thanks for flagging - fixed!
|
gharchive/issue
| 2021-03-06T09:03:05 |
2025-04-01T04:34:24.993489
|
{
"authors": [
"amuramoto",
"mspace-git"
],
"repo": "googlecodelabs/maps-platform-101-js",
"url": "https://github.com/googlecodelabs/maps-platform-101-js/issues/6",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
312219019
|
Family Check: GlyphOrder is the same across the family
Per https://github.com/TypeNetwork/Amstelvar/issues/59 we should check and enforce this
How should this check be run? Compare the glyph order of all fonts? (cave: italics often have be checked separately). Encoding file?
Hm. For the glyph order, I suppose the test can differentiate between styles by partitioning the fonts into upright and italics depending on the filenames. Would be nice to have some kind of automatism that runs tests on each style subset so you don't have to break up the input fonts yourself.
|
gharchive/issue
| 2018-04-07T15:56:48 |
2025-04-01T04:34:25.039746
|
{
"authors": [
"davelab6",
"madig"
],
"repo": "googlefonts/fontbakery",
"url": "https://github.com/googlefonts/fontbakery/issues/1779",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
380702204
|
fontbakery --version may return incorrect version number
fontbakery --version will return the version number at the point of installing the package. It won't update if the user has git pulled from the repo.
from the convo in https://github.com/googlefonts/fontbakery/issues/2194
are you sure about that?
I noticed that Lasse did not run fontbakery --version right after fetching and rebasing. At that point we could have seen a different output, but that possibility was not verified yet since Lasse jumped straight to a new pip install command in that example.
for the record, we're both talking about this specific comment (https://github.com/googlefonts/fontbakery/issues/2194#issuecomment-438347326) by @graphicore, here.
you mean that you remember that when you ran those commands the fetching and rebasing were actually unnecessary?
Just did this
$ fontbakery --version
0.0.16.dev2539+g29aea5ac
$ git reset --hard 7a5a29b624de31a521d5c522f662448a1cdaa8ae # pre 0.6
$ fontbakery --version
0.0.16.dev2539+g29aea5ac
If I was to go back to a previous commit, I'd expect the --version flag to reflect this.
Well, I mean Current branch master is up to date. is what you get when there was nothing to do:
(venv) commander@ghoul:~/Projekte/fontdev/googlefonts/fontbakery$ git checkout master
Switched to branch 'master'
Your branch is up to date with 'origin/master'.
(venv) commander@ghoul:~/Projekte/fontdev/googlefonts/fontbakery$ git branch test_branch HEAD~10
(venv) commander@ghoul:~/Projekte/fontdev/googlefonts/fontbakery$ git checkout test_branch
Switched to branch 'test_branch'
(venv) commander@ghoul:~/Projekte/fontdev/googlefonts/fontbakery$ git rebase master
First, rewinding head to replay your work on top of it...
Fast-forwarded test_branch to master.
(venv) commander@ghoul:~/Projekte/fontdev/googlefonts/fontbakery$ git rebase master
Current branch test_branch is up to date.
That’s correct. It is by design like that. Setuptools_scm generates the version string from git only upon running the setup.py script (by writing that _version.py file which is .gitignored).
If you want the version string to be dynamically generated every time one imports fontbakery, you could use versioneer.py
https://github.com/warner/python-versioneer
Personally I prefer setuptools_scm, and don’t mind re-doing pip install -e . after each git pull.
|
gharchive/issue
| 2018-11-14T13:39:14 |
2025-04-01T04:34:25.045482
|
{
"authors": [
"anthrotype",
"felipesanches",
"graphicore",
"m4rc1e"
],
"repo": "googlefonts/fontbakery",
"url": "https://github.com/googlefonts/fontbakery/issues/2198",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
568771569
|
Updates typography values
My goal here is to update the base typography styles, building with or leaving as much of the existing system in place as possible!
Removes use of and support for <h4> and <h5> entirely—they were only used once, more as type sizes rather than content hierarchy, and this helps reduce the surface area of the system a bit
Swaps use of line height values, so the line height increases as font size gets smaller. I didn’t look into the underlying system, I only swapped the values around
Increases line height for paragraphs in general
Tweaks h1 and h2 font size and weights slightly to match design reference
I’m going to merge this into a develop branch, which I am going to keep pushing to gh-pages for the moment.
Not a perfect diff, but hopefully helps a bit:
Less is changing here, as I did a lot of the vertical spacing changes already using the --body-text-spacer variable:
@jpamental If this seems like I’m using the system well enough, I’ll keep using it from here without requesting a review, but wanted to get your feedback. Thanks!
@kennethormandy I disagree pretty strongly with a good bit of this. I worked extensively with Irin to get to a place where we both were OK with the typography, and I don't want to lose any more of the hierarchy. The spacing changes around the headings (the h2 in particular) look good though.
As far as line height goes, it was intentional that the line height should be smaller with narrower screens, as it should match up with the shorter line length. Opening it up a bit more on larger screens is fine (it currently maxes out at 1.4; going to 1.5 is fine but shouldn't be more than that)—but it definitely should not be larger on smaller screens: it will reduce reading speed and overall color.
I realize you weren't there for a lot of these conversations, but the typography on the site was my responsibility, and as far as it was left between Irin and I last week, the typography was settled and shouldn't be getting changed at this point.
It would be good to be more on the same page; if you want to find time to chat later today that might be a good idea.
It's also worth pointing out that the site will likely be growing, and it's probably worth keeping the h4 and h5 styles as they will probably end up getting used at some point
Here’s what was cherry picked in:
Kept the vertical spacing changes using the spacing multiplier so they all scale afcc00a
Changed the line height range to a higher value, smaller range from to 1.4 min to 1.5 max
Closing the other font size changes, kept the h4 and h5 styling, but they are not currently in use in the content
|
gharchive/pull-request
| 2020-02-21T06:59:27 |
2025-04-01T04:34:25.063386
|
{
"authors": [
"jpamental",
"kennethormandy"
],
"repo": "googlefonts/variable-fonts",
"url": "https://github.com/googlefonts/variable-fonts/pull/118",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
98849167
|
Update VariantSimilarity pipeline to optionally use streaming.
Tested via:
utils-java/refactor branch: mvn install
dataflow-java/refactor-redo branch: mvn -Dit.test=VariantSimilarityITCase verify
LGTM. Feel free to merge after adding the comment (or removing the option in case it's not used?)
Removed that option - thanks @Careyjmac
|
gharchive/pull-request
| 2015-08-03T22:20:56 |
2025-04-01T04:34:25.090642
|
{
"authors": [
"deflaux",
"jean-philippe-martin"
],
"repo": "googlegenomics/dataflow-java",
"url": "https://github.com/googlegenomics/dataflow-java/pull/121",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
204404031
|
An observation: GlyphData.xml ignored for 11 cases/sources
In the past, a custom GlyphData.xml was delivered along with Glyphs source files. The following deliveries have had GlyphData.xml included before which we ignore (and not even store in noto-source):
NotoSansArabic
NotoSansArabicUI
NotoSansCherokee
NotoSansGeorgian
NotoSansHebrew
NotoSansMyanmar
NotoSansMyanmarUI
NotoSerifArmenian
NotoSerifGeorgian
NotoSerifHebrew
NotoSerifMyanmar
To simplify the workflow and reduce the chance of things going wrong, could you make the Glyphs sources self-contained? According to https://github.com/googlei18n/glyphsLib/issues/12, “Glyphs can store production names in the .glyphs file now”. Not sure how to do this, but it’s surely explained somewhere on the forums.
We could also compare the content of those custom files against the XML files in https://github.com/schriftgestalt/GlyphsInfo/, and resolve any differences. @marekjez86, could you find somebody locally who could write a little script? Shouldn’t take more than a few hours.
@brawer : I have placed the list for the record. For a while, MTI has been producing only standalone sources and they'll investigate what might be done here. I just didn't want to lose the track of the names of the sources that we need to deal with.
Has anyone done this comparison? If not: @marekjez86, can you send me those GlyphData.xml files?
note that there are some complex scripts also with GlyphData.xml files
I haven't attached these here
On Sun, Feb 19, 2017 at 11:46 PM, Sascha Brawer notifications@github.com
wrote:
Has anyone done this comparison? If not: @marekjez86
https://github.com/marekjez86, can you send me those GlyphData.xml
files?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/googlei18n/fontmake/issues/245#issuecomment-281009012,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AMtJf_NH7bKmuN5bG1rmKS01qVUqfCUWks5reUTngaJpZM4LzFvD
.
--
Marek Z Jeziorek [ 老马 ] | marekj@google.com | 312 725-6958
The files above are all the exact same GlyphData.xml. And they aren't used at all in these particular Glyphs files. There's no reason for these GlyphData.xml to be there because nothing related to these scripts is in the file. The only one with related info is Arabic, but it almost all matches what's in the Glyphs source file. I wrote a script to check these things and also make the Glyphs file self contained so the GlyphData.xml can be removed. The only thing I see from the above is that NotoSansArabicUI has one discrepancy:
Glyph production name for fehDotbelow-ar is uni06A2. GlyphData.xml expects uni06A3.
Glyph unicode for fehDotbelow-ar is 06A2. GlyphData.xml expects 06A3.
NotoSansArabic matches the GlyphData.xml. Also in this case it matches the default GlyphData.xml. I also checked Myanmar and Thai and I see nothing used in a custom GlyphData.xml.
|
gharchive/issue
| 2017-01-31T19:44:21 |
2025-04-01T04:34:25.099146
|
{
"authors": [
"brawer",
"marekjez86",
"punchcutter"
],
"repo": "googlei18n/fontmake",
"url": "https://github.com/googlei18n/fontmake/issues/245",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
50888076
|
germany mobile phone numbers not recognized
Imported from Google Code issue #359 created by nicolas.schwartz@blablacar.com on 2013-10-15T08:06:45.000Z:
What steps will reproduce the problem?
use the demo java tool to parse a Number
enter the phone numbers and DE as a country, submit
each time isValidNumber() returns false BUT those numbers are real (I was told we had the people on the phone.)
version of the demo is r603
Here are the phone numbers (I replaced the last digit with a ? for the privacy of our members)
0176341285? => O2
017670221410? => O2
0151261416? => ???
0157399893052? => ???
Fixed in release 7.3.1
|
gharchive/issue
| 2014-12-03T20:24:57 |
2025-04-01T04:34:25.102521
|
{
"authors": [
"mihaelacr-google",
"padmaksha"
],
"repo": "googlei18n/libphonenumber",
"url": "https://github.com/googlei18n/libphonenumber/issues/359",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
273151881
|
[hb_input] get unicode from ordinal past 10000 on narrow Python build
When trying to run hbinput on the Roboto sources, I get the following traceback
Traceback (most recent call last):
File "_hbtest.py", line 8, in <module>
print seq.all_inputs()
File "/Users/marc/Documents/nototools/nototools/hb_input.py", line 60, in all_inputs
cur_input = self.input_from_name(name, pad=is_zero_width)
File "/Users/marc/Documents/nototools/nototools/hb_input.py", line 97, in input_from_name
text = unichr(self.reverse_cmap[name])
ValueError: unichr() arg not in range(0x10000) (narrow Python build)
If you're running a narrow Python build, the function unichr will only return unicode chars from ordinals up to 10000. Roboto has two glyphs which are outside this range, uni1F16A and uni1F16B.
We can address this issue by converting the ordinal into a unicode esc string, its then stored as a UT-16 surrogate pair on narrow build, which can then be reencoded back to utf-8.
This is not my solution but it works nicely. Found it here, https://stackoverflow.com/questions/7105874/valueerror-unichr-arg-not-in-range0x10000-narrow-python-build/7107319
Why reinvent the wheel? Fonttools is already a requirement of nototools and includes a unichr that does exactly that, in the fontTools.misc.py23 module
@anthrotype thanks!, I'll investigate and use that instead.
@anthrotype thanks! does exactly what I wanted.
More than happy to squash and clean up this commit message if approved.
|
gharchive/pull-request
| 2017-11-11T15:12:55 |
2025-04-01T04:34:25.106549
|
{
"authors": [
"anthrotype",
"m4rc1e"
],
"repo": "googlei18n/nototools",
"url": "https://github.com/googlei18n/nototools/pull/465",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
678326288
|
Volunteer-Isolate Matching
UI mock for matching volunteers with isolates. This shows what the pages might look like on both sides, i.e. the volunteers might have multiple requests and so they are shown in a table.
Maybe add some screenshots of the UI just so it's easier to review, since it is a UI-heavy PR.
|
gharchive/pull-request
| 2020-08-13T10:21:39 |
2025-04-01T04:34:25.114303
|
{
"authors": [
"kevlartest",
"thomasogara"
],
"repo": "googleinterns/step226-2020",
"url": "https://github.com/googleinterns/step226-2020/pull/2",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
518902294
|
Travis emulator executions occasionally fail
This is a follow-on to https://github.com/googlemaps/android-maps-utils/issues/371, which was fixed in PR https://github.com/googlemaps/android-maps-utils/pull/573 - after this PR the emulator is running again on Travis and executing the instrumented unit tests.
However, we're still seeing occasional failures on Travis which appear to be related to adb not being able to connect to the emulator.
Here's an example failed execution on Travis:
https://travis-ci.org/CUTR-at-USF/android-maps-utils/builds/607898077
Here's the relevant log:
> Task :library:packageDebugAndroidTest UP-TO-DATE
[PropertyFetcher]: ShellCommandUnresponsiveException getting properties for device emulator-5554: null
10:35:39 V/ddms: execute: running getprop
10:35:41 V/ddms: execute: returning
10:35:41 W/PropertyFetcher: ShellCommandUnresponsiveException getting properties for device emulator-5554: null
> Task :library:connectedDebugAndroidTest
Skipping device 'test(AVD)' for 'library:': Unknown API Level
> : No compatible devices connected.[TestRunner] FAILED
Found 1 connected device(s), 0 of which were compatible.
10:35:41 I/XmlResultReporter: XML test result file generated at /home/travis/build/CUTR-at-USF/android-maps-utils/library/build/outputs/androidTest-results/connected/TEST-TestRunner-library-.xml. Total tests 1, failure 1,
> Task :library:connectedDebugAndroidTest FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':library:connectedDebugAndroidTest'.
> There were failing tests. See the report at: file:///home/travis/build/CUTR-at-USF/android-maps-utils/library/build/reports/androidTests/connected/index.html
* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':library:connectedDebugAndroidTest'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$3.accept(ExecuteActionsTaskExecuter.java:166)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$3.accept(ExecuteActionsTaskExecuter.java:163)
at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:191)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:156)
at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:62)
at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:108)
at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:94)
at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:95)
at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file:///home/travis/build/CUTR-at-USF/android-maps-utils/library/build/reports/androidTests/connected/index.html
at com.android.build.gradle.internal.tasks.DeviceProviderInstrumentTestTask.doTaskAction(DeviceProviderInstrumentTestTask.java:226)
at com.android.build.gradle.internal.tasks.NonIncrementalTask$taskAction$$inlined$recordTaskAction$1.invoke(AndroidVariantTask.kt:51)
at com.android.build.gradle.internal.tasks.NonIncrementalTask$taskAction$$inlined$recordTaskAction$1.invoke(AndroidVariantTask.kt:31)
at com.android.build.gradle.internal.tasks.Blocks.recordSpan(Blocks.java:91)
at com.android.build.gradle.internal.tasks.NonIncrementalTask.taskAction(NonIncrementalTask.kt:34)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:49)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:42)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:717)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:684)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$5.run(ExecuteActionsTaskExecuter.java:476)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:461)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:444)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:93)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:237)
at org.gradle.internal.execution.steps.ExecuteStep.lambda$execute$1(ExecuteStep.java:33)
at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:33)
at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:26)
at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:58)
at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:35)
at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:48)
at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:33)
at org.gradle.internal.execution.steps.CancelExecutionStep.execute(CancelExecutionStep.java:39)
at org.gradle.internal.execution.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:73)
at org.gradle.internal.execution.steps.TimeoutStep.execute(TimeoutStep.java:54)
at org.gradle.internal.execution.steps.CatchExceptionStep.execute(CatchExceptionStep.java:35)
at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:51)
at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:45)
at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:31)
at org.gradle.internal.execution.steps.CacheStep.executeWithoutCache(CacheStep.java:208)
at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:70)
at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:45)
at org.gradle.internal.execution.steps.BroadcastChangingOutputsStep.execute(BroadcastChangingOutputsStep.java:49)
at org.gradle.internal.execution.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:43)
at org.gradle.internal.execution.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:32)
at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:38)
at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:24)
at org.gradle.internal.execution.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:96)
at org.gradle.internal.execution.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:89)
at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:54)
at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:38)
at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:76)
at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:37)
at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:36)
at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:26)
at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:90)
at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:48)
at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:69)
at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:47)
at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:33)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:140)
... 34 more
* Get more help at https://help.gradle.org
BUILD FAILED in 49s
Environment details
master branch as of https://github.com/googlemaps/android-maps-utils/commit/a7ff535e7d81e137e34ac1b0d5e3b93b78100178 (after 0.6.2 release)
Steps to reproduce
Make changes and open a PR to see a Travis build
Reopening this, as I'm still seeing occasional execution failures. I restarted this master branch merge build twice and it finally passed on the third execution:
https://travis-ci.org/googlemaps/android-maps-utils/builds/612104823
|
gharchive/issue
| 2019-11-06T22:44:13 |
2025-04-01T04:34:25.121305
|
{
"authors": [
"barbeau"
],
"repo": "googlemaps/android-maps-utils",
"url": "https://github.com/googlemaps/android-maps-utils/issues/575",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1067079763
|
Mocks for google.maps.Data
Is your feature request related to a problem? Please describe.
I would like to have this library support the google.maps.Data, e.g. for adding GeoJSON data to a map. Data class reference.
Describe the solution you'd like
Create mock implementation for google.maps.Data and provide it as a property for google.maps.Map as well as a standalone class from which Data instances can be constructed.
Additional context
I'm currently mocking the base Data class manually in some project. If you are ok with this feature request, I can submit a PR with mocks for the base class. In a next step, we could also provide mocks for other classes on google.maps.Data, e.g. google.maps.Data.Feature etc.
A pull request would be great!
Done, let me know if this is ok for you. I didn't find any style guidelines so I tried to adhere to what was already there.
|
gharchive/issue
| 2021-11-30T10:46:50 |
2025-04-01T04:34:25.125634
|
{
"authors": [
"eegli",
"jpoehnelt"
],
"repo": "googlemaps/js-jest-mocks",
"url": "https://github.com/googlemaps/js-jest-mocks/issues/141",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
456346505
|
Add the steps to generate the winafl.dll file in the documentation
I think that the following line should be included in the documentation:
git submodule update --init --recursive
the command was extracted from issue: https://github.com/googleprojectzero/winafl/issues/184
Done.
|
gharchive/issue
| 2019-06-14T16:59:20 |
2025-04-01T04:34:25.127198
|
{
"authors": [
"ecc2016",
"ifratric"
],
"repo": "googleprojectzero/winafl",
"url": "https://github.com/googleprojectzero/winafl/issues/188",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
294911664
|
using no parameters in device action
Hello,
I am attempting to write some custom device actions using the google-assistant-sdk python library. For some custom traits the user provides parameters via query patterns and for others the query pattern contains no parameters. I am having issues when I am not using any parameters. in my action.json file I have the following for a trait that requires no parameters.
"intent": {
"name": "addLight",
"trigger": {
"queryPatterns": [
"Add light"
]
}
},
"directResponseFulfillment": {
"ttsPattern": "adding light"
},
"requiredTraits": ["light"]
}
I am getting an error when I add the trait in the sample hotword.py:
if event.type == EventType.ON_DEVICE_ACTION:
for command, params in process_device_actions(event, device_id):
if command == "addLight":
print('Attempting to add light')
send_osc( "/lights/add" );
The error I get is :
line 74, in process_device_actions
if e['params']:
KeyError: 'params'
Basically it is saying that there is no params. And that is correct because I do not want the user to provide any parameters. Is there anyway to assign null to the parameters from the device action json file. If someone can give an example of the python side of the api regarding how to add traits without any parameters, it would be really helpful. Please let me know if you need further explanation. Thanks and regards.
Right now custom device actions are not generally available. With regards to the bug itself, you should be able to modify hotword.py to not include params. We will fix this bug in the future.
if params in e:
yield e['command'], e['params']
else:
yield e['command'], None
Thank you! That change in the if statement took care of the error. Regards.
|
gharchive/issue
| 2018-02-06T21:03:28 |
2025-04-01T04:34:25.134251
|
{
"authors": [
"Fleker",
"sandblasted7"
],
"repo": "googlesamples/assistant-sdk-python",
"url": "https://github.com/googlesamples/assistant-sdk-python/issues/178",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
128405708
|
Can't get token id for user
I am trying to retrieve ID Token for a user using the sample app in google-services/android/signin/app but it always ends up with access token not jwt token.
Can anyone helps me to figure out what am i doing wrong ?!
What makes you think you have an access token and not a JWT token? And if you do have an access token, where are you getting it from?
we are making login/register using cross-auth . Using google sdk we get ID TOKEN to send it to the backend which uses it to do the login/register process. it is working fine using google login sdk on ios which generates a code like this 4/4x0Hh1acrl7OPyzAYtuziNCX5dmsj5ewclRIuWN0qxg which the backend takes and complete the process. but on android i got an access token like this eyJhbGciOiJSUzI1NiIsImtpZCI6ImIyY2ZiZDIxOGNmYzIyNGYyODQ2Y2ViNzE0MDNkMDVmMTc1ZDJiM2MifQ.eyJpc3MiOiJodHRwczovL2FjY291bnRzLmdvb2dsZS5jb20iLCJhdWQiOiI1MjI3MzQ0NDI4MC1zZTlnZ3ZnYjU4MWs2aDYzNDZlZWliOXNlcWYwOWFkMS5hcHBzLmdvb2dsZXVzZXJjb250ZW50LmNvbSIsInN1YiI6IjExNjE4NTUxNjA2Nzc0MDk4MDU5OCIsImF6cCI6IjUyMjczNDQ0MjgwLWVmbmxqYjhmdmFuN3MwcnNzZGFvMjNma2I3aWZmZmRlLmFwcHMuZ29vZ2xldXNlcmNvbnRlbnQuY29tIiwiaWF0IjoxNDUzNzU1MTM1LCJleHAiOjE0NTM3NTg3MzUsIm5hbWUiOiJNb2hhbWVkIFpha2FyaWEgRWwtWm9naGJpIiwicGljdHVyZSI6Imh0dHBzOi8vbGg2Lmdvb2dsZXVzZXJjb250ZW50LmNvbS8tWDBSZG12cXNNR0EvQUFBQUFBQUFBQUkvQUFBQUFBQUFYNGMvUmNYWXpPbFo4VlUvczk2LWMvcGhvdG8uanBnIiwiZ2l2ZW5fbmFtZSI6Ik1vaGFtZWQiLCJmYW1pbHlfbmFtZSI6Ilpha2FyaWEgRWwtWm9naGJpIiwibG9jYWxlIjoiZW4ifQ.TA_mzwqdGIqZwm7pbVRhN764-CCIYRsYcY537X3hrfO3dHMuhq_GPceJlXBNeD-5PK8bL6sG-V4onZD0k4Mss-GW7OPtfSqY7uqdVC_yfIMpEDMV_rPvHsw02ecdYmWXDMm7ccnTn5SEGSjCuXotTBTnDivONTYrwn15vW9GcaOoA_ihywbwgUpnxNPFmHZ-No596VsfUz34bQBIuQbamf93lGAtHulWEusytHASDSN3XHnifYnEkJ8vbiAIdGL13WnRmf7EfDrBOcEBmv4PjFIp3K4dq6j8Rcsij66e-Wm4noiDlHqgO3DDsIMeO2dwSe5errRR1c2FkHB5YLgHBQ which the backend don't accept !
I solved that, this short code is called serverAuthCode :)
|
gharchive/issue
| 2016-01-24T15:49:25 |
2025-04-01T04:34:25.137356
|
{
"authors": [
"mzelzoghbi",
"samtstern"
],
"repo": "googlesamples/google-services",
"url": "https://github.com/googlesamples/google-services/issues/149",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2172866350
|
🛑 goormIDE is down
In f2ec9c7, goormIDE (https://ide.goorm.io) was down:
HTTP code: 403
Response time: 652 ms
Resolved: goormIDE is back up in 4fcb66c after 36 minutes.
|
gharchive/issue
| 2024-03-07T03:26:08 |
2025-04-01T04:34:25.161684
|
{
"authors": [
"rlatjdwn4926"
],
"repo": "goorm-dev/goorm-status",
"url": "https://github.com/goorm-dev/goorm-status/issues/3730",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2185908746
|
🛑 goormEDU is down
In 919c91a, goormEDU (https://edu.goorm.io) was down:
HTTP code: 403
Response time: 736 ms
Resolved: goormEDU is back up in 60cef7b after 5 minutes.
|
gharchive/issue
| 2024-03-14T09:47:24 |
2025-04-01T04:34:25.163953
|
{
"authors": [
"rlatjdwn4926"
],
"repo": "goorm-dev/goorm-status",
"url": "https://github.com/goorm-dev/goorm-status/issues/4006",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
679837929
|
No way to natively disable linking to a folder
Most of the time, when the link type is "File on this website", the intention is for content authors to link specifically to files, and no to folders - but either through misunderstanding or mis-clicking, currently content authors can link to folders.
TreeDropdownField allows passing in a 'DisableFunction' to set certain nodes as disabled (i.e. not selectable). I propose setting a function that will disable all folders.
This should of course be configurable (be it globally via .yml config, or per link field through some sort of linkfield configuration) in case linking to folders is intended for some application (and to avoid breaking backwards compatibility).
I am happy to submit a pull request to resolve this - it's what I'm working on right now in an extension for an application I'm working on. Just let me know your preferred configuration out of the above (or other) options.
Feel free to make a PR. I could be wrong but think it makes sense to disable it by default as I never intended to link to a folder in the first place. Aka this is a bug.
I agree that it would ideally be disabled by default - the only problem with that is that it's technically a breaking change since (regardless of original intentions) someone may be using the Link model to intentionally link folders. In theory it would be a major version release.
That said I'll submit a PR later today. Unless you indicate otherwise I'll make it configurable globally via .yml, and disabled by default.
I have made a PR (#15) to address this.
PR Merged. Closing this issue.
|
gharchive/issue
| 2020-08-16T22:21:00 |
2025-04-01T04:34:25.275032
|
{
"authors": [
"GuySartorelli",
"gorriecoe"
],
"repo": "gorriecoe/silverstripe-link",
"url": "https://github.com/gorriecoe/silverstripe-link/issues/14",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2272001362
|
Can't go get the package...
> go get github.com/yohamta/gosbd
go: github.com/yohamta/gosbd@upgrade (v0.1.0) requires github.com/yohamta/gosbd@v0.1.0: parsing go.mod:
module declares its path as: github.com/gosbd/gosbd
but was required as: github.com/yohamta/gosbd
https://github.com/gosbd/gosbd/pull/2
|
gharchive/issue
| 2024-04-30T16:36:27 |
2025-04-01T04:34:25.276280
|
{
"authors": [
"tamis-laan"
],
"repo": "gosbd/gosbd",
"url": "https://github.com/gosbd/gosbd/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1309081690
|
Support for docx to pdf error
I've been deployed the dockerize gotenberg to our clusters, when I run a request from postman, it returns a 500 error, and log is
_____ __ __ / / / /___ ___ / / ___ _______ / ( / _ / / -) _ / _ / -) / _ '/_/_/_/_////.__/_// _, / /__/A Docker-powered stateless API for PDF files.Version: 7.5.3-------------------------------------------------------[SYSTEM] modules: api chromium gc libreoffice logging pdfcpu pdfengines pdftk prometheus qpdf uno uno-pdfengine webhook [SYSTEM] pdfengines: pdfcpu pdftk qpdf uno-pdfengine[SYSTEM] uno: long-running LibreOffice listener ready to start[SYSTEM] gc: application started[SYSTEM] prometheus: collecting metrics[SYSTEM] api: server listening on port 3000{"level":"error","ts":1658214348.5521502,"logger":"api","msg":"convert to PDF: unoconv PDF: unix process error: wait for unix process: exit status 6","trace":"ea1cb927-1bff-4af9-b358-94461e7e5c49","remote_ip":"10.51.136.65","host":"10.51.136.65","uri":"/forms/libreoffice/convert?serviceNamespace=hkcrm&serviceName=gotenberg&serviceVersion=1.0.0&","method":"POST","path":"/forms/libreoffice/convert","referer":"","user_agent":"PostmanRuntime/7.29.0","status":500,"latency":2598753594,"latency_human":"2.598753594s","bytes_in":31625,"bytes_out":21}
how to solve this unix process error: wait for unix process: exit status 6 Error?
degraded the version to 7.4.2 the problem solved.
|
gharchive/issue
| 2022-07-19T07:10:10 |
2025-04-01T04:34:25.288741
|
{
"authors": [
"adsian"
],
"repo": "gotenberg/gotenberg",
"url": "https://github.com/gotenberg/gotenberg/issues/476",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
439976349
|
Mouse wheel not work if mouse button is pressed
Mouse wheel not work if mouse button is keeping pressed.
No reaction in event listener and in ScrolledWindow.
Is this "gotk" bug or I should change something in code?
Small example code:
package main
import (
"fmt"
"os"
"strconv"
"github.com/gotk3/gotk3/gdk"
"github.com/gotk3/gotk3/gtk"
)
func main() {
gtk.Init(nil)
win, _ := gtk.WindowNew(gtk.WINDOW_TOPLEVEL)
win.Connect("destroy", func() {
fmt.Println("bye")
os.Exit(0)
})
win.SetDefaultSize(300, 200)
win.SetTitle("scroll test")
scroll_n := 0
win.Connect("scroll-event", func(_ *gtk.Window, event *gdk.Event) {
scroll_n++
fmt.Println("scroll - " + strconv.Itoa(scroll_n))
})
big_content, _ := gtk.BoxNew(gtk.ORIENTATION_VERTICAL, 0)
label0, _ := gtk.LabelNew("try mouse whell scroll with left mouse button pressed")
big_content.Add(label0)
for j := 0; j < 99; j++ {
label, _ := gtk.LabelNew("text_" + strconv.Itoa(j+1))
big_content.Add(label)
}
scroll, _ := gtk.ScrolledWindowNew(nil, nil)
scroll.SetVExpand(true)
scroll.SetHExpand(true)
scroll.Add(big_content)
win.Add(scroll)
win.ShowAll()
for {
gtk.MainIteration()
}
}
I am unsure whether this is desired behaviour or not. Try writing the same application in C or Vala in order to test it out. Also, why are you doing this:
for {
gtk.MainIteration()
}
Did anyone recommend you to do so?
is it not equivalent to the following: gtk.Main() ?
My full code looks like this:
go start_async_loading(chan_var)
for {
select {
case loaded_data := <-chan_var:
gtk_label_object.SetText(loaded_data)
default:
//CONTINUE!!
}
gtk.MainIteration()
}
ok. This solved my problem =)
scroll.SetEvents(int(gdk.ALL_EVENTS_MASK))
Have you managed doing this via trial and error? Or does the documentation recommend you to do so? Because this seems a bit odd to me. Maybe it doesn't even make sense to count the scrolling event while a mouse-button is down.
Yes, via trial and error.
I am creating file manager (look at my repos).
Such programs usually allow to select files by mouse. So I needed scroll to work.
|
gharchive/issue
| 2019-05-03T10:04:21 |
2025-04-01T04:34:25.298563
|
{
"authors": [
"Bios-Marcel",
"SilentGopherLnx"
],
"repo": "gotk3/gotk3",
"url": "https://github.com/gotk3/gotk3/issues/370",
"license": "isc",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1733307319
|
feat: publish missing metrics
changelog metrics of jobs (add/update/delete)
upstream refresh metrics
dag upload/removal metric
enrich resource upload metrics
enrich scheduler events metrics
add job start metric
enrich/add long-running features duration metrics (job replace-all, validation, refresh, resource upload-all)
add replay request metric
@arinda-arif lets discuss whether we really need to have the metrics captured at job_name level? as it will be very sparse?
@sravankorumilli I have added some commits to fix wrong metric/label name. please take a look.
|
gharchive/pull-request
| 2023-05-31T03:01:13 |
2025-04-01T04:34:25.301775
|
{
"authors": [
"arinda-arif",
"sravankorumilli"
],
"repo": "goto/optimus",
"url": "https://github.com/goto/optimus/pull/67",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1422742139
|
Feature/support sharp yuv
Since I needed the sharp YUV property support, I added this to the codebase.
Also I used the gotson crossbuild docker to build the MAC library, the original failed.
I don't maintain this library anymore, you may need to fork your own.
|
gharchive/pull-request
| 2022-10-25T16:20:23 |
2025-04-01T04:34:25.308287
|
{
"authors": [
"gotson",
"joostoudeman"
],
"repo": "gotson/webp-imageio",
"url": "https://github.com/gotson/webp-imageio/pull/2",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2151551279
|
LIME-1033 - Adding autocomplete to form fields
What changed
Added autocomplete to form fields for accessibility
This PR has been replaced by the following - https://github.com/govuk-one-login/ipv-cri-uk-passport-front-v1/pull/87
|
gharchive/pull-request
| 2024-02-23T17:57:51 |
2025-04-01T04:34:25.330093
|
{
"authors": [
"ElliotMurphyGDS"
],
"repo": "govuk-one-login/ipv-cri-uk-passport-front-v1",
"url": "https://github.com/govuk-one-login/ipv-cri-uk-passport-front-v1/pull/85",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
614320979
|
Repo name
Thank you for the nice tmux configuration!
It's rather unusual to have a repository name start with a dot, i.e. hiding it from a regular ls by default. Is it possible to rename it? For example to tmux-config?
Hello @friederbluemle 👋
Well I like it it's hidden by default when I use it. But you can name if the way you want when cloning anyways:
$ git https://github.com/gpakosz/.tmux.git tmux-config
|
gharchive/issue
| 2020-05-07T20:23:39 |
2025-04-01T04:34:25.386983
|
{
"authors": [
"friederbluemle",
"gpakosz"
],
"repo": "gpakosz/.tmux",
"url": "https://github.com/gpakosz/.tmux/issues/348",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2471773218
|
🛑 PiP tracks is down
In b549981, PiP tracks (https://www.portamiinpista.it/Tracks/) was down:
HTTP code: 500
Response time: 174 ms
Resolved: PiP tracks is back up in 35259a7 after 8 minutes.
|
gharchive/issue
| 2024-08-18T03:47:10 |
2025-04-01T04:34:25.389397
|
{
"authors": [
"gpcaretti"
],
"repo": "gpcaretti/upptime",
"url": "https://github.com/gpcaretti/upptime/issues/49",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
118212865
|
TCmalloc GetStats is not showing correct data
Hi,
I wrote small program where I am doing malloc of size 10220 bytes and I was expecting it to show up in class of 10240 central cache. But GetState is not showing at all. Please see below output.
Program
const unsigned int size_to_alloc = 10220;
const unsigned int total_size = ((1024000));
const unsigned int tcmalloc_stats_buf_len_ = 32768;
char tcmalloc_stats_buf[tcmalloc_stats_buf_len_];
void *temp[total_size];
unsigned int i=0;
for (i=0; i < total_size/4; i++) {
temp[i] = malloc(size_to_alloc);
}
MallocExtension::instance()->GetStats(tcmalloc_stats_buf,
tcmalloc_stats_buf_len_);
printf("%s ",tcmalloc_stats_buf);
fflush(stdout);
GetStats output
.------------------------------------------------
MALLOC: 2621617128 ( 2500.2 MiB) Bytes in use by application
MALLOC: + 999424 ( 1.0 MiB) Bytes in page heap freelist
MALLOC: + 174946832 ( 166.8 MiB) Bytes in central cache freelist
MALLOC: + 0 ( 0.0 MiB) Bytes in transfer cache freelist
MALLOC: + 37384 ( 0.0 MiB) Bytes in thread cache freelists
MALLOC: + 8614040 ( 8.2 MiB) Bytes in malloc metadata
MALLOC: ------------
MALLOC: = 2806214808 ( 2676.2 MiB) Actual memory used (physical + swap)
MALLOC: + 0 ( 0.0 MiB) Bytes released to OS (aka unmapped)
MALLOC: ------------
MALLOC: = 2806214808 ( 2676.2 MiB) Virtual address space used
MALLOC:
MALLOC: 85375 Spans in use
MALLOC: 2 Thread heaps in use
MALLOC: 8192 Tcmalloc page size
Call ReleaseFreeMemory() to release freelist memory to the OS (via
madvise()).
Bytes released to the OS take up virtual address space but no physical
memory.
Size class breakdown
class 1 [ 8 bytes ] : 1013 objs; 0.0 MiB; 0.0 cum MiB
class 2 [ 16 bytes ] : 504 objs; 0.0 MiB; 0.0 cum MiB
class 3 [ 32 bytes ] : 52 objs; 0.0 MiB; 0.0 cum MiB
class 4 [ 48 bytes ] : 155 objs; 0.0 MiB; 0.0 cum MiB
class 5 [ 64 bytes ] : 84 objs; 0.0 MiB; 0.0 cum MiB
class 6 [ 80 bytes ] : 61 objs; 0.0 MiB; 0.0 cum MiB
class 7 [ 96 bytes ] : 81 objs; 0.0 MiB; 0.0 cum MiB
class 8 [ 112 bytes ] : 73 objs; 0.0 MiB; 0.0 cum MiB
class 9 [ 128 bytes ] : 62 objs; 0.0 MiB; 0.1 cum MiB
class 10 [ 144 bytes ] : 55 objs; 0.0 MiB; 0.1 cum MiB
class 12 [ 176 bytes ] : 41 objs; 0.0 MiB; 0.1 cum MiB
class 13 [ 192 bytes ] : 30 objs; 0.0 MiB; 0.1 cum MiB
class 14 [ 208 bytes ] : 38 objs; 0.0 MiB; 0.1 cum MiB
class 15 [ 224 bytes ] : 35 objs; 0.0 MiB; 0.1 cum MiB
class 17 [ 256 bytes ] : 31 objs; 0.0 MiB; 0.1 cum MiB
class 19 [ 320 bytes ] : 24 objs; 0.0 MiB; 0.1 cum MiB
class 20 [ 352 bytes ] : 22 objs; 0.0 MiB; 0.1 cum MiB
class 21 [ 384 bytes ] : 18 objs; 0.0 MiB; 0.1 cum MiB
class 22 [ 416 bytes ] : 19 objs; 0.0 MiB; 0.1 cum MiB
class 24 [ 480 bytes ] : 16 objs; 0.0 MiB; 0.1 cum MiB
class 25 [ 512 bytes ] : 10 objs; 0.0 MiB; 0.1 cum MiB
class 26 [ 576 bytes ] : 14 objs; 0.0 MiB; 0.1 cum MiB
class 30 [ 896 bytes ] : 9 objs; 0.0 MiB; 0.2 cum MiB
class 32 [ 1152 bytes ] : 3 objs; 0.0 MiB; 0.2 cum MiB
class 36 [ 1792 bytes ] : 8 objs; 0.0 MiB; 0.2 cum MiB
class 48 [ 8192 bytes ] : 2 objs; 0.0 MiB; 0.2 cum MiB
class 50 [ 10240 bytes ] : 2 objs; 0.0 MiB; 0.2 cum MiB
PageHeap: 2 sizes; 1.0 MiB free; 0.0 MiB unmapped
2 pages * 1 spans ~ 0.0 MiB; 0.0 MiB cum; unmapped:
0.0 MiB; 0.0 MiB cum
120 pages * 1 spans ~ 0.9 MiB; 1.0 MiB cum; unmapped:
0.0 MiB; 0.0 MiB cum
255 large * 0 spans ~ 0.0 MiB; 1.0 MiB cum; unmapped:
0.0 MiB; 0.0 MiB cum
I change some calculation of GetState and below is what I come up with
size 10240 bytes 2796224512 = 85334(objects)48192(page size)
I this a issue or I am doing something wrong ?
I'm not sure what you're expecting to see here. Please note that GetStats reports sizes of free lists. It currently does not report allocated objects.
Ok. malloc_extension.h description says "Get human readable description of the current state of the malloc data structure". It didn't mention it's for free list, according to me everyone will assume it's in use structure (as description say current state).
Also do you think there should be an Public API for current usage of memory for central cache ?
Better description could be useful indeed.
And I agree that we can add a lot more useful information to stats that we expose.
uploaded patch for new api as well as fixing description of GetStats, please have a look.
On Mon, Nov 23, 2015 at 10:25 AM, gshirishfree notifications@github.com
wrote:
uploaded patch for new api as well as fixing description of GetStats,
please have a look.
Yes I saw that. I'll need a bit more time to think about this. My gut
feeling is that std::vector in API is wrong. But I'm not yet sure if it is
actually wrong/suboptimal or it's just my gut feeling.
If you can split "fixing comment" patch out, I'll be able to merge it
sooner. And in general it's always best to keep close to
one-patch-per-improvement. Easier to review, discuss and revert (if needed).
—
Reply to this email directly or view it on GitHub
https://github.com/gperftools/gperftools/issues/744#issuecomment-159019207
.
Just like gshirishfree's patch above does in central cache i am trying to get the memory in use in per thread cache per size class, I understand this maynot have larger application by for my immediate purpose i need it. Since there is no spans in the per thread cache list I am not sure how to get that information without adding a new counter (which i don't want to as as the class is CACHE_ALIGNED)
Can i get that information from existing data members of the class ThreadCache & FreeList ?
This maynot be right point to ask this question but i wanted use the pre_built context of the above discussion.
I think it should be possible to enumerate all spans of given size class and get total count and size of object there. And then subtract free items in spans and transfer cache.
|
gharchive/issue
| 2015-11-21T18:22:05 |
2025-04-01T04:34:25.405174
|
{
"authors": [
"alk",
"bh0kaal",
"gshirishfree"
],
"repo": "gperftools/gperftools",
"url": "https://github.com/gperftools/gperftools/issues/744",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
614310794
|
Implement ErrorLink
ErrorLink implementation
An example HttpAuthLink using both ErrorLink and TransformLink
@smkhalsa, this is a bit different than what you suggested in #57
The difference arises from a need to recover from errors silently if possible.
|
gharchive/pull-request
| 2020-05-07T20:05:03 |
2025-04-01T04:34:25.436782
|
{
"authors": [
"klavs"
],
"repo": "gql-dart/gql",
"url": "https://github.com/gql-dart/gql/pull/103",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
444118383
|
HostAccess confusingly affects ProxyObject impl
It seems that ProxyObject implementations are somewhat beholden to HostAccess restrictions. Example test in Groovy/TestNG:
@Test
void "HostAccess and ProxyObject"() {
def context = Context.create("js")
def someProxy = new SomeProxy()
context.getBindings("js").putMember("someProxy", someProxy)
context.eval("js", "Object.keys(someProxy)") // FAILS, see below
}
static class SomeProxy implements ProxyObject {
private static final Set<String> KEYS = ImmutableSet.of("one", "two")
@Override
Object getMemberKeys() {
return KEYS.toArray()
}
@Override
boolean hasMember(String key) {
return KEYS.contains(key)
}
@Override
Object getMember(String key) { return null }
@Override
void putMember(String key, Value value) {}
}
Fails with:
getMemberKeys() returned invalid value [one, two] but must return an array of member key Strings.
at com.oracle.truffle.polyglot.PolyglotProxy.getMembers(PolyglotProxy.java:393)
at com.oracle.truffle.polyglot.PolyglotProxyGen$InteropLibraryExports$Cached.getMembers(PolyglotProxyGen.java:203)
at com.oracle.truffle.api.interop.InteropLibrary.getMembers(InteropLibrary.java:520)
at com.oracle.truffle.js.builtins.ObjectFunctionBuiltins$ObjectKeysNode.getKeysList(ObjectFunctionBuiltins.java:655)
at com.oracle.truffle.js.builtins.ObjectFunctionBuiltins$ObjectKeysNode.keys(ObjectFunctionBuiltins.java:649)
at com.oracle.truffle.js.builtins.ObjectFunctionBuiltinsFactory$ObjectKeysNodeGen.executeAndSpecialize(ObjectFunctionBuiltinsFactory.java:1318)
at com.oracle.truffle.js.builtins.ObjectFunctionBuiltinsFactory$ObjectKeysNodeGen.execute(ObjectFunctionBuiltinsFactory.java:1222)
at com.oracle.truffle.js.nodes.function.FunctionRootNode.executeInRealm(FunctionRootNode.java:147)
at com.oracle.truffle.js.runtime.JavaScriptRealmBoundaryRootNode.execute(JavaScriptRealmBoundaryRootNode.java:93)
at <js> :program(Unnamed:1:0-21)
at org.graalvm.polyglot.Context$eval$0.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:136)
at com.evergage.proxy.APIProxyTest.HostAccess and ProxyObject(APIProxyTest.groovy:42)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:108)
at org.testng.internal.Invoker.invokeMethod(Invoker.java:661)
at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:869)
at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1193)
at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:126)
at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:109)
at org.testng.TestRunner.privateRun(TestRunner.java:744)
at org.testng.TestRunner.run(TestRunner.java:602)
at org.testng.SuiteRunner.runTest(SuiteRunner.java:380)
at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:375)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:340)
at org.testng.SuiteRunner.run(SuiteRunner.java:289)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1301)
at org.testng.TestNG.runSuitesLocally(TestNG.java:1226)
at org.testng.TestNG.runSuites(TestNG.java:1144)
at org.testng.TestNG.run(TestNG.java:1115)
at org.testng.IDEARemoteTestNG.run(IDEARemoteTestNG.java:73)
at org.testng.RemoteTestNGStarter.main(RemoteTestNGStarter.java:123)
Caused by host exception: java.lang.IllegalStateException: getMemberKeys() returned invalid value [one, two] but must return an array of member key Strings.
As you can see, that error message was confusing since it was seemingly showing the correct array of member key strings :)
It took me a bit to figure out that it was a HostAccess problem, remedied by either:
def context = Context.newBuilder("js")
.allowHostAccess(true) // deprecated
.build()
or
def hostAccess = HostAccess.newBuilder()
.allowArrayAccess(true)
.build()
def context = Context.newBuilder("js")
.allowHostAccess(hostAccess)
.build()
Note that the @Export annotation was not required, though.
Suggestions:
Improve error message when !allowArrayAccess / !allowListAccess but dealing with array-like object?
ProxyObject impl should either be fully immune from HostAccess or fully beholden to it. Make clear in ProxyObject doc.
I think i'm seeing a slightly different variant of this too, a Proxy (ProxyArray in my case) can't @HostAccess.Export any methods / properties if it extends ProxyArray an error is thrown if you try
TypeError: invokeMember on foreign object failed due to: Message not supported.
But if you don't implement ProxyArray the export and method call works as expected.. even with allowAllAccess = true it still fails though.
Yep just tried that and it works as described, can up with this rough interface to make HostAccess.Export work as expected, very little testing so far but here it is:
import java.util.HashMap;
import java.util.Map;
import java.util.stream.Stream;
import org.graalvm.polyglot.HostAccess;
import org.graalvm.polyglot.Value;
import org.graalvm.polyglot.proxy.ProxyExecutable;
import org.graalvm.polyglot.proxy.ProxyObject;
//shouldn't need this but works around the problem where you can't add custom methods to Proxy* objects
//as per https://github.com/graalvm/graaljs/issues/163
public interface HostAccessDrivenProxyObject extends ProxyObject {
Map<Class, Map<String, Object>> accessibleObjectsForClass = new HashMap<>();
default Map<String, Object> _getAccessibleObjects() {
return accessibleObjectsForClass.computeIfAbsent(this.getClass(), (key) -> new HashMap<>());
}
default void _load() {
Map<String, Object> accessibleObjects = _getAccessibleObjects();
if (accessibleObjects.size() == 0) {
Object self = this;
Stream.of(this.getClass().getDeclaredMethods()).filter(m -> m.getAnnotation(HostAccess.Export.class) != null).forEach(m -> {
ProxyExecutable proxyExecutable = arguments -> {
try {
if (arguments.length == 0) {
return m.invoke(self);
} else {
Object[] typedArguments = new Object[arguments.length];
for (int x = 0; x < arguments.length; x++) {
typedArguments[x] = arguments[x].as(m.getParameterTypes()[x]);
}
return m.invoke(self, typedArguments);
}
} catch (Exception e) {
e.printStackTrace();
return null;
}
};
accessibleObjects.put(m.getName(), proxyExecutable);
});
Stream.of(this.getClass().getDeclaredFields()).filter(f -> f.getAnnotation(HostAccess.Export.class) != null).forEach(f -> {
try {
accessibleObjects.put(f.getName(), f.get(self));
} catch (Exception e) {
e.printStackTrace();
}
});
}
}
@Override
default Object getMember(String key) {
_load();
return _getAccessibleObjects().get(key);
}
@Override
default Object getMemberKeys() {
_load();
return _getAccessibleObjects().keySet();
}
@Override
default boolean hasMember(String key) {
_load();
return _getAccessibleObjects().containsKey(key);
}
@Override
default void putMember(String key, Value value) {
//noop
}
}
Idea is you can make your desired class implement that interface and it effectively makes it behave as it should behave and honour both ProxyArray etc interfaces AND @HostAccess annotations.
... implements ProxyArray, HostAccessDrivenProxyObject
|
gharchive/issue
| 2019-05-14T20:49:30 |
2025-04-01T04:34:25.455394
|
{
"authors": [
"hashtag-smashtag",
"nhoughto"
],
"repo": "graalvm/graaljs",
"url": "https://github.com/graalvm/graaljs/issues/163",
"license": "UPL-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
805952805
|
Ability to build without JCenter
JCenter is going to be shut down "soon" (although 1st May has been "clarified" to 1st Feb 2022 :-D ). When I tried to remove jcenter() the build on CI server started to fail:
> Task :dokkaJavadoc FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':dokkaJavadoc'.
> Could not resolve all files for configuration ':dokkaJavadocPlugin'.
> Could not find org.jetbrains.dokka:javadoc-plugin:1.4.20.
Searched in the following locations:
- https://repo.maven.apache.org/maven2/org/jetbrains/dokka/javadoc-plugin/1.4.20/javadoc-plugin-1.4.20.pom
If the artifact you are trying to retrieve can be found in the repository but without metadata in 'Maven POM' format, you need to adjust the 'metadataSources { ... }' of the repository declaration.
Required by:
project :
* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':dokkaJavadoc'.
at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:38)
In Maven Central, there is a similar version available, so probably it is possible to override transitive dependencies, however, it would be best to fix it upstream - https://github.com/Kotlin/dokka/issues/1725
With an exception for dokka itself, there are more dokka-related transitive dependencies missing:
* What went wrong:
Execution failed for task ':dokkaJavadoc'.
> Could not resolve all files for configuration ':dokkaJavadocPlugin'.
> Could not find com.soywiz.korlibs.korte:korte-jvm:1.10.3.
Searched in the following locations:
- https://repo.maven.apache.org/maven2/com/soywiz/korlibs/korte/korte-jvm/1.10.3/korte-jvm-1.10.3.pom
If the artifact you are trying to retrieve can be found in the repository but without metadata in 'Maven POM' format, you need to adjust the 'metadataSources { ... }' of the repository declaration.
Required by:
project : > org.jetbrains.dokka:javadoc-plugin:1.4.20
> Could not find org.jetbrains.kotlinx:kotlinx-html-jvm:0.7.2.
Searched in the following locations:
- https://repo.maven.apache.org/maven2/org/jetbrains/kotlinx/kotlinx-html-jvm/0.7.2/kotlinx-html-jvm-0.7.2.pom
If the artifact you are trying to retrieve can be found in the repository but without metadata in 'Maven POM' format, you need to adjust the 'metadataSources { ... }' of the repository declaration.
Required by:
project : > org.jetbrains.dokka:javadoc-plugin:1.4.20
project : > org.jetbrains.dokka:javadoc-plugin:1.4.20 > org.jetbrains.dokka:dokka-base:1.4.20
And later:
* What went wrong:
Execution failed for task ':dokkaJavadoc'.
> Could not resolve all files for configuration ':dokkaJavadocRuntime'.
> Could not find org.jetbrains:markdown:0.1.45.
Searched in the following locations:
- https://repo.maven.apache.org/maven2/org/jetbrains/markdown/0.1.45/markdown-0.1.45.pom
If the artifact you are trying to retrieve can be found in the repository but without metadata in 'Maven POM' format, you need to adjust the 'metadataSources { ... }' of the repository declaration.
Required by:
project : > org.jetbrains.dokka:dokka-core:1.4.20
I think this has been resolved with the fixing of #57
Good catch, thanks.
|
gharchive/issue
| 2021-02-10T23:16:37 |
2025-04-01T04:34:25.478634
|
{
"authors": [
"nhajratw",
"szpak"
],
"repo": "gradle-nexus/publish-plugin",
"url": "https://github.com/gradle-nexus/publish-plugin/issues/56",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1922374573
|
Set artifact retention period to 1 day
Fixes #903
I also found a missing await statement which was added in this PR.
Functionality provided by #953
|
gharchive/pull-request
| 2023-10-02T18:00:20 |
2025-04-01T04:34:25.479977
|
{
"authors": [
"DuncanCasteleyn"
],
"repo": "gradle/gradle-build-action",
"url": "https://github.com/gradle/gradle-build-action/pull/912",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2011915315
|
Add a sample app for the Tests API usage
Summary
This PR adds a sample app to illustrate how the Tests API can be used to address one of the common use cases: creating a report/GitHub issues for recently unstable test classes.
Review notes
A big chunk of changes is transforming the repo into a multi-module project to support the creation of multiple isolated apps.
Start by reviewing README.md that describes the sample app at a high level and provides an overview of the app parameters.
TestsApiSampleMain is the entry point for the app. It contains the high-level steps to fetch the tests data of interest and delegates the reporting of unstable test containers to an instance of UnstableTestContainersReporter
StandardOutputReporter will print a small summary to the console.
GitHubCliReporter will create a GitHub issue per unstable container in the repository of user's choice. The issue will look something like this.
I'm fine with either. Let's see what the @gradle/ge-build-insights team thinks who have been maintaining this repo so far.
Closing it as the final version will diverge significantly, and it is easier to re-create the branch than to resolve conflicts caused by the rename
|
gharchive/pull-request
| 2023-11-27T10:01:26 |
2025-04-01T04:34:25.484407
|
{
"authors": [
"marcphilipp",
"pshevche"
],
"repo": "gradle/gradle-enterprise-api-samples",
"url": "https://github.com/gradle/gradle-enterprise-api-samples/pull/88",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
298849541
|
Support native languages in build init plugin
@pioterj commented on Tue Mar 28 2017
Add Build Init Plugin types for native languages supported by Gradle.
Expected Behavior
Build init plugin can generate projects using native languages that are supported out of the box:
C
C++
Objective-C
Objective-C++
Assembly
Windows resources
Current Behavior
Build init plugin can only generate projects using JVM languages (Java, Groovy and Scala).
Context
Adding build init types for native languages would make it easier to get started with Gradle for native developers.
I wouldn't reuse --package. For now, I would just generate a default one based on the name of the project or hard-code it.
|
gharchive/issue
| 2018-02-21T05:47:28 |
2025-04-01T04:34:25.488432
|
{
"authors": [
"adammurdoch",
"big-guy"
],
"repo": "gradle/gradle-native",
"url": "https://github.com/gradle/gradle-native/issues/498",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
647784198
|
Gradle 6.5 build fails on Java 15-ea with Execution failed for task ':jar'. java.util.ConcurrentModificationException (no error message)
I have set up a Travis CI continuous integration server for my open source project picocli.
This successfully builds the project for OpenJDK 8, 9, 10, 11, 12, 13, and 14, but fails for Java 15-ea.
The build log says "Execution failed for task ':jar'.
> java.util.ConcurrentModificationException (no error message)"
It is unclear to me why this only happens with Java 15-ea and not with earlier versions of Java.
Expected Behavior
Jar task should not fail with ConcurrentModificationException.
Current Behavior
> Task :jar FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':jar'.
> java.util.ConcurrentModificationException (no error message)
See https://travis-ci.org/github/remkop/picocli/jobs/703036632 for details.
Context
I am trying to build the picocli project with many different versions of Java, especially Early Access versions, to detect problems that may require changes to the library in order to work on recent versions of Java.
Gradle version: https://github.com/remkop/picocli/blob/master/gradle/wrapper/gradle-wrapper.properties
Build config: https://github.com/remkop/picocli/blob/master/build.gradle
Travis config: https://github.com/remkop/picocli/blob/master/.travis.yml
Steps to Reproduce
Looking at the Travis CI log:
Download Java 15-ea from https://download.java.net/java/early_access/jdk15/29/GPL/openjdk-15-ea+29_linux-x64_bin.tar.gz
Clone picocli project from github:
git clone --depth=50 --branch=master https://github.com/remkop/picocli.git remkop/picocli
Run ./gradlew assemble
Your Environment
$ java -Xmx32m -version
openjdk version "15-ea" 2020-09-15
OpenJDK Runtime Environment (build 15-ea+29-1445)
OpenJDK 64-Bit Server VM (build 15-ea+29-1445, mixed mode, sharing)
$ javac -J-Xmx32m -version
javac 15-ea
$ ./gradlew assemble
Downloading https://services.gradle.org/distributions/gradle-6.5-all.zip
...
Build system information
Build language: java
Build dist: xenial
Build id: 703036620
Job id: 703036632
Runtime kernel version: 4.15.0-1077-gcp
travis-build version: 8e9c8035
Build image provisioning date and time
Wed Jun 24 13:36:52 UTC 2020
Operating System Details
Distributor ID: Ubuntu
Description: Ubuntu 16.04.6 LTS
Release: 16.04
Codename: xenial
Systemd Version
systemd 229
Cookbooks Version
3f92a99 https://github.com/travis-ci/travis-cookbooks/tree/3f92a99
git version
git version 2.27.0
bash version
GNU bash, version 4.3.48(1)-release (x86_64-pc-linux-gnu)
gcc version
gcc (Ubuntu 5.4.0-6ubuntu1~16.04.12) 5.4.0 20160609
docker version
Client:
Version: 18.06.0-ce
API version: 1.38
Go version: go1.10.3
Git commit: 0ffa825
Built: Wed Jul 18 19:11:02 2018
OS/Arch: linux/amd64
Experimental: false
Server:
Engine:
Version: 18.06.0-ce
API version: 1.38 (minimum version 1.12)
Go version: go1.10.3
Git commit: 0ffa825
Built: Wed Jul 18 19:09:05 2018
OS/Arch: linux/amd64
Experimental: false
clang version
clang version 7.0.0 (tags/RELEASE_700/final)
jq version
jq-1.5
bats version
Bats 0.4.0
shellcheck version
0.7.0
shfmt version
v2.6.3
ccache version
3.2.4
cmake version
cmake version 3.12.4
heroku version
heroku/7.42.1 linux-x64 node-v12.16.2
imagemagick version
Version: ImageMagick 6.8.9-9 Q16 x86_64 2019-11-12 http://www.imagemagick.org
md5deep version
4.4
mercurial version
version 4.8
mysql version
mysql Ver 14.14 Distrib 5.7.30, for Linux (x86_64) using EditLine wrapper
openssl version
OpenSSL 1.0.2g 1 Mar 2016
packer version
1.3.3
postgresql client version
psql (PostgreSQL) 10.13 (Ubuntu 10.13-1.pgdg16.04+1)
ragel version
Ragel State Machine Compiler version 6.8 Feb 2013
sudo version
1.8.16
gzip version
gzip 1.6
zip version
Zip 3.0
vim version
VIM - Vi IMproved 7.4 (2013 Aug 10, compiled Mar 18 2020 14:06:17)
iptables version
iptables v1.6.0
curl version
curl 7.47.0 (x86_64-pc-linux-gnu) libcurl/7.47.0 GnuTLS/3.4.10 zlib/1.2.8 libidn/1.32 librtmp/2.3
wget version
GNU Wget 1.17.1 built on linux-gnu.
rsync version
rsync version 3.1.1 protocol version 31
gimme version
v1.5.4
nvm version
0.35.3
perlbrew version
/home/travis/perl5/perlbrew/bin/perlbrew - App::perlbrew/0.88
phpenv version
rbenv 1.1.2-30-gc879cb0
rvm version
rvm 1.29.10 (latest) by Michal Papis, Piotr Kuczynski, Wayne E. Seguin [https://rvm.io]
default ruby version
ruby 2.5.3p105 (2018-10-18 revision 65156) [x86_64-linux]
CouchDB version
couchdb 1.6.1
ElasticSearch version
5.5.0
Installed Firefox version
firefox 63.0.1
MongoDB version
MongoDB 4.0.19
PhantomJS version
2.1.1
Pre-installed PostgreSQL versions
9.4.26
9.5.22
9.6.18
Redis version
redis-server 6.0.5
Pre-installed Go versions
1.11.1
ant version
Apache Ant(TM) version 1.9.6 compiled on July 20 2018
mvn version
Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
gradle version
Gradle 5.1.1!
lein version
Leiningen 2.9.3 on Java 11.0.2 OpenJDK 64-Bit Server VM
Pre-installed Node.js versions
v10.21.0
v11.0.0
v12.18.1
v4.9.1
v6.17.1
v8.12.0
v8.17.0
v8.9
phpenv versions
system
5.6
5.6.40
7.1
7.1.27
7.2
* 7.2.15 (set by /home/travis/.phpenv/version)
hhvm
hhvm-stable
composer --version
Composer version 1.8.4 2019-02-11 10:52:10
Pre-installed Ruby versions
ruby-2.3.8
ruby-2.4.5
ruby-2.5.3
Hello, did anyone have a chance to look at this?
The problem is also happening with Java 15 final release candidate...
Link to failing build log: https://travis-ci.org/github/remkop/picocli/jobs/716766646
$ java -Xmx32m -version
openjdk version "15" 2020-09-15
OpenJDK Runtime Environment (build 15+35-1559)
OpenJDK 64-Bit Server VM (build 15+35-1559, mixed mode, sharing)
$ javac -J-Xmx32m -version
javac 15
...
> Task :processResources NO-SOURCE
> Task :classes
> Task :jar FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':jar'.
> java.util.ConcurrentModificationException (no error message)
@remkop could you try with 6.6 and a nightly?
@big-guy Thanks for responding. Will do.
Update: I still see the failure in Travis CI with Gradle 6.6:
$ java -Xmx32m -version
openjdk version "15" 2020-09-15
OpenJDK Runtime Environment (build 15+35-1559)
OpenJDK 64-Bit Server VM (build 15+35-1559, mixed mode, sharing)
$ javac -J-Xmx32m -version
javac 15
55.61s$ ./gradlew assemble
Downloading https://services.gradle.org/distributions/gradle-6.6-all.zip
...
> Task :jar FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':jar'.
> java.util.ConcurrentModificationException (no error message)
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 47s
2 actionable tasks: 2 executed
The command "eval ./gradlew assemble " failed. Retrying, 2 of 3.
...
BUILD FAILED in 1s
2 actionable tasks: 1 executed, 1 up-to-date
The command "eval ./gradlew assemble " failed 3 times.
The command "./gradlew assemble" failed and exited with 1 during .
Not sure if this is relevant, but I just noticed that my .travis.yml file says:
script:
- ./gradlew check --info --stacktrace
... but the error is talking about ./gradlew assemble... (also I don't see any stack trace details in the travis build log).
The lack of details in the log makes it hard to debug.
I will trying to customize the installation step next to get more details about the failure.
By skipping the Travis installation step, the build log now shows the stack trace of the failure:
> Task :jar FAILED
Caching disabled for task ':jar' because:
Build cache is disabled
Task ':jar' is not up-to-date because:
No history is available.
Writing module descriptor into /home/travis/build/remkop/picocli/build/classes/java/main/META-INF/versions/9/module-info.class
file or directory '/home/travis/build/remkop/picocli/build/resources/main', not found
:jar (Thread[Daemon worker,5,main]) completed. Took 0.644 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':jar'.
> java.util.ConcurrentModificationException (no error message)
* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':jar'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.lambda$executeIfValid$1(ExecuteActionsTaskExecuter.java:208)
at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:263)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:206)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:187)
at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:114)
at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:62)
at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:409)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:399)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:157)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:242)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:150)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:94)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:41)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:372)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:359)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:352)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:338)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.lambda$run$0(DefaultPlanExecutor.java:127)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:191)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:182)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:124)
at org.gradle.execution.plan.DefaultPlanExecutor.process(DefaultPlanExecutor.java:72)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.executeWithServices(DefaultTaskExecutionGraph.java:184)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.execute(DefaultTaskExecutionGraph.java:177)
at org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTaskExecutionAction.java:39)
at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:40)
at org.gradle.execution.DefaultBuildWorkExecutor.access$000(DefaultBuildWorkExecutor.java:24)
at org.gradle.execution.DefaultBuildWorkExecutor$1.proceed(DefaultBuildWorkExecutor.java:48)
at org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildExecutionAction.java:49)
at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:40)
at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:33)
at org.gradle.execution.IncludedBuildLifecycleBuildWorkExecutor.execute(IncludedBuildLifecycleBuildWorkExecutor.java:36)
at org.gradle.execution.DeprecateUndefinedBuildWorkExecutor.execute(DeprecateUndefinedBuildWorkExecutor.java:42)
at org.gradle.execution.BuildOperationFiringBuildWorkerExecutor$ExecuteTasks.run(BuildOperationFiringBuildWorkerExecutor.java:57)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:395)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:387)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:157)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:242)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:150)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:84)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at org.gradle.execution.BuildOperationFiringBuildWorkerExecutor.execute(BuildOperationFiringBuildWorkerExecutor.java:42)
at org.gradle.initialization.DefaultGradleLauncher.runWork(DefaultGradleLauncher.java:272)
at org.gradle.initialization.DefaultGradleLauncher.doClassicBuildStages(DefaultGradleLauncher.java:179)
at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:155)
at org.gradle.initialization.DefaultGradleLauncher.executeTasks(DefaultGradleLauncher.java:131)
at org.gradle.internal.invocation.GradleBuildController$1.create(GradleBuildController.java:72)
at org.gradle.internal.invocation.GradleBuildController$1.create(GradleBuildController.java:67)
at org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:180)
at org.gradle.internal.work.StopShieldingWorkerLeaseService.withLocks(StopShieldingWorkerLeaseService.java:40)
at org.gradle.internal.invocation.GradleBuildController.doBuild(GradleBuildController.java:67)
at org.gradle.internal.invocation.GradleBuildController.run(GradleBuildController.java:56)
at org.gradle.tooling.internal.provider.ExecuteBuildActionRunner.run(ExecuteBuildActionRunner.java:31)
at org.gradle.launcher.exec.ChainingBuildActionRunner.run(ChainingBuildActionRunner.java:35)
at org.gradle.launcher.exec.BuildOutcomeReportingBuildActionRunner.run(BuildOutcomeReportingBuildActionRunner.java:63)
at org.gradle.tooling.internal.provider.ValidatingBuildActionRunner.run(ValidatingBuildActionRunner.java:32)
at org.gradle.tooling.internal.provider.FileSystemWatchingBuildActionRunner.run(FileSystemWatchingBuildActionRunner.java:52)
at org.gradle.launcher.exec.BuildCompletionNotifyingBuildActionRunner.run(BuildCompletionNotifyingBuildActionRunner.java:41)
at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:49)
at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:44)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:409)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:399)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:157)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:242)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:150)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:94)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner.run(RunAsBuildOperationBuildActionRunner.java:44)
at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:50)
at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:47)
at org.gradle.composite.internal.DefaultRootBuildState.run(DefaultRootBuildState.java:87)
at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:47)
at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:31)
at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:44)
at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:29)
at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:87)
at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:55)
at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:60)
at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:38)
at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:68)
at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:38)
at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:37)
at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:26)
at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:60)
at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:32)
at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:56)
at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:42)
at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:48)
at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:32)
at org.gradle.launcher.daemon.server.exec.ExecuteBuild.doBuild(ExecuteBuild.java:68)
at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.WatchForDisconnection.execute(WatchForDisconnection.java:39)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.ResetDeprecationLogger.execute(ResetDeprecationLogger.java:29)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.RequestStopIfSingleUsedDaemon.execute(RequestStopIfSingleUsedDaemon.java:35)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.create(ForwardClientInput.java:78)
at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.create(ForwardClientInput.java:75)
at org.gradle.util.Swapper.swap(Swapper.java:38)
at org.gradle.launcher.daemon.server.exec.ForwardClientInput.execute(ForwardClientInput.java:75)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.LogAndCheckHealth.execute(LogAndCheckHealth.java:55)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.LogToClient.doBuild(LogToClient.java:63)
at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment.doBuild(EstablishBuildEnvironment.java:82)
at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy$1.run(StartBuildOrRespondWithBusy.java:52)
at org.gradle.launcher.daemon.server.DaemonStateCoordinator$1.run(DaemonStateCoordinator.java:297)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
Caused by: java.util.ConcurrentModificationException
at aQute.bnd.osgi.Jar.putResource(Jar.java:288)
at aQute.bnd.osgi.Jar.buildFromZip(Jar.java:216)
at aQute.bnd.osgi.Jar.<init>(Jar.java:122)
at aQute.bnd.osgi.Jar.<init>(Jar.java:172)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at aQute.bnd.gradle.BundleTaskConvention$_buildBundle_closure5$_closure6.doCall(BundleTaskConvention.groovy:239)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at aQute.bnd.gradle.BundleTaskConvention$_buildBundle_closure5.doCall(BundleTaskConvention.groovy:199)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at org.gradle.util.ClosureBackedAction.execute(ClosureBackedAction.java:71)
at org.gradle.util.ConfigureUtil.configureTarget(ConfigureUtil.java:154)
at org.gradle.util.ConfigureUtil.configureSelf(ConfigureUtil.java:130)
at org.gradle.api.internal.AbstractTask.configure(AbstractTask.java:599)
at org.gradle.api.DefaultTask.configure(DefaultTask.java:307)
at org.gradle.api.DefaultTask.configure(DefaultTask.java:44)
at org.gradle.util.Configurable$configure.call(Unknown Source)
at aQute.bnd.gradle.BundleTaskConvention.buildBundle(BundleTaskConvention.groovy:194)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:484)
at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:196)
at org.gradle.internal.extensibility.DefaultConvention$ExtensionsDynamicObject.tryInvokeMethod(DefaultConvention.java:308)
at org.gradle.internal.metaobject.CompositeDynamicObject.tryInvokeMethod(CompositeDynamicObject.java:98)
at org.gradle.internal.extensibility.MixInClosurePropertiesAsMethodsDynamicObject.tryInvokeMethod(MixInClosurePropertiesAsMethodsDynamicObject.java:34)
at org.gradle.internal.metaobject.AbstractDynamicObject.invokeMethod(AbstractDynamicObject.java:163)
at org.gradle.api.tasks.bundling.Jar_Decorated.invokeMethod(Unknown Source)
at aQute.bnd.gradle.BndBuilderPlugin$_apply_closure1$_closure2$_closure6.doCall(BndBuilderPlugin.groovy:59)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at org.gradle.api.internal.AbstractTask$ClosureTaskAction.doExecute(AbstractTask.java:680)
at org.gradle.api.internal.AbstractTask$ClosureTaskAction.lambda$execute$0(AbstractTask.java:667)
at org.gradle.configuration.internal.DefaultUserCodeApplicationContext$CurrentApplication.reapply(DefaultUserCodeApplicationContext.java:86)
at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:667)
at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:642)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$3.run(ExecuteActionsTaskExecuter.java:570)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:395)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:387)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:157)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:242)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:150)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:84)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:555)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:538)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$300(ExecuteActionsTaskExecuter.java:109)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.executeWithPreviousOutputFiles(ExecuteActionsTaskExecuter.java:279)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:268)
at org.gradle.internal.execution.steps.ExecuteStep.lambda$execute$1(ExecuteStep.java:33)
at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:33)
at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:26)
at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:67)
at org.gradle.internal.execution.steps.CleanupOutputsStep.execute(CleanupOutputsStep.java:36)
at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:49)
at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:34)
at org.gradle.internal.execution.steps.CancelExecutionStep.execute(CancelExecutionStep.java:43)
at org.gradle.internal.execution.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:73)
at org.gradle.internal.execution.steps.TimeoutStep.execute(TimeoutStep.java:54)
at org.gradle.internal.execution.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:44)
at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:54)
at org.gradle.internal.execution.steps.SnapshotOutputsStep.execute(SnapshotOutputsStep.java:38)
at org.gradle.internal.execution.steps.BroadcastChangingOutputsStep.execute(BroadcastChangingOutputsStep.java:49)
at org.gradle.internal.execution.steps.CacheStep.executeWithoutCache(CacheStep.java:159)
at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:72)
at org.gradle.internal.execution.steps.CacheStep.execute(CacheStep.java:43)
at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:44)
at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:33)
at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:38)
at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:24)
at org.gradle.internal.execution.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:92)
at org.gradle.internal.execution.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:85)
at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:55)
at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:39)
at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:76)
at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:37)
at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:36)
at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:26)
at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:94)
at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:49)
at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:79)
at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:53)
at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:74)
at org.gradle.internal.execution.steps.SkipEmptyWorkStep.lambda$execute$2(SkipEmptyWorkStep.java:78)
at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:78)
at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:34)
at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsStartedStep.execute(MarkSnapshottingInputsStartedStep.java:39)
at org.gradle.internal.execution.steps.LoadExecutionStateStep.execute(LoadExecutionStateStep.java:40)
at org.gradle.internal.execution.steps.LoadExecutionStateStep.execute(LoadExecutionStateStep.java:28)
at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:33)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:195)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:187)
at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:114)
at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:62)
at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:409)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:399)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:157)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:242)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:150)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:94)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:41)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:372)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:359)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:352)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:338)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.lambda$run$0(DefaultPlanExecutor.java:127)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:191)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:182)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:124)
at org.gradle.execution.plan.DefaultPlanExecutor.process(DefaultPlanExecutor.java:72)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.executeWithServices(DefaultTaskExecutionGraph.java:184)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.execute(DefaultTaskExecutionGraph.java:177)
at org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTaskExecutionAction.java:39)
at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:40)
at org.gradle.execution.DefaultBuildWorkExecutor.access$000(DefaultBuildWorkExecutor.java:24)
at org.gradle.execution.DefaultBuildWorkExecutor$1.proceed(DefaultBuildWorkExecutor.java:48)
at org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildExecutionAction.java:49)
at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:40)
at org.gradle.execution.DefaultBuildWorkExecutor.execute(DefaultBuildWorkExecutor.java:33)
at org.gradle.execution.IncludedBuildLifecycleBuildWorkExecutor.execute(IncludedBuildLifecycleBuildWorkExecutor.java:36)
at org.gradle.execution.DeprecateUndefinedBuildWorkExecutor.execute(DeprecateUndefinedBuildWorkExecutor.java:42)
at org.gradle.execution.BuildOperationFiringBuildWorkerExecutor$ExecuteTasks.run(BuildOperationFiringBuildWorkerExecutor.java:57)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:395)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:387)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:157)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:242)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:150)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:84)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at org.gradle.execution.BuildOperationFiringBuildWorkerExecutor.execute(BuildOperationFiringBuildWorkerExecutor.java:42)
at org.gradle.initialization.DefaultGradleLauncher.runWork(DefaultGradleLauncher.java:272)
at org.gradle.initialization.DefaultGradleLauncher.doClassicBuildStages(DefaultGradleLauncher.java:179)
at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:155)
at org.gradle.initialization.DefaultGradleLauncher.executeTasks(DefaultGradleLauncher.java:131)
at org.gradle.internal.invocation.GradleBuildController$1.create(GradleBuildController.java:72)
at org.gradle.internal.invocation.GradleBuildController$1.create(GradleBuildController.java:67)
at org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:180)
at org.gradle.internal.work.StopShieldingWorkerLeaseService.withLocks(StopShieldingWorkerLeaseService.java:40)
at org.gradle.internal.invocation.GradleBuildController.doBuild(GradleBuildController.java:67)
at org.gradle.internal.invocation.GradleBuildController.run(GradleBuildController.java:56)
at org.gradle.tooling.internal.provider.ExecuteBuildActionRunner.run(ExecuteBuildActionRunner.java:31)
at org.gradle.launcher.exec.ChainingBuildActionRunner.run(ChainingBuildActionRunner.java:35)
at org.gradle.launcher.exec.BuildOutcomeReportingBuildActionRunner.run(BuildOutcomeReportingBuildActionRunner.java:63)
at org.gradle.tooling.internal.provider.ValidatingBuildActionRunner.run(ValidatingBuildActionRunner.java:32)
at org.gradle.tooling.internal.provider.FileSystemWatchingBuildActionRunner.run(FileSystemWatchingBuildActionRunner.java:52)
at org.gradle.launcher.exec.BuildCompletionNotifyingBuildActionRunner.run(BuildCompletionNotifyingBuildActionRunner.java:41)
at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:49)
at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:44)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:409)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:399)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:157)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:242)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:150)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:94)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner.run(RunAsBuildOperationBuildActionRunner.java:44)
at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:50)
at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:47)
at org.gradle.composite.internal.DefaultRootBuildState.run(DefaultRootBuildState.java:87)
at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:47)
at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:31)
at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:44)
at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:29)
at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:87)
at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:55)
at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:60)
at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:38)
at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:68)
at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:38)
at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:37)
at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:26)
at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:60)
at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:32)
at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:56)
at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:42)
at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:48)
at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:32)
at org.gradle.launcher.daemon.server.exec.ExecuteBuild.doBuild(ExecuteBuild.java:68)
at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.WatchForDisconnection.execute(WatchForDisconnection.java:39)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.ResetDeprecationLogger.execute(ResetDeprecationLogger.java:29)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.RequestStopIfSingleUsedDaemon.execute(RequestStopIfSingleUsedDaemon.java:35)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.create(ForwardClientInput.java:78)
at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.create(ForwardClientInput.java:75)
at org.gradle.util.Swapper.swap(Swapper.java:38)
at org.gradle.launcher.daemon.server.exec.ForwardClientInput.execute(ForwardClientInput.java:75)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.LogAndCheckHealth.execute(LogAndCheckHealth.java:55)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.LogToClient.doBuild(LogToClient.java:63)
at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment.doBuild(EstablishBuildEnvironment.java:82)
at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:37)
at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
at org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy$1.run(StartBuildOrRespondWithBusy.java:52)
at org.gradle.launcher.daemon.server.DaemonStateCoordinator$1.run(DaemonStateCoordinator.java:297)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
My gradle.build has biz.aQute.bnd:biz.aQute.bnd.gradle:4.2.0.
I will try the latest version (5.1.2) next.
After updating the biz.aQute.bnd Gradle plugin to version 5.1.2, the build succeeds without error.
https://travis-ci.org/github/remkop/picocli/jobs/717123867
Thanks, closing this ticket.
|
gharchive/issue
| 2020-06-30T01:41:14 |
2025-04-01T04:34:25.513279
|
{
"authors": [
"big-guy",
"remkop"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/issues/13628",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
900513373
|
Cryptic IllegalArgumentException
Hi,
One of my builds contains a transitive dependency on uk.ac.ebi.pride.architectural.pride-logging. This artifact should be available from this repository: https://www.ebi.ac.uk/Tools/maven/repos/content/groups/ebi-repo/uk/ac/ebi/pride/architectural/pride-logging/1.0.0/
Currently, the repository is broken - it contains only the .pom file but not the corresponding .jar file.
Therefore, it is not surprising that the gradle build fails. However, it fails with a rather cryptic IllegalArgumentException:
> Task :compileJava FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileJava'.
> java.lang.IllegalArgumentException: /Users/plusik/.gradle/caches/modules-2/files-2.1/uk.ac.ebi.pride.architectural/pride-logging/1.0.0/63df375258e700783e6e134f690bbb0d22491a94/pride-logging-1.0.0.pom
Expected Behavior
More meaningful error message from gradle.
Current Behavior
Not very meaningful exception.
Your Environment
OS: macOS 10.15.7
Gradle version: 7.0.2
Build scan URL: https://gradle.com/s/m6rolizm3zitk
@tomas-pluskal Can you please share the snippet how you declare the dependency in your project? It seems we're trying to parse the pom file as a JPMS module (which is clearly wrong). Do you mind trying to come up with a reproducer? Thanks!
This is the complete source (contains only the module declaration)
gradle_exception_test.zip
Thank you for creating a reproducer so fast. I'll take a look soon.
I was able to reproduce the problem using your sample project.
The org.openjfx.javafxplugin plugins applies the org.javamodularity.moduleplugin that seems to be the source of trouble. It seems like that the plugin modifies the compileJava task configuration and fails to omit poms from the module path.
Is org.javamodularity.moduleplugin part of gradle, or is that an external plugin?
I've reported the issue on the plugin: https://github.com/java9-modularity/gradle-modules-plugin/issues/190
Thanks!
|
gharchive/issue
| 2021-05-25T08:53:27 |
2025-04-01T04:34:25.523319
|
{
"authors": [
"bmuskalla",
"donat",
"tomas-pluskal"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/issues/17264",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1559346373
|
Order of tasks in JVMTestSuite plugin is not kept when shouldRunAfter is specified
I use the JVMTestSuite plugin to define integration tests. Those tests must run after the normal unit tests have been executed, not the other way around.
So, I defined this:
testing {
suites {
val jUnitVersion = "5.9.2"
val test by getting(JvmTestSuite::class) {
useJUnitJupiter(jUnitVersion)
}
val integrationTest by registering(JvmTestSuite::class)
integrationTest {
dependencies {
implementation(project)
}
targets {
all {
testTask.configure {
shouldRunAfter(test)
}
}
}
}
}
}
Expected Behavior
When I do a ./gradlew :build, I expect FIRST the unit test and after that the integration tests task to be executed.
Current Behavior
The integrationTest task is always executed first...
Context
I see this in Gradle 7.5.1 and 7.6, the "shouldRunAfter" seems to be ignored. "mustRunAfter" is not correct, because that forces integration tests to be executed even if you just want test...
Steps to Reproduce
Your Environment
Build scan URL:
https://scans.gradle.com/s/v77ms75oxijm6/timeline
Thank you for your interest in Gradle!
This is the intended behavior and there’s no plan to change it. This issue will be closed.
shouldRunAfter allows Gradle to run them in a different order
See https://docs.gradle.org/current/userguide/more_about_tasks.html#sec:ordering_tasks
If your use case is stricter then mustRunAfter would be better.
mustRunAfter doesn't mandate that the first task is always executed. finalizedBy is about that.
|
gharchive/issue
| 2023-01-27T08:14:35 |
2025-04-01T04:34:25.528852
|
{
"authors": [
"Zordid",
"eskatos"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/issues/23678",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1706274737
|
Gradle fails catastrophically on multi-release jars with unsupported Java versions
Expected Behavior
When Gradle encounters a multi-release jar (https://openjdk.org/jeps/238) with a Java version under META-INF that is not recognized by that version of Gradle, Gradle should just ignore the unrecognized version and continue working.
Current Behavior
After upgrading jackson-core to 2.15.0, Gradle 7.4.2 failed with these errors:
Failed to create Jar file /home/jenkins/workspace/ourProject/.gradle/caches/jars-9/bc33e077f18aab4d8905d3690c9e67e4/jackson-core-2.15.0.jar.
16:31:39 Caused by: java.lang.IllegalArgumentException: Unsupported class file major version 63
Because the failure was complete, we couldn't use Gradle to try to debug the problem. For example, ./gradlew dependencies failed with the same "Failed to create Jar file" error.
Context (optional)
No response
Steps to Reproduce
Create a project with a dependency on jackson-core 2.15.0 and Gradle 7.4.2, then try to run ./gradlew dependencies.
Gradle version
7.4.2
Build scan URL (optional)
No response
Your Environment (optional)
No response
Sorry that you're having trouble with Gradle!
We appreciate the effort that went into filing this issue, but we must ask for more information.
As stated in our issue template, a minimal reproducible example is a must for us to be able to track down and fix your problem efficiently. Our available resources are severely limited, and we must be sure we are looking at the exact problem you are facing.
If we have a reproducer, we may be able also to suggest workarounds or ways to avoid the problem.
The ideal way to provide a reproducer is to leverage our reproducer template. You can also use Gradle Project Replicator to reproduce the structure of your project.
This issue will be closed after 7 days unless you provide more information.
Please provide details about the environment you experience this problem in, especially the Java version you use to run Gradle (and possibly any toolchain versions you use). We suspect your problem to be not completely generic, hence asking for a concrete reproducer.
|
gharchive/issue
| 2023-05-11T17:47:38 |
2025-04-01T04:34:25.535348
|
{
"authors": [
"jbartok",
"jimshowalter"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/issues/25043",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2040617184
|
Sync task with Provider as source reports NO-SOURCE when using configuration cache
Current Behavior
On Gradle 8.4 when configuring the source of a Sync task with a Provider<String>, the task reports NO-SOURCE when run with --configuration-cache and no files are copied. Without the configuration cache enabled, the task runs as expected.
Expected Behavior
Configuring the source of a Sync task with a Provider<String> (as documented) should work with the configuration cache enabled.
Context (optional)
No response
Steps to Reproduce
I distilled the problem down to this simple project:
gradle-cfg-cache-test.zip
Reproduce with
./gradlew --configuration-cache some-project:copyFiles
I upgraded that project to Gradle 8.5 (latest version as of 12/13) and reproduced there as well.
Gradle version
8.4
Build scan URL (optional)
No response
Your Environment (optional)
No response
Thank you for providing a valid report.
The issue is in the backlog of the relevant team, but this area of Gradle is currently not a focus one, so it might take a while before a fix is made.
|
gharchive/issue
| 2023-12-13T23:24:31 |
2025-04-01T04:34:25.539832
|
{
"authors": [
"BrennanEbeling",
"ov7a"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/issues/27409",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2346428138
|
Formatting should be the same with JDK8 compiler reports as with other compilers
Current Behavior
After fixing https://github.com/gradle/gradle/issues/29113 with https://github.com/gradle/gradle/pull/29141, we still haven't completely fixed the case when a compilation diagnostic is reported using a JDK8 compiler, and have some Processor added to the compiler.
Our mitigation will fall back on using the BasicDiagnosticFormatter that comes with the diagnostic. This formatter has some drawbacks compared to the standard RichDiagnosticFormatter, with the additional features of RichDiagnosticFormatter detailed in the JavaDoc.
Expected Behavior
The JDK8 compiler output should be consistent with any configuration of the compiler (i.e. when annotation processors are applied or not)
Context (optional)
https://github.com/gradle/gradle/issues/29113
Steps to Reproduce
JavaCompileProblemsIntegrationTest#compiler warnings causes failure in problem mapping under JDK8
Gradle version
8.9
Build scan URL (optional)
No response
Your Environment (optional)
No response
Upon further investigation into OpenJDK's codebase, I've come to the conclusion that this equivalence between JDK8 and JDK8+ probably is not possible.
It seems that in JDK8, a new Context object will be created per processing round, and the previous one reset ^1
This is an obvious blocker, as our concept was to hold onto a Context and use it to have access to formatting utilities.
|
gharchive/issue
| 2024-06-11T13:22:37 |
2025-04-01T04:34:25.545544
|
{
"authors": [
"hegyibalint"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/issues/29501",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
359991265
|
compileTestScala fails with UncheckedIOException: java.io.FileNotFoundException
Our builds are failing to compile scala after the upgrade to 4.10.1.
We do not see this issue in 4.10
[09:18:24]org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':projectA:scala:compileTestScala'.
[09:18:24] at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:110)
[09:18:24]
[09:18:24] at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:77)
[09:18:24] at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
[09:18:24] at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
[09:18:24] at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
[09:18:24] at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
[09:18:24] at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:79)
[09:18:24] at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
[09:18:24] at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
[09:18:24] at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
[09:18:24] at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
[09:18:24] at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
[09:18:24] at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
[09:18:24] at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
[09:18:24] at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
[09:18:24] at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
[09:18:24] at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.run(EventFiringTaskExecuter.java:51)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:90)
[09:18:24] at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
[09:18:24] at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
[09:18:24] at org.gradle.execution.taskgraph.LocalTaskInfoExecutor.execute(LocalTaskInfoExecutor.java:42)
[09:18:24] at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:277)
[09:18:24] at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareWorkItemExecutor.execute(DefaultTaskExecutionGraph.java:262)
[09:18:24] at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:135)
[09:18:24] at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
[09:18:24] at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.execute(DefaultTaskPlanExecutor.java:200)
[09:18:24] at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.executeWithWork(DefaultTaskPlanExecutor.java:191)
[09:18:24] at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$ExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
[09:18:24] at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
[09:18:24] at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
[09:18:24] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[09:18:24] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[09:18:24] at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
[09:18:24] at java.lang.Thread.run(Thread.java:748)
[09:18:24]Caused by: org.gradle.api.UncheckedIOException: java.io.FileNotFoundException: ....build/classes/java/main (Is a directory)
[09:18:24] at org.gradle.language.scala.tasks.AbstractScalaCompile.resolveAnalysisMappingsForOtherProjects(AbstractScalaCompile.java:167)
[09:18:24] at org.gradle.language.scala.tasks.AbstractScalaCompile.configureIncrementalCompilation(AbstractScalaCompile.java:137)
[09:18:24] at org.gradle.language.scala.tasks.AbstractScalaCompile.compile(AbstractScalaCompile.java:101)
[09:18:24] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[09:18:24] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[09:18:24] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[09:18:24] at java.lang.reflect.Method.invoke(Method.java:498)
[09:18:24] at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
[09:18:24] at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
[09:18:24] at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
[09:18:24] at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
[09:18:24] at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:801)
[09:18:24] at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:768)
[09:18:24] at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:131)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:300)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:292)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:174)
[09:18:24] at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:90)
[09:18:24] at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
[09:18:24] at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:120)
[09:18:24] at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:99)
[09:18:24] ... 36 more
[09:18:24]Caused by: java.io.FileNotFoundException: /.../build/classes/java/main (Is a directory)
[09:18:24] at java.io.FileInputStream.open0(Native Method)
[09:18:24] at java.io.FileInputStream.open(FileInputStream.java:195)
[09:18:24] at java.io.FileInputStream.<init>(FileInputStream.java:138)
[09:18:24] at com.google.common.io.Files$FileByteSource.openStream(Files.java:126)
[09:18:24] at com.google.common.io.Files$FileByteSource.openStream(Files.java:116)
[09:18:24] at com.google.common.io.ByteSource$AsCharSource.openStream(ByteSource.java:435)
[09:18:24] at com.google.common.io.CharSource.getInput(CharSource.java:94)
[09:18:24] at com.google.common.io.CharSource.getInput(CharSource.java:65)
[09:18:24] at com.google.common.io.CharStreams.readLines(CharStreams.java:344)
[09:18:24] at com.google.common.io.Files.readLines(Files.java:741)
[09:18:24] at com.google.common.io.Files.readLines(Files.java:712)
[09:18:24] at org.gradle.language.scala.tasks.AbstractScalaCompile.resolveAnalysisMappingsForOtherProjects(AbstractScalaCompile.java:163)```
@pgcortina thanks! We're looking into issues with Scala and 4.10.1.
As a workaround, can you try:
compileTestScala {
analysisFiles.setFrom(files())
}
@big-guy thanks! that seems to work.
@pgcortina could you share how you're declaring the dependency between your projects? Do you have something like:
dependencies {
compile project(":someOtherProject")
}
I suspect this is the same or some variant of https://github.com/gradle/gradle/issues/6735, I'm going to close this as a dupe.
|
gharchive/issue
| 2018-09-13T17:26:49 |
2025-04-01T04:34:25.550700
|
{
"authors": [
"big-guy",
"pgcortina"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/issues/6748",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
723343102
|
Move plugin development guides to user manual
This turns the four guides about plugin development (https://gradle.org/guides/?q=Plugin Development) into user manual sections.
All snippets are updated to follow idiomatic Gradle recommendation. This is already a big improvement as non of them was using the lazy configuration APIs until now.
Text and structure are mostly kept, but some outdated information has been removed.
@bot-gradle test this
OK, I've already triggered ReadyForMerge build for you.
@bot-gradle test this
OK, I've already cancelled the old build and triggered a new ReadyForMerge build for you.
|
gharchive/pull-request
| 2020-10-16T15:58:31 |
2025-04-01T04:34:25.554484
|
{
"authors": [
"bot-gradle",
"jjohannes"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/pull/14895",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1561946841
|
Emit a bunch of deprecation warnings for Gradle 8.1
Closes https://github.com/gradle/gradle/issues/21902
Closes https://github.com/gradle/gradle/issues/21882
Closes https://github.com/gradle/gradle/issues/21900
Closes https://github.com/gradle/gradle/issues/21914
This PR also opportunistically cleans up some smoke tests by removing @UnsupportedWithConfigurationCache annotation when it is not applicable anymore.
@bot-gradle test and merge
I've triggered a build for you.
Pre-tested commit build failed.
@bot-gradle test and merge
Your PR is queued. See the queue page for details.
I've triggered a build for you.
Pre-tested commit build failed.
@bot-gradle test and merge
Your PR is queued. See the queue page for details.
I've triggered a build for you.
|
gharchive/pull-request
| 2023-01-30T08:13:03 |
2025-04-01T04:34:25.561134
|
{
"authors": [
"bot-gradle",
"eskatos"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/pull/23705",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2598169673
|
Speed up FileBackedBlockStore
Reviewing cheatsheet
Before merging the PR, comments starting with
❌ ❓must be fixed
🤔 💅 should be fixed
💭 may be fixed
🎉 celebrate happy things
@bot-gradle help
Currently, the following commands are supported:
@bot-gradle test <BuildTrigger1> <BuildTrigger2> ... <BuildTriggerN> [without PTS] [on linux]
Add without PTS to run the build with full test coverage.
Add on linux/windows/macos to run the build on specific OS (only support stage trigger).
A trigger is a special build for this PR on TeamCity, common triggers are:
SanityCheck/CompileAll/QuickFeedbackLinux/QuickFeedback/PullRequestFeedback/ReadyForNightly/ReadyForRelease
Shortcuts: SC/CA/QFL/QF/PRF/RFN/RFR
Specific builds:
BD: BuildDistributions/BuildDocs so that you can preview the generated docs/distribution.
PT: PerformanceTest, all performance tests for Ready For Nightly stage.
APT: AllPerformanceTest, all performance tests, including slow performance tests.
ACT: AllCrossVersionTests, all cross version tests in ReadyForNightly and ReadyForRelease.
AST: AllSmokeTestsPullRequestFeedback
AFT: AllFunctionalTestsPullRequestFeedback
ASB: AllSpecificBuildsPullRequestFeedback
ACC: AllConfigCacheTestsPullRequestFeedback
ACTN: AllCrossVersionTestsReadyForNightly
AFTN: AllFunctionalTestsReadyForNightly
ACTR: AllCrossVersionTestsReadyForRelease, alias of ACT
AFTR: AllFunctionalTestsReadyForRelease
@bot-gradle squash
Squash the current pull request into a single commit. The squash message will come from the pull request body, with:
All Signed-off-by: lines kept.
All authors being Co-Authored-By:
@bot-gradle merge
Enqueue this PR into GitHub merge queue for merging.
GitHub will create a merge commit from your PR branch HEAD and the target branch
A GitHub Merge Queue Check Pass build will be triggered on the merge commit
When the build passes, the target branch will be fast-forwarded to this merge commit (i.e. merge the PR)
@bot-gradle squash and merge
Squash the current pull request into a single commit as described in squash command above,
then enqueue this PR into GitHub merge queue as described in merge command above.
@bot-gradle cherrypick to <branch>
Cherrypicks the current PR to another branch.
A new PR will be created and a build will triggered automatically if there is no conflict.
@bot-gradle cancel
cancel a running build in GitHub merge queue and remove the PR from the queue
@bot-gradle clean
clear the conversation history
@bot-gradle help
display this message
To run a command, simply submit a comment. For detailed instructions see here.
@bot-gradle test APT
The following builds have passed:
All Performance Tests (Trigger)
Build Scan
|
gharchive/pull-request
| 2024-10-18T18:44:17 |
2025-04-01T04:34:25.575984
|
{
"authors": [
"big-guy",
"bot-gradle"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/pull/30964",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
466691984
|
Better document for DeprecationLogger
This closes https://github.com/gradle/gradle-private/issues/2432
Previously DeprecationLogger is not well documented, developers have to run a build to
see what the final message looks like. This commit adds clear documentation for each method.
Thanks @wolfs !
|
gharchive/pull-request
| 2019-07-11T06:38:35 |
2025-04-01T04:34:25.577713
|
{
"authors": [
"blindpirate"
],
"repo": "gradle/gradle",
"url": "https://github.com/gradle/gradle/pull/9926",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.