id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
1785384176 | [Bug] where or operation error structure
Bug report
Structure ... where: { or: { xxx_contains_nocase: "xxx", xxx_contains_nocase: "xxx"}} ... query is not work
... or: { xxx_contains_nocase: "xxx", xxx_contains_nocase: "xxx"} ... query work
Relevant log output
Invalid value provided for argument `where`: Object({\"or\": Object({\"name_contains_nocase\": String(\"xx\"), \"symbol_contains_nocase\": String(\"xx\")})})
IPFS hash
No response
Subgraph name or link to explorer
https://api.thegraph.com/subgraphs/name/uniswap/uniswap-v2/graphql?query=query+MyQuery+{
++tokens(where%3A+{or%3A+{name_contains_nocase%3A+"xx"%2C+symbol_contains_nocase%3A+"xx"}})+{
++++name
++++symbol
++}
}
Some information to help us out
[ ] Tick this box if this bug is caused by a regression found in the latest release.
[ ] Tick this box if this bug is specific to the hosted service.
[X] I have searched the issue tracker to make sure this issue is not a duplicate.
OS information
Windows
hi @huazhuangnan I think you need the following query:
query MyQuery {
tokens(
where: {or: [{name_contains_nocase: "xx"}, {symbol_contains_nocase: "xx"}]}
) {
name
symbol
}
}
| gharchive/issue | 2023-07-03T05:47:39 | 2025-04-01T06:38:52.531787 | {
"authors": [
"azf20",
"huazhuangnan"
],
"repo": "graphprotocol/graph-node",
"url": "https://github.com/graphprotocol/graph-node/issues/4732",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1433295299 | Not populated metrics for indexer service, version 0.20.4
Some metrics are missing from indexer service (not populated) such as indexer_service_queries_ok or indexer_service_queries_total
Version:
graphval@node023:~/indexer/packages/indexer-cli/bin$ graph-indexer-service --version
0.20.4
However, I'm not sure if MIP send queries excactly to my indexer.
the issue is solved when queries started to arrive to index-service
| gharchive/issue | 2022-11-02T14:56:51 | 2025-04-01T06:38:52.533873 | {
"authors": [
"oleksandrmarkelov"
],
"repo": "graphprotocol/indexer",
"url": "https://github.com/graphprotocol/indexer/issues/532",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1678977005 | Add a protocolNetwork column to Rules and Actions table
Depends on #650.
Create a migration with these steps:
Alter the tables: indexingRules and actions:
add protocol_network column, which defaults to null
Resolve every row protocol_network (There can be no nulls left behind!)
This step can be difficult because we need to consider edge cases, such as when an indexer has allocations in the same subgraph for both layers.
derive protocol_network for every rule and action (only need to do for status = queued or approved)
gather rules, active allocations (across protocol networks), and actions: matching rules to active allocations to identify the network
Alter the tables again to add a NOT NULL
Add the protocolNetwork field to Rules and Actions ORM models.
Also, add it to non-ORM types, like SubgraphDeployment.
At this point, the compiler will surface every usage of the new field, we might make some wild discoveries.
Update the indexer-cli sub-commands to require a protocol network identifier.
indexingRules
add the option to specify the protocol network (optional). If not specified, the value will resolve to the default rule
example: indexer rules set <DeploymentId> protocolNetwork homestead allocationAmount 10000
actions
update actions queue commands to require protocol network
example: indexer actions queue allocate QmYN4ofRb5CUg1WdpLhhNTVCuiiAt29hBKGjTnnxYh9zYt homestead 1000
allocations
update create command
indexer allocations create <deployment-id> <protocol-network> <amount> <index-node>
NOTE: validate that allocation ids are unique per-protocol network; if not, we may need to require protocol-network for allocations close and allocations reallocate and allocations get
Update get command
support filtering by protocol_network
disputes
Update get command
Support filtering by protocol_network
cost
OPEN QUESTION: do we continue to use just deployment_id as the primary key, or do we also update to include the network?
To state that another way: do we need cost models for each deployment to be different per network?
If we do: we’ll need to update the set, get, and delete commands also to require protocol_network arg
Fixed by #668
| gharchive/issue | 2023-04-21T19:14:24 | 2025-04-01T06:38:52.544250 | {
"authors": [
"tilacog"
],
"repo": "graphprotocol/indexer",
"url": "https://github.com/graphprotocol/indexer/issues/651",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1231400326 | Use lodash-es
Instead of using individual lodash packages using lodash-es would be a better "es module" player and work with browser playground environments like google's playground elements. Also being an es module package would allow any build tools/bundlers to treeshake as needed.
Using ramda instead since it supports both es and cjs modules. Tested with playground-elements and ramda is good to go. The original motivation behind this issue was to move toward a more compatible module that could support the "es-only" world without bundling, etc, and ramda does the trick.
| gharchive/issue | 2022-05-10T16:18:33 | 2025-04-01T06:38:52.558925 | {
"authors": [
"chadian"
],
"repo": "graphql-mocks/graphql-mocks",
"url": "https://github.com/graphql-mocks/graphql-mocks/issues/169",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
32948914 | migrate README docs to Sphinx/RTD
Following on from #1313, I'd like to continue migrating to Sphinx for documentation, turning next to the installation and configuration instructions in the README. The purpose is to give ourselves a more powerful documentation system than GitHub markdown files and wiki pages. Sphinx enables us to practice literate programming, pulling documentation from docstrings in our source code. I'm using Sphinx successfully on these other libraries: algorithm.py, dependency_injection.py,environment.py, filesystem_tree.py, and postgres.py. We're also moving to Sphinx for the Aspen docs.
There's a sphinx-autobuild package that looks promising in terms of streamlining the doc workflow (auto-rebuild and live-reload). There's some potential hiccups, though, around both Mac OS and VirtualBox (i.e., Vagrant). See https://github.com/GaretJax/sphinx-autobuild/issues/6. So we'll have to figure out the best workflow.
Want to back this issue? Post a bounty on it! We accept bounties via Bountysource.
Closing in light of our decision to shut down Gratipay.
Thank you all for a great run, and I'm sorry it didn't work out! 😞 💃
| gharchive/issue | 2014-05-07T01:45:12 | 2025-04-01T06:38:52.592948 | {
"authors": [
"whit537"
],
"repo": "gratipay/gratipay.com",
"url": "https://github.com/gratipay/gratipay.com/issues/2356",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1367850536 | chore: cci update
fix circleci config
:tada: This PR is included in version 1.2.0 :tada:
The release is available on:
1.2.0
GitHub release
Your semantic-release bot :package::rocket:
:tada: This PR is included in version 1.2.0 :tada:
The release is available on:
1.2.0
GitHub release
Your semantic-release bot :package::rocket:
| gharchive/pull-request | 2022-09-09T13:43:37 | 2025-04-01T06:38:52.611575 | {
"authors": [
"graviteeio",
"jhaeyaert"
],
"repo": "gravitee-io/gravitee-policy-json-xml",
"url": "https://github.com/gravitee-io/gravitee-policy-json-xml/pull/25",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2166223 | rustc: add manifest so we can install with cargo
Depends on #1149
Is this still part of the cargo plan? Do you still want it merged?
Yes, this is dead now.
| gharchive/issue | 2011-11-07T19:18:29 | 2025-04-01T06:38:52.660351 | {
"authors": [
"brson",
"elly"
],
"repo": "graydon/rust",
"url": "https://github.com/graydon/rust/issues/1150",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
332286898 | Spaces in keys break zabbix-sender
Here is an example. I'm simulating the parsing of the input from pm2-zabbix --monitor, which produces a key-value stream which is parsed by zabbix-sender and sent. I'm fetching just one row from this stream, and sending to zabbix:
echo "xxxxx-be-preprod01 pm2.processes[xxxxx Frontend-0,cpu] 90" | zabbix_sender -vv --config /etc/zabbix/zabbix_agentd.conf -s hostname --input-file -
zabbix_sender [25699]: DEBUG: answer [{"response":"success","info":"processed: 0; failed: 1; total: 1; seconds spent: 0.000028"}]
info from server: "processed: 0; failed: 1; total: 1; seconds spent: 0.000028"
So, if there are spaces in pm2 process, the server fails to process the monitor part. The "discover" part is working, though, because if we enclose the key with double quotes, it goes through:
echo "xxxxxx-be-preprod01 "pm2.processes[xxxxx Frontend-0,cpu]" 90" | zabbix_sender -vv --config /etc/zabbix/zabbix_agentd.conf -s hostname --input-file -
zabbix_sender [25131]: DEBUG: answer [{"response":"success","info":"processed: 1; failed: 0; total: 1; seconds spent: 0.000083"}]
info from server: "processed: 1; failed: 0; total: 1; seconds spent: 0.000083"
This is going to be addressed in PR #31.
| gharchive/issue | 2018-06-14T07:44:42 | 2025-04-01T06:38:52.720614 | {
"authors": [
"joppino",
"rkaw92"
],
"repo": "greatcare/pm2-zabbix",
"url": "https://github.com/greatcare/pm2-zabbix/issues/27",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
60451635 | 下载不了FREEBROWSER
HI.
我下载不了FREEBROWSER,下载失败。无论是https://raw.githubusercontent.com/greatfire/z/master/FreeBrowser-1.1.apk还是https://raw.githubusercontent.com/greatfire/z/master/FreeBrowser-1.0.apk都下载失败(我浏览器是翻墙状态的),怎么回事?
可否把FreeBrowser-1.1.apk上传到某个网盘,贴出下载地址供人下载呢?
我找不到其他地方提交问题,只好在此提交,请见谅。
@luckypoem,这位已经通过邮箱反馈了,此issue申请关闭
网盘下载地址
FreeBrowser-1.1.apk
https://mega.co.nz/#!vs0QAToA!4ZqFmlvaevtWj4NDoFZzy_cqCTtnBYU-sWHcOelFk8Q
| gharchive/issue | 2015-03-10T05:06:42 | 2025-04-01T06:38:52.723042 | {
"authors": [
"ghfdbk",
"luckypoem",
"percyalpha"
],
"repo": "greatfire/website-mirror-by-proxy",
"url": "https://github.com/greatfire/website-mirror-by-proxy/issues/21",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
126481333 | Improve test_project
See bak2 archive (requirements.txt, etc.)
Solved in commits:
commit 72be744a4c301f66c9c6c73ee6691b6ea60c491e
Author: Max Arnold arnold.maxim@gmail.com
Date: Mon Dec 28 13:13:45 2015 +0700
Drop LazyProgressBar, the patch is accepted upstream as of version 3.5.2
commit 489dc0eb6c01436798778af347ce92c0bc371011
Author: Michael Gulev greenday2.gd@gmail.com
Date: Fri Dec 25 18:39:52 2015 +0700
travis: remove obsolete vendor directory from pep8 ignore list.
test_project test.py: Remove obsolete in django 1.9 django.utils.unitest and use unittest.
commit c53eec572c5ef13dfb5298381703ea47aef2cf6e
Author: Max Arnold arnold.maxim@gmail.com
Date: Thu Dec 10 20:56:55 2015 +0700
fix docs
commit 1632113183b98f1122c23032300044deec271dd7
Author: Max Arnold arnold.maxim@gmail.com
Date: Thu Dec 10 20:56:32 2015 +0700
write proper docs/requirements.txt
commit 94394148b78b1fe5b94d197e5dc8ea21db17eef6
Author: Max Arnold arnold.maxim@gmail.com
Date: Thu Dec 10 20:56:00 2015 +0700
update test_project to look like it was created using django 1.9
commit cbd2b0cad537216a8cae96ec21e9abdbfd74e808
Author: Max Arnold arnold.maxim@gmail.com
Date: Thu Dec 10 20:20:43 2015 +0700
prettify test_project requirements.txt
commit 7b40098719832e41bc90da3804b17595ad0d7a70
Author: Michael Gulev greenday2.gd@gmail.com
Date: Thu Dec 10 11:25:46 2015 +0700
Change order of dependecy and up django version to 1.9 in requirements.txt
commit 3db5243edab4a2e42ea00c95217b2109cc10e383
Author: Michael Gulev greenday2.gd@gmail.com
Date: Thu Dec 10 11:23:30 2015 +0700
Remove pl, abbr translation and add ru translation. Make ready contrib to django 1.9
commit 7903dd202ee32d265f1932665a40a8a44cf94c0f
Author: Michael Gulev greenday2.gd@gmail.com
Date: Tue Dec 1 22:35:01 2015 +0700
Replace vendor/progressbar with progressbar2
commit 47837033cb9301e5e248871394dc0ffc33db21d8
Author: Michael Gulev greenday2.gd@gmail.com
Date: Sun Nov 22 02:23:00 2015 +0700
Add loggers to settings.py
commit 698ceb50d3c32ca30e95555e57a1b69254d6fe4f
Author: Michael Gulev greenday2.gd@gmail.com
Date: Sun Nov 22 02:21:44 2015 +0700
Remove South migrations.
commit a7032c1a6dd036d66a6fa0c6199d9e11eaf87f3b
Author: Michael Gulev greenday2.gd@gmail.com
Date: Sun Nov 22 01:18:50 2015 +0700
Upgrade requirements in test_project
commit 16e47de2cf8c3711dbc38ec6026e4c012b4e60f1
Author: Michael Gulev greenday2.gd@gmail.com
Date: Mon Nov 9 03:50:25 2015 +0700
New test_project on Django 1.9
commit d2aee99926739ee0e1605dfdcb88b4c0b29bc3b6
Author: Michael Gulev greenday2.gd@gmail.com
Date: Mon Nov 9 01:40:58 2015 +0700
Remove unnecessary files in test project. Add requirements.txt
Closing,
Closed in
https://github.com/greenday2/django-cities-light/commit/d2aee99926739ee0e1605dfdcb88b4c0b29bc3b6
Ну всю портянку то не обязательно :)
| gharchive/issue | 2016-01-13T18:22:28 | 2025-04-01T06:38:52.757851 | {
"authors": [
"greenday2",
"max-arnold"
],
"repo": "greenday2/django-cities-light",
"url": "https://github.com/greenday2/django-cities-light/issues/8",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1249670837 | question: documentation regarding API keys is a bit sparse
I added an API key from the auth portal settings page at /settings, but now what?
I passed it as a bearer token a la Authorization: Bearer API_KEY, but that does not seem to work.
The only doc page that mentions API keys is this one: https://authp.github.io/docs/authorize/basic_api_key_auth
I have added with api key auth portal myportal realm local to my policy.
Is there something obvious I'm missing here? Or am I just not understanding what purpose API keys are supposed to serve?
My goal is to make a never expiring API key that I can use to give external services access to my services behind the authorize directive.
Thanks in advance, this project is great @greenpau.
@jariz , please share your config.
it is not “authorization Bearer”. Rather, pass X-API-Token header with the value of the key you’d created
@jariz , let’s keep it open. Will address it next week
@jariz , please help promote this project … if you like of course 😃
| gharchive/issue | 2022-05-26T14:46:51 | 2025-04-01T06:38:52.790641 | {
"authors": [
"greenpau",
"jariz"
],
"repo": "greenpau/caddy-security",
"url": "https://github.com/greenpau/caddy-security/issues/116",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1024044 | ActiveAdmin::Comment doesn't pick up ActiveRecord::Base.table_name_prefix
(Rails 3.0.7, ActiveAdmin 0.2.2)
The table_name property of ActiveAdmin::Comment is set before the configuration in application.rb is read and before Activerecord::Base is able to deal out pre- and suffixes for table names. Because of the load order of gems and application configuration this patch does not work:
#comments/comment.rb
class Comment < ActiveRecord::Base
#self.table_name = "active_admin_comments"
self.table_name = ActiveRecord::Migrator.proper_table_name("active_admin_comments")
# ... more code ...
end
A simple workaround is:
#config/initializers/some_initializer.rb ,
ActiveAdmin::Comment.table_name = ActiveRecord::Migrator.proper_table_name(ActiveAdmin::Comment.table_name)
Is it possible to load ActiveAdmin::Comment more lazily? Then the first solution would work.
+1
Just as a side note, is it possible to completely disable the comments when using rails g active_admin:install ?
Because some of the solution here works fine if you don't have code already, if you're adding activeadmin after you created your rails project you will be face with namespace collision (Comment)
| gharchive/issue | 2011-06-08T13:59:18 | 2025-04-01T06:38:52.842389 | {
"authors": [
"davidmathei",
"pothibo"
],
"repo": "gregbell/active_admin",
"url": "https://github.com/gregbell/active_admin/issues/174",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
343333262 | Refresh methods hangs with WPF application but not with .Net Core.
Describe the bug
To Reproduce
Hi! Playing around with your library and cannot figure out why Refresh method hangs when called from WPF app and not from .NET Core app.
I have 3 projects in a soluction:
TrelloHelper.Library: A .NET Standard Library (2.0), referencing Manatee.Trello 3.2.1 and dependencies (Json and WebApi)
TrelloHelper.ConsoleApp: A .NET Core 2.0 Console, referencing TrelloHelper.Library
TrelloHelper.WinApp: A .NET Framework 4.6.1 WPF Application referencing TrelloHelper.Library
At the TrelloHelper.Library I have a simple method to connect to Trello, get a board and enumerate it's lists. Code bellow:
public void TestTrelloConnection(string output)
{
string s;
var stb = new StringBuilder();
var factory = new TrelloFactory();
var board = factory.Board(boardDEV);
board.Lists.Refresh();
s = $"Board: {board}";
stb.AppendLine(s);
Console.WriteLine(s);
foreach (var list in board.Lists)
{
s = $" {list}";
stb.AppendLine(s);
Console.WriteLine(s);
}
stb.AppendLine(s);
Console.WriteLine();
s="Connection OK!";
stb.AppendLine(s);
Console.WriteLine(s);
output = s;
}
At the TrelloHelper.ConsoleApp, I have the following code, that works perfectily:
static void Main(string[] args)
{
string s = "" ;
TrelloHelperLib.TrelloHelper tr;
tr = new TrelloHelperLib.TrelloHelper();
Console.WriteLine("Connecting to trello...");
tr.TestTrelloConnection(s);
Console.WriteLine(s);
Console.ReadLine();
}
At TrelloHelper.WinApp, I have a button that calls the following method. But it does not work, hanging at the line "board.Lists.Refresh();" inside the method TestTrelloConnection :
private void btTestarConexao_Click(object sender, RoutedEventArgs e)
{
string s = "";
tr.TestTrelloConnection(s);
MessageBox.Show(s);
}
I don't know if it is a bug or if I'm doing something wrong using async/await in WPF Application ...
Expected behavior
I expected that the behavior would be the same at the .net core console and the .net wpf app.
Desktop (please complete the following information):
OS: Windows 10 Pro
.Net Target 4.6.1, Core 2.0 e Standard 2.0
Version: Manatee.Trello 3.2.1, ManateeJson 2.3.0 and Manatee WebApi 2.0.3 (all using .net standard 2.0)
Visual Studio Community 2017 15.7.3
Additional Information.
Your project is awesome. Thanks!
Thanks for reporting this. I can see that one problem is the async/await usage, but surprised that the console app works.
I'm on my phone right now, though. I'll update later today after I can do some testing.
@rtrianiguerra does this help? Are you still experiencing issues?
Yes, @gregsdennis! You put me on the track! I've gone back to the C#'s book to review tasks/await/async, strugglered a little, but got it working!
Thanks a lot!
Now I'll begin to make something useful to publish on GitHub!
These are the revised methods:
Library
public static async Task<string> TestTrelloConnection()
{
string s;
var stb = new StringBuilder();
var factory = new TrelloFactory();
var board = factory.Board(boardDEV);
await board.Lists.Refresh();
s = $"Board: {board}";
stb.AppendLine(s);
foreach (var list in board.Lists)
{
s = $" {list}";
stb.AppendLine(s);
}
stb.AppendLine(s);
s = "Connection OK!";
stb.AppendLine(s);
return stb.ToString();
}
ConsoleApp:
class Program
{
static void Main(string[] args)
{
try
{
Test().Wait();
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
Console.ReadLine();
}
static private async Task Test()
{
Console.WriteLine("Connecting Trello...");
var s = await TrelloHelperLib.TrelloHelper.TestTrelloConnection();
Console.WriteLine(s);
}
}
WPFApp :
private async void btTestarConexao_Click(object sender, RoutedEventArgs e)
{
var s = await Task.Run(() =>
TrelloHelperLib.TrelloHelper.TestTrelloConnection());
MessageBox.Show(s);
}
That's a lot better. Just remember that async void should only ever be used for event handlers. Outside of that, you should always return a task.
| gharchive/issue | 2018-07-21T15:11:15 | 2025-04-01T06:38:52.868525 | {
"authors": [
"gregsdennis",
"rtrianiguerra"
],
"repo": "gregsdennis/Manatee.Trello",
"url": "https://github.com/gregsdennis/Manatee.Trello/issues/243",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
499393635 | Day 7 - Wrong variable name in List Pagination paragraph
Close to the end of the paragraph there is this text.
We added page in the URL path and defined default value, in case when page is not defined in the URL (ex: /category/design). Variable $path is added in arguments of the method. It will be injected automatically by name in path. Also we need parameter max_jobs_on_category and getParameter methods to access it. That’s why this controller extends now Symfony\Bundle\FrameworkBundle\Controller\Controller but not Symfony\Bundle\FrameworkBundle\Controller\AbstractController.
Shouldn't it be the $page variable?
@gracoes yes, you are right. Fixed in PR #40. Thank you for reporting :+1:
| gharchive/issue | 2019-09-27T11:32:57 | 2025-04-01T06:38:52.870826 | {
"authors": [
"gracoes",
"gregurco"
],
"repo": "gregurco/jobeet-tutorial",
"url": "https://github.com/gregurco/jobeet-tutorial/issues/36",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
446636362 | Another Strange Problem
I'm not sure why, but the following line:
const defaultEntityMap = require('./default-entity-map');
doesn't work when compiling with Angular/Webpack. The implicit .json extension isn't recognized, and I'm not sure why, since it is using node.
Would it be possible to add the .json to your code base, by any chance? I'm trying to avoid having a separate copy of the code just to allow this to work.
Thanks very much for this module.
Hrm, using this project with WebPack has been working for me. Looking at my config, I have .json in my resolve.extensions:
extensions: ['.js', '.json', '.ts', '.tsx'],
Have you tried adding that?
| gharchive/issue | 2019-05-21T13:59:14 | 2025-04-01T06:38:52.872821 | {
"authors": [
"greim",
"queejie"
],
"repo": "greim/html-tokenizer",
"url": "https://github.com/greim/html-tokenizer/issues/4",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
23421702 | Published to Rubygems.
I'm very impressed with this implementation and looking forward to using it my own project. I notice that you have contributed some great code clean up to the originally published gem.
Will these changes be contributed back to a new version of the published 'secretsharing' gem or will they remain on your fork? I'm just trying to figure out the best way to get your latest code into my project.
Thanks!
Andy
FYI, version 1.0.0 of the gem has been released to rubygems. Please let me know if you see any issues.
https://rubygems.org/gems/secretsharing
:-)
| gharchive/issue | 2013-11-27T23:22:24 | 2025-04-01T06:38:52.875155 | {
"authors": [
"a-bash",
"grempe"
],
"repo": "grempe/secretsharing",
"url": "https://github.com/grempe/secretsharing/issues/1",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1463379223 | Bug fix where isnan template specifier causes compilation errors
Bug fix where isnan template specifier causes compilation errors, fixed it by removing the template specifier.
Thanks for spotting! I didn't get an error compiling on clang v12.0 on MacOS.
| gharchive/pull-request | 2022-11-24T13:49:59 | 2025-04-01T06:38:52.876314 | {
"authors": [
"grepthat",
"victorjarlow"
],
"repo": "grepthat/libOpenDRIVE",
"url": "https://github.com/grepthat/libOpenDRIVE/pull/51",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
336207469 | Give the ability to bid on EOS premium names (Name Action)
It would be fantastic to let your userbase bid on premium names. It would give much value to EOS to give the oportunity to MORE users to bid on names. I am sure many that would not vote will do if they find it easy from a allready installed wallet.
Tentatively slating this for the 0.6.x milestone, couple versions out still. There's been an incredible demand for account creation and account permissions, which will be the two builds between now and then.
| gharchive/issue | 2018-06-27T12:27:04 | 2025-04-01T06:38:52.893731 | {
"authors": [
"aaroncox",
"liondani"
],
"repo": "greymass/eos-voter",
"url": "https://github.com/greymass/eos-voter/issues/162",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1860174322 | Timestamp issue on ERCOT SPP data pull?
When I run the example "Retrieving data in local time", I still seeing the timestamp in UTC... It also looks like it isn't lining up with the ERCOT data that I am finding on the website.
Or the data are 4 hours off?
it looks like the time zone conversion isn't happening correctly and the date is staying in utc.
what version of the grid status client library are you using?
>>> import gridstatusio
>>> import pandas
>>> gridstatusio.__version__
'0.4.0'
>>> pandas.__version__
'2.0.1'
Here is what I have up and running in this environment

Joshua Rhodes
On Aug 22, 2023, at 11:25 AM, Max Kanter @.***> wrote:
it looks like the time zone conversion isn't happening correctly and the date is staying in utc.
what version of the grid status client library are you using?
import gridstatusio
import pandas
gridstatusio.version
'0.4.0'
pandas.version
'2.0.1'
—
Reply to this email directly, view it on GitHub https://github.com/gridstatus/gridstatusio/issues/16#issuecomment-1688539078, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAZQFTEPFBCBQESL64VFSVTXWTMO5ANCNFSM6AAAAAA3Y6XDD4.
You are receiving this because you authored the thread.
Here is what I have:
thanks! i was able to reproduce and fix the error when I downgraded to that version of pandas.
just released version 0.5.0 of gridstatusio. if you upgrade to that, everything should work. let me know if you see any other problems
Hi Max
I seem to be getting a similar timestamp conversion error with the “ercot_fuel_mix” data after updating to the latest packages? I see solar production that appears to be indexed via UTC time even when asking for local?
Thanks!
Joshua Rhodes


On Aug 22, 2023, at 8:04 PM, Max Kanter @.***> wrote:
thanks! i was able to reproduce and fix the error when I downgraded to that version of pandas.
just released version 0.5.0 of gridstatusio. if you upgrade to that, everything should work. let me know if you see any other problems
—
Reply to this email directly, view it on GitHub https://github.com/gridstatus/gridstatusio/issues/16#issuecomment-1689113353, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAZQFTHXGUYD2KJG3XDTWFTXWVJIBANCNFSM6AAAAAA3Y6XDD4.
You are receiving this because you authored the thread.
Hi Joshua - im not able to reproduce. can you share the code you are using the versions of pandas and gridstatus that you are using?
Right, but it looks like that the solar doesn't start producing until late in the afternoon, well after the sun has come up and keeps on producing well into the night after the sun has gone down?
| gharchive/issue | 2023-08-21T21:24:06 | 2025-04-01T06:38:52.923066 | {
"authors": [
"joshdr83",
"kmax12"
],
"repo": "gridstatus/gridstatusio",
"url": "https://github.com/gridstatus/gridstatusio/issues/16",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1917586249 | Bun support
Hi.. Could you add bun.sh support? https://bun.sh
Hi.. Could you add bun.sh support? https://bun.sh
Hi! Sure! Will do it this weekend
| gharchive/issue | 2023-09-28T13:43:34 | 2025-04-01T06:38:52.961362 | {
"authors": [
"Kleywalker",
"grigorii-zander"
],
"repo": "grigorii-zander/zsh-npm-scripts-autocomplete",
"url": "https://github.com/grigorii-zander/zsh-npm-scripts-autocomplete/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
171728611 | Misspelling in AccordionPanel headings of the word "Third"
What does this PR do?
Where should the reviewer start?
What testing has been done on this PR?
How should this be manually tested?
Any background context you want to provide?
What are the relevant issues?
Screenshots (if appropriate)
thanks
| gharchive/pull-request | 2016-08-17T18:33:08 | 2025-04-01T06:38:52.979184 | {
"authors": [
"alansouzati",
"tracybarmore"
],
"repo": "grommet/grommet-docs",
"url": "https://github.com/grommet/grommet-docs/pull/83",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
2571030758 | Makes source no longer required
This should in theory make v0.18 backwards. compatible
I need to check in on the test runner's permissions before I can merge this in, but the branch is here to unblock internal work
Test runners are updated, should be good to go
The fact that source is optional is very unintuitive I think (and hard to determine why by reading the code). Is it possible to add a tag or annotation or something that explians its for compatibility between x version and y version and that in fact it will be returned in y version?
| gharchive/pull-request | 2024-10-07T17:24:49 | 2025-04-01T06:38:53.035195 | {
"authors": [
"brandon-groundlight",
"tyler-romero"
],
"repo": "groundlight/python-sdk",
"url": "https://github.com/groundlight/python-sdk/pull/260",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
47385796 | Selenium Grid Extras not restarting after N tests
Hi Dima,
I'm running the last Grid Extras (v1.7.1) in a windows vm and in the configuration I set the restart of the vm at 10 tests.
Despite this, the vm is not restarting. In the logs there's no tracking information and I accumulate like 100+ sessions without a restart.
It seems not to be a browser bug related, because it fails for ie/chrome/firefox.
Thanks in advance
@Ahskaniz Is this fixed for you in Selenium-Grid-Extras 1.10.0 ?
@smccarthy I just stopped using this feature for one handmade.
@Ahskaniz Ok thanks for the quick response!
Closing as we think this is working correctly. If anyone finds that this is still an issue, please open a new issue.
| gharchive/issue | 2014-10-31T11:47:33 | 2025-04-01T06:38:53.039373 | {
"authors": [
"Ahskaniz",
"smccarthy"
],
"repo": "groupon/Selenium-Grid-Extras",
"url": "https://github.com/groupon/Selenium-Grid-Extras/issues/66",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
605825858 | Indian Flag Issue
Hi, Thank you for providing .svg flags for Vue. Though, I have an issue.
The flag you have deployed for India is invalid in vueflags/assets/flags/in.svg. It is not an Indian flag. I request you to please change the flag to the one that I have shared below from the source of wikipedia or as per your wished open source. Furthermore, the Indian flag has 'Ashok Chakra' in between and not the star. I thank you in advance for the same.
Your flag:
Link:
https://github.com/growthbunker/vueflags/blob/master/assets/flags/in.svg
Svg:
<?xml version="1.0" encoding="iso-8859-1"?> <svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" version="1.1" id="Capa_1" x="0px" y="0px" viewBox="0 0 512 512" style="enable-background:new 0 0 512 512;" xml:space="preserve"> <rect y="85.337" style="fill:#F0F0F0;" width="512" height="341.326"/> <rect y="85.337" style="fill:#FF9811;" width="512" height="113.775"/> <rect y="312.888" style="fill:#6DA544;" width="512" height="113.775"/> <circle style="fill:#0052B4;" cx="256" cy="256" r="43.896"/> <circle style="fill:#F0F0F0;" cx="256" cy="256" r="27.434"/> <polygon style="fill:#0052B4;" points="256,222.146 264.464,241.341 285.319,239.073 272.927,256 285.319,272.927 264.464,270.659 256,289.854 247.536,270.659 226.681,272.927 239.073,256 226.681,239.073 247.536,241.341 "/> </svg>
Real Indian Flag:
Link:
https://en.wikipedia.org/wiki/File:Flag_of_India.svg#/media/File:Flag_of_India.svg
Svg:
<?xml version="1.0" encoding="UTF-8"?> <svg width="1350" height="900" viewBox="0 0 225 150" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"> <rect width="225" height="150" fill="#f93"/> <rect y="50" width="225" height="100" fill="#fff"/> <rect y="100" width="225" height="50" fill="#128807"/> <g transform="translate(112.5,75)"> <circle r="20" fill="#008"/> <circle r="17.5" fill="#fff"/> <circle r="3.5" fill="#008"/> <g id="e"> <g id="f"> <g id="g"> <g id="h"> <circle transform="rotate(7.5) translate(17.5)" r=".875" fill="#008"/> <path d="m0 17.5 0.6-10.5-0.6-5-0.6 5 0.6 10.5z" fill="#008"/> </g> <use transform="rotate(15)" xlink:href="#h"/> </g> <use transform="rotate(30)" xlink:href="#g"/> </g> <use transform="rotate(60)" xlink:href="#f"/> </g> <use transform="rotate(120)" xlink:href="#e"/> <use transform="rotate(240)" xlink:href="#e"/> </g> </svg>
Thank you @Ruchirr, I have updated it with your flag. 👍
Update your package to 0.0.9.
Hi @LeCoupa
Thanks for a quick update. Though, the svg compilation does not match with other flags. This is how it is looking figure 1. Before with the wrong flag it was working just fine figure 2
Figure 1
Figure 2
Please use the following SVG code
`
`
Hope this helps your project @LeCoupa
Thank you so much @Ruchirr
| gharchive/issue | 2020-04-23T19:59:05 | 2025-04-01T06:38:53.055146 | {
"authors": [
"LeCoupa",
"Ruchirr"
],
"repo": "growthbunker/vueflags",
"url": "https://github.com/growthbunker/vueflags/issues/7",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2132925042 | this dependency seems to be incompatible with new version 3.x of spring boot.
The context
upgrade my app to spring boot 3.X
The bug
"grpc-spring-boot-starter" seems to not have a version compatible with spring boot 3.x as of now. And since their auto-configuration is still present in "spring.factories" file, some of the beans for these dependencies are not being autowired after migration to spring boot 3, resulting in application start up failures.
Could you please post the error mesaage along with the version you are using?
I'm having a similar problem. I'm following the Getting Started Guide, but Spring Boot 3 seems to ignore @GrpcService. Probably there is some kind of auto-detecting issue described in 3rd task of Implementing the Service.
And my pom.xml, downgrading spring-boot-starter-parent seemed to work for me.
i.e. from:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.2.2</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
to:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.5.7</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
You never said which version of grpc-spring you are using. Is it possibly not the latest?
You never said which version of grpc-spring you are using. Is it possibly not the latest?
You were right, I was using 2.15.0 and 3.0.0 is released and Beans are autowired there. What a simple mistake, thank you!
| gharchive/issue | 2024-02-13T18:33:19 | 2025-04-01T06:38:53.060439 | {
"authors": [
"ST-DDT",
"codeGuru775",
"dsyer",
"ocebenzer"
],
"repo": "grpc-ecosystem/grpc-spring",
"url": "https://github.com/grpc-ecosystem/grpc-spring/issues/1053",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
65779867 | Metadata value; null-terminated vs. sized
There's a lack of consensus on the treatment of null-bytes in metadata values as far as I can tell. There should be documentation here clarifying how the metadata should be treated.
Note that GRPC appears to Do The Right Thing™ and respect the length for the metadata value when copying here (thus allowing null-bytes in the value). That such is not written anywhere as a guarantee isn't particularly comfortable.
And then there's what to do about metadata key suffixes. The purpose of this should be documented.
The HTTP2 spec forbids null as a value in header values which is why we require that arbitrary binary sequences (including ones with nulls) are encoded as base-64 on the wire and use the '-bin' suffix on the header name.
Feel free to suggest clarification to
https://github.com/grpc/grpc-common/blob/master/PROTOCOL-HTTP2.md
I don't think people should need to jump across repositories to get an idea of the semantics of our code.
#554 should address this.
| gharchive/issue | 2015-04-01T21:10:58 | 2025-04-01T06:38:53.098725 | {
"authors": [
"ctiller",
"louiscryan",
"soltanmm"
],
"repo": "grpc/grpc",
"url": "https://github.com/grpc/grpc/issues/1167",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1035185171 | Issue with gRPC channel class missing request method
What version of gRPC and what language are you using?
What operating system (Linux, Windows,...) and version?
OSx Big Sur 11.5.2
What runtime / compiler are you using (e.g. python version or version of gcc)
Python 3.9
What did you do?
Generated a Python client using protoc-gen-grpclib_python, attempted to use the generated client:
import sys
import asyncio
sys.path.insert(0, '../clients/python')
import grpc
from proto.hot_storage_pb2 import SetRequest, GetRequest
from proto.hot_storage_grpc import HotStorageServiceStub
from google.protobuf import struct_pb2 as struct
async def main():
with grpc.insecure_channel('0.0.0.0:9000') as channel:
stub = HotStorageServiceStub(channel=channel)
await stub.Set(SetRequest(key='key', values=struct.Struct(fields={'value': struct.Value(string_value='value')})))
result = await stub.Get(GetRequest(key='key'))
print(result)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
What did you expect to see?
I expected the code snippet to call my gRPC server, and return a result.
What did you see instead?
Instead, I got the following error:
Traceback (most recent call last):
File "/Users/ewanvalentine/work/hot-storage-rollout/playground/test.py", line 24, in <module>
loop.run_until_complete(main())
File "/usr/local/Cellar/python@3.9/3.9.7/Frameworks/Python.framework/Versions/3.9/lib/python3.9/asyncio/base_events.py", line 642, in run_until_complete
return future.result()
File "/Users/ewanvalentine/work/hot-storage-rollout/playground/test.py", line 17, in main
await stub.Set(SetRequest(key='key', values=struct.Struct(fields={'value': struct.Value(string_value='value')})))
File "/Users/ewanvalentine/work/hot-storage-rollout/playground/venv/lib/python3.9/site-packages/grpclib/client.py", line 881, in __call__
async with self.open(timeout=timeout, metadata=metadata) as stream:
File "/Users/ewanvalentine/work/hot-storage-rollout/playground/venv/lib/python3.9/site-packages/grpclib/client.py", line 853, in open
return self.channel.request(self.name, self._cardinality,
AttributeError: 'Channel' object has no attribute 'request'
Anything else we should know about your project / environment?
$ pip freeze . generates:
asyncio==3.4.3
cachetools==4.2.4
certifi==2021.10.8
charset-normalizer==2.0.7
coverage==6.0.2
Cython==0.29.24
google-api-core==2.1.1
google-api-python-client==2.27.0
google-auth==2.3.0
google-auth-httplib2==0.1.0
googleapis-common-protos==1.53.0
greenlet==1.1.1
grpcio==1.41.0
grpcio-tools==1.41.0
grpclib==0.4.2
h2==4.1.0
hpack==4.0.0
httplib2==0.20.1
hyperframe==6.0.1
idna==3.3
msgpack==1.0.2
multidict==5.2.0
mypy-extensions==0.4.3
protobuf==3.12.2
pyasn1==0.4.8
pyasn1-modules==0.2.8
pynvim==0.4.3
pyparsing==2.4.7
python-engineio==3.12.1
requests==2.26.0
rsa==4.7.2
ruamel.yaml==0.16.13
ruamel.yaml.clib==0.2.6
six==1.16.0
typed-argument-parser==1.7.1
typing-extensions==3.10.0.2
typing-inspect==0.7.1
uritemplate==4.1.1
urllib3==1.26.7
File "/Users/ewanvalentine/work/hot-storage-rollout/playground/venv/lib/python3.9/site-packages/grpclib/client.py", line 853, in open
return self.channel.request(self.name, self._cardinality,
AttributeError: 'Channel' object has no attribute 'request'
The error is in grpclib, which is a separate project: https://github.com/vmagamedov/grpclib.
Oh! My apologies @drfloob, silly me!
How did you solve it, I got the same problem
I am also having same issue.. How is that fixed?
Anyone have any contributions to the above issue?
| gharchive/issue | 2021-10-25T14:08:13 | 2025-04-01T06:38:53.104848 | {
"authors": [
"EwanValentine",
"KiranHighNote",
"atulsriv",
"drfloob",
"nflux-pyang"
],
"repo": "grpc/grpc",
"url": "https://github.com/grpc/grpc/issues/27817",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
106143453 | grpc cpp 0_11_0 create stub error with msvc 2013 with preprocessor definition _USE_32BIT_TIME_T
I build my grpc 0_11_0 with default settings under /vsprojects, and get all the cpp libs. When I used it in my project with very simple rpc server and client, I found that NewStub will be created without any error messages, but afterwards I called a rpc method, the program crashed. I tested again without the preprocesser definition _USE_32BIT_TIME_T in all /vsprojects, then everything is ok.
My environment is msvc 2013 in windows 10.
#4315
| gharchive/issue | 2015-09-12T09:32:35 | 2025-04-01T06:38:53.106949 | {
"authors": [
"Zerqkboo",
"jtattermusch"
],
"repo": "grpc/grpc",
"url": "https://github.com/grpc/grpc/issues/3336",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
95936594 | changes to allow VS solution/project generation and grpc.mak generation for c++ tests
this isn't fully polished but it gets OK coverage (about 10 c++ tests are not building because of posix includes, and a couple other random ones)
todo
switch back to using generate_projects.py (parallel version)
clean up c++ props and split into gtest/gflags props
Decide if building .proto files in-place with a manual script is the best solution, or if there's another way
add documentation for grpc.mak building and overall buildgen layout
get versions of zlib/openssl with pdb files (debug symbols) to get rid of most compile warnings
Can one of the admins verify this patch?
This is ok to test.
not building windows tests are below. I am putting "platform" = posix into build.json for these for now
failed test
reason
async_streaming_ping_pong_test
posix (sys/time.h, sys/signal.h)
async_unary_ping_pong_test
posix (sys/time.h, sys/signal.h)
client_crash_test
unresolved external in grpc++_test_util (don't know why)
server_crash_test
unresolved external in grpc++_test_util (don't know why)
interop_client
posix (unistd.h)
interop_server
posix (unistd.h)
interop_test
posix (unistd.h)
qps_interarrival_test
posix (sys/time.h, sys/signal.h)
qps_openloop_test
posix (SIGPIPE, sys/time.h, sys/signal.h)
qps_test
posix (SIGPIPE, sys/time.h, sys/signal.h)
sync_streaming_ping_pong_test
posix (SIGPIPE, sys/time.h, sys/signal.h)
sync_unary_ping_pong_test
posix (SIGPIPE, sys/time.h, sys/signal.h)
C test failures/timeouts, over 3 runs:
failure count
fail/timeout
name
3
FAILED:
secure_endpoint_test
3
FAILED:
initial_settings_frame_bad_client_test
3
TIMEOUT:
chttp2_fullstack_compression_early_server_shutdown_finishes_inflight_calls_test
3
TIMEOUT:
chttp2_fullstack_disappearing_server_unsecure_test
3
TIMEOUT:
chttp2_simple_ssl_fullstack_request_with_flags_test
3
TIMEOUT:
chttp2_fullstack_compression_early_server_shutdown_finishes_inflight_calls_unsecure_test
2
TIMEOUT:
chttp2_fullstack_compression_cancel_after_invoke_unsecure_test
3
TIMEOUT:
chttp2_fullstack_graceful_server_shutdown_unsecure_test
3
TIMEOUT:
chttp2_fullstack_compression_disappearing_server_test
3
TIMEOUT:
chttp2_fullstack_early_server_shutdown_finishes_inflight_calls_test
3
TIMEOUT:
chttp2_fullstack_compression_graceful_server_shutdown_unsecure_test
2
TIMEOUT:
chttp2_fullstack_cancel_after_invoke_test
1
TIMEOUT:
chttp2_fullstack_cancel_after_invoke_unsecure_test
3
TIMEOUT:
chttp2_fullstack_compression_graceful_server_shutdown_test
3
TIMEOUT:
chttp2_fullstack_early_server_shutdown_finishes_inflight_calls_unsecure_test
3
TIMEOUT:
chttp2_fullstack_disappearing_server_test
3
TIMEOUT:
chttp2_fullstack_compression_disappearing_server_unsecure_test
3
TIMEOUT:
chttp2_fullstack_compression_cancel_after_invoke_test
3
TIMEOUT:
chttp2_simple_ssl_with_oauth2_fullstack_request_with_flags_test
3
TIMEOUT:
chttp2_fullstack_graceful_server_shutdown_test
C++ timeouts (no regular failures). I think these are to do with opening a windows socket, see #2294
thread_stress_test
mock_test
cli_call_test
async_end2end_test
end2end_test
Alright, thank you for your hard work on this :)
| gharchive/pull-request | 2015-07-19T18:08:16 | 2025-04-01T06:38:53.125741 | {
"authors": [
"grpc-jenkins",
"larsonmpdx",
"nicolasnoble"
],
"repo": "grpc/grpc",
"url": "https://github.com/grpc/grpc/pull/2506",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
109563873 | Build and run per-language containers for interop tests
-- introduce per-language docker images for interop tests
-- run_interop_tests.py now works in several steps: it first build all the language specific images, then spins up all the servers, runs all the interop tests (each in a separate container) and finally it kills the servers and cleans up the docker images it has build (it leaves the "base" images as those will rarely change).
-- added support java client/server (the scripts looks for java sources in ../grpc-java relatively to grpc repo root)
-- added support for --alow_flakes flag (fixes #3581)
CC @ejona86
| gharchive/pull-request | 2015-10-02T20:13:42 | 2025-04-01T06:38:53.128200 | {
"authors": [
"jtattermusch"
],
"repo": "grpc/grpc",
"url": "https://github.com/grpc/grpc/pull/3605",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
110066628 | Node package cleanup
This change removes several files that have become irrelevant, either because they now live in other repositories (like the contents of src/node/cli and src/node/bin) or because they were completely unused and untested (like src/node/examples/stock*). In addition, src/node/examples is now confusing in relation to the root examples directory, so it is split into more clear directories: src/node/performance for performance tests, and src/node/test/math, since the math service stuff is only used in tests.
It also removes some directories from package.json that are not essential to using the library.
LGTM, will merge once this is updated
It's updated
| gharchive/pull-request | 2015-10-06T18:07:26 | 2025-04-01T06:38:53.130193 | {
"authors": [
"murgatroid99",
"tbetbetbe"
],
"repo": "grpc/grpc",
"url": "https://github.com/grpc/grpc/pull/3672",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
208511807 | Improve Node and libuv testing and test coverage
This makes a number of related changes:
The use of the libuv iomgr implementation can now be set at compile time when building Node (instead of hardcoded into the file). That setting currently defaults to false; we should set it back to true when we are more confident in that implementation.
The Node tests now run on Node 7 by default, instead of Node 4. Portability tests have been added to run the tests on Node 4 and Node 6, and with the libuv iomgr instead of the default iomgr.
When the debug config is specified, the Node tests actually use the debug build.
Some core tests have been modified or added to run with the libuv iomgr. A portability test has been added to run the core tests with libuv.
Note: The core test lb_policies_test currently fails with libuv. This test is being disabled in parallel in #9765
This also seems to fix #9668. That test is passing in this PR, at least. The test that was failing in #9726 also passes here.
TSAN failure: #9124. The interop failure is an infrastructure failure. The Mac failure appears to be a fluke: I have been unable to reproduce it in thousands of runs.
| gharchive/pull-request | 2017-02-17T17:51:54 | 2025-04-01T06:38:53.132994 | {
"authors": [
"murgatroid99"
],
"repo": "grpc/grpc",
"url": "https://github.com/grpc/grpc/pull/9766",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
151857913 | Feature Request: Fail when number of warnings found exceeds certain threshhold
When trying to use CSSLint plugin on an already existing project, one issue that comes up often is that certain violations are not going to be fixed, therefore, one could not start from zero violations.
At such a situation the option to fail build only if the number of violations exceeds certain count (watermark) will be extremely useful.
I doubt this will ever be implemented.
You can make a PR and discuss the implementation there.
| gharchive/issue | 2016-04-29T12:47:22 | 2025-04-01T06:38:53.159135 | {
"authors": [
"XhmikosR",
"serhiy-yevtushenko"
],
"repo": "gruntjs/grunt-contrib-csslint",
"url": "https://github.com/gruntjs/grunt-contrib-csslint/issues/72",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
59515633 | Watch Error: FSEventStreamFlushSync(): failed assertion
I am getting the following error after Running "watch" task restarts
2015-03-02 11:19 grunt[7732] (FSEvents.framework) FSEventStreamFlushSync(): failed assertion '(SInt64)last_id > 0LL'
Happens with these versions:
$ node 0.11.6 - 0.12.0
$ npm 2.5.1
Related Grunt Config:
...
watch: {
sass: {
files: [
'<%= configs.sassPaths.answ_sass + configs.minimatch %>',
'<%= configs.sassPaths.sass_lib + configs.minimatch %>'
],
options: {
nospawn: true
}
}
}
...
Get exactly the same message when watch restarts. Happens on several projects with different configurations and on two different Macs.
Tested with those npm versions:
'''
$ node v0.12.0
$ npm 2.3.0, 2.5.0, 2.7.0, 2.7.1
'''
This happens for me too
+1 Ditto
Same here. Can't seem to find a solution...
+1 Yeah, same here. Watch task seems to actually work, but nonetheless the message is a bit annoying. I am on Yosemite 10.10.2
Same here, whats the problem?
Same problem. Upgraded to node.js v0.12.1, and it is still there.
Same problem.
Does anyone write an bugreport to the nodejs guys?
Same here
$ node v0.12.2
$ npm 2.7.4
OSX 10.10.2 (14C1514)
grunt[22532] (FSEvents.framework) FSEventStreamFlushSync(): failed assertion '(SInt64)last_id > 0LL'
If anyone wants to try this in io.js, should be fixed there.
Still happening for me as well
node v0.12.2
npm 2.7.5
grunt-contrib-watch 0.6.1
OS X 10.10.3
This seems to happen more when it has to run an uglify or concat task. But both are at the latest version.
Same here on
node v0.12.0
npm 2.5.1
grunt-contrib-watch 0.6.1
OS X 10.9.5
Happens only if I set the option spawn: false. The tasks don't run correctly, it's not just the message for me.
Same problem - Seems to be functioning fine, but always gives this error.
Node: v0.12.2,
NPM: 2.7.5,
grunt: ^0.4.5,
grunt-contrib-concat: ^0.5.1,
grunt-contrib-imagemin: ^0.9.4,
grunt-contrib-uglify: ^0.9.1,
grunt-contrib-watch: ^0.6.1,
grunt-sass: ^0.18.1
OS X 10.10.2
(FSEvents.framework) FSEventStreamFlushSync(): failed assertion '(SInt64)last_id > 0LL'
Can confirm.
same here
Node v0.12.2
grunt-cli v0.1.13
grunt v0.4.5
grunt-contrib-watch 0.6.1
OS X 10.10.3
I'm also experiencing this issue
Node v0.12.2
Npm v2.7.5
Packages:
├── grunt@0.4.5
├── grunt-contrib-connect@0.10.1
├── grunt-contrib-copy@0.8.0
├── grunt-contrib-less@1.0.1
├── grunt-contrib-uglify@0.9.1
├── grunt-contrib-watch@0.6.1
├── grunt-exec@0.4.6
└── grunt-express-server@0.5.1
Same here.
node: v0.12.0
Removing spawn from Watch > Scripts > Options:{ } stops the error. What is spawn anyway again?
Might this potentially be why?
https://github.com/bdkjones/fseventsbug/wiki/realpath()-And-FSEvents
@digitalcraftco solution worked with v0.12.7, removing all spawns from watch: {} task (even if they're false)
@juanbrujo Thanks for the follow up. Indeed this has fixed the issue for newer versions.
Interesting. Is 0.11.6 the earliest node version this manifests in? We've had the above linked io.js issue for this for a while but haven't really been able to debug it.
@digitalcraftco It defines whether a new task should be spawned or not. See https://github.com/gruntjs/grunt-contrib-watch#optionsspawn
Removing spawn from options does avoid but not fix the issue.
+1 getting this as well still.
Interesting to note, as soon as you remove the spawn = false property the compile time jumps up dramatically, in my case from 0.3sec to 4.3sec !!
I see similar messages in a C app using libuv and uv_fs_event* I wrote as well. Linked to https://github.com/nodejs/node/issues/854 which mentions this issue.
Absolutely nothing changed in my dependencies for the last month at least and I randomly just started seeing this
@maruf89 Did you change OS X versions?
No, I'm still on Yosemite 10.10.3 - The only thing that changed is I got a hard drive cable replaced but no software/npm deps changed
Never had this problem on Yosemite, upgraded to El Capitan and now it happens...
+1 having same issue
@maruf89, same by me, i had only changed the hard drive cable.
This is still an issue with Node.js 4.4.7 and 6.3.1. I'm on OS X 10.11.6. I'm NOT using chokidar, fsevents, gaze, or any other filesystem watching library. I'm only using the built-in fs.watch().
+1
What are the side-effects to this error? It started occurring in my project after I added imagemin
What are the side-effects to this error?
Unknown, probably nothing impactful? I think the OS just retries or something.
Strange that multiple people would see it after a harddrive cable, maybe it's an odd apple hardware issue?
Am sure it has nothing to do with hardware. As Fishrock123 said, no apparent(!) sideeffects, but still annoying. So a fix definetly would be nice, since also a lot of systems seem to be affected.
I'm having this issue with node v7.3.0 on OSX 10.11.6 (El Capitan) and grunt v0.4.5, grunt-cli v1.2.0
I can't make sense of why this worked, but I was having the exact same issue, and adding more specific selectors for the files I was watching resolved the issue. Here's a sample of my old Gruntfile:
watch: {
css: {
files: '**/*.scss',
tasks: ['sass']
},
js: {
files: '**/*.js',
tasks: ['rollup']
}
}
and here's my new Gruntfile
watch: {
css: {
files: './assets/**/*.scss',
tasks: ['sass']
},
js: {
files: './assets/**/*.js',
tasks: ['rollup']
}
}
What's more--my old grunt process was using a huge chunk of my CPU, and with my more specific file selection, I use only 2-3%.
| gharchive/issue | 2015-03-02T17:45:54 | 2025-04-01T06:38:53.176907 | {
"authors": [
"AlexChesters",
"Fishrock123",
"JWGmeligMeyling",
"Pedder",
"SudoCat",
"TheBox193",
"ThomasHoadley",
"alphanull",
"awenro",
"cb1kenobi",
"crimann",
"cwklausing",
"digitalcraftco",
"ekkis",
"flekschas",
"greenchapter",
"infinityplusone",
"jlujan",
"juanbrujo",
"lancepadgett",
"maruf89",
"mikehdt",
"of6",
"ogmios2",
"ourcore",
"paulhayes",
"tannerlinsley",
"tbremer",
"tconroy",
"teejayhh",
"tuffz",
"vemec",
"vladikoff",
"yumyo"
],
"repo": "gruntjs/grunt-contrib-watch",
"url": "https://github.com/gruntjs/grunt-contrib-watch/issues/415",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
245614890 | Fix template code for commander
If people uncomments the example it should actually compile :)
Thanks a lot!
| gharchive/pull-request | 2017-07-26T06:02:56 | 2025-04-01T06:38:53.210415 | {
"authors": [
"grych",
"vic"
],
"repo": "grych/drab",
"url": "https://github.com/grych/drab/pull/29",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
993927043 | 🛑 volkstanz.st is down
In 21862af, volkstanz.st (https://volkstanz.st) was down:
HTTP code: 0
Response time: 0 ms
Resolved: volkstanz.st is back up in 94dc3d7.
| gharchive/issue | 2021-09-11T19:59:41 | 2025-04-01T06:38:53.213436 | {
"authors": [
"grzchr15"
],
"repo": "grzchr15/uptime",
"url": "https://github.com/grzchr15/uptime/issues/2014",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1051631664 | 🛑 bretterhofer.at is down
In fc8ae49, bretterhofer.at (https://bretterhofer.at) was down:
HTTP code: 0
Response time: 0 ms
Resolved: bretterhofer.at is back up in f31004b.
| gharchive/issue | 2021-11-12T06:35:52 | 2025-04-01T06:38:53.216651 | {
"authors": [
"grzchr15"
],
"repo": "grzchr15/uptime",
"url": "https://github.com/grzchr15/uptime/issues/3503",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1087978799 | 🛑 kinderundjugendtanz.at is down
In da1ca1c, kinderundjugendtanz.at (https://kinderundjugendtanz.at) was down:
HTTP code: 0
Response time: 0 ms
Resolved: kinderundjugendtanz.at is back up in 2986215.
| gharchive/issue | 2021-12-23T20:35:03 | 2025-04-01T06:38:53.219752 | {
"authors": [
"grzchr15"
],
"repo": "grzchr15/uptime",
"url": "https://github.com/grzchr15/uptime/issues/3901",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1155897535 | 🛑 Wikipedia is down
In d312e65, Wikipedia (https://en.wikipedia.org) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Wikipedia is back up in 05df3bf.
| gharchive/issue | 2022-03-01T22:41:48 | 2025-04-01T06:38:53.221977 | {
"authors": [
"grzchr15"
],
"repo": "grzchr15/uptime",
"url": "https://github.com/grzchr15/uptime/issues/4371",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1899032740 | 🛑 volkstanz.st is down
In 8666018, volkstanz.st (https://volkstanz.st) was down:
HTTP code: 0
Response time: 0 ms
Resolved: volkstanz.st is back up in 7481ad0 after 7 minutes.
| gharchive/issue | 2023-09-15T20:21:15 | 2025-04-01T06:38:53.224932 | {
"authors": [
"grzchr15"
],
"repo": "grzchr15/uptime",
"url": "https://github.com/grzchr15/uptime/issues/7973",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
204361567 | Support for threaded messages
When being triggered inside a message thread, the plugin still posts within the main channel - without having looked into too much detail in the code, it would be great if the plugin could use the thread_ts/ts attributes as discussed in [1].
Thanks,
Johan
[1] https://api.slack.com/docs/message-threading
Thanks for the patch!
| gharchive/issue | 2017-01-31T17:11:34 | 2025-04-01T06:38:53.242160 | {
"authors": [
"gsingers",
"johanlindquist"
],
"repo": "gsingers/slack-jira-plugin",
"url": "https://github.com/gsingers/slack-jira-plugin/issues/31",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1835282784 | Character count
Include a character count UI component which displays the length of the user's current message. If the CHARACTER_LIMIT defence mechanism is active, then this component should also inform the user when the message length is too long.
@PuneetLoona
| gharchive/issue | 2023-08-03T15:26:40 | 2025-04-01T06:38:53.243363 | {
"authors": [
"gsproston-scottlogic"
],
"repo": "gsproston-scottlogic/prompt-injection",
"url": "https://github.com/gsproston-scottlogic/prompt-injection/issues/93",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
362874606 | Wrong variable referenced in Generating Google Slides from images Tutorial
Lines 61, 65, and 66 of imageSlides reference the presentation variable. At that point in the tutorial, the deck variable is still being used to reference the presentation, so the script throws an error if you're following the tutorial step by step.
Good catch, thanks!
| gharchive/issue | 2018-09-22T17:19:24 | 2025-04-01T06:38:53.244970 | {
"authors": [
"erickoledadevrel",
"kacrouse"
],
"repo": "gsuitedevs/apps-script-samples",
"url": "https://github.com/gsuitedevs/apps-script-samples/issues/73",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1803491409 | Error using topquadrant shacl
Hi Gabe, I'm getting an error running test cases using topquadrant shacl:
docker.errors.APIError: 500 Server Error for http+docker://localhost/v1.41/images/create?tag=1.4.2&fromImage=ghcr.io%2Ftopquadrant%2Fshacl: Internal Server Error ("Head "https://ghcr.io/v2/topquadrant/shacl/manifests/1.4.2": unauthorized")
Did you encounter this error?
I don't think I've run into this, but it might be an issue with docker configuration that we need to either document in the README or fix in a script. Have you run the build-topbraid-shacl.sh script?
That worked! I thought I had done it but it appears not. It may be a good idea to add to the readme.
Just added a note to the README!
| gharchive/issue | 2023-07-13T17:44:59 | 2025-04-01T06:38:53.246832 | {
"authors": [
"gtfierro",
"lazlop"
],
"repo": "gtfierro/shacl-issues",
"url": "https://github.com/gtfierro/shacl-issues/issues/2",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
203704369 | Disable draw_value on Scale elements disables snapping to value
I'm not positive if this is a gtk bug or a gtk-rs bug, but if I create a Scale element with s couple of set values that the selector should snap to, it works fine. But then if I call set_draw_value(false), the scale no longer snaps to the appropriate values. I've included sample code:
extern crate gtk;
use gtk::prelude::*;
fn main() {
gtk::init().expect("");
let window = gtk::Window::new(gtk::WindowType::Toplevel);
window.set_default_size(400, 300);
let scale = gtk::Scale::new_with_range(gtk::Orientation::Horizontal, 1.0, 2.0, 1.0);
scale.set_draw_value(false); // Comment out this line to see "proper" behavior
scale.add_mark(1.0, gtk::PositionType::Bottom, Some("1"));
scale.add_mark(2.0, gtk::PositionType::Bottom, Some("2"));
let vbox = gtk::Box::new(gtk::Orientation::Vertical, 0);
vbox.pack_start(&scale, false, false, 0);
window.add(&vbox);
window.show_all();
gtk::main();
}
with Cargo.toml:
[dependencies.gtk]
version = "0.1"
Seems snapping unrelated to marks and controlled by https://developer.gnome.org/gtk3/stable/GtkScale.html#gtk-scale-set-digits (scale.set_digits(1); in your case) and works only when current value shown.
This can be seen if you set max to 5.0 and add 3.5 (or 4.0) marks.
If you need disable smooth scroll and don't show current value, you can use this
scale.connect_format_value(|scale, value| {
String::new()
});
Thanks, yes I see what you're saying. So it seems like this is a GTK+ bug, either in their docs for not noting this limitation or in their code. I can't imagine why the behavior changes when changing appearance, however. I'll file a bug upstream and close this one once I do.
Moving this discussion to GNOME's bug tracker: https://bugzilla.gnome.org/show_bug.cgi?id=777858
| gharchive/issue | 2017-01-27T18:20:22 | 2025-04-01T06:38:53.259711 | {
"authors": [
"EPashkin",
"Susurrus"
],
"repo": "gtk-rs/gtk",
"url": "https://github.com/gtk-rs/gtk/issues/432",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2437939168 | show currently logged in username in slideout action menu
https://github.com/filebrowser/filebrowser/issues/2692
Added in https://github.com/gtsteffaniak/filebrowser/pull/157
0.2.7 included
| gharchive/issue | 2024-07-30T14:09:15 | 2025-04-01T06:38:53.281252 | {
"authors": [
"gtsteffaniak"
],
"repo": "gtsteffaniak/filebrowser",
"url": "https://github.com/gtsteffaniak/filebrowser/issues/144",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
59465210 | SMS integration
let user enter phone number to link username with phone number?
philipp has something, that you culd send an sms with your username and your number was automatically registered. Afterwards it was possible to send messages.
I've used Nexmo in the past and it works well:
https://www.nexmo.com/
https://www.twilio.com/
twilio didn't give me a DE number, trying nexmo now
stuck with http://stackoverflow.com/q/29447706/1245190 now
Yay! First test worked!
Our number is +491771789420
works quite well now
todo: match phone number with user
continue with #9 and #11
| gharchive/issue | 2015-03-02T10:47:48 | 2025-04-01T06:38:53.290904 | {
"authors": [
"UserStefan",
"guaka",
"simison"
],
"repo": "guaka/hitchticker",
"url": "https://github.com/guaka/hitchticker/issues/2",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
552884319 | Bug Fix GmSSL client cannot communicate with TaSSL server; https://gi…
Bug Fix : GmSSL client cannot communicate with TaSSL server.
Issue: https://github.com/guanzhi/GmSSL/issues/913
Are you still work on it?
| gharchive/pull-request | 2020-01-21T13:41:45 | 2025-04-01T06:38:53.292471 | {
"authors": [
"NaveenShivanna86",
"davidkhala"
],
"repo": "guanzhi/GmSSL",
"url": "https://github.com/guanzhi/GmSSL/pull/915",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
87534008 | Update livereload.js
Fix for Chrome bug: https://github.com/livereload/livereload-extensions/issues/26#issuecomment-54250594
I can merge this if it helps others - please ask people to add +1s here to confirm.
I've made it so you can change those values in the Guardfile: https://github.com/guard/guard-livereload/pull/147
If there are any issues there, please open a new PR.
| gharchive/pull-request | 2015-06-11T23:11:12 | 2025-04-01T06:38:53.294379 | {
"authors": [
"e2",
"jamilabreu"
],
"repo": "guard/guard-livereload",
"url": "https://github.com/guard/guard-livereload/pull/136",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
175498610 | Setting up development environment with uWSGI, Nginx, and alerta sources
Hi Nick,
below you can find an instruction, how to run alerta from sources, using uWSGI and Nginx, so you can add it into the docs.
Clone the repository
Prepare installation directory - /opt/alerta for API, /opt/alerta-web for Angular UI
Setup virtualenv: mkdir /opt/alerta/venv && virtualenv /opt/alerta/venv
install uwsgi - pip install uwsgi
install alerta - ./venv/bin/python setup.py develop - note develop instead of install
uwsgi setup file:
[uwsgi]
base = /opt/alerta
master = true
app = alerta.app
module = %(app)
home = /opt/alerta/venv
pythonpath = %(base)
http = 127.0.0.1:5005
chmod-socket=664
callable = app
logto = /opt/alerta/alerta/log/%n.log
nginx setup file(correct the indents):
upstream alerta_api {
server 127.0.0.1:5005;
}
server {
listen 8000;
server_name localhost;
charset utf-8;
client_max_body_size 75M;
location / { try_files $uri @alerta; }
location @alerta {
index index.html;
root /opt/alerta-web/app;
}
location /api/ {
include uwsgi_params;
proxy_pass http://alerta_api/;
proxy_set_header Host $host:$server_port;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
Ubuntu start script /etc/init/alerta.conf
description "alerta uWSGI start script"
start on runlevel [2345]
stop on runlevel [06]
respawn
env UWSGI=/opt/alerta/venv/bin/uwsgi
env LOGFILE=/opt/alerta/alerta/log/emperor.log
exec $UWSGI --master --emperor /opt/alerta/alerta/run/vassals --die-on-term --uid alerta --gid alerta --logto $LOGFILE
Activate start script: ln -s /etc/init/alerta.conf /etc/init.d/alerta && chmod +x /etc/init.d/alerta
Create missing directories:
mkdir -p /opt/alerta/alerta/run/vassals - for uWSGI Emperor vassals
Activate uwsgi script: ln -s <uwsgi_file_location> /opt/alerta/alerta/run/vassals
Start services - service nginx start && service alerta start
Navigate to http://hostname:8000/ - angular ui, http://hostname:8000/api - API
Regards
Jacek
ugh... editor messed up the numbers
Awesome. Thanks!
Used this as the basis for the first step-by-step tutorial on the docs website. See http://docs.alerta.io/en/latest/gettingstarted/tutorial-1-deploy-alerta.html
| gharchive/issue | 2016-09-07T13:11:15 | 2025-04-01T06:38:53.305917 | {
"authors": [
"JacksonHill",
"satterly"
],
"repo": "guardian/alerta",
"url": "https://github.com/guardian/alerta/issues/256",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
2123312301 | fix: use later aws sdk version in configTools
What is the purpose of this change?
Makes the configTools project use the same version of aws-sdk as the main app.
What is the value of this change and how do we measure success?
This is an attempt to mitigate a high severity vulnerability discovered by Dependabot, whereby configTools' reliance on aws-scala which has been deprecated, and which in turn relies on an old version of aws-java-sdk-s3. Once this is merged, if successful we should see the issue disappear.
I have tested this locally and the dependency tree for configTools shows that the vulnerable version has been replaced a safe version:
Before:
After:
Closing for now to see if Dependabot raises a PR
Re-opening as Dependabot didn't raise a PR to fix this.
| gharchive/pull-request | 2024-02-07T15:38:12 | 2025-04-01T06:38:53.341007 | {
"authors": [
"tjsilver"
],
"repo": "guardian/janus-app",
"url": "https://github.com/guardian/janus-app/pull/407",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
224234028 | Implement 'Between' RangeKeyCondition
Similar to BeginsWith it would be great to have a Between RangeKeyCondition.
https://github.com/guardian/scanamo/blob/9cb47068bc3f69709c5bdb1a792dd9a6d5562e01/src/main/scala/com/gu/scanamo/query/DynamoKeyCondition.scala#L26
http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_Condition.html
https://github.com/guardian/scanamo/pull/106
| gharchive/issue | 2017-04-25T18:38:59 | 2025-04-01T06:38:53.352070 | {
"authors": [
"todor-kolev"
],
"repo": "guardian/scanamo",
"url": "https://github.com/guardian/scanamo/issues/104",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
268391523 | Restructure project to be installable as a python module
Hi,
I've created a pull request that restructures the project to be installable as a python module using setuptools. This allows pyall to be installed using "python setup.py install" and be importable from any project (use develop instead of install for development). You can also use setuptools to upload it to pypi if you want.
Feel free to reject it if it does not fit your workflow
Regards,
oysstu
This looks like a useful way forward, are the owners open for this change?
Hi Joakim,
Good to prompt me on this.
I think we should put pyall into pypi so it can be installed with pip.
I move the sources round and made some text scripts to demonstrate and clean up. That’s all done
I also registered for pypi as a producer.
I think we can get this done in the next week or 2 if your ok with the idea?
From: Joakim Skjefstad @.>
Sent: Saturday, January 20, 2024 6:22 PM
To: guardiangeomatics/pyall @.>
Cc: Subscribed @.***>
Subject: Re: [guardiangeomatics/pyall] Restructure project to be installable as a python module (#1)
This looks like a useful way forward, are the owners open for this change?
—
Reply to this email directly, view it on GitHubhttps://github.com/guardiangeomatics/pyall/pull/1#issuecomment-1902058165, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ACDTJMKKABN7OQSYNPCKX5LYPOLFFAVCNFSM4EAX42VKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCOJQGIYDKOBRGY2Q.
You are receiving this because you are subscribed to this thread.Message ID: @.***>
@pktrigg wonderful! There is already a different project using pyall as package name, so you may need to re-name this repo - not sure what you would like to call it, but I would suggest pyemall - em being Kongsberg designator for multibeams. However if you plan to/do support other types of .all-data, then I guess pyemall would restrict the project. What do you think?
Here is the other pyall:
It seems @oysstu fixed the packaging in this pull request, but it is old now. I am not sure how to best proceed, maybe accept the pull request then modify to fit the latest version, or ask if he can help package the latest version, making a pull request for you?
I am not experienced with packaging of modules, but I bet @oysstu would be happy to bring his thoughts and suggestions to the table if we ask him. As far as I can be of any help here, please let me know, and I will try to read up on it the coming weeks.
Some additional commits got automatically added to this PR because I committed them to my fork. Feel free to close this PR and copy anything you need. The code needs to be in a subdirectory with an init file, with setup.py in the root directory. I think adding a pyproject.toml with something like this is a good idea:
[build-system]
requires = [
"setuptools>=42",
]
build-backend = "setuptools.build_meta"
That ensures that modern packaging tools work
| gharchive/pull-request | 2017-10-25T13:10:39 | 2025-04-01T06:38:53.378544 | {
"authors": [
"joakimsk",
"oysstu",
"pktrigg"
],
"repo": "guardiangeomatics/pyall",
"url": "https://github.com/guardiangeomatics/pyall/pull/1",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
987461155 | 🛑 Cenite.com is down
In f7e05b8, Cenite.com (https://www.cenite.com) was down:
HTTP code: 523
Response time: 2939 ms
Resolved: Cenite.com is back up in 07f1dce.
| gharchive/issue | 2021-09-03T06:57:49 | 2025-04-01T06:38:53.392452 | {
"authors": [
"gudata"
],
"repo": "gudata/uptime",
"url": "https://github.com/gudata/uptime/issues/20",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1500151776 | Estudar monitoramento e controle de transporte de mídia
RCTP.
Depende de https://github.com/gugon/smu20222/issues/27.
Realizado!
| gharchive/issue | 2022-12-16T12:46:38 | 2025-04-01T06:38:53.401272 | {
"authors": [
"gugon"
],
"repo": "gugon/smu20222",
"url": "https://github.com/gugon/smu20222/issues/28",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
616070980 | Cannot populate docx template in node.js
Hello, I cannot create a docx from a template. I don't receive any errors, but the output file is corrupted (16 bytes). I've tried on both OSX and Linux.
This is my simple template (template.docx):
+++=name+++
+++=surname+++
And this is my node.js code:
const createReport = require ('docx-templates').default;
const fs = require('fs');
const template = fs.readFileSync('./template.docx');
const buffer = createReport({
template,
data: {
name:"foo",
surname: "foo2"
},
});
fs.writeFileSync('./report.docx', buffer);
Is this my fault?
Thank you in advance
Note that the createReport function returns a promise. In your example you are writing a promise to a file. It seems this is my fault: the example in the README is missing an await statement! I'm sorry for that. Thanks for the report!
See the fix here: https://github.com/guigrpa/docx-templates/commit/336a66b15a63437880823ced34709686366855bc
Be sure to reopen this issue if it doesn't solve the problem!
Thank you. It was the problem.
| gharchive/issue | 2020-05-11T17:57:19 | 2025-04-01T06:38:53.413477 | {
"authors": [
"glatorre",
"jjhbw"
],
"repo": "guigrpa/docx-templates",
"url": "https://github.com/guigrpa/docx-templates/issues/122",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2139256704 | calc força adic.
calc força adic.
ok
| gharchive/pull-request | 2024-02-16T19:08:55 | 2025-04-01T06:38:53.414309 | {
"authors": [
"guilhermelpc",
"guilhermelpc-cbc"
],
"repo": "guilhermelpc/desenvolvimento_dobras",
"url": "https://github.com/guilhermelpc/desenvolvimento_dobras/pull/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1019697148 | chore(Blog): update “2021-10-06-liste-de-mariage”
Automatically generated by Netlify CMS
🔮 Deploy Preview for lucileetguillaume canceled.
🔨 Explore the source changes: 191c04c45f37c55f2037d6eae31bd39bb96edbc9
🔍 Inspect the deploy log: https://app.netlify.com/sites/lucileetguillaume/deploys/615e9dd10da1850008d2eb37
| gharchive/pull-request | 2021-10-07T07:12:16 | 2025-04-01T06:38:53.423754 | {
"authors": [
"guillim"
],
"repo": "guillim/nuxt-starter-netlify-cms",
"url": "https://github.com/guillim/nuxt-starter-netlify-cms/pull/29",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
153972686 | ptargs iat-mode wont change even I set iat-mode=1 in json file
i set
{
...
"ptargs": "cert=AAAAAAAAAAAAAAAAAAAAAAAAAAAAA+AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA;iat-mode=1",
...
}
but when I run:
python3 ptproxy.py -s server.json
===== Server information =====
"server": "[::]:5899",
"ptname": "obfs4",
"ptargs": "cert=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX;iat-mode=0",
==============================
2016-05-10 10:15:16 PT started successfully.
(I hide the cert string for security)
p.s.
I installed obfs4 as:
go get git.torproject.org/pluggable-transports/obfs4.git/obfs4proxy
and when i run it i found that its version is 0.0.7, will it be the problem?
"ptargs" on server mode is ignored. This "Server information" is intended to be filled in the client config.
@gumblex so it means that the client could specify the cert arbitrarily? But when I use different cert, it wouldn't succeed.
Also, could you give a more specific description on how to use SOCKS5 ? I replace the local with socks5 on server, however, the client doesn't accept the socks5 content for local, is there a solution?
Thanks
@targetnull No. The client must specify the same cert as printed on the server side.
The client is not aware of what inner protocol is being used. local is for listening port. You should set your applications (eg. browsers) connect through the SOCKS5 server at local address.
Thanks for your reply. I'm wondering how the server generate the cert, is there a private key embeded in your script?
The obfs4 private key is stored in obfs4_state.json. If it doesn't exist, obfs4 will generate a random new key.
Thanks, I see.
| gharchive/issue | 2016-05-10T10:23:39 | 2025-04-01T06:38:53.467896 | {
"authors": [
"gumblex",
"saltydizz",
"targetnull"
],
"repo": "gumblex/ptproxy",
"url": "https://github.com/gumblex/ptproxy/issues/9",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1687231378 | [Enhancement] Relative Line
instead of needing to invoke a position twice for a relative Line movement, there should be a dedicated method:
with BuildLine() as l:
l1 = Line((0,25),(250/2-15,25))
l2 = Line(l1@1,l1@1+(0,30/2))
l2 = RelLine(l1@1,(0,30/2)) #automatically invokes the second parameter as relative to the first
A related but separate proposal (originally in discord): BaseLineObjects could use the @1 of the immediately-preceding BaseLineObject for their start_point (or equivalent) instead of requiring the user to explicitly state this.
related: #191
@fischman Following up on your comment above, I personally do not like the proposal to automatically use @1 as there are often times when I am creating a series of segments in the clockwise direction, and then switch to counterclockwise because of what is known about the design. Here is a simple example showing a situation in which the angle/length of a line segment is not specified:
@jdegenstein any reason not to use @1 as default, but let user specify an override if needed?
I would prefer to just keep it simple and let the user provide the Vector representing the starting point which is the way most current BaseLineObject-derived classes currently work. End users can always write custom helpers that override this behavior if needed.
| gharchive/issue | 2023-04-27T17:05:17 | 2025-04-01T06:38:53.471779 | {
"authors": [
"fischman",
"jdegenstein",
"snoyer"
],
"repo": "gumyr/build123d",
"url": "https://github.com/gumyr/build123d/issues/228",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2077608898 | merykitty gave in to Unsafe
Check List:
[x] Tests pass (./test.sh <username> shows no differences between expected and actual outputs)
[x] All formatting changes by the build are committed
[ ] Your launch script is named calculate_average_<username>.sh (make sure to match casing of your GH user name) and is executable
[x] Output matches that of calculate_average_baseline.sh
Execution time of merykittyunsafe:
Performance counter stats for 'sh calculate_average_merykittyunsafe.sh':
13497.35 msec task-clock:u # 10.750 CPUs utilized
0 context-switches:u # 0.000 /sec
0 cpu-migrations:u # 0.000 /sec
227100 page-faults:u # 16.826 K/sec
60988772290 cycles:u # 4.519 GHz
289560740 stalled-cycles-frontend:u # 0.47% frontend cycles idle
114022556 stalled-cycles-backend:u # 0.19% backend cycles idle
114074745602 instructions:u # 1.87 insn per cycle
# 0.00 stalled cycles per insn
12762459789 branches:u # 945.553 M/sec
53292216 branch-misses:u # 0.42% of all branches
1.255587212 seconds time elapsed
12.568676000 seconds user
0.870567000 seconds sys
Execution time of merykitty:
Performance counter stats for 'sh calculate_average_merykitty.sh':
15482.62 msec task-clock:u # 11.502 CPUs utilized
0 context-switches:u # 0.000 /sec
0 cpu-migrations:u # 0.000 /sec
222547 page-faults:u # 14.374 K/sec
71001812851 cycles:u # 4.586 GHz
362493890 stalled-cycles-frontend:u # 0.51% frontend cycles idle
130886807 stalled-cycles-backend:u # 0.18% backend cycles idle
143777012786 instructions:u # 2.02 insn per cycle
# 0.00 stalled cycles per insn
23320869203 branches:u # 1.506 G/sec
45042342 branch-misses:u # 0.19% of all branches
1.346125882 seconds time elapsed
14.487117000 seconds user
0.916624000 seconds sys
Not sure if this is allowed, I have consciously avoided using Unsafe, and this is my attempt to truly push the challenge to its limit.
The code consistently outpaces the solutions of Thomas and Roy in C2, but I don't think it will do so after taking into consideration AOT vs JIT and after staring at the assembly for 3 hours I have not come up with any ideas.
@merykitty How much slower does it perform with aot?
@artsiomkorzun It is about 8 times slower with Graal, I do not have a Graal built with hsdis at hand but looking at the code comments:
0x00007fadf73244c8: ; ImmutableOopMap {rax=Oop r10=Oop r11=Oop [16]=Oop [24]=Oop }
;*iinc {reexecute=1 rethrow=0 return_oop=0}
; - (reexecute) jdk.incubator.vector.ByteVector::ldLongOp@35 (line 368)
; - jdk.incubator.vector.ByteVector$ByteSpecies::ldLongOp@8 (line 4262)
; - jdk.incubator.vector.ByteVector::lambda$fromMemorySegment0Template$105@8 (line 3811)
; - jdk.incubator.vector.ByteVector$$Lambda/0x00007fad9c05e900::load@10
; - jdk.internal.vm.vector.VectorSupport::load@32 (line 428)
; - jdk.internal.misc.ScopedMemoryAccess::loadFromMemorySegmentScopedInternal@28 (line 361)
; - jdk.internal.misc.ScopedMemoryAccess::loadFromMemorySegment@31 (line 338)
; - jdk.incubator.vector.ByteVector::fromMemorySegment0Template@33 (line 3807)
; - jdk.incubator.vector.Byte256Vector::fromMemorySegment0@3 (line 938)
; - jdk.incubator.vector.ByteVector::fromMemorySegment@31 (line 3297)
; - dev.morling.onebrc.CalculateAverage_merykittyunsafe::iterate@37 (line 256)
I assume that jdk.internal.vm.vector.VectorSupport::load is not intrinsified properly, some more other vector intrinsics seem to express the same behaviour.
@merykitty Yes, we are aware of this issue with the Vector API. Unfortunately it still changes quite often when in incubator phase and it is difficult for us to keep up with the intrinsifications, which for this kind of low-level compiler API are essential.
It doesn't seems to be a clear improvement on the eval machine for some reason. Here's the results from 10 runs:
Benchmark 1: timeout -v 300 ./calculate_average_merykitty.sh 2>&1
Time (mean ± σ): 3.268 s ± 0.127 s [User: 21.568 s, System: 0.741 s]
Range (min … max): 2.953 s … 3.430 s 10 runs
Summary
merykitty: trimmed mean 3.286730414775, raw times 3.1829055004,2.9527902024,3.3237153764,3.3429483424,3.2571899574,3.4296059994,3.2995724264,3.2947934074000003,3.3001821494000003,3.2925361584000004
Leaderboard
| # | Result (m:s.ms) | Implementation | JDK | Submitter | Notes |
|---|-----------------|--------------------|-----|---------------|-----------|
| | 00:03.286 | [link](https://github.com/gunnarmorling/1brc/blob/main/src/main/java/dev/morling/onebrc/CalculateAverage_merykitty.java)| 21.0.1-open | [Quan Anh Mai](https://github.com/merykitty) | |
@gunnarmorling It is a separate entry merykittyunsafe instead of merykitty. I don't really want to add Unsafe to my original submission so can this go as a separate entry? Thanks.
@gunnarmorling It is a separate entry merykittyunsafe instead of merykitty.
Ah, sorry, had missed that. Oh, boy 🤯 :
Benchmark 1: timeout -v 300 ./calculate_average_merykittyunsafe.sh 2>&1
Time (mean ± σ): 2.573 s ± 0.024 s [User: 15.982 s, System: 0.749 s]
Range (min … max): 2.521 s … 2.605 s 10 runs
Summary
merykittyunsafe: trimmed mean 2.575439155045, raw times 2.57165876642,2.57405251042,2.55458195942,2.5576553414200003,2.52133204842,2.5802883164200003,2.59440042542,2.58838683042,2.60452826242,2.58248909042
Leaderboard
| # | Result (m:s.ms) | Implementation | JDK | Submitter | Notes |
|---|-----------------|--------------------|-----|---------------|-----------|
| | 00:02.575 | [link](https://github.com/gunnarmorling/1brc/blob/main/src/main/java/dev/morling/onebrc/CalculateAverage_merykittyunsafe.java)| 21.0.1-open | [merykittyunsafe](https://github.com/merykittyunsafe) | |
I don't really want to add Unsafe to my original submission so can this go as a separate entry? Thanks.
Yes, we can do that. While there should be only one entry per participant by default, I am happy to make an exception for this case.
@gunnarmorling Wow that is really impressive, I guess the test machine is much more capable at utilising the reduction in instruction count than mine. Thanks a lot for your help.
@merykitty Really cool vectorized solution!
On my machine utilizing all CPU cores the solution becomes memory-bound, but I think given that the evaluation is done only on 8 cores out of the 64, the reduction in instructions is giving the speed-up.
Strangely enough, both merykitty and merykittyunsafe are very slow on the old Hetzner CCX33. 45 and 40 seconds, respectively (on the official dataset). On the same instance, royvanrijn is at 4.7 seconds. I know the new instance doesn't have AVX-512 support, either, so I wonder what should explain this.
@mtopolnik that's why, in some extent I personally prefer SWAR, when applicable, cause it works the same across archs, although it add some register pressure. Peak performance is obvious not comparable...
@mtopolnik Are you running with C2 or with Graal? Because the latter has not caught up with the development of the Vector API yet.
Yes, @mtopolnik the numbers there seem to indicate you were running with Graal JIT with the missing intrinsifications for the incubator Vector API.
Didn't btw know about "-Djdk.incubator.vector.VECTOR_ACCESS_OOB_CHECK=0" yet. Is this planned to stay and be the new unsafe ;-) ?
@mtopolnik Are you running with C2 or with Graal? Because the latter has not caught up with the development of the Vector API yet.
Oops, I have a script that calls prepare_author.sh and then calculate_average_author.sh. But now I see there's no prepare_merykitty*.sh.
| gharchive/pull-request | 2024-01-11T21:20:54 | 2025-04-01T06:38:53.484753 | {
"authors": [
"artsiomkorzun",
"franz1981",
"gunnarmorling",
"merykitty",
"mtopolnik",
"thomaswue"
],
"repo": "gunnarmorling/1brc",
"url": "https://github.com/gunnarmorling/1brc/pull/331",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
499993449 | Default response on logic adapter
hey, i'm just using the doc code for default response:
´def get_default_response(self, input_statement):
from random import choice
if self.default_responses:
response = choice(self.default_responses)
else:
try:
response = self.chatbot.storage.get_random()
except StorageAdapter.EmptyDatabaseException:
response = input_statement
self.chatbot.logger.info(
'No known response to the input was found. Selecting a random response.'
)
return response´
but, could I define one especif response instead of selecting a new one?
If you are looking specific response please specific response adapter for more information please go through this link https://chatterbot.readthedocs.io/en/stable/logic/index.html#specific-response-adapter
| gharchive/issue | 2019-09-30T00:16:16 | 2025-04-01T06:38:53.489925 | {
"authors": [
"JDZC",
"vkosuri"
],
"repo": "gunthercox/ChatterBot",
"url": "https://github.com/gunthercox/ChatterBot/issues/1825",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
653517416 | Releases e tags
Boa tarde, professor. Gostaria de saber utilizar releases e tags no GitHub e vi que ainda não há material sobre. Haverá novas aulas?
Eu também tenho interesse em como utilizar Release e Tags. Procurei se tinha alguma issue e achei esta. Mestre @gustavoguanabara poderia nos indicar algum material seu ou de terceiro sobre este tema. Muito obrigado, desde já. Abraço.
| gharchive/issue | 2020-07-08T18:46:34 | 2025-04-01T06:38:53.543455 | {
"authors": [
"fabio7siqueira",
"fm1randa"
],
"repo": "gustavoguanabara/git-github",
"url": "https://github.com/gustavoguanabara/git-github/issues/283",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2039764244 | Onde está os slide da Aula 04?
Procurei os slides da aula 04 e não encontrei. Alguém poderia atualizar pra mim?
Sou novo por aqui também!
Olá @Petersonsales, a aula 4 de Git e Github não tem slides! ela foi uma aula prática que você pode acessar [Clicando Aqui ] (https://www.youtube.com/live/xEKo29OWILE?si=9qQnBaMyDS3U55BA ).
.
| gharchive/issue | 2023-12-13T13:53:50 | 2025-04-01T06:38:53.545398 | {
"authors": [
"BrenoReisSilva",
"Petersonsales",
"WallissonAlv",
"ceo1rock"
],
"repo": "gustavoguanabara/git-github",
"url": "https://github.com/gustavoguanabara/git-github/issues/2838",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1567895299 | 希望增加快手直播 大佬很强
如题
我想看抖音
| gharchive/issue | 2023-02-02T11:48:20 | 2025-04-01T06:38:53.566209 | {
"authors": [
"A3148475824",
"Hades-ming"
],
"repo": "guyijie1211/JustLive-Android",
"url": "https://github.com/guyijie1211/JustLive-Android/issues/49",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
381756873 | https falling back to http
Q
A
Bug?
no
New Feature?
no
Version
Specific version or SHA of a commit
Actual Behavior
What is the actual behavior?
The debug option prints http
Expected Behavior
Https due to the fact that the url specified is https
Steps to Reproduce
this is returned when set to debug
TCP_NODELAY set
* Connected to www.cedulaprofesional.sep.gob.mx (201.175.41.233) port 80 (#0)
> POST /cedula/buscaCedulaJson.action HTTP/1.1
Host: www.cedulaprofesional.sep.gob.mx
User-Agent: GuzzleHttp/6.2.1 curl/7.54.0 PHP/7.1.21
Content-Type: application/x-www-form-urlencoded
Content-Length: 170
this is the code causing it
$cliente = new Client([
'form_params'=>[
"json"=>$message2
],
'allow_redirects'=>false
]);
$request = new Request('POST', 'https://www.cedulaprofesional.sep.gob.mx/cedula/buscaCedulaJson.action'
);
$response = $cliente->send($request,["debug"=>true]
I am not able to reproduce your issue. Running on an Ubuntu machine with GuzzleHttp/6.3.3. Here's the minimalistic example code:
<?php
require '../vendor/autoload.php';
use GuzzleHttp\Client;
use GuzzleHttp\Psr7\Request;
$cliente = new Client([
'form_params' => [
"json" => []
],
'allow_redirects' => false
]);
$request = new Request('POST', 'https://www.cedulaprofesional.sep.gob.mx/cedula/buscaCedulaJson.action');
$response = $cliente->send($request, ["debug" => true]);
The result is:
| * Hostname www.cedulaprofesional.sep.gob.mx was found in DNS cache
-- | --
| * Trying 201.175.41.233...
| * Connected to www.cedulaprofesional.sep.gob.mx (201.175.41.233) port 443 (#0)
| * ALPN, offering http/1.1
| * Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
| * successfully set certificate verify locations:
| * CAfile: /etc/ssl/certs/ca-certificates.crt
| CApath: /etc/ssl/certs
| * SSL connection using TLSv1.2 / DHE-RSA-AES256-GCM-SHA384
| * ALPN, server did not agree to a protocol
| * Server certificate:
| * subject: C=MX; ST=Distrito Federal; L=Ciudad De Mexico; O=Secretar�a de Educaci�n P�blica; CN=sep.gob.mx
| * start date: May 16 14:41:56 2017 GMT
| * expire date: May 17 15:11:52 2019 GMT
| * subjectAltName: www.cedulaprofesional.sep.gob.mx matched
| * issuer: C=CA; O=AffirmTrust; OU=See www.affirmtrust.com/repository; CN=AffirmTrust Certificate Authority - OV1
| * SSL certificate verify ok.
| > POST /cedula/buscaCedulaJson.action HTTP/1.1
| Host: www.cedulaprofesional.sep.gob.mx
| Content-Length: 0
| User-Agent: GuzzleHttp/6.3.3 curl/7.47.0 PHP/7.1.25-1+ubuntu16.04.1+deb.sury.org+1
| Content-Type: application/x-www-form-urlencoded
|
| * HTTP 1.0, assume close after body
| < HTTP/1.0 200 OK
| < Server: Desconocido
| < X-Powered-By: Servlet/3.0 JSP/2.2 (Desconocido Java/Oracle Corporation/1.7)
| < Content-Language: es-MX
| < Content-Length: 0
| < Date: Thu, 03 Jan 2019 15:37:50 GMT
| < X-Cache: MISS from SEP
| < X-Cache-Lookup: MISS from SEP:80
| < Via: 1.0 SEP (squid)
| * HTTP/1.0 connection set to keep alive!
| < Connection: keep-alive
| <
| * Connection #0 to host www.cedulaprofesional.sep.gob.mx left intact
|
As you can see, the request is done through HTTPs (port 443). Please confirm that you have correct root certificates installed on your system/server.
| gharchive/issue | 2018-11-16T20:56:40 | 2025-04-01T06:38:53.575308 | {
"authors": [
"Dzhuneyt",
"zardilior"
],
"repo": "guzzle/guzzle",
"url": "https://github.com/guzzle/guzzle/issues/2205",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2677358718 | TB /cam fog off feature no longer works.
After updating to v6.25, I have tried using /cam fog off to eliminate the fog in-game. As soon as I hit enter, the command doesn't work. Could be a bug?
It is still not working for me. I have tried in several areas. I don't even know why it won't turn off even if I hotkey it.
Enter gloom and show a screenshot
Enter gloom and show a screenshot
Sure thing. Of course.
I entered Gloom and used Hotkey /cam fog off and /camera fog off to turn off the fog, which had no effect.
Enter gloom and show a screenshot
Okay, as it turns out. I got this message on Discord. It reads the following: GWCA / TB++
Cam fog off and cam fog on works. They work, but only the other way around. Cam fog off” turns the fog on and cam fog on turns the fog off.
@3vcloud
| gharchive/issue | 2024-11-20T22:24:49 | 2025-04-01T06:38:53.617438 | {
"authors": [
"3vcloud",
"BearsMan"
],
"repo": "gwdevhub/GWToolboxpp",
"url": "https://github.com/gwdevhub/GWToolboxpp/issues/1277",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2204307240 | 🛑 Undertale社区维基 - UTCWIKI is down
In 9fe18da, Undertale社区维基 - UTCWIKI (https://utcwiki.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Undertale社区维基 - UTCWIKI is back up in 0becc4f after 57 minutes.
| gharchive/issue | 2024-03-24T11:36:19 | 2025-04-01T06:38:53.651122 | {
"authors": [
"gzombiejun"
],
"repo": "gzombiejun/upptime",
"url": "https://github.com/gzombiejun/upptime/issues/169",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1662223787 | content/en/_index.md not mentioned in tutorial
Dear Sir,
I started yesterday with Hugo and its tutorial. Today, I tried your doks theme, again following the tutorial. In
https://getdoks.org/docs/tutorial/set-configuration/
we have
Open ./config/_default/params.toml
Change these settings:
## Homepage
title = "Doks"
titleSeparator = "-"
titleAddition = "Modern Documentation Theme"
description = "Doks is a Hugo theme for building secure, fast, and SEO-ready documentation websites, which you can easily update and customize."
I had the feeling, that changing titleAddition or description had no effects.
After finding the string literals with grep, I finally modified content/en/_index.md and the running local webserver immediately updated the main page in firefox.
Did I something wrong, or may the tutorial be outdated? Maybe I did something wrong, I will repeat all actions later this day from scratch, and tell you if I made indeed a mistake.
Well, from comment
Set meta data for Search Engine Optimization (SEO) and Social Media.
I may get the feeling, that it is no real issue. The content may be just metadata and never got displayed. But for that case, it would make it for beginner easier, when the meta content is not identical to the shown text.
Next Problem:
JSON-LD
Change these settings:
...
schemaAuthor = "Henk Verlinde"
Well, should I really change the entries? I think the schemaAuthor is Henk Verlinde and not me?
Thanks for using Doks!
Did I something wrong, or may the tutorial be outdated?
No, you did not! You found a bug w/ Chromium browsers (will be fixed in the soon to be released Doks 1.0). You'll just need to add --noHTTPCache to the start script in package.json, like so:
"start": "exec-bin node_modules/.bin/hugo/hugo server --gc --bind=0.0.0.0 --disableFastRender --baseURL=http://localhost --noHTTPCache",
Thank you very much for your fast reply.
[..] it would make it for beginner easier, when the meta content is not identical to the shown text.
Get that, will be looking in auto fill possibilities when using the CLI for setting up a new Doks project.
Well, should I really change the entries? I think the schemaAuthor is Henk Verlinde and not me?
No, the meta data should be completely yours. It's based on the Schema.org approach by Yoast SEO — for more background, see Schema - Background information
| gharchive/issue | 2023-04-11T10:45:59 | 2025-04-01T06:38:53.661848 | {
"authors": [
"StefanSalewski",
"h-enk"
],
"repo": "h-enk/doks",
"url": "https://github.com/h-enk/doks/issues/1036",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
209817996 | PDB component integration into VF
[x] litemol viewer of selected pdb file
[x] pdb link component and other one entry components
[x] prepare dataset panel, which allows to add entries from PDB using autocomplete component and refined by other component/viewers
[x] widget to customize entity-id attribute for pdb-topology-viewer (N), data.firstelementofstructure[0].entity_id, filtering polypeptide molecules - issue #32
[x] show/hide all components - encapsulate in dataset - change icon (T)
[x] show/hide some of the components, within dataset (T)
[x] distinguish between pdb and uniprot entry - show PDB UniProt Viewer (T)
[x] litemol viewer in dataset (T)
[x] howto effectively rebootstrap already existing pdb component - detach and attach -
[x] pdb prints component and other multiple entries components
Tomas:
[x] autocomplete component in Firefox
[x] autocomplete component, works in chrome, test in chromium
[ ] store submitted dataset as artifact of virtual folder - represent it as separated folder with metadata
[ ] replacing the component in more elegant way (within entity-id, pdb-ids)
[ ] jsonrender to show raw data from pdb summary and other APIs
[x] autocomplete, fix when key press enter renders button click event
Nurul:
[ ] accordion to hide all details
[ ] make component to choose assembly-id, analogy of entity-id
before submitting D5.5 report:
[x] address https://portal.west-life.eu/virtualfolder/ shows error message pops up: "Sorry, error when connecting backend web service at /metadataservice/files error:{"ResponseStatus": "ErrorCode":"UnauthorizedAccessException","Message":"Attempted to perform an unauthorized operation.","Errors":[]}} status:UnauthorizedAccessException"
[x] There is a link at the bottom of the page " Development documentation at internal-wiki.west-life.eu/w/index.php?title=WP6". This is not appropriate in a page that has been delivered.
[x] The report says that there is a demo at https://portal.west-life.eu/virtualfolder/test/index-dataset.html. When I visit this I see:[an error occurred while processing the directive]
[x] on that page I also see " PDB or related item to add:". There is no help text to suggest what I should enter. There should be a tooltip listing the possible responses, and a placeholder within the input element.
[ ] If I enter something unexpected, e.g. the gene name CFTR, then I get the unhelpful response "No hints."
[ ] I guessed that a PDB accession code is expected, and entered one. This gave a useful page but with some errors - failed PDB-REDO etc., failed pdb components should not be rendered, relates to issue #39
[x] Then when I click “Publish dataset” I get “Sorry. Dataset not submitted at undefined error:404 status:Not Found” - remove
| gharchive/issue | 2017-02-23T16:46:48 | 2025-04-01T06:38:53.674007 | {
"authors": [
"TomasKulhanek"
],
"repo": "h2020-westlife-eu/west-life-wp6",
"url": "https://github.com/h2020-westlife-eu/west-life-wp6/issues/26",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
188046350 | TF Java bindings build errors
Hi...Trying to install DeepWater for TensorFlow on Mac OSX by following the instructions here:
https://github.com/h2oai/deepwater/tree/master/tensorflow
Everything runs fine up until this step:
mvn -T 20 install --projects .,tensorflow
Following the script's recommendation, I re-ran with debugging & error flags:
mvn -e -X -T 20 install --projects .,tensorflow
The last command generated the error messages below. I looked into submitting an issue at the Javacpp-presets github page but it was not clear to me how to open a new issue there.
Thanks.
Javacpp-presets debugging messages:
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T10:41:47-06:00)
Maven home: /usr/local/Cellar/maven/3.3.9/libexec
Java version: 1.8.0_102, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.11.6", arch: "x86_64", family: "mac"
[DEBUG] Created new class realm maven.api
[DEBUG] Importing foreign packages into class realm maven.api
[DEBUG] Imported: javax.enterprise.inject.* < plexus.core
[DEBUG] Imported: javax.enterprise.util.* < plexus.core
[DEBUG] Imported: javax.inject.* < plexus.core
[DEBUG] Imported: org.apache.maven.* < plexus.core
[DEBUG] Imported: org.apache.maven.artifact < plexus.core
[DEBUG] Imported: org.apache.maven.classrealm < plexus.core
[DEBUG] Imported: org.apache.maven.cli < plexus.core
[DEBUG] Imported: org.apache.maven.configuration < plexus.core
[DEBUG] Imported: org.apache.maven.exception < plexus.core
[DEBUG] Imported: org.apache.maven.execution < plexus.core
[DEBUG] Imported: org.apache.maven.execution.scope < plexus.core
[DEBUG] Imported: org.apache.maven.lifecycle < plexus.core
[DEBUG] Imported: org.apache.maven.model < plexus.core
[DEBUG] Imported: org.apache.maven.monitor < plexus.core
[DEBUG] Imported: org.apache.maven.plugin < plexus.core
[DEBUG] Imported: org.apache.maven.profiles < plexus.core
[DEBUG] Imported: org.apache.maven.project < plexus.core
[DEBUG] Imported: org.apache.maven.reporting < plexus.core
[DEBUG] Imported: org.apache.maven.repository < plexus.core
[DEBUG] Imported: org.apache.maven.rtinfo < plexus.core
[DEBUG] Imported: org.apache.maven.settings < plexus.core
[DEBUG] Imported: org.apache.maven.toolchain < plexus.core
[DEBUG] Imported: org.apache.maven.usability < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.* < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.authentication < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.authorization < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.events < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.observers < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.proxy < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.repository < plexus.core
[DEBUG] Imported: org.apache.maven.wagon.resource < plexus.core
[DEBUG] Imported: org.codehaus.classworlds < plexus.core
[DEBUG] Imported: org.codehaus.plexus.* < plexus.core
[DEBUG] Imported: org.codehaus.plexus.classworlds < plexus.core
[DEBUG] Imported: org.codehaus.plexus.component < plexus.core
[DEBUG] Imported: org.codehaus.plexus.configuration < plexus.core
[DEBUG] Imported: org.codehaus.plexus.container < plexus.core
[DEBUG] Imported: org.codehaus.plexus.context < plexus.core
[DEBUG] Imported: org.codehaus.plexus.lifecycle < plexus.core
[DEBUG] Imported: org.codehaus.plexus.logging < plexus.core
[DEBUG] Imported: org.codehaus.plexus.personality < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.Xpp3Dom < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.pull.XmlPullParser < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.pull.XmlPullParserException < plexus.core
[DEBUG] Imported: org.codehaus.plexus.util.xml.pull.XmlSerializer < plexus.core
[DEBUG] Imported: org.eclipse.aether.* < plexus.core
[DEBUG] Imported: org.eclipse.aether.artifact < plexus.core
[DEBUG] Imported: org.eclipse.aether.collection < plexus.core
[DEBUG] Imported: org.eclipse.aether.deployment < plexus.core
[DEBUG] Imported: org.eclipse.aether.graph < plexus.core
[DEBUG] Imported: org.eclipse.aether.impl < plexus.core
[DEBUG] Imported: org.eclipse.aether.installation < plexus.core
[DEBUG] Imported: org.eclipse.aether.internal.impl < plexus.core
[DEBUG] Imported: org.eclipse.aether.metadata < plexus.core
[DEBUG] Imported: org.eclipse.aether.repository < plexus.core
[DEBUG] Imported: org.eclipse.aether.resolution < plexus.core
[DEBUG] Imported: org.eclipse.aether.spi < plexus.core
[DEBUG] Imported: org.eclipse.aether.transfer < plexus.core
[DEBUG] Imported: org.eclipse.aether.version < plexus.core
[DEBUG] Imported: org.slf4j.* < plexus.core
[DEBUG] Imported: org.slf4j.helpers.* < plexus.core
[DEBUG] Imported: org.slf4j.spi.* < plexus.core
[DEBUG] Populating class realm maven.api
[INFO] Error stacktraces are turned on.
[DEBUG] Reading global settings from /usr/local/Cellar/maven/3.3.9/libexec/conf/settings.xml
[DEBUG] Reading user settings from /Users/mohamed.badawy/.m2/settings.xml
[DEBUG] Reading global toolchains from /usr/local/Cellar/maven/3.3.9/libexec/conf/toolchains.xml
[DEBUG] Reading user toolchains from /Users/mohamed.badawy/.m2/toolchains.xml
[DEBUG] Using local repository at /Users/mohamed.badawy/.m2/repository
[DEBUG] Using manager EnhancedLocalRepositoryManager with priority 10.0 for /Users/mohamed.badawy/.m2/repository
[INFO] Scanning for projects...
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=1, ConflictMarker.markTime=1, ConflictMarker.nodeCount=84, ConflictIdSorter.graphTime=0, ConflictIdSorter.topsortTime=0, ConflictIdSorter.conflictIdCount=45, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=5, ConflictResolver.conflictItemCount=71, DefaultDependencyCollector.collectTime=277, DefaultDependencyCollector.transformTime=11}
[DEBUG] org.sonatype.plugins:nexus-staging-maven-plugin:jar:1.6.6:
[DEBUG] org.sonatype.nexus.maven:nexus-common:jar:1.6.6:compile
[DEBUG] org.sonatype.plexus:plexus-sec-dispatcher:jar:1.4:compile
[DEBUG] org.sonatype.plexus:plexus-cipher:jar:1.7:compile
[DEBUG] com.google.guava:guava:jar:14.0.1:compile
[DEBUG] org.sonatype.nexus:nexus-client-core:jar:2.9.1-02:compile
[DEBUG] org.sonatype.nexus.plugins:nexus-restlet1x-model:jar:2.9.1-02:compile
[DEBUG] org.apache.maven:maven-model:jar:3.0.4:compile
[DEBUG] org.slf4j:slf4j-api:jar:1.7.7:compile
[DEBUG] com.google.code.findbugs:jsr305:jar:2.0.1:compile
[DEBUG] com.intellij:annotations:jar:9.0.4:compile
[DEBUG] commons-io:commons-io:jar:2.4:compile
[DEBUG] com.thoughtworks.xstream:xstream:jar:1.4.7:compile
[DEBUG] xmlpull:xmlpull:jar:1.1.3.1:compile
[DEBUG] xpp3:xpp3_min:jar:1.1.4c:compile
[DEBUG] joda-time:joda-time:jar:2.2:compile
[DEBUG] commons-lang:commons-lang:jar:2.6:compile
[DEBUG] commons-beanutils:commons-beanutils-core:jar:1.8.3:compile
[DEBUG] org.sonatype.sisu.siesta:siesta-client:jar:1.7:compile
[DEBUG] org.sonatype.sisu.siesta:siesta-common:jar:1.7:compile
[DEBUG] javax.ws.rs:jsr311-api:jar:1.1.1:compile
[DEBUG] com.sun.jersey:jersey-core:jar:1.17.1:compile
[DEBUG] javax.validation:validation-api:jar:1.1.0.Final:compile
[DEBUG] com.sun.jersey:jersey-client:jar:1.17.1:compile
[DEBUG] com.sun.jersey.contribs:jersey-apache-client4:jar:1.17.1:compile
[DEBUG] org.sonatype.sisu.siesta:siesta-jackson:jar:1.7:compile
[DEBUG] com.fasterxml.jackson.core:jackson-annotations:jar:2.3.1:compile
[DEBUG] com.fasterxml.jackson.core:jackson-core:jar:2.3.1:compile
[DEBUG] com.fasterxml.jackson.core:jackson-databind:jar:2.3.1:compile
[DEBUG] com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:jar:2.3.1:compile
[DEBUG] com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar:2.3.1:compile
[DEBUG] com.fasterxml.jackson.module:jackson-module-jaxb-annotations:jar:2.3.1:compile
[DEBUG] org.apache.httpcomponents:httpclient:jar:4.3.5:compile
[DEBUG] commons-codec:commons-codec:jar:1.6:compile
[DEBUG] org.apache.httpcomponents:httpcore:jar:4.3.2:compile
[DEBUG] org.slf4j:jcl-over-slf4j:jar:1.7.7:compile
[DEBUG] javax.inject:javax.inject:jar:1:compile
[DEBUG] org.sonatype.spice.zapper:spice-zapper:jar:1.3:compile
[DEBUG] org.fusesource.hawtbuf:hawtbuf-proto:jar:1.9:compile
[DEBUG] org.fusesource.hawtbuf:hawtbuf:jar:1.9:compile
[DEBUG] org.codehaus.plexus:plexus-utils:jar:3.0.8:compile
[DEBUG] org.codehaus.plexus:plexus-interpolation:jar:1.15:compile
[DEBUG] org.sonatype.aether:aether-api:jar:1.13.1:compile
[DEBUG] ch.qos.logback:logback-core:jar:1.1.2:runtime
[DEBUG] ch.qos.logback:logback-classic:jar:1.1.2:runtime
[DEBUG] Created new class realm extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6
[DEBUG] Importing foreign packages into class realm extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6
[DEBUG] Imported: < maven.api
[DEBUG] Populating class realm extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6
[DEBUG] Included: org.sonatype.plugins:nexus-staging-maven-plugin:jar:1.6.6
[DEBUG] Included: org.sonatype.nexus.maven:nexus-common:jar:1.6.6
[DEBUG] Included: org.sonatype.plexus:plexus-sec-dispatcher:jar:1.4
[DEBUG] Included: org.sonatype.plexus:plexus-cipher:jar:1.7
[DEBUG] Included: com.google.guava:guava:jar:14.0.1
[DEBUG] Included: org.sonatype.nexus:nexus-client-core:jar:2.9.1-02
[DEBUG] Included: org.sonatype.nexus.plugins:nexus-restlet1x-model:jar:2.9.1-02
[DEBUG] Included: com.google.code.findbugs:jsr305:jar:2.0.1
[DEBUG] Included: com.intellij:annotations:jar:9.0.4
[DEBUG] Included: commons-io:commons-io:jar:2.4
[DEBUG] Included: com.thoughtworks.xstream:xstream:jar:1.4.7
[DEBUG] Included: xmlpull:xmlpull:jar:1.1.3.1
[DEBUG] Included: xpp3:xpp3_min:jar:1.1.4c
[DEBUG] Included: joda-time:joda-time:jar:2.2
[DEBUG] Included: commons-lang:commons-lang:jar:2.6
[DEBUG] Included: commons-beanutils:commons-beanutils-core:jar:1.8.3
[DEBUG] Included: org.sonatype.sisu.siesta:siesta-client:jar:1.7
[DEBUG] Included: org.sonatype.sisu.siesta:siesta-common:jar:1.7
[DEBUG] Included: javax.ws.rs:jsr311-api:jar:1.1.1
[DEBUG] Included: com.sun.jersey:jersey-core:jar:1.17.1
[DEBUG] Included: javax.validation:validation-api:jar:1.1.0.Final
[DEBUG] Included: com.sun.jersey:jersey-client:jar:1.17.1
[DEBUG] Included: com.sun.jersey.contribs:jersey-apache-client4:jar:1.17.1
[DEBUG] Included: org.sonatype.sisu.siesta:siesta-jackson:jar:1.7
[DEBUG] Included: com.fasterxml.jackson.core:jackson-annotations:jar:2.3.1
[DEBUG] Included: com.fasterxml.jackson.core:jackson-core:jar:2.3.1
[DEBUG] Included: com.fasterxml.jackson.core:jackson-databind:jar:2.3.1
[DEBUG] Included: com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:jar:2.3.1
[DEBUG] Included: com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar:2.3.1
[DEBUG] Included: com.fasterxml.jackson.module:jackson-module-jaxb-annotations:jar:2.3.1
[DEBUG] Included: org.apache.httpcomponents:httpclient:jar:4.3.5
[DEBUG] Included: commons-codec:commons-codec:jar:1.6
[DEBUG] Included: org.apache.httpcomponents:httpcore:jar:4.3.2
[DEBUG] Included: org.slf4j:jcl-over-slf4j:jar:1.7.7
[DEBUG] Included: org.sonatype.spice.zapper:spice-zapper:jar:1.3
[DEBUG] Included: org.fusesource.hawtbuf:hawtbuf-proto:jar:1.9
[DEBUG] Included: org.fusesource.hawtbuf:hawtbuf:jar:1.9
[DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:3.0.8
[DEBUG] Included: org.codehaus.plexus:plexus-interpolation:jar:1.15
[DEBUG] Included: ch.qos.logback:logback-core:jar:1.1.2
[DEBUG] Included: ch.qos.logback:logback-classic:jar:1.1.2
[DEBUG] Extension realms for project org.bytedeco:javacpp-presets:pom:1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Created new class realm project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT
[DEBUG] Populating class realm project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT
[DEBUG] Looking up lifecyle mappings for packaging pom from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:opencv:jar:3.1.0-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:ffmpeg:jar:3.1.4-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:flycapture:jar:2.9.3.43-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:libdc1394:jar:2.2.4-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:libfreenect:jar:0.5.3-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:videoinput:jar:0.200-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:artoolkitplus:jar:2.3.1-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:chilitags:jar:master-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:flandmark:jar:1.07-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:hdf5:jar:1.10.0-patch1-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:openblas:jar:0.2.19-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:fftw:jar:3.3.5-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:gsl:jar:2.2.1-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:llvm:jar:3.9.0-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:leptonica:jar:1.73-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:tesseract:jar:3.04.01-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:caffe:jar:master-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:cuda:jar:8.0-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:mxnet:jar:master-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[DEBUG] Extension realms for project org.bytedeco.javacpp-presets:tensorflow:jar:0.10.0-1.2.5-SNAPSHOT: [ClassRealm[extension>org.sonatype.plugins:nexus-staging-maven-plugin:1.6.6, parent: sun.misc.Launcher$AppClassLoader@55f96302]]
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:opencv:jar:3.1.0-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:opencv:3.1.0-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/opencv/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:ffmpeg:jar:3.1.4-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:ffmpeg:3.1.4-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/ffmpeg/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:flycapture:jar:2.9.3.43-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:flycapture:2.9.3.43-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/flycapture/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:libdc1394:jar:2.2.4-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:libdc1394:2.2.4-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/libdc1394/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:libfreenect:jar:0.5.3-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:libfreenect:0.5.3-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/libfreenect/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:videoinput:jar:0.200-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:videoinput:0.200-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/videoinput/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:artoolkitplus:jar:2.3.1-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:artoolkitplus:2.3.1-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/artoolkitplus/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:chilitags:jar:master-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:chilitags:master-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/chilitags/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:flandmark:jar:1.07-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:flandmark:1.07-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/flandmark/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:hdf5:jar:1.10.0-patch1-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:hdf5:1.10.0-patch1-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/hdf5/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:openblas:jar:0.2.19-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:openblas:0.2.19-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/openblas/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:fftw:jar:3.3.5-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:fftw:3.3.5-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/fftw/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:gsl:jar:2.2.1-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:gsl:2.2.1-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/gsl/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:llvm:jar:3.9.0-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:llvm:3.9.0-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/llvm/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:leptonica:jar:1.73-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:leptonica:1.73-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/leptonica/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:tesseract:jar:3.04.01-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:tesseract:3.04.01-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tesseract/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:caffe:jar:master-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:caffe:master-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/caffe/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:cuda:jar:8.0-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:cuda:8.0-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/cuda/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:mxnet:jar:master-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:mxnet:master-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/mxnet/pom.xml, line 14, column 12
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.bytedeco.javacpp-presets:tensorflow:jar:0.10.0-1.2.5-SNAPSHOT
[WARNING] 'version' contains an expression but should be a constant. @ org.bytedeco.javacpp-presets:tensorflow:${tensorflow.version}-${project.parent.version}, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/pom.xml, line 14, column 12
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
[INFO] Inspecting build with total of 2 modules...
[INFO] Not installing Nexus Staging features:
[INFO] * Preexisting staging related goal bindings found in 2 modules.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] JavaCPP Presets
[INFO] JavaCPP Presets for TensorFlow
[DEBUG] === REACTOR BUILD PLAN ================================================
[DEBUG] Project: org.bytedeco:javacpp-presets:pom:1.2.5-SNAPSHOT
[DEBUG] Tasks: [install]
[DEBUG] Style: Regular
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Project: org.bytedeco.javacpp-presets:tensorflow:jar:0.10.0-1.2.5-SNAPSHOT
[DEBUG] Tasks: [install]
[DEBUG] Style: Regular
[DEBUG] =======================================================================
[INFO]
[INFO] Using the MultiThreadedBuilder implementation with a thread count of 20
[DEBUG] Scheduling: MavenProject: org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/pom.xml
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building JavaCPP Presets 1.2.5-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] === PROJECT BUILD PLAN ================================================
[DEBUG] Project: org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT
[DEBUG] Dependencies (collect): []
[DEBUG] Dependencies (resolve): [compile]
[DEBUG] Repositories (dependencies): [central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] Repositories (plugins) : [central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-javadoc-plugin:2.10.3:jar (attach-javadocs)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${additionalJOption}
-Xdoclint:none
${aggregate}
${maven.javadoc.applyJavadocSecurityFix}
${attach}
${author}
${bootclasspath}
${bootclasspathArtifacts}
${bottom}
${breakiterator}
${charset}
${maven.javadoc.classifier}
${debug}
${destDir}
${detectJavaApiLink}
${detectLinks}
${detectOfflineLinks}
${docencoding}
${docfilessubdirs}
${doclet}
${docletArtifact}
${docletArtifacts}
${docletPath}
${doctitle}
${encoding}
${excludePackageNames}
${excludedocfilessubdir}
${extdirs}
${maven.javadoc.failOnError}
${project.build.finalName}
${footer}
${groups}
${header}
${helpfile}
${project.build.directory}
${javaApiLinks}
${javadocExecutable}
${javadocVersion}
${keywords}
http://bytedeco.org/javacpp/apidocs/${links}
${linksource}
${localRepository}
${locale}
${maxmemory}
${minmemory}
${nocomment}
${nodeprecated}
${nodeprecatedlist}
${nohelp}
${noindex}
${nonavbar}
${nooverview}
${noqualifier}
${nosince}
${notimestamp}
${notree}
${offlineLinks}
${old}
${destDir}
${overview}
${packagesheader}
${proxyHost}
${proxyPort}
${quiet}
${reactorProjects}
${project.remoteArtifactRepositories}
${resourcesArtifacts}
${serialwarn}
${show}
${maven.javadoc.skip}
${source}
${sourcepath}
${sourcetab}
${splitindex}
${stylesheet}
${stylesheetfile}
${subpackages}
${taglet}
${tagletArtifact}
${tagletArtifacts}
${tagletpath}
${taglets}
${tags}
${top}
${use}
${useStandardDocletOptions}
${validateLinks}
${verbose}
${version}
${windowtitle}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-install-plugin:2.5.2:install (default-install)
[DEBUG] Style: Regular
[DEBUG] Configuration:
true
${installAtEnd}
${localRepository}
${maven.install.skip}
${updateReleaseInfo}
[DEBUG] =======================================================================
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=0, ConflictMarker.markTime=0, ConflictMarker.nodeCount=1, ConflictIdSorter.graphTime=0, ConflictIdSorter.topsortTime=0, ConflictIdSorter.conflictIdCount=0, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=0, ConflictResolver.conflictItemCount=0, DefaultDependencyCollector.collectTime=0, DefaultDependencyCollector.transformTime=0}
[DEBUG] org.bytedeco:javacpp-presets:pom:1.2.5-SNAPSHOT
[INFO]
[INFO] --- maven-javadoc-plugin:2.10.3:jar (attach-javadocs) @ javacpp-presets ---
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=1, ConflictMarker.markTime=1, ConflictMarker.nodeCount=301, ConflictIdSorter.graphTime=0, ConflictIdSorter.topsortTime=0, ConflictIdSorter.conflictIdCount=75, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=4, ConflictResolver.conflictItemCount=183, DefaultDependencyCollector.collectTime=261, DefaultDependencyCollector.transformTime=6}
[DEBUG] org.apache.maven.plugins:maven-javadoc-plugin:jar:2.10.3:
[DEBUG] org.apache.maven:maven-core:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-plugin-parameter-documenter:jar:2.2.1:compile
[DEBUG] org.slf4j:slf4j-jdk14:jar:1.5.6:runtime
[DEBUG] org.slf4j:slf4j-api:jar:1.5.6:runtime
[DEBUG] org.slf4j:jcl-over-slf4j:jar:1.5.6:runtime
[DEBUG] org.apache.maven:maven-profile:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-repository-metadata:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-error-diagnostics:jar:2.2.1:compile
[DEBUG] commons-cli:commons-cli:jar:1.2:compile
[DEBUG] org.apache.maven:maven-plugin-descriptor:jar:2.2.1:compile
[DEBUG] org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4:compile
[DEBUG] org.apache.maven:maven-monitor:jar:2.2.1:compile
[DEBUG] classworlds:classworlds:jar:1.1:compile
[DEBUG] org.sonatype.plexus:plexus-sec-dispatcher:jar:1.3:compile
[DEBUG] org.sonatype.plexus:plexus-cipher:jar:1.4:compile
[DEBUG] org.apache.maven:maven-project:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-plugin-registry:jar:2.2.1:compile
[DEBUG] org.codehaus.plexus:plexus-interpolation:jar:1.11:compile
[DEBUG] org.apache.maven:maven-model:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-settings:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-plugin-api:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-artifact:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-artifact-manager:jar:2.2.1:compile
[DEBUG] backport-util-concurrent:backport-util-concurrent:jar:3.1:compile
[DEBUG] org.apache.maven:maven-toolchain:jar:2.2.1:compile
[DEBUG] org.apache.maven.reporting:maven-reporting-api:jar:3.0:compile
[DEBUG] org.apache.maven:maven-archiver:jar:2.5:compile
[DEBUG] org.apache.maven.shared:maven-invoker:jar:2.0.9:compile
[DEBUG] org.apache.maven.shared:maven-common-artifact-filters:jar:1.3:compile
[DEBUG] org.apache.maven.doxia:doxia-sink-api:jar:1.4:compile
[DEBUG] org.apache.maven.doxia:doxia-logging-api:jar:1.4:compile
[DEBUG] org.apache.maven.doxia:doxia-site-renderer:jar:1.4:compile
[DEBUG] org.apache.maven.doxia:doxia-core:jar:1.4:compile
[DEBUG] xerces:xercesImpl:jar:2.9.1:compile
[DEBUG] xml-apis:xml-apis:jar:1.3.04:compile
[DEBUG] org.apache.maven.doxia:doxia-decoration-model:jar:1.4:compile
[DEBUG] org.apache.maven.doxia:doxia-module-xhtml:jar:1.4:compile
[DEBUG] org.apache.maven.doxia:doxia-module-fml:jar:1.4:compile
[DEBUG] org.codehaus.plexus:plexus-i18n:jar:1.0-beta-7:compile
[DEBUG] org.codehaus.plexus:plexus-velocity:jar:1.1.7:compile
[DEBUG] org.apache.velocity:velocity:jar:1.5:compile
[DEBUG] oro:oro:jar:2.0.8:compile
[DEBUG] org.apache.velocity:velocity-tools:jar:2.0:compile
[DEBUG] commons-beanutils:commons-beanutils:jar:1.7.0:compile
[DEBUG] commons-digester:commons-digester:jar:1.8:compile
[DEBUG] commons-chain:commons-chain:jar:1.1:compile
[DEBUG] commons-validator:commons-validator:jar:1.3.1:compile
[DEBUG] dom4j:dom4j:jar:1.1:compile
[DEBUG] sslext:sslext:jar:1.2-0:compile
[DEBUG] org.apache.struts:struts-core:jar:1.3.8:compile
[DEBUG] antlr:antlr:jar:2.7.2:compile
[DEBUG] org.apache.struts:struts-taglib:jar:1.3.8:compile
[DEBUG] org.apache.struts:struts-tiles:jar:1.3.8:compile
[DEBUG] commons-collections:commons-collections:jar:3.2.1:compile
[DEBUG] org.codehaus.plexus:plexus-component-annotations:jar:1.5.5:compile
[DEBUG] org.apache.maven.wagon:wagon-provider-api:jar:1.0-beta-6:compile
[DEBUG] commons-lang:commons-lang:jar:2.4:compile
[DEBUG] commons-io:commons-io:jar:2.2:compile
[DEBUG] org.apache.httpcomponents:httpclient:jar:4.2.3:compile
[DEBUG] org.apache.httpcomponents:httpcore:jar:4.2.2:compile
[DEBUG] commons-codec:commons-codec:jar:1.6:compile
[DEBUG] commons-logging:commons-logging:jar:1.1.1:compile
[DEBUG] log4j:log4j:jar:1.2.14:compile
[DEBUG] com.thoughtworks.qdox:qdox:jar:1.12.1:compile
[DEBUG] org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9:compile
[DEBUG] junit:junit:jar:3.8.1:compile
[DEBUG] org.codehaus.plexus:plexus-archiver:jar:2.9:compile
[DEBUG] org.codehaus.plexus:plexus-io:jar:2.4:compile
[DEBUG] org.apache.commons:commons-compress:jar:1.9:compile
[DEBUG] org.codehaus.plexus:plexus-utils:jar:3.0.20:compile
[DEBUG] Created new class realm plugin>org.apache.maven.plugins:maven-javadoc-plugin:2.10.3
[DEBUG] Importing foreign packages into class realm plugin>org.apache.maven.plugins:maven-javadoc-plugin:2.10.3
[DEBUG] Imported: < project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT
[DEBUG] Populating class realm plugin>org.apache.maven.plugins:maven-javadoc-plugin:2.10.3
[DEBUG] Included: org.apache.maven.plugins:maven-javadoc-plugin:jar:2.10.3
[DEBUG] Included: org.slf4j:slf4j-jdk14:jar:1.5.6
[DEBUG] Included: org.slf4j:jcl-over-slf4j:jar:1.5.6
[DEBUG] Included: commons-cli:commons-cli:jar:1.2
[DEBUG] Included: org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4
[DEBUG] Included: org.sonatype.plexus:plexus-sec-dispatcher:jar:1.3
[DEBUG] Included: org.sonatype.plexus:plexus-cipher:jar:1.4
[DEBUG] Included: org.codehaus.plexus:plexus-interpolation:jar:1.11
[DEBUG] Included: backport-util-concurrent:backport-util-concurrent:jar:3.1
[DEBUG] Included: org.apache.maven.reporting:maven-reporting-api:jar:3.0
[DEBUG] Included: org.apache.maven:maven-archiver:jar:2.5
[DEBUG] Included: org.apache.maven.shared:maven-invoker:jar:2.0.9
[DEBUG] Included: org.apache.maven.shared:maven-common-artifact-filters:jar:1.3
[DEBUG] Included: org.apache.maven.doxia:doxia-sink-api:jar:1.4
[DEBUG] Included: org.apache.maven.doxia:doxia-logging-api:jar:1.4
[DEBUG] Included: org.apache.maven.doxia:doxia-site-renderer:jar:1.4
[DEBUG] Included: org.apache.maven.doxia:doxia-core:jar:1.4
[DEBUG] Included: xerces:xercesImpl:jar:2.9.1
[DEBUG] Included: xml-apis:xml-apis:jar:1.3.04
[DEBUG] Included: org.apache.maven.doxia:doxia-decoration-model:jar:1.4
[DEBUG] Included: org.apache.maven.doxia:doxia-module-xhtml:jar:1.4
[DEBUG] Included: org.apache.maven.doxia:doxia-module-fml:jar:1.4
[DEBUG] Included: org.codehaus.plexus:plexus-i18n:jar:1.0-beta-7
[DEBUG] Included: org.codehaus.plexus:plexus-velocity:jar:1.1.7
[DEBUG] Included: org.apache.velocity:velocity:jar:1.5
[DEBUG] Included: oro:oro:jar:2.0.8
[DEBUG] Included: org.apache.velocity:velocity-tools:jar:2.0
[DEBUG] Included: commons-beanutils:commons-beanutils:jar:1.7.0
[DEBUG] Included: commons-digester:commons-digester:jar:1.8
[DEBUG] Included: commons-chain:commons-chain:jar:1.1
[DEBUG] Included: commons-validator:commons-validator:jar:1.3.1
[DEBUG] Included: dom4j:dom4j:jar:1.1
[DEBUG] Included: sslext:sslext:jar:1.2-0
[DEBUG] Included: org.apache.struts:struts-core:jar:1.3.8
[DEBUG] Included: antlr:antlr:jar:2.7.2
[DEBUG] Included: org.apache.struts:struts-taglib:jar:1.3.8
[DEBUG] Included: org.apache.struts:struts-tiles:jar:1.3.8
[DEBUG] Included: commons-collections:commons-collections:jar:3.2.1
[DEBUG] Included: org.codehaus.plexus:plexus-component-annotations:jar:1.5.5
[DEBUG] Included: commons-lang:commons-lang:jar:2.4
[DEBUG] Included: commons-io:commons-io:jar:2.2
[DEBUG] Included: org.apache.httpcomponents:httpclient:jar:4.2.3
[DEBUG] Included: org.apache.httpcomponents:httpcore:jar:4.2.2
[DEBUG] Included: commons-codec:commons-codec:jar:1.6
[DEBUG] Included: commons-logging:commons-logging:jar:1.1.1
[DEBUG] Included: log4j:log4j:jar:1.2.14
[DEBUG] Included: com.thoughtworks.qdox:qdox:jar:1.12.1
[DEBUG] Included: junit:junit:jar:3.8.1
[DEBUG] Included: org.codehaus.plexus:plexus-archiver:jar:2.9
[DEBUG] Included: org.codehaus.plexus:plexus-io:jar:2.4
[DEBUG] Included: org.apache.commons:commons-compress:jar:1.9
[DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:3.0.20
[DEBUG] Configuring mojo org.apache.maven.plugins:maven-javadoc-plugin:2.10.3:jar from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-javadoc-plugin:2.10.3, parent: sun.misc.Launcher$AppClassLoader@55f96302]
[DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-javadoc-plugin:2.10.3:jar' with basic configurator -->
[DEBUG] (f) additionalparam = -Xdoclint:none
[DEBUG] (f) aggregate = false
[DEBUG] (f) applyJavadocSecurityFix = true
[DEBUG] (f) attach = true
[DEBUG] (f) author = true
[DEBUG] (f) bootclasspathArtifacts = []
[DEBUG] (f) bottom = Copyright © {inceptionYear}–{currentYear} {organizationName}. All rights reserved.
[DEBUG] (f) breakiterator = false
[DEBUG] (f) classifier = javadoc
[DEBUG] (f) debug = false
[DEBUG] (f) defaultManifestFile = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/target/classes/META-INF/MANIFEST.MF
[DEBUG] (f) detectJavaApiLink = true
[DEBUG] (f) detectLinks = false
[DEBUG] (f) detectOfflineLinks = true
[DEBUG] (f) docfilessubdirs = false
[DEBUG] (f) docletArtifact = groupId = 'null'
artifactId = 'null'
version = 'null'
[DEBUG] (f) docletArtifacts = []
[DEBUG] (f) doctitle = JavaCPP Presets 1.2.5-SNAPSHOT API
[DEBUG] (f) encoding = UTF-8
[DEBUG] (f) failOnError = true
[DEBUG] (f) finalName = javacpp-presets
[DEBUG] (f) groups = []
[DEBUG] (f) includeDependencySources = false
[DEBUG] (f) includeTransitiveDependencySources = false
[DEBUG] (f) isOffline = false
[DEBUG] (f) jarOutputDirectory = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/target
[DEBUG] (f) javaApiLinks = {}
[DEBUG] (f) javadocDirectory = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/src/main/javadoc
[DEBUG] (f) javadocOptionsDir = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/target/javadoc-bundle-options
[DEBUG] (f) keywords = false
[DEBUG] (f) links = [http://bytedeco.org/javacpp/apidocs/]
[DEBUG] (f) linksource = false
[DEBUG] (f) localRepository = id: local
url: file:///Users/mohamed.badawy/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]
[DEBUG] (f) nocomment = false
[DEBUG] (f) nodeprecated = false
[DEBUG] (f) nodeprecatedlist = false
[DEBUG] (f) nohelp = false
[DEBUG] (f) noindex = false
[DEBUG] (f) nonavbar = false
[DEBUG] (f) nooverview = false
[DEBUG] (f) nosince = false
[DEBUG] (f) notimestamp = false
[DEBUG] (f) notree = false
[DEBUG] (f) offlineLinks = []
[DEBUG] (f) old = false
[DEBUG] (f) outputDirectory = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/target/apidocs
[DEBUG] (f) overview = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/src/main/javadoc/overview.html
[DEBUG] (f) project = MavenProject: org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/pom.xml
[DEBUG] (f) quiet = false
[DEBUG] (f) reactorProjects = [MavenProject: org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/pom.xml, MavenProject: org.bytedeco.javacpp-presets:tensorflow:0.10.0-1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/pom.xml]
[DEBUG] (f) remoteRepositories = [ id: central
url: https://repo.maven.apache.org/maven2
layout: default
snapshots: [enabled => false, update => daily]
releases: [enabled => true, update => daily]
]
[DEBUG] (f) resourcesArtifacts = []
[DEBUG] (f) serialwarn = false
[DEBUG] (f) session = org.apache.maven.execution.MavenSession@3c3cbc0e
[DEBUG] (f) settings = org.apache.maven.execution.SettingsAdapter@4d87d62c
[DEBUG] (f) show = protected
[DEBUG] (f) skip = false
[DEBUG] (f) sourceDependencyCacheDir = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/target/distro-javadoc-sources
[DEBUG] (f) splitindex = false
[DEBUG] (f) stylesheet = java
[DEBUG] (f) tagletArtifact = groupId = 'null'
artifactId = 'null'
version = 'null'
[DEBUG] (f) tagletArtifacts = []
[DEBUG] (f) taglets = []
[DEBUG] (f) tags = []
[DEBUG] (f) use = true
[DEBUG] (f) useDefaultManifestFile = false
[DEBUG] (f) useStandardDocletOptions = true
[DEBUG] (f) validateLinks = false
[DEBUG] (f) verbose = false
[DEBUG] (f) version = true
[DEBUG] (f) windowtitle = JavaCPP Presets 1.2.5-SNAPSHOT API
[DEBUG] -- end configuration --
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO]
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ javacpp-presets ---
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=0, ConflictMarker.markTime=0, ConflictMarker.nodeCount=40, ConflictIdSorter.graphTime=0, ConflictIdSorter.topsortTime=0, ConflictIdSorter.conflictIdCount=19, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=1, ConflictResolver.conflictItemCount=40, DefaultDependencyCollector.collectTime=8, DefaultDependencyCollector.transformTime=1}
[DEBUG] org.apache.maven.plugins:maven-install-plugin:jar:2.5.2:
[DEBUG] org.apache.maven:maven-plugin-api:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-project:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-settings:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-profile:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-plugin-registry:jar:2.2.1:compile
[DEBUG] org.codehaus.plexus:plexus-interpolation:jar:1.11:compile
[DEBUG] org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile
[DEBUG] junit:junit:jar:3.8.1:compile
[DEBUG] classworlds:classworlds:jar:1.1-alpha-2:compile
[DEBUG] org.apache.maven:maven-model:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-artifact-manager:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-repository-metadata:jar:2.2.1:compile
[DEBUG] backport-util-concurrent:backport-util-concurrent:jar:3.1:compile
[DEBUG] org.apache.maven:maven-artifact:jar:2.2.1:compile
[DEBUG] commons-codec:commons-codec:jar:1.6:compile
[DEBUG] org.apache.maven.shared:maven-shared-utils:jar:0.4:compile
[DEBUG] com.google.code.findbugs:jsr305:jar:2.0.1:compile
[DEBUG] org.codehaus.plexus:plexus-utils:jar:3.0.15:compile
[DEBUG] Created new class realm plugin>org.apache.maven.plugins:maven-install-plugin:2.5.2
[DEBUG] Importing foreign packages into class realm plugin>org.apache.maven.plugins:maven-install-plugin:2.5.2
[DEBUG] Imported: < project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT
[DEBUG] Populating class realm plugin>org.apache.maven.plugins:maven-install-plugin:2.5.2
[DEBUG] Included: org.apache.maven.plugins:maven-install-plugin:jar:2.5.2
[DEBUG] Included: org.codehaus.plexus:plexus-interpolation:jar:1.11
[DEBUG] Included: junit:junit:jar:3.8.1
[DEBUG] Included: backport-util-concurrent:backport-util-concurrent:jar:3.1
[DEBUG] Included: commons-codec:commons-codec:jar:1.6
[DEBUG] Included: org.apache.maven.shared:maven-shared-utils:jar:0.4
[DEBUG] Included: com.google.code.findbugs:jsr305:jar:2.0.1
[DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:3.0.15
[DEBUG] Configuring mojo org.apache.maven.plugins:maven-install-plugin:2.5.2:install from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-install-plugin:2.5.2, parent: sun.misc.Launcher$AppClassLoader@55f96302]
[DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-install-plugin:2.5.2:install' with basic configurator -->
[DEBUG] (f) artifact = org.bytedeco:javacpp-presets:pom:1.2.5-SNAPSHOT
[DEBUG] (f) attachedArtifacts = []
[DEBUG] (f) createChecksum = true
[DEBUG] (f) installAtEnd = false
[DEBUG] (f) localRepository = id: local
url: file:///Users/mohamed.badawy/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]
[DEBUG] (f) packaging = pom
[DEBUG] (f) pomFile = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/pom.xml
[DEBUG] (f) project = MavenProject: org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/pom.xml
[DEBUG] (f) reactorProjects = [MavenProject: org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/pom.xml, MavenProject: org.bytedeco.javacpp-presets:tensorflow:0.10.0-1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/pom.xml]
[DEBUG] (s) skip = false
[DEBUG] (f) updateReleaseInfo = false
[DEBUG] -- end configuration --
[INFO] Installing /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/pom.xml to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/javacpp-presets-1.2.5-SNAPSHOT.pom
[DEBUG] Writing tracking file /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/_remote.repositories
[DEBUG] Installing org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT/maven-metadata.xml to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/maven-metadata-local.xml
[DEBUG] Installing org.bytedeco:javacpp-presets/maven-metadata.xml to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/maven-metadata-local.xml
[DEBUG] Calculating checksums for /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/javacpp-presets-1.2.5-SNAPSHOT.pom
[DEBUG] Installing checksum to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/javacpp-presets-1.2.5-SNAPSHOT.pom.md5
[DEBUG] Installing checksum to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/javacpp-presets-1.2.5-SNAPSHOT.pom.sha1
[DEBUG] Calculating checksums for /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/maven-metadata-local.xml
[DEBUG] Installing checksum to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/maven-metadata-local.xml.md5
[DEBUG] Installing checksum to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/1.2.5-SNAPSHOT/maven-metadata-local.xml.sha1
[DEBUG] Calculating checksums for /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/maven-metadata-local.xml
[DEBUG] Installing checksum to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/maven-metadata-local.xml.md5
[DEBUG] Installing checksum to /Users/mohamed.badawy/.m2/repository/org/bytedeco/javacpp-presets/maven-metadata-local.xml.sha1
[DEBUG] Scheduling: org.bytedeco.javacpp-presets:tensorflow:jar:0.10.0-1.2.5-SNAPSHOT -> [install]
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building JavaCPP Presets for TensorFlow 0.10.0-1.2.5-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] === PROJECT BUILD PLAN ================================================
[DEBUG] Project: org.bytedeco.javacpp-presets:tensorflow:0.10.0-1.2.5-SNAPSHOT
[DEBUG] Dependencies (collect): []
[DEBUG] Dependencies (resolve): [compile, runtime, test]
[DEBUG] Repositories (dependencies): [central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] Repositories (plugins) : [central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (javacpp.cppbuild.install)
[DEBUG] Style: Regular
[DEBUG] Configuration:
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/../cppbuild.sh
install
tensorflow
-platform
macosx-x86_64
${exec.classpathScope}
${exec.args}
bash
${exec.longClasspath}
${exec.outputFile}
false
${sourceRoot}
${testSourceRoot}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/..
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-resources-plugin:2.7:resources (javacpp.parser)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${encoding}
${maven.resources.escapeString}
${maven.resources.escapeWindowsPaths}
${maven.resources.includeEmptyDirs}
${maven.resources.overwrite}
${maven.resources.supportMultiLineFiltering}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-compiler-plugin:3.3:compile (javacpp.parser)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${maven.compiler.compilerId}
${maven.compiler.compilerReuseStrategy}
${maven.compiler.compilerVersion}
${maven.compiler.debug}
${maven.compiler.debuglevel}
${encoding}
${maven.compiler.executable}
${maven.compiler.failOnError}
${maven.compiler.forceJavacCompilerUse}
${maven.compiler.fork}
org/bytedeco/javacpp/presets/*.java
${maven.compiler.maxmem}
${maven.compiler.meminitial}
${maven.compiler.optimize}
${maven.compiler.showDeprecation}
${maven.compiler.showWarnings}
false
${maven.compiler.skipMultiThreadWarning}
1.7
${lastModGranularityMs}
1.7
${maven.compiler.useIncrementalCompilation}
${maven.compiler.verbose}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.bytedeco:javacpp:1.2.5-SNAPSHOT:build (javacpp.parser)
[DEBUG] Style: Regular
[DEBUG] Configuration:
org.bytedeco.javacpp.presets.*
${javacpp.classOrPackageNames}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/target/classes
${javacpp.classPaths}
${javacpp.compile}
${javacpp.compilerOptions}
${javacpp.copyLibs}
${javacpp.deleteJniFiles}
${javacpp.environmentVariables}
${javacpp.header}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/include/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-genfiles/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/external/protobuf/src/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/external/eigen_archive/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/external/eigen_archive/eigen-eigen-d02e6a705c30/${javacpp.includePaths}
${javacpp.jarPrefix}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/lib/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-bin/tensorflow/${javacpp.linkPaths}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/src/main/java
${javacpp.outputName}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/bin/
${javacpp.preloadPaths}
macosx-x86_64
${javacpp.propertyFile}
platform.root
${javacpp.platform.root}
platform.compiler
${javacpp.platform.compiler}
${javacpp.propertyKeysAndValues}
false
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-dependency-plugin:2.10:copy-dependencies (copy-dependencies)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${mdep.addParentPoms}
${classifier}
${mdep.copyPom}
${excludeArtifactIds}
${excludeClassifiers}
${excludeGroupIds}
${excludeScope}
${excludeTransitive}
${excludeTypes}
${mdep.failOnMissingClassifierArtifact}
${includeArtifactIds}
${includeClassifiers}
${includeGroupIds}
${includeScope}
${includeTypes}
${markersDirectory}
${outputAbsoluteArtifactFilename}
${project.parent.build.directory}
${overWriteIfNewer}
${overWriteReleases}
${overWriteSnapshots}
${mdep.prependGroupId}
${silent}
${mdep.skip}
${mdep.stripClassifier}
true
${type}
${mdep.useBaseVersion}
${mdep.useRepositoryLayout}
${mdep.useSubDirectoryPerArtifact}
${mdep.useSubDirectoryPerScope}
${mdep.useSubDirectoryPerType}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-resources-plugin:2.7:resources (default-resources)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${encoding}
${maven.resources.escapeString}
${maven.resources.escapeWindowsPaths}
${maven.resources.includeEmptyDirs}
${maven.resources.overwrite}
${maven.resources.supportMultiLineFiltering}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-compiler-plugin:3.3:compile (default-compile)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${maven.compiler.compilerId}
${maven.compiler.compilerReuseStrategy}
${maven.compiler.compilerVersion}
${maven.compiler.debug}
${maven.compiler.debuglevel}
${encoding}
${maven.compiler.executable}
${maven.compiler.failOnError}
${maven.compiler.forceJavacCompilerUse}
${maven.compiler.fork}
org/bytedeco/javacpp/*.java
${maven.compiler.maxmem}
${maven.compiler.meminitial}
${maven.compiler.optimize}
${maven.compiler.showDeprecation}
${maven.compiler.showWarnings}
${maven.main.skip}
${maven.compiler.skipMultiThreadWarning}
1.7
${lastModGranularityMs}
1.7
${maven.compiler.useIncrementalCompilation}
${maven.compiler.verbose}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.bytedeco:javacpp:1.2.5-SNAPSHOT:build (javacpp.compiler)
[DEBUG] Style: Regular
[DEBUG] Configuration:
org.bytedeco.javacpp.*
${javacpp.classOrPackageNames}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/target/classes
${javacpp.classPaths}
${javacpp.compile}
${javacpp.compilerOptions}
true
${javacpp.deleteJniFiles}
${javacpp.environmentVariables}
${javacpp.header}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/include/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-genfiles/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/external/protobuf/src/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/external/eigen_archive/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-tensorflow-0.10.0/external/eigen_archive/eigen-eigen-d02e6a705c30/${javacpp.includePaths}
${javacpp.jarPrefix}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/lib/
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/tensorflow-0.10.0/bazel-bin/tensorflow/${javacpp.linkPaths}
${javacpp.outputDirectory}
${javacpp.outputName}
/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild/macosx-x86_64/bin/
${javacpp.preloadPaths}
macosx-x86_64
${javacpp.propertyFile}
platform.root
${javacpp.platform.root}
platform.compiler
${javacpp.platform.compiler}
${javacpp.propertyKeysAndValues}
false
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-resources-plugin:2.7:testResources (default-testResources)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${encoding}
${maven.resources.escapeString}
${maven.resources.escapeWindowsPaths}
${maven.resources.includeEmptyDirs}
${maven.resources.overwrite}
${maven.test.skip}
${maven.resources.supportMultiLineFiltering}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-compiler-plugin:3.3:testCompile (default-testCompile)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${maven.compiler.compilerId}
${maven.compiler.compilerReuseStrategy}
${maven.compiler.compilerVersion}
${maven.compiler.debug}
${maven.compiler.debuglevel}
${encoding}
${maven.compiler.executable}
${maven.compiler.failOnError}
${maven.compiler.forceJavacCompilerUse}
${maven.compiler.fork}
${maven.compiler.maxmem}
${maven.compiler.meminitial}
${maven.compiler.optimize}
${maven.compiler.showDeprecation}
${maven.compiler.showWarnings}
${maven.test.skip}
${maven.compiler.skipMultiThreadWarning}
1.7
${lastModGranularityMs}
1.7
${maven.compiler.testSource}
${maven.compiler.testTarget}
${maven.compiler.useIncrementalCompilation}
${maven.compiler.verbose}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-surefire-plugin:2.12.4:test (default-test)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${argLine}
${childDelegation}
${maven.surefire.debug}
${disableXmlReport}
${enableAssertions}
${excludedGroups}
${surefire.failIfNoSpecifiedTests}
${failIfNoTests}
${forkMode}
${surefire.timeout}
${groups}
${junitArtifactName}
${jvm}
${objectFactory}
${parallel}
${perCoreThreadCount}
${plugin.artifactMap}
${surefire.printSummary}
${project.artifactMap}
${maven.test.redirectTestOutputToFile}
${surefire.reportFormat}
${surefire.reportNameSuffix}
${maven.test.skip}
${maven.test.skip.exec}
${skipTests}
${test}
${maven.test.failure.ignore}
${testNGArtifactName}
${threadCount}
${trimStackTrace}
${surefire.useFile}
${surefire.useManifestOnlyJar}
${surefire.useSystemClassLoader}
${useUnlimitedThreads}
${basedir}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-jar-plugin:2.6:jar (default-jar)
[DEBUG] Style: Regular
[DEBUG] Configuration:
true
. javacpp.jar tensorflow-linux-arm.jar tensorflow-linux-x86.jar tensorflow-linux-x86_64.jar tensorflow-macosx-x86.jar tensorflow-macosx-x86_64.jar tensorflow-windows-x86.jar tensorflow-windows-x86_64.jar
org/bytedeco/javacpp/presets/
JavaCPP Presets for TensorFlow
Bytedeco
0.10.0-1.2.5-SNAPSHOT
JavaCPP Presets for TensorFlow
Bytedeco
0.10.0-1.2.5-SNAPSHOT
${maven.jar.classifier}
org/bytedeco/javacpp/*.h
org/bytedeco/javacpp/linux-*/
org/bytedeco/javacpp/macosx-*/
org/bytedeco/javacpp/windows-*/
org/bytedeco/javacpp/macosx-x86_64/
${jar.finalName}
${jar.forceCreation}
org/bytedeco/javacpp/*
org/bytedeco/javacpp/helper/*
org/bytedeco/javacpp/presets/*
${jar.skipIfEmpty}
${jar.useDefaultManifestFile}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-jar-plugin:2.6:jar (macosx-x86_64)
[DEBUG] Style: Regular
[DEBUG] Configuration:
macosx-x86_64
**/*.exp
**/*.lib
**/*.obj
${jar.finalName}
${jar.forceCreation}
${javacpp.platform.library.path}/
org/bytedeco/javacpp/macosx-x86_64/
true
${jar.useDefaultManifestFile}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-dependency-plugin:2.10:copy (default)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${artifact}
org.bytedeco.javacpp-presets
tensorflow
0.10.0-1.2.5-SNAPSHOT
jar
true
${project.parent.build.directory}
org.bytedeco.javacpp-presets
tensorflow
0.10.0-1.2.5-SNAPSHOT
macosx-x86_64
jar
true
${project.parent.build.directory}
${outputAbsoluteArtifactFilename}
${outputDirectory}
${mdep.overIfNewer}
${mdep.overWriteReleases}
${mdep.overWriteSnapshots}
${mdep.prependGroupId}
${silent}
${mdep.skip}
${mdep.stripClassifier}
true
${mdep.useBaseVersion}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-source-plugin:2.4:jar-no-fork (attach-source)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${attach}
${maven.source.classifier}
${source.excludeResources}
${source.forceCreation}
${source.includePom}
${source.skip}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-javadoc-plugin:2.10.3:jar (attach-javadocs)
[DEBUG] Style: Regular
[DEBUG] Configuration:
${additionalJOption}
-Xdoclint:none
${aggregate}
${maven.javadoc.applyJavadocSecurityFix}
${attach}
${author}
${bootclasspath}
${bootclasspathArtifacts}
${bottom}
${breakiterator}
${charset}
${maven.javadoc.classifier}
${debug}
${destDir}
${detectJavaApiLink}
${detectLinks}
${detectOfflineLinks}
${docencoding}
${docfilessubdirs}
${doclet}
${docletArtifact}
${docletArtifacts}
${docletPath}
${doctitle}
${encoding}
${excludePackageNames}
${excludedocfilessubdir}
${extdirs}
${maven.javadoc.failOnError}
${project.build.finalName}
${footer}
${groups}
${header}
${helpfile}
${project.build.directory}
${javaApiLinks}
${javadocExecutable}
${javadocVersion}
${keywords}
http://bytedeco.org/javacpp/apidocs/${links}
${linksource}
${localRepository}
${locale}
${maxmemory}
${minmemory}
${nocomment}
${nodeprecated}
${nodeprecatedlist}
${nohelp}
${noindex}
${nonavbar}
${nooverview}
${noqualifier}
${nosince}
${notimestamp}
${notree}
${offlineLinks}
${old}
${destDir}
${overview}
${packagesheader}
${proxyHost}
${proxyPort}
${quiet}
${reactorProjects}
${project.remoteArtifactRepositories}
${resourcesArtifacts}
${serialwarn}
${show}
${maven.javadoc.skip}
${source}
${sourcepath}
${sourcetab}
${splitindex}
${stylesheet}
${stylesheetfile}
${subpackages}
${taglet}
${tagletArtifact}
${tagletArtifacts}
${tagletpath}
${taglets}
${tags}
${top}
${use}
${useStandardDocletOptions}
${validateLinks}
${verbose}
${version}
${windowtitle}
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal: org.apache.maven.plugins:maven-install-plugin:2.5.2:install (default-install)
[DEBUG] Style: Regular
[DEBUG] Configuration:
true
${installAtEnd}
${localRepository}
${maven.install.skip}
${updateReleaseInfo}
[DEBUG] =======================================================================
[WARNING] *****************************************************************
[WARNING] * Your build is requesting parallel execution, but project *
[WARNING] * contains the following plugin(s) that have goals not marked *
[WARNING] * as @threadSafe to support parallel building. *
[WARNING] * While this /may/ work fine, please look for plugin updates *
[WARNING] * and/or request plugins be made thread-safe. *
[WARNING] * If reporting an issue, report it against the plugin in *
[WARNING] * question, not against maven-core *
[WARNING] *****************************************************************
[WARNING] The following goals are not marked @threadSafe in JavaCPP Presets for TensorFlow:
[WARNING] org.bytedeco:javacpp:1.2.5-SNAPSHOT:build
[WARNING] *****************************************************************
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=0, ConflictMarker.markTime=0, ConflictMarker.nodeCount=2, ConflictIdSorter.graphTime=0, ConflictIdSorter.topsortTime=0, ConflictIdSorter.conflictIdCount=1, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=0, ConflictResolver.conflictItemCount=1, DefaultDependencyCollector.collectTime=2, DefaultDependencyCollector.transformTime=0}
[DEBUG] org.bytedeco.javacpp-presets:tensorflow:jar:0.10.0-1.2.5-SNAPSHOT
[DEBUG] org.bytedeco:javacpp:jar:1.2.5-SNAPSHOT:compile
[INFO]
[INFO] --- exec-maven-plugin:1.4.0:exec (javacpp.cppbuild.install) @ tensorflow ---
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=1, ConflictMarker.markTime=0, ConflictMarker.nodeCount=136, ConflictIdSorter.graphTime=0, ConflictIdSorter.topsortTime=0, ConflictIdSorter.conflictIdCount=35, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=0, ConflictResolver.conflictItemCount=78, DefaultDependencyCollector.collectTime=11, DefaultDependencyCollector.transformTime=1}
[DEBUG] org.codehaus.mojo:exec-maven-plugin:jar:1.4.0:
[DEBUG] org.apache.maven:maven-toolchain:jar:1.0:compile
[DEBUG] org.apache.maven:maven-project:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-settings:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-profile:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-plugin-registry:jar:2.2.1:compile
[DEBUG] org.codehaus.plexus:plexus-interpolation:jar:1.11:compile
[DEBUG] org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile
[DEBUG] junit:junit:jar:4.11:test
[DEBUG] org.hamcrest:hamcrest-core:jar:1.3:test
[DEBUG] org.apache.maven:maven-model:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-artifact:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-artifact-manager:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-repository-metadata:jar:2.2.1:compile
[DEBUG] backport-util-concurrent:backport-util-concurrent:jar:3.1:compile
[DEBUG] org.apache.maven:maven-core:jar:2.2.1:compile
[DEBUG] org.apache.maven:maven-plugin-parameter-documenter:jar:2.2.1:compile
[DEBUG] org.slf4j:slf4j-jdk14:jar:1.5.6:runtime
[DEBUG] org.slf4j:slf4j-api:jar:1.5.6:runtime
[DEBUG] org.slf4j:jcl-over-slf4j:jar:1.5.6:runtime
[DEBUG] org.apache.maven.reporting:maven-reporting-api:jar:2.2.1:compile
[DEBUG] org.apache.maven.doxia:doxia-sink-api:jar:1.1:compile
[DEBUG] org.apache.maven.doxia:doxia-logging-api:jar:1.1:compile
[DEBUG] org.apache.maven:maven-error-diagnostics:jar:2.2.1:compile
[DEBUG] commons-cli:commons-cli:jar:1.2:compile
[DEBUG] org.apache.maven:maven-plugin-descriptor:jar:2.2.1:compile
[DEBUG] org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4:compile
[DEBUG] org.apache.maven:maven-monitor:jar:2.2.1:compile
[DEBUG] classworlds:classworlds:jar:1.1:compile
[DEBUG] org.sonatype.plexus:plexus-sec-dispatcher:jar:1.3:compile
[DEBUG] org.sonatype.plexus:plexus-cipher:jar:1.4:compile
[DEBUG] org.apache.maven:maven-plugin-api:jar:2.2.1:compile
[DEBUG] org.codehaus.plexus:plexus-utils:jar:3.0.20:compile
[DEBUG] org.apache.commons:commons-exec:jar:1.3:compile
[DEBUG] Created new class realm plugin>org.codehaus.mojo:exec-maven-plugin:1.4.0
[DEBUG] Importing foreign packages into class realm plugin>org.codehaus.mojo:exec-maven-plugin:1.4.0
[DEBUG] Imported: < project>org.bytedeco:javacpp-presets:1.2.5-SNAPSHOT
[DEBUG] Populating class realm plugin>org.codehaus.mojo:exec-maven-plugin:1.4.0
[DEBUG] Included: org.codehaus.mojo:exec-maven-plugin:jar:1.4.0
[DEBUG] Included: org.codehaus.plexus:plexus-interpolation:jar:1.11
[DEBUG] Included: backport-util-concurrent:backport-util-concurrent:jar:3.1
[DEBUG] Included: org.slf4j:slf4j-jdk14:jar:1.5.6
[DEBUG] Included: org.slf4j:jcl-over-slf4j:jar:1.5.6
[DEBUG] Included: org.apache.maven.reporting:maven-reporting-api:jar:2.2.1
[DEBUG] Included: org.apache.maven.doxia:doxia-sink-api:jar:1.1
[DEBUG] Included: org.apache.maven.doxia:doxia-logging-api:jar:1.1
[DEBUG] Included: commons-cli:commons-cli:jar:1.2
[DEBUG] Included: org.codehaus.plexus:plexus-interactivity-api:jar:1.0-alpha-4
[DEBUG] Included: org.sonatype.plexus:plexus-sec-dispatcher:jar:1.3
[DEBUG] Included: org.sonatype.plexus:plexus-cipher:jar:1.4
[DEBUG] Included: org.codehaus.plexus:plexus-utils:jar:3.0.20
[DEBUG] Included: org.apache.commons:commons-exec:jar:1.3
[DEBUG] Configuring mojo org.codehaus.mojo:exec-maven-plugin:1.4.0:exec from plugin realm ClassRealm[plugin>org.codehaus.mojo:exec-maven-plugin:1.4.0, parent: sun.misc.Launcher$AppClassLoader@55f96302]
[DEBUG] Configuring mojo 'org.codehaus.mojo:exec-maven-plugin:1.4.0:exec' with basic configurator -->
[DEBUG] (f) arguments = [/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/../cppbuild.sh, install, tensorflow, -platform, macosx-x86_64]
[DEBUG] (f) basedir = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow
[DEBUG] (f) classpathScope = runtime
[DEBUG] (f) executable = bash
[DEBUG] (f) longClasspath = false
[DEBUG] (f) project = MavenProject: org.bytedeco.javacpp-presets:tensorflow:0.10.0-1.2.5-SNAPSHOT @ /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/pom.xml
[DEBUG] (f) session = org.apache.maven.execution.MavenSession@190b981f
[DEBUG] (f) skip = false
[DEBUG] (f) workingDirectory = /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/..
[DEBUG] -- end configuration --
[DEBUG] env: Apple_PubSub_Socket_Render=/private/tmp/com.apple.launchd.BWXbwKeNwU/Render
[DEBUG] env: HOME=/Users/mohamed.badawy
[DEBUG] env: JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home
[DEBUG] env: JAVA_MAIN_CLASS_21604=org.codehaus.plexus.classworlds.launcher.Launcher
[DEBUG] env: LANG=en_US.UTF-8
[DEBUG] env: LOGNAME=mohamed.badawy
[DEBUG] env: MAVEN_CMD_LINE_ARGS= -e -X -T 20 install --projects .,tensorflow
[DEBUG] env: MAVEN_PROJECTBASEDIR=/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets
[DEBUG] env: OLDPWD=/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets
[DEBUG] env: PATH=/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin
[DEBUG] env: PWD=/Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets
[DEBUG] env: SHELL=/bin/bash
[DEBUG] env: SHLVL=1
[DEBUG] env: SSH_AUTH_SOCK=/private/tmp/com.apple.launchd.IieZTG5cNr/Listeners
[DEBUG] env: TERM=xterm-256color
[DEBUG] env: TERM_PROGRAM=Apple_Terminal
[DEBUG] env: TERM_PROGRAM_VERSION=361.1
[DEBUG] env: TERM_SESSION_ID=7C984810-AA3F-45FF-9A51-56E87F789084
[DEBUG] env: TMPDIR=/var/folders/jq/6cq37j_91fq47pzzlh7fjfj46wskwc/T/
[DEBUG] env: USER=mohamed.badawy
[DEBUG] env: XPC_FLAGS=0x0
[DEBUG] env: XPC_SERVICE_NAME=0
[DEBUG] env: __CF_USER_TEXT_ENCODING=0xDCCCB8C:0x0:0x0
[DEBUG] Executing command line: [bash, /Users/mohamed.badawy/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/../cppbuild.sh, install, tensorflow, -platform, macosx-x86_64]
Detected platform "macosx-x86_64"
Building for platform "macosx-x86_64"
Installing "tensorflow"
~/Bin/deepwater/thirdparty/javacpp-presets/tensorflow/cppbuild ~/Bin/deepwater/thirdparty/javacpp-presets
Decompressing archives
Can't find swig. Ensure swig is in $PATH or set $SWIG_PATH.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] JavaCPP Presets .................................... SUCCESS [ 0.647 s]
[INFO] JavaCPP Presets for TensorFlow ..................... FAILURE [ 1.475 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.829 s (Wall Clock)
[INFO] Finished at: 2016-11-08T10:18:00-06:00
[INFO] Final Memory: 21M/392M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (javacpp.cppbuild.install) on project tensorflow: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (javacpp.cppbuild.install) on project tensorflow: Command execution failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:185)
at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:181)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.maven.plugin.MojoExecutionException: Command execution failed.
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:276)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
... 11 more
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:660)
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:265)
... 13 more
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :tensorflow
Is deep in the logs but the real error is: Can't find swig. Ensure swig is in $PATH or set $SWIG_PATH.
You need to install swig. Please read the tensorflow build dependencies and try again.
Will close for now. Please reopen if something else comes up
Thanks.
On Wed, Nov 16, 2016 at 5:46 PM, Fabrizio Milo notifications@github.com
wrote:
Closed #16 https://github.com/h2oai/deepwater/issues/16.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/h2oai/deepwater/issues/16#event-861995992, or mute
the thread
https://github.com/notifications/unsubscribe-auth/AGQHR9_VS9iDXclmZyiljVrwJ90MS5MBks5q-5XJgaJpZM4KsqSj
.
| gharchive/issue | 2016-11-08T16:57:14 | 2025-04-01T06:38:53.947390 | {
"authors": [
"Mistobaan",
"dab3-2014"
],
"repo": "h2oai/deepwater",
"url": "https://github.com/h2oai/deepwater/issues/16",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1944019130 | start h2ogpt with multiple "Visible Models"
How to start h2ogpt with multiple "Visible Models" with the generate.py on startup, similar to that example https://gpt.h2o.ai
This is what we use for gpt.h2o.ai. So "vis" is list of models to choose from and we export that string to bash, and "$MODEL_LOCK" is bash variable for the list of dicts of models.
export vis="['h2oai/h2ogpt-4096-llama2-70b-chat','h2oai/h2ogpt-4096-llama2-13b-chat','HuggingFaceH4/zephyr-7b-alpha','gpt-3.5-turbo-0613']"
python generate.py --save_dir=saveall_gpt --model_lock="$MODEL_LOCK" --model_lock_columns=3 --auth_filename=all_auth.json --gradio_size=small --height=400 --score_model=None --max_max_new_tokens=2048 --max_new_tokens=1024 --visible_models="$vis" &>> logs.all.gradio_chat.txt &
For details on model lock see: https://github.com/h2oai/h2ogpt/blob/main/docs/README_InferenceServers.md#locking-models-for-easy-start-up-or-in-app-comparison
| gharchive/issue | 2023-10-15T19:20:13 | 2025-04-01T06:38:53.958035 | {
"authors": [
"eslam-gomaa",
"pseudotensor"
],
"repo": "h2oai/h2ogpt",
"url": "https://github.com/h2oai/h2ogpt/issues/964",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
133542005 | unmaintained?
have you stopped maintaining doozer?
no commits in 2 years, its basically broken at this point due to dependency path having changed. (protobuf in particular)
Yup. Use etcd instead.
| gharchive/issue | 2016-02-14T14:04:31 | 2025-04-01T06:38:54.057904 | {
"authors": [
"dgryski",
"james-lawrence"
],
"repo": "ha/doozer",
"url": "https://github.com/ha/doozer/issues/42",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
827810632 | restart api
great work you have done the api! i just have trouble when it disconnects... is there a way to make it automatically reconnect, instead of me having to type src/yap/./yap api?
This repository is no longer maintained. Please use https://github.com/onlplab/yap
| gharchive/issue | 2021-03-10T14:18:18 | 2025-04-01T06:38:54.127663 | {
"authors": [
"habeanf",
"yishairasowsky"
],
"repo": "habeanf/yap",
"url": "https://github.com/habeanf/yap/issues/8",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
57186708 | Fix api specs
In your API tests you can examine the response:
puts response.body
I modified one of the tests:
post "http://api.vcap.me:3000/v1/applications?access_token=#{@token.token}", application: FactoryGirl.attributes_for(:application), :format => :json
puts response.body
And it returned:
{"error":"reimbursement_needed is missing, profile_id is missing, hackathon_id is missing"}
This is because you're posting these fields under application, so the server gets params[:application][:hackathon_id], not just params[:hackathon_id]. You can decide to go either way, but obviously it needs to be consistent.
Closes #25.
Btw, check out https://github.com/yujinakayama/transpec and make all the RSpec expectations use a consistent syntax. Add config.raise_errors_for_deprecations! to RSpec spec_helper.rb then.
Coverage remained the same at 85.32% when pulling 54875a3f23204bca9d6d21b775ab7bbe02415ffb on dblock:fix-api-specs into aa2369694149a5fbe4f52bd0456250559ca7e5fa on hackcentral:master.
Awesome, thanks @dblock for your help with this!
Merged.
| gharchive/pull-request | 2015-02-10T15:09:49 | 2025-04-01T06:38:54.186318 | {
"authors": [
"coveralls",
"dblock",
"maclover7"
],
"repo": "hackcentral/hackcentral",
"url": "https://github.com/hackcentral/hackcentral/pull/26",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2279835974 | Remove Reese Armstrong
I am no longer participating in the Hack Club webring
many are saying this
Your pull request has been merged
| gharchive/pull-request | 2024-05-06T01:00:11 | 2025-04-01T06:38:54.204439 | {
"authors": [
"Shrey-Mehra",
"reesericci"
],
"repo": "hackclub/webring",
"url": "https://github.com/hackclub/webring/pull/163",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
208658179 | Implement a custom 500 page
In particular, this special-cases the NoSuchEmployee error, which is caused by a user not being included in a database dump. I want to have a nicer experience for people who log in but for some reason weren't included in my LDAP dump. This tells the user they can email a support email (i.e. me) to have them added to the database. The general case 500 page doesn't have the support email address. Screenshots below.
Builds are broken due to #4. Local make test passes, although I will probably want to add a test or two ensuring that the error pages have the right text.
Looks good. I'll work on travis later
@matthewbentley no worries I just fixed it - I'm reverting my makefile changes. I'll add a wiki note on how to make it work locally
| gharchive/pull-request | 2017-02-18T17:58:14 | 2025-04-01T06:38:54.324026 | {
"authors": [
"brenns10",
"matthewbentley"
],
"repo": "hacsoc/love",
"url": "https://github.com/hacsoc/love/pull/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
246531181 | Problem with png images
I've tried use ur library for change brightness on image. For jpeg images it's work fine. But for png image it return { data: undefined, type: 'png', width: 285, height: 177
I don't understand why.
Thanks
@komarevtsevdn
Hello, Thank you for report. I will check it.
@komarevtsevdn
I fixed this Issue, you update node-image-filter version to 0.1.0
$ npm install node-image-filter@0.1.0
I improved save image logic. So you change your code like this.
result.data.pipe(fs.createWriteStream(`result.${result.type}`)); // save local
Thank you.
| gharchive/issue | 2017-07-29T16:33:18 | 2025-04-01T06:38:54.346442 | {
"authors": [
"haegul",
"komarevtsevdn"
],
"repo": "haegul/node-image-filter",
"url": "https://github.com/haegul/node-image-filter/issues/9",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1301127988 | portfolio deployment
my portfolio is deployed before and the link is here:
https://haftamudesta.github.io/desta451616-hotmail.com.github.io/portfolio.html
Project Approved :trophy: :tada: 🟢
Hi @haftamudesta
Your project is complete! You have done it exceptionally well. it's time to merge it :100:
Congratulations! 🎉
To Highlight :+1:
:heavy_check_mark: No linters error.
:heavy_check_mark: Details PR title and description.
:heavy_check_mark: Good commit messages.
Cheers and Happy coding!👏👏👏
Feel free to leave any questions or comments in the PR thread if something is not 100% clear.
Please, remember to tag me in your question so I can receive the notification. You can also connect with me on slack
_As described in the Code reviews limits policy you have a limited number of reviews per project (check the exact number in your Dashboard). If you think that the code review was not fair, you can request a second opinion using this form.
| gharchive/issue | 2022-07-11T19:28:38 | 2025-04-01T06:38:54.361928 | {
"authors": [
"haftamudesta",
"youmari"
],
"repo": "haftamudesta/desta451616-hotmail.com.github.io",
"url": "https://github.com/haftamudesta/desta451616-hotmail.com.github.io/issues/3",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
722618998 | Add descriptions to request form's fields
Some of the form's fields are too complicated to understand just by the title, so we should add some description.
I think the cleanest way would be making it a hover-tooltip popping out of a help icon.
Hover actions should work as taps in touch devices.
can you assign this to me please ?
done
| gharchive/issue | 2020-10-15T19:36:02 | 2025-04-01T06:38:54.363887 | {
"authors": [
"Polarts",
"cHidoriPunk"
],
"repo": "haifa-dev/official-website",
"url": "https://github.com/haifa-dev/official-website/issues/43",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1222885080 | cmd/ebitenmobile: support current NDK
Hi,
I've installed a fresh Android Studio with NDK 24.0.8215888. When trying to use ebitenmobile, I get:
$ export ANDROID_HOME=$HOME/Android/Sdk
$ export ANDROID_NDK_HOME=$HOME/Android/Sdk/ndk/24.0.8215888
$ go run github.com/hajimehoshi/ebiten/v2/cmd/ebitenmobile \
bind -target android -javapkg io.github.divverent.aaaaxy.android \
-androidapi 21 \
-o aaaaxy.aar \
github.com/divVerent/aaaaxy/internal/mobile
2022/05/02 09:20:33 gomobile [init] failed: gomobile: No compiler for 386 was found in the NDK (tried /home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/i686-linux-android16-clang). Make sure your NDK version is >= r19c. Use `sdkmanager --update` to update it.
I do not know why it tries to search SDK 16 binaries (and for x86 even, which is an odd platform for Android); this NDK comes with versions 21 to 32 for aarch64 and x86_64, as well as 19 to 32 for armv7a and i686.
I would assume that passing -androidapi sets which NDK binaries it'll use, but to no avail.
In general, the ebitenmobile instructions could be a LOT clearer - many things in there I just do not understand and will have to figure out by trial and error, not being an Android developer myself.
Just in case, I did run the suggested command:
/home/rpolzer/Android/Sdk/cmdline-tools/latest/bin/sdkmanager --update
Warning: IO exception while downloading manifest
[=== ] 10% Computing updates...
No updates available
[=======================================] 100% Computing updates...
https://github.com/golang/go/issues/35030#issuecomment-1026887111 might be the same issue - however, before I am going to muck around in an Android SDK package-managed directory, I'd like to have confirmation that this is really necessary and that gomobile did not fix this elementary issue for three years. Is gomobile unmaintained?
gomobile is maintained. Does gomobile work with the latest NDK? (I think it should)
Going by Ebiten docs fails:
[rpolzer@brlogenshfegle aaaaxy (git)-[mobile]-]$ ~/go/bin/gomobile build github.com/divVerent/aaaaxy/internal/mobile
/home/rpolzer/go/bin/gomobile: No compiler for 386 was found in the NDK (tried /home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/i686-linux-android16-clang). Make sure your NDK version is >= r19c. Use `sdkmanager --update` to update it.
However, this looks more promising:
[rpolzer@brlogenshfegle aaaaxy (git)-[mobile]-]$ ~/go/bin/gomobile build -androidapi 21 github.com/divVerent/aaaaxy/internal/mobile
/home/rpolzer/go/bin/gomobile: go build github.com/divVerent/aaaaxy/internal/mobile failed: exit status 2
# github.com/divVerent/aaaaxy/internal/mobile
internal/mobile/mobile.go:18:2: imported and not used: "github.com/hajimehoshi/ebiten/v2" as ebiten
internal/mobile/mobile.go:25:2: undefined: flag
This is good news, as it means the rest might be my fault. But it is worrying that the -androidapi flag doesn't work with ebitenmobile bind.
After fixing these, gomobile build succeeds, but gomobile install fails because it doesn't find mobile.apk.
Looks like the Ebiten docs are really rather incomplete - would be nice if it actually explained the steps to turn a desktop game into an Android one.
Looks like the Ebiten docs are really rather incomplete - would be nice if it actually explained the steps to turn a desktop game into an Android one.
I agree we need more documentation about ebitenmobile, but I want to know what was the issue. Doesn't ebitenmobile support the latest NDK, or is this no longer an issue?
Hm... this is weird. The gomobile example doesn't actually use init() the same way as the example in the Ebiten docs. It just has a main().
Using the desktop main() file does work with gomobile, apart from LOTS of warnings about the app being blocked by Play Protect, being for an older version etc. - I assume all this can be fixed when using ebitenmobile properly.
So, status with gomobile is that ONLY build works. In particular, bind tells me to use init, and init says:
[rpolzer@brlogenshfegle aaaaxy (git)-[mobile]-]$ ~/go/bin/gomobile init
/home/rpolzer/go/bin/gomobile: No compiler for arm was found in the NDK (tried /home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/armv7a-linux-androideabi16-clang). Make sure your NDK version is >= r19c. Use `sdkmanager --update` to update it.
[rpolzer@brlogenshfegle aaaaxy (git)-[mobile]-]$ ~/go/bin/gomobile init -androidapi=21
flag provided but not defined: -androidapi
usage: /home/rpolzer/go/bin/gomobile init [-openal dir]
If a OpenAL source directory is specified with -openal, init will
build an Android version of OpenAL for use with gomobile build
and gomobile install.
It doesn't seem to support using any non-default Android NDK. What a mess.
I think you are confused with gomobile-build and gomobile-bind. gomobile-build creates an executable while gomobile-bind creates a library (.aar). ebitenmobile has only the 'bind' feature.
Yeah - I "assumed" (but may be wrong) that ebitenmobile bind uses gomobile bind.
In any case, ebitenmobile bind does not seem to use the -androidapi option, which is the issue here.
You can specify -androidapi: https://github.com/hajimehoshi/ebiten/blob/main/cmd/ebitenmobile/main.go#L103
As said, I can specify it, but it's ignored:
go run github.com/hajimehoshi/ebiten/v2/cmd/ebitenmobile build -target android -javapkg io.github.divverent.aaaaxy.android -androidapi 21 -o aaaaxy.aar github.com/divVerent/aaaaxy/internal/mobile
2022/05/02 09:52:15 gomobile [init] failed: gomobile: No compiler for arm was found in the NDK (tried /home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/armv7a-linux-androideabi16-clang). Make sure your NDK version is >= r19c. Use `sdkmanager --update` to update it.
Hmm? I found this value was not ignored on my machine (macOS):
$ go run github.com/hajimehoshi/ebiten/v2/cmd/ebitenmobile bind -target android -javapkg com.hajimehoshi.goinovation -o ./mobile/android/inovation/inovation.aar -androidapi=999 ./mobile
gomobile: No compiler for arm was found in the NDK (tried /Users/hajimehoshi/Library/Android/sdk/ndk-bundle/toolchains/llvm/prebuilt/darwin-x86_64/bin/armv7a-linux-androideabi999-clang). Make sure your NDK version is >= r19c. Use `sdkmanager --update` to update it.
exit status 1
That same command fails too for me - surprisingly even BEFORE noticing I don't actually have the goinovation app source checked out:
[rpolzer@brlogenshfegle aaaaxy (git)-[mobile]-]$ go run github.com/hajimehoshi/ebiten/v2/cmd/ebitenmobile bind -target android -javapkg com.hajimehoshi.goinovation -o ./mobile/android/inovation/inovation.aar -androidapi=999 ./mobile
2022/05/02 09:58:15 gomobile [init] failed: gomobile: No compiler for 386 was found in the NDK (tried /home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/i686-linux-android16-clang). Make sure your NDK version is >= r19c. Use `sdkmanager --update` to update it.
exit status 1
exit status 1
Note how the path still contains 16.
ebitenmobile build
ebitenmobile build doesn' exist. Please try ebitenmobile bind
$ ls /Users/hajimehoshi/Library/Android/sdk/ndk-bundle/toolchains/llvm/prebuilt/darwin-x86_64/bin/
aarch64-linux-android-as
aarch64-linux-android21-clang
aarch64-linux-android21-clang++
aarch64-linux-android22-clang
aarch64-linux-android22-clang++
aarch64-linux-android23-clang
aarch64-linux-android23-clang++
aarch64-linux-android24-clang
aarch64-linux-android24-clang++
aarch64-linux-android26-clang
aarch64-linux-android26-clang++
aarch64-linux-android27-clang
aarch64-linux-android27-clang++
aarch64-linux-android28-clang
aarch64-linux-android28-clang++
aarch64-linux-android29-clang
aarch64-linux-android29-clang++
aarch64-linux-android30-clang
aarch64-linux-android30-clang++
aarch64-linux-android31-clang
aarch64-linux-android31-clang++
arm-linux-androideabi-as
armv7a-linux-androideabi16-clang
armv7a-linux-androideabi16-clang++
armv7a-linux-androideabi17-clang
armv7a-linux-androideabi17-clang++
armv7a-linux-androideabi18-clang
armv7a-linux-androideabi18-clang++
armv7a-linux-androideabi19-clang
armv7a-linux-androideabi19-clang++
armv7a-linux-androideabi21-clang
armv7a-linux-androideabi21-clang++
armv7a-linux-androideabi22-clang
armv7a-linux-androideabi22-clang++
armv7a-linux-androideabi23-clang
armv7a-linux-androideabi23-clang++
armv7a-linux-androideabi24-clang
armv7a-linux-androideabi24-clang++
armv7a-linux-androideabi26-clang
armv7a-linux-androideabi26-clang++
armv7a-linux-androideabi27-clang
armv7a-linux-androideabi27-clang++
armv7a-linux-androideabi28-clang
armv7a-linux-androideabi28-clang++
armv7a-linux-androideabi29-clang
armv7a-linux-androideabi29-clang++
armv7a-linux-androideabi30-clang
armv7a-linux-androideabi30-clang++
armv7a-linux-androideabi31-clang
armv7a-linux-androideabi31-clang++
clang
clang++
clang-12
clang-check
clang-cl
clang-format
clang-tidy
clangd
dsymutil
git-clang-format
i686-linux-android-as
i686-linux-android16-clang
i686-linux-android16-clang++
i686-linux-android17-clang
i686-linux-android17-clang++
i686-linux-android18-clang
i686-linux-android18-clang++
i686-linux-android19-clang
i686-linux-android19-clang++
i686-linux-android21-clang
i686-linux-android21-clang++
i686-linux-android22-clang
i686-linux-android22-clang++
i686-linux-android23-clang
i686-linux-android23-clang++
i686-linux-android24-clang
i686-linux-android24-clang++
i686-linux-android26-clang
i686-linux-android26-clang++
i686-linux-android27-clang
i686-linux-android27-clang++
i686-linux-android28-clang
i686-linux-android28-clang++
i686-linux-android29-clang
i686-linux-android29-clang++
i686-linux-android30-clang
i686-linux-android30-clang++
i686-linux-android31-clang
i686-linux-android31-clang++
ld
ld.lld
ld64.lld
lld
lld-link
lldb
lldb-argdumper
lldb.sh
llvm-addr2line
llvm-ar
llvm-as
llvm-cfi-verify
llvm-config
llvm-cov
llvm-cxxfilt
llvm-dis
llvm-dwarfdump
llvm-dwp
llvm-lib
llvm-link
llvm-lipo
llvm-modextract
llvm-nm
llvm-objcopy
llvm-objdump
llvm-profdata
llvm-ranlib
llvm-rc
llvm-readelf
llvm-readobj
llvm-size
llvm-strings
llvm-strip
llvm-symbolizer
pbcopy
sancov
sanstats
scan-build
scan-view
x86_64-linux-android-as
x86_64-linux-android21-clang
x86_64-linux-android21-clang++
x86_64-linux-android22-clang
x86_64-linux-android22-clang++
x86_64-linux-android23-clang
x86_64-linux-android23-clang++
x86_64-linux-android24-clang
x86_64-linux-android24-clang++
x86_64-linux-android26-clang
x86_64-linux-android26-clang++
x86_64-linux-android27-clang
x86_64-linux-android27-clang++
x86_64-linux-android28-clang
x86_64-linux-android28-clang++
x86_64-linux-android29-clang
x86_64-linux-android29-clang++
x86_64-linux-android30-clang
x86_64-linux-android30-clang++
x86_64-linux-android31-clang
x86_64-linux-android31-clang++
yasm
Hm... so you do have version 16 for two platforms; I don't. We have different NDK versions and it seems like this incompatibility is new.
Can we as a stopgap document the proper NDK version that works with Ebiten, including instructions how to install that one?
I'll try to investigate the mechanism which NDK is chosen.
But, could you try ebitenmobile bind instead of ebitenmobile build? You tried build at
https://github.com/hajimehoshi/ebiten/issues/2085#issuecomment-1114908765
Sorry, I don't see the word "build" in https://github.com/hajimehoshi/ebiten/issues/2085#issuecomment-1114917037 - I thought I already reran with "bind"?
I have the following v1 ones:
[rpolzer@brlogenshfegle aaaaxy (git)-[mobile]-]$ ls /home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/*21*
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/aarch64-linux-android21-clang
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/aarch64-linux-android21-clang++
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/armv7a-linux-androideabi21-clang
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/armv7a-linux-androideabi21-clang++
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/i686-linux-android21-clang
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/i686-linux-android21-clang++
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/x86_64-linux-android21-clang
/home/rpolzer/Android/Sdk/ndk/24.0.8215888/toolchains/llvm/prebuilt/linux-x86_64/bin/x86_64-linux-android21-clang++
So yes, it should work...
I'm not 100% sure but this might be an issue in gomobile rather than ebitenmobile, as ebitenmobile is just a (thin?) wrapper of gomobile.
Yes, I do suspect this to be a gomobile issue of some sort - however "gomobile build" does work.
My suspicion is that some code already runs before the -apiversion flag is parsed.
On gomobile i found:
https://github.com/golang/go/issues/52470
This issue also looks daunting: https://github.com/golang/go/issues/38439 - if this is true, then gomobile apps are no longer allowed on the Play Store due to only supporting the old v1 signature scheme.
https://github.com/golang/go/issues/52470
Thanks. I reviewed the proposal. I'll update Ebiten as soon as possible after this is merged.
I am not 100% convinced the proposed fix will be sufficient here, as I still don't know why it doesn't use the -androidapi flag and don't see anything in there fixing that part. I'm going to see if I can try out the change though.
Seems like the proposed fix does help, although one still needs to pass -androidapi (but at least it's finally being respected):
[rpolzer@carbonic go-inovation (git)-[main]-]$ EBITENMOBILE_GOMOBILE=$HOME/src/gomobile ANDROID_NDK_HOME=$HOME/Android/Sdk/ndk/24.0.8215888 ~/src/ebiten/cmd/ebitenmobile/ebitenmobile bind -target android -javapkg com.hajimehoshi.goinovation -o ./mobile/android/inovation/inovation.aar ./mobile
gomobile: ANDROID_NDK_HOME specifies /home/rpolzer/Android/Sdk/ndk/24.0.8215888, which is unusable: unsupported API version 16 (not in 19..32)
[rpolzer@carbonic go-inovation (git)-[main]-]$ EBITENMOBILE_GOMOBILE=$HOME/src/gomobile ANDROID_NDK_HOME=$HOME/Android/Sdk/ndk/24.0.8215888 ~/src/ebiten/cmd/ebitenmobile/ebitenmobile bind -target android -javapkg com.hajimehoshi.goinovation -o ./mobile/android/inovation/inovation.aar -androidapi 21 ./mobile
[rpolzer@carbonic go-inovation (git)-[main]-]$ ls -la ./mobile/android/inovation/inovation.aar
-rw-r----- 1 rpolzer primarygroup 27998497 May 3 05:31 ./mobile/android/inovation/inovation.aar
Can we change the default of -androidapi to match the current NDK, or at least document this need in Ebiten's gomobile docs?
19 does seem to work as well, even though some platforms only start at 21. But IIRC this is because gomobile has hardcoded min versions per platform.
Can we change the default of -androidapi to match the current NDK
Which gomobile or ebitenmobile are you suggesting to fix?
As I do not know the design of the two well enough - not sure. I'd hope for gomobile to fix this (i.e. it should "work out of the box" with whatever the current NDK is, especially given Android Studio does not offer previous NDK versions to install), but of course gomobile also has a desire to support previous NDKs - so just hardcoding the new minimum versions won't be quite right. Not sure if they have some compatibility promise that also prevents them from just bumping the flag default of -androidapi.
What Ebiten however can do is to just start using the -androidapi option in the documentation's example command lines. In many cases you'll want to set that explicitly anyway (e.g. the Google Play docs say that to get on the Play Store, you soon need to target at least version 31: https://developer.android.com/google/play/requirements/target-sdk).
| gharchive/issue | 2022-05-02T13:23:39 | 2025-04-01T06:38:54.418647 | {
"authors": [
"divVerent",
"hajimehoshi"
],
"repo": "hajimehoshi/ebiten",
"url": "https://github.com/hajimehoshi/ebiten/issues/2085",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1771746262 | Fix antialias region size
What issue is this addressing?
Closes #2679
What type of issue is this addressing?
bug
What this PR does | solves
Fixes coordinate transformation math on dstRegion.
Previous code had x1 and y1 shifted by exactly (i.region.X, i.region.Y), which allowed it to exceed the width and height of the destination image. iOS MTLDebugRenderCommandEncoder detects this issue and crashes.
No known test case for OpenGL or similar, as the rectangle is always strictly larger or equal to the correct rectangle.
It is POSSIBLE, but not reproduced, this this can in OpenGL cause DrawTriangles to draw on a different image than the destination image if they share an atlas. However, as this depends a lot on how the atlas is built, I am not sure if it can actually be triggered reliably, or even at all.
LGTM.
If we find it hard to add a test, it's OK, let's merge this without tests. What do you think?
I tried finding a test case for this by detecting overwriting pixels on other textures, however it appears that the texture antialiasing uses doesn't end up on the atlas of the NewImage textures. Don't know why.
If this can't be done, then yeah, let's merge this without test cases. At the very least we have test cases to ensure this isn't a regression.
A destination image is often a different texture from the source, and I understand it pretty hard to make a reliable test. OK let's merge this without a test.
Found the reason: https://github.com/hajimehoshi/ebiten/blob/main/internal/ui/image.go#L91
So the backing offscreen is only ever unmanaged or volatile. And only regular images ever end up on an atlas.
So there's never any texture sharing, and in OpenGL, writing out of bounds of a texture is harmless - so yeah, no test case possible, except if we can produce the Metal crash by forcing the same debug backend. All god then - thanks!
The other fact that makes this "untestable" is - region.X and region.Y are elsewhere clamped to be at least zero. Thus, the wrong region is only ever bigger, never smaller, than the correct region. Making the bug 100% harmless on OpenGL backends, except maybe a bit more wasteful regarding dirtying of texture space or similar.
| gharchive/pull-request | 2023-06-23T16:40:20 | 2025-04-01T06:38:54.424852 | {
"authors": [
"divVerent",
"hajimehoshi"
],
"repo": "hajimehoshi/ebiten",
"url": "https://github.com/hajimehoshi/ebiten/pull/2681",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
278766375 | Code of Conduct dosyası.
Code of Conduct dosyası oluşturulmalı.
Yararlı Linkler:
Opensource.guide
Github Help
önce ne olduğunu öğrenmem gerekecek :D
Yararlı linkler bıraktım :)
| gharchive/issue | 2017-12-03T11:44:24 | 2025-04-01T06:38:54.427139 | {
"authors": [
"BTaskaya",
"hakancelik96"
],
"repo": "hakancelik96/coogger",
"url": "https://github.com/hakancelik96/coogger/issues/15",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
315233273 | Inlining expected files to pass the 0-test, import feature and hk-maple
As per https://github.com/hakaru-dev/hakaru/pull/154
The issues with failing the 0-test were solved by inlining everything. Most of the tests I've written related to the exponential distribution had the same issues and I solved them same way.
Ideally, we would have used an import feature to call distributions directly from stdlib to write these test case files. I'd expect the exact same issues to arise in this case, making import useless for writing expected files. Isn't this inability to automatically in-line a problem then?
Not an issue for our project, but something I wanted to bring up in case it hasn't been considered.
Writing expected test files is inherently a manual thing. You're testing that Simplify does all sorts of operations correctly, including inlining. So you need to write the expected 'by hand'. You can, of course, use some machine assistance in deriving things.
| gharchive/issue | 2018-04-17T20:59:30 | 2025-04-01T06:38:54.429387 | {
"authors": [
"JacquesCarette",
"maymoo99"
],
"repo": "hakaru-CS4ZP6/hakaru",
"url": "https://github.com/hakaru-CS4ZP6/hakaru/issues/49",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
187200837 | config/markdown.php not created
I have install it in laravel 5.3 but it's not working even config/markdown.php not created ....
Is there any solution ??
Have you pulled in the latest version? (0.3.0)
I've noticed that in the documentation I'm still referring to 0.1.* but during that version Laravel 5.3 wasn't out yet. I will update the documentation shortly.
| gharchive/issue | 2016-11-03T21:54:28 | 2025-04-01T06:38:54.436380 | {
"authors": [
"haleksandre",
"sidor555"
],
"repo": "haleks/laravel-markdown",
"url": "https://github.com/haleks/laravel-markdown/issues/8",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1933474905 | Website is down
I went to check the documentation and realized the website is down. It seems to be down for everyone 😞
Status on Is It Down Right Now
It's working now 😅 I'll close the issue.
| gharchive/issue | 2023-10-09T17:04:24 | 2025-04-01T06:38:54.439917 | {
"authors": [
"toridoriv"
],
"repo": "halfmoonui/halfmoon",
"url": "https://github.com/halfmoonui/halfmoon/issues/142",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1329279619 | ⚠️ ClamAV has degraded performance
In 2418bae, ClamAV (https://spamassassin.apache.org/) experienced degraded performance:
HTTP code: 200
Response time: 88 ms
Resolved: ClamAV performance has improved in 04cc147.
| gharchive/issue | 2022-08-04T23:38:25 | 2025-04-01T06:38:54.468233 | {
"authors": [
"hamboneZA"
],
"repo": "hamboneZA/caffeine",
"url": "https://github.com/hamboneZA/caffeine/issues/2108",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1565130694 | ⚠️ ClamAV has degraded performance
In 5a7d21c, ClamAV (https://spamassassin.apache.org/) experienced degraded performance:
HTTP code: 200
Response time: 88 ms
Resolved: ClamAV performance has improved in f84d4f1.
| gharchive/issue | 2023-01-31T23:08:01 | 2025-04-01T06:38:54.470692 | {
"authors": [
"hamboneZA"
],
"repo": "hamboneZA/caffeine",
"url": "https://github.com/hamboneZA/caffeine/issues/5297",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1571690881 | ⚠️ ClamAV has degraded performance
In 9c51126, ClamAV (https://spamassassin.apache.org/) experienced degraded performance:
HTTP code: 200
Response time: 179 ms
Resolved: ClamAV performance has improved in f3f99d9.
| gharchive/issue | 2023-02-06T00:33:52 | 2025-04-01T06:38:54.473110 | {
"authors": [
"hamboneZA"
],
"repo": "hamboneZA/caffeine",
"url": "https://github.com/hamboneZA/caffeine/issues/5412",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1609731382 | ⚠️ ClamAV has degraded performance
In 906858c, ClamAV (https://spamassassin.apache.org/) experienced degraded performance:
HTTP code: 200
Response time: 68 ms
Resolved: ClamAV performance has improved in 64537ab.
| gharchive/issue | 2023-03-04T11:20:23 | 2025-04-01T06:38:54.475658 | {
"authors": [
"hamboneZA"
],
"repo": "hamboneZA/caffeine",
"url": "https://github.com/hamboneZA/caffeine/issues/5943",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1728834052 | ⚠️ ClamAV has degraded performance
In 617146e, ClamAV (https://spamassassin.apache.org/) experienced degraded performance:
HTTP code: 200
Response time: 64 ms
Resolved: ClamAV performance has improved in 8999f2e.
| gharchive/issue | 2023-05-27T16:53:40 | 2025-04-01T06:38:54.478022 | {
"authors": [
"hamboneZA"
],
"repo": "hamboneZA/caffeine",
"url": "https://github.com/hamboneZA/caffeine/issues/7943",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1236126713 | ⚠️ ClamAV has degraded performance
In 4d07316, ClamAV (https://spamassassin.apache.org/) experienced degraded performance:
HTTP code: 200
Response time: 68 ms
Resolved: ClamAV performance has improved in 123ec36.
| gharchive/issue | 2022-05-14T21:39:56 | 2025-04-01T06:38:54.480375 | {
"authors": [
"hamboneZA"
],
"repo": "hamboneZA/caffeine",
"url": "https://github.com/hamboneZA/caffeine/issues/875",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1842407529 | ⚠️ ClamAV has degraded performance
In fd872d4, ClamAV (https://spamassassin.apache.org/) experienced degraded performance:
HTTP code: 200
Response time: 480 ms
Resolved: ClamAV performance has improved in 2b4f5d6.
| gharchive/issue | 2023-08-09T03:46:43 | 2025-04-01T06:38:54.482694 | {
"authors": [
"hamboneZA"
],
"repo": "hamboneZA/caffeine",
"url": "https://github.com/hamboneZA/caffeine/issues/9809",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
509482780 | Could create an example that has authentication and authorization?
Could create an example that has authentication and authorization?
Not use Auth0.
Yup, it’s coming soon! I’m going to use Netlify Identity.
looking forward to it.
| gharchive/issue | 2019-10-19T17:19:51 | 2025-04-01T06:38:54.487780 | {
"authors": [
"cannikin",
"zwl1619"
],
"repo": "hammerframework/example-blog",
"url": "https://github.com/hammerframework/example-blog/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
482333247 | Updated to not error out on undefined window and document
Allows hammer to be used alongside webworkers. See rollup.config.js for full changes - rest is just rebuild of files.
This has been applied in proposed 2.1.0: https://github.com/squadette/hammer.js/issues/1
Thank you,
| gharchive/pull-request | 2019-08-19T14:06:31 | 2025-04-01T06:38:54.489125 | {
"authors": [
"Niko-Kk",
"squadette"
],
"repo": "hammerjs/hammer.js",
"url": "https://github.com/hammerjs/hammer.js/pull/1228",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
178693837 | Adding vaxrank HTML/PDF output, with some associated changes
refactored report generation to create a dictionary of values which then gets fed into ASCII/HTML/whatever template
added an HTML template and an ASCII template that's very close to previous output (only difference should be whitespace)
added report outputs to .gitignore
updated some requirements files
Fixes #3 - also see PDF for what this looks like
vaccine-peptides-report.pdf
If you rebase this on master, does Travis still fail? It's related to the recent biotypes changes @iskandr just made.
Just a random thought in case it turns out to be related to the tempfile
issue: if you are using tempfile.NamedTemporaryFile and haven't closed the
file handle after writing to it, you'll want to call .flush() on the handle
before passing its name to another tool. Otherwise the output may not have
made it into the file yet due to buffering.
On Fri, Sep 23, 2016 at 11:51 AM, Julia Kodysh notifications@github.com
wrote:
@julia326 commented on this pull request.
In vaxrank/report.py https://github.com/hammerlab/vaxrank/pull/18:
variants,
bam_path,
html_report_path,
pdf_report_path=None):
template = JINJA_ENVIRONMENT.get_template('templates/template.html')
template_data = compute_template_data(
ranked_variants_with_vaccine_peptides,
mhc_alleles,
variants,
bam_path)
with open(html_report_path, "w") as f:
f.write(template.render(template_data))
+def make_pdf_report(
template_data,
pdf_report_path):
path = "%s.html" % uuid.uuid4()
I tried to make it work with tempfile first, but pdfkit didn't seem to
like reading from a temp HTML file. I'll play around with it some more,
probably a weird permissions thing
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/hammerlab/vaxrank/pull/18, or mute the thread
https://github.com/notifications/unsubscribe-auth/AAcjuCFVldkX6pQBNlEXDhpZrWF4yS3Fks5qs_V7gaJpZM4KEP0O
.
@timodonnell that was indeed an issue, thank you!
LGTM! Merge away 👏 First PR = major need for the pipeline.
Coverage decreased (-6.8%) to 45.594% when pulling 453dbfadb9a98565323c394717b3c5ff5865aa3d on vaxrank-html into e2a1d9458f68997b3ad17d755a7b04ddab422622 on master.
| gharchive/pull-request | 2016-09-22T19:02:30 | 2025-04-01T06:38:54.499878 | {
"authors": [
"coveralls",
"ihodes",
"julia326",
"timodonnell"
],
"repo": "hammerlab/vaxrank",
"url": "https://github.com/hammerlab/vaxrank/pull/18",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
2604160202 | 最新的repo我在cloudflare上尝试部署了2次都失败了
不知何故,我在自己的分支上更新了最新的代码,然后CF上去重新部署还是失败!附件是我的Log,还望帮我看看,谢谢
nezha-dash.5796d95e-8fe3-4794-8b87-76132d0ebb23.log
需要使用cloudflare分支,才可以在cf pages进行部署
https://buycoffee.top/blog/tech/nezha-cloudflare 您可以参考教程,在cf部署时所需的配置项都在文章内
搞定,谢谢咯
| gharchive/issue | 2024-10-22T03:54:36 | 2025-04-01T06:38:54.502161 | {
"authors": [
"cigar",
"hamster1963"
],
"repo": "hamster1963/nezha-dash",
"url": "https://github.com/hamster1963/nezha-dash/issues/85",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1064930248 | Thank you very much for your work. I am very interested in your work. I hope you can help explain the meaning of this sentence in the code. The code is in the script 'test_runner. py ' in line 56, as follows: pred_ rgb[label == 255] = np.array((0, 0, 0))
Thank you very much for your work. I am very interested in your work. I hope you can help explain the meaning of this sentence in the code. The code is in the script 'test_runner. py ' in line 56, as follows:
pred_ rgb[label == 255] = np.array((0, 0, 0))
Thank you for your attention, pred_ rgb[label == 255] = np.array((0, 0, 0)) is used to visualize the unlabeled pixels, note that unlabeled pixels are not involved in the calculation of the final result.
Thank you very much for taking some time out of your busy schedule to answer my questions
| gharchive/issue | 2021-11-27T07:41:33 | 2025-04-01T06:38:54.530956 | {
"authors": [
"1725917163",
"hanchaoleng"
],
"repo": "hanchaoleng/ShapeConv",
"url": "https://github.com/hanchaoleng/ShapeConv/issues/3",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
659023671 | HanLP.java访问资源文件使用绝对路径的问题
您好,最近在使用Hanlp过程中遇到了一些问题。我用的是java 的 1.7.7版本,在加载HanLP的资源文件时,我发现源码里用的都是绝对路径的方式,而不是inputstream的方式访问文件,这导致我在引用的时候,出现了跨jar包的问题。当然我可以将data资源文件放到您提示的 根目录(或tomcat其它)也可以访问。我想问下,HanLP.java 这个文件在加载资源的时候,有没有更新使用inputstream的方式呢?我在论坛找不到相关问题,在此来提问了!
【自动回复】您好,感谢反馈。
由于未按要求填写,本issue被暂时标记为无效。请按issue模板认真修改,然后耐心等待下周末处理。
如果您提的是bug或feature request,那么请您按模板认真改一改,reopen这个issue。
否则,咨询类的问题请先搜索一下。99.9%的问题已经反复回答过了,人类的本质是复读机。如果您确信您提了个新问题,那么欢迎写清楚前因后果发在论坛而不是GitHub上。
如果您不修改的话,恐怕没有人能够回复您,我们也会错失一次改进的机会或者一次友好的讨论。开源项目维护不易,只能在周末受理友好的提问,造成不便还望海涵。
| gharchive/issue | 2020-07-17T08:39:52 | 2025-04-01T06:38:54.570421 | {
"authors": [
"hankcs",
"zxxvscs"
],
"repo": "hankcs/HanLP",
"url": "https://github.com/hankcs/HanLP/issues/1510",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
938664972 | 这个繁体转义合适吗?
糊=煳
【自动回复】您好,感谢反馈。
由于未按要求填写,本issue被自动关闭,将在一小时内自动删除。
如果您提的是bug或feature request,请按issue模板认真修改,然后耐心等待下周末处理。
否则,咨询类的问题GitHub概不受理。请先搜索一下,99.9%的问题已经反复回答过了,人类的本质是复读机。如果您确信您提了个新问题,那么欢迎写清楚前因后果发在论坛而不是GitHub上。
开源项目维护不易,HanLP用户以十万计,而主要维护者只有我一人。我有自己的full-time研究工作,只能在周末受理友好的提问,造成不便还望海涵。
| gharchive/issue | 2021-07-07T09:05:38 | 2025-04-01T06:38:54.572807 | {
"authors": [
"datalee",
"hankcs"
],
"repo": "hankcs/HanLP",
"url": "https://github.com/hankcs/HanLP/issues/1662",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.