id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
499806791
|
Struggling with chaining network requests
Hi there, I've just started using PromiseKit, and am struggling with chaining requests together instead of embedding them...
I'm using PromiseKit 6.11.0 and PMKFoundation 3.3.3 (using SwiftPM in Xcode 11) targeting iOS 13 in a sample project.
I'm using the JSONPlaceholder API to try and get things working. I've created a networking engine that has 2 methods:
import Foundation
import PromiseKit
class JSONPlaceholderNetworkEngine: NetworkEngine {
func getPosts() -> Promise<GetPostsRequest.Response> {
return execute(GetPostsRequest())
}
func getPost(id: Int) -> Promise<GetPostRequest.Response> {
return execute(GetPostRequest(id: id))
}
}
The base class execute method looks like this:
func execute<Request: NetworkRequest>(_ request: Request) -> Promise<Request.Response> {
guard let urlRequest = request.urlRequest(schemeOverride: scheme, hostOverride: host, additionalHeaders: defaultHeaders) else {
return Promise(error: NetworkError.badURLRequest)
}
return session.dataTask(.promise, with: urlRequest).compactMap { (data, response) -> Request.Response? in
guard self.validate(response) else { throw NetworkError.invalidResponse }
return try request.jsonDecoder.decode(Request.Response.self, from: data)
}
}
NetworkRequest's have an associated type, Response that is Decodable.
What I'm trying to accomplish is in the ViewController viewDidLoad method, load the posts using getPosts() then once they are loaded, pick a random one and load it's details using getPost(id: randomPost.id), like this:
jsonNetworkEngine.getPosts().then { posts in
let randomPost = posts.randomElement()!
print("Got \(posts.count) posts! Choosing post \(randomPost.id) to load details:")
self.jsonNetworkEngine.getPost(id: randomPost.id)
}.then { post in
print(post)
}.catch { error in
print("Error: \(error)")
}
But it errors at the first line jsonNetworkEngine.getPosts().then { posts in highlighting the { and saying Unable to infer complex closure return type; add explicit type to disambiguate.
I can make it work if I write the code like this:
jsonNetworkEngine.getPosts().done { posts in
let randomPost = posts.randomElement()!
print("Got \(posts.count) posts! Choosing post \(randomPost.id) to load details:")
self.jsonNetworkEngine.getPost(id: randomPost.id).done { post in
print(post)
}.catch { error in
print("Error: \(error)")
}
}.catch { error in
print("Error: \(error)")
}
But that seems to break the whole idea of not ending up in "call back hell" if the subsequent calls all need to be embedded?
Please see our
Troubleshooting Guide
Ah, it was the return type! I did read the trouble shooting guide and was trying to specify the type but that still wasn't working. Somehow missed the return part though. Thanks for the incredibly fast reply!
|
gharchive/issue
| 2019-09-28T18:40:29 |
2025-04-01T06:39:42.286483
|
{
"authors": [
"danmartyn",
"mxcl"
],
"repo": "mxcl/PromiseKit",
"url": "https://github.com/mxcl/PromiseKit/issues/1098",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
139516525
|
Fix Bolts framework reference in PMKiOSCategoryTests
I was unable to run the PMKiOSCategoryTests. It seemed that the Bolts.framework was missing in the Copy Files build phase.
I believe I ended up addressing this myself recently when I ran into the same issue without remembering that this PR existed. So my apologies.
Closed by https://github.com/mxcl/PromiseKit/commit/3b079f6dd6f0c67d85c6ef02f9ed42804dfe4171
|
gharchive/pull-request
| 2016-03-09T09:00:31 |
2025-04-01T06:39:42.288238
|
{
"authors": [
"lammertw",
"nathanhosselton"
],
"repo": "mxcl/PromiseKit",
"url": "https://github.com/mxcl/PromiseKit/pull/373",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
365567511
|
updated code/ dir with new python file
Fixes or introduces:
Proposed Changes
Updated readme file
All new py file inside code/.
Thanks for contributing (and putting it in alphabetical order)!
|
gharchive/pull-request
| 2018-10-01T17:38:41 |
2025-04-01T06:39:42.296386
|
{
"authors": [
"Akshay-N-Shaju",
"lukeoliff"
],
"repo": "my-first-pr/hacktoberfest-2018",
"url": "https://github.com/my-first-pr/hacktoberfest-2018/pull/33",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
368000720
|
Update README.md
:warning: This is a Pull Request Template :warning:
Checkout all the things you've done by placing an *x*\ within the square braces.
Proposed Changes
[ ] I've forked the repository.
[ ] I've created a branch and made my chagnes in it.
[ ] I've read the CODE OF CONDUCT and abide to it.
[ ] I've read the CONTRIBUTING.md
[ ] I understand opening a PULL REQUEST doesn't mean it will be merged for sure.
Nice work, thank you! 🙌
|
gharchive/pull-request
| 2018-10-09T00:46:16 |
2025-04-01T06:39:42.299426
|
{
"authors": [
"joshcanhelp",
"sirocanabarro"
],
"repo": "my-first-pr/hacktoberfest-2018",
"url": "https://github.com/my-first-pr/hacktoberfest-2018/pull/349",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1844064464
|
🛑 Website Load Balancers is down
In 547df2f, Website Load Balancers ($WEB_SERVER) was down:
HTTP code: 522
Response time: 15983 ms
Resolved: Website Load Balancers is back up in 910b215.
|
gharchive/issue
| 2023-08-09T21:43:56 |
2025-04-01T06:39:42.306829
|
{
"authors": [
"myavsuper"
],
"repo": "myavsuper/upptime",
"url": "https://github.com/myavsuper/upptime/issues/314",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
369387362
|
missing property in xml file
-- bug
in file mybatis-3/src/test/java/org/apache/ibatis/binding/BoundAuthorMapper.xml line 42
<result column="favourite_section" javaType="org.apache.ibatis.domain.blog.Section"/> should add property field like below
<result column="favourite_section" property="favouriteSection" javaType="org.apache.ibatis.domain.blog.Section"/>
-- test
in file mybatis-3/src/test/java/org/apache/ibatis/binding/BindingTest.java line 374 ,test_method :shouldExecuteBoundSelectBlogUsingConstructorWithResultMapAndProperties add test code assertNotNull("author favouriteSection should not be null", blog.getAuthor().getFavouriteSection());
to test it
Thank you. :)
|
gharchive/issue
| 2018-10-12T03:39:49 |
2025-04-01T06:39:42.311793
|
{
"authors": [
"harawata",
"huanqingdong"
],
"repo": "mybatis/mybatis-3",
"url": "https://github.com/mybatis/mybatis-3/issues/1369",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
35133400
|
resultMap bug
mapper :
<resultMap id="user" type="User">
<id property="id" column="id"/>
<result property="name" column="name"/>
<association property="superior" resultMap="user" columnPrefix="superior_" />
</resultMap>
<select id="find" parameterType="long" resultMap="user">
select 1 as id, 'a' as name, 2 as superior_id , 'b' as superior_name from dual
</select>
java :
User user = dao.find(1L);
System.out.println("user.id : " + user.getId());
System.out.println("user.name : " + user.getName());
System.out.println("user.superior.id : " + user.getSuperior().getId());
System.out.println("user.superior.name : " + user.getSuperior().getName());
result :
[DEBUG] ==> Preparing: select 1 as id, 'a' as name, 2 as superior_id , 'b' as superior_name from dual [org.ufox.demo.dao.UserDao.find]
[DEBUG] ==> Parameters: [org.ufox.demo.dao.UserDao.find]
[DEBUG] <== Total: 1 [org.ufox.demo.dao.UserDao.find]
user.id : 1
user.name : a
user.superior.id : 1
user.superior.name : a
A test case contributed by another user in the forum seems to be the test case for this issue, so I'll reopen.
|
gharchive/issue
| 2014-06-06T10:03:11 |
2025-04-01T06:39:42.315816
|
{
"authors": [
"harawata",
"ooknight"
],
"repo": "mybatis/mybatis-3",
"url": "https://github.com/mybatis/mybatis-3/issues/215",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
180511877
|
Can change StatementHandler.query(Statement, ResultHandler) into StatementHandler.query(Statement0.
The paramter ResultHandler not used in implement classes. For test, I remove it in my local project, that's ok.
Modifying an existing public interface could break backward compatibility, so we avoid it unless there is a very good reason.
|
gharchive/issue
| 2016-10-02T13:17:06 |
2025-04-01T06:39:42.317292
|
{
"authors": [
"harawata",
"luoxn28"
],
"repo": "mybatis/mybatis-3",
"url": "https://github.com/mybatis/mybatis-3/issues/797",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2739771659
|
[★𝐍𝐎𝐖★] 𝟓 𝐆𝐢𝐫𝐥𝐬 𝟓 𝐑𝐨𝐜𝐤𝐞𝐭 Video 𝙻𝚎𝚊𝚔𝚎𝚍 Viral 𝘖𝘳𝘪𝘨𝘪𝘯𝘢𝘭 Clips LINK On Social Media X Twitter
20 sec ago [★𝐍𝐎𝐖★] 𝟓 𝐆𝐢𝐫𝐥𝐬 𝟓 𝐑𝐨𝐜𝐤𝐞𝐭 Video 𝙻𝚎𝚊𝚔𝚎𝚍 Viral 𝘖𝘳𝘪𝘨𝘪𝘯𝘢𝘭 Clips LINK On Social Media X Twitter
..
..
..
..
..
..
𝟓 𝐆𝐢𝐫𝐥𝐬 𝟓 𝐑𝐨𝐜𝐤𝐞𝐭, a young and talented digital creator, recently gained widespread attention on social media platforms with her viral video. The video quickly became a trending topic across various platforms, sparking a significant amount of discussion among viewers. As a rising star in the digital world, 𝟓 𝐆𝐢𝐫𝐥𝐬 𝟓 𝐑𝐨𝐜𝐤𝐞𝐭's creativity and content have captivated audiences, contributing to her growing popularity on platforms like X and Twitter.
This viral moment has sparked conversations about the impact of digital content on social media trends. 𝟓 𝐆𝐢𝐫𝐥𝐬 𝟓 𝐑𝐨𝐜𝐤𝐞𝐭's video, while generating attention, highlights the growing influence of young content creators who are reshaping the landscape of online media. With her engaging presence, 𝟓 𝐆𝐢𝐫𝐥𝐬 𝟓 𝐑𝐨𝐜𝐤𝐞𝐭 continues to inspire others to explore the possibilities of digital creation and interaction on social platforms.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
cctv
.
.
### 𝖶𝖺𝗍𝖼𝗁 🟢 ➤ ➤ ➤ 🌐 𝖢𝗅𝗂𝖼𝗄 𝖧𝖾𝗋𝖾 𝖳𝗈 𝗅𝗂𝗇𝗄 (𝖥𝗎𝗅 𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇𝗄)
.
.
.
### 🔴 ➤► 𝖣𝖮𝖶𝖭𝖫𝖮𝖠𝖣👉👉 (𝖥𝗎𝗅𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇 𝗄)
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
### 𝖶𝖺𝗍𝖼𝗁 🟢 ➤ ➤ ➤ 🌐 𝖢𝗅𝗂𝖼𝗄 𝖧𝖾𝗋𝖾 𝖳𝗈 𝗅𝗂𝗇𝗄 (𝖥𝗎𝗅 𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇𝗄)
.
.
.
### 🔴 ➤► 𝖣𝖮𝖶𝖭𝖫𝖮𝖠𝖣👉👉 (𝖥𝗎𝗅𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇 𝗄)
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
### 𝖶𝖺𝗍𝖼𝗁 🟢 ➤ ➤ ➤ 🌐 𝖢𝗅𝗂𝖼𝗄 𝖧𝖾𝗋𝖾 𝖳𝗈 𝗅𝗂𝗇𝗄 (𝖥𝗎𝗅 𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇𝗄)
.
.
.
### 🔴 ➤► 𝖣𝖮𝖶𝖭𝖫𝖮𝖠𝖣👉👉 (𝖥𝗎𝗅𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇 𝗄)
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
### 𝖶𝖺𝗍𝖼𝗁 🟢 ➤ ➤ ➤ 🌐 𝖢𝗅𝗂𝖼𝗄 𝖧𝖾𝗋𝖾 𝖳𝗈 𝗅𝗂𝗇𝗄 (𝖥𝗎𝗅 𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇𝗄)
.
.
.
### 🔴 ➤► 𝖣𝖮𝖶𝖭𝖫𝖮𝖠𝖣👉👉 (𝖥𝗎𝗅𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇 𝗄)
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
### 𝖶𝖺𝗍𝖼𝗁 🟢 ➤ ➤ ➤ 🌐 𝖢𝗅𝗂𝖼𝗄 𝖧𝖾𝗋𝖾 𝖳𝗈 𝗅𝗂𝗇𝗄 (𝖥𝗎𝗅 𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇𝗄)
.
.
.
### 🔴 ➤► 𝖣𝖮𝖶𝖭𝖫𝖮𝖠𝖣👉👉 (𝖥𝗎𝗅𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇 𝗄)
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
### 𝖶𝖺𝗍𝖼𝗁 🟢 ➤ ➤ ➤ 🌐 𝖢𝗅𝗂𝖼𝗄 𝖧𝖾𝗋𝖾 𝖳𝗈 𝗅𝗂𝗇𝗄 (𝖥𝗎𝗅 𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇𝗄)
.
.
.
### 🔴 ➤► 𝖣𝖮𝖶𝖭𝖫𝖮𝖠𝖣👉👉 (𝖥𝗎𝗅𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇 𝗄)
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
### 𝖶𝖺𝗍𝖼𝗁 🟢 ➤ ➤ ➤ 🌐 𝖢𝗅𝗂𝖼𝗄 𝖧𝖾𝗋𝖾 𝖳𝗈 𝗅𝗂𝗇𝗄 (𝖥𝗎𝗅 𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇𝗄)
.
.
.
### 🔴 ➤► 𝖣𝖮𝖶𝖭𝖫𝖮𝖠𝖣👉👉 (𝖥𝗎𝗅𝗅 𝖵𝗂𝗋𝖺𝗅 𝖵𝗂𝖽𝖾𝗈 𝖫𝗂𝗇 𝗄)
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
𝐰𝐚𝐭𝐜𝐡 𝐧𝐨𝐰
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
cctv
|
gharchive/issue
| 2024-12-14T10:49:36 |
2025-04-01T06:39:42.408260
|
{
"authors": [
"ahidiqba",
"amirashazfas",
"shazfaalif",
"zxroscoe"
],
"repo": "myhhub/stock",
"url": "https://github.com/myhhub/stock/issues/203",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1523359007
|
Bug on Export
Plugin version
latest
WooCommerce version
latest
WordPress version
latest
PHP version
8.1 (not sure)
What went wrong?
"0 - data.shipments[0].recipient.cc - Unknown error: {"fields":["data.shipments[0].recipient.cc"],"human":["data.shipments[0].recipient.cc"]}. Please contact MyParcel."
On export
Reproduction steps
Navigate to ...
Click on ...
See ...
Relevant log output
2023-01-07T00:05:50+00:00 DEBUG *** Creating shipments started ***
2023-01-07T00:05:50+00:00 DEBUG Shipment data for order 170.
2023-01-07T00:05:50+00:00 DEBUG export_order: "0 - data.shipments[0].recipient.cc - Unknown error: {\"fields\":[\"data.shipments[0].recipient.cc\"],\"human\":[\"data.shipments[0].recipient.cc\"]}. Please contact MyParcel."
2023-01-07T00:07:45+00:00 DEBUG *** Creating shipments started ***
2023-01-07T00:07:45+00:00 DEBUG Shipment data for order 170.
2023-01-07T00:07:45+00:00 DEBUG export_order: "0 - data.shipments[0].recipient.cc - Unknown error: {\"fields\":[\"data.shipments[0].recipient.cc\"],\"human\":[\"data.shipments[0].recipient.cc\"]}. Please contact MyParcel."
2023-01-07T00:23:07+00:00 DEBUG *** Creating shipments started ***
2023-01-07T00:23:07+00:00 DEBUG Shipment data for order 170.
2023-01-07T00:23:08+00:00 DEBUG export_order: "0 - data.shipments[0].recipient.cc - Unknown error: {\"fields\":[\"data.shipments[0].recipient.cc\"],\"human\":[\"data.shipments[0].recipient.cc\"]}. Please contact MyParcel."
2023-01-07T00:26:32+00:00 DEBUG *** Creating shipments started ***
2023-01-07T00:26:32+00:00 DEBUG Shipment data for order 170.
2023-01-07T00:26:32+00:00 DEBUG export_order: "0 - data.shipments[0].recipient.cc - Unknown error: {\"fields\":[\"data.shipments[0].recipient.cc\"],\"human\":[\"data.shipments[0].recipient.cc\"]}. Please contact MyParcel."
2023-01-07T00:27:26+00:00 DEBUG *** Creating shipments started ***
2023-01-07T00:27:26+00:00 DEBUG Shipment data for order 186.
2023-01-07T00:27:26+00:00 DEBUG export_order: "0 - data.shipments[0].recipient.cc - Unknown error: {\"fields\":[\"data.shipments[0].recipient.cc\"],\"human\":[\"data.shipments[0].recipient.cc\"]}. Please contact MyParcel."
2023-01-07T00:27:52+00:00 DEBUG *** Creating shipments started ***
2023-01-07T00:27:52+00:00 DEBUG Shipment data for order 186.
2023-01-07T00:27:52+00:00 DEBUG export_order: "0 - data.shipments[0].recipient.cc - Unknown error: {\"fields\":[\"data.shipments[0].recipient.cc\"],\"human\":[\"data.shipments[0].recipient.cc\"]}. Please contact MyParcel."
Additional context
none
Hi @EV-Builder, this should not occur anymore in the v5.0.0 beta versions of the plugin.
Please read this issue for more information and how to report bugs in the new version.
|
gharchive/issue
| 2023-01-07T00:45:10 |
2025-04-01T06:39:42.461635
|
{
"authors": [
"EV-Builder",
"EdieLemoine"
],
"repo": "myparcelnl/woocommerce",
"url": "https://github.com/myparcelnl/woocommerce/issues/943",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1375593228
|
🛑 UHCNO Enterprise is down
In 73e4aee, UHCNO Enterprise (https://uhcno.ent.sirsi.net/client/en_US/uhcno/search/results?te=) was down:
HTTP code: 503
Response time: 311 ms
Resolved: UHCNO Enterprise is back up in 23d7716.
|
gharchive/issue
| 2022-09-16T08:14:04 |
2025-04-01T06:39:42.464891
|
{
"authors": [
"myqua"
],
"repo": "myqua/LOUIS-upptime",
"url": "https://github.com/myqua/LOUIS-upptime/issues/25",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
330371841
|
Unable to get local issuer certificate
When I try to yarn add -D npm-run-all I get an error:
[1/4] 🔍 Resolving packages...
error: An unexpected error occurred: "https://registry.yarnpkg.com/npm-run-all: unable to get local issuer certificate".
info: If you think this is a bug, please open a bug report with the information provided in "/Users/z00221y/Sites/exp-deploy-vue/yarn-error.log".
info: Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.
I also tried installing via the repo's remote .git URL, but to no avail:
[1/4] 🔍 Resolving packages...
error: Couldn't find package "cross-spawn@^6.0.4" required by "https://github.com/mysticatea/npm-run-all.git" on the "npm" registry.
info: Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.
error: Couldn't find package "ps-tree@^1.1.0" required by "https://github.com/mysticatea/npm-run-all.git" on the "npm" registry.
error: Couldn't find package "memorystream@^0.3.1" required by "https://github.com/mysticatea/npm-run-all.git" on the "npm" registry.
Here are some pertinent details re: my local environment:
macOS
Node
NPM
Yarn
v10.12.6
v8.11.2
v6.1.0
v1.6.0
Thank you for this issue.
However, I couldn't reproduce it.
Please open issue on the yarn's repo.
|
gharchive/issue
| 2018-06-07T17:44:54 |
2025-04-01T06:39:42.515126
|
{
"authors": [
"mysticatea",
"rafegoldberg"
],
"repo": "mysticatea/npm-run-all",
"url": "https://github.com/mysticatea/npm-run-all/issues/135",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
183503794
|
Fix proxy bugs
This should fix two issues related to using the proxy options.
Hostname is preferred over host
Adds the accompanying host header to the correct url
@mzabriskie found a few bugs while protoyping with your library
Coverage remained the same at 94.393% when pulling ce1ecdae7a035c144b3726e976c3d1a98caa7bd0 on Jarlotee:patch-1 into 3f8b128da4ab11e34f0b880381f9395b2ab0e22f on mzabriskie:master.
Coverage remained the same at 94.393% when pulling ce1ecdae7a035c144b3726e976c3d1a98caa7bd0 on Jarlotee:patch-1 into 3f8b128da4ab11e34f0b880381f9395b2ab0e22f on mzabriskie:master.
Thanks for the PR!
|
gharchive/pull-request
| 2016-10-17T19:23:11 |
2025-04-01T06:39:42.528239
|
{
"authors": [
"Jarlotee",
"coveralls",
"mzabriskie"
],
"repo": "mzabriskie/axios",
"url": "https://github.com/mzabriskie/axios/pull/491",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2135749652
|
🛑 mziyut.com is down
In af4f16e, mziyut.com (https://mziyut.com) was down:
HTTP code: 500
Response time: 76 ms
Resolved: mziyut.com is back up in c77ebfe after 11 minutes.
|
gharchive/issue
| 2024-02-15T06:30:07 |
2025-04-01T06:39:42.531329
|
{
"authors": [
"mziyut"
],
"repo": "mziyut/upptime",
"url": "https://github.com/mziyut/upptime/issues/440",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2279065257
|
feat: Update from default-net to rebranded netdev
Description
Upgrades from default-net v0.20 to netdev v0.25, which is simply a rebrand of the original default-net
Allows depending on system-configuration version 0.6 instead of 0.5.1 downstream. This may help iOS compliation.
Breaking Changes
Would need to check if/how much the default-net types were exposed in the API.
Notes & open questions
Not sure yet if it fixes our iOS problems.
Change checklist
[x] Self-review.
[x] Documentation updates if relevant. (no mention of default_net in docs)
[ ] Tests if relevant.
[ ] All breaking changes documented.
Changed items in the public API
===============================
-pub iroh_net::net::interfaces::IpNet::V4(netdev::ip::Ipv4Net)
+pub iroh_net::net::interfaces::IpNet::V4(default_net::ip::Ipv4Net)
-pub iroh_net::net::interfaces::IpNet::V6(netdev::ip::Ipv6Net)
+pub iroh_net::net::interfaces::IpNet::V6(default_net::ip::Ipv6Net)
interfaces should probably not even be exposed as part of the public api, but in any case, here is what changed
This fixes our iOS build together with the instructions from here: https://iroh.computer/docs/examples/ios-starter#add-the-system-configuration-framework
It seems like it's been resolved and was released in version 0.16.
iroh-net already depends on version 0.20 at the moment, so should the macOS logic be adjusted, or is the original issue not actually fixed?
yeah, I think this just never got rechecked after that fix was done
I'd be happy to contribute a change that removes the logic that's possibly subsumed by the upstream fix, but don't have a macOS device to test on :confused:
|
gharchive/pull-request
| 2024-05-04T15:09:30 |
2025-04-01T06:39:42.544264
|
{
"authors": [
"dignifiedquire",
"divagant-martian",
"matheus23"
],
"repo": "n0-computer/iroh",
"url": "https://github.com/n0-computer/iroh/pull/2264",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
340307697
|
Users list initial feature
So this is my intial feature of providing mutualy exclusive options for usernames (you can use commandline or text file).
Suggested:
migration to python3
multiprocessing instead of threading with pool of threads
Thanks!
|
gharchive/pull-request
| 2018-07-11T16:02:43 |
2025-04-01T06:39:42.545978
|
{
"authors": [
"n00py",
"s3gm3nt4ti0nf4ult"
],
"repo": "n00py/WPForce",
"url": "https://github.com/n00py/WPForce/pull/13",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
275246097
|
Redesign protobuf structure
What / 変更点
Protobuf のスキーマを再定義
Why / 変更した理由
リクエストをべき等にするため
How (Optional) / 概要
[ ] vmのstateについてべき等にする
|
gharchive/pull-request
| 2017-11-20T06:04:50 |
2025-04-01T06:39:42.547320
|
{
"authors": [
"h-otter",
"kyontan"
],
"repo": "n0stack/n0core",
"url": "https://github.com/n0stack/n0core/pull/45",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
352402159
|
greedy solver performs poorly
The problem
The interplay between entitymatch.calculate_filter_similarity and entitymatch.greedy_solver is not very good.
The output of calculate_filter_similarity is a list, sorted by similarity score in descending order on a per row basis.
However, the greedy solver does only one pass over the similarity matrix, assigning mappings on a first-come-first-serve basis. Thus, it can happen that it maps A to B although there might be another similarity of B to C with a higher similarity score.
This mismatch leads to poor matching results.
A more sensible approach is to sort the similarity matrix purely by similarity score, irrespective of the row or column.
Comparison of matching results with properly sorted similarity matrix on a febrl generated dataset
example code:
sparse_matrix = calculate_filter_similarity(filters_a, filters_b, k, thresh)
sparse_matrix = sorted(sparse_matrix, key=lambda tup: tup[1], reverse=True)
mapping = greedy_solver(sparse_matrix)
Results without the second line (as in calculate_mapping_greedy):
precision: 0.457
recall: 0.957
vs results with sorted similarity matrix:
precision: 0.692
recall: 0.99986
Proposal
It makes sense to be able to parameterize the calculate_filter_similarity function to control the structure of the similarity matrix.
I don't know if we need the current structure of the similarity matrix anywhere, but anyway, as we only have one sensible solver, the greedy solver, it would make sense to change the default to a more greedy solver pleasing kind of type.
Or, we sort the matrix again in the solver. Which means we essentially sort the matrix twice.
Related to #135. Fixed in the new API in #136.
|
gharchive/issue
| 2018-08-21T06:29:21 |
2025-04-01T06:39:42.551617
|
{
"authors": [
"nbgl",
"wilko77"
],
"repo": "n1analytics/anonlink",
"url": "https://github.com/n1analytics/anonlink/issues/153",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
52464483
|
back button
Hi!
I know it has been addressed before, but I can't seem to get my page to go back and I'm not that good with casperjs yet...
My script is like this:
casper.wait(15000, function() {
if(this.exists(x("(//tr[@class='pager'])[1]//a[preceding-sibling::span[not(span)]][1]"))) {
loop();
} else {
this.capture("billede1.png");
this.then(function() {
this.back();
console.log("attempting to go back");
});
casper.wait(15000, function() {
this.capture("billede2.png");
});
}
});
and it works very well apart from the fact that it doesn't go back :( the two screenshots are identical..
does anyone have an idea what to do?
thanks in advance!
I appreciate your help :)
CasperJS is built on top of PhantomJS so I believe that you are able utilize this: http://phantomjs.org/api/webpage/method/go-back.html
As part of a cleanup effort:
Looks to be a stale help request. Assuming you've moved on to better and cooler things. Please feel free to re-open if this is still an active issue for you and we'll try our best to help out.
|
gharchive/issue
| 2014-12-19T09:45:07 |
2025-04-01T06:39:42.554579
|
{
"authors": [
"BIGjuevos",
"MathiasLund",
"noma4i"
],
"repo": "n1k0/casperjs",
"url": "https://github.com/n1k0/casperjs/issues/1111",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
145040476
|
move to archived versions of phantomjs since bitbucket throttles
Let's try pulling in phantomjs from somewhere else, and see what happens. This requires some minor tweaks to the travis yml to like gz files as well.
@istr @mickaelandrieu Builds are now pulling from a github release. Also updated travis config to support both bz2 and gz compress tar files for future use.
Sounds good to me :) thank you !
@istr any objections to merging this? I think this should go in before anything else so we can stabilize the builds.
Ok, thank you. Merging this now (sorry, but always try to work from past to current).
|
gharchive/pull-request
| 2016-03-31T23:00:35 |
2025-04-01T06:39:42.556481
|
{
"authors": [
"BIGjuevos",
"istr",
"mickaelandrieu"
],
"repo": "n1k0/casperjs",
"url": "https://github.com/n1k0/casperjs/pull/1506",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2034675346
|
Flight Import Not Calculating Distances
Describe the bug
When importing a CSV with our routes, the system will process all routes without issue, but fills in the distance with 0nm. I have been told that v7 should autocalculate if the csv field for ditance is blank, this is not currently happening. I have included a line of our csv file as it stands.
"EJV,814,,814,1,KHHR,KMCE,,,,,24000,,61,C,,,,,,,,1,B350,,,,"
Which version you are using ? Bug report template kindly wants you to provide this info :)
If you are on beta5 update to latest dev and try again please.
Should be closed as OP not providing info and can not re-create the issue in latest dev
|
gharchive/issue
| 2023-12-11T01:50:46 |
2025-04-01T06:39:42.604259
|
{
"authors": [
"FatihKoz",
"rbadger12"
],
"repo": "nabeelio/phpvms",
"url": "https://github.com/nabeelio/phpvms/issues/1715",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
417775289
|
Toastr not loading css from cdn
After updating to latest version, toastr is only loading js file from CDN, and not css file
Can you provide more details. I cant reproduce
I had an issue where the styling dissapeared so i took a look at the network traffic and saw that no css file was loaded from the cdn.
But i resolved my issue with creating my own style sheet.
|
gharchive/issue
| 2019-03-06T12:17:55 |
2025-04-01T06:39:42.605788
|
{
"authors": [
"StrongGesmbH",
"nabinked"
],
"repo": "nabinked/NToastNotify",
"url": "https://github.com/nabinked/NToastNotify/issues/58",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1707727581
|
.net core sdk 在 M1芯片的macOS 13.3.1 中获取不到配置值
.net core sdk 在 M1芯片的macOS 13.3.1 中获取不到配置值,但是同样的代码在win下能获取到值
ref: #211
.net core sdk不支持m1芯片的原因是因为sdk grpc底层使用的是Grpc.Core,Grpc.Core不支持macOS arm64芯片。
Grpc.Core不支持macOS arm64的原因是因为:Grpc.Core依赖native动态库,然后Grpc.Core包中没有macOS arm64对应的动态库。(windows支持x86和x64,linux支持x64和arm64,macOS只支持x64)
解决方案:
联系Grpc.Core开发者,让其发布支持macOS arm64的新版本;目前Grpc.Core已不更新,其推荐使用grpc-dotnet。
使用grpc-dotnet替换Grpc.Core ,我在本地测试grpc-dotnet的例子,其打包出的结果不依赖native动态库,也就没有grpc-dotnet的问题。
我看 #258 也有讨论切换到grpc-dotnet,Grpc.Core已不更新,切换到grpc-dotnet的方案应该更合适一些。
Grpc.Core不支持macOS arm64,从应用发布结果就能看出来。
使用当前仓库的samples/App3例子,在samples/App3目录下执行发布命令dotnet publish,在 public目录下有runtimes native 动态库包,有5类动态库,不包含osx-arm64 。
发布包文件夹下具体的文件信息如下:
$ tree .
.
├── App3
├── App3.deps.json
├── App3.dll
├── App3.pdb
├── App3.runtimeconfig.json
├── App3.xml
├── Google.Protobuf.dll
├── Grpc.Core.Api.dll
├── Grpc.Core.dll
├── Microsoft.Extensions.DependencyModel.dll
├── Nacos.AspNetCore.dll
├── Nacos.AspNetCore.pdb
├── Nacos.AspNetCore.xml
├── Nacos.dll
├── Nacos.pdb
├── Nacos.xml
├── Newtonsoft.Json.dll
├── appsettings.Development.json
├── appsettings.json
├── runtimes
│ ├── linux-arm64
│ │ └── native
│ │ └── libgrpc_csharp_ext.arm64.so
│ ├── linux-x64
│ │ └── native
│ │ └── libgrpc_csharp_ext.x64.so
│ ├── osx-x64
│ │ └── native
│ │ └── libgrpc_csharp_ext.x64.dylib
│ ├── win-x64
│ │ └── native
│ │ └── grpc_csharp_ext.x64.dll
│ └── win-x86
│ └── native
│ └── grpc_csharp_ext.x86.dll
└── web.config
mac m1~m3 的问题,请使用最新的 1.3.9 版本
|
gharchive/issue
| 2023-05-12T14:22:58 |
2025-04-01T06:39:42.618018
|
{
"authors": [
"ZhaoBaoLin159753",
"catcherwong",
"heqingpan"
],
"repo": "nacos-group/nacos-sdk-csharp",
"url": "https://github.com/nacos-group/nacos-sdk-csharp/issues/251",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1029822012
|
Relax literal the requirement of the create_plot macro so that it can be used with stringify
This matches what concat! uses https://doc.rust-lang.org/stable/core/macro.concat.html
Also, concat! will produce a friendly customized error if a literal is not used:
Compared too:
or with multiple macro layers:
I'm sorry, but I won't be able to take a careful look into this PR for a couple more weeks (currently undergoing a job change)
Actually, never mind. I'll merge this now and release a version with this in place and we can release a breaking change later down the line if this turns out to have been a mistake.
@nagisa I'm happy to wait
tracy-client v0.12.6 is up.
|
gharchive/pull-request
| 2021-10-19T03:37:25 |
2025-04-01T06:39:42.638652
|
{
"authors": [
"Imberflur",
"nagisa"
],
"repo": "nagisa/rust_tracy_client",
"url": "https://github.com/nagisa/rust_tracy_client/pull/23",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1278286615
|
🛑 eeclassNCU is down
In 977a7b4, eeclassNCU (https://ncueeclass.ncu.edu.tw/dashboard) was down:
HTTP code: 0
Response time: 0 ms
Resolved: eeclassNCU is back up in ce3b1be.
|
gharchive/issue
| 2022-06-21T11:10:48 |
2025-04-01T06:39:42.642255
|
{
"authors": [
"naian0809"
],
"repo": "naian0809/upptime",
"url": "https://github.com/naian0809/upptime/issues/9",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
931452603
|
Transparent background for option label
Hi there! Thanks for the beautiful view!
I'd like to make the label background transparent (to show only text), I tried to set "label_backgroundColor" to "android: transparent" (#0000), but it didn't work.
I also tried to disable the label, but when I set the "label_text" empty, there is a still black rectangle.
So, summing up, my ask is to add the ability to use transparent background for labels.
Hey @DenShlk,
Thanks for bringing this issue to our attention. Please try the latest version v1.1.1 now available on Maven Central (or clone the latest commits from the master branch on this repo). Please let me know if the latest update fixes your problem.
Hi! It fixed my issue, thanks for the quick update!
|
gharchive/issue
| 2021-06-28T11:08:26 |
2025-04-01T06:39:42.657248
|
{
"authors": [
"DenShlk",
"kabumere"
],
"repo": "nambicompany/expandable-fab",
"url": "https://github.com/nambicompany/expandable-fab/issues/21",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2546445100
|
[FEATURE] Documentation extraction from package
Create a script that extracts package readme from https://github.com/nandlabs/golly. This needs to happen for every release.
We also need to maintain the golly docs link for old releases.
@neo7337 any suggestions ?
Each package would contain a README.md file, we can traverse and parse the documentation based on that and we will also know which documentation belongs to which package. Won't this work?🤔 @nandagopalan
Well some packages you may want to have the Readme.md for technical information but don't want it to be documented isn't it?
I guess for that purpose we can create a marker file.
|
gharchive/issue
| 2024-09-24T21:42:28 |
2025-04-01T06:39:42.683783
|
{
"authors": [
"nandagopalan",
"neo7337"
],
"repo": "nandlabs/golly-docs",
"url": "https://github.com/nandlabs/golly-docs/issues/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
897534832
|
(docs): Replace usage of <View> with <MotiView>
Since MotiView is now importable directly, replaced all instances of <View> with <MotiView>
Awesome thanks!
|
gharchive/pull-request
| 2021-05-20T23:04:32 |
2025-04-01T06:39:42.685323
|
{
"authors": [
"archcorsair",
"nandorojo"
],
"repo": "nandorojo/moti",
"url": "https://github.com/nandorojo/moti/pull/71",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1556250914
|
🛑 valerio-personeni.com is down
In ead05be, valerio-personeni.com (https://valerio-personeni.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: valerio-personeni.com is back up in 64dabff.
|
gharchive/issue
| 2023-01-25T08:35:43 |
2025-04-01T06:39:42.688449
|
{
"authors": [
"Ishydo"
],
"repo": "nanhosting/monitoring",
"url": "https://github.com/nanhosting/monitoring/issues/784",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1465264590
|
Write dart sass importer in idiomatic ruby
This PR rewrites the dart sass custom importer in idiomatic ruby code.
Thanks! This refactoring totally makes sense.
|
gharchive/pull-request
| 2022-11-26T21:59:05 |
2025-04-01T06:39:42.691318
|
{
"authors": [
"denisdefreyne",
"ntkme"
],
"repo": "nanoc/nanoc",
"url": "https://github.com/nanoc/nanoc/pull/1629",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
372229646
|
Fix MSVC linker error using rai_bootstrap_weights
see #1316
No problem! Glad to help , I'm mainly a Windows dev myself so it it was definitely self beneficial haha
|
gharchive/pull-request
| 2018-10-20T16:44:33 |
2025-04-01T06:39:42.696499
|
{
"authors": [
"CathalT"
],
"repo": "nanocurrency/raiblocks",
"url": "https://github.com/nanocurrency/raiblocks/pull/1317",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2313429270
|
Update MultiQC version
A recent MultiQC update broke the pipeline. I've switched back to a past version and it's running fine, but it would be nice to stay current.
Still need to update the pipeline to successfully use the new version.
Given that we're not using any of the fancier features of MultiQC and are essentially using it as a FastQC data aggregator, and might be switching to Falco anyway, I'm no longer sure it makes sense to use MultiQC here at all. I expect we'll revisit this after we've made the changes discussed in #74 and #78.
|
gharchive/issue
| 2024-05-23T16:57:17 |
2025-04-01T06:39:42.735622
|
{
"authors": [
"willbradshaw"
],
"repo": "naobservatory/mgs-workflow",
"url": "https://github.com/naobservatory/mgs-workflow/issues/18",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
976563711
|
A misprint in open method
You have a misprint in open method. At line 126 you wrote - "huawei_telent" instead of "huawei_telnet". It entails an error when you try to use telnet connection instead ssh
thanks.this is typing error. I will fix it.
|
gharchive/issue
| 2021-08-23T02:34:58 |
2025-04-01T06:39:42.742884
|
{
"authors": [
"kovalev94",
"tkspuk"
],
"repo": "napalm-automation-community/napalm-huawei-vrp",
"url": "https://github.com/napalm-automation-community/napalm-huawei-vrp/issues/12",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
211639257
|
Added commit confirm functionality
Related to napalm-automation/napalm-base#213
Hey @mirceaulinic,
Yes, I have tested it. commit_config() without additional argument confirmed=0 doesn't change current behaviour.
BTW I will create PR in a few minutes with brief description of this method in documentation.
Looking at this, I'd like to test the following scenario:
connect using the config_lock optional arg set as False
commit confirmed x minutes
is the config DB still locked?
confirm the commit
is the config DB unlocked?
@mirceaulinic I have tested proposed scenario and config DB was unlocked during commit confirmed period. I have added small fix and now it works as it should, so config DB is locked when device waits for confirmation.
Thanks @kderynski
|
gharchive/pull-request
| 2017-03-03T09:39:57 |
2025-04-01T06:39:42.746128
|
{
"authors": [
"kderynski",
"mirceaulinic"
],
"repo": "napalm-automation/napalm-junos",
"url": "https://github.com/napalm-automation/napalm-junos/pull/127",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1105781272
|
switch pd.append to pd.concat
Description
This is a proposed fix for #3962. Pandas is deprecating DataFrame.append in favor of pd.concat. Currently, causes a FutureWarning to be emitted when items are appended to the _FeatureTable.
Type of change
[x] Bug-fix (non-breaking change which fixes an issue)
References
Closes #3962
How has this been tested?
[x] the test suite for my feature covers cases x, y, and z
Final checklist:
[ ] My PR is the minimum possible work for the desired functionality
[ ] I have commented my code, particularly in hard-to-understand areas
[ ] I have made corresponding changes to the documentation
[ ] I have added tests that prove my fix is effective or that my feature works
[ ] If I included new strings, I have used trans. to make them localizable.
For more information see our translations guide.
merging now to get this in for the release!
|
gharchive/pull-request
| 2022-01-17T12:13:55 |
2025-04-01T06:39:42.753644
|
{
"authors": [
"alisterburt",
"kevinyamauchi"
],
"repo": "napari/napari",
"url": "https://github.com/napari/napari/pull/3963",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2375874161
|
UdpListener: Use 'poll_send_to' instead of 'poll_send' to allow multiple connections
Currently, listen servers are only able to accept connections from one peer address. This is because currently when the UdpListener accepts a connection, it calls UdpSocket::connect which restricts that socket sending/recv'ing to that peer address indefinitely.
This isn't necessary as tokio's UdpSocket offers poll_send_to, meaning the listener socket doesn't need to become exclusive and can use the stored peer_addr to accept and manage multiple connections.
This was tested using the "echo-udp" example with multiple instances of netcat.
Currently, listen servers are only able to accept connections from one peer address. This is because currently when the UdpListener accepts a connection, it calls UdpSocket::connect which restricts that socket sending/recv'ing to that peer address indefinitely.
This isn't necessary as tokio's UdpSocket offers poll_send_to, meaning the listener socket doesn't need to become exclusive and can use the stored peer_addr to accept and manage multiple connections.
This was tested using the "echo-udp" example with multiple instances of netcat.
As you mentioned, connect is not necessary here.
Thanks for the PR.
I noticed that it made an issue for the client, let me investigate more.
Just fixed on #11
Interesting that I haven't encountered this for some reason, I've been testing a lot since the PR. Glad you caught it however!
|
gharchive/pull-request
| 2024-06-26T17:09:10 |
2025-04-01T06:39:42.790371
|
{
"authors": [
"SajjadPourali",
"SirMangler"
],
"repo": "narrowlink/udp-stream",
"url": "https://github.com/narrowlink/udp-stream/pull/9",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1986067655
|
DDraw7.SetDisplayMode
DDraw7.SetDisplayMode
Action not supported.
ОК
I have no idea what this is, but as the readme says, no support is provided for D3D9On12 here.
|
gharchive/issue
| 2023-11-09T17:11:34 |
2025-04-01T06:39:42.792451
|
{
"authors": [
"Patraskon",
"narzoul"
],
"repo": "narzoul/ForceD3D9On12",
"url": "https://github.com/narzoul/ForceD3D9On12/issues/3",
"license": "0BSD",
"license_type": "permissive",
"license_source": "github-api"
}
|
476578837
|
Add Kitspace compatible electronics BOMs?
Hey, I noticed in the readme it says:
The Bill of Materials folder contains (currently just one) Bill of materials file for a specific vendor. We are searching for better ways to help with the ordering process
This is exactly what my open source project kitspace.org is for! It lets you buy across the 5 currently supported distributors with a single click.
To mirror your PCB designs there we need to add a few files: a manifest and the electronics BOMs (can be csv, xlsx or ods) separated into PCB files. When you update the repo on GitHub in the future it will automatically sync your changes. We do have a BOM export script for KiCad as well.
I would be glad to work with you on this and put this project up. Are you interested?
@kasbah we're very interested in anything that helps folks get their parts easier, faster, or less expensively!!!
Would you be willing to take our BOM and build out a project on kitspace to show us what that would look like? It'd be great to get an updated BOM with easier carts / ordering for folks.
Do note that we build our master parts list / BOM dynamically from our build instruction files. There may need to be a little thought going forwards about maintaining accuracy and consistency, but even as a first pass having something like a kitspace project (even for a static BOM as it exists today) would definitely add value over what we already have available.
I'm going to close out this issue here since it's not an issue against our repository per se, but I'd love to see what you can come up with in kitspace for this project. We can continue to chat on this issue or on the OSR forum (https://www.tapatalk.com/groups/jpl_opensource_rover/).
(And FYI if we get a kitspace project / BOM to an acceptable place, we will definitely link to it from our main repository!)
Awesome! @kevinb456, would you be up for generating the BOMs and sending the appropriate pull-requests or should I do it?
I think as a first step we'll add two .csv to the Bill of Materials Files folder one for Arduino_uno_sheild (sic) and one for Control Board. We should be able to use our KiCad scripts and then reconcile that with your Digikey BOM. One issue is that you haven't been using the actual PCB schematic references in any BOMs? I am sure we can figure it out though.
Maybe we can set up a script for you to re-generate these in the future if any of the electronics design change and also incorporate that into your build.
Interested by your approach in general. In my other job we develop an open source microscope (openflexure.org) and we are developing a tool called git-building to generate BOMs from markdown descriptions. It seems like a very similar approach except you are using .tex. I like that your "master parts list" is a spreadsheet (we have been using yaml for a similar purpose but I like this better).
|
gharchive/issue
| 2019-08-04T18:06:24 |
2025-04-01T06:39:42.801591
|
{
"authors": [
"kasbah",
"vssystemluba"
],
"repo": "nasa-jpl/open-source-rover",
"url": "https://github.com/nasa-jpl/open-source-rover/issues/144",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
515743947
|
Integration Candidate 20191030
Describe the contribution
Fixes #361, fixes #373, fixes #374, fixes #381
Testing performed
Steps taken to test the contribution:
Checked out bundle with OSAL and cFE ic-20191030 branches
make ENABLE_UNIT_TESTS=TRUE SIMULATION=native prep
make
make install
make test
Built without warnings, all tests passed except osal_timer_UT (nominal result on linux)
executed cfe, successful startup with no warnings
Expected behavior changes
Resolved potential lockup bug
Resolved anomalous messages produced during app delete
System(s) tested on:
cFS dev server
OS: Ubuntu 16.04
Versions: bundle with OSAL and cFE ic-20191030 branches
Additional context
None
Contributor Info
Jacob Hageman - NASA/GSFC
CCB 20191106 - approved for merge to master
|
gharchive/pull-request
| 2019-10-31T21:02:37 |
2025-04-01T06:39:42.810980
|
{
"authors": [
"skliper"
],
"repo": "nasa/cFE",
"url": "https://github.com/nasa/cFE/pull/388",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
414415373
|
Pretrained weights trained on landmark dataset
I share the pretrained weight trained on landmarkd dataset.
Please download it from the following url:
https://drive.google.com/open?id=1dbdaDyVeIb53iGh4Uk5kA4in9-uURoLM
Hi, thank you for sharing the trained model. Is this keypoints model trained on the full-version and the finetune model trained on the clean-version?
Hello, when I use your 'pretrained_model/ldmk/pca/pca.h5' to extract dimension-reduced DeLF, it shows error:
...
loaded weights from module "pool" ...
load model from "pretrained_model/ldmk/model/keypoint/ckpt/fix.pth.tar"
load PCA parameters...
Traceback (most recent call last):
File "extract/extractor.py", line 432, in <module>
extractor = FeatureExtractor(extractor_config)
File "extract/extractor.py", line 112, in __init__
self.pca_mean = h5file['.']['pca_mean'].value
AttributeError: 'Dataset' object has no attribute 'value'
How to fix it? Can you give me some advises?
@nashory , Hello, when I use your 'pretrained_model/ldmk/pca/pca.h5' to extract dimension-reduced DeLF, it shows error:
...
loaded weights from module "pool" ...
load model from "pretrained_model/ldmk/model/keypoint/ckpt/fix.pth.tar"
load PCA parameters...
Traceback (most recent call last):
File "extract/extractor.py", line 432, in <module>
extractor = FeatureExtractor(extractor_config)
File "extract/extractor.py", line 112, in __init__
self.pca_mean = h5file['.']['pca_mean'].value
AttributeError: 'Dataset' object has no attribute 'value'
How to fix it? Can you give me some advises?
I use np.array(x) to replace x.value, and now it works.
|
gharchive/issue
| 2019-02-26T03:37:26 |
2025-04-01T06:39:42.885559
|
{
"authors": [
"IvyGongoogle",
"nashory",
"rita-qingyu"
],
"repo": "nashory/DeLF-pytorch",
"url": "https://github.com/nashory/DeLF-pytorch/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1198688334
|
Audio doesn't play in Opera browser
I've always used this site in Opera, but after a while it didn't work anymore
I think this may be related to #19 ?
|
gharchive/issue
| 2022-04-09T15:45:33 |
2025-04-01T06:39:42.888229
|
{
"authors": [
"HevandroMP",
"nasso"
],
"repo": "nasso/urmusic",
"url": "https://github.com/nasso/urmusic/issues/20",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
931375405
|
Can't get the demo to work
Which platform(s) does your issue occur on?
Android 10, Real device
Please, tell us how to recreate the issue in as much detail as possible.
I downloaded the code from this repo and ran the demo (plain js) app. I got the following errors:
ERROR in /Users/vipulk/Documents/MaterialDemo/tsconfig.json
[tsl] ERROR
TS6053: File '../tsconfig' not found.
ERROR in ./app.ts
Module build failed (from ../node_modules/ts-loader/index.js):
Error: error while parsing tsconfig.json
at Object.loader (/Users/vipulk/Documents/MaterialDemo/node_modules/ts-loader/dist/index.js:19:18)
ERROR in ../tsconfig.json
TS6053: File '../tsconfig' not found.
and then after Gradle build
Error executing Static Binding Generator: Couldn't find '/Users/vipulk/Documents/MaterialDemo/platforms/android/build-tools/sbg-bindings.txt' bindings input file. Most probably there's an error in the JS Parser execution. You can run JS Parser with verbose logging by executing "node '/Users/vipulk/Documents/MaterialDemo/platforms/android/build-tools/jsparser/js_parser.js' enableErrorLogging".
Upon executing the suggested command, nothing happens and retrying produces same error.
@Whip the demo code depends on the whole repo structure. Seeing your logs you only extracted the demo from the repo. It wont work.
My bad. Okay I've included the entire repo. Navigated to demo folder
cd demo
ns run android
I get this:
npm WARN deprecated urix@0.1.0: Please see https://github.com/lydell/urix#deprecated
npm WARN deprecated har-validator@5.1.5: this library is no longer supported
npm WARN deprecated browserslist@1.7.7: Browserslist 2 could fail on reading Browserslist >3.0 config used in other tools.
npm WARN deprecated resolve-url@0.2.1: https://github.com/lydell/resolve-url#deprecated
npm WARN deprecated chokidar@2.1.8: Chokidar 2 will break on node v14+. Upgrade to chokidar 3 with 15x less dependencies.
npm WARN deprecated fsevents@1.2.13: fsevents 1 will break on node v14+ and could be using insecure binaries. Upgrade to fsevents 2.
npm WARN deprecated extract-text-webpack-plugin@3.0.2: Deprecated. Please use https://github.com/webpack-contrib/mini-css-extract-plugin
npm WARN deprecated uuid@3.4.0: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.
npm WARN deprecated request@2.88.2: request has been deprecated, see https://github.com/request/request/issues/3142
npm WARN deprecated core-js@2.6.12: core-js@<3.3 is no longer maintained and not recommended for usage due to the number of issues. Because of the V8 engine whims, feature detection in old core-js versions could cause a slowdown up to 100x even if nothing is polyfilled. Please, upgrade your dependencies to the actual version of core-js.
npm ERR! code 1
npm ERR! path /Users/vipulk/Documents/Material-Components/packages/core
npm ERR! command failed
npm ERR! command sh -c node postinstall.js
npm ERR! internal/modules/cjs/loader.js:883
npm ERR! throw err;
npm ERR! ^
npm ERR!
npm ERR! Error: Cannot find module '/Users/vipulk/Documents/Material-Components/packages/core/postinstall.js'
npm ERR! at Function.Module._resolveFilename (internal/modules/cjs/loader.js:880:15)
npm ERR! at Function.Module._load (internal/modules/cjs/loader.js:725:27)
npm ERR! at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:72:12)
npm ERR! at internal/main/run_main_module.js:17:47 {
npm ERR! code: 'MODULE_NOT_FOUND',
npm ERR! requireStack: []
npm ERR! }
npm ERR! A complete log of this run can be found in:
npm ERR! /Users/vipulk/.npm/_logs/2021-06-28T10_07_56_038Z-debug.log
Unable to install dependencies. Make sure your package.json is valid and all dependencies are correct. Error is: Command npm failed with exit code 1
I've attached the log file mentiond as well.
2021-06-28T10_07_56_038Z-debug.log
Sorry if I'm still being a noob.
@Whip you need to build the project first. The demo is made to show you code. However it is also meant for us to develop with. So it relies on local built project. See the last question of the faq https://github.com/nativescript-community/ui-material-components#faq
It still didn't work. The error I'm getting is
Cannot find module '@nativescript-community/ui-material-core/scripts/before-prepare.js'
Require stack:
- /Users/vipulk/Documents/Material-Components/demo/hooks/before-prepare/nativescript-community-ui-material-core.js
- /usr/local/lib/node_modules/nativescript/lib/common/services/hooks-service.js
- /usr/local/lib/node_modules/nativescript/lib/common/yok.js
- /usr/local/lib/node_modules/nativescript/lib/bootstrap.js
- /usr/local/lib/node_modules/nativescript/lib/nativescript-cli.js
- /usr/local/lib/node_modules/nativescript/bin/tns
This happens on the last step npm run demo.android. I had setup a fresh copy of the repo for running the commands you linked.
@Whip ok i know what s going on! will fix it and push right away
@Whip i pushed a fix and updated the readme. Instead of running npm run tsc you need to run npm run build.all
Worked. Thanks for all your support
|
gharchive/issue
| 2021-06-28T09:37:46 |
2025-04-01T06:39:43.058647
|
{
"authors": [
"Whip",
"farfromrefug"
],
"repo": "nativescript-community/ui-material-components",
"url": "https://github.com/nativescript-community/ui-material-components/issues/311",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2498020974
|
Add Figure 5 code
Fig. 5. Bright and dark spots of the global range-size rarity-weighted Red List Index of sharks, rays, and chimaeras values across in 2020 at four biogeographic scales
Defintion of Done
[x] Rmarkdown added with code for generating RLI values for each biogeographic scales (hex, EEZ, LME, FAO)
[ ] Adding final figure outputs
New task:
[ ] Organize figure generation code (currently a mess), preferably as a package for replicability
|
gharchive/issue
| 2024-08-30T19:15:43 |
2025-04-01T06:39:43.061032
|
{
"authors": [
"JayMatsushiba"
],
"repo": "natpac/ecol_erosion_extinction_risk_sharks",
"url": "https://github.com/natpac/ecol_erosion_extinction_risk_sharks/issues/1",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
55955447
|
SitePrism::NoUrlMatcherForPage following documentation example
The documentation shows the following example:
class Account < SitePrism::Page
set_url "/accounts/{id}{?query*}"
end
It goes onto say that the following test would pass:
@account_page = Account.new
@account_page.load(id: 22, query: { token: "ca2786616a4285bc" })
expect(@account_page.current_url).to end_with "/accounts/22?token=ca2786616a4285bc"
expect(@account_page).to be_displayed
I have a very similar setup I'm trying to test, and I am getting the following error:
Failure/Error: expect(index_page).to be_displayed
SitePrism::NoUrlMatcherForPage:
SitePrism::NoUrlMatcherForPage
# /Users/gconzett/.gem/ruby/2.0.0/gems/site_prism-2.6/lib/site_prism/page.rb:14:in `displayed?'
The page object is as follows:
require_relative '../sections/widget_section.rb'
require_relative '../sections/widget_fields_section.rb'
class WidgetIndexPage < SitePrism::Page
set_url '/widgets'
section :fields, WidgetFieldsSection, 'form'
sections :widgets, WidgetSection, 'tbody tr'
element :create_widget_button, '#create-widget'
end
And the spec itself has this simple assertion which fails:
feature 'Creating a widget' do
scenario 'with valid input' do
index_page = WidgetIndexPage.new
index_page.load
expect(index_page).to be_displayed
end
end
Unfortunately this leads to the above mentioned error. Do I have to define some kind of regex matcher? The coded leads me to believe that it should just use the string for the URL. Any info would be appreciated. Thanks!
+1 I'm having the same issue
+1 I have this too, running 2.2.0.
The page I am visiting is...
http://my.website.url/product
and the page object has...
set_url "#{my.website.url}/product"
Ah, this is happening because the README.md as shown on github reflects a couple of pull requests that have been merged in; but I haven't pushed an updated version of the site_prism gem :/ I'll be doing that over the next few days :)
Ah, when is the new gem going to be released? I just updated to version 2.6 from 2.2 and it didn't fix the problem. Should we take the latest commit from here and use that?
Well there you go. 2.7 fixed this issue for me. Thx Nat! This is a great gem!
|
gharchive/issue
| 2015-01-29T20:55:41 |
2025-04-01T06:39:43.066258
|
{
"authors": [
"JohnSmall",
"adamalbrecht",
"conzett",
"natritmeyer",
"sherbhachu",
"vmpj"
],
"repo": "natritmeyer/site_prism",
"url": "https://github.com/natritmeyer/site_prism/issues/101",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
141503562
|
case of a node failure
Hi
So lets say in a cluster of 3 nodes, we push a message but a node fails before the message is delivered, is there a chance that the message would be lost or the other 2 nodes already have it and will deliver it (once or twice?)
or in other words, when a message is pushed to nats.io, is the message acknowledge when its present in all 3 nodes or just that one node where its pushed to and then is redistributed to the rest of the nodes.
There is no acknowledgment at all in NATS. This is a "fire-and-forget" mechanism, in that when a message is sent, there is no guarantee that it is going to be delivered. At the application level, one can use Request (with timeout) to ensure that the receiver gets the message, and resend if the reply is not received within the timeout.
At the server level, it's just a send. So if your application sends to server A, and A sends to B and C (assuming there is an interest on both), there is no guarantee that B and/or C receives it, and therefore no guarantee that subscribers attached to it receive it either. Again, the sender can get around that using request/reply if guarantee delivery is desired.
Note that we have a persistence layer in the works.
thanks
|
gharchive/issue
| 2016-03-17T07:36:24 |
2025-04-01T06:39:43.070136
|
{
"authors": [
"deviantmk",
"kozlovic"
],
"repo": "nats-io/gnatsd",
"url": "https://github.com/nats-io/gnatsd/issues/225",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1251339422
|
Token authentication bug
The logic used to choose token or username does not work properly, Connection reports: Connection: nats: Authorization Violation. When the CLI attempts to nats.Connect() it is calling this function under the hood to provide natsOpts: https://github.com/nats-io/jsm.go/blob/5a299917dacd0d9daafc9609afae4eed6107c816/natscontext/context.go#L294
The switch logic in this function will never set a token though for the created natsConnection since the CLI treats user/token the same. I was able to verify this is the case building the cli from source and removing the c.User (https://github.com/nats-io/jsm.go/blob/5a299917dacd0d9daafc9609afae4eed6107c816/natscontext/context.go#L298-L299). With this removed the token is properly set in the NATS connection and I can properly interface with my NATS server expecting a token.
Had a few bugs related to this, I think I will add a specific flag for tokens.
Looking again now I am not so sure this is the problem, we dont just always pass User from the CLI, it only sets the user if both a user and password is set, else it pass a token:
https://github.com/nats-io/natscli/blob/0f49ced81752e6eae21b3ee078ba4851114be158/cli/util.go#L749-L753
Given that, the context will do the right thing since it will only have the token option set.
I guess what you cant do though is have a context with a user+password set and then override that with a token. Is that what you were trying to do?
Closing this as working with the slight caveat, let me know if you have further issues
|
gharchive/issue
| 2022-05-27T22:46:46 |
2025-04-01T06:39:43.115657
|
{
"authors": [
"boreddude13",
"ripienaar"
],
"repo": "nats-io/natscli",
"url": "https://github.com/nats-io/natscli/issues/476",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1838689044
|
Add mpd-mpris.desktop
As also discussed in https://github.com/natsukagami/mpd-mpris/pull/37 .
Thanks!
|
gharchive/pull-request
| 2023-08-07T05:05:32 |
2025-04-01T06:39:43.122815
|
{
"authors": [
"doronbehar",
"natsukagami"
],
"repo": "natsukagami/mpd-mpris",
"url": "https://github.com/natsukagami/mpd-mpris/pull/42",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
294653484
|
Plot vector distribution
Plotting the major variables of CapsNet in histograms and distribution plots.
This will be used later for estimating the accuracy needed for fixed point number representation.
Sorry, I meant to PR into my own fork
|
gharchive/pull-request
| 2018-02-06T07:08:49 |
2025-04-01T06:39:43.128559
|
{
"authors": [
"minghz"
],
"repo": "naturomics/CapsNet-Tensorflow",
"url": "https://github.com/naturomics/CapsNet-Tensorflow/pull/56",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
264349847
|
Naudio WPF Demo 8 band Equalizer Frequency ranges
Hi Mark, et al,
I'm working with the Naudio WPF Demo 8 band Equalizer. It works nicely. But I can't figure out the audible frequency bands (0-22KHz range) for each of the band sliders.
The demo code looks like this...
bands = new EqualizerBand[]
{
new EqualizerBand {Bandwidth = 0.8f, Frequency = 100, Gain = 0},
new EqualizerBand {Bandwidth = 0.8f, Frequency = 200, Gain = 0},
new EqualizerBand {Bandwidth = 0.8f, Frequency = 400, Gain = 0},
new EqualizerBand {Bandwidth = 0.8f, Frequency = 800, Gain = 0},
new EqualizerBand {Bandwidth = 0.8f, Frequency = 1200, Gain = 0},
new EqualizerBand {Bandwidth = 0.8f, Frequency = 2400, Gain = 0},
new EqualizerBand {Bandwidth = 0.8f, Frequency = 4800, Gain = 0},
new EqualizerBand {Bandwidth = 0.8f, Frequency = 9600, Gain = 0},
};
So assuming standard audio spans from 0Hz to 22050 Hz, how does the Frequency parm in the EqualizerBand constructor map to the 0-22KHz range?
The frequency parameter is in Hz, and you can give it any value (obviously staying below Nyquist frequency - half the sample rate). I kept the maximum band fairly low in this example to avoid issues with lower sample rates, but no reason why you can't go higher.
Mark,
Thanks for the quick reply.
Ok, so for audio with a 44.1K sample rate I could set Frequency as high as 22050?
Example
new EqualizerBand {Bandwidth = 0.8f, Frequency = 22050, Gain = 0},
What I might do is get the sample rate in the ISampleProvider constructor and calc the Frequency for each band accordingly.
I found this link that has a good explanation of how to choose the Equalizer bands frequencies.
https://www.presonus.com/learn/technical-articles/equalizer-terms-and-tips
With this info, I'm closing this issue. If anyone has anything to add, please do comment below.
|
gharchive/issue
| 2017-10-10T19:35:18 |
2025-04-01T06:39:43.132245
|
{
"authors": [
"SmokinLeather",
"markheath"
],
"repo": "naudio/NAudio",
"url": "https://github.com/naudio/NAudio/issues/247",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1693098305
|
Change REST API to use HyperlinkedModelSerializer
Closes: #DNE
What's Changed
Change the base REST API serializer inheritance from ModelSerializer to HyperlinkedModelSerializer. This wasn't quite a drop-in replacement because for whatever reason, DRF's HyperlinkedModelSerializer, HyperlinkedRelatedField, and HyperlinkedIdentityField classes are not aware of URL namespacing, and so they default to trying to automatic reverse urls like <modelname>-detail instead of what we need of <applabel>-api:<modelname>-detail. I've overridden what I believe to be all of the places where this was being used incorrectly.
This lets us remove the explicit url field from most of our serializers as a bonus.
TODO
[x] Explanation of Change(s)
[ ] Added change log fragment(s) (for more information see the documentation)
[ ] Attached Screenshots, Payload Example
[ ] Unit, Integration Tests
[ ] Documentation Updates (when adding/changing features)
[ ] Example Plugin Updates (when adding/changing features)
[ ] Outline Remaining Work, Constraints from Design
Can you confirm/deny that the effect of this is that if as a plugin developer, I use BaseModelSerializer, it will automatically generate a field called "url" and it will correctly link.
Can you confirm/deny that the effect of this is that if as a plugin developer, I use BaseModelSerializer, it will automatically generate a field called "url" and it will correctly link.
That should be correct, yes.
To be clear that is not the primary intent of this PR, but it is a bonus. I'll add more details tomorrow.
|
gharchive/pull-request
| 2023-05-02T21:03:57 |
2025-04-01T06:39:43.145906
|
{
"authors": [
"glennmatthews",
"itdependsnetworks"
],
"repo": "nautobot/nautobot",
"url": "https://github.com/nautobot/nautobot/pull/3679",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2553413752
|
🛑 BarefootEdu is down
In 5f62b66, BarefootEdu (https://barefootedu.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: BarefootEdu is back up in ec5fb18 after 11 minutes.
|
gharchive/issue
| 2024-09-27T17:29:34 |
2025-04-01T06:39:43.148539
|
{
"authors": [
"arunjose1995"
],
"repo": "navadhiti/upptime",
"url": "https://github.com/navadhiti/upptime/issues/1157",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2495641808
|
🛑 BarefootEdu is down
In a9e9862, BarefootEdu (https://barefootedu.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: BarefootEdu is back up in 37a92ba after 18 minutes.
|
gharchive/issue
| 2024-08-29T21:50:39 |
2025-04-01T06:39:43.150932
|
{
"authors": [
"arunjose1995"
],
"repo": "navadhiti/upptime",
"url": "https://github.com/navadhiti/upptime/issues/309",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2495718210
|
🛑 BarefootEdu is down
In 95e7752, BarefootEdu (https://barefootedu.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: BarefootEdu is back up in a20dd83 after 5 minutes.
|
gharchive/issue
| 2024-08-29T22:51:20 |
2025-04-01T06:39:43.153443
|
{
"authors": [
"arunjose1995"
],
"repo": "navadhiti/upptime",
"url": "https://github.com/navadhiti/upptime/issues/311",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1228645662
|
🛑 Harmony Bot Website is down
In 0bf087a, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in 393a34d.
|
gharchive/issue
| 2022-05-07T14:43:51 |
2025-04-01T06:39:43.155814
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/10120",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1264236494
|
🛑 Harmony Bot Website is down
In f3aaced, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in 435e42f.
|
gharchive/issue
| 2022-06-08T05:59:03 |
2025-04-01T06:39:43.157937
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/10664",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
807283902
|
🛑 Harmony Bot Website is down
In fed632d, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in 17b0b4c.
|
gharchive/issue
| 2021-02-12T14:33:30 |
2025-04-01T06:39:43.160124
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/120",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
868768479
|
🛑 Harmony Bot Website is down
In e458001, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in 4799f8a.
|
gharchive/issue
| 2021-04-27T11:51:45 |
2025-04-01T06:39:43.162312
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/1867",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
878064873
|
🛑 Harmony Backup Bot is down
In b18b4d4, Harmony Backup Bot ($HARMONY_BACKUP) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Backup Bot is back up in 32cf140.
|
gharchive/issue
| 2021-05-06T21:32:49 |
2025-04-01T06:39:43.164510
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/2223",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
890244836
|
🛑 Harmony Bot Website is down
In 9dae02f, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in f2d65cd.
|
gharchive/issue
| 2021-05-12T15:58:40 |
2025-04-01T06:39:43.166839
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/2406",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
812523431
|
🛑 Harmony Bot Website is down
In 3868280, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in 618333f.
|
gharchive/issue
| 2021-02-20T05:45:23 |
2025-04-01T06:39:43.169046
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/444",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1078011231
|
🛑 Harmony Bot Website is down
In 5695b0d, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in 3f36cde.
|
gharchive/issue
| 2021-12-13T02:21:40 |
2025-04-01T06:39:43.171177
|
{
"authors": [
"navaneethkm004"
],
"repo": "navaneethkm004/uptime",
"url": "https://github.com/navaneethkm004/uptime/issues/7085",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
326476388
|
refactor(SpriteImage): Use image tag & transform property
Issue
#210
Details
Use image tag & transform property instead of background property enhancement
Coverage decreased (-0.2%) to 92.852% when pulling 3a302da0fe8153bfa4e8e3902a2a166fccb4af85 on jongmoon:SpriteImage#210 into 6adcc218507258844050137c9145bdfe5f336abe on naver:master.
LGTM
|
gharchive/pull-request
| 2018-05-25T10:56:36 |
2025-04-01T06:39:43.180604
|
{
"authors": [
"coveralls",
"jongmoon",
"younkue"
],
"repo": "naver/egjs-view360",
"url": "https://github.com/naver/egjs-view360/pull/211",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
310460035
|
Pinpoint- configuration support
Description
----> I've downloaded - pinpoint-collector-1.7.2-SNAPSHOT,pinpoint-web-1.7.2-SNAPSHOT,pinpoint-agent-1.7.2-SNAPSHOT.tar,hbase-1.2.6-bin.tar
----> without any modification I've started the HBASE and started the collected and web in TOMCAT version 9.0.6
----> Agent I extracted and placed on the same folder where all the files located.
----> wrote one small JAVA program to wait for half an hour
----> executed with the below command [java test -javaagent:/root/Documents/pinpointagent/pinpointagent.jar - DpinPoint.applicationaName=testapp -DpinPoint.agentId=count]
Note: program excuted successfully.
I've also ran the HBASE table creation and I could see the 15 tables in HBASE UI
Environment
ROOT user
Linux Lin 4.13.0-kali1-amd64
Additional Info
Now I'm struct and don't know how to move further could anyone support on this. thanks in advnce
Hello, @gsmba6
correct me if I'm wrong.
You have completed setting up HBASE.
successfully executed PINPOINT-AGENT, PINPOINT-COLLECTOR, PINPOINT-WEB
small JAVA program running without any problems.
I believe there isn't anything else to take care of besides monitoring your application(in this case small JAVA program) through PINPOINT-WEB(for the installation of WEB, take a look at here)
depending on the configuration that you configured for PINPOINT-WEB in TOMCAT. You can access PINPOINT-WEB and the first screen image that you see will be something like this.
@RoySRose thanks for the quick turn around.
My issue is the Pinpoint web is launching but i could able to see the application name or ID in the web. its plain.. my application drop down is disabled.
@gsmba6
I don't think I can be a much help without any logs or further information.
Since you have downloaded all
pinpoint-collector-1.7.2-SNAPSHOT
pinpoint-web-1.7.2-SNAPSHOTpinpoint-agent-1.7.2-SNAPSHOT.tar,hbase-1.2.6-bin.tar
Only thing that I can think of https://github.com/naver/pinpoint/issues/2273
@RoySRose - Issue resolved.
I've appended the Tomcat - Catalina option in . sh file
But my Eclipse uses its own configuration where I gave the java agent details to VM arguments and the issue resolved and my agent appeared in web UI.
I can monitor the transactions.
|
gharchive/issue
| 2018-04-02T11:15:34 |
2025-04-01T06:39:43.188912
|
{
"authors": [
"RoySRose",
"gsmba6"
],
"repo": "naver/pinpoint",
"url": "https://github.com/naver/pinpoint/issues/3965",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
484360147
|
Unable to collect spring beans information
Prerequisites
Please check the FAQ, and search existing issues for similar questions before creating a new issue.YOU MAY DELETE THIS PREREQUISITES SECTION.
[x] I have checked the FAQ, and issues and found no answer.
Environment:
pinpoint:1.8.4
spring-boot:2.1.7
JDK:8
pinpoint.conf
...
###########################################################
# application type #
###########################################################
profiler.applicationservertype=SPRING_BOOT
...
###########################################################
# TOMCAT #
###########################################################
profiler.tomcat.enable=true
# Classes for detecting application server type. Comma separated list of fully qualified class names. Wildcard not supported.
profiler.tomcat.bootstrap.main=org.apache.catalina.startup.Bootstrap
# Check pre-conditions when registering class file transformers mainly due to JBoss plugin transforming the same class.
# Setting this to true currently adds transformers only if the application was launched via org.apache.catalina.startup.Bootstrap,
# or SpringBoot's launchers.
# Set this to false to bypass this check entirely (such as when launching standalone applications running embedded Tomcat).
profiler.tomcat.conditional.transform=false
...
###########################################################
# SPRING BOOT #
###########################################################
profiler.springboot.enable=true
# Classes for detecting application server type. Comma separated list of fully qualified class names. Wildcard not supported.
profiler.springboot.bootstrap.main=org.springframework.boot.loader.JarLauncher, org.springframework.boot.loader.WarLauncher, org.springframework.boot.loader.PropertiesLauncher
...
###########################################################
# spring-beans
###########################################################
# Profile spring-beans
profiler.spring.beans=true
# filters
# filter
# filter OR filters
# filter
# value
# value AND filter
# value
# token
# token OR token
# token
# profiler.spring.beans.n.scope= [component-scan | post-processor] default is component-scan.
# profiler.spring.beans.n.base-packages= [package name, ...]
# profiler.spring.beans.n.name.pattern= [regex pattern, regex:regex pattern, antstyle:antstyle pattern, ...]
# profiler.spring.beans.n.class.pattern= [regex pattern, regex:regex pattern, antstyle:antstyle pattern, ...]
# profiler.spring.beans.n.annotation= [annotation name, ...]
#
# Scope:
# component-scan: <context:component-scan ... /> or @ComponentScan
# post-processor: BeanPostProcessor - Slow!!!
#
# ANT Style pattern rules:
# ? - matches on character
# * - matches zero or more characters
# ** - matches zero or more 'directories' in a path
# Examples
# profiler.spring.beans.1.scope=post-processor
# profiler.spring.beans.1.base-packages=com.foo, com.bar
# profiler.spring.beans.1.name.pattern=.*Foo, regex:.*Bar, antstyle:*Controller
# profiler.spring.beans.1.class.pattern=
# profiler.spring.beans.1.annotation=org.springframework.stereotype.Controller,org.springframework.stereotype.Service,org.springframework.stereotype.Repository
#
# profiler.spring.beans.2.scope=post-processor
# profiler.spring.beans.2.base-packages=com.foo
# profiler.spring.beans.2.name.pattern=
# profiler.spring.beans.2.class.pattern=antstyle:com.foo.repository.*Repository, antstyle:com.foo.Service.Main*
# profiler.spring.beans.2.annotation=
profiler.spring.beans.scope=post-processor
profiler.spring.beans.base-packages=com.nsl
profiler.spring.beans.name.pattern=
profiler.spring.beans.class.pattern=
profiler.spring.beans.annotation=org.springframework.stereotype.Controller,org.springframework.stereotype.Service,org.springframework.stereotype.Repository
profiler.spring.beans.mark.error=true
...
Logs
2019-08-23 14:25:47 [INFO ](com.navercorp.pinpoint.bootstrap.config.DefaultProfilerConfig) profiler.springboot.enable=true
2019-08-23 14:25:47 [INFO ](com.navercorp.pinpoint.bootstrap.config.DefaultProfilerConfig) profiler.spring.beans=true
2019-08-23 14:25:47 [INFO ](com.navercorp.pinpoint.bootstrap.config.DefaultProfilerConfig) profiler.spring.beans.name.pattern=
2019-08-23 14:25:47 [INFO ](com.navercorp.pinpoint.bootstrap.config.DefaultProfilerConfig) profiler.spring.beans.class.pattern=
2019-08-23 14:25:47 [INFO ](com.navercorp.pinpoint.bootstrap.config.DefaultProfilerConfig) profiler.spring.beans.annotation=org.springframework.stereotype.Controller,org.springframework.stereotype.Service,org.springframework.stereotype.Repository
2019-08-23 14:25:47 [INFO ](com.navercorp.pinpoint.bootstrap.config.DefaultProfilerConfig) profiler.spring.beans.[0-9]+(.scope|.base-packages|.name.pattern|.class.pattern|.annotation)={}
2019-08-23 14:25:47 [INFO ](com.navercorp.pinpoint.bootstrap.config.DefaultProfilerConfig) profiler.spring.beans.mark.error=true
I can find my application on the web ui and the requests can be caught, but I can't see the Spring beans information like Controller from the details.
Look forward to your reply. thanks.
@zhangjiankun1997
Please modify your settings as follows:
profiler.spring.beans.1.scope=post-processor
profiler.spring.beans.1.base-packages=com.nsl
profiler.spring.beans.1.name.pattern=
profiler.spring.beans.1.class.pattern=
profiler.spring.beans.1.annotation=org.springframework.stereotype.Controller,org.springframework.stereotype.Service,org.springframework.stereotype.Repository
Thanks! It worked!
|
gharchive/issue
| 2019-08-23T06:52:22 |
2025-04-01T06:39:43.195526
|
{
"authors": [
"jaehong-kim",
"zhangjiankun1997"
],
"repo": "naver/pinpoint",
"url": "https://github.com/naver/pinpoint/issues/5915",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
490170626
|
where can i get the code of test-pinpoint-demo?
Prerequisites
Please check the FAQ, and search existing issues for similar questions before creating a new issue.YOU MAY DELETE THIS PREREQUISITES SECTION.
I have checked the FAQ, and issues and found no answer.
where can i get the code of test-pinpoint-demo?
thank you.
Hello, @jicai
what do you mean by test-pinpoint-demo
@RoySRose
http://125.209.240.10:10123/#/main
is the project of APIGateway,Shopping-Api,Shoppping-Order open source?
@jicai
Oh. the demo source code isn't currently an opensource.
@Xylus
any plans?
@jicai Can you access https://github.com/Xylus/pinpoint-demo-apps?
It's set to public so you'll be able to clone and play around with it.
@Xylus
I got it, Thank you.
|
gharchive/issue
| 2019-09-06T07:17:29 |
2025-04-01T06:39:43.200801
|
{
"authors": [
"RoySRose",
"Xylus",
"jicai"
],
"repo": "naver/pinpoint",
"url": "https://github.com/naver/pinpoint/issues/5972",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
53334749
|
gravatar https 문제.
https 환경에서 (nginx reverse proxy로 yobi 이용, x-forwarded-proto header 이용중)
gravatar url이 http로 연결되어서 브라우져의 주소창에서 경고메시지(?)가 뜨네요.
gravatar는 항상 https로 연결해도 괜찮을것 같아요.
앗 좋은 의견입니다. 반영하는 쪽으로 논의해 보겠습니다. 감사합니다.
|
gharchive/issue
| 2015-01-04T12:17:11 |
2025-04-01T06:39:43.202012
|
{
"authors": [
"ajkj",
"keesun"
],
"repo": "naver/yobi",
"url": "https://github.com/naver/yobi/issues/831",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
54528204
|
프로젝트 정보에 Language 설정
프로젝트 정보에 Language 추가하는 거 있는데...
이게 뭔지 잘 모르겠어서 문의 드립니다~
그냥 다른 사람들이 이 프로젝트에서 어떤 언어를 사용하는지 알 수 있도록 라벨을 붙이는 것입니다.
|
gharchive/issue
| 2015-01-16T01:12:07 |
2025-04-01T06:39:43.203329
|
{
"authors": [
"mangochicken",
"npcode"
],
"repo": "naver/yobi",
"url": "https://github.com/naver/yobi/issues/844",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1401257726
|
Deploy av 20221007161110-4ae8e76
4ae8e76525a7db8ef18101068e87f61a1088322a
/promote dev-gcp
/promote prod-gcp
|
gharchive/issue
| 2022-10-07T14:18:08 |
2025-04-01T06:39:43.221805
|
{
"authors": [
"espenwaaga",
"grutkowska"
],
"repo": "navikt/foreldrepengesoknad",
"url": "https://github.com/navikt/foreldrepengesoknad/issues/3730",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1939839761
|
IS-1655: Delete btsys code++
Vi kaller nå btsys direkte fra padm2 (https://github.com/navikt/padm2/pull/210) så kan fjerne all kode relatert til dette. Kan samtidig fjerne AzureAdClient-kode og en hel del annet som er ubrukt.
Hvis vi kan henge oss på det team sykmelding har gjort med syfohelsenettproxy, så kan vi kalle nhn direkte fra fastlegerest og da er det ikke noe igjen av isproxy.
|
gharchive/pull-request
| 2023-10-12T12:03:58 |
2025-04-01T06:39:43.224902
|
{
"authors": [
"andersrognstad",
"geir-waagboe"
],
"repo": "navikt/isproxy",
"url": "https://github.com/navikt/isproxy/pull/123",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1815931995
|
🛑 ESTAT is down
In 0004e32, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in 5b9f0b5.
|
gharchive/issue
| 2023-07-21T14:44:00 |
2025-04-01T06:39:43.260068
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/11876",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1817687811
|
🛑 OECD is down
In 05135c2, OECD (https://stats.oecd.org/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: OECD is back up in 4c74dd4.
|
gharchive/issue
| 2023-07-24T05:53:25 |
2025-04-01T06:39:43.262492
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/12030",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1833433028
|
🛑 ESTAT is down
In 576a54f, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in dcc61fc.
|
gharchive/issue
| 2023-08-02T15:40:05 |
2025-04-01T06:39:43.264937
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/12651",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1847743827
|
🛑 ISTAT is down
In d7da63b, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ISTAT is back up in dbe205e.
|
gharchive/issue
| 2023-08-12T04:36:36 |
2025-04-01T06:39:43.267312
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/13263",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1182635421
|
🛑 OECD is down
In f0ccb78, OECD (https://stats.oecd.org/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: OECD is back up in 712c590.
|
gharchive/issue
| 2022-03-27T19:21:26 |
2025-04-01T06:39:43.269649
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/1366",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1936234490
|
🛑 ISTAT is down
In 78778a1, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ISTAT is back up in 899921b after 11 minutes.
|
gharchive/issue
| 2023-10-10T20:57:15 |
2025-04-01T06:39:43.272372
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/17116",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2056853192
|
🛑 NBB is down
In d66283a, NBB (https://stat.nbb.be/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down:
HTTP code: 429
Response time: 693 ms
Resolved: NBB is back up in 68fe234 after 12 minutes.
|
gharchive/issue
| 2023-12-27T02:54:54 |
2025-04-01T06:39:43.274753
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/21855",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2075723458
|
🛑 ESTAT is down
In d25b634, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in 06d8325 after 7 minutes.
|
gharchive/issue
| 2024-01-11T04:31:39 |
2025-04-01T06:39:43.277240
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/22799",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2124709971
|
🛑 IMF is down
In f24edc0, IMF (http://dataservices.imf.org/REST/SDMX_XML.svc/Dataflow) was down:
HTTP code: 0
Response time: 0 ms
Resolved: IMF is back up in 5fe71f1 after 5 minutes.
|
gharchive/issue
| 2024-02-08T09:30:02 |
2025-04-01T06:39:43.279638
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/24597",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2140658698
|
🛑 ISTAT is down
In e728991, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ISTAT is back up in fb901bb after 6 minutes.
|
gharchive/issue
| 2024-02-17T23:42:58 |
2025-04-01T06:39:43.282056
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/25227",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2177248122
|
🛑 ISTAT is down
In 8bad49f, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ISTAT is back up in 166b927 after 6 minutes.
|
gharchive/issue
| 2024-03-09T14:34:48 |
2025-04-01T06:39:43.284719
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/26591",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1066883716
|
🛑 IMF is down
In ba06545, IMF (http://dataservices.imf.org/REST/SDMX_XML.svc/Dataflow) was down:
HTTP code: 0
Response time: 0 ms
Resolved: IMF is back up in 5632a43.
|
gharchive/issue
| 2021-11-30T07:23:51 |
2025-04-01T06:39:43.287107
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/306",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2378991037
|
🛑 ESTAT is down
In 775214d, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in 5df5755 after 14 minutes.
|
gharchive/issue
| 2024-06-27T19:59:19 |
2025-04-01T06:39:43.289430
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/33652",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1407577429
|
🛑 OECD is down
In 48307a5, OECD (https://stats.oecd.org/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: OECD is back up in c5cf8e6.
|
gharchive/issue
| 2022-10-13T10:46:12 |
2025-04-01T06:39:43.291862
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/3436",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2422052619
|
🛑 ISTAT is down
In 72c3222, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ISTAT is back up in c5f1b12 after 11 minutes.
|
gharchive/issue
| 2024-07-22T06:28:44 |
2025-04-01T06:39:43.294236
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/35089",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2480595169
|
🛑 ESTAT is down
In 1eda01d, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in 0a8ae97 after 38 minutes.
|
gharchive/issue
| 2024-08-22T11:53:17 |
2025-04-01T06:39:43.296877
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/36864",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1507976840
|
🛑 OECD is down
In a6bfb3f, OECD (https://stats.oecd.org/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down:
HTTP code: 0
Response time: 0 ms
Resolved: OECD is back up in 5451e9e.
|
gharchive/issue
| 2022-12-22T14:20:48 |
2025-04-01T06:39:43.299402
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/4657",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1608692034
|
🛑 ESTAT is down
In 6fafeec, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in 3e60f11.
|
gharchive/issue
| 2023-03-03T13:56:19 |
2025-04-01T06:39:43.301762
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/6326",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1661680541
|
🛑 ESTAT is down
In 6ad11f0, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in 8c9fc24.
|
gharchive/issue
| 2023-04-11T02:42:17 |
2025-04-01T06:39:43.304175
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/7838",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1709697324
|
🛑 NBB is down
In 4e7460c, NBB (https://stat.nbb.be/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down:
HTTP code: 429
Response time: 531 ms
Resolved: NBB is back up in 25f07f0.
|
gharchive/issue
| 2023-05-15T09:32:45 |
2025-04-01T06:39:43.306539
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/9202",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1728669754
|
🛑 ESTAT is down
In 638793b, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down:
HTTP code: 0
Response time: 0 ms
Resolved: ESTAT is back up in d2d7ba0.
|
gharchive/issue
| 2023-05-27T11:42:11 |
2025-04-01T06:39:43.309157
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/9713",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1730142176
|
🛑 IMF is down
In 5548253, IMF (http://dataservices.imf.org/REST/SDMX_XML.svc/Dataflow) was down:
HTTP code: 0
Response time: 0 ms
Resolved: IMF is back up in 175d90c.
|
gharchive/issue
| 2023-05-29T06:12:00 |
2025-04-01T06:39:43.311557
|
{
"authors": [
"charphi"
],
"repo": "nbbrd/sdmx-upptime",
"url": "https://github.com/nbbrd/sdmx-upptime/issues/9783",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1990947221
|
Update Posts “calcul-de-ratio-de-surface-et-détection-des-bâtiments-dit-de-bureaux-avec-python-et-pandas”
Automatically generated by Netlify CMS
👷 Deploy Preview for shiny-babka-998c7e processing.
Name
Link
🔨 Latest commit
8ccf22f1eefb049045e02ae29330b286e0f87aa9
🔍 Latest deploy log
https://app.netlify.com/sites/shiny-babka-998c7e/deploys/655246b7df2a1d00089f051b
|
gharchive/pull-request
| 2023-11-13T15:54:29 |
2025-04-01T06:39:43.314448
|
{
"authors": [
"nbirckel"
],
"repo": "nbirckel/mainsite-v2",
"url": "https://github.com/nbirckel/mainsite-v2/pull/81",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
776914425
|
Capturing time fields
Thanks for a brilliant library!
I'm trying to generate a from that allows the user to capture a "from-date/time" and a "to-date/time".
What is the best way to do this with NCForm?
If I define the fields with a widget of type "datetime" - they get shown a nice date-picker (I don't mind the date part), but I can't find a way to allow them to capture the time.
I tried setting the widget to "datetime-local" - but then VueJs thinks that there is no component called ncform-datetime-local?
If there is a workaround for this - I would gladly contribute an extra page in the documentation to describe how to do it, so that others can find this easily.
Try this:
{
"type": "object",
"properties": {
"name": {
"type": "string",
"default": "dx: {{+new Date()}}",
"ui": {
"widget": "date-picker"
}
},
"email": {
"type": "string"
},
"age": {
"type": "integer"
},
"adult": {
"type": "boolean"
}
}
}
You can paste the code to Playground
Thanks for the help Daniel. If I paste it into playground, I get the date-picker and this time the date is initialized from the dx-expression, but the "time-part" is still missing.
Is there a way to capture the hours and minutes ?
{
"type": "object",
"properties": {
"name": {
"type": "string",
"default": "dx: {{+new Date()}}",
"ui": {
"widget": "date-picker",
"widgetConfig": {
"type": "datetime"
}
}
},
"email": {
"type": "string"
},
"age": {
"type": "integer"
},
"adult": {
"type": "boolean"
}
}
}
{
"type": "object",
"properties": {
"name": {
"type": "string",
"default": "dx: {{+new Date()}}",
"ui": {
"widget": "date-picker",
"widgetConfig": {
"type": "datetime"
}
}
},
"email": {
"type": "string"
},
"age": {
"type": "integer"
},
"adult": {
"type": "boolean"
}
}
}
If you don't need second, try this:
{
"type": "object",
"properties": {
"name": {
"type": "string",
"default": "dx: {{+new Date()}}",
"ui": {
"widget": "date-picker",
"widgetConfig": {
"type": "datetime",
"format": "yyyy-MM-dd HH:mm"
}
}
},
"email": {
"type": "string"
},
"age": {
"type": "integer"
},
"adult": {
"type": "boolean"
}
}
}
If you don't need second, try this:
{
"type": "object",
"properties": {
"name": {
"type": "string",
"default": "dx: {{+new Date()}}",
"ui": {
"widget": "date-picker",
"widgetConfig": {
"type": "datetime",
"format": "yyyy-MM-dd HH:mm"
}
}
},
"email": {
"type": "string"
},
"age": {
"type": "integer"
},
"adult": {
"type": "boolean"
}
}
}
Thanks Daniel. That worked absolutely perfectly. Really cool!!!
I have one last question, but I'll create another issue for that one - since it;s a little different.
Thanks Daniel. That worked absolutely perfectly. Really cool!!!
I have one last question, but I'll create another issue for that one - since it;s a little different.
|
gharchive/issue
| 2020-12-31T10:42:30 |
2025-04-01T06:39:43.336301
|
{
"authors": [
"NedDerick",
"daniel-dx"
],
"repo": "ncform/ncform",
"url": "https://github.com/ncform/ncform/issues/222",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
273571323
|
Discussion: Cache responses
Would caching results be positively received if added? If so, I'd like to work on a PR for this.
My idea for implementation of the cache would be similar to that of dataloader. There would be a cache option added to the options object that would accept any object with an API similar to Map.
According to the Google Maps ToS, the cache cannot persist for more than 30 days. Caching is acceptable if used "for the purpose of improving the performance...due to network latency". Because of the 30 day limitation—and because many would probably be interested in caching for fewer days—I'm thinking the options would also have a cacheTimeout option. cacheTimeout's value would be a Number representing the number of milliseconds from the time of caching that the entry would be considered stale and a network request would happen.
Thanks in advance!
Thanks to open the discussion on this subject.
I am not sure about implementing the cache inside the library, but I think it could be nice to have an F.A.Q or explain in the readme, how you can implement cache for node-geocoder.
Let me know your opinions on that.
I hadn't thought about that. Adding an example to the readme would certainly be simpler. I can make a PR if you're interested. Maybe a small example under the "More" section?
Yes Looks a good idea maybe 👍
|
gharchive/issue
| 2017-11-13T20:44:09 |
2025-04-01T06:39:43.341447
|
{
"authors": [
"blakek",
"nchaulet"
],
"repo": "nchaulet/node-geocoder",
"url": "https://github.com/nchaulet/node-geocoder/issues/240",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
586880388
|
HERE: API updates - appId and appCode not supported anymore
Due to recent updates, appId and appCode are not supported anymore (for new Here projects, old projects still work for now, but will be depreciated).
We need to make a standard fetch with url and token for example:
https://geocode.search.hereapi.com/v1/geocode?q=<YOUR_SEARCH>&apiKey=<YOUR_API_KEY>
Thanks to report the issue going to fix this soon
Duplicate of #286
Resolved by #302
|
gharchive/issue
| 2020-03-24T11:12:23 |
2025-04-01T06:39:43.343566
|
{
"authors": [
"jvolker",
"nchaulet",
"pa-lem"
],
"repo": "nchaulet/node-geocoder",
"url": "https://github.com/nchaulet/node-geocoder/issues/297",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
418504184
|
output sha512 hashes with release archives
Resolves #91
Codecov Report
Merging #92 into master will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #92 +/- ##
=======================================
Coverage 79.08% 79.08%
=======================================
Files 11 11
Lines 483 483
=======================================
Hits 382 382
Misses 101 101
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 17aee12...5d4c58c. Read the comment docs.
|
gharchive/pull-request
| 2019-03-07T20:43:42 |
2025-04-01T06:39:43.355479
|
{
"authors": [
"codecov-io",
"quaggoth"
],
"repo": "ncr-devops-platform/nagios-foundation",
"url": "https://github.com/ncr-devops-platform/nagios-foundation/pull/92",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
67964766
|
Problem on Mac OS 10.6.8 with new drive
First time user of stressdisk, so not sure where the issue is with this:
Mac Pro (2010), Mac OS 10.6.8
New WD Black 4TB in external eSATA dock. Dock has a fan and drive is room temp to the touch.
I started stressdisk yesterday with the command:
./stressdisk -duration=24h0m0s -logfile="stressdisk.log" run /Volumes/NewDisk/
I was writing fine, about 175MB/s:
2015/04/11 14:19:54 Writing file "/Volumes/NewDisk/TST_0024" size 1000000000
2015/04/11 14:20:00 Writing file "/Volumes/NewDisk/TST_0025" size 1000000000
2015/04/11 14:20:05 Writing file "/Volumes/NewDisk/TST_0026" size 1000000000
2015/04/11 14:20:11 Writing file "/Volumes/NewDisk/TST_0027" size 1000000000
2015/04/11 14:20:16 Writing file "/Volumes/NewDisk/TST_0028" size 1000000000
As it tried to fill the disk, it slowed way, way down:
2015/04/12 03:13:40 Writing file "/Volumes/NewDisk/TST_3989" size 1000000000
2015/04/12 03:13:56 Writing file "/Volumes/NewDisk/TST_3990" size 1000000000
2015/04/12 08:48:03 Writing file "/Volumes/NewDisk/TST_3991" size 1000000000
2015/04/12 10:52:18 Writing file "/Volumes/NewDisk/TST_3992" size 1000000000
2015/04/12 12:03:40 Writing file "/Volumes/NewDisk/TST_3993" size 1000000000
2015/04/12 13:14:52 Writing file "/Volumes/NewDisk/TST_3994" size 1000000000
2015/04/12 14:26:27 Writing file "/Volumes/NewDisk/TST_3995" size 1000000000
We're past the 24 hours (started 4/11 14:17:42, currently 14:50:45) and it's now reading at about 25MB/s, which I assume is the read random portion of the test. Since it finished writing after the timer should have expired, will it ever stop? Also, are any error messages flushed out of whatever stream they are written to, so that if there are no errors visible I can assume that there have been none?
I'll let it continue to run for now. If I recall, if I kill it now and restart it, will it detect the disk as full and just read the existing files?
As the disk becomes full it will have to write fragmented files most likely, so it isn't entirely unexpected it slows down. This might be caused by something else - maybe seeking back to the catalogue - I'm not sure.
The -duration flag only refers to the read portion of the test, so it should stop 24 hours after it finished writing the test files.
Any errors are summarised in the stats which are printed every minute and they will be printed at the end of the run too.
And yes, if you stop the process, it will detect the disks as full and carry on.
How did the test go?
I'm going to close this I haven't heard from you in a while. If you have more info then please re-open.
Thanks
Nick
|
gharchive/issue
| 2015-04-12T21:56:40 |
2025-04-01T06:39:43.370301
|
{
"authors": [
"ncw",
"sfenwick"
],
"repo": "ncw/stressdisk",
"url": "https://github.com/ncw/stressdisk/issues/4",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
251202676
|
v2.1.1
Switch to async/await syntax
Codecov Report
Merging #63 into master will increase coverage by 0.35%.
The diff coverage is 100%.
@@ Coverage Diff @@
## master #63 +/- ##
=========================================
+ Coverage 97.54% 97.9% +0.35%
=========================================
Files 23 23
Lines 489 525 +36
=========================================
+ Hits 477 514 +37
+ Misses 12 11 -1
Impacted Files
Coverage Δ
src/main.js
100% <100%> (ø)
:arrow_up:
src/parsers/index.js
96.82% <100%> (+2.08%)
:arrow_up:
src/utils/standalizeArticle.js
100% <0%> (ø)
:arrow_up:
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 88ab41c...94474a9. Read the comment docs.
|
gharchive/pull-request
| 2017-08-18T10:20:29 |
2025-04-01T06:39:43.380582
|
{
"authors": [
"codecov-io",
"ndaidong"
],
"repo": "ndaidong/article-parser",
"url": "https://github.com/ndaidong/article-parser/pull/63",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2356094
|
Global variables are mixed up
See this gist: https://gist.github.com/1395995 the two strings are swapped
Oooh, fun bug :D
|
gharchive/issue
| 2011-11-26T17:20:32 |
2025-04-01T06:39:43.385701
|
{
"authors": [
"duckinator",
"fredreichbier"
],
"repo": "nddrylliog/rock",
"url": "https://github.com/nddrylliog/rock/issues/337",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1174644281
|
Add support for language specifier GHC2021
CmdLine.getExtensions now considers the data constructor GHC2021 (in addition to Haskell98 and Haskell2010).
Thanks!
|
gharchive/pull-request
| 2022-03-20T18:40:14 |
2025-04-01T06:39:43.389357
|
{
"authors": [
"ndmitchell",
"shayne-fletcher"
],
"repo": "ndmitchell/hlint",
"url": "https://github.com/ndmitchell/hlint/pull/1362",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.