id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
---|---|---|---|---|---|
682249486
|
Enable publishing of upm packages in release pipelines
This change reads the registry path from the pipeline run and publishes the artifacts. This completes the UPM publishing work!
/azp run
|
gharchive/pull-request
| 2020-08-19T23:37:20 |
2025-04-01T06:39:34.377721
|
{
"authors": [
"davidkline-ms"
],
"repo": "microsoft/MixedRealityToolkit-Unity",
"url": "https://github.com/microsoft/MixedRealityToolkit-Unity/pull/8342",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
665674454
|
Remove TimedScopeGen since we defining them manually
We've switched to creating TimedScopeDefinitions manually so TimedScopeGen could be deprecated now
Will this break any project that uses the timedscopeGen?
Will this break any project that uses the timedscopeGen?
No projects using this one, current project using TimedScopeGen from Omex repo
An alternative to deleting this one would be switching all projects to this one and delete the one in Omex. @vlad-ion how complicated do you think it will be?
If we are moving away from using a generator anyway then removing is fine. Updating the projects in the Omex repo is not worth it. If no projects that we port out of that repo will use a generator then we don't need this one and when everything is out of the Omex repo then we don't need that one anymore either.
|
gharchive/pull-request
| 2020-07-25T23:00:23 |
2025-04-01T06:39:34.379757
|
{
"authors": [
"AndreyTretyak",
"gijskoning",
"vlad-ion"
],
"repo": "microsoft/Omex",
"url": "https://github.com/microsoft/Omex/pull/231",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
378092213
|
PTVS needs to give a reasonable error for unrecognized python versions
https://github.com/Microsoft/PTVS/issues/4842 describes a cryptic error message when PTVS didn't recognize python because the python version was too new.
Please make sure that the current version of PTVS will give a descriptive error when encountering future versions of python (e.g. Python 3.26.1, released in 2046).
@greazer , was this closed because it was fixed, or for another reason?
|
gharchive/issue
| 2018-11-07T00:30:21 |
2025-04-01T06:39:34.412026
|
{
"authors": [
"cowlinator"
],
"repo": "microsoft/PTVS",
"url": "https://github.com/microsoft/PTVS/issues/4847",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1414896585
|
PackageId:CPython39.Exe.x64;PackageAction:Repair;ReturnCode:1638;
This issue has been moved from a ticket on Developer Community.
Describe your issue here
===DO NOT EDIT BELOW THIS LINE===
PackageId:CPython39.Exe.x64;PackageAction:Repair;ReturnCode:1638;
Original Comments
Feedback Bot on 9/25/2022, 07:44 PM:
(private comment, text removed)
Original Solutions
(no solutions)
Hi there, we could use more detailed logs to see what's really going on. Can you please collect more logs by running http://aka.ms/vscollect.exe and uploading %temp%\vslogs.zip ? Thank you!
Going to close this out. If the issue still persists, please reopen with the information requested. Thanks.😊
|
gharchive/issue
| 2022-10-19T12:36:20 |
2025-04-01T06:39:34.415430
|
{
"authors": [
"StellaHuang95",
"vsfeedback"
],
"repo": "microsoft/PTVS",
"url": "https://github.com/microsoft/PTVS/issues/7193",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1564776764
|
PackageId:CPython39.Exe.x64;PackageAction:Install;ReturnCode:1603;
This issue has been moved from a ticket on Developer Community.
Describe your issue here
===DO NOT EDIT BELOW THIS LINE===
PackageId:CPython39.Exe.x64; PackageAction:Install; ReturnCode:1603;
Original Comments
Feedback Bot on 1/19/2023, 07:19 PM:
(private comment, text removed)
Original Solutions
(no solutions)
Hi there, please see https://learn.microsoft.com/en-us/visualstudio/python/installing-python-support-in-visual-studio?view=vs-2022#troubleshooting.
If that doesn't solve your issue, we could use more detailed logs to see what's really going on. Can you please collect more logs by running http://aka.ms/vscollect.exe and uploading %temp%\vslogs.zip and reopen the ticket if it persists? Thank you!
|
gharchive/issue
| 2023-01-31T18:02:35 |
2025-04-01T06:39:34.419927
|
{
"authors": [
"StellaHuang95",
"vsfeedback"
],
"repo": "microsoft/PTVS",
"url": "https://github.com/microsoft/PTVS/issues/7359",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2350985591
|
error
[Window Title]
devenv.exe
[Main Instruction]
An unexpected error occurred
[Content]
Please press Ctrl+C to copy the contents of this dialog and report this error to our issue tracker.
[V] Show details [Close]
[Expanded Information]
Build: 17.0.24064.1
System.ObjectDisposedException: Cannot access a disposed object.
at Microsoft.VisualStudioTools.Project.ProjectNode.GetMsBuildProperty(String propertyName, Boolean resetCache)
at Microsoft.VisualStudioTools.Project.ProjectNode.GetProjectProperty(String propertyName, Boolean resetCache)
at Microsoft.PythonTools.Project.CondaEnvCreateProjectInfoBar.<CheckAsync>d__4.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.PythonTools.Infrastructure.VSTaskExtensions.<HandleAllExceptions>d__5.MoveNext()
Thanks for reporting the issue. Is your code in a public repo or could you provide code samples that help us repro this issue? Does this repro constantly for you? Could you also share more detailed repro steps? Thanks!
Thanks for the report. Unfortunately, this is not enough information for us to investigate this issue.
If this continues to be a problem, would you please reopen the issue and add detailed repro steps as outlined above?
Thanks for the report. Unfortunately, this is not enough information for us to investigate this issue.
If this continues to be a problem, would you please reopen the issue and add detailed repro steps as outlined above?
|
gharchive/issue
| 2024-06-13T12:11:51 |
2025-04-01T06:39:34.423142
|
{
"authors": [
"StellaHuang95",
"aainkatronic",
"cwebster-99"
],
"repo": "microsoft/PTVS",
"url": "https://github.com/microsoft/PTVS/issues/7928",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
611690572
|
Getting the below issue when running - Install-Module -Name PartnerCenter -AllowClobber -Scope CurrentUser
Steps to reproduce
I run powershell ISE as adminstrator and run the below script.
Install-Module -Name PartnerCenter -AllowClobber -Scope CurrentUser
I got the below issue when the running the above script.
WARNING: Unable to resolve package source 'https://www.powershellgallery.com/api/v2'.
PackageManagement\Install-Package : No match was found for the specified search criteria and module name 'PartnerCenter'. Try
Get-PSRepository to see all available registered module repositories.
At C:\Program Files\WindowsPowerShell\Modules\PowerShellGet\1.0.0.1\PSModule.psm1:1809 char:21
... $null = PackageManagement\Install-Package @PSBoundParameters
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : ObjectNotFound: (Microsoft.Power....InstallPackage:InstallPackage) [Install-Package], Exception
FullyQualifiedErrorId : NoMatchFoundForCriteria,Microsoft.PowerShell.PackageManagement.Cmdlets.InstallPackage
What steps can reproduce the defect?
Please share the setup, commandline for vstest.console, sample project, target
framework etc.
Expected behavior
It is supposed to Installl
Share the expected output
Actual behavior
What is the behavior observed?
Diagnostic logs
Please share test platform diagnostics logs.
The logs may contain test assembly paths, kindly review and mask those before sharing.
Environment
Please share additional details about your environment.
Version
@balarjrao the appropriate changes to correct this issue have been made. Later today version 3.0.9 will be released which includes this hotfix.
|
gharchive/issue
| 2020-05-04T08:23:16 |
2025-04-01T06:39:34.429581
|
{
"authors": [
"balarjrao",
"isaiahwilliams"
],
"repo": "microsoft/Partner-Center-PowerShell",
"url": "https://github.com/microsoft/Partner-Center-PowerShell/issues/300",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
979415412
|
Add Census Data Notebook
This pull request adds the Census Data Notebook.
Currently, it includes everything except for Census Blocks, this notebook will be updated in the next few days to included Census Blocks with processed population data.
[x] Merge adjacent cells (e.g. titles and markdown text)
[x] Confirm that CRS is set correctly in the parquet metadata
[x] Consolidate contextily imports
[x] Consolidate read_parquets
[x] Remove .DS_STORE files (and ensure they aren't uploaded)
[x] use .parquet extension instead of .parq
[x] Optimize dtypes
[x] STATEFP -> category
[x] COUNTYFP -> category (?)
[x] BLKGRPCE -> int (int16, int32, int64?)
[x] NAME -> int
[x] NAMELSAD -> category (?)
[x] ALAND -> int
[x] AWATER -> int
[x] all object -> string
[x] PARTFLG -> category
[x] Census block specific
[x] Write a single partitioned datsaet for all the state
[x] Drop SUFFIX
[x] INTPTLAT, INTPTLON -> int
[x] STUSAB -> category?
[x] GEOID as index?
FYI @nodell111, https://github.com/microsoft/PlanetaryComputerExamples/blob/main/CONTRIBUTING.md#linting walks through setting up pre-commit to automatically run the linting checks before pushing.
I updated the original post with some TODOs that are hopefully complete.
FYI, I've done some work on the STAC side to handle getting tokens. You should be able to use this to access the data from the private container.
r = requests.get("https://planetarycomputer.microsoft.com/api/sas/v1/token/ai4edataeuwest/us-census")
storage_options = {
"account_name": "ai4edataeuwest",
"credential": r.json()["token"],
}
It looks like the dtype change broken some plot examples. For example, the "Congressional Districts: 116th Congress (CD116)" plot is broken now
ax = ddf[ddf.GEOID == "2402"].compute().plot(figsize=(10, 10), alpha=0.5, edgecolor="k")
should be (no quotes around 2402)
ax = ddf[ddf.GEOID == 2402].compute().plot(figsize=(10, 10), alpha=0.5, edgecolor="k")
Thanks all, merging!
|
gharchive/pull-request
| 2021-08-25T16:38:12 |
2025-04-01T06:39:34.438733
|
{
"authors": [
"TomAugspurger",
"nodell111"
],
"repo": "microsoft/PlanetaryComputerExamples",
"url": "https://github.com/microsoft/PlanetaryComputerExamples/pull/71",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
684750601
|
Cleanup contact2 (that was created via account2)
The sample did not cleanup all of the data it added.
@millarde-equinor Thanks for catching that.
|
gharchive/pull-request
| 2020-08-24T15:11:23 |
2025-04-01T06:39:34.439806
|
{
"authors": [
"millarde-equinor",
"phecke"
],
"repo": "microsoft/PowerApps-Samples",
"url": "https://github.com/microsoft/PowerApps-Samples/pull/152",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
589749314
|
The HTTP request was forbidden with client authentication scheme 'Anonymous'
Hello,
the CdsServiceClient throws an exception when trying to use the ClientSecret authentication and you have not registered the application in the dynamics instance yet. The stack trace is the following:
Microsoft.Powerplatform.Cds.Client.Utils.CdsConnectionException
HResult=0x80131500
Message=Failed to connect to Common Data Service
Source=Microsoft.Powerplatform.Cds.Client
StackTrace:
at Microsoft.Powerplatform.Cds.Client.CdsServiceClient.CreateCdsServiceConnection(Object externalOrgServiceProxy, AuthenticationType requestedAuthType, String hostName, String port, String orgName, NetworkCredential credential, String userId, SecureString password, String domain, String Geo, String claimsHomeRealm, Boolean useSsl, Boolean useUniqueInstance, OrganizationDetail orgDetail, UserIdentifier user, String clientId, Uri redirectUri, PromptBehavior promptBehavior, String tokenCachePath, OrganizationWebProxyClient externalOrgWebProxyClient, String certificateThumbPrint, StoreName certificateStoreName, X509Certificate2 certificate, Uri instanceUrl, Boolean isCloned, Boolean useDefaultCreds)
at Microsoft.Powerplatform.Cds.Client.CdsServiceClient..ctor(Uri instanceUrl, String clientId, String clientSecret, Boolean useUniqueInstance, String tokenCachePath)
at CDS.Core_Test.Program.Main(String[] args) in C:\Demo\CDS.Core Test\CDS.Core Test\Program.cs:line 17
This exception was originally thrown at this call stack:
Microsoft.Powerplatform.Cds.Client.CdsConnectionService.InitCdsService()
Microsoft.Powerplatform.Cds.Client.CdsConnectionService.GetCachedCDSService(out Microsoft.Powerplatform.Cds.Client.CdsConnectionService)
Microsoft.Powerplatform.Cds.Client.CdsConnectionService.IntilizeService(out Microsoft.Powerplatform.Cds.Client.CdsConnectionService)
Microsoft.Powerplatform.Cds.Client.CdsConnectionService.DoLogin(out Microsoft.Powerplatform.Cds.Client.CdsConnectionService)
Microsoft.Powerplatform.Cds.Client.CdsServiceClient.CreateCdsServiceConnection(object, Microsoft.Powerplatform.Cds.Client.AuthenticationType, string, string, string, System.Net.NetworkCredential, string, System.Security.SecureString, string, string, string, bool, bool, Microsoft.Xrm.Sdk.Discovery.OrganizationDetail, Microsoft.IdentityModel.Clients.ActiveDirectory.UserIdentifier, string, System.Uri, Microsoft.IdentityModel.Clients.ActiveDirectory.PromptBehavior, string, Microsoft.Xrm.Sdk.WebServiceClient.OrganizationWebProxyClient, string, System.Security.Cryptography.X509Certificates.StoreName, System.Security.Cryptography.X509Certificates.X509Certificate2, System.Uri, bool, bool)
Inner Exception 1:
MessageSecurityException: The HTTP request was forbidden with client authentication scheme 'Anonymous'.
You can use the following code to replicate the problem:
using Microsoft.Extensions.Configuration;
using Microsoft.Powerplatform.Cds.Client;
using System;
using System.IO;
namespace CDS.Core_Test
{
class Program
{
static void Main(string[] args)
{
IConfiguration config = new ConfigurationBuilder().SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appsettings.json", true, true)
.Build();
var cdsClientConfig = config.GetSection("CdsClient");
var cdsClient = new CdsServiceClient(new Uri(cdsClientConfig["Uri"]), cdsClientConfig["ClientId"], cdsClientConfig["ClientSecret"], true, cdsClientConfig["TokenCache"]);
if (cdsClient.IsReady)
{
Console.WriteLine("Hello, connected to CDS using .NET Core");
}
else
{
// Never reaches this line of code because the constructor throws an exception
Console.WriteLine("Failed to connect to CDS using .NET Core :(");
}
Console.ReadKey();
}
}
}
I replicated the same situation using CrmServiceConnection from Microsoft.Xrm.Tooling.Connector and it does not throw an exception, instead, it handles the exception and stores it in the CrmServiceClient.LastCrmException property. You can use the following code to replicate it using .Net Framewrork
using Microsoft.Xrm.Tooling.Connector;
using System;
using System.Configuration;
namespace CrmServiceClient_Test
{
class Program
{
static void Main(string[] args)
{
var appSettings = ConfigurationManager.AppSettings;
var crmClient = new CrmServiceClient(new Uri(appSettings["Uri"]), appSettings["ClientId"], appSettings["ClientSecret"], true, appSettings["TokenCache"]);
if (crmClient.IsReady)
{
Console.WriteLine("Hello, connected to Dynamics using .NET Framework");
}
else
{
Console.WriteLine("Failed to connect to Dynamics using .NET Framework :(");
//this works without an exception
}
Console.ReadKey();
}
}
}
Actually CrmServiceClient.LastCrmException does not have an InnerException and contains no Trace information and the most meaningfull information I got from this (by configuring Visual Studio to break on thrown exceptions) is the following:
Microsoft.IdentityModel.Clients.ActiveDirectory.Internal.HttpRequestWrapperException
HResult=0x80131500
Message=
Source=Microsoft.IdentityModel.Clients.ActiveDirectory
StackTrace:
at Microsoft.IdentityModel.Clients.ActiveDirectory.Internal.Http.HttpClientWrapper.<GetResponseAsync>d__31.MoveNext()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
Inner Exception 1:
HttpRequestException: Response status code does not indicate success: 401 (Unauthorized).
Inner Exception 2:
AdalException: : Unknown error
System.ServiceModel.Security.MessageSecurityException
HResult=0x80131501
Message=The HTTP request was forbidden with client authentication scheme 'Anonymous'.
Source=System.ServiceModel
StackTrace:
at System.ServiceModel.Channels.HttpChannelUtilities.ValidateAuthentication(HttpWebRequest request, HttpWebResponse response, WebException responseException, HttpChannelFactory`1 factory)
This exception was originally thrown at this call stack:
System.Net.HttpWebRequest.GetResponse()
System.ServiceModel.Channels.HttpChannelFactory<TChannel>.HttpRequestChannel.HttpChannelRequest.WaitForReply(System.TimeSpan)
Inner Exception 1:
WebException: The remote server returned an error: (403) Forbidden.
Thanks for the report, This is actually the behavior of the API endpoint in CDS coming though.. as the identity your trying to access with does not exist, it does not know what the do with it and throws a 403, We are looking at ways to make that message better from the API itself. More similar to the "user does not exist in this instance" error we used to throw when trying to access an instance where your user identity was blocked by security or policy.
One thing I did not specify clearly is that the CdsServiceClient constructor throws exceptions and instead the CrmServiceClient constructor catches the exception and puts it in the CrmServiceClient.LastCrmError property.
Yes, this is intentional.. CrmServiceClient follows a C++ pattern for exception management from when it was originally designed many years ago. We wanted to move over to the more common "throw breaking exception on connect fail" but could not without breaking a lot of existing clients. Adding a Option to the constructor or a static setting didn't make sense either so we lived with it.
also: re the exception stack trimming things, we did a fix for that recently that will now include the full stack though all levels, if you patch to current nuget's you will get that behavior.
With CdsServiceClient we moved to using Exceptions on connect. Its noted in the nuget release notes. We have not yet decided if we are going to do away with the LastException pattern completely yet.. but for now it will always throw exceptions on connect, which is the more accepted pattern today.
Feedback is welcome on that.
I am so excited that finally there will be client library that we can use from .NET Core applications when interacting with Dynamics CRM. Thank You!!
Is there a resolution to this issue? I am getting the same error message when using Client ID and Client Secret connecting to Dynamic 365 CRM Online. Thanks again
Hello @twiga2013,
you have to register your application user in dynamics, the issue has been opened because the error message is misleading.
I have written a LinkedIn Article on how to build an MVP for this.
Best regards,
Betim.
Hi Betim,
I have created the App Registration under the same Azure Active Directory that CRM Dynamics is using but when I got to add Application user all the fields are locked and I am logged in Dynamics as System Administrator.
@twiga2013 , Change the user view from "user" to "Application User"
I did switch to Application User form everything is locked. I did try it out from a complete different tenant and Application User forms fields are not locked.
I will need to contact Microsoft support to find out why the application user form fields are all locked.
I am going to close this for now, the API team has a repro of this working now and is working though providing a better error experience.
thanks all for your feedback here.
|
gharchive/issue
| 2020-03-29T09:01:03 |
2025-04-01T06:39:34.450338
|
{
"authors": [
"BetimBeja",
"MattB-msft",
"twiga2013"
],
"repo": "microsoft/PowerPlatform-CdsServiceClient",
"url": "https://github.com/microsoft/PowerPlatform-CdsServiceClient/issues/5",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1106669261
|
[SERVICE TEAM SUPPORT REQUEST] LogicApp custom connector test
Description of Feature or Work Requested
How can I check my LogicApp custom connector before publishing?
I am seeing support only for Power Platform (Power Apps) but I want to check it in LogicApp env as this is how I initially created my custom connector.
Using paconn utility.
Also LogicApp custom connector only have one (json) file available for downloading,
And it doesn't seem in the format of PowerPlatformConnectors.
Which has 2 json files.
Finally I've created the artifacts manually and created a PR.
(I've verified things are working in azure LogicApps env)
Full guide:
https://docs.microsoft.com/en-us/connectors/custom-connectors/certification-submission#step-3d-validate-your-custom-connector-files
Any support/lead how to validate my connector will be highly appriciated.
Target Date
20/01/2022
Hi @DavidMeu thank you for your feedback. Please add this feature request into the Power Automate forum. We will discuss this internally in the meantime.
|
gharchive/issue
| 2022-01-18T09:28:50 |
2025-04-01T06:39:34.455012
|
{
"authors": [
"DavidMeu",
"natalie-pienkowska"
],
"repo": "microsoft/PowerPlatformConnectors",
"url": "https://github.com/microsoft/PowerPlatformConnectors/issues/1371",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
836786588
|
System Crash: ColorPickerUI.exe
ℹ Computer information
PowerToys version: 0.33.1.0
PowerToy Utility: ColorPickerUI.exe
Running PowerToys as Admin: Yes
Windows build number: 20H2 19042.868
📝 Provide detailed reproduction steps (if any)
PowerToys set to start with Windows
Not using PowerToys, but its running in system tray
Using PC usually, VS Code, Edge (40+ tabs in multiple windows) etc.
✔️ Expected result
Windows should continue to work as usual, no lockups.
❌ Actual result
Windows just locks up, no mouse movement, no KB response, any movements on screen is stopped. Had to hard reset to get it back in normal state.
Event Viewer shows that it was ColorPickerUI.exe which made fault in KERNELBASE.dll
📷 Screenshots
@chall3ng3r
Right-click on PowerToys icon in the tray menu and select Report Bug.
Drag and drop the report into a Github comment.
Here's the report which I made right after system start after crash.
PowerToysReport_2021-03-20-02-26-33.zip
@chall3ng3r
Can you upload %appdata%\ColorPicker folder as well?
Here you go,
ColorPicker.zip
Object reference not set to an instance of an object.
Inner exception:
Stack trace:
at ColorPicker.Keyboard.KeyboardMonitor.Hook_KeyboardPressed(Object sender, GlobalKeyboardHookEventArgs e)
at ColorPicker.Keyboard.GlobalKeyboardHook.LowLevelKeyboardProc(Int32 nCode, IntPtr wParam, IntPtr lParam)
@enricogior
It looks like #9573 and should be resolved at 0.35 release.
|
gharchive/issue
| 2021-03-20T11:36:37 |
2025-04-01T06:39:34.462330
|
{
"authors": [
"chall3ng3r",
"enricogior",
"mykhailopylyp"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/10344",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1031536659
|
FancyZones reverts to defaults when docking and undocking Surface Pro 7 using Surface Dock 2
Microsoft PowerToys version
0.47.1
Running as admin
[X] Yes
Area(s) with issue?
FancyZones
Steps to reproduce
Surface Pro docked with 2 external monitors. Dock is Surface Dock 2. Video connectors are 2x USB-C to DisplayPort; Surface Pro device has screen turned off. On startup (while docked) FancyZones loads custom zones as expected.
Undocking the Surface Pro --> FancyZones reverts to default zones on device (custom zones discarded). Toggling FancyZones returns custom zones setup.
OR
Surface Pro device undocked. Upon startup, FancyZones custom zones load correctly.
Docking as described above. Surface Pro's screen is turned off in this configuration (keyboard cover closed). --> FancyZones reverts to default zones for dual external screens. Toggling FancyZones returns custom zones.
✔️ Expected Behavior
Custom zones should be maintained.
❌ Actual Behavior
Zones revert to defaults. Toggling FancyZones is required to return to custom zones.
Other Software
No response
does this issue still happen with v0.73.0? /needinfo
Hello,
Thanks for following up with this issue.
I can no longer reproduce the issue with v0.73.0. I believe it hasn't been an issue for me for a while. Thank you for those who helped fix it and apologies for not updating everyone here.
Andrew
|
gharchive/issue
| 2021-10-20T15:18:05 |
2025-04-01T06:39:34.467420
|
{
"authors": [
"AndrewGarib",
"TheJoeFin"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/13931",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1086547102
|
Always On Top
Description of the new feature / enhancement
The ability to show an application on top of others.
Scenario when this would be used?
Working away on my study in a Word Document and I wanted to watch a movie at the same time in VLC Player.
I would like to Pin VLC Player to always be placed on top of other applications.
Supporting information
No response
Issue is a duplicate
Thank you for your issue! We've linked your report against another issue #13
Thanks for helping us to make PowerToys a better piece of software.
Duplicate of #13
This is in progress too :D
Hope we can ship this soon.
This is a necessity!! Can't stress the importance of this feature enough in saving lots of unnecessary clicks especially now that Taskbar drag and drop file support is gone.
goal is Dec release
|
gharchive/issue
| 2021-12-22T07:48:58 |
2025-04-01T06:39:34.470621
|
{
"authors": [
"Aaron-Junker",
"AgrMayank",
"HelixCreations",
"crutkas",
"franky920920"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/15110",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
579469486
|
powertoys sets
the function sets in powertoys as a function
Sorry but we're a bit uncertain what this means, sorry. Happy to reopen the issue once we have more detail.
|
gharchive/issue
| 2020-03-11T18:40:49 |
2025-04-01T06:39:34.471545
|
{
"authors": [
"banaantjes",
"crutkas"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/1541",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1098176207
|
PowerToys Run lags when searching for an application
Microsoft PowerToys version
0.53.1
Running as admin
[X] Yes
Area(s) with issue?
PowerToys Run
Steps to reproduce
Activate the shortcut for PowertToys Run and enter a application's name
✔️ Expected Behavior
The text to appear on the textbox immediately
❌ Actual Behavior
There's a lag between pressing the key on the keyboard and the name appearing on the text box, It's more significant when the typing is fast
Other Software
No response
Looks like a duplicate of https://github.com/microsoft/PowerToys/issues/15364.
Makes sense to you @wildman9x ?
Could you please try the debug build provided there to see if it fixes for you?
Looks like a duplicate of #15364. Makes sense to you @wildman9x ? Could you please try the debug build provided there to see if it fixes for you?
Yeah it fixed the lagging issue for me, thanks man
|
gharchive/issue
| 2022-01-10T17:25:34 |
2025-04-01T06:39:34.475767
|
{
"authors": [
"jaimecbernardo",
"wildman9x"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/15415",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1327147889
|
Unable to turn of "Always run as administrator
Microsoft PowerToys version
0.61.1
Running as admin
[X] Yes
Area(s) with issue?
General
Steps to reproduce
Turning off "Always Run as Administrator" doesn't work. Requires admin login at every startup.
✔️ Expected Behavior
Unchecking "Always Run as administrator" would save the setting. Restart of PowerToys would not require an admin login.
❌ Actual Behavior
PowerToys asks for admin login during next restart. "Always run as administrator" is checked on again.
Other Software
No response
/bugreport
Not OP but I have the same problem. In my case, the likely culprit is that PowerToys is started using UAC with another account that the session actually open. Had to go to the desktop folder of that user to get the diagnostic zip
PowerToysReport_2022-08-16-13-55-19.zip
.
Very likely the cause. Company policy now requires me to use a new
administrator account separate from my main user account which I used to
install initially. Uninstall and reinstall seems to keep everything linked
to the original user account.
On Tue, Aug 16, 2022, 6:58 AM bendem @.***> wrote:
Not OP but I have the same problem. In my case, the likely culprit is that
PowerToys is started using UAC with another account that the session
actually open. Had to go to the desktop folder of that user to get the
diagnostic zip
PowerToysReport_2022-08-16-13-55-19.zip
https://github.com/microsoft/PowerToys/files/9350671/PowerToysReport_2022-08-16-13-55-19.zip
.
—
Reply to this email directly, view it on GitHub
https://github.com/microsoft/PowerToys/issues/19745#issuecomment-1216538506,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ASFCL3V5ZSAZRXG5XAFX2DLVZN66JANCNFSM55OVRCWA
.
You are receiving this because you authored the thread.Message ID:
@.***>
PowerToysReport_2022-08-25-16-18-58.zip
Here is the Bug Report as requested.
Don't know if this helps anyone, but if you navigate to 'C:\Users%username%\AppData\Local\Microsoft\PowerToys' and open the settings.json in a text editor there is a "run_elevated" option. I changed that to 'false' and relaunched the app and it fixed my issue.
Don't know if this helps anyone, but if you navigate to 'C:\Users%username%\AppData\Local\Microsoft\PowerToys' and open the settings.json in a text editor there is a "run_elevated" option. I changed that to 'false' and relaunched the app and it fixed my issue.
Ah right, it reads the config from the launching user's config but saves it to the configuration of the elevated user. Makes sense.
|
gharchive/issue
| 2022-08-03T12:30:32 |
2025-04-01T06:39:34.485530
|
{
"authors": [
"Dreistul-dev",
"bendem",
"jaimecbernardo",
"neddntd"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/19745",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1435717833
|
Customizing Windows Exporer with background colors according to folder
Description of the new feature / enhancement
This feature would allow to easily change the background color of windows explorer for different folder.
Additionaly, it would change the all folder icons in a given folder.
Scenario when this would be used?
This would be useful when you work for instance with different drives / servers and you need to easily identifiy where you are. I word the two networks that have the same folder structure, making it bit hard to know in which network you are
Supporting information
No response
/dup #6157
|
gharchive/issue
| 2022-11-04T08:22:17 |
2025-04-01T06:39:34.487837
|
{
"authors": [
"Yay-Ou",
"crutkas"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/21718",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1609828619
|
Launch a PowerToys feature from PowerToys Run
Description of the new feature / enhancement
I would like to be able to launch a PowerToys feature from the PowerToys Run, such as Colorpicker or Awake
Scenario when this would be used?
Since there are a lot of shortcuts for each PowerToys features, if you don't use one very often you forget the associated shortcut and lose time to open PowerToys, search for the feature in the menu and launch it from there or find the shortcut
Supporting information
No response
Thanks for your message. This is a Duplicate of an other issue. Additional comments may be added there. /dup #17351
|
gharchive/issue
| 2023-03-04T15:24:57 |
2025-04-01T06:39:34.490152
|
{
"authors": [
"Jay-o-Way",
"amauryblin"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/24553",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1609865675
|
Convert text from lowcase to uppercase and viceversa
Description of the new feature / enhancement
Convert text between different keyboard layouts by selecting the incorrect text fragment and press a keyboard shortcut to fix your text immediately.
e.g. hELLO wORLD -> CTRL + ALT + Space bar -> Hello World
Scenario when this would be used?
Whenever I need it.
Supporting information
No response
Duplicate of 907
/dup #907
|
gharchive/issue
| 2023-03-04T17:22:22 |
2025-04-01T06:39:34.492436
|
{
"authors": [
"An-NahL-Am",
"crutkas",
"technobulb"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/24556",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1891214394
|
PowerRename cannot rename ß to ss
Microsoft PowerToys version
0.73.0
Installation method
WinGet
Running as admin
No
Area(s) with issue?
PowerRename
Steps to reproduce
Create some files that use the ß character example:
Daliah Lavi - Weißt du, was du für mich bist?.mp3
In the PowerRename utility, enter the character to search ß and the character to replace with ss.
✔️ Expected Behavior
The renaming window should show the result as:
Daliah Lavi - Weisst du, was du für mich bist?.mp3
With the option to apply the rename.
❌ Actual Behavior
No renaming result is shown and the action to apply does not become available.
Using some other combination of characters to replace with seems to work.
Other Software
No response
I can reproduce this
Renaming preview:
ß to ss (not working):
ß to s (working):
|
gharchive/issue
| 2023-09-11T20:27:41 |
2025-04-01T06:39:34.497504
|
{
"authors": [
"Aaron-Junker",
"lkraider"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/28509",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2041752903
|
Screen ruler utility is broken in v0.76.2
Microsoft PowerToys version
0.76.2
Installation method
Microsoft Store
Running as admin
Yes
Area(s) with issue?
Screen ruler
Steps to reproduce
Open screen ruler, selecting in on Powertoys menú or with shortcut.
Try to realize a measurement.
The rule utility crashes.
✔️ Expected Behavior
The utility does not crash and can perform measurements as usual.
❌ Actual Behavior
The rule utility crashes.
Other Software
Using with 2 monitors connected to one laptop with Windows 10 Enterprise 21H2.
/bugreport
Hi! Sure, I attached the file. Thanks!
PowerToysReport_2023-12-15-09-51-36.zip
It seems the problem fixes itself, maybe something was wrong with my laptop. Thank you!
|
gharchive/issue
| 2023-12-14T13:55:05 |
2025-04-01T06:39:34.501928
|
{
"authors": [
"pablohs1986",
"stefansjfw"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/30446",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
757352734
|
PreviewPane Porting .NET Core 3.1
Porting of the following projects to .NET Core 3.1
MarkdownPreviewHandler
PreviewHandlerCommon
SvgPreviewHandler
SVGThumbnailProvider
PreviewPaneUnitTests
UnitTests-PreviewHandlerCommon
UnitTests-SvgPreviewHandler
UnitTests-SvgThumbnailProvider
Related to #776
Please assign this to me 😃
Fixed with 0.29, released today. https://github.com/microsoft/PowerToys/releases/tag/v0.29.0
|
gharchive/issue
| 2020-12-04T19:36:42 |
2025-04-01T06:39:34.504904
|
{
"authors": [
"crutkas",
"davidegiacometti"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/8405",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
794823019
|
Build an ISO image
@mayurjansari please follow the new feature request template. We request both the what and the 'why'. scenarios are critical. This is too vague as is.
@mayurjansari please follow the new feature request template. We request both the what and the 'why'. scenarios are critical. This is too vague as is.
namely:
What is the expected behavior of the proposed feature? What is the scenario this would be used?
namely:
What is the expected behavior of the proposed feature? What is the scenario this would be used?
Now days ISO is very good to keep file safe as it is. Probably it will not change any file in ISO file. I am using ISO file instead of archive file because u need to extract then you can use but in ISO file it will mount file and run your setup etc. also fast file transferring. Now some software like ventoy use direct ISO file for boot system.
Right Click on folder it show -> Build an ISO -> show dialog box.
in dialog box
File Name of ISO -> default as folder name
ISO label -> default as folder name
icon if possible
now some ISO support compress
Now days ISO is very good to keep file safe as it is. Probably it will not change any file in ISO file. I am using ISO file instead of archive file because u need to extract then you can use but in ISO file it will mount file and run your setup etc. also fast file transferring. Now some software like ventoy use direct ISO file for boot system.
Right Click on folder it show -> Build an ISO -> show dialog box.
in dialog box
File Name of ISO -> default as folder name
ISO label -> default as folder name
icon if possible
now some ISO support compress
can we go more into "keeping a file safe"? Is the big thing here making sure a file isn't tampered with / is read only?
can we go more into "keeping a file safe"? Is the big thing here making sure a file isn't tampered with / is read only?
No. Not tampered or read only. Mainly useful keep as provider gave them. In my case antivirus is headache when use confuserEx on program.
No. Not tampered or read only. Mainly useful keep as provider gave them. In my case antivirus is headache when use confuserEx on program.
i'm not totally sure i understanding your use-case here still. Are you obfusicating a program you didn't create?
Yes all program i build secure with confuser to stop reverse engineering. It not totally stop but it is deficult to read and understand code.
@mayurjansari your statement sounds like it is about programs you built, my question was about all programs. An example scenario would be would you run confuserEx against the signed PowerToy DLLs and then put them in the ISO?
I'm still trying to understand the use-case(s) for how this tool would be used. Right now, i bet you'd still hit the same issue as an OS would view this as a USB/DVD loading and i'm betting the anti-virus would still run against the files there
@mayurjansari your statement sounds like it is about programs you built, my question was about all programs. An example scenario would be would you run confuserEx against the signed PowerToy DLLs and then put them in the ISO?
applied only on created by me
I'm still trying to understand the use-case(s) for how this tool would be used. Right now, i bet you'd still hit the same issue as an OS would view this as a USB/DVD loading and i'm betting the anti-virus would still run against the files there
anti-virus scan the file but in ISO it can not delete file. some good anti-virus not detect confuserEx file as infected file
This is a pretty unique use case. Honestly, something is wrong if anti-virus is flagging the file. I'd also imagine the anti-virus would quarantine the ISO at that point.
Most of leading anti-virus quarantine file some of them also delete file. I checked most of anti-virus they quarantine file not ISO. they show notification file when ISO file open but not quarantine file.
|
gharchive/issue
| 2021-01-27T06:47:05 |
2025-04-01T06:39:34.514603
|
{
"authors": [
"crutkas",
"mayurjansari"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/9310",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
610346532
|
[Keyboard Manager] Fix Remap buttons stop working
Summary of the Pull Request
I got it to stop crashing by bypassing XamlBridge Window focus handling.
PR Checklist
[x] Applies to #2321
[x] CLA signed. If not, go over here and sign the CLA
Detailed Description of the Pull Request / Additional comments
Xaml Bridge seems to not handle focus events properly, so this fix just bypasses it by handling the WM_SETFOCUS event directly.
This also solves another not documented crash that involved minimizing the window, then closing by pressing the X button without interacting with the window content itself.
Validation Steps Performed
Click on the Remap a key button with no key remappings.
Close the window by pressing the X button.
It shouldn't crash.
Repeat for Redefine a shortcut.
Another crashing scenario.
Click on the Remap a key button.
Minimize the Window.
Click on the taskbar to show the window again.
Close the window by pressing the X button.
It shouldn't crash.
Also could you find any explanation for why it happens only when there are no key remaps?
And also if you minimize it and right click close it.
Good catch, this causes a crash.
Since I can't repro the error, can you verify that if you open it, click away to another window and then close it (while it's not in focus), will it still not crash?
This doesn't cause a crash, on the other hand.
This is to confirm that the crash happens because the window requires focus atleast once before closing, rather than the window needing focus whenever it is closed.
Maybe just calling SetFocus before closing the window does the trick? I'll try that out.
I actually was able to the minimize thing. I think its unrelated to the focus problem, cause I got it as well. It doesn't actually crash the runner in my case, but the buttons stop working. I think there is probably something going wrong with the mutex in that case. Sak is making a separate issue for it.
I actually was able to repro the minimize thing. I think its unrelated to the focus problem, cause I got it as well. It doesn't actually crash the runner in my case, but the buttons stop working. I think there is probably something going wrong with the button mutex in that case. Sak is making a separate issue for it. I can look into that, since I think it might be unrelated to the PR. I'm not sure if it would also crash the runner in your case, but it isnt happening now because of some other issue.
It looks like in case of minimize the code after the message loop where EditKeyboardNativeHandle is set to null does not take place, and since that remains non-null, when the button is pressed after that since it is not null, the runner thinks the window is still open, so it tries to bring that to the foreground even though it doesn't exist. I'm not entirely sure why it doesn't close though, maybe WM_DESTROY does not get executed on closing from the right-click menu after minimize.
Actually, it looks like it applies even if you don't minimize and you just right-click at the bottom and exit it. Seems unrelated to this issue though, so I'll see if I can come up with a fix and then you can verify that it doesn't have the focus interaction in addition to it
Actually, it looks like it applies even if you don't minimize and you just right-click at the bottom and exit it. Seems unrelated to this issue though, so I'll see if I can come up with a fix and then you can verify that it doesn't have the focus interaction in addition to it. This PR #2566 fixes the the issue. Maybe you can check if it works in your case as well since the focus could also be another problem to add to it
#2566 does seem to fix the issue, however as you said it now causes a crash on insiders builds since the window doesn't have focus when its closed. I'll try SetFocus when receiving a WM_SYSCOMMAND message to close the window.
|
gharchive/pull-request
| 2020-04-30T19:56:07 |
2025-04-01T06:39:34.524216
|
{
"authors": [
"arjunbalgovind",
"traies"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/pull/2562",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
608671376
|
[Draft] New pre-built image for Katas to improve startup
This PR adds a new Dockerfile to create a fully baked Katas image that removes online Nuget sources to improve Package loading times (and therefore the first cell execution of all katas).
After @tcNickolas comments, I think it's best to abandon this PR and only have the #335 PR in a special MyBinder branch as @cgranade did for Samples.
Are there any fragments of this change that can be applied to the master Dockerfile, to improve the experience people have running from it? I agree that option 2 you described is better (thank you for writing it up in such level of detail!), but it will involve updating all our links to Binder to use that special branch. I can imagine some missed or non-updatable links (such as in tweets), so it would be nice to have a reasonable experience when running Binder from master. It seems that disabling the online NuGet sources after prebuilding them will still yield the speedup on %package commands?
Yes, we can disable the online Nuget sources on the current Dockerfile.
Can you check/approve PR #336? After that I'll do another small PR to combine all the RUN commands into one (slightly reduce image size by creating just one layer instead of a dozen) and disable the Nuget online sources.
Regarding updating links, we should stick to using aka.ms for that so we can update it any time. Do we have one for Katas already?
Created new PR #337 that transfer optimizations from this PR without doing a breaking change in MyBinder behavior that this PR would do
|
gharchive/pull-request
| 2020-04-28T23:09:36 |
2025-04-01T06:39:34.528137
|
{
"authors": [
"tcNickolas",
"vxfield"
],
"repo": "microsoft/QuantumKatas",
"url": "https://github.com/microsoft/QuantumKatas/pull/333",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1186231789
|
How to operate the Plaintext
Hello SEAL team!
I found the method to evaluate between two ciphertexts or plaintext and ciphertexts in the Evaluator class. But hwo can I get the sum or product between two plaintexts of polynomial rings.
Can I get the multiply depth through encryption parameters?
SEAL does not offer APIs for plaintext-plaintext operations. You can sum/multiply messages first then encode them into a new plaintext.
Check this example
|
gharchive/issue
| 2022-03-30T10:22:48 |
2025-04-01T06:39:34.532206
|
{
"authors": [
"WeiDaiWD",
"jun1015"
],
"repo": "microsoft/SEAL",
"url": "https://github.com/microsoft/SEAL/issues/473",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
542272370
|
AttributeError: 'lxml.etree._Element' object has no attribute 'fldCharType'
when I open my .docx file , which is saved from a .doc file use python-docx, it cames out this Error,
C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py:194: UnexpectedElementWarning: Skipping unexpected tag: {http://schemas.openxmlformats.org/wordprocessingml/2006/main}background UnexpectedElementWarning) C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py:194: UnexpectedElementWarning: Skipping unexpected tag: {http://schemas.openxmlformats.org/wordprocessingml/2006/main}pict UnexpectedElementWarning) Traceback (most recent call last): File "E:/_master/硕士论文/data/data_preprocess/temp.py", line 27, in <module> db_json = simplify(db) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\__init__.py", line 33, in simplify out = document(doc.element).to_json(doc, _options) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 106, in to_json "VALUE": [ elt.to_json(doc, options) for elt in self], File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 106, in <listcomp> "VALUE": [ elt.to_json(doc, options) for elt in self], File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\body.py", line 25, in to_json JSON = elt.to_json(doc, options, iter_me) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\paragraph.py", line 142, in to_json out: Dict[str, Any] = super(paragraph, self).to_json(doc, options, super_iter) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\paragraph.py", line 27, in to_json for elt in run_iterator: File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 62, in __iter__ self.__iter_name__ if self.__iter_name__ else self.__type__): File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py", line 167, in xml_iter for elt in xml_iter(current, handlers.TAGS_TO_NEST[current.tag], _msg): File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py", line 156, in xml_iter yield handlers.TAGS_TO_YIELD[current.tag](current) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\form.py", line 106, in __init__ super(fldChar, self).__init__(x) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 36, in __init__ self.props[prop] = getattr(x, prop) AttributeError: 'lxml.etree._Element' object has no attribute 'fldCharType'
what should I do to solve it?
Thank you for your question! This error indicates that the .docx file was not valid. Specifically, the fldCharType attribute is required on fldChar elements, by the open office specification, and the error indicates that this required attribute is missing.
Since you saved this file with python-docx (from the original .doc file), you might want to raise an issue with python-docx.
I had the same error with some tables in the docx. Is there any docx format for simplify_docx? In other words, what attribute in docx files that simplify_docx can't deal with?
Thanks.
when I open my .docx file , which is saved from a .doc file use python-docx, it cames out this Error,
C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py:194: UnexpectedElementWarning: Skipping unexpected tag: {http://schemas.openxmlformats.org/wordprocessingml/2006/main}background UnexpectedElementWarning) C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py:194: UnexpectedElementWarning: Skipping unexpected tag: {http://schemas.openxmlformats.org/wordprocessingml/2006/main}pict UnexpectedElementWarning) Traceback (most recent call last): File "E:/_master/硕士论文/data/data_preprocess/temp.py", line 27, in <module> db_json = simplify(db) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\__init__.py", line 33, in simplify out = document(doc.element).to_json(doc, _options) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 106, in to_json "VALUE": [ elt.to_json(doc, options) for elt in self], File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 106, in <listcomp> "VALUE": [ elt.to_json(doc, options) for elt in self], File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\body.py", line 25, in to_json JSON = elt.to_json(doc, options, iter_me) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\paragraph.py", line 142, in to_json out: Dict[str, Any] = super(paragraph, self).to_json(doc, options, super_iter) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\paragraph.py", line 27, in to_json for elt in run_iterator: File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 62, in __iter__ self.__iter_name__ if self.__iter_name__ else self.__type__): File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py", line 167, in xml_iter for elt in xml_iter(current, handlers.TAGS_TO_NEST[current.tag], _msg): File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\iterators\generic.py", line 156, in xml_iter yield handlers.TAGS_TO_YIELD[current.tag](current) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\form.py", line 106, in __init__ super(fldChar, self).__init__(x) File "C:\Users\Luke\Anaconda3\lib\site-packages\simplify_docx\elements\base.py", line 36, in __init__ self.props[prop] = getattr(x, prop) AttributeError: 'lxml.etree._Element' object has no attribute 'fldCharType'
what should I do to solve it?
请问,这个问题解决了吗?
according to the README
This project relies on the python-docx package which can be installed via
pip install python-docx. However, as of this writing, if you wish to
scrape documents which contain (A) form fields such as drop down lists,
checkboxes and text inputs or (B) nested documents (subdocs, altChunks,
etc.), you'll need to clone this fork of the python-docx package
You need a modified version of python-docx to support parsing forms, otherwise python-docx will not parse fldChar and cause this issue.
I created a new fork moving over the changes made in this (stale) python-docx fork to the latest python-docx repo and this error was resolved.
You can install the new fork:
pip install git+https://github.com/dalmia/python-docx.git
Hope this helps!
|
gharchive/issue
| 2019-12-25T06:51:09 |
2025-04-01T06:39:34.557013
|
{
"authors": [
"Ledenel",
"LukeALee",
"Orangeices",
"dalmia",
"jdthorpe",
"yellowishee"
],
"repo": "microsoft/Simplify-Docx",
"url": "https://github.com/microsoft/Simplify-Docx/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2660619838
|
Building nft chinese learning with awards
I'm finding ITs who can work with me in nft Chinese learning program. Dm me if you feel excited.
Spam or suspicious activity. Closing.
|
gharchive/issue
| 2024-11-15T03:30:49 |
2025-04-01T06:39:34.558327
|
{
"authors": [
"Maxi986",
"paulosalem"
],
"repo": "microsoft/TinyTroupe",
"url": "https://github.com/microsoft/TinyTroupe/issues/31",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
464800920
|
Resolve large font size in handbook documentation
In reference to #1061
The following address is showing a large font for import resolutions:
https://www.typescriptlang.org/docs/handbook/module-resolution.html#path-mapping
Instead of using an ordered list within a bullet point, bring the subject line to the parent level and then use an ordered list.
Screenshot below for reference:
As we can see, the font is much larger than the rest of the documentation.
We got a PR fixing this I believe 👍
|
gharchive/issue
| 2019-07-06T00:28:43 |
2025-04-01T06:39:34.560471
|
{
"authors": [
"justinpage",
"orta"
],
"repo": "microsoft/TypeScript-Handbook",
"url": "https://github.com/microsoft/TypeScript-Handbook/issues/1062",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
521897036
|
No "Code ran. Check the console" if code throws an exception
http://www.typescriptlang.org/play/index.html?target=1&ts=3.8.0-dev.20191105&ssl=7&ssc=1&pln=8&pc=1#code/JYOwLgpgTgZghgYwgAgPICMBWyDeBYAKGWOTgH4AuXQk25dS5AZzClAHMaSBfQ3gwgBsIYZAHssVDNgC81IiThV8CuvSoAiOOgQauxfv0IIxIJmOEA6QWPYAKCZkvlL6AISWmAB0HAwdjXQNAEpgoA
interface Obj {
a?: {
b?: string
}
}
let obj: Obj = {
a: {
b: "abc"
}
}
console.log(obj.a?.b!.split("b"))
``
This is fixed in v3
|
gharchive/issue
| 2019-11-13T01:20:33 |
2025-04-01T06:39:34.561949
|
{
"authors": [
"DanielRosenwasser",
"orta"
],
"repo": "microsoft/TypeScript-Website",
"url": "https://github.com/microsoft/TypeScript-Website/issues/126",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
994381352
|
added a warning against object literal shap-matching
added a warning against object literal shap-matching and provided code example, and error code.
Thanks for the PR. This is a bit different (it's called excess property checking) and I think it's OK for it to be explained later in the handbook 👍🏻
|
gharchive/pull-request
| 2021-09-13T02:28:17 |
2025-04-01T06:39:34.563055
|
{
"authors": [
"eugland",
"orta"
],
"repo": "microsoft/TypeScript-Website",
"url": "https://github.com/microsoft/TypeScript-Website/pull/2023",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
330889918
|
Extended Numeric Literals
Search Terms
extended-numeric-literals
extended numeric literals
Suggestion
Support tc39/proposal-extended-numeric-literals (Stage 1)
Use Cases
For represent lengths in pixels, inches, degrees, radians, seconds and other units without calling specifically conversion methods in runtime (if this possible).
const time = 100s // transforms to `100,000` ms
const angle = 180° // transforms to `3.1415926536` radians
Examples
function deg(value: number): number {
return value * (Math.PI / 180);
}
function °(value: number): number {
return deg(value);
}
function _f32(value: number): number {
return Math.fround(value);
}
function _u32(value: number): number {
return value >>> 0;
}
function px(value: number): string {
return `${ value }px`;
}
var angle = 30deg; // transforms to `0.5235987756`
const angle2 = 180°; // transforms to `pi` (`3.1415926536`)
let count = 1_000_000_u32;
var float32 = 1.0_f32;
canvas.style.width = 512px; // transforms to "512px" string
Also related to #15096
I think that would be a great feature, but also that it could cause confusion for newbies who would read code from teammates
I don't think this will reach stage-3 in the near future (3-5 years)
~and it is ugliness~
This is already used with the arrival of BigInt. And with success has taken hold in such modern languages as like Rust, Ruby, Swift
@MaxGraey So I think that this feature is related to Javascript, not Typescript
If this will land in JavaScript it should support by Typescript as well, isn't it? String templating has been pretty useful feature. Extended numeric literals just like templating but for numbers
If this will land in JavaScript it should support by Typescript as well, isn't it?
@MaxGraey Yes it is
Closing since there's nothing for us to do but wait for TC39
|
gharchive/issue
| 2018-06-09T14:07:00 |
2025-04-01T06:39:34.568406
|
{
"authors": [
"Kingwl",
"MaxGraey",
"RyanCavanaugh",
"Shinigami92"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/24828",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
374123150
|
Unable to discriminate union type because of "Property does not exist"
TypeScript Version: typescript@3.2.0-dev.20181025
Search Terms: union narrowing discriminate missing property field
Code
class Foo
{
private foo: string;
}
class Bar
{
private bar: number;
}
function fn(baz: { foo: Foo; bar: string; } | { foo: Bar; })
{
if (typeof baz.bar != 'undefined')
{
testFoo(baz.foo);
}
}
function testFoo(foo: Foo)
{
}
Expected behavior:
No Error. I can call testFoo(baz.foo) since the union type is properly narrowed.
Actual behavior:
Property 'bar' does not exist on type 'Baz'.
Argument of type 'Foo | Bar' is not assignable to parameter of type 'Foo'.
Type 'Bar' is not assignable to type 'Foo'.
Property 'foo' is missing in type 'Bar'.
Playground Link: https://www.typescriptlang.org/play/index.html#src=class Foo
{
private foo%3A string%3B
}
class Bar
{
private bar%3A number%3B
}
function fn(baz%3A { foo%3A Foo%3B bar%3A string%3B } | { foo%3A Bar%3B })
{
if (typeof baz.bar !%3D 'undefined')
{
testFoo(baz.foo)%3B
}
}
function testFoo(foo%3A Foo)
{
}
Related Issues: I've looked around but found none.
Although, in general, I do understand this should not work, I think it would help greatly if we had a way to discriminate on properties that are only in some of the members of a union.
One of the solutions could be to type properties, that are missing in some of the union members as prop?: void, or prop?: never, or prop?: undefined, or whatever makes sense in the type system.
Although, in general, I do understand this should not work, I think it would help greatly if we had a way to discriminate on properties that are only in some of the members of a union.
This should work
function fn(baz: { foo: Foo; bar: string; } | { foo: Bar; })
{
if ("bar" in baz)
{
testFoo(baz.foo);
}
}
With this PR #27695 you can also do this:
function fn(baz: { foo: Foo; bar: string; } | { foo: Bar; bar: undefined })
{
if (baz.bar !== 'undefined')
{
testFoo(baz.foo);
}
}
@jack-williams Your suggestion works. Thank you!
I'll keep this issue open as I think it should be documented and/or we could have better syntax for it... but I might be wrong. I'd like a member of TS team to have a look and provide a definitive decision on that if you don't mind...
BTW, there's a good reason this:
function fn(baz: { foo: Foo; bar: string; } | { foo: Bar; })
{
if (typeof baz.bar != 'undefined')
{
testFoo(baz.foo);
}
}
doesn't work - consider this call:
fn({bar: 12, foo: new Bar()});
that's a valid call (assignable to the second union member), and is why even if you were to check that bar wasn't undefined in the input, you might still not have a Foo for foo. The in operator is intentionally a little unsound here, but we figured that its usage was infrequent enough that if you use it you really wanted it to act like this.
Indeed. I haven't thought of this case.
I guess it would impossible to implement it in a completely sound way.
Thanks for your answer!
BTW, there's a good reason this:
function fn(baz: { foo: Foo; bar: string; } | { foo: Bar; })
{
if (typeof baz.bar != 'undefined')
{
testFoo(baz.foo);
}
}
doesn't work - consider this call:
fn({bar: 12, foo: new Bar()});
that's a valid call (assignable to the second union member), and is why even if you were to check that bar wasn't undefined in the input, you might still not have a Foo for foo. The in operator is intentionally a little unsound here, but we figured that its usage was infrequent enough that if you use it you really wanted it to act like this.
Hi, I strolled by this thread randomly and I am immediately baffled as to why fn({bar:12, foo: new Bar()}) should be a valid call?
Let me play the role of a verbose compiler:
value {bar: 12, foo: new Bar()} cannot be assigned to {foo: Foo, bar: string}, because bar is 12, 12 is not string.
value {bar: 12, foo: new Bar()} cannot be assigned to {foo: Bar} either, because it contains an extra bit bar: 12, and TS2322 says: "object literal may only specify known properties". field bar is unknown to type {foo: Bar}.
So I will throw an error, I can either reject it because 12 is not a string, or I can reject it with TS2322.
And I tested it, indeed fn({bar:12, foo: new Bar()}) is not valid call, as tsc complains that number 12 is not a string. (tested for strictNullCheck = true and false)
But then here comes the weird part: fn({bar:"lol", foo: new Bar()}) is a valid call! This surprises me, as it still doesn't fit either type, for the same reasons the verbose compiler rejected it earlier. Let me role play again:
value {bar:"lol", foo: new Bar()} cannot be assigned to {foo: Foo, bar: string}, because foo is new Bar(), which is not a Foo.
value {bar:"lol", foo: new Bar()} cannot be assigned to {foo: Bar} either, because it contains an extra bit bar: 12, and TS2322, blah blah.
I should raise an error saying either new Bar() cannot be assigned to Foo, or TS2322.
But since reality doesn't agree with me, it must be I'm understanding union types wrong, or maybe TS has a bug. Do you know what's happening here?
BTW, there's a good reason this:
function fn(baz: { foo: Foo; bar: string; } | { foo: Bar; })
{
if (typeof baz.bar != 'undefined')
{
testFoo(baz.foo);
}
}
doesn't work - consider this call:
fn({bar: 12, foo: new Bar()});
that's a valid call (assignable to the second union member), and is why even if you were to check that bar wasn't undefined in the input, you might still not have a Foo for foo. The in operator is intentionally a little unsound here, but we figured that its usage was infrequent enough that if you use it you really wanted it to act like this.
Hi, I strolled by this thread randomly and I am immediately baffled as to why fn({bar:12, foo: new Bar()}) should be a valid call?
Let me play the role of a verbose compiler:
value {bar: 12, foo: new Bar()} cannot be assigned to {foo: Foo, bar: string}, because bar is 12, 12 is not string.
value {bar: 12, foo: new Bar()} cannot be assigned to {foo: Bar} either, because it contains an extra bit bar: 12, and TS2322 says: "object literal may only specify known properties". field bar is unknown to type {foo: Bar}.
So I will throw an error, I can either reject it because 12 is not a string, or I can reject it with TS2322.
And I tested it, indeed fn({bar:12, foo: new Bar()}) is not valid call, as tsc complains that number 12 is not a string. (tested for strictNullCheck = true and false)
But then here comes the weird part: fn({bar:"lol", foo: new Bar()}) is a valid call! This surprises me, as it still doesn't fit either type, for the same reasons the verbose compiler rejected it earlier. Let me role play again:
value {bar:"lol", foo: new Bar()} cannot be assigned to {foo: Foo, bar: string}, because foo is new Bar(), which is not a Foo.
value {bar:"lol", foo: new Bar()} cannot be assigned to {foo: Bar} either, because it contains an extra bit bar: 12, and TS2322, blah blah.
I should raise an error saying either new Bar() cannot be assigned to Foo, or TS2322.
But since reality doesn't agree with me, it must be I'm understanding union types wrong, or maybe TS has a bug. Do you know what's happening here?
value {bar:"lol", foo: new Bar()} cannot be assigned to {foo: Bar} either, because it contains an extra field bar: 12, and there is a TS rule against that.
This is the wrong bit -- extra fields are allowed when there is some matching property in the target across any union member.
value {bar:"lol", foo: new Bar()} cannot be assigned to {foo: Bar} either, because it contains an extra field bar: 12, and there is a TS rule against that.
This is the wrong bit -- extra fields are allowed when there is some matching property in the target across any union member.
This is the wrong bit -- extra fields are allowed when there is some matching property in the target across any union member.
Yeah, about that.
That was my initial feeling so I did some more testing which ended up contradicting this.
declare function id_or_name(input: {id:number; name:"joe"}|{name:string})
id_or_name({id:3, name:"alex"}) // Argument of type '{ id: number; name: string; }' is not assignable to parameter of type '{ id: number; name: "joe"; } | { name: string; }'. Object literal may only specify known properties, and 'id' does not exist in type '{ name: string; }'.ts(2345)
Somehow this time it doesn't work again. I notice though that if instead of having name:<primitive type/literal type> in the parameter, you have name: <ObjectType>, then it behaves like you describe again.
function id_or_name(input: {id:number; name:Date}|{name:string}): void {return}
id_or_name({id:3, name:"alex"}) // no problem
This is the wrong bit -- extra fields are allowed when there is some matching property in the target across any union member.
Yeah, about that.
That was my initial feeling so I did some more testing which ended up contradicting this.
declare function id_or_name(input: {id:number; name:"joe"}|{name:string})
id_or_name({id:3, name:"alex"}) // Argument of type '{ id: number; name: string; }' is not assignable to parameter of type '{ id: number; name: "joe"; } | { name: string; }'. Object literal may only specify known properties, and 'id' does not exist in type '{ name: string; }'.ts(2345)
Somehow this time it doesn't work again. I notice though that if instead of having name:<primitive type/literal type> in the parameter, you have name: <ObjectType>, then it behaves like you describe again.
function id_or_name(input: {id:number; name:Date}|{name:string}): void {return}
id_or_name({id:3, name:"alex"}) // no problem
|
gharchive/issue
| 2018-10-25T20:39:44 |
2025-04-01T06:39:34.592473
|
{
"authors": [
"RyanCavanaugh",
"jack-williams",
"lemoinem",
"nuts-n-bits",
"weswigham"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/28138",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
405003204
|
Mixin with abstract class constraint forces abstract memeber implementation
TypeScript Version: 3.4.0-dev.20190130
Search Terms: abstract mixin
When using a mixin function, if the constraint to the constructor returns an abstract class (T extends new (...a: any[])=> AbBase) we are forced to implement the abstract members in the mixin class. This seems strange since if the passed in class satisfies the constructor signature constraint it will not be an abstract class, as abstract classes do not have a callable constructor.
Code
abstract class AbBase {
abstract m(): void ;
}
function mixin<T extends new (...a: any[])=> AbBase>(c: T) {
let a = new c(); // We can instantiate
return class extends c { // but when deriving the class we need to redefine abstract members
}
}
mixin(AbBase) //Argument of type 'typeof AbBase' is not assignable to parameter of type 'new (...a: any[]) => AbBase' as expected, no chance c would be abstract
Expected behavior:
The mixin function should compile without error
Actual behavior:
The compiler raises an error on the class defined in the mixin function : Non-abstract class expression does not implement inherited abstract member 'm' from class 'AbBase'.
Playground Link: Link
Related Issues: None
Motivating SO question
@rbuckton FYI, I found this #26617 related issue. This issue seems more general. If we have a constructor signature returning an abstract class the compiler will complain we have to implement all members even though we can never assign an abstract class to such a constructor signature
abstract class Base {
abstract m(): void
}
let ctor: new () => Base;
ctor = Base /// error we can't assign an abstract class to this
ctor = class extends Base { // we can only assign a non-abstract class
m() { }
}
// Non-abstract class 'Derived' does not implement inherited abstract member 'm' from class 'Base'
// Why is this an error at all ? The constructor returns an instance of an abstract class but can never be an abstract class
class Derived extends ctor {
}
IMO the true issue is there is no way to represent an abstract class constructor. Ideally I would like to be able to tell the compiler that I have an abstract class constructor and it should enforce implementation of abstract members (something like new abstract () => Base). If my constructor signature is just new () => Base then this is not the constructor of an abstract class and I should not have to implement the methods
let ctor: new abstract () => Base; // new syntax
ctor = Base /// this is allowed now
class Derived extends ctor { // errors expected here we don't implement base members
}
This can make mixins a PITA!
It would be really useful: yesterday I wanted to make a mixin, and constrain it to classes of a certain base type (which happens to be marked as abstract). It doesn't make sense for the mixin to implement all of the abstract members. The mixin just wants to enforce a constraint, and to even possibly call abstract methods, under the assumption that the user of the mixin (and of the abstract base class) will be implementing the members.
Here's a playground example.
Is there some workaround, some way to remove the abstracted (and abstracted in some type functions to make it clean), so that in the mixin the constrained base class does not appear to be abstract?
I've tried some tricks like
type NonAbstractBase = Pick<Base, keyof Base>
and then using NonAbstractBase for the constraint type, but that has issues where it converts method types into property types, etc, and then that introduces other errors.
Another workaround is to just not use the abstract keyword, and do stuff like throw new Error('Method not implemented') in the base class, delegating the check to runtime. But doing this makes it easy to make mixins.
It would be super sweet to be able to do this in TypeScript, just like in plain JS.
(Relating to abstract support for mixins, there's also https://github.com/microsoft/TypeScript/issues/32122)
@dragomirtitian I think the solution to this is actually simple. The abstract keyword simply needs to be usable anywhere besides at the top level.
We should be able to stick the abstract keyword in front of the class that a mixin returns, just in this example: playground.
Secondly, regarding your point, I don't think we necessarily need a way to specify an abstract new() => ... constructor. I think the type check should allow something like new(...args: any[]) => SomeAbstractClass to be a valid type (because, it is representing a totally valid JS object), but type type checker should forbid the use of new on callsites using new with that type.
So, this could be the case:
abstract class SomeAbstractClass {/* ... */}
const Ctor = new(...args: any[]) => SomeAbstractClass // NO ERROR
new Ctor // ERROR, can't call `new` on abstract class.
If the type checker did all of my above comment, then:
it makes sense: we shouldn't be able to instantiate an abstract class. However, the reference is a totally valid constructor. In JS, we can call it. In TS, it can simply prevent calling it.
abstract class {} syntax is already perfectly valid existing syntax. It would make perfect sense to be able to return an abstract class from a class factory function.
I believe this would be the most intuitive way to solve the issue without introducing new syntax.
@trusktr I don't necessarily agree that calling new on new(...args: any[]) => SomeAbstractClass should issue an error. The constructor signature does not mean that the constructor returns that exact type, but rather any derived type. This is very useful where you want to have a collection of constructors all returning a common base type.
interface X { m(): void; }
class A implements X { m() { console.log("A");} }
class B implements X { m() { console.log("B");} }
class C implements X { m() { console.log("C");} }
let o : Array<new () => X> = [A, B, C];
let r = Math.round(Math.random() * 2);
new o[r]().m();
A constructor signature can return any type, be it an interface, a union whatever else, and no assumptions are made about the constructor itself based on the return type. It is only in the special case of inheritance where the fact that a constructor returns an abstract class that the assumption is made that this means the constructor itself is of an abstract class.
The constructor signature does not mean that the constructor returns that exact type
Ah, that's true. (I was confused at first why I couldn't use new(...args: any[]) => any for the constraint of a mixin function, but then I realized that it needs to be new(...args: any[]) => {} to specify the minimum)
I was trying to constrain a mixin Base class to something like C extends typeof AbstractClass, but that doesn't work. Why doesn't that work? Isn't typeof AbstractClass a constructor? It seems like it would be intuitive to write it that way.
It is only in the special case of inheritance where the fact that a constructor returns an abstract class that the assumption is made that this means the constructor itself is of an abstract class.
This is important. I figured how to get rid of the issue with abstract class being returned by just not returning it all in the same statement (same applies to decorators):
abstract class SomeBaseClass {
abstract base(): boolean
}
function CoolMixin<
C extends new(...args: any[]) => SomeBaseClass,
T extends InstanceType<C>
>(Base: C) {
abstract class Cool extends Base {
cool() {
console.log("cool");
}
};
return Cool
}
class Bar extends CoolMixin(SomeBaseClass) { // ERROR
bar() {
console.log("bar");
}
}
playground
Seems like that's just a bug.
But notice the error, that an abstract class isn't assignable to the new (...) => ... type.
If we avoid modifying syntax, can we just treat the abstract class as a subset of the new()=>{} type? I mean, after all, the reference is, literally, a constructor in JavaScript, and TypeScript wishes to depict JavaScript.
What I mean is, in my previous example, at a conceptual level, how is SomeBaseClass (despite being abstract) not a new(...args: any[]) => SomeBaseClass? It's otherwise valid JavaScript.
Hmmmm, maybe abstract interface is needed instead?:
abstract interface Type {
nonAbstractMethod(): boolean
abstract method(): number
}
abstract interface TypeCtor {
new (): Type
}
or something
And, to tie it all together, typeof AbstractClass would effectively return something like that abstract interface TypeCtor interface, and it could be used in the constraint of a mixin.
An idea: why not have abstractedness be automatically inherited by a subclass if the subclass does not implement all the abstract features, and then throw a type error like "Can not instantiate abstract class" at the sites where the constructor is called?
Yes, this would make the type error once-removed from the place where the issue is, but maybe abstractedness inheritance would actually be a feature.
Example:
abstract Foo {
abstract foo(): number
}
// this class is abstract, because it didn't implement foo():
class Bar extends Foo {
bar = 123
}
const b = new Bar() // Error, can not instantiate Bar because it is an abstract class
Or, maybe, just simply allow abstract class {} in expression sites.
It seems simple to allow abstract classes to work in expressions. I haven't made changes to TypeScript source before. Would this be easy? It seems like a useful addition.
Hmmmm, maybe we need something like abstract interface:
abstract interface Type {
nonAbstractMethod(): boolean
abstract method(): number
}
abstract interface TypeCtor {
new (): Type
}
or something
@trusktr Actually, if you add //@ts-ignore to the abstract methods of an interface, you'll found that TypeScript already support abstract interface members.
interface Mixin
{
//@ts-ignore
abstract abstractMethod(): void;
}
interface MixinStatic<T extends Class>
{
new(): InstanceType<T> & Mixin;
}
TypeScript Playground
@JasonHK I would not recommend using @ts-ignore on anything as part of a recommended workflow. If you have to suppress an error to get the behavior you want it is by definition not sported , even if it by accident works as you are essentially relying on undocumented behavior .
@JasonHK Huh! Neat hack! I'll have to take that for a spin. I'm more interested in my consumers getting the desired features than in me avoiding mistakes. At least it isn't me fully reverting all the way back to plain JavaScript.
Is there an update on this? I am also failing to apply a mixin to an abstract base class.
I have a few mixins in my Angular project. Here is one I am currently using. Don't know if this helps, but here goes:
import { Injector } from '@angular/core';
import { AbstractBaseService } from './abstract-base.service';
import { UserService } from '../../authorization/services/user.service';
import { User } from '../../authorization/entities/user';
export abstract class AbstractBaseService implements OnInit {
private _injector: Injector = null;
protected get injector(): Injector {
return this._injector;
}
constructor(injector: Injector) {
this._injector = injector;
}
public ngOnInit() {
//You may do a few things here before forwarding to
//subclass via initializeAll
$this.initializeAll();
}
/**
* initializeAll is post ngOnInit, allowing ngOnInit to do brief initialization
* within the AbstractBaseService while allowing the subclass to do whatever it needs.
**/
protected abstract initializeAll(): void;
}
// Mixins setup....................................................................
type Constructor = new (...properties: any[]) => T;
type Mixin<R extends (...properties: any[]) => any> = InstanceType<ReturnType>;
export const WithUserInjection = <T extends Constructor>(base: T) => {
class UserInjectedService extends base {
private _userService: UserService = null;
protected get userService(): UserService {
return this._userService;
}
protected get user(): User {
return this._userService.user;
}
constructor(...properties: any[]) {
super(properties[0] as Injector);
this._userService = this.injector.get(UserService);
}
protected initializeAll(): void {
throw new Error("Method not implemented.");
}
}
return UserInjectedService;
}
export type WithUserInjection = Mixin;
This is supported as of TypeScript 4.2! See the release notes on abstract constructor signatures.
abstract class AbBase {
abstract m(): void;
}
function mixin<T extends abstract new (...args: any[]) => any>(c: T) {
abstract class Mixin extends c {
mixinMethod() {
return 42;
}
};
return Mixin;
}
class Example extends mixin(AbBase) {
m() {
console.log(this.mixinMethod());
}
}
Try it on the Playground.
When I'm looking at this I'm thinking that this should be possibe.
abstract class FooAbstract {
abstract bar(): {};
}
let autoInititatedFoos: FooAbstract[] = new Array<FooAbstract>;
function autoInitiate<T extends new (...args: any[])=>FooAbstract>(cons...
Playground Link
Because something like this isn't possible with the abstract new approach.
|
gharchive/issue
| 2019-01-30T22:58:14 |
2025-04-01T06:39:34.625791
|
{
"authors": [
"IslandWalker",
"JasonHK",
"Maikeio",
"MattiasBuelens",
"dragomirtitian",
"joelrich",
"trusktr"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/29653",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
602626993
|
Conditionally filter types from tuples at transpilation time.
Search Terms
tuple, filter, metaprogram
Suggestion
I'd like a better way to filter types from a tuple type (or indeed object type). Currently I can easily make a type not optional (with -?) or make one optional (?) but I can't conditionally remove a type element from a tuple to produce another tuple. Well actually I can, with this crazy metaprogram that uses up so much CPU that it isn't usable in the real world:
type NumberMap = {
0: 1
1: 2
2: 3
3: 4
4: 5
5: 6
6: 7
7: 8
8: 9
9: 10
10: 11
11: 12
// up to twelve supported
12: 12
}

// prepending is relatively easy
type Prepend<H, T extends any[]> = ((h: H, ...t: T) => void) extends (
...l: infer L
) => void
? L
: never

// appending is possible but expensive and hard, must
// build lists in reverse and reverse the result when done
type Reverse<L extends any[], R extends any[] = []> = {
0: R
1: ((...l: L) => void) extends (h: infer H, ...t: infer T) => void
? Reverse<T, Prepend<H, R>>
: never
}[L extends [any, ...any[]] ? 1 : 0]
type Equals<I extends number, L extends number> = I extends L ? 1 : 0
type FilterBoolean<T, R extends any[]> = T extends boolean ? R : Prepend<T, R>
type FilterBooleansNext<
I extends keyof NumberMap,
T extends any[],
R extends any[]
> = T extends []
? R
: {
0: FilterBooleansNext<NumberMap[I], T, FilterBoolean<T[I], R>>
1: R
}[Equals<I, T['length']>]
// append is hard/expensive, so keep prepending and reverse the result
export type FilterBooleans<T extends any[]> = Reverse<
FilterBooleansNext<0, T, []>
>
Use Cases
I am writing a parser library where you can build parsers by composing operators (each operator being a higher order function). When composing a parser using the sequence operator, the return type of the sequence should be a tuple of the return types of each operator the sequence is composed from. However the return types of "predicate" operators should be excluded from the tuple return type since the predicate operators don't parse any data, they only match or fail to match.
Examples
// need something better than `excluded` here I guess, using it as a placeholder.
type FilterBooleans<T> = { [K in keyof T]: T[K] extends boolean ? excluded : T[K] };
type TupleWithoutBooleans = FilterBooleans<[string, boolean, number]>;
type RecordWithoutBooleans = FilterBooleans<{ s: string, b: boolean, n: number }>;
Checklist
My suggestion meets these guidelines:
[x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
[x] This wouldn't change the runtime behavior of existing JavaScript code
[x] This could be implemented without emitting different JS based on the types of the expressions
[x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
[x] This feature would agree with the rest of TypeScript's Design Goals.
What does "remove a type from an object" mean?
I'm guessing you want this
type Step1<T> = { [K in keyof T]: T[K] extends boolean ? never : K }[keyof T];
type FilterBooleans<T> = { [K in Step1<T>]: T[K] }
// { s: string, n: number }
type RecordWithoutBooleans = FilterBooleans<{ s: string, b: boolean, n: number }>;
The corresponding thing for tuples is kind of conceptually weird since it implies some kind of splicing operation?
@RyanCavanaugh Thanks for the suggestion, I see that works for non-tuples. Cool! But if I run:
FilterBooleans<[string, boolean, Date, boolean]>;
I get the type:
type TupleWithoutBooleans {
4: undefined;
0: string;
2: Date;
}
I need it for my parser generator though, so I can build a parser from component functions.
const parseRule = parseSequence(parseRuleName(), notPredicate(parseConstant("<"))
The notPredicate return type shouldn't be in the tuple type returned by the parseRule function.
@RyanCavanaugh This is labelled "Waiting for feedback" but I'm not sure what feedback you'd like, could you let me know?
BTW in typescript 4.1 I can do this:
type NoBoolean<K, T> = T extends boolean ? never : K
type FilterBooleans<T> = {
[K in keyof T as NoBoolean<K, T[K]>]: T[K]
}
type Filtered = FilterBooleans<[string, boolean, number, boolean]>
Here type Filtered looks like the tuple I want, only it's a record, with a key 0 of string, a key 2 of number, a length of the constant 4 and all the functions from the array prototype. It would be great if there was a tuple equivalent of what you can already now do with records. Maybe something like:
type FilterBooleans<T> = [
[K in keyof T as NoBoolean<K, T[K]>]: T[K]
];
(i.e. the same as above but with the { and } changed to [ and ]). Without this, my parser combinator framework needs a lot of manual help from the user.
Closing in favour of https://github.com/microsoft/TypeScript/issues/42122
This is labelled "Waiting for feedback" but I'm not sure what feedback you'd like, could you let me know?
@insidewhy: The Awaiting More Feedback label has this description:
This means we'd like to hear from more people who would be helped by this feature
|
gharchive/issue
| 2020-04-19T05:30:22 |
2025-04-01T06:39:34.638255
|
{
"authors": [
"MartinJohns",
"RyanCavanaugh",
"insidewhy"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/38044",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
609431538
|
Exclude<string, "foo"> is not working
TypeScript Version: 3.8.3 or earlier
Search Terms: Exclude, Conditional Types
Code
type Foo = Exclude<string, "bar">;
// this should cause an error.
const baz: Foo = "bar";
Expected behavior:
"bar" is not assignable to baz.
Actual behavior:
"bar" is assignable to baz without any errors.
Playground Link: https://www.typescriptlang.org/play/#code/FAFwngDgpgBAYgewTAvDAogDwMYBsCuAJlADwDOIATgJYB2A5gDQwBEARgIaUsB8A3MGDYEtCjE4AvAFzwkqVp24ChIsglxQAdLgT0AFJICUAoA
Not the most elegant, but here is a work-around.
https://www.typescriptlang.org/play/?ssl=4&ssc=1&pln=5&pc=1#code/C4TwDgpgBAQghgOwRAJlAvFARAIzgJywG4AoUSKAMQHtqMoBRADwGMAbAVxQgB4BnYPgCWCAOYAaWImQoAfKRItqCAVDwAvKAC4qtergLEoAemNU4Qtn110h1uFAHCxJEgDMOCFsCHKo1AGseABUoCCZgCAQUaycRUVkACgA3HVCAMihE0PDI6Ot4JFQoAH5sGABBADkqhgARLB0AbwBfAEo2qCaSKCh8CGAOfAQoZNIW1yUVYDU4dQB9Nws2ekDEg0JO03NLPkVlVQ15wNWA9YRqebxNkzMAYQALCBYAviA
This is limitation of Exclude. Basically it works when you "filter" unions, but string is not a union.
What you are lookin is Negated types: #29317 and comment about your use case https://github.com/microsoft/TypeScript/pull/29317#issuecomment-452987604
Thank you for comments!
I guess it means that, in the type system of typescript, string type is not a 'representative' or 'set' of any valid strings.
I thought string means the set of any of it including 'foo', 'bar' or any others of string.
Yeah, unfortunately we're not wired up to "cut holes" out of infinitely large domains of types, and it appears that adding that assumption to the type system would make it incredibly painful to use.
Regarding
https://www.typescriptlang.org/play/?ssl=4&ssc=1&pln=5&pc=1#code/C4TwDgpgBAQghgOwRAJlAvFARAIzgJywG4AoUSKAMQHtqMoBRADwGMAbAVxQgB4BnYPgCWCAOYAaWImQoAfKRItqCAVDwAvKAC4qtergLEoAemNU4Qtn110h1uFAHCxJEgDMOCFsCHKo1AGseABUoCCZgCAQUaycRUVkACgA3HVCAMihE0PDI6Ot4JFQoAH5sGABBADkqhgARLB0AbwBfAEo2qCaSKCh8CGAOfAQoZNIW1yUVYDU4dQB9Nws2ekDEg0JO03NLPkVlVQ15wNWA9YRqebxNkzMAYQALCBYAviA
you might be able to get away with using never in place of BANNED
|
gharchive/issue
| 2020-04-29T23:24:21 |
2025-04-01T06:39:34.646616
|
{
"authors": [
"DanielRosenwasser",
"IllusionMH",
"frodo821",
"stormpat"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/38254",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
616371926
|
[bug] [project-references] "tsc --build" fails to report non-existent module errors
TypeScript Version: 3.8.3
Search Terms:
"typescript build project references no error on non-existent module"
"typescript build project references no error on incorrect import"
"typescript build project references no error when module is not found"
Code
import "./file-that-does-not-exist" // No error!
Expected behavior:
There should be an error.
Actual behavior:
The next screenshot in VS Code show that if I try to go to the definition of the module, it can't find it (because it doesn't exist, and Webpack also says the same), yet the TypeScript build (tsc --build) does not report any error.
Is this related to imports that don't create any identifiers?
Playground Link:
I don't have a reproduction, only a private repo at the moment.
Related Issues:
Possibly related: https://github.com/microsoft/TypeScript/issues/38495
I'll just close this and re-open if/when I have a simple reproduction.
|
gharchive/issue
| 2020-05-12T05:31:14 |
2025-04-01T06:39:34.651703
|
{
"authors": [
"trusktr"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/38494",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
687560576
|
Move to new file does not carry over global 'use strict'
Given a javascript source file that sets strict mode ('use strict'),
When using the Move to a new file refactoring to extract a function declaration,
Then the resulting function ends up in a non-strict file.
It is possible for this to change the program's behavior.
VSCode Version: 1.48.1
OS Version: Windows 10
Steps to Reproduce:
Given main.js:
'use strict'
const fs = require('fs')
function foo() { return this }
console.log(foo())
Running node main produces the output:
undefined
Extract foo using the Move to a new file refactoring.
This modifies main.js:
'use strict'
const fs = require('fs')
const { foo } = require("./foo")
console.log(foo())
and generates foo.js:
function foo() { return this; }
exports.foo = foo;
Now running node main produces the output:
Object [global] {
global: [Circular],
....
Manually adding 'use strict' to foo.js restores the original behavior.
Do we need to suggest "Move to a new file" for directives?
[|"use strict";|] ?
function foo() {
[|"use strict";|] ?
}
cc @DanielRosenwasser @RyanCavanaugh
No, the intent is that when declarations are moved to a new file, every prologue should be moved over as well.
Actually, I'm going to close this as a duplicate of #30478 because I already filed this issue.
Ouch, I can do PR to resolve the issue with the prologue directives, if, of course, this is acceptable.
Absolutely, that would be great!
|
gharchive/issue
| 2020-08-27T21:05:39 |
2025-04-01T06:39:34.658435
|
{
"authors": [
"DanielRosenwasser",
"a-tarasyuk",
"spazmodius"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/40292",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1248122759
|
Error using indexed access types with mapped types
Bug Report
🔎 Search Terms
indexed access types, mapped types
🕗 Version & Regression Information
This changed between versions 4.6.4 and 4.7.0
⏯ Playground Link
v4.8.0-dev.20220525 ❌
v4.7.0-beta ❌
v4.6.4 ✅
v4.5.5 ❌
💻 Code
type Types = {
first: { a1: true };
second: { a2: true };
third: { a3: true };
}
class Test {
entries: { [T in keyof Types]?: Types[T][] }
constructor() {
this.entries = {};
}
addEntry<T extends keyof Types>(name: T, entry: Types[T]) {
if (!this.entries[name]) {
this.entries[name] = [];
}
this.entries[name]?.push(entry); // error
}
}
🙁 Actual behavior
Got the following error:
Argument of type '{ a1: true; } | { a2: true; } | { a3: true; }' is not assignable to parameter of type '{ a1: true; } & { a2: true; } & { a3: true; }'.
Type '{ a1: true; }' is not assignable to type '{ a1: true; } & { a2: true; } & { a3: true; }'.
Property 'a2' is missing in type '{ a1: true; }' but required in type '{ a2: true; }'
🙂 Expected behavior
No error. Or maybe I missed something, and this was intentional.
Seems like it was fixed in https://github.com/microsoft/TypeScript/pull/47109 and broken in v4.7.
Looks like something's going on when the mapped type is optional. This happened between 4.7.0-dev.20220302 and 4.7.0-dev.20220329. It would definitely be nice if this didn't stay broken in 4.7.x for long.
Yeah, this is a bug. We're being a bit too conservative in the isMappedTypeGenericIndexedAccess function. It disallows any ? modifiers when really it should only disallow -?. Easy fix, I'll have a PR up soon.
|
gharchive/issue
| 2022-05-25T13:44:41 |
2025-04-01T06:39:34.665414
|
{
"authors": [
"ahejlsberg",
"irudoy",
"jcalz"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/49242",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1283253483
|
Automatically deduce the parameter type in the callback function
Bug Report
Automatically deduce the parameter type in the callback function
🔎 Search Terms
callback, object deconstruction, type deduction
🕗 Version & Regression Information
Playground version V4.7.4
⏯ Playground Link
Typescriptlang Playground
code
type CallBack = (_: number) => number
function sqk({x}:{x:number}): Promise<number>;
function sqk({x,callback}:{x:number,callback:CallBack}): void;
function sqk({x,callback}:{x:number,callback?:CallBack}): Promise<number>|void {
if(callback){
callback (x * x);
}
return Promise.resolve(x)
}
// ❌ here our callback will receive a number, but the `x` can‘t be deduced
const cb = sqk({x:5, callback:function(x) {
console.log(x);
return x;
}});
// ✅
const promise = sqk({x:5}).then(x=>{console.log(x)})
🙁 Actual behavior
Callback will receive a number, but the type of x can‘t be deduced
🙂 Expected behavior
A possible solution is to remove the callback from the object as the second parameter,as follows:
type CallBack = (_: number) => number
function sqk(x:number): Promise<number>;
function sqk(x:number,callback:CallBack): void;
function sqk(x:number,callback?:CallBack): Promise<number>|void {
if(callback){
callback (x * x);
}
return Promise.resolve(x*x)
}
// ✅
const cb = sqk(5,(x)=> {
console.log(x);
return x;
});
// ✅
const promise = sqk(5).then(x=>{console.log(x)})
But this doesn't answer my question, why the first solution doesn't work?is it a bug?
Overload resolution always picks the first overload that matches, so the second overload won't ever get picked. You need to reverse the order of them:
function sqk({x,callback}:{x:number,callback:CallBack}): void;
function sqk({x}:{x:number}): Promise<number>;
Thanks for your help, pointing out the key points, I think I need to read the docs more confidently, this question can be closed.
|
gharchive/issue
| 2022-06-24T04:41:46 |
2025-04-01T06:39:34.670607
|
{
"authors": [
"RyanCavanaugh",
"Sylvenas"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/49665",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1390569877
|
[Proposal] Typescript Overriding Type
Typescript Overriding the Type
🔍 Search Terms
typescript utility type override
✅ Viability Checklist
My suggestion meets these guidelines:
[x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
[x] This wouldn't change the runtime behavior of existing JavaScript code
[x] This could be implemented without emitting different JS based on the types of the expressions
[x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.)
[x] This feature would agree with the rest of TypeScript's Design Goals.
⭐ Suggestion
Sometimes we need to override the some property in same interface with another type; :wink:
📃 Motivating Example
playground
💻 Use Cases
We can have this type alias as Type Tools :
type Override<
T,
K extends Partial<{ [P in keyof T]: any }> | string,
> = K extends string
? Omit<T, K> & { [P in keyof T]: T[P] | unknown }
: Omit<T, keyof K> & K;
And, We have an interface alike below :
interface IUser {
username: string;
phone: string;
// any else properties
}
Then, We need the same interface with some another type for username property
interface IEmployee extends IUser {
username: [department: string, code: number]; // -> error
tasks: Array<any>;
}
But If we use the Override type?
interface IEmployee extends Override<IUser, 'username'> {
username: [department: string, code: number]; // -> work
tasks: Array<any>;
}
Or This Syntax :
interface IEmplyee extends Override<IUser, { username: [department: string, code: number]; }> {
tasks: Array<any>;
}
Or we have an ICustomer interface with same properties with IUser but for username property that should be given the number as the type
type ICustomer = Override<IUser, { username: number }>;
Also We can use alike below syntax :
interface ICustomer Override<IUser, { username: number }> {};
Work with Type Assignment :
// use-case
const employees: IEmployee = {
username: ['any-string', 1234],
tasks: [],
phone: '09123456789'
};
// use-case
const customers: ICustomer = {
username: 1234,
phone: '09123456789'
};
Or any another example for this Overidde Utility Type, Thanks
Exactly this is just a Concept and with Week Example (take it easy :wink:)
You can already do extends Omit<T, 'prop'> so this type alias seems unnecessarily complicated (it has to do more than the bare minimum to even be useful as there's already a thing built-in that does its main job).
You can already do extends Omit<T, 'prop'> so this type alias seems unnecessarily complicated (it has to do more than the bare minimum to even be useful as there's already a thing built-in that does its main job).
Yes Exactly but not for accept interface as generic type to overriding, omit just accept the key of interfacebut thisOverrideType givekeyandinterface` also :wink:
This will be declined, since it's only for developer convenience, and not needed by the compiler to emit .d.ts files.
Hi @jcalz :wave:
Exactly :
Any fool can write code that a computer can understand. Good programmers write code that humans can understand
@DanielRosenwasser
or intersecting with a new property
Intersecting with a new property doesn't work the way interface/extends does though... For example
// this type came from a library, I need to replace foo: any
type A = {
foo: any
}
type B = A & { foo: number }
type T = B['foo'] // I need this to be number, but it's still any
interface/extends does get the desired behavior, but...if something is easier to do with interfaces, I wish TS would engineer a way to do it just as easily with type literals. Type literals are just way more convenient because they can be used inline. For example, let's say I have
type Props = {
b: A
}
And I want to constrain the type of foo.b to be number. Do I want to break out a separate interface declaration just to do that? No, that would be a hassle. I would just:
type Props = {
b: Omit<A, 'foo'> & { foo: number },
}
As such when there's more than one property to override, the Omit pattern becomes a pain because of the duplicated property names, and I find myself thinking I should declare an Override type that does what OP proposes. But then, I think, this should really just be a builtin.
what if it's be a reusable keyword? :thinking:
// basic
interface One {
property: number;
}
// replace
interface Two extends One {
override property: string ;
}
// union mode
interface Three extends One {
extends property: string;
}
// cases
assertTypeEqual<One, { property: number }>;
assertTypeEqual<Two, { property: string }>;
assertTypeEqual<Three, { property: number | string }>;
|
gharchive/issue
| 2022-09-29T09:40:10 |
2025-04-01T06:39:34.685752
|
{
"authors": [
"fatcerberus",
"jedwards1211",
"mikoloism"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/50989",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1567667333
|
TypeScript does not catch type that does not match complex union
Bug Report
🔎 Search Terms
Complex union, should not compile
🕗 Version & Regression Information
This is the behaviour in every version I tried, and I reviewed the FAQ for entries about unions.
⏯ Playground Link
Playground link with relevant code
💻 Code
type ContrivedType =
| {
foo: "BAR" | "BAZ";
}
| {
foo: "QUX";
qux: "qux";
}
| {
foo: "BAR" | "BAZ";
baz: "baz";
};
// This should not compile, but it does!
const contrivedExample: ContrivedType = {
foo: "BAR",
qux: "qux",
};
🙁 Actual behavior
It compiles.
🙂 Expected behavior
It should not compile.
Duplicate of #51873. From a type system perspective it's perfectly fine, because your object matches the first type of the union. Objects are not sealed, they can have additional properties. People often don't realize it because of the "excess property checks", which was called more of a linter feature.
Thanks for the swift reply @MartinJohns!
your object matches the first type of the union.
I am not sure if I fully understand what the TypeScript compiler is doing this case. For example, if instead of this union I just try the first case, it catches the error just fine:
// This does not compile, as expected
const hmmm: { foo: "BAR" | "BAZ" } = {
foo: "BAR",
qux: "qux",
};
Similarly, it also catches the error if I remove the third object in the union, like so:
type ContrivedType =
| {
foo: "BAR" | "BAZ";
}
| {
foo: "QUX";
qux: "qux";
};
// Now this fails to compile, as expected
const contrivedExample: ContrivedType = {
foo: "BAR",
qux: "qux"
};
I fear I might be misunderstanding something, but this feels inconsistent. :thinking: I'll take some time to read the duplicate issue, maybe it'll help me understand better what's going on. .
For example, if instead of this union I just try the first case, it catches the error just fine:
This is the "excess property check" that I mentioned. This only applies to object literals. This example compiles just fine:
const hmmm = { foo: "BAR", qux: "qux" } as const;
const demo: { foo: "BAR" | "BAZ" } = hmmm;
Similarly, it also catches the error if I remove the third object in the union, like so:
Excess property checks does not for work for unions with an overlapping discriminator property. See the issue I linked earlier: #51873. In your first example the discriminator foo with the type "BAR" overlaps, by removing the third union case you don't have it overlap anymore.
Thanks again for the explaination @MartinJohns! Sorry I took to so long to respond, it took me some time to get up to speed with all the concepts like discriminated unions. You're absolutely right of course: this is a duplicate, so I will close this issue.
|
gharchive/issue
| 2023-02-02T09:34:51 |
2025-04-01T06:39:34.693417
|
{
"authors": [
"MartinJohns",
"jlek"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/52564",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1612411001
|
Don't slice whitespace in jsdoc parser unless needed
It's only needed when there's an indentation relative to the margin, which is rare and usually doesn't last for the entire comment text. The common case is to skip the whitespace.
Works on https://github.com/microsoft/TypeScript/issues/52959, based on a suggestion from @DanielRosenwasser in #53081
@typescript-bot perf test this
@typescript-bot perf test this
|
gharchive/pull-request
| 2023-03-06T23:39:18 |
2025-04-01T06:39:34.695669
|
{
"authors": [
"sandersn"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/pull/53121",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
474300209
|
[Release] Milestone M155
Mac
1176 Platform native notifications PR_2
1193 Mac ProjFS: Only hydrate/expand when deleting due to rename
1202 Platform native notifications PR_3
1215 Remove Placeholder after calling PreDelete
1250 Mac: Switch on the UTF8 Tests
1257 Mac: Bash Functional Test Hang
1285 Installer reliability
1318 Mac build scripts: consume git installer pkg directly
1319 Don't change version number of dev builds
1322 Script to generate GVFS.Installer.Mac nuspec
1340 PrjFSKextLogDaemon: Don't log every time a dropped kext message is detected
Cross-Platform
1267 Add logging to clone/mount paths
1287 use repository relative paths in DiffTreeResult.TargetPath
1301 simplify Git functional tests of case-sensitive file paths
1320 Making max pipe length platform-dependant
1375 Remove all TODO(POSIX) and TODO(Mac) comments
Git
1283 Incremental commit graph
1300 Update Git to include status deserialize fix
1342 Update Git to include tracing updates
1382 Update Git to include more tracing updates
Upgrade
1224 ProductUpgrader: platform specific upgrade directories
1225 Upgrader - platform specific directory creation
1228 Upgrader: copy entire application directory
1230 Upgrader: add command install action
1234 Upgrader: implement cross platform functionality
1268 NuGetUpgrader: fix issue with upgrader using interop services
1270 VFSForGit Installer non-relocatable
1284 Enable mac upgrade MVP
1297 Upgrader: NuGet Upgrader should use GitHub endpoint for notifications
1312 Use the new signing certificate
1381 Upgrade: Fix test flakiness around upgrade reminders
FastFetch
1291 use correct checkout thread maximum in FastFetch
Polish
1241 Update heartbeats for folder placeholders and file hydration
1257 Mac: Bash Functional Test Hang
1263 Log Folder Placeholders Removed
1281 PrefetchStep: don't send post-fetch request if no new packs
1282 Remove requirement to run --no-renames
1317 Setup.iss: Update GitHub URL
1328 Move Enlistment Directory For Functional tests to ~/GVFS.FT
1380 GitProcess: Catch InvalidOperationException when setting priority class
Backports from M153
1259 Update Git to v2.22.0
1276 POSIX: Switch to Process.Start for launching GVFS.Mount
1278 Fix performance regression in LibGit2RepoInvoker
/azp run GitHub VFSForGit Large Repo Build
/azp run GitHub GitHub VFSForGit Large Repo Perf Tests
Large Repo Perf run succeeded with expected results. All difference was within error bounds.
|
gharchive/pull-request
| 2019-07-29T23:24:42 |
2025-04-01T06:39:34.707674
|
{
"authors": [
"derrickstolee",
"jrbriggs"
],
"repo": "microsoft/VFSForGit",
"url": "https://github.com/microsoft/VFSForGit/pull/1385",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
592120936
|
Fix commit-graph expiration
Wow, this was really not working as expected.
See microsoft/git#255 for how broken the --expire-time argument was.
Fix this by using the fixed argument and passing a datetime instead of an offset by seconds. This will provide a longer window for old commit-graph files, but apparently we've been leaving turd files around for a long time without anyone noticing.
/azp run PR - Windows - Build and Unit Test
/azp run PR - Windows - Functional Tests
/azp run PR - Windows - Functional Tests (Sparse Mode)
/azp run PR - Windows - Extra
|
gharchive/pull-request
| 2020-04-01T18:33:57 |
2025-04-01T06:39:34.710048
|
{
"authors": [
"derrickstolee"
],
"repo": "microsoft/VFSForGit",
"url": "https://github.com/microsoft/VFSForGit/pull/1647",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2344356999
|
Feature/Request for Git Events (Branch changed, Git Pull, etc)
Hello, are there any plans to supporting git events in the API?
Example: subscribe to whenever a git branch has changed, git pull, etc.
This would enable several neat extensions such as: Auto stash/restore when pulling a repo with pending git changes, auto apply DB migrations, etc
Thanks for your suggestion! I've created a suggestion ticket to track this so folks can vote on it: https://developercommunity.visualstudio.com/t/VisualStudioExtensibility---FeatureReq/10680306. This isn't on our short-term roadmap, but if we get substantial votes, we can reconsider.
|
gharchive/issue
| 2024-06-10T16:26:20 |
2025-04-01T06:39:34.711981
|
{
"authors": [
"luislhg",
"tinaschrepfer"
],
"repo": "microsoft/VSExtensibility",
"url": "https://github.com/microsoft/VSExtensibility/issues/391",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1725797503
|
WslRegisterDistribution failed with error: 0xc03a0014
Windows Version
Microsoft Windows [Versión 10.0.22621.1702]
WSL Version
1.2.5.0
Are you using WSL 1 or WSL 2?
[x] WSL 2
[ ] WSL 1
Kernel Version
No response
Distro Version
No response
Other Software
No response
Repro Steps
wsl --install ubuntu
Expected Behavior
Install and run ubuntu in WSL
Actual Behavior
Install but return a error for provider.
Installing, this may take a few minutes... WslRegisterDistribution failed with error: 0xc03a0014 Error: 0xc03a0014 No se encontr¾ un proveedor de compatibilidad con disco virtual para el archivo especificado.
also i use (Get-WindowsOptionalFeature -online | Where-Object { $_.FeatureName -match "hyper" -or $_.FeatureName -match "linux"})
FeatureName : HypervisorPlatform
State : Enabled
FeatureName : Microsoft-Windows-Subsystem-Linux
State : Enabled
FeatureName : Microsoft-Hyper-V-All
State : Enabled
FeatureName : Microsoft-Hyper-V
State : Enabled
FeatureName : Microsoft-Hyper-V-Tools-All
State : Enabled
FeatureName : Microsoft-Hyper-V-Management-PowerShell
State : Enabled
FeatureName : Microsoft-Hyper-V-Hypervisor
State : Enabled
FeatureName : Microsoft-Hyper-V-Services
State : Enabled
FeatureName : Microsoft-Hyper-V-Management-Clients
State : Enabled
Note: i used WSL in the past.. for not reason this error happen i try reinstall WSL but the error still show.
Diagnostic Logs
No response
same
same
Same here:
This Ubuntu issue seems to result because a previously working WSL has failed to mount the required virtual machine.
The 'WSL fails to start' error is described in this unresolved issue:
9178: https://github.com/microsoft/WSL/issues/9178
MY SETUP:
Windows 10 Enterprise 10.0.19045
on ASRock B450 Pro4 motherboard
with Hyper-V and 'Virtual Machine Platform' turned on
using: WSL 1.2.5.0_x64 Kernel 5.15.90.1
with "ubuntu-18.04" or "ubuntu 20.04"
PROBLEM: in PowerShell as Admin, I get:
PS C:\WINDOWS\system32> wsl
Failed to attach disk 'C:\Program Files\WindowsApps\MicrosoftCorporationII.WindowsSubsystemForLinux_1.2.5.0_x64__8wekyb3d8bbwe\system.vhd' to WSL2: The request is not supported.
Error code: Wsl/Service/CreateInstance/CreateVm/MountVhd/0x80070032
PS C:\WINDOWS\system32>
<<< also >>>
PS C:\WINDOWS\system32> bash
Failed to attach disk 'C:\Program Files\WindowsApps\MicrosoftCorporationII.WindowsSubsystemForLinux_1.2.5.0_x64__8wekyb3d8bbwe\system.vhd' to WSL2: The request is not supported.
Error code: Bash/Service/CreateInstance/CreateVm/MountVhd/0x80070032
PS C:\WINDOWS\system32>
<<< running download for Ubuntu instance yields >>>
Installing, this may take a few minutes...
WslRegisterDistribution failed with error: 0xc03a0014
Error: 0xc03a0014 A virtual disk support provider for the specified file was not found.
Press any key to continue...
https://github.com/microsoft/WSL/issues/10555#issuecomment-1937980344
|
gharchive/issue
| 2023-05-25T13:07:25 |
2025-04-01T06:39:34.722488
|
{
"authors": [
"Doc94",
"bjoseph-aya",
"joe-mit",
"karimMogh",
"neokofg"
],
"repo": "microsoft/WSL",
"url": "https://github.com/microsoft/WSL/issues/10139",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
644443799
|
PyTorch Can't transfer anything to GPU
Environment
Windows build number: Version 2004 (Build 20150.1000)
Your Distribution version: Ubuntu 20.04
Whether the issue is on WSL 2 and/or WSL 1: WSL2
Linux Kernel: 4.19.121-microsoft-standard
Python: 3.5.4
PyTorch: 1.2
Geforce Driver: 455.41
Steps to reproduce
import torch
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
tensor = torch.zeros(1, 1, 10)
tensor.to(device)
torch.cuda.is_available() returns true.
Expected behavior
PyTorch should transfer the tensor to the GPU.
Actual behavior
It crashes with
CUDA error: unknown error
I find its ok;
i use ubuntu 20.04
pip install torch==1.5.1 torchvision==0.6.1
i am not install cuda just torch 1.5.1 and torchvision
(test) lab@pc_yqh:~$ python
Python 3.6.10 |Anaconda, Inc.| (default, May 8 2020, 02:54:21)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
.zeros(1, 1, 10)
When I update my torch and pytorch version with
pip3.5 install torch==1.5.1+cu101 torchvision==0.6.1+cu101 -f https://download.pytorch.org/whl/torch_stable.html
the code runs without an error but doesn't work either.
import torch
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
tensor = torch.rand(1, 1, 10)
tensor.to(device)
tensor_two = tensor + tensor
runs and torch.cuda.is_available() still returns true. But if I inspect tensor_two at the end the device attribute of it is device(type="cpu").
And I want to work with PyTorch Version 1.2, not 1.5.1.
When I update my torch and pytorch version with
pip3.5 install torch==1.5.1+cu101 torchvision==0.6.1+cu101 -f https://download.pytorch.org/whl/torch_stable.html
the code runs without an error but doesn't work either.
import torch
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
tensor = torch.rand(1, 1, 10)
tensor.to(device)
tensor_two = tensor + tensor
runs and torch.cuda.is_available() still returns true. But if I inspect tensor_two at the end the device attribute of it is device(type="cpu").
And I want to work with PyTorch Version 1.2, not 1.5.1.
>>> import torch
>>> device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
>>> device
device(type='cuda', index=0)
>>> tensor = torch.rand(1, 1, 10)
>>> tensor
tensor([[[0.0096, 0.1139, 0.7168, 0.8277, 0.4454, 0.5753, 0.7148, 0.4055,
0.8267, 0.7882]]])
>>> tensor.to(device)
tensor([[[0.0096, 0.1139, 0.7168, 0.8277, 0.4454, 0.5753, 0.7148, 0.4055,
0.8267, 0.7882]]], device='cuda:0')
>>> tensor
tensor([[[0.0096, 0.1139, 0.7168, 0.8277, 0.4454, 0.5753, 0.7148, 0.4055,
0.8267, 0.7882]]])
>>> tensor = tensor.to(device)
>>> tensor
tensor([[[0.0096, 0.1139, 0.7168, 0.8277, 0.4454, 0.5753, 0.7148, 0.4055,
0.8267, 0.7882]]], device='cuda:0')
>>> tensor_two = tensor + tensor
>>> tensor_two
tensor([[[0.0192, 0.2279, 1.4336, 1.6553, 0.8908, 1.1505, 1.4295, 0.8111,
1.6534, 1.5763]]], device='cuda:0')
any news on this issue?
This is now working with
PyTorch: 1.5.1 (CUDA 10.1)
Python: 3.5.9
instead of my previous setup
PyTorch: 1.2 (CUDA 10)
Python: 3.5.4
I would still consider it as a problem though since I would like to work with PyTorch 1.2 since it is the version that we use in our workplace.
Maybe this helps you @keesschollaart81
|
gharchive/issue
| 2020-06-24T08:59:15 |
2025-04-01T06:39:34.730809
|
{
"authors": [
"Erik-Sovereign",
"devsnets",
"sailyung"
],
"repo": "microsoft/WSL",
"url": "https://github.com/microsoft/WSL/issues/5477",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1535629916
|
Error code: Wsl/Service/E_UNEXPECTED
Version
Microsoft Windows [Versión 10.0.19045.2486]
WSL Version
[X] WSL 2
[ ] WSL 1
Kernel Version
5.15.79.1
Distro Version
Ubuntu 22.04
Other Software
No response
Repro Steps
steps taken until I got the error, in a fresh OS installation
Activate systemd
sudo nano /etc/wsl.conf
[boot] systemd=true
wsl.exe –shutdown
Install webmin
sudo nano /etc/apt/sources.list → deb https://download.webmin.com/download/repository sarge contrib
wget https://download.webmin.com/jcameron-key.asc
cat jcameron-key.asc | gpg --dearmor >/etc/apt/trusted.gpg.d/jcameron-key.gpg
sudo apt-get install apt-transport-https
sudo apt-get update
sudo apt-get install webmin
install webinoly (lemp)
wget -qO weby qrok.es/wy && sudo bash weby -clean
sudo stack -php-ver=8.1
sudo stack -mysql-ver=10.6
sudo stack -lemp
So far so good, all services work properly.
Expected Behavior
When closing the console window where the linux operating system is open, the WSL is expected to stop all services in the background and turn off the virtual machine.
Actual Behavior
When I start the application again I get the following error:
Fatal Error
Error code: Wsl/Service/E_UNEXPECTED
press any key to continue
and then the window closes.
Diagnostic Logs
No response
/logs
this just happened to me after shutting down my ubuntu distro and then rebooting (after installing a few packages from the ubuntu-desktop bundle)
it somehow automagically revived itself?
installed the rest of the bundle (when i say bundle i mean the packages listed among "the following NEW packages will be installed" thing). working amazing so far
WslLogs-2023-04-14_07-19-36.zip
For me WSL runs great initially. I notice this issue if I come back to work after a big gap. Once I hit the error, wsl --shutdown is useful.
Linux home-pc 5.15.90.1-microsoft-standard-WSL2 #1 SMP Fri Jan 27 02:56:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Distributor ID: Ubuntu
Description: Ubuntu 22.10
Release: 22.10
Codename: kinetic
I am running the following: snaps, microk8s and I use VSCode plugin for WSL.
same error
same error:
Error code: Wsl/Service/E_UNEXPECTED
please.
@Taco0220, If you need to restore the WSL "machine" (to recover files, etc.), see if this approach can help you: https://stackoverflow.com/a/76354545/1493870
It's not the solution to this issue. Only a way to get back the files inside the WSL.
I've tried a lot of things, but nothing worked。
This just happened to me
Get it frequently lately. Much confusion. I feel like it started once I:
updated to latest
enabled options="metadata"
enabled dbus
# Check if DBus is running, if not, launch it
if ! pgrep -x "dbus-launch" > /dev/null; then
eval `dbus-launch --sh-syntax`
# echo "DBus session started."
fi
|
gharchive/issue
| 2023-01-17T00:32:37 |
2025-04-01T06:39:34.743037
|
{
"authors": [
"901238746",
"Morrigan-Ship",
"Taco0220",
"benhillis",
"c4artisan",
"fliespl",
"nehe009",
"vm75"
],
"repo": "microsoft/WSL",
"url": "https://github.com/microsoft/WSL/issues/9490",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
514394515
|
WinAppDriver hangs when findElement
Now , I am trying to operate Windows .Net form Application by WinAppDriver.
I can create session with remote connection, and I can get some properties such as winHandle,winTitle.
DesiredCapabilities appCap = new DesiredCapabilities();
appCap.setCapability("platformName", "Windows");
appCap.setCapability("deviceName", "WindowsPC");
appCap.setCapability("appTopLevelWindow", {targetWinHandle});
WindowsDriver driver = new WindowsDriver(new URL("http://{my ip}:4723/wd/hub"), appCap);
but when I try to findElementByXPath, the session hangs and no response.
I had to shutdown session on cmd console to restart.
driver.findElementByXPath("//*");
# WinAppDriver cmd console
==========================================
POST /wd/hub/session/xxxxxxxxxxxxxxxxxxxxxx/element HTTP/1.1
Accept-Encoding: gzip
Connection: Keep-Alive
Content-Length: 50
Content-Type: application/json; charset=utf-8
Host: x.x.x.x:4723
User-Agent: selenium/3.141.59 (java windows)
{
"using": "xpath",
"value": "\u002f\u002f*"
}
... no response ...
I want to know why it hangs, but it may be difficult to solve because the information I've showed is less.
So I wish that WinAppDriver will output some exeption or error log on such case.
Environment
master pc
Windows10 Pro
AdoptOpenJDK 8.x
Appium java_client 7.2.0
Selenium 3.141.59
remote pc
Windows10 IoT OS
WinAppDriver.exe 1.2
Target Application: Windows .Net Form Application
@NathanZook , @licanhua
Sorry it was bad example.
No matter xPath is,(If I tried to using another xPath such as "//Window[@Name="{my_app window_name"]") , then WinAppDriver hangs and no response.
And I've tried Windows Calculator sample, and it worked fine.
It's hard to tell if it's the app or winappdriver problem. winappdriver connected to app by accessibility interface, so it makes WinAppDriver has no response if app itself has no response.
You can check the cpu usage to see who is busy
use visual studio or other debugger. Stack trace and thread information may help you to understand what the process is doing.
other functions(not xpath) which may dump all the elements, for example get page source.
use inspect.exe, narrator, or appium-desktop to inspect the application when problem happens.
|
gharchive/issue
| 2019-10-30T04:36:01 |
2025-04-01T06:39:34.750230
|
{
"authors": [
"licanhua",
"shift-morikawa"
],
"repo": "microsoft/WinAppDriver",
"url": "https://github.com/microsoft/WinAppDriver/issues/938",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
628874255
|
Discussion: Simplify deployment for packaged app that not in store
Discussion: Simplify deployment for packaged app that not in store
Currently, to deploy a UWP app that you don't want to publish to ms store for whatever reason, you need to deliver the certificate to the end-user, the end-user needs to enable developer mode, install the certificate then install the actual app. or use the PowerShell script, it's a bit more complicate for the end-user, and it will deliver multiple files to the end-user.
I think it would be better to have just one install package file and just "click -> install -> run" without dealing with the certificate and other things. Just like Android's APK file, it does obtain a signature but the end-user will never notice that.
Yes please. That pain when client says "just send me the .exe" and you have to explain why is it more complicated than it was 10 years ago...
Do you have more information on the certificate you are using? Anything that we can verify authenticity does not require extra steps. Adding a certificate is usually only required for enterprises when they are using their own root, but usually deploy in mass to their devices. If you are using a self signed certificate that offers not authenticity and is basically equivalent to not signing code.
If the issue is the type of signing certificate we plan on launching a preview of the Azure Trust Service this summer. The goal is to reduce the friction in signing flows for developers. The first part of this presentation covers the plans for code signing. MSIX Build
starting in the May 2020 update (2004) the sideloading setting has been removed and apps can be installed from any source similar to other installer techs
Great to hear that:)
Just clarify that it's not the problem of the certificate, it's the installation process that makes the end-user a little bit confused, they usually ask why can't just open a single installation file and click install then everything is good to go, just like what they used to.
For example, currently, when you create an app package for a UWP app with sideloading, you need a certificate, after build finish, there will be a folder that contains the app with files like *.msixbundle, Install.ps1, *.cer and some subfolder. You will need to deliver the whole folder to the end-user and tell them to run Install.ps1 to install the app. Or you will need to tell the user how to install *.cer file first like this
This is much more complex than the "traditional" way like the installation process of vscode, you just have to download a single file, open it, install it, and everything is good to go.
you are using a self signed certificate that offers not authenticity and is basically equivalent to not signing code. If the issue is the type of signing certificate we plan on launching a preview of the Azure Trust Service this summer. The goal is to reduce the friction in signing flows for developers.
@jvintzel it was supposed to be launched last summer right? open source developers are in need of this very service. can we now expect an answer please since it's already past 1 year?
it's a bit more complicate for the end-user
It's not a "bit" more complicated. This is an absolute showstopper for many deployment scenarios and invalidates an entire family of new MS tech that would otherwise be appropriate if it didn't essentially force MSIX.
Can you elaborate on why you think this is a showstopper? I don't see any difference here from traditional software rollout.
It seems a lot of friction here revolves around the distribution and installation of self signed packages. This is not normal, and not something you should be doing beyond development/test.
This is not normal, and not something you should be doing beyond development/test.
Making a regular MSI without any of the ceremony is normal, works out of the box, and doesn't require buying (management is happy) or juggling certificates (devs are happy).
I understand the reasoning behind all this signature business, but it doesn't work in practice. It's a showstopper because the friction is too much and makes ignoring msix the straightforward choice. Don't use it and everything works again out of the box for everyone. Why bother with this new thing with no immediate and clear benefits but plenty of drawbacks?
I understand the reasoning behind all this signature business, but it doesn't work in practice.
do tell what works then ? making everything free ? you aware that "FrEe SoFtWaRe FoUnDaTiOn Cult " also runs on sucking donations/money right ?
Advice : I understand the frustration but making everything free is not the way the world works. Nothing is free is in this world. Instead You could've asked to make certificates prices affordable for indie devs.
Don't use it and everything works again out of the box for everyone.
as you seem to figure it out, keep using MSI then.
Why bother with this new thing
why bothering here in the first place ? to vent ?
I understand the reasoning behind all this signature business, but it doesn't work in practice.
do tell what works then ? making everything free ? you aware that "FrEe SoFtWaRe FoUnDaTiOn Cult " also runs on sucking donations/money right ?
Advice : I understand the frustration but making everything free is not the way the world works. Nothing is free is in this world. Instead You could've asked to make certificates prices affordable for indie devs.
Don't use it and everything works again out of the box for everyone.
as you seem to figure it out, keep using MSI then.
Why bother with this new thing
why bothering here in the first place ? to vent ?
that is why now some people must suffer where they didn't use to?
|
gharchive/issue
| 2020-06-02T03:36:11 |
2025-04-01T06:39:34.764844
|
{
"authors": [
"Tlaster",
"ghost000000000",
"ihavefoxdie",
"jvintzel",
"lcsondes",
"mcosmin22",
"riverar",
"tesar-tech"
],
"repo": "microsoft/WindowsAppSDK",
"url": "https://github.com/microsoft/WindowsAppSDK/issues/57",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2159410605
|
Remove unneeded references to VisualStudio.CoreUtility
The build is unable to find this, but there was never impact of not finding it since it was not used.
Removing it.
/azp run
|
gharchive/pull-request
| 2024-02-28T16:57:08 |
2025-04-01T06:39:34.766324
|
{
"authors": [
"bpulliam"
],
"repo": "microsoft/WindowsAppSDK",
"url": "https://github.com/microsoft/WindowsAppSDK/pull/4236",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2063220121
|
[Core] Sanitize filename before using it as docker image tag. Fix #1069
Why are these changes needed?
Sanitize filename before using it as docker image tag.
In addition, removing the skips for Mac OS X platforms -- they all passed.
Related issue number
Fix #1069
Checks
[ ] I've included any doc changes needed for https://microsoft.github.io/autogen/. See https://microsoft.github.io/autogen/docs/Contribute#documentation to build and test documentation locally.
[x] I've added tests (if relevant) corresponding to the changes introduced in this PR.
[x] I've made sure all auto checks have passed.
Codecov Report
Attention: 6 lines in your changes are missing coverage. Please review.
Comparison is base (b26e659) 30.81% compared to head (accbe2b) 18.67%.
Report is 2 commits behind head on main.
Files
Patch %
Lines
autogen/code_utils.py
14.28%
6 Missing :warning:
Additional details and impacted files
@@ Coverage Diff @@
## main #1127 +/- ##
===========================================
- Coverage 30.81% 18.67% -12.15%
===========================================
Files 30 30
Lines 4037 4043 +6
Branches 915 966 +51
===========================================
- Hits 1244 755 -489
- Misses 2714 3204 +490
- Partials 79 84 +5
Flag
Coverage Δ
unittests
18.67% <14.28%> (-12.10%)
:arrow_down:
Flags with carried forward coverage won't be shown. Click here to find out more.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Yes. I will use this library for sanitizing filename generated by LLM. Although for the purpose of generating a docker tag name, the library doesn't support it -- a post-processing of the filename is still required to make it a valid docker tag name.
This library is useful for sanitizing filename generated by LLM. Although for the purpose of generating a docker tag name, the library doesn't support it -- a post-processing of the filename is still required to make it a valid docker tag name. So for the purpose of fixing #1069 we still need the custom function.
Perhaps we can have a separate PR for sanitizing filename generated by LLM?
Ah, fair enough! It's just that all that custom sanitization code made me uncomfortable. I've made too many errors in this area in the past, and prefer pre-built libraries for this kind of thing.
I agree random UUID tag is probably a good idea. Although there is a readability component to it (https://docs.docker.com/engine/reference/commandline/tag/), and the only semantically relevant information about the container in the function is the code and the filename which are both generated by the LLM.
I agree random UUID tag is probably a good idea. Although there is a readability component to it (https://docs.docker.com/engine/reference/commandline/tag/), and the only semantically relevant information about the container in the function is the code and the filename which are both generated by the LLM.
Is there a risk that to allowing the llm to name the containers? Will they reuse the same names ever? Is it ok if they do? What is the tradeoff with the other two extremes: Using a fixed tag name hat is shared by all calls (e.g., "python:autogen") vs a UUID to guarantee uniqueness?
Probably a combination of two? e.g., python:_<llm_generated_description>
Is there a risk to allowing the llm to name the containers? Will they reuse the same names ever? Is it ok if they do?
I think the risk of allowing llm to name the container is at the level of potentially reusing another container that is currently being used by other purpose. It would be okay if it is using a container created earlier for the same or previous iteration of the code.
To address this we would need to have a concept of a coding session with a container created for the purpose of this session only. Currently the filename is used to identify the session. If the caller does not provide a filename then a hash generated from the code itself is used as a filename. So currently exactly same code may means container reuse.
What is the tradeoff with the other two extremes: Using a fixed tag name that is shared by all calls (e.g., "python:autogen") vs a UUID to guarantee uniqueness?
We probably don't want to use the same container for all because each coding session may have different libraries installed and there will be issue with dependency management. We probably want to have uniquely identifiable sessions so iterations of the generated code gets to share the same environment.
My feeling is that since this PR does not change the expected behavior of the current code, we can design the new code execution logic in a separate PR.
@BeibinLi do you want to review this?
|
gharchive/pull-request
| 2024-01-03T04:08:18 |
2025-04-01T06:39:34.787717
|
{
"authors": [
"afourney",
"codecov-commenter",
"ekzhu"
],
"repo": "microsoft/autogen",
"url": "https://github.com/microsoft/autogen/pull/1127",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
399365287
|
List
Any chance to provide a List sample taking advantage of azure-devops-ui?
I can only find a Pivot which implements office-ui-fabric-react.
You can find samples under the List and Table components on the components site:
https://developer.microsoft.com/en-us/azure-devops/components/list
https://developer.microsoft.com/en-us/azure-devops/components/table
|
gharchive/issue
| 2019-01-15T14:18:23 |
2025-04-01T06:39:34.790398
|
{
"authors": [
"chris-to-pher",
"nkirchem"
],
"repo": "microsoft/azure-devops-extension-sample",
"url": "https://github.com/microsoft/azure-devops-extension-sample/issues/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1164907777
|
Ability to specify staging directory in configuration
Plugin name and version
azure-functions-maven-plugin:1.16.0
Plugin configuration in your pom.xml
<plugin>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-functions-maven-plugin</artifactId>
<version>1.16.0</version>
<configuration>
<appName>${functionAppName}</appName>
<resourceGroup>${resourceGroup}</resourceGroup>
<appServicePlanName>${appServicePlanName}</appServicePlanName>
<runtime>
<os>linux</os>
<javaVersion>8</javaVersion>
</runtime>
<auth>
<type>service_principal</type>
<serverId>azure-auth</serverId>
</auth>
</configuration>
<executions>
<execution>
<id>package-functions</id>
<goals>
<goal>package</goal>
</goals>
</execution>
</executions>
</plugin>
Expected behavior
Sometimes names of the function app and staging directory are different and it would be good to have ability to specify staging directory manually in plugin configuration
Actual behavior
Currently it's not possible to specify staging directory in azure-functions-maven-plugin configuration
@jshaptic Thanks a lot for your feedback, I agree that staging directory should be customized, I've added this item to the back log.
Besides, currently maven toolkit will use <appName> in plugin configuration as the name of staging folder, could you please help share the project that app name diffs from staging folder name if possible?
I don't think I will be able to find some open-source project with such needs, but I can share some details regarding my current project. We have a multi-environment setup, where each environment has its own function instance and each one has diffferent name, e.g. function-dev, function-stage, function-prod etc.
So, it would be really good to have ability to build only once, but deploy many times to different environments - in this case staging folder will have environment-agnostic name, whereas app name will have environment prefix.
I see this has been open for a year now. Any idea when this issue will be worked on? We have the exact same need as the OP.
|
gharchive/issue
| 2022-03-10T08:26:50 |
2025-04-01T06:39:34.794998
|
{
"authors": [
"Flanker32",
"jshaptic",
"marcia-schulman"
],
"repo": "microsoft/azure-maven-plugins",
"url": "https://github.com/microsoft/azure-maven-plugins/issues/1968",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
618358280
|
create environment agent with tags, in unattended mode
When registering an Environment agent on a VM interactively it prompts me to add tags. Is there a way to add tags even when running --unattended, without going through the UI?
.\config.cmd --environment --environmentname "01 - Development" --agent $env:COMPUTERNAME --runasservice --work '_work' --url '<redacted>' --projectname '<Redacted>' --auth PAT --token <redacted> --unattended
solved it, you can use --addvirtualmachineresourcetags --virtualmachineresourcetags "<tag>"
|
gharchive/issue
| 2020-05-14T16:14:44 |
2025-04-01T06:39:34.796623
|
{
"authors": [
"JWroe"
],
"repo": "microsoft/azure-pipelines-agent",
"url": "https://github.com/microsoft/azure-pipelines-agent/issues/2973",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1100055040
|
Hosted Agent- Azure Container Instances -gtVersion 2.172.0
Having issue with software on Hosted Agent?
I configure the hosted container instance agent and get an error
"##[error]No agent found in pool aci-agent which satisfies the specified demands: Agent.Version -gtVersion 2.172.0"
Agent Platform:
Azure DevOps Services
Hi @cheahengteong there seems to be demand in your pipeline - agent version should be greater than 2.172.0.
You would probably need to update your agent, or at least make sure that auto-update is enabled for your agent pool.
Hi @cheahengteong do you have any updates?
@anatolybolshakov yes
@cheahengteong is this issue still actual for you? Have the steps above helped you to resolve it?
You can refer to the https://medium.com/@cloudlabs01/running-azure-self-hosted-agent-in-azure-container-instance-aci-ad1fa338d769
@ceteongvanness thanks! As I see there are instructions to set up self-hosted agent with azure container instance, although the error here seems to be related to demand required by some pipeline (agent version should be greater than 2.172.0). Seems like it was either configured with version less than 2.172.0, or there is some other possible reason).
@cheahengteong I'm closing this at the moment due to inactivity - please let us know if you have any questions.
For Azure DevOps related questions - you can also open a ticket on https://developercommunity.visualstudio.com/search?space=21 to get support
|
gharchive/issue
| 2022-01-12T08:55:20 |
2025-04-01T06:39:34.801981
|
{
"authors": [
"anatolybolshakov",
"ceteongvanness",
"cheahengteong"
],
"repo": "microsoft/azure-pipelines-agent",
"url": "https://github.com/microsoft/azure-pipelines-agent/issues/3694",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
714688143
|
Added parameter to setResourcePath to add posibility to disable warnings thrown
Added parameter to setResourcePath method to add posibility to disable warnings thrown
Related issue:
#274
Can you add the same pr targeting the releases/3.x branch?
Can you add the same pr targeting the releases/3.x branch? https://github.com/microsoft/azure-pipelines-task-lib/tree/releases/3.x
I opened it here. Could you please take a look. Thank you!
Thanks!
Published!
|
gharchive/pull-request
| 2020-10-05T09:44:56 |
2025-04-01T06:39:34.805113
|
{
"authors": [
"damccorm",
"egor-bryzgalov"
],
"repo": "microsoft/azure-pipelines-task-lib",
"url": "https://github.com/microsoft/azure-pipelines-task-lib/pull/668",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
736622823
|
DownloadPipelineArtifact task patterns not working
Required Information
Entering this information will route you directly to the right team and expedite traction.
Question, Bug, or Feature?
Type: Bug
Enter Task Name: Download Pipeline Artifacts task
list here (V# not needed):
https://github.com/microsoft/azure-pipelines-tasks/tree/master/Tasks/DownloadPipelineArtifact (V2)
Environment
Server - Azure Pipelines or TFS on-premises?
Server - Azure Pipelines, acc:schwarzit, project:kaufland.bads-ai, repo:KAP.airflow
Agent - Hosted or Private:
Private, RHEL 7.7 (Maipo), vsts-agent-linux-x64-2.174.1
Issue Description
Expect matching pattern to match file correctly. Minimatch pattern matches 0 files.
Start downloading artifact - airflow.cfg_sometext-scratched_chd
Minimatch patterns: [*_chd]
Filtered 0 files from the Minimatch filters supplied.
Downloaded 0.0 MB out of 0.0 MB (100%).
Patterns tested:
*chd*
*chd
*_chd
All return 0 matches.
Possible root cause is the '.' in the filename
https://github.com/isaacs/minimatch/issues/82
To be tested
|
gharchive/issue
| 2020-11-05T05:23:15 |
2025-04-01T06:39:34.810510
|
{
"authors": [
"lubomir-angelov"
],
"repo": "microsoft/azure-pipelines-tasks",
"url": "https://github.com/microsoft/azure-pipelines-tasks/issues/13840",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1834528090
|
Deployment Error by CustomDeployment from Template to Azure
I am unfortunately getting the following error when using the one-click deployment to azure:
{"code":"DeploymentFailed","target":"/subscriptions/XXX/resourceGroups/XXXX/providers/Microsoft.Resources/deployments/resources-5stymuvr2bba2","message":"At least one resource deployment operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-deployment-operations for usage details.","details":[{"code":"BadRequest","target":"/subscriptions/XXX/resourceGroups/XXX/providers/Microsoft.Resources/deployments/resources-5stymuvr2bba2","message":"{\r\n "code": "BadRequest",\r\n "message": "The character 'F' at index 0 is not allowed in the DatabaseAccount name\r\nActivityId: 6a63308f-d2c3-49ea-a9df-fd1c369b7938, Microsoft.Azure.Documents.Common/2.14.0"\r\n}"}]}
the errors is related to the cosmosdb but I cant see what caused it:
The character 'F' at index 0 is not allowed in the DatabaseAccount name
What deployment name are you using?
solved was caused by capital letter
|
gharchive/issue
| 2023-08-03T08:16:11 |
2025-04-01T06:39:34.817112
|
{
"authors": [
"RobertHoegner",
"Xtrah"
],
"repo": "microsoft/azurechatgpt",
"url": "https://github.com/microsoft/azurechatgpt/issues/34",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
742204922
|
There is a vulnerability in node-fetch 2.5.7,upgrade recommended
https://github.com/microsoft/azuredatastudio/blob/536628603e6df2a343f3bdf05df99e6aa0dfb706/extensions/microsoft-authentication/yarn.lock#L13-L14
CVE-2020-15168
Recommended upgrade version:2.6.1
This is a typing file for a dev dependency. I don't think the CVE applies in this case. We'll update it in a future build though.
|
gharchive/issue
| 2020-11-13T06:47:39 |
2025-04-01T06:39:34.818873
|
{
"authors": [
"QiAnXinCodeSafe",
"kburtram"
],
"repo": "microsoft/azuredatastudio",
"url": "https://github.com/microsoft/azuredatastudio/issues/13395",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
981055112
|
Formatting not kept on copy+paste
Issue Type: Bug
Format of field not kept when copying and pasting into other applications, ie Visual Studio, Excel, Sublime text.
How field actually looks (hovering over field in results pane):
How it looks after pasting into IDE:
Azure Data Studio version: azuredatastudio 1.32.0 (4a45ba7cf20dd4129f1a08e5e776dfb33e3d1d1e, 2021-08-16T18:08:28.086Z)
OS version: Windows_NT x64 10.0.19043
System Info
Item
Value
CPUs
Intel(R) Core(TM) i7-10750H CPU @ 2.60GHz (12 x 2592)
GPU Status
2d_canvas: enabledgpu_compositing: enabledmultiple_raster_threads: enabled_onoop_rasterization: enabledopengl: enabled_onrasterization: enabledskia_renderer: enabled_onvideo_decode: enabledvulkan: disabled_offwebgl: enabledwebgl2: enabled
Load (avg)
undefined
Memory (System)
15.86GB (6.23GB free)
Process Argv
Screen Reader
no
VM
0%
Extensions: none
@fsmythe I just tried this with long html content and unchecking the configuration worked for me. have you tried to reopen the query editor after making the configuration change?
The issue has been fixed.
Hi, sorry to bring up an old thread/issue but still persists. See more detailed screenshots below. Setting was unticked then Azure Data Studio completely restarted. No effect.
Hover over field in azure data studio, note the carriage return symbols:
After copy and pasting into visual studio code:
Showing setting is unticked (tried ticked too no difference)
Same field copy and pasted from SQL Management Studio 19 (no issues):
Let me know if you need anything else.
The issue has been fixed.
Issue not fixed for me, see above, should I open a new issue or can you re-open this one for me? Thanks
|
gharchive/issue
| 2021-08-27T09:27:59 |
2025-04-01T06:39:34.829067
|
{
"authors": [
"alanrenmsft",
"fsmythe"
],
"repo": "microsoft/azuredatastudio",
"url": "https://github.com/microsoft/azuredatastudio/issues/16915",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
854887294
|
Declarative table component fixes
Fixes https://github.com/microsoft/azuredatastudio/issues/14048
We now :
Don't add duplicate items to container components. If an item is attempted to be added that already exists then we just skip it (and log a warning since that should not be a normal case)
Clear out the declarative table component when setting new data values
please fix the unit test failure
Pull Request Test Coverage Report for Build 742392656
33 of 92 (35.87%) changed or added relevant lines in 1 file are covered.
9 unchanged lines in 1 file lost coverage.
Overall coverage increased (+0.006%) to 43.554%
Changes Missing Coverage
Covered Lines
Changed/Added Lines
%
src/sql/workbench/api/common/extHostModelView.ts
33
92
35.87%
Files with Coverage Reduction
New Missed Lines
%
src/sql/workbench/api/common/extHostModelView.ts
9
35.28%
Totals
Change from base Build 734125201:
0.006%
Covered Lines:
26187
Relevant Lines:
54951
💛 - Coveralls
|
gharchive/pull-request
| 2021-04-09T22:18:45 |
2025-04-01T06:39:34.838513
|
{
"authors": [
"Charles-Gagnon",
"alanrenmsft",
"coveralls"
],
"repo": "microsoft/azuredatastudio",
"url": "https://github.com/microsoft/azuredatastudio/pull/15085",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
965495736
|
Synapse query editor dropdown fix
This is a substitute for https://github.com/microsoft/azuredatastudio/pull/16485
It fixes https://github.com/microsoft/azuredatastudio/issues/16501
Seems the issue with fetching serverInfo from STS was fixed with Aasim's recent fix. I was able to remove the instance name based off of that.
@abist Some of the tests are still failing. Can you please fix them?
@aasimkhan30 That's the reason it's not merged yet. These tests don't seem to fail locally but only on ADO. You can see my previous commits.
Pull Request Test Coverage Report for Build 1124927796
6 of 11 (54.55%) changed or added relevant lines in 2 files are covered.
458 unchanged lines in 11 files lost coverage.
Overall coverage decreased (-0.3%) to 41.307%
Changes Missing Coverage
Covered Lines
Changed/Added Lines
%
src/sql/workbench/contrib/query/browser/queryActions.ts
4
9
44.44%
Files with Coverage Reduction
New Missed Lines
%
extensions/data-workspace/src/common/workspaceTreeDataProvider.ts
3
77.22%
src/sql/workbench/browser/modal/modal.ts
14
60.71%
src/sql/workbench/contrib/query/browser/queryActions.ts
16
48.17%
extensions/sql-database-projects/src/tools/netcoreTool.ts
21
43.79%
extensions/data-workspace/src/services/workspaceService.ts
31
64.64%
src/sql/workbench/contrib/notebook/browser/notebookViews/notebookViewsActions.ts
42
30.6%
src/sql/workbench/contrib/notebook/browser/notebookViews/insertCellsModal.ts
43
22.43%
src/sql/workbench/contrib/notebook/browser/notebookViews/notebookViews.component.ts
54
6.98%
extensions/schema-compare/src/schemaCompareMainWindow.ts
60
75.74%
src/sql/workbench/services/connection/browser/connectionManagementService.ts
78
75.02%
Totals
Change from base Build 1117984469:
-0.3%
Covered Lines:
26184
Relevant Lines:
57500
💛 - Coveralls
|
gharchive/pull-request
| 2021-08-10T22:41:58 |
2025-04-01T06:39:34.854498
|
{
"authors": [
"aasimkhan30",
"abist",
"coveralls"
],
"repo": "microsoft/azuredatastudio",
"url": "https://github.com/microsoft/azuredatastudio/pull/16684",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
973092456
|
Remove extra click in python smoke test
This click is not needed since the New Python Installation option is selected by default.
I don't think this specific step was breaking anything, it just timed out while waiting for this element. At first I thought about increasing the timeout but since the step is not necessary anyways, it's better to remove it.
I'll be adding screenshots in a separate PR.
Pull Request Test Coverage Report for Build 1140965257
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage decreased (-0.002%) to 41.357%
Totals
Change from base Build 1140867446:
-0.002%
Covered Lines:
26439
Relevant Lines:
57961
💛 - Coveralls
|
gharchive/pull-request
| 2021-08-17T21:51:13 |
2025-04-01T06:39:34.860264
|
{
"authors": [
"coveralls",
"lucyzhang929"
],
"repo": "microsoft/azuredatastudio",
"url": "https://github.com/microsoft/azuredatastudio/pull/16806",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1294897934
|
[SQL Migration] Implement URI handler
In this PR, we implement an URI handler for the sql-migration extension, which supports one path (/start) which allows the user to launch the migration wizard directly without having to manually navigate to the "Migrate to Azure SQL" button in the UI. Additionally, with optional parameters, the user can provide an arbitrary list of databases (or specify that they'd like all databases) which will be pre-selected on step 1 of the migration wizard.
Sample URIs:
azuredatastudio://Microsoft.sql-migration/start
Launches the migration wizard, as if the user were to manually click the "Migrate to Azure SQL" button in the extension UI
azuredatastudio://Microsoft.sql-migration/start?databases=AdventureWorks,AdventureWorks2
Launches the migration wizard, skipping the page which asks the user whether or not they want to start a new session, and goes directly to step 1 of the wizard with the databases AdventureWorks and AdventureWorks2 automatically selected.
azuredatastudio://Microsoft.sql-migration/start?databases=__all
Launches the migration wizard, skipping directly to step 1 of the wizard, with all databases automatically selected.
WIP:
support adding a new connection via URI
Pull Request Test Coverage Report for Build 2619478079
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage remained the same at 42.338%
Totals
Change from base Build 2618420290:
0.0%
Covered Lines:
28337
Relevant Lines:
62439
💛 - Coveralls
@raymondroc are you still working on this PR? If not, please close and reopen again once work resumes. Thanks!
Closing older PR without recent activity. Please reopen if working on this area again.
|
gharchive/pull-request
| 2022-07-05T23:33:45 |
2025-04-01T06:39:34.868574
|
{
"authors": [
"coveralls",
"kburtram",
"raymondtruong"
],
"repo": "microsoft/azuredatastudio",
"url": "https://github.com/microsoft/azuredatastudio/pull/19922",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2015537208
|
Bump electron version to 25.9.6
This PR bumps the electron version to 25.9.6 and updates the ADS-sqlite version to 1.18.0 for our test suite.
*Need to follow up with the new distro hash
Pull Request Test Coverage Report for Build 7026187014
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage remained the same at 41.565%
Totals
Change from base Build 7024675834:
0.0%
Covered Lines:
30790
Relevant Lines:
69376
💛 - Coveralls
|
gharchive/pull-request
| 2023-11-29T00:03:05 |
2025-04-01T06:39:34.873849
|
{
"authors": [
"coveralls",
"lewis-sanchez"
],
"repo": "microsoft/azuredatastudio",
"url": "https://github.com/microsoft/azuredatastudio/pull/25066",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2738296187
|
Patch "gh" to fix CVE-2024-54132
Merge Checklist
All boxes should be checked before merging the PR (just tick any boxes which don't apply to this PR)
[x] The toolchain has been rebuilt successfully (or no changes were made to it)
[x] The toolchain/worker package manifests are up-to-date
[x] Any updated packages successfully build (or no packages were changed)
[x] Packages depending on static components modified in this PR (Golang, *-static subpackages, etc.) have had their Release tag incremented.
[x] Package tests (%check section) have been verified with RUN_CHECK=y for existing SPEC files, or added to new SPEC files
[x] All package sources are available
[x] cgmanifest files are up-to-date and sorted (./cgmanifest.json, ./toolkit/scripts/toolchain/cgmanifest.json, .github/workflows/cgmanifest.json)
[x] LICENSE-MAP files are up-to-date (./LICENSES-AND-NOTICES/SPECS/data/licenses.json, ./LICENSES-AND-NOTICES/SPECS/LICENSES-MAP.md, ./LICENSES-AND-NOTICES/SPECS/LICENSE-EXCEPTIONS.PHOTON)
[x] All source files have up-to-date hashes in the *.signatures.json files
[x] sudo make go-tidy-all and sudo make go-test-coverage pass
[x] Documentation has been updated to match any changes to the build system
[ ] Ready to merge
Summary
Patch gh to fix CVE-2024-54132
Change Log
CVE-2024-54132
Does this affect the toolchain?
NO
Links to CVEs
https://nvd.nist.gov/vuln/detail/CVE-2024-54132
Test Methodology
Pipeline build id: Buddy Build 693749
Auto cherry-pick results:
3.0-dev :x: -> https://github.com/microsoft/azurelinux/pull/11465
Auto cherry-pick pipeline run -> https://dev.azure.com/mariner-org/mariner/_build/results?buildId=693970&view=results
|
gharchive/pull-request
| 2024-12-13T12:33:36 |
2025-04-01T06:39:34.882935
|
{
"authors": [
"CBL-Mariner-Bot",
"cyberbandya007"
],
"repo": "microsoft/azurelinux",
"url": "https://github.com/microsoft/azurelinux/pull/11448",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
704430461
|
Avoid recreating resources when not needed
Sub-sequent calls to terraform plan or terraform apply against an already provisioned cluster recreates the ACR pull role assignment for the Managed Identity.
As per the discussion over at https://github.com/hashicorp/terraform/issues/22005 this fix will mitigate that behaviour.
The added query section is merely there to create an implicit dependency between the cluster resource and the msi_object_id resource which will move the read from the apply-phase to the plan phase where it will evaluate to a non-changed value.
/AzurePipelines run
We removed the msi_object_id in this file in main branch. Closing.
|
gharchive/pull-request
| 2020-09-18T14:35:00 |
2025-04-01T06:39:34.885386
|
{
"authors": [
"andrebriggs",
"mickeahlinder"
],
"repo": "microsoft/bedrock",
"url": "https://github.com/microsoft/bedrock/pull/1443",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
977326499
|
Update dependency Microsoft.Extensions.Configuration to 5.0.0
Fixes #5751
Description
This updates Microsoft.Extensions.Configuration from version 2.1.0 to 5.0.0 for TargetFramework netcoreapp3.1.
For netcoreapp2.1, version 2.1.0 must be kept because later versions are not compatible.
Specific Changes
Testing
Pull Request Test Coverage Report for Build 265143
0 of 0 changed or added relevant lines in 0 files are covered.
3 unchanged lines in 2 files lost coverage.
Overall coverage decreased (-0.01%) to 75.769%
Files with Coverage Reduction
New Missed Lines
%
/libraries/AdaptiveExpressions/BuiltinFunctions/GetPreviousViableTime.cs
1
90.91%
/libraries/AdaptiveExpressions/BuiltinFunctions/GetNextViableTime.cs
2
90.91%
Totals
Change from base Build 264914:
-0.01%
Covered Lines:
23136
Relevant Lines:
30535
💛 - Coveralls
|
gharchive/pull-request
| 2021-08-23T18:55:50 |
2025-04-01T06:39:34.894068
|
{
"authors": [
"BruceHaley",
"coveralls"
],
"repo": "microsoft/botbuilder-dotnet",
"url": "https://github.com/microsoft/botbuilder-dotnet/pull/5848",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2567274318
|
feat: [#4684] Consolidate and update browser-echo-bot dependencies
Addresses #4684
#minor
Description
This PR updates the browser-echo-bot dependencies to their latest version. Additionally, it consolidates the project with the rest of the botbuilder libraries with workspaces, so latest changes are taken into consideration when testing them from a bot perspective.
Specific Changes
Update all browser-echo-bot dependencies to their latest version.
Added crypto-browserify dependency to address an issue with latest botbuilder version in the browser.
Removed resolutions inside the browser-echo-bot, since they are no longer needed.
Removed the section style from the browser-echo-bot, since was causing to display the webchat incorrectly.
Removed browser-echo-bot yarn.lock, to consolidate the dependencies into the root's yarn.lock.
Updated browser-echo-bot CI pipeline to work with the new structure.
Testing
The following image shows the functional pipeline and the bot working correctly.
Pull Request Test Coverage Report for Build 11186399843
Details
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage decreased (-0.4%) to 84.997%
Totals
Change from base Build 11164635173:
-0.4%
Covered Lines:
20351
Relevant Lines:
22910
💛 - Coveralls
|
gharchive/pull-request
| 2024-10-04T20:17:06 |
2025-04-01T06:39:34.901737
|
{
"authors": [
"coveralls",
"sw-joelmut"
],
"repo": "microsoft/botbuilder-js",
"url": "https://github.com/microsoft/botbuilder-js/pull/4764",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1601120987
|
CalculatorApp.exe is not executable...
Describe the bug
I try to build the Calculator for a computer without entry to the Microsoft Store. After I compile the project with Visual Studio inside my project folder the CalculatorApp.exe appears. Unfortunately this exe is not executable. Every time I try to start the exe nothing happens. When I simultaneously open the Task Manager I can see that the exe start and cancel after a few seconds. Inside the Event Viewer two error messages appear(see first two screenshots). When I debug the project in Visual Studio the calculator starts normally.
When I create a Setup Launcher for the project the application it cant installed the application probably, without the change of the Microsoft.NET dependency from .NET Framework 4.7.2 to Any, although the requested version is installed(see screenshot 3 and 4). While the build process of the Setup Launcher several warning appears(see screenshot 4 and 5). Likewise the warning before the requested dependencies are available on the system. While my search I discovered that maybe some file path are not resolved correctly(see screenshot 7). All in all i can not find the reason why the creation of a working exe is not working. Maybe some compatibility issues...
Can someone maybe describe the correct steps how to create a working exe of the Calculator?
So why run the calculator after I debug the project in Visual Studio, but not when I try to open the exe?
Steps To Reproduce
Steps to reproduce the behavior:
Download a version of the project with is compatible with Win 10.
Follow the instructions of the "Getting Started".
Build Solution with Visual Studio.
Open exe in the project folder.
See error
Expected behavior
The Calculator can be installed on the computer normally.
Screenshots
1.
Device and Application Information
OS Build: 10.0.20348.0
Architecture: x64
Application Version:
Region:de-De
Dev Version Installed:
Additional context
I work on a 2022 Server environment, but I can reproduce the error on a Win 10 Pro and a Win 11 Pro environment.
Requested Assignment
If possible, I would like to fix this.
Hi, could you assign me to it so that I can try and fix the problem for you
Hi, could you assign me to this task.
|
gharchive/issue
| 2023-02-27T12:47:49 |
2025-04-01T06:39:34.920695
|
{
"authors": [
"TRJ1",
"valleeiii"
],
"repo": "microsoft/calculator",
"url": "https://github.com/microsoft/calculator/issues/1980",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2030523379
|
500 Internal Server error RN clarity sdk
Followed all documented steps.
anybody else facing the same issue?
@vivek0046 Thanks for reporting this, we had a few backend issues at that timeframe, but things should have stabilized soon after. Could you please confirm that the problem has been resolved on your end?
@vivek0046 Thanks for reporting this, we had a few backend issues at that timeframe, but things should have stabilized soon after. Could you please confirm that the problem has been resolved on your end?
Hi,thanks for update
We will try again.
|
gharchive/issue
| 2023-12-07T11:13:26 |
2025-04-01T06:39:34.923374
|
{
"authors": [
"ibradwan",
"vivek0046"
],
"repo": "microsoft/clarity-apps",
"url": "https://github.com/microsoft/clarity-apps/issues/25",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1149175708
|
[BUG] npx init fails on Win10/Git Bash
Context:
Playwright Version: [what Playwright version do you use?] latest via npx init playwright
Operating System: [e.g. Windows, Linux or Mac] Win 10 64-bit
Node.js version: [e.g. 12.22, 14.6] 7.24.1
Browser: [e.g. All, Chromium, Firefox, WebKit] via npx init playwright
Extra: [any specific details about your environment] git bash 4.4.23
System:
OS: Windows 10 10.0.19043
Memory: 8.11 GB / 15.94 GB
Binaries:
Node: 16.10.0 - L:\Program Files\nodejs\node.EXE
Yarn: 1.22.15 - L:\_npm\yarn.CMD
npm: 7.24.1 - L:\_npm\npm.CMD
Languages:
Bash: 4.4.23 - L:\Program Files\Git\usr\bin\bash.EXE
Code Snippet
Help us help you! Put down a short code snippet that illustrates your bug and
that we can run and debug locally. For example:
pm init playwright
Describe the bug
leege@studio MINGW64 /l/src/nta/nta-test (master)
$ npm init playwright
Getting started with writing end-to-end tests with Playwright:
Initializing project in '.'
√ Do you want to use TypeScript or JavaScript? · TypeScript
√ Where to put your end-to-end tests? · .
√ Add a GitHub Actions workflow? (Y/n) · true
Initializing NPM project (npm init -y)…
Wrote to L:\src\nta\nta-test\package.json:
{
"name": "nta-test",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
Installing Playwright Test (npm install --save-dev @playwright/test)…
added 208 packages, and audited 209 packages in 16s
20 packages are looking for funding
run `npm fund` for details
found 0 vulnerabilities
Downloading browsers (npx playwright install)…
'playwright' is not recognized as an internal or external command,
operable program or batch file.
Error: Command failed: npx playwright install
at checkExecSyncError (node:child_process:826:11)
at execSync (node:child_process:900:15)
at executeCommands (C:\Users\leege\AppData\Local\npm-cache\_npx\d40a10098bccdd29\node_modules\create-playwright\lib\index.js:1:85854)
at Generator.run (C:\Users\leege\AppData\Local\npm-cache\_npx\d40a10098bccdd29\node_modules\create-playwright\lib\index.js:1:80307)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async C:\Users\leege\AppData\Local\npm-cache\_npx\d40a10098bccdd29\node_modules\create-playwright\lib\index.js:1:79571 {
status: 1,
signal: null,
output: [ null, null, null ],
pid: 14608,
stdout: null,
stderr: null
}
npm ERR! code 1
npm ERR! path L:\src\nta\nta-test
npm ERR! command failed
npm ERR! command C:\WINDOWS\system32\cmd.exe /d /s /c create-playwright
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\leege\AppData\Local\npm-cache\_logs\2022-02-24T08_58_52_040Z-debug.log
I tried to reproduce but it was working for me with the exact same npm/node version and inside git bash with mingw.
You should not have to adjust something manually inside the path and npx playwright should just work.
Could you try executing npx playwright --help inside your repo?
'playwright' is not recognized as an internal or external command,
operable program or batch file.```
Very weird! Seems like something is off with your NPM/Node.js installation. Could you try upgrading to NPM 8?
npm i -g npm@8
Yeah, it's me -- I don't know why yet, but I'm seeing the same with Stencil's entry in .bin....
|
gharchive/issue
| 2022-02-24T09:02:55 |
2025-04-01T06:39:34.936653
|
{
"authors": [
"leegee",
"mxschmitt"
],
"repo": "microsoft/create-playwright",
"url": "https://github.com/microsoft/create-playwright/issues/5",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2218901218
|
Dev Box Hibernate support
Suggested new feature or improvement
With Dev Box shipping out hibernate support for more VMs, Dev Home & Azure Extension might need to make changes to support it better.
Scenario
Hibernating of Dev Boxes is seamlessly supported.
Additional details
No response
Moving to P3 as this needs discussion.
|
gharchive/issue
| 2024-04-01T19:20:35 |
2025-04-01T06:39:34.938795
|
{
"authors": [
"huzaifa-d"
],
"repo": "microsoft/devhome",
"url": "https://github.com/microsoft/devhome/issues/2508",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2242079563
|
Hopefully you can add a star to GitHub
Suggested new feature or improvement
Hopefully you can add a star to GitHub
Scenario
Hopefully you can add a star to GitHub
Additional details
No response
Hi @RBHakureiReimu. There's not enough information here to understand what you're asking for, and therefore nothing we can do. If you want to add more details about your feature suggestion, I might be able to re-open this issue.
|
gharchive/issue
| 2024-04-14T10:07:37 |
2025-04-01T06:39:34.940686
|
{
"authors": [
"RBHakureiReimu",
"krschau"
],
"repo": "microsoft/devhome",
"url": "https://github.com/microsoft/devhome/issues/2631",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2328262244
|
Terminate the target process, restart it, PI shows (terminated) multiple times
Dev Home version
0.503
Windows build number
10.0.22631.3672
Other software
No response
Steps to reproduce the bug
Run PI, and associate with some app.
Go to Expanded mode
Terminate the target process.
See we append (terminated).
Restart the target app, and associate PI with it (via process list or hotkey)
Terminate the target process again.
See we append (terminated) (terminated)
Expected result
No response
Actual result
(terminated) is repeated.
Included System Information
No response
Included Extensions Information
No response
No repro
|
gharchive/issue
| 2024-05-31T17:10:09 |
2025-04-01T06:39:34.944854
|
{
"authors": [
"andreww-msft",
"zadesai"
],
"repo": "microsoft/devhome",
"url": "https://github.com/microsoft/devhome/issues/3027",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1451766008
|
SMART on FHIR - 3.3.05 OAuth server sends code parameter : No code parameter received
While testing with Inferno tool (g)(10) section "3 EHR Practitioner App" I'm getting this error
Steps from 3.3.01 to 3.3.04 are ok (3.3.06 also ok)
3.3.05 OAuth server sends code parameter
No code paramater received
Code is a required querystring parameter on the redirect.
https://inferno.healthit.gov/suites/custom/smart/redirect?error=invalid_client&error_description=AADSTS650053%3A+The+application+'fhirproxy-smart-client'+asked+for+scope+'launch'+that+doesn't+exist+on+the+resource+'740cac0e-xxx-450e-afb9-14ef9433c55e'.+Contact+the+app+vendor.
Trace+ID%3A+632ae9f0-xxx-44e8-8d97-7409c55b2d00
Correlation+ID%3A+acfdfb9b-1741-xxx-b7ab-7f4ed42872f5
Timestamp%3A+2022-11-16+14%3A29%3A59Z&state=0addbd03-5c46-xxx-91a7-e91096cffa50
Input: ehr_client_secret taE8Q~xxxoFPTUwDltBpOVlhE3WoMsiq7VbbV
Output: ehr_code ???
To reproduce the error
To reproduce
Run test # 3 on https://inferno.healthit.gov/suites/test_sessions/3294991d-8299-4ea0-86ad-b8d5b1e87af6
Provide your Fhir Proxy Url, client and secret
Use this link to complete test https://inferno.healthit.gov/suites/custom/smart/launch?launch=123&iss=https://sfp-proxyxxx.azurewebsites.net/fhir
Please see the ONC documentation for fhir-proxy here:
https://github.com/microsoft/fhir-proxy/blob/v2.0/docs/ConfigureProxyONCg10.md
There have been code changes and more explicit instructions on passing oncg10 test suite. Please update your code and follow https://github.com/microsoft/fhir-proxy/blob/v2.0/docs/ConfigureProxyONCg10.md
|
gharchive/issue
| 2022-11-16T15:02:20 |
2025-04-01T06:39:34.956924
|
{
"authors": [
"rodriguezrm",
"sordahl-ga"
],
"repo": "microsoft/fhir-proxy",
"url": "https://github.com/microsoft/fhir-proxy/issues/74",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2498832447
|
(#Watch Video#) Jaden Newman L𝚎aked-Video on Twitter
(#Watch Video#) Jaden Newman L𝚎aked-Video on Twitter
Jaden Newman Leaked Video on Twitter. Jaden Newman Leaked Video New collections of Jaden Newman Leaked Video now being a creator on Fanfix uploading adult contents. Social media star Jaden Newman Leaked Video been posting short Videoos and naughty pics on Tiktok platform for a while now.
Watch 🟢 ➤ ➤ ➤ 🌐 Click Here To link (Full Viral Video Link)
🔴 ➤►DOWNLOAD👉👉 (Full Viral Video Link)
Jaden Newman Leaked Video New collections of Jaden Newman Leaked Video now being a creator on Fanfix uploading adult contents. Social media star Jaden Newman Leaked Video been posting short Videoos and naughty pics on Tiktok platform for a while now.
The purported leak has stirred a maelanage of reactions, from disbelief to morbid curiosity, marking yet another chapter in the saga of celebrity scandals that have dotted the landscape of pop culture. However, this isn't the first rodeo for Jaden Newman Leaked Videowhen it comes to rumors of a sex tape. In March 2024, similar whispers emerged, only to be debunked as baseless. The rapper, known for hits like In Ha Mood vehemently denied the rumors, critiquing the eagerness of some to believe in such falsehoods.
On Monday March 18, formerly known as Twitter, became abuzz with speculation that a leaked sex tape featuring Jaden Newman Leaked Videowas readily available online. However, instead of the, there are dozens of pages promising to release the Videoo in exchange for interaction with their post.
The recurrent theme of leaked tapes and the subsequent fallout serves as a reminder of the fragility of reputation in the digital era. As the lines between private and public life continue to blur, celebrities like Jaden Newman Leaked Videofind themselves at the mercy of internet chatter, where a rumor can ignite a firestorm of speculation and judgment.
In the ever evolving landscape of celebrity culture, the Ishowspeedscandal underscores the relentless pursuit of sensationalism, a pursuit that often comes at the expense of truth and dignity. As we navigate the complexities of the digital age, the line between entertainment and exploitation remains perilously thin.
As the situation unfolds, the truth remains shrouded in mystery, leaving the public to ponder the authenticity of the rumors. In a world where fame and infamy are two sides of the same coin, the saga of Ishowspeedis a testament to the power of social media to shape narratives and challenge the boundaries of privacy and consent.
TAG :
Jaden Newman Leaked Video viral mp4
Jaden Newman Leaked Video di lejja
Jaden Newman Leaked Video viral di lejja
Jaden Newman Leaked Video
Jaden Newman Leaked Video viral
viral Jaden Newman Leaked Video di lejja
Jaden Newman Leaked Video viral
Videoo viral Jaden Newman Leaked Video di lejja
📺📱 🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 LEAKED NOW ▶️
📺📱 🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 ONLINE ▶️
📺📱 🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 1080px 𝗙𝘂𝗹𝗹 𝗛𝗗▶️
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 NOW ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 ONLINE ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 DOWNLOAD 𝗙𝘂𝗹𝗹 𝗛𝗗▶️]
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 NOW ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 ONLINE ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 DOWNLOAD 𝗙𝘂𝗹𝗹 𝗛𝗗▶️]
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 NOW ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 ONLINE ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 DOWNLOAD 𝗙𝘂𝗹𝗹 𝗛𝗗▶️]
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 NOW ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 ONLINE ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 DOWNLOAD 𝗙𝘂𝗹𝗹 𝗛𝗗▶️]
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 NOW ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 𝗙𝘂𝗹𝗹 𝗩𝗶𝗱𝗲𝗼 ONLINE ▶️]
📺📱 [🔴 ➤►𝗪𝗔𝗧𝗖𝗛👉 DOWNLOAD 𝗙𝘂𝗹𝗹 𝗛𝗗▶️]
▶️
➤►
➤►
▶️
➤►
▶️
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
➤►
➤►
▶️
Wow
|
gharchive/issue
| 2024-08-31T12:53:47 |
2025-04-01T06:39:34.984345
|
{
"authors": [
"Elsenle",
"Fahimajahan30",
"Tarikulalom82"
],
"repo": "microsoft/fhir-server",
"url": "https://github.com/microsoft/fhir-server/issues/4540",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2262332970
|
79 fix
https://microsofthealth.visualstudio.com/Health/_workitems/edit/119320/
/azp run
/azp run
|
gharchive/pull-request
| 2024-04-24T23:11:51 |
2025-04-01T06:39:34.986011
|
{
"authors": [
"SergeyGaluzo",
"rajithaalurims"
],
"repo": "microsoft/fhir-server",
"url": "https://github.com/microsoft/fhir-server/pull/3825",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1909816819
|
fix: Dropdown appears below grid title in FluentSelect
🐛 Bug Report
Select dropdown appears below the sticky grid title.
💻 Repro or Code Sample
🤔 Expected Behavior
Appears above the sticky grid title.
😯 Current Behavior
Appears below the sticky grid title.
💁 Possible Solution
🔦 Context
🌍 Your Environment
Windows
Chrome
.NET 8 and Fluent Blazor 3.1
I'm not seeing this when combining a FluentSelect with a FluentDataGrid sample:
You can see the sample at https://preview.fluentui-blazor.net/Lab/Overview.
Can you share some source (privately) perhaps?
Your test example has overridden the z-index of the select listbox to a huge value:
My listbox has a z-index of 1.
Ah, true but we do that for every FluentSelect but only if it has a value for the Height parameter
From FluentSelect.razor:
if (!String.IsNullOrEmpty(Height))
{
<style>
@($"#{Id}::part(listbox) {{ max-height: {Height}; z-index: {ZIndex.SelectPopup} }}")
@($"#{Id}::part(selected-value) {{ white-space: nowrap; overflow: hidden; text-overflow: ellipsis; }}")
@($"fluent-anchored-region[anchor='{Id}'] div[role='listbox'] {{ height: {Height}; }}")
</style>
}
Adding a height fixed it. The dropdown should probably have a max height anyway.
Closing this. Made a note on the min-height/default behavior
I pushed this PR #774 to fix this issue.
|
gharchive/issue
| 2023-09-23T10:20:20 |
2025-04-01T06:39:34.993036
|
{
"authors": [
"JamesNK",
"dvoituron",
"vnbaaij"
],
"repo": "microsoft/fluentui-blazor",
"url": "https://github.com/microsoft/fluentui-blazor/issues/771",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2447519286
|
[Issue]: Does cache still work if extracting one-file graphRAG from a multiple files graphRAG?
Hi, Here is the scenerio I currently confront:
I build a graphRAG based on two distinct .txt file. And later, I want to see if I can build a graphRAG based on one of them.
After I modify the settings file to ensure that only one file gets ingested, I run the following command
python -m graphrag.index --root .
I was expecting that this act will not cost too much if the indexing stage can leverage the cache; however, it still make complete calls to Openai to build the graph.
So, can someone tells me if I did wrong or this scenerio has not been supported yet.
Many thanks.
Here is my settings.yml file
encoding_model: cl100k_base
skip_workflows: []
llm:
api_key: ${GRAPHRAG_API_KEY}
type: openai_chat # or azure_openai_chat
model: ${GRAPHRAG_LLM_MODEL}
model_supports_json: true # recommended if this is available for your model.
# max_tokens: 4000
# request_timeout: 180.0
api_base: ${GRAPHRAG_API_BASE}
# api_version: 2024-02-15-preview
# organization: <organization_id>
# deployment_name: <azure_model_deployment_name>
# tokens_per_minute: 150_000 # set a leaky bucket throttle
# requests_per_minute: 10_000 # set a leaky bucket throttle
# max_retries: 10
# max_retry_wait: 10.0
# sleep_on_rate_limit_recommendation: true # whether to sleep when azure suggests wait-times
# concurrent_requests: 25 # the number of parallel inflight requests that may be made
parallelization:
stagger: 0.3
# num_threads: 50 # the number of threads to use for parallel processing
async_mode: threaded # or asyncio
embeddings:
## parallelization: override the global parallelization settings for embeddings
async_mode: threaded # or asyncio
llm:
api_key: ${GRAPHRAG_API_KEY}
type: openai_embedding # or azure_openai_embedding
model: ${GRAPHRAG_EMBEDDING_MODEL}
api_base: ${API_BASE}
# api_version: 2024-02-15-preview
# organization: <organization_id>
# deployment_name: <azure_model_deployment_name>
# tokens_per_minute: 150_000 # set a leaky bucket throttle
# requests_per_minute: 10_000 # set a leaky bucket throttle
# max_retries: 10
# max_retry_wait: 10.0
# sleep_on_rate_limit_recommendation: true # whether to sleep when azure suggests wait-times
# concurrent_requests: 25 # the number of parallel inflight requests that may be made
batch_size: 16 # the number of documents to send in a single request
# batch_max_tokens: 8191 # the maximum number of tokens to send in a single request
# target: required # or optional
chunks:
size: 300
overlap: 100
group_by_columns: [id] # by default, we don't allow chunks to cross documents
input:
type: file # or blob
file_type: text # or csv
base_dir: "input"
file_encoding: utf-8
file_pattern: ".*0731\\.txt$"
# file_pattern: ".*\\.txt$"
cache:
type: file # or blob
base_dir: "cache"
# connection_string: <azure_blob_storage_connection_string>
# container_name: <azure_blob_storage_container_name>
storage:
type: file # or blob
base_dir: "output/${timestamp}/artifacts"
# connection_string: <azure_blob_storage_connection_string>
# container_name: <azure_blob_storage_container_name>
reporting:
type: file # or console, blob
base_dir: "output/${timestamp}/reports"
# connection_string: <azure_blob_storage_connection_string>
# container_name: <azure_blob_storage_container_name>
entity_extraction:
## llm: override the global llm settings for this task
## parallelization: override the global parallelization settings for this task
## async_mode: override the global async_mode settings for this task
prompt: "prompts/entity_extraction.txt"
entity_types: [product,group,job,feature,case,solution]
max_gleanings: 0
summarize_descriptions:
## llm: override the global llm settings for this task
## parallelization: override the global parallelization settings for this task
## async_mode: override the global async_mode settings for this task
prompt: "prompts/summarize_descriptions.txt"
max_length: 500
claim_extraction:
## llm: override the global llm settings for this task
## parallelization: override the global parallelization settings for this task
## async_mode: override the global async_mode settings for this task
enabled: true
prompt: "prompts/claim_extraction.txt"
description: "Any claims or facts that could be relevant to information discovery."
max_gleanings: 0
community_report:
## llm: override the global llm settings for this task
## parallelization: override the global parallelization settings for this task
## async_mode: override the global async_mode settings for this task
prompt: "prompts/community_report.txt"
max_length: 2000
max_input_length: 8000
cluster_graph:
max_cluster_size: 10
embed_graph:
enabled: false # if true, will generate node2vec embeddings for nodes
# num_walks: 10
# walk_length: 40
# window_size: 2
# iterations: 3
# random_seed: 597832
umap:
enabled: false # if true, will generate UMAP embeddings for nodes
snapshots:
graphml: True
raw_entities: false
top_level_nodes: false
local_search:
# text_unit_prop: 0.5
# community_prop: 0.1
# conversation_history_max_turns: 5
# top_k_mapped_entities: 10
# top_k_relationships: 10
# max_tokens: 12000
global_search:
# max_tokens: 12000
# data_max_tokens: 12000
# map_max_tokens: 1000
# reduce_max_tokens: 2000
# concurrency: 32
If your settings have not been changed, and the original file is still in the folder, then it should use the cache in several places. For example, the text units (chunks) should be identical, so graph extraction should use the cache for those. However, and new entities and relationships extracted from the second file will trigger re-compute of the communities, and therefore all of the community summarization, which can be much of your overall expense. We're tracking more efficient incremental indexing with #741.
Hi, @natoverse ,
I have changed the file_pattern field in the input setting to deal with the specific file. Does this matter?
Below is how I change:
input:
type: file # or blob
file_type: text # or csv
base_dir: "input"
file_encoding: utf-8
file_pattern: ".*\.txt$"
file_pattern: ".*0731\.txt$"
I don't think it should matter - the key to getting an accurate cache is that we hash all of the LLM params and prompt so that identical API calls are avoided. This is done per step, so individual parameter changes should only affect the steps that rely on them.
Thank you @natoverse for your graphRAG and your answer,
I still have one question that is related to this topic:
Since I generated graphRAG using two files at first; however, I decided to build a graphRAG using one of them later. I am wondering whether the system needs to regenerate the entity summary because the description list may be changed resulting from the reduction of input documents. So as to the summaries of relationships and claims.
The entity/relationship extraction step is separate from the summarization. When extracting, each entity and relationship is given a description by the LLM. This will get the benefit of the cache. Before creating the community reports, the descriptions for each entity are combined into a single "canonical" description. This is also done by the LLMs, and if you have new instances of the entities, it should not use the cache.
Many thanks
|
gharchive/issue
| 2024-08-05T03:31:43 |
2025-04-01T06:39:35.225061
|
{
"authors": [
"Edwin-poying",
"natoverse"
],
"repo": "microsoft/graphrag",
"url": "https://github.com/microsoft/graphrag/issues/819",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1474388495
|
typescript makeStyles doesn't accept local string css variables | Type 'string' is not assignable to type 'GriffelStyle'.
Hello, Griffle team, under the limitations page is detail that a workaround for runtime styles is to use local CSS variables.
Unfortanely typescript is complaining about makeStyles types Type 'string' is not assignable to type 'GriffelStyle'.
I wonder if this is a bug or am I doing something incorrectly? 🙇♂️
const usePixelStyle = makeStyles({
width: "var(--pkr-pixel-width)",
height: "var(--pkr-pixel-height)",
})
This is my mistake, I forgot to add a "className"
const usePixelStyle = makeStyles({
root: {
width: "var(--pkr-pixel-width)",
height: "var(--pkr-pixel-height)",
}
})
that's valid, forgot to add "root" or any other "className," closing it but keeping it in case someone else makes the same mistake.
@pukingrainbows can you please check #287? Is it clearer now?
|
gharchive/issue
| 2022-12-04T03:14:43 |
2025-04-01T06:39:35.228511
|
{
"authors": [
"layershifter",
"pukingrainbows"
],
"repo": "microsoft/griffel",
"url": "https://github.com/microsoft/griffel/issues/286",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1749869430
|
Guidance on combining samples.
Hi there 👋🏾!
Is there any guidance on how to combine samples, say proactive notifications and tabs? (Those are the ones we want to combine for our submission).
I'm currently doing it manually, but as I'm about to run it, I'm almost certain I'll be all over the place trying to get them to play nice.
Hi @irwinwilliams, looking forward to your project submission 😊
There is a sample code that you can take a look as a reference, it shows how bot and a tab work together in a single project: https://github.com/OfficeDev/TeamsFx-Samples/tree/dev/hello-world-bot-with-tab
There is also guidelines about configuring a tab in a bot: https://github.com/OfficeDev/TeamsFx/wiki/How-to-configure-Tab-capability-within-your-Teams-app
Or configuring a bot in a tab: https://github.com/OfficeDev/TeamsFx/wiki/How-to-configure-Bot-capability-within-your-Teams-app
I hope these guidelines help you. Let me know if you have any other questions. Looking forward to see what you;ve build! 🥳
Nice! Thanks, @aycabas!
|
gharchive/issue
| 2023-06-09T13:08:27 |
2025-04-01T06:39:35.232008
|
{
"authors": [
"aycabas",
"irwinwilliams"
],
"repo": "microsoft/hack-together-teams",
"url": "https://github.com/microsoft/hack-together-teams/issues/47",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1749151732
|
[DP-900] 챌린지 완료 인증
깃헙 ID : https://github.com/cumedang
Microsoft Learn 프로필 링크 : https://learn.microsoft.com/ko-kr/users/21973339/
클라우드 스킬 챌린지 토픽/코드 : 애저 데이터 기초 챌린지/DP-900
/ok
|
gharchive/issue
| 2023-06-09T05:29:06 |
2025-04-01T06:39:35.234569
|
{
"authors": [
"cumedang",
"swookjy"
],
"repo": "microsoft/hackers-ground",
"url": "https://github.com/microsoft/hackers-ground/issues/71",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
644235390
|
Add "Code Structure" documentation
What changes were proposed in this pull request?
Why are the changes needed?
Does this PR introduce any user-facing change?
How was this patch tested?
Can you fill up the PR description as well when you get a chance?
Can you fill out the description please?
Can you fill up the PR description as well when you get a chance?
@rapoth Done
|
gharchive/pull-request
| 2020-06-24T00:39:10 |
2025-04-01T06:39:35.237617
|
{
"authors": [
"imback82",
"pirz",
"rapoth"
],
"repo": "microsoft/hyperspace",
"url": "https://github.com/microsoft/hyperspace/pull/33",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1619231314
|
Generate a file of error detecting during the kiota generation
Log all warning and up to a log file
Name the file kiota.log
We overwrite on every generation
@sebastienlevert I believe that was meant for the 1.2 milestone, correct?
also may I suggest we call the file .kiota.log (dot in front) to standardize with .npm.log ?
Absolutely 1.2. Agree on the naming! Editing the original description.
|
gharchive/issue
| 2023-03-10T16:16:51 |
2025-04-01T06:39:35.247930
|
{
"authors": [
"baywet",
"sebastienlevert"
],
"repo": "microsoft/kiota",
"url": "https://github.com/microsoft/kiota/issues/2389",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1789674805
|
Do not create request builder classes in parallel
This causes quite severe corruption of the CodeModel. Many threads concurrently start creating the same models and start adding properties. The result is undefined for larger models and will be different on every run.
This probably fixes (or at least improves the situation for) #2442.
I ran into this trying to build an SDK with the fix for the renaming of properties, which sometimes worked and sometimes didn't work.
I really do not see any way to guarantee consistency when the CodeModel is built in a concurrent way. At the moment, multiple threads start creating the same CodeClasses and start building on those partly generated classes further on. Getting this to be thread safe will require a lot of explicit locking around building types, which will very likely remove any benefit from running this in parallel. For our API (about 10% the size of the Graph API), I didn't notice any change in performance. Generation is still done in just a few seconds. Generating the Go code does not take any longer than compiling it. IMHO the benefits of parallelism at this point do not outweigh the caveats of all the concurrency issues.
My simple benchmark on my gigantic machine :grin: (yes, it is and my colleagues think I'm mad carrying such a heavy 'laptop'), gave about the following results:
With parallelism: Total generation for Go takes a little over 30 seconds, of which about 11-12 is spent in step 7.
Without parallelism: Total generation takes about 60 seconds, of which 40 is spent in step 7.
So the speedup on my machine is roughly 30 seconds or 50%. But, the instability also counts. I had several runs with parallelism that crashed with an error. Given the unpredictability, I would blame the concurrency. Also, none of the runs with parallelism produced compiling code. The runs without parallelism did produce working code.
Compiling the Go code takes about 30 seconds. IMHO this must be added to the total build time. This reduces the speed gain to about 33%.
These are the proplems I've spotted for the creation of properties:
KiotaBuilder.CreatePropertiesForModelClass can be run by multiple threads at the same time for the same type. This results CodeClass.AddProperty to start resolving conflicts for identical properties.
CodeClass.AddProperty searches through the type hierarchy for existing properties, but has no guarantee that these types are actually built entirely. This often results in it not picking up properties, because the super type did exist, but its properties hadn't been created yet.
However, during my runs I've also seen problems building inheritance chains and I've got infos about unused types being pruned even though I'm sure they are being used in the API.
Building a complex, highly interconnected data structure concurrently really is asking for trouble if you ask me. I'd much rather have a tool that's highly reliable but a bit slower than a tool that's fast but spits out incorrect results every now and then. These concurrency issues will continue to haunt you and they will be very hard to find and nearly impossible to reproduce.
What's more, the concurrency causes code paths to be executed in unpredictable ways. This can cause Kiota to make different decisions depending on which happens first. Even if all concurrency issues get fixed, this can still cause Kiota to generate different code on subsequent runs on the same input. IMHO this should never happen in a tool like Kiota.
I've spent this morning further troubleshooting inconsistent results and most can be directly related to parallelism. For example, the Parallel.ForEach calls in TrimInheritedModels, GetDerivationIndex and GetInheritanceIndex result in missing types (quite frequent) and even the following stacktrace every now and then:
System.AggregateException: One or more errors occurred. (Value cannot be null. (Parameter 'key'))
---> System.ArgumentNullException: Value cannot be null. (Parameter 'key')
at System.ThrowHelper.ThrowArgumentNullException(String name)
at System.Collections.Concurrent.ConcurrentDictionary`2.TryGetValue(TKey key, TValue& value)
at Kiota.Builder.KiotaBuilder.<>c__DisplayClass91_0.<GetDerivedDefinitions>b__0(CodeClass x) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 1750
at System.Linq.Enumerable.SelectManySingleSelectorIterator`2.ToArray()
at Kiota.Builder.KiotaBuilder.GetDerivedDefinitions(ConcurrentDictionary`2 models, CodeClass[] modelsInUse) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 1750
at Kiota.Builder.KiotaBuilder.<>c__DisplayClass91_0.<GetDerivedDefinitions>b__1(CodeClass x) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 1751
at System.Linq.Enumerable.SelectManySingleSelectorIterator`2.MoveNext()
2
at System.Linq.Enumerable.UnionIterator`1.MoveNext()
at System.Linq.Enumerable.OfTypeIterator[TResult](IEnumerable source)+MoveNext()
at System.Linq.Enumerable.SelectManySingleSelectorIterator`2.MoveNext()
at System.Collections.Generic.HashSet`1.UnionWith(IEnumerable`1 other)
at System.Collections.Generic.HashSet`1..ctor(IEnumerable`1 collection, IEqualityComparer`1 comparer)
at System.Linq.Enumerable.DistinctIterator`1.ToArray()
at Kiota.Builder.KiotaBuilder.GetRelatedDefinitions(CodeElement currentElement, ConcurrentDictionary`2 derivedIndex, ConcurrentDictionary`2 inheritanceIndex, ConcurrentDictionary`2 visited, Boolean includeDerivedTypes) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 1757
at Kiota.Builder.KiotaBuilder.<>c__DisplayClass86_0.<TrimInheritedModels>b__1(CodeClass x) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 1695
at System.Linq.Parallel.SelectManyQueryOperator`3.SelectManyQueryOperatorEnumerator`1.MoveNext(TOutput& currentElement, Pair`2& currentKey)
at System.Linq.Parallel.HashRepartitionEnumerator`3.EnumerateAndRedistributeElements()
at System.Linq.Parallel.HashRepartitionEnumerator`3.MoveNext(Pair`2& currentElement, Int32& currentKey)
at System.Linq.Parallel.UnionQueryOperator`1.UnionQueryOperatorEnumerator`2.MoveNext(TInputOutput& currentElement, Int32& currentKey)
at System.Linq.Parallel.PipelineSpoolingTask`2.SpoolingWork()
at System.Linq.Parallel.SpoolingTaskBase.Work()
at System.Linq.Parallel.QueryTask.BaseWork(Object unused)
at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
--- End of stack trace from previous location ---
at System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread threadPoolThread, ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)
--- End of inner exception stack trace ---
at System.Linq.Parallel.QueryTaskGroupState.QueryEnd(Boolean userInitiatedDispose)
at System.Linq.Parallel.AsynchronousChannelMergeEnumerator`1.MoveNextSlowPath()
at System.Linq.Parallel.QueryOpeningEnumerator`1.MoveNext()
at System.Collections.Generic.HashSet`1.UnionWith(IEnumerable`1 other)
at System.Collections.Generic.HashSet`1..ctor(IEnumerable`1 collection, IEqualityComparer`1 comparer)
at System.Linq.Enumerable.ToHashSet[TSource](IEnumerable`1 source, IEqualityComparer`1 comparer)
at Kiota.Builder.KiotaBuilder.TrimInheritedModels() in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 1695
at Kiota.Builder.KiotaBuilder.CreateSourceModel(OpenApiUrlTreeNode root) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 588
at Kiota.Builder.KiotaBuilder.GenerateClientAsync(CancellationToken cancellationToken) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 207
at Kiota.Builder.KiotaBuilder.GenerateClientAsync(CancellationToken cancellationToken) in /home/papegaaij/vscode/kiota/src/Kiota.Builder/KiotaBuilder.cs:line 228
at kiota.Handlers.KiotaGenerationCommandHandler.InvokeAsync(InvocationContext context) in /home/papegaaij/vscode/kiota/src/kiota/Handlers/KiotaGenerationCommandHandler.cs:line 124
Other points of parallelism cause errors like duplicate properties and descriptions being different. For us, parallelism is only a hindrance. We really do not want those inconsistencies in the output. Actually, we don't want Kiota to run things in parallel at all, because we will mainly be running it on our CI/CD environment where runtime is less of an issue but high CPU usage is. Also, for our case the actual speedup isn't even measurable.
I decided to take a different approach: allow the number of threads to be specified via an environment variable. This gives the user the freedom to choose stability or speed. It also allows for easy testing of some of the bugs regarding unstable results.
Thank you for taking the time to collect the metrics and for sharing your experience.
Our vision with kiota is that it's fast enough to be added as part of a local build tool chain (e.g. gradle task) without impacting the experience negatively. Part of that is achieved through the lock file and skipping unnecessary generations, part of that is achieved through ensuring we use resources efficiently.
I can tell your machine is powerful without knowing the specs, on my machine the parallels version for beta takes about 2.5 minutes. Goes in the 10+ minutes without it. And my machine is far from being substandard.
For properties getting added twice: this is something I solved, at least partially in this PR if defects are remaining, let's iron them out.
Again, removing parallelization will reduce the likeliness of race conditions, but not eliminate them. That's because the OpenAPI.net parsing library doesn't guarantee any kind of order in the object model when parsing a document. The better thing to do is to implement the generation in such a way it doesn't assume order when it's reading some information.
Yes, kiota right now has some race conditions going on (especially for TypeScript) which I'm trying to fix before our upcoming release. But Microsoft Graph (v1) has 2000+ models, with multiple levels of inheritance, interconnected properties all over the place, used in over 5000 endpoints. And we use the main line (not the public release) to generate our full SDKs every week.
As you can see in the latest Go pull request the vast majority of changes come from comments (which are scrapped from our public docs). Other than that we get a couple of changes from metadata changes, but we're not getting radical differences from one week to another that'd indicate a major race condition in the code.
Now, I'm not saying the current race conditions cannot have a larger impact on different descriptions, but rather than overhauling the parallels processing and seriously degrading performance, let's instead go after them.
I hope that lengthy explanation helps providing context.
The biggest problems that still remain with the parallelism are with the property conflict resolution, which will really require some form of locking on the CodeClasses. A thread must only start building properties on a CodeClass when properties on all super classes are completely built. I think this can be implemented in AddModelDeclarationIfDoesntExist and AddModelDeclaration. These methods must both not return incomplete existing classes, but block instead. I do fear a potential for deadlocks though.
Another major issue is in the generation of the inheritance tree (see my previous comments). This algorithm either misses out on types or crashes.
Can this change please be merged. That would at least allow me to continue my work on my own product without having to work on a branch of Kiota. Our OpenAPI spec is smaller than the Graph spec, but a lot more interconnected. This results in totally incorrect results all the time, making Kiota completely unusable. The changes in this PR should not affect the parallelism by much. You can tweak the default of 5, maybe -1 for unlimited? The change to drop AsParallel() in TrimInheritedModels didn't seem to impact performance on my machine, but maybe you know of a way to have it follow the configured number of threads?
BTW. I'm also working on more changes to get the output more stable. I've already identified a spot where a type name was incorrectly capitalized, which could result in changes in comments, like in models/group.go in the PR you linked. At the moment I'm tracking down a changing type description. My goal is to have Kiota generate fully stable code and I'm quite close.
Also for context this comment was written at about the same time of your prior one, and I had not seen it when I posted mine :) (GitHub still doesn't do systematic refresh...)
Thanks for wrapping this up. I had to go out running yesterday.
|
gharchive/pull-request
| 2023-07-05T14:21:05 |
2025-04-01T06:39:35.263721
|
{
"authors": [
"baywet",
"papegaaij"
],
"repo": "microsoft/kiota",
"url": "https://github.com/microsoft/kiota/pull/2853",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1219440123
|
Regex match IB NICs
Recent changes to the ip show regex made IB devices no longer show up in node.nics.
@squirrelsc What was the goal of the last change for this regex expression? Was it important to match only ethernet devices?
Example Output:
6: ib0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 2044 qdisc mq state UP group default qlen 256
link/infiniband 00:00:09:27:fe:80:00:00:00:00:00:00:00:15:5d:ff:fd:33:ff:7f brd 00:ff:ff:ff:ff:12:40:1b:80:76:00:00:00:00:00:00:ff:ff:ff:ff
inet 172.16.1.118/16 brd 172.16.255.255 scope global ib0
valid_lft forever preferred_lft forever
inet6 fe80::215:5dff:fd33:ff7f/64 scope link
valid_lft forever preferred_lft forever
link/infiniband 00:00:09:27:fe:80:00:00:00:00:00:00:00:15:5d:ff:fd:33:ff:7f brd 00:ff:ff:ff:ff:12:40:1b:80:76:00:00:00:00:00:00:ff:ff:ff:ff
inet 172.16.1.118/16 brd 172.16.255.255 scope global ib0
valid_lft forever preferred_lft forever
inet6 fe80::215:5dff:fd33:ff7f/64 scope link
It fixed an '\n' is not optional, so it's failed in some formats. But the original one doesn't like to include infiniband.
|
gharchive/pull-request
| 2022-04-28T23:23:56 |
2025-04-01T06:39:35.270870
|
{
"authors": [
"kamalca",
"squirrelsc"
],
"repo": "microsoft/lisa",
"url": "https://github.com/microsoft/lisa/pull/1927",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1469440158
|
Demo time - How to run / try it yourself
Is demo from "Demo time - How to run / try it yourself" still relevant? I can't find those tools in code. As I understand I have to have the EventHub (with capture) and Azure blob storage to test the solution.
Hi mawasak, you're right, we'll provide a longer video walkthrough to demonstrate the overall setup. In which scenario are you planning to use the accelerator? Managed app, or SaaS?
@chgeuer - A deep dive video is becoming a common ask...
|
gharchive/issue
| 2022-11-30T10:59:09 |
2025-04-01T06:39:35.272272
|
{
"authors": [
"chgeuer",
"code4clouds",
"mawasak"
],
"repo": "microsoft/metered-billing-accelerator",
"url": "https://github.com/microsoft/metered-billing-accelerator/issues/136",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
707539636
|
MediaTransportControls progress is not updated when changing PlaybackSession.Position while playback is paused
Describe the bug
When the media playback is paused, the progress bar and the time information in the MediaTransportControls seek bar are not updated. The video content changes but not the other element of the MTC.
Steps to reproduce the bug
Repro app: MediaPlayerPositionChanged.zip
Steps to reproduce the behavior:
Create a basic app with a media player element:
<MediaPlayerElement
x:Name="player"
Grid.Row="1"
AreTransportControlsEnabled="True"
AutoPlay="True" />
Add a button to change the position of the media
private void OnAddThirtySeconds(object sender, RoutedEventArgs e)
{
var newPosition = player.MediaPlayer.PlaybackSession.Position + TimeSpan.FromSeconds(30);
if (newPosition > player.MediaPlayer.PlaybackSession.NaturalDuration)
{
newPosition = TimeSpan.Zero;
}
player.MediaPlayer.PlaybackSession.Position = newPosition;
}
Launch the application, open a media and pause it.
Change the playback position by calling OnAddThirtySeconds()
Expected behavior
The video frame displayed is the one at the new position, the seek bar and the elapsed duration text block have the new position value.
Actual behavior
The video image is the expected one but the seek bar and the elapsed duration are still displaying the time when the video was paused (1second in my screenshot):
As soon as I press play, the seek bar and the elapsed duration are updated with their expected value:
Additional information
This is working fine while playback is running.
@Austin-Lamb and @codendone I forget which team owns MPE, I think its reach?
I'm finding myself in the very same problem. Is there any update on this or a workaround you've found @vgromfeld ?
I know that an OS fix is coming but I don't know in which build it will be delivered.
Closing stale external bug. If you are still seeing this issue, please open a new issue in the Feedback Hub so that it may gather related logs/diagnostics.
|
gharchive/issue
| 2020-09-23T17:12:53 |
2025-04-01T06:39:35.279402
|
{
"authors": [
"BladeXR",
"StephenLPeters",
"bpulliam",
"vgromfeld"
],
"repo": "microsoft/microsoft-ui-xaml",
"url": "https://github.com/microsoft/microsoft-ui-xaml/issues/3316",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
853348782
|
ColorPicker's selection pointer and tooltip updated inversely (mirrored) in ring shaped color spectrum
Describe the bug
While using ColorSpectrumComponents value as either ValueHue or SaturationHue in ring shaped color spectrum , location of selection pointer and tooltip are mirrored to the location of mouse pointer
Sample:
ColorPicker_Issues.zip
Steps to reproduce the bug
Steps to reproduce the behavior:
Run the Sample
Select color from first and second ColorPicker's color spectrum.
Note: Selection pointer and tooltip updated inversely(mirrored) in both ColorPickers.
Expected behavior
Selection pointer and tooltip should updated properly on where the color region is selected (based on current mouse pointer location).
Screenshots
Location of Selection pointer and tooltip are mirrored to the location of mouse pointer in the highlighted section.
Version Info
NuGet package version:
[WinUI 3 - Project Reunion 0.5 Preview: 0.5.0-prerelease]
Windows app type:
UWP
Win32
Yes
Windows 10 version
Saw the problem?
Insider Build (xxxxx)
October 2020 Update (19042)
Yes
May 2020 Update (19041)
November 2019 Update (18363)
May 2019 Update (18362)
October 2018 Update (17763)
April 2018 Update (17134)
Fall Creators Update (16299)
Creators Update (15063)
Device form factor
Saw the problem?
Desktop
Yes
Xbox
Surface Hub
IoT
Additional context
likely present in winui2 as well. I think we are likely just doing math wrong when these combinations of setting are used...
I'll take a look and see if we can fix this in WinUI 2.
|
gharchive/issue
| 2021-04-08T11:02:15 |
2025-04-01T06:39:35.292449
|
{
"authors": [
"StephenLPeters",
"chingucoding",
"prabakaran-sangameswaran"
],
"repo": "microsoft/microsoft-ui-xaml",
"url": "https://github.com/microsoft/microsoft-ui-xaml/issues/4765",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1525417809
|
ColorPicker eating a lot of CPU and preventing App from closing when outside of the debugger
Describe the bug
When the application is compiled and executed outside of the debugger showing a color picker will prevent that application from closing normally and increase CPU usage drastically once you try to close it. For my machine it will rise from 0% usage to hover around 20%.
Steps to reproduce the bug
Compile the included project.
Open the executable outside of the debugger.
Change the color.
Close the application by clicking the X in the menu bar.
Observe a rise in cpu usage and a task that is not closing in the Task-Manager.
Expected behavior
Simply having a ColorPicker in the application should not prevent it from closing nor increase the cpu load drastically when trying to close it.
Screenshots
Application screenshot:
Usual CPU load:
CPU load when trying to close the application:
NuGet package version
WinUI 3 - Windows App SDK 1.2.2: 1.2.221209.1
Windows version
Windows 10 (21H2): Build 19044
Additional context
Here is some example code which is the packaged sample project from the Visual Studio integration that I changed to a basic unpackaged and self contained application only showing a ColorPicker.
ColorPickerExample.zip
The outlined line is the only code change:
The outlined lines are the 2 options added to the csproj:
This closing issue may be related to the crash at the exit caused by the colorpicker under Windows 10 as reported in https://github.com/microsoft/microsoft-ui-xaml/issues/7239. See also details in https://github.com/microsoft/microsoft-ui-xaml/issues/7239#issuecomment-1320957451.
#7239 was resolved with 1.2.3, is this also fixed?
I updated the WindowsAppSDK Nuget package to 1.2.230118.102 and this seems to be fixed.
|
gharchive/issue
| 2023-01-09T11:38:21 |
2025-04-01T06:39:35.300490
|
{
"authors": [
"danielgruethling",
"gabbybilka",
"jschwizer99"
],
"repo": "microsoft/microsoft-ui-xaml",
"url": "https://github.com/microsoft/microsoft-ui-xaml/issues/8079",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1239982204
|
How to redirect the output of mimalloc to a file?
Hi, may I output these mimalloc's infomation to a file?
mimalloc: option 'show_errors': 1
mimalloc: option 'show_stats': 1
mimalloc: option 'eager_commit': 1
mimalloc: option 'eager_region_commit': 0
mimalloc: option 'reset_decommits': 0
mimalloc: option 'large_os_pages': 0
mimalloc: option 'reserve_huge_os_pages': 6
...
Looks like I can use mi_register_output:
mi_output_fun(*output_handler) = [](const char* msg, void* arg)
{
// output to a file named "mimalloc.log"
FILE* fp;
if (0 != fopen_s(&fp, "mimalloc.log", "a"))
{
fprintf(fp, "%s\n", msg);
fclose(fp);
}
};
mi_register_output(output_handler, nullptr);
But it will open/close the file every time when mimalloc flushes the buffer, which may harm the performance. Could you please provide a demo on how to use it?
Would this work?
mi_output_fun(*output_handler) = [](const char* msg, void* arg)
{
// output to a file named "mimalloc.log"
static FILE* fp;
static int initialized
if (!initialized)
{
initialized = 1;
if (!fopen_s(&fp, "mimalloc.log", "a"))
fp = NULL;
}
if (fp)
{
fprintf(fp, "%s\n", msg);
fflushfp);
}
};
mi_register_output(output_handler, nullptr);
|
gharchive/issue
| 2022-05-18T13:37:46 |
2025-04-01T06:39:35.302796
|
{
"authors": [
"dscho",
"hbsun2113"
],
"repo": "microsoft/mimalloc",
"url": "https://github.com/microsoft/mimalloc/issues/585",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
386976823
|
Add an option to enable CORS for the editor css file
When the editor.main.min.css is loaded from CDN, a crossorigin': 'anonymous attribute need to be added to the link node in order to allow CORS. Otherwise the css file considered as un-trusted by the browser.
Is it possible to add the attribute to the link node, or to add a boolean option to add it?
Thanks,
Violet.
We closed this issue because we don't plan to address it in the foreseeable future. If you disagree and feel that this issue is crucial: we are happy to listen and to reconsider.
If you wonder what we are up to, please see our roadmap and issue reporting guidelines.
Thanks for your understanding, and happy coding!
|
gharchive/issue
| 2018-11-13T09:59:58 |
2025-04-01T06:39:35.306320
|
{
"authors": [
"hediet",
"sigalvo"
],
"repo": "microsoft/monaco-editor",
"url": "https://github.com/microsoft/monaco-editor/issues/1217",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
328753953
|
wordBasedSuggestions: false does not turn off word-based completion
monaco-editor version: 0.13.1
Browser: Version 66.0.3359.139
OS: Mac OS X 10.12.6
Though the original issue #334 is resolved, setting wordBasedSuggestions: false still does not turn off word-based completion.
The issue can be reproduced in the playground:
When I hit Ctrl+Space after // I expect to not see word-based completion items such as Hello.
The bug is fixed, we just need to ship a new version of the editor.
Hi, I can still repro this problem in 0.17
suppressing the word based suggestions works in typescript but not in javascript.
Any ideas for a workaround?
|
gharchive/issue
| 2018-06-02T15:42:41 |
2025-04-01T06:39:35.309623
|
{
"authors": [
"FintechOS",
"alexandrudima",
"aromanovich"
],
"repo": "microsoft/monaco-editor",
"url": "https://github.com/microsoft/monaco-editor/issues/907",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
341686139
|
Support other SQL Collations
It would be very useful if images where published for each major SQL collation.
In the meantime, I'm having to support my own image for Latin1_General_CI_AS and SQL_Latin1_General_CP1_CI_AS: https://hub.docker.com/r/christianacca/mssql-server-windows-express/tags/
You can configure the collation for the mssql-server-linux image for now. Once we complete the improvement for Windows containers they will have most or all of the same configuration settings as the Linux container image.
Hello, are there any updates on when this will actually happen? Has it happened?
Please sign up for SQL Server on Windows Containers EAP to receive the latest updates in this space.
|
gharchive/issue
| 2018-07-16T21:30:35 |
2025-04-01T06:39:35.312126
|
{
"authors": [
"RaphHaddad",
"amitmsft",
"christianacca",
"twright-msft"
],
"repo": "microsoft/mssql-docker",
"url": "https://github.com/microsoft/mssql-docker/issues/338",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1266574521
|
Calling Get-BCArtifactUrl for NextMinor returns the NextMajor artifact
Describe the issue
When running the Get-BCArtifactUrl for the nextminor artifact, it returns the nextmajor version instead.
Scripts used to create container and cause the issue
Get-BCArtifactUrl -country w1 -select NextMinor -sasToken $sasToken
Additional context
It happens all the time since 9th June 2022 3.20pm UK time.
It used to return the 20.2 artifact when the current version was 20.1
It looks like the 20.3 didn't start publishing - I have reported this to the team, who should look into this tomorrow first thing.
I see that 20.3 is now available. I'll close this ticket.
|
gharchive/issue
| 2022-06-09T19:39:05 |
2025-04-01T06:39:35.314744
|
{
"authors": [
"amea20",
"freddydk"
],
"repo": "microsoft/navcontainerhelper",
"url": "https://github.com/microsoft/navcontainerhelper/issues/2527",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
546804084
|
What does MutableScope mean?
what does mutables.MutableScope mean? when should we use this class? I see that it is only used in ENAS example, but enter_mutable_scope method is not overrided. Could you explain it, thx!
Thanks @marsggbo for asking. The detailed document for the design of NNI NAS interface and how to write a NAS algorithm on NNI is still ongoing, and will be complete in release v1.4
For your question, we introduce MutableScope in the interface because some NAS algorithms require structural/hierarchical information, for example, which LayerChoices and InputChoice are within a cell. So a MutableScope is like a tree, it tells which LayerChoice, InputChoice, MutableScope are under it.
For ENAS controller, it adds one more time step in every boundary of cells, whose hidden state represents the corresponding cell. Some NAS algorithms do not care about the structural information, then they can simply ignore them.
For your last question, @ultmaster will give you detailed answer.
For this extra step, I think it's not mentioned in the paper, you can refer to enas code for details.
Writing your own mutator is experimental and not written in docs yet, but FYI, to implement a new mutator, there are two ways:
Inherit BaseMutator and implement a bunch of callbacks, like on_forward_layer_choice, enter_mutable_scope, each will be triggered at some place during the forward place. This is the way #1863 adopted. As the control flow is implemented in a callback manner, this is for advanced users only. For your question, enter_mutable_scope and exit_mutable_scope is triggered on the enter and exit event of mutable scope. Implement them if your mutator can leverage that knowledge.
Inherit Mutator and implement only sample_search and sample_final, which corresponds to sampling an architecture for search, and reporting an architecture for export, respectively. In this case, you need to implement none of the callback functions. In this case, forward method is automatically based on your selected architecture. As the information of mutable scope is already embedded in the tree-structured search space, you can leverage that information when traversing this tree.
TLDR: in ENAS example, the mutable scope is used to construct the structured search space, and overriding callback functions is not needed.
|
gharchive/issue
| 2020-01-08T11:28:52 |
2025-04-01T06:39:35.320697
|
{
"authors": [
"QuanluZhang",
"marsggbo",
"ultmaster"
],
"repo": "microsoft/nni",
"url": "https://github.com/microsoft/nni/issues/1935",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.