Skip to main content

Microsoft Azure Collective

Questions

Browse questions with relevant Microsoft Azure tags

10,629 questions

Has recommended answer
0 votes
1 answer
37 views

How to get metadata (File Shares List) of the Storage account & configure the copy data activity for copying b/w storage accounts in the ADF?

Following my previous Q&A, How to get the File Shares list in the metadata activity and copy all the file shares and inside data as-is to the destination storage account? Linked Service for Source ...
Jashua Criss's user avatar
Answer

The issue is because you have same dataset for both getmetadata activity and source of copy activity. You can change the dataset of get metadata activity Click +New in the settings of get metadata ...

View answer
Aswin's user avatar
  • 6,661
0 votes
1 answer
27 views

Listing the Tables in the Storage Account using ADF Web Activity is failed with an authorization error

i have added the parameter source storage account sas token as shown below: Then I added web activity to list the tables of the storage account where this account has only 500 tables but i'm getting ...
Jashua Criss's user avatar
Answer Accepted

Make sure you have provided correct SAS. While generating SAS give following permission as shown below: Provide web activity URL as mentioned below: @concat('https://<storageAccountName>.table....

View answer
Bhavani's user avatar
  • 4,002
1 vote
2 answers
50 views

Unable to InferSchema with Databricks SQL

When I attempt to create a table with Databricks SQL I get the error: AnalysisException: Unable to infer schema for CSV. It must be specified manually. %sql CREATE TABLE IF NOT EXISTS newtabletable ...
Patterson's user avatar
  • 2,599
Answer

AnalysisException: Unable to infer schema for CSV. It must be specified manually. According to the MS document Databricks recommends the read_files table-valued function for SQL users to read CSV ...

View answer
Bhavani's user avatar
  • 4,002
0 votes
1 answer
35 views

No value provided for parameter `container_name` in the Get Metadata1 activity

I have added the Linked Service and Dataset pointing to blob storage account. But I don't know what should I give the container_name parameter value to copy all the containers. So, when I define the ...
Jashua Criss's user avatar
Answer Accepted

It seems like you are using the same source dataset which you have given for copy activity, in the Get meta data activity as well. To list the container names using Get meta data activity, you should ...

View answer
Rakesh Govindula's user avatar
0 votes
1 answer
47 views

How to grant all users of a CLI application with `DeviceCodeCredential` access to resources granted to the application?

I'm writing a CLI application for our developers to use for administrative tasks. The intention is for the CLI to pull in secrets etc. from an Azure Key Vault using the Key Vault SDK. Given that it's ...
Felix ZY's user avatar
  • 786
Answer

Note that, DeviceCodeCredential flow involves user interaction that works on signed-in user's roles. If you want to use the application identity instead, switch to client credentials flow. Initially, ...

View answer
Sridevi's user avatar
  • 18.3k
0 votes
1 answer
35 views

I have a file name in epackage - Sell In and Sell Out - Master - NS CBEC_20240711164411.xlsx - ADF Copy activity Dynamic expression

I have a file name in epackage - Sell In and Sell Out - Master - NS CBEC_20240711164411.xlsx - while loading in Storage through Copy activity i need dynamic expression to update in Sink to reflect ...
Myxtacy's user avatar
Answer

You can try the below expression to achieve your requirement. @{replace(concat(substring(variables('filename'), 0, lastIndexOf(variables('filename'), ' ')),'_',last(split(concat(substring(variables('...

View answer
Rakesh Govindula's user avatar
0 votes
1 answer
55 views

How to do copy of all the containers without explicitly mentioning in the array between two different storage accounts

https://stackoverflow.com/a/78351368/22054564 In my previous question-solution, i asked to change the names of the containers in the target storage account but now I do not want to change. I have ...
Jashua Criss's user avatar
Answer Accepted

To achieve your desired requirement, you can follow below procedure: Create binary dataset with source linked services by leaving file path as empty as shown below: Add Get meta data activity to the ...

View answer
Bhavani's user avatar
  • 4,002
1 vote
1 answer
46 views

How can I define the title for each document I import into Azure OpenAI?

I imported some text files into Azure OpenAI: After the import, I see a "title" field used for search: which I can't edit via UI as it's greyed out: How can I define the title for each ...
Franck Dernoncourt's user avatar
Answer

Azure OpenAI On Your Data API doesn't have such kind of modifications to ai search only it gives you the results for the search query with citations based on the ai search data. To modify the fields ...

View answer
JayashankarGS's user avatar
0 votes
1 answer
32 views

Getting net::ERR_NAME_NOT_RESOLVED error on flutter Azure authentication redirect url

I am setting Azure authentication with flutter. The Login function is below Future<void> loginWithAzure(context) async { config = Config( tenant: 'edfdf2d2-ef52-4a80-a647-...
paseo Market's user avatar
Answer

AndroidManifest.xml: <activity android:name="com.yourcompany.yourapp.MainActivity"> <intent-filter android:label="flutter_defaultIntentFilters"> <action ...

View answer
Suresh Chikkam's user avatar
1 vote
1 answer
47 views

Replicating Azure SQL DB and blob storage in multi region setup

I'm working on this web app using Azure for a multi region architecture to ensure high availability and low latency for all users in the globe. My problems are data consistency, replicating lag, ...
kiruthikpurpose's user avatar
Answer Accepted

Have you tried Azure SQL Data Sync for replicating Azure SQL Database between regions? That could be an option. If you find out the latency with SQL Data Sync and Geo-replication is too high, then you ...

View answer
Alberto Morillo's user avatar
0 votes
1 answer
31 views

How to bind trigger parameters to appsettings.json under Isolated azure function

When running under the in-process function model, you could resolve trigger parameters from the appsettings.json - assuming its been configured as a custom configuration source as defined here. e.g. ...
Samuel's user avatar
  • 38
Answer

Set the schedule expression as an app setting in local.settings.json and use this property in function code by wrapping with % sign i.e., %ScheduleTriggerTime%. Created a .NET 8.0 Isolated Azure ...

View answer
Pravallika KV's user avatar
0 votes
1 answer
36 views

Foreign key constraint lost in PostgreSQL table in Azure when data is added using R Script

I have a table in PostgreSQL database, hosted on Azure cloud. In this table, I have created foreign key constraints on multiiple columns. I have a script in R which tries to push the contents of a ...
Moohoo's user avatar
  • 9
Answer

To ensure that foreign key constraints are not lost when updating the data in your PostgreSQL table, you should avoid using the overwrite = TRUE option in dbWriteTable as it drops and recreates the ...

View answer
Bhavani's user avatar
  • 4,002
0 votes
2 answers
54 views

Rename SharePoint site with Graph API or REST

I am looking for the ability to programmability change the name of a sharepoint site with either Graph API or SharePoint REST (or related powershell modules) I have tried "Rename-PnPTenantSite&...
ryankennedy712's user avatar
Answer

whats the difference between "Name" and "Site Name"? Whenever you create team site in SharePoint, Microsoft 365 group will also be created automatically with same name as site. ...

View answer
Sridevi's user avatar
  • 18.3k
0 votes
1 answer
33 views

Unable to create external data source using Azure SQL Database Emulator

I am trying to use the Azure SQL Database Emulator described here: https://learn.microsoft.com/en-us/azure/azure-sql/database/local-dev-experience-sql-database-emulator?view=azuresql I am trying to ...
aorphan's user avatar
  • 181
Answer

The error you are getting is because External Data Source type RDBMS is only supported in Azure SQL not in local SQL server. while publishing or building the project if you select the Publish to new ...

View answer
Pratik Lad's user avatar
  • 7,007
0 votes
1 answer
44 views

What does adding/using Application Insights from Visual Studio do?

I have added Application Insights to my app by following the Blazor server instructions here. As listed, I added the NuGet package Microsoft.ApplicationInsights.AspNetCore. However, in Visual Studio, ...
David Thielen's user avatar
Answer Accepted

What does adding/using Application Insights from Visual Studio do? Adding Application Insights from Visual Studio provides an option to create/select the Application Insights Resource and updates the ...

View answer
Harshitha's user avatar
  • 6,558
0 votes
1 answer
38 views

Can I cache BlobContainerClient?

I am constantly building the full url to a BLOB with the following: var (containerName, blobName) = SplitFullname(filename); var container = ServiceClient.GetBlobContainerClient(containerName); var ...
David Thielen's user avatar
Answer Accepted

#Can I cache BlobContainerClient? I agree with Andrew B's comment, the GetBlobContainerClient method is a lightweight call, so you may not notice a significant performance difference by caching the ...

View answer
Venkatesan's user avatar
  • 7,826
0 votes
1 answer
26 views

Load a Registered Model in Azure ML Studio in an Interactive Notebook

I'm using Azure Machine Learning Studio and I have an sklearn mlflow model stored in my default datastore (blob storage) which I have then registered as a model asset. How can I load this model inside ...
Matt_Haythornthwaite's user avatar
Answer

According to this documentation any one of the paths should be given for loading the model. /Users/me/path/to/local/model relative/path/to/local/model s3://my_bucket/path/to/model runs:/<...

View answer
JayashankarGS's user avatar
0 votes
1 answer
23 views

Terraform - Azure DataFactory Pipeline empty after runnig succsesfully

i'm currently testing Terraform to deploy Azure DataFactory Pipelines, i generate the Pipeline JSON File in the Azure DataFactory, then copy it into my Repo. When running Terraform apply, it generates ...
Alexander Hanikel's user avatar
Answer

While creating azure data factory pipeline using azurerm_data_factory_pipeline tf command, you should not give the entire piepline Json in the activities_json part of terraform code. You can copy only ...

View answer
Aswin's user avatar
  • 6,661
0 votes
1 answer
44 views

Conditional sorting in Azure Data Factory

lets assume I have 3 columns (id, language, name). I have multiple rows with the same ID but different language and name values. eg.: id, language, name 123, German, Blume 123, English, Flower 123, ...
Julian's user avatar
  • 99
Answer

You can follow the below approach to achieve your requirement. First create an array parameter cond_arr in the dataflow with your order values ["German","English","French"...

View answer
Rakesh Govindula's user avatar
0 votes
1 answer
60 views

Not able to pass inputs as array to Terraform

Using the below code, I am able to pass object id as array in terraform but not able to pass from Terragrunt input file and I have the following error . cloud - Azure , service - Key vault Main.tf ...
Ashraf Baig's user avatar
Answer

Passing inputs as array to Terraform from Terragrunt file The blocker you're facing is because of the way you define object_ids in each policy where it requires to be set of strings but the way you'...

View answer
Vinay B's user avatar
  • 1,007
0 votes
1 answer
78 views

Max retry exceeded when using DeltaTable with Azure Blob Storage

I'm facing an issue when using the deltalake lib to save / loading data to Azure Blob Storage. Sometimes, I'm getting the following error: DatasetError: Failed while saving data to data set ...
Erick Figueiredo's user avatar
Answer

you can control the number of retries and timeouts as shown below: datalake_vale = { 'account_name': account_name, 'client_id': client_id, 'tenant_id': tenant_id, 'client_secret': ...

View answer
Bhavani's user avatar
  • 4,002
0 votes
1 answer
55 views

What's the resource group equivalent of subscriptionResourceId and managementGroupResourceId in Bicep?

In Bicep, I can use subscriptionResourceId() or managementGroupResourceId() to refer to a resource at the subscription or management group level, respectively. In the following code, I'm using ...
Shuzheng's user avatar
  • 12.8k
Answer

How would I get the resource's resource group ID? Here is the updated code to fetch the resourcegroup ID from a resource (app service) and assign a role to the app service identity at the resource ...

View answer
Venkat V's user avatar
  • 5,347
0 votes
1 answer
32 views

Move multiple folder structure along files from one blob container to another in different storage accounts using Logic app

I have Blob container contains multiple folders ,sub folder and files. I have to create a logic app to copy folders ,sub folder and files to another container in different storage account. I used ...
Hemant Kumar's user avatar
Answer Accepted

Move multiple folder structure along files from one blob container to another in different storage accounts using Logic app You can use below design which worked for me: then: then: Output:

View answer
RithwikBojja's user avatar
1 vote
1 answer
57 views

Databricks Performance Tuning with Joins around 15 tables with around 200 Million Rows

As part of our Databricks notebook, we are trying to run sql joining around 15 Delta Tables with 1 Fact and around 14 Dimension Tables. The Data coming out of Joins is around 200 Million records. ...
Nanda's user avatar
  • 51
Answer

A broadcast join, also known as a map-side join, is a join execution strategy that distributes the join operation across cluster nodes. It is highly efficient for joining a large table (fact table) ...

View answer
Dileep Raj Narayan Thumula's user avatar
0 votes
1 answer
36 views

When storing secrets within a Key Vault (Such as 1Password Developer Tools or Azure KeyVault), how do you properly protect the keyvaults secret?

When looking to avoid keeping secrets in code (so it doesn't get checked into source control) and having them encrypted in some fashion (so they're not exposed if a machine is compromised), I'm ...
Owen's user avatar
  • 3
Answer

You are not missing a key piece. Outside of the azure environment with no possibilities for using managed identities you need a service principal to access the key vault. You can configure the azure ...

View answer
Peter Bons's user avatar
  • 28.9k
0 votes
1 answer
33 views

Azure Function Trigger Not Processing Images in PDFs Correctly

I'm encountering an issue with my Azure Function trigger, which processes files uploaded to Azure Blob Storage and indexes them into Azure Cognitive Search. The function works perfectly for most file ...
Su Myat's user avatar
  • 13
Answer

To resolve the error of fails to process images with PDFs we add skill of Text Merge skill to existing skill set. The Image processing is indexer-driven, which means that the raw inputs must be in ...

View answer
Sampath's user avatar
  • 2,805
0 votes
1 answer
57 views

Use pfx certificate to get Microsoft Entra token - Error: secretOrPrivateKey must be an asymmetric key when using RS256

I tried different ways to get an Azure AD token using a pfx certificate. But the code below is persistently returning this error: Error: secretOrPrivateKey must be an asymmetric key when using RS256 I ...
Cristobal BL's user avatar
Answer

In my case, I ran below OpenSSL commands to create private key and certificate like this: openssl genrsa -out sridemo.pem 2048 openssl req -new -key sridemo.pem -out sridemo.csr openssl x509 -req -...

View answer
Sridevi's user avatar
  • 18.3k
0 votes
1 answer
16 views

Azure Traffic Manager and blue/green deployments

I found this blog post by Microsoft: https://azure.microsoft.com/en-us/blog/blue-green-deployments-using-azure-traffic-manager/ And am finding statements in that post contradict the Azure Traffic ...
cptully's user avatar
  • 687
Answer Accepted

To stop all traffic why don't you just disable the endpoint. It is also offered as the solution in the post you mention: In this example we set the first endpoint Blue.contoso.com with a weight of 1,...

View answer
Peter Bons's user avatar
  • 28.9k
0 votes
1 answer
36 views

Server failing to authenticate request when using Azurite to copy blob from one container to another

I am getting the error: AuthorizationFailure, (Exception: HttpResponseError: Server failed to authenticate the request. Make sure the value of the Authorization header is formed correctly including ...
Siri Mudunuri's user avatar
Answer Accepted

AuthorizationFailure, (Exception: HttpResponseError: Server failed to authenticate the request. Make sure the value of the Authorization header is formed correctly including the signature.) According ...

View answer
Venkatesan's user avatar
  • 7,826
0 votes
1 answer
31 views

spring boot auto-configure Azure EventHub handshake issue

I am writting an application using Java Spring Boot auto configure that tries to send a message to Azure EventHub using SPN for authentication without avail as i'm getting a cert issue. My code looks ...
Klam's user avatar
  • 151
Answer

Here, you need to change the above application.yml file format and also Additionally, needed slight adjustment as you can check that in the below steps. application.yml: spring: cloud: azure: ...

View answer
Suresh Chikkam's user avatar
0 votes
1 answer
39 views

SharepointList to Azure synapse

I am trying to create a copy activity in Azure synapse analytics that copies data from a SharePoint List. I already have an APP registered and a linked service created. My issue is that once the data ...
Ahmed ELMANIALAWY's user avatar
Answer

As per the Documentation, For lookup data type in the source you need to give int32 as target data type and for Yes/No, give Boolean data type. There is no complex data type in the documentation. So, ...

View answer
Rakesh Govindula's user avatar
1 vote
2 answers
51 views

Spark reading CSV with bad records

I am trying to read a csv file in spark using a pre-defined schema. For which I use: df = (spark.read.format("csv") .schema(schema) .option("sep", ";") ...
Tarique's user avatar
  • 579
Answer

You can use the .option("mode", "DROPMALFORMED") to skip bad rows. df = sqlContext.read \ .format("com.databricks.spark.csv") \ .option("header", "...

View answer
Dileep Raj Narayan Thumula's user avatar
0 votes
1 answer
49 views

Calculation of outlier score in series_outlier method

I want to implement the series_outlier method in Python & used the following code import pandas as pd import numpy as np from scipy.stats import norm # Load the data into a DataFrame data = { ...
New2015's user avatar
  • 29
Answer Accepted

You can use below code to use series_outlier method: import pandas as r from scipy.stats import norm as r_nm import numpy as rn rith_test = { 'r_sr': [67.95675, 58.63898, 33.59188, 4906.018, 5....

View answer
RithwikBojja's user avatar
0 votes
1 answer
73 views

Azure Function .Net Code Issue for HttpRequest and BlobServiceClient Assembly or namespace

I have the below code of azure function to read excel file from Azure blob storage but it is giving below 2 errors. I am using .Net framework 4.8 and I also have System.net.http package added to my ...
SRP's user avatar
  • 1,081
Answer

I have done few modifications to your code and able to read excel file from Azure Blob Storage. Code Snippet: //using .. using Azure.Storage.Blobs; using OfficeOpenXml; using System.Text; public ...

View answer
Pravallika KV's user avatar
0 votes
1 answer
29 views

Scheduled alert rule created in Terraform doesn't work

I try to create scheduled alert rule in terraform. This is my code: resource "azurerm_monitor_scheduled_query_rules_alert_v2" "failed_alert" { name = "test&...
michasaucer's user avatar
  • 4,980
Answer Accepted

Adding Kind property is not exactly required when you are scheduling monitor log alert query. Because the resource azurerm_monitor_scheduled_query_rules_alert_v2 itself mentioning that it is ...

View answer
Jahnavi's user avatar
  • 6,221
0 votes
1 answer
26 views

Azure Service Bus - Differentiating between Messages on the same Topic

I'm relatively new to Azure Service Bus (and message queueing in general), and have run into technical detail that seemed like a simple question but without a solution clearly available online. The C# ...
marcuthh's user avatar
  • 596
Answer

There are two options I can think of for this scenario. Subscription/Function per message type Single subscription/Function for all messages Regardless of the method you choose, you'll need to stamp ...

View answer
Sean Feldman's user avatar
  • 25.4k
-1 votes
1 answer
47 views

How I can to import my automation account job schedule in Terraform?

How can I import my schedule job of my runbook of my automation account to manage it from Terraform? We have created it from outside our IaC and now we need it to be managed from Terraform so that ...
Dielam's user avatar
  • 15
Answer

Import my automation account job schedule in Terraform The error you're facing is because of the way you providing the import command. To import the job schedule we need to specify the id of job ...

View answer
Vinay B's user avatar
  • 1,007
0 votes
1 answer
59 views

How to create linked service in Azure data factory for on premise Rest API

I have an on premise API which is only accessible within private network. I have to create a linked service of this API for a client. For testing purpose I created a simple flask API in python running ...
Ajax's user avatar
  • 169
Answer

You can check the connection directly in the integration runtime manager only. Below are the steps to check. Open Integration runtime configuration manager. Click Diagnostics tab in the dialog box. ...

View answer
Aswin's user avatar
  • 6,661
0 votes
1 answer
53 views

Unable to create Win11 VM with OS Image under 126GB

My goal is to create a Win11 image in Azure with a OS disk size of 80GB (so it can be used as an image for AWS Workspace). I can see that the UI in Azure does not let me create a VM with a disk size ...
degan93's user avatar
Answer

I do agree with Julian Hüppauff,You can use Hyper-V to create a VM with a disk size of 80GB and install the OS on the same disk. In Hyper-V, you have more control over the disk size. In Azure, if you ...

View answer
Venkat V's user avatar
  • 5,347
0 votes
1 answer
54 views

Why are PublicClientApplicationBuilder requires client(app) Secret?

My final goal is manage user's access token by refresh token. I need token from user. So, I tried this logic. var app = PublicClientApplicationBuilder.Create(clientId) .WithRedirectUri(redirectUri)...
손동진's user avatar
Answer Accepted

I agree with @TinyWang, PublicClientApplicationBuilder does not require client secret. Refer this MS Document. The error occurs if you missed enabling public client flow option or added redirect URI ...

View answer
Sridevi's user avatar
  • 18.3k
0 votes
1 answer
83 views

Databricks Generating Error: AnalysisException: [ErrorClass=INVALID_PARAMETER_VALUE] Missing cloud file system scheme

When I attempt to create or save a table to a location in my Azure Datalake Gen 2 using example code: %sql CREATE TABLE IF NOT EXISTS events USING parquet OPTIONS (path "/mnt/training/ecommerce/...
Patterson's user avatar
  • 2,599
Answer Accepted

INVALID_PARAMETER_VALUE] GenerateTemporaryPathCredential uri /mnt/files/Iris.parquet is not a valid URI. Error message: INVALID_PARAMETER_VALUE: Missing cloud file system scheme. When working with ...

View answer
Bhavani's user avatar
  • 4,002
0 votes
1 answer
25 views

After upgrading Azure AI Search to V11, new SearchOptions do not provide an option to specify fields count we can highlight

We recently upgraded azure ai search (acs) to version 11. We now have a new class SearchOptions. We are not able to find the required property where we can mention no. of highlight count on a field. ...
chetan s's user avatar
Answer

As per this document below is the requirement for using highlight. Fields must be Edm.String or Collection(Edm.String) Fields must be attributed at searchable If not, please configure to meet ...

View answer
JayashankarGS's user avatar
0 votes
1 answer
73 views

Using Azure Text Analytics to Detect Language per Sentence Instead of Paragraph

I am new to Azure. I want to use Azure AI's language detection service, which I have set up using the following code on my local machine: def authenticate_client(): endpoint = os.environ["...
Ibrar Babar's user avatar
Answer

Text Split cognitive skill in Azure is not what you are thinking Please follow this documentation to understand what azure ai search actually is. But if you wish to combine both the services possible ...

View answer
JayashankarGS's user avatar
0 votes
2 answers
77 views

Azure DevOps release pipeline has suddenly started failing

2024-07-09T23:11:19.7718791Z Complete getting Artifacts From Template 2024-07-09T23:11:19.7719155Z Start deploying artifacts from the template. 2024-07-09T23:11:19.7722609Z Deploy ...
Su1tan's user avatar
  • 47
Answer Accepted

Based on the error message , we can conclude that the deployment failed with the error : "ClientIpAddressNotAuthorized" , which means that the request was forbidden due to Workspace not ...

View answer
Nandan's user avatar
  • 4,693
0 votes
1 answer
56 views

Unable to setup continuous deployment in new Azure Container App due to "deny assignment" error

I've successfully completed the tutorial for containerized functions on Azure Container Apps. Now when I go to setup up Github Actions for continuous deployment from the container app in the Azure ...
Peter Radocchia's user avatar
Answer Accepted

Failed to set up continuous deployment with error: The client '{email-address}' with object id '{user-uuid}' has permission to perform action 'Microsoft.Resources/deployments/write' on scope '/...

View answer
Pravallika KV's user avatar
0 votes
1 answer
56 views

Unable to Create Delta Table in Databricks Premium. No problems creating Delta Table Databricks Community Version

I have created a new Databricks Premium Instance in Azure and whether I attempt to create a Delta Table with the following PySpark code: if chkDir == 'False' or chkTbl == False: ent....
Patterson's user avatar
  • 2,599
Answer Accepted

When I tried to register the Delta table using the deltadf = DeltaTable.forName(spark, f"{stageName}.{regName}") Error's: AnalysisException: `BASEsqlArea2`.`Country_users` is not a Delta ...

View answer
Dileep Raj Narayan Thumula's user avatar
0 votes
1 answer
36 views

"Operation not permitted" when trying to compile code in Databricks

I am trying to compile my code in Databricks using the following code: import subprocess process = subprocess.Popen(["python", "setup.py", "bdist_wheel"], ...
Iqram Choudhury's user avatar
Answer

When you are using shared cluster, this error comes due to permission constraint in shared cluster. But the recommended ways of installing or importing custom modules is given in this documentation In ...

View answer
JayashankarGS's user avatar
0 votes
1 answer
50 views

How to register device on Entra ID programatically

I'm trying to register a device programmatically with Entra ID (formerly known as Azure Active Directory) for my organization's internal application. I've read through the official Microsoft ...
service1 user's user avatar
Answer

To create and register a new device in the organization, check the below: Create a Microsoft Entra ID application and grant Directory.AccessAsUser.All delegated API permission: Generate auth-code by ...

View answer
Rukmini's user avatar
  • 11.7k
0 votes
1 answer
57 views

Verify token from Microsoft Entra ID fails?

I have setup IdentityServer4 with the quickstart template to authenticate the user against Micorosft Entra ID(with FIDO2). IdentityServer4 gets a token back and the verifikation starts in the ...
Banshee's user avatar
  • 15.7k
Answer Accepted

Based on your code, you are generating access token for Microsoft Graph API: options.Scope.Add("openid"); options.Scope.Add("profile"); options.Scope.Add("email"); And ...

View answer
Rukmini's user avatar
  • 11.7k
0 votes
1 answer
45 views

send queue data with custom expiration in python

Currently I'm trying to send data to a azure queue in storage. I am able to send data but I am unable to custom the expiration time like to 10 seconds 1 day like that. But in azure portal able to do ...
sai's user avatar
  • 43
Answer

I am able to send data but I am unable to custom the expiration time like to 10 seconds 1 day like that. But in azure portal able to do it. below image azure portal. Here is the code to send message ...

View answer
Venkatesan's user avatar
  • 7,826


1
2 3 4 5
213