- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2023 09:14 AM
We have setup the metastore with Manged Identity and when trying to create a managed table in the default location I am hitting below error. The storage is ADLS Gen2.
AbfsRestOperationException: Operation failed: "This request is not authorized to perform this operation.", 403, GET, https://xxxxxxx(masked intentionally).dfs.core.windows.net/unity-catalog?upn=false&resource=filesystem&maxResults=5000&directory=xxxx((masked intentionally))/tables/3b26d0dd-43b9-45cf-afb1-3d4d83246851/_delta_log&continuation=NTU4NjA5OTE0NzQwMTQ1MDUyNyAwIDAwMDAwMDAwMDAwMDAwMDAwMDA=&timeout=90&recursive=false&st=2023-12-11T16:31:06Z&sv=2020-02-10&ske=2023-12-11T18:31:06Z&sig=XXXXX&sktid=e9aef9b7-25ca-4518-a881-33e546773136&se=2023-12-11T17:51:09Z&sdd=3&skoid=b9083a16-f310-4656XXXXXXXXXXXXXXXXXX&spr=https&sks=b&skt=2023-12-11T16:31:06Z&sp=rl&skv=2020-02-10&sr=d, AuthorizationFailure, "This request is not authorized to perform this operation. RequestId:
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-05-2024 09:52 PM
Hi @Kris2 , I completely understand your hesitation and appreciate your approach to seeking guidance!
This error generally means that the cluster has connectivity to Unity Catalog configured storage location and it is not authorized to access storage. It could happen because of the following reasons.
- Access connector does not have right role or
- Storage firewall issues or
- Wrong storage credentials/spark azure keys being used.
You can check the below details to resolve this error:
- Verify that the storage principal/managed identity used has access to the storage account. It should have the `Storage Blob Data Contributor` role assigned at the storage account level.
If you can’t provide `Storage Blob Data Contributor` role to SP/MI at the storage account level but only provide at the container level, you can set `Storage Blob Delegator` role at the storage account level and provide `Storage Blob Data Contributor` role at the container level.
Please refer to https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-ident... step for details. - If the access mechanism is based on storage principal (SP) and there is a storage firewall being used, managed identity (MI) should be used instead of SP. Refer to https://docs.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-identi... for information on MI.
- If you are using a private link for gen2 storage with public network access disabled, it would be required to enable MI connector for access as mentioned below.
- If you have the exception “Allow Azure services on the trusted services list to access this storage account” allowed ( this is only visible when the Public access option is set to 'Enabled from selected virtual networks and IP addresses') and then set Public option to Disabled, it would work fine. If the exception is not allowed, then adding MI connector as per https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-ident... and switching public access to disabled state would work fine.
- If managed Identity is being used, ensure that storage account network rules (if any as per https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-ident... ) are added with righ tenant Id and resource Id details if it’s done via CLI/Arm template. You can list the rules using azure cli https://learn.microsoft.com/en-us/cli/azure/storage/account/network-rule?view=azure-cli-latest#az-st...
Leave a like if this helps, followups are appreciated.
Kudos,
Ayushi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-05-2024 09:52 PM
Hi @Kris2 , I completely understand your hesitation and appreciate your approach to seeking guidance!
This error generally means that the cluster has connectivity to Unity Catalog configured storage location and it is not authorized to access storage. It could happen because of the following reasons.
- Access connector does not have right role or
- Storage firewall issues or
- Wrong storage credentials/spark azure keys being used.
You can check the below details to resolve this error:
- Verify that the storage principal/managed identity used has access to the storage account. It should have the `Storage Blob Data Contributor` role assigned at the storage account level.
If you can’t provide `Storage Blob Data Contributor` role to SP/MI at the storage account level but only provide at the container level, you can set `Storage Blob Delegator` role at the storage account level and provide `Storage Blob Data Contributor` role at the container level.
Please refer to https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-ident... step for details. - If the access mechanism is based on storage principal (SP) and there is a storage firewall being used, managed identity (MI) should be used instead of SP. Refer to https://docs.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-identi... for information on MI.
- If you are using a private link for gen2 storage with public network access disabled, it would be required to enable MI connector for access as mentioned below.
- If you have the exception “Allow Azure services on the trusted services list to access this storage account” allowed ( this is only visible when the Public access option is set to 'Enabled from selected virtual networks and IP addresses') and then set Public option to Disabled, it would work fine. If the exception is not allowed, then adding MI connector as per https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-ident... and switching public access to disabled state would work fine.
- If managed Identity is being used, ensure that storage account network rules (if any as per https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/azure-managed-ident... ) are added with righ tenant Id and resource Id details if it’s done via CLI/Arm template. You can list the rules using azure cli https://learn.microsoft.com/en-us/cli/azure/storage/account/network-rule?view=azure-cli-latest#az-st...
Leave a like if this helps, followups are appreciated.
Kudos,
Ayushi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2025 03:31 AM
Hi,
I am facing the same issue. I am getting 403 error while creating a TABLE.
The folder is getting created inside the metastore in the ADLS everytime I run the Create Table command.
The permissions are `Storage Blob Data Contributor` role assigned at the storage account level.
At a container level the permissions are inherited.
At a Container Level I have also added a role of `Storage Blob Data Owner`
I have also tried to add ACLs at further folder levels with the Access Connector for Azure DatabricksI have added Databricks Resource Instances for permissions with the access Connector name at the at Storage account networking tab.
I have ticked Allow Trusted Microsoft Services to access at Storage account networking tab
I can create the Catalog and schema, table creation is getting error as it writes the data in the ADLS.
I have gone through the above documents.

