Steps to create API proxy in SAP API Management for Azure Blob Storage to read/write/delete/list the blobs or files with assign message and java script policy

Introduction: This document describes about how to create API proxy for Azure Blob Storage with assign message and java script policy.

Here, we will create a container in Azure Blob Storage and then create an API proxy to access that container to create/delete/read blobs or files and also read the list of blobs or files in the container to access it.

Prerequisite: Create a container in Azure Storage with any name and generate SAS key to access it, Please click here to know the steps to create container and generate SAS Key for the container.

I have created one container with the name “data”.

Copy SAS Key and URL to use in SAP API Management

SAS Key :- sp=r&st=2023-12-24T13:36:40Z&se=2023-12-24T21:36:40Z&spr=https&sv=2022-11-02&sr=c&sig=sdsad%dasdsdDFasdasda%2FgfUCnxte8NtVPtswt2iMA%5F

URL:- https://<StorageAccount>


Create an API provider in APIM for Azure Blob Storage.

  • In integration suite, go to Configure->API and select API Provider tab and click on create.
  • Give name “AzureStorage” or any suitable name.

  • Enter hostname “<your  Azure Storage API Hostname>”and Port “<Azure Storage API port>”.


  • Select Authentication type none.
  • Save it.


Create API Proxy for Azure Storage Rest API.

  • In integration suite, go to Configure->API.


  • Let’s give “AzureBlob” or  any suitable name and title, Select Host Alias, if you have multiple host configured, give any base path, I am giving “/blob”, service type would be REST.


  • You will see that API proxy is created now. but there is no resources for this API proxy, click on Add to add resources.

  • Resource name would be your container name, i have created container with name “data” so creating resource with giving “data” in Path Prefix.

***We need only three operations get, put and delete. remove all other operations and click on ok.


  • All resources related to this proxy have been created now, save it and click on policies to add/edit policies.



  • In policy editor, I have created 3 policies at pre flow of Target End Point.



  • Assign Message policy “AMtoGetList” is used to add query parameters when consumer wants to get the list of the all blobs or files in container. If you can see the above table then you will find that headers and query parameters are same in all activities expect first one where list of blobs or files should come in the response.

For reading list, APIM has to send additional query parameters  “restype=container&comp=list” when request comes with get operations and without <FilePath or FileName>

So the condition string of “AMtoGetList” will check if request verb is “GET” and there is no <FilePath or FileName> after resource or container name then execute this policy else no action and  flow will go to next policy.



  • Next policy is again Assign Message policy “AMAddQueryParameters“, which is used to add all parameters of SAS key except sig parameter.

  • Let’s add a Java Script policy “setSig” at pre flow of Target End Point which will add sig parameter to query string.


Question: Why are we using javaScript policy for sig parameter?

Answer: Because signature value can have some special characters like “%” which will be encoded while sending the request on wire, so “%”  will be converted into “%25” and it will change the signature value and authentication would be failed at Azure’s end.

Solution: Decode signature value before sending it on wire so that after encoding it will become the actual value of signature

This policy will call java script created “setSig”, which you can write under scripts

To create a script, click on “+”

Give any name and write the code in the script resource, here we can use decodeURIComponent() function to decode value of sig parameter.


  • Go to policy editor and add Verify API Key policy to pre flow of Proxy End Point, which will be used to authenticate consumer.


  • Enter any suitable name, I am giving VAPIK and Stream should be “Incoming request”.


  • Now, policy has been added with default code, we need to change as per our requirement.
  • Replace “variable containing api key” with “request.header.x-api-key”. Consumer has to pass API key in “x-api-key” header which will be verified at APIM.


  • Click on update, then save and deploy API Proxy, now we are good with API proxy creation but authentication and  subscription of consumer are still pending for this proxy.


  • Authentication can be done via Basic/OAuth/PrivateKey or NoAuth with or without API Key verification. in this example, we will be using No Auth with API key verification.


  • Let’s add the api proxy to a product, i have created a product called “AzureStorage” and added API in it.
  • Subscribe the product by a consumer, i subscribed the product “AzureStorage” for consumer “ConsumerOne”.


  • Please click here for more details of creating product and subscription,
  • Now we are all set to test API proxy with API Key.


  • Open postman to test this API and create a file in container using PUT operation, put header “x-api-key” with value of application key and write “test1 data” in the body part, use <proxy url>/<container or resource name>/<FilePath or FileName>, if you see highlighted part of below image then “data” is the container name or resource name then file path, it means file test1.txt would be created under container “data”-> folder section1.Send the request and file would be created with the content sent in the body and it can be seen in Azure Storage under container “data”.

  • Now, let read the same file with get operation


Here, we can see response body with the content sent earlier while creating file.

  • Let’s read the list of files or blobs in the container using GET operation, no need to pass file path here.



Conclusion: This document explained that how to create an API proxy for Azure Blob Storage and perform get/put/delete operations.







Source link

Be the first to comment

Leave a Reply

Your email address will not be published.