Using Azure Blob storage with Dynamics 365 Finance and Supply Chain
Azure Blob storage has become synonymous with cloud infrastructure storage. It’s a scalable, cost-effective cloud storage option for all your unstructured data. Paying only for what you use, you save money compared with on-premises storage options.
Businesses have become very creative in their use for Azure Blob:
- A leading gaming company uses Blob as a cost-effective mechanism to store game data and game server logs
- A well-known consumer products company uses Blob to manage big data before being compiled into a data warehouse from which analytics can be derived
- A prominent media company uses Blob to enable their journalists to upload or stream videos from their devices to Blob where it’s then indexed for quick access and use
Azure Blob storage is also very valuable within the context of Dynamics 365 Finance and Supply Chain. The Data Management Framework (DMF) can use Azure Blob to temporarily store exported files. We’ve also included a cope sample on how to delete these files from the Blob containers.
When you import or export a file using the DMF in Microsoft Dynamics 365 Finance and Supply Chain a temporary copy of the file is created in Azure Blob. It’s temporary because the created files have an expiration date which is currently hard-coded to 7 days (10080 minutes) and re-generated with a different GUID every time a user downloads the file.
To demonstrate this on a developer VM we will use the Azure Storage Emulator and the Azure Storage Explorer which can be downloaded for free. In the steps below we will export and download the Customer Groups data entity.
Export Data Project
In this example we create an export data project to export two entities (Customer Groups and Vendor Groups) in Excel format:
There are two options to export the data: we can either click on the download button on the action pane or click on the export button.
Either action will create the Excel files in the blob storage “dmf” folder as shown below:
Send file to user
The download action will further package the files into a single zipped file and sends it to the user client browser. To send the file to the user a temporary file is created in the blob storage with a download URL:
You can see the file in the temporary-file blob container of the Azure Emulator:
The download URL has an expiration time set in minutes which can be specified by navigating to System Administration > Setup > System parameters > Blob link expiration timespan. If this is left as “Zero”, a default expiration of 10 minutes is applied.
Once the export has executed, it will create the files in the “dmf” storage but will not send them to the user and therefore will not create the files in the “temporary-file” blob container just yet. When the DMFExecutionHistoryList form opens, you will have a “Download file” that generates the file in the “temporary-file” blob container and a download URL is provided.
The two Blob containers “dmf” and “temporary-file” can be found in the #DMF macro as:
The temporary container is also a public constant string in the FileUploadTemporaryStorageStrategy class.
Deleting the temporary files
In a recent project we opted to export files using the Data Management Framework by executing the data project export functionality from code. One of the requirements was to delete the files from the Blob container once the file was sent to the user browser via the download link.
Firstly, file identifiers that are downloaded and therefore created in the “temporary-file” blob container are not stored anywhere in the D365 Finance and Supply Chain database because they are created on the fly by generating a new GUID every time (refer to method uploadFile
in class FileUploadTemporaryStorageStrategy). Therefore, we have to create our own log table and extend the method uploadFile which is generating the GUID to save these file IDs in our custom table:
public final class BFTFileUploadTemporaryStorageStrategy_Extension
public FileUploadResultBase uploadFile(System.IO.Stream _stream, str _fileName, str _contentType, str _fileExtension, str _metaData)
FileUploadResultBase fileUploadResult = next uploadFile(_stream, _fileName, _contentType, _fileExtension, _metaData);
if (fileUploadResult is FileUploadTemporaryStorageResult
BFTFileUploadResult fileUpload; //Custom table to store uploaded file ids to the temporary blob
FileUploadTemporaryStorageResult fileUploadResultTempStorage = fileUploadResult as FileUploadTemporaryStorageResult;
fileUpload.Filename = fileUploadResultTempStorage.getFileName();
fileUpload.FileId = fileUploadResultTempStorage.getFileId();
Secondly, we have to create a class (this can be executed as batches every night) that can delete these temporary files. In the example below we have a runnable class that loops all exported files in both the “dmf” and “temporary-file” containers that were exported and/or downloaded:
public static void main(Args _args)
var blobStorageService = new SharedServiceUnitStorage(SharedServiceUnitStorage::GetDefaultStorageContext());
str azureStorageCategory = #DmfExportContainer;
while select entityExportDetails
while select bftFileUploadResultTemp
Blob storage handles trillions of stored objects, with millions of average requests per second, for customers around the world. Contact us here
if you would like a fresh approach to managing unstructured data more securely and reduce storage costs.