Inc. 5000 Fastest-Growing Company in the US

inc-5000-logo
Replaceable Attributes in D365 Operations

Replaceable Attributes in D365 Operations Table of Contents Share Replaceable Attributes in D365 Operations We were working for an extension modelling in Dynamics 365 for Operations in standard Bank module. The standard process is to download the “Bank Positive Pay” file information, which will be downloaded as text file. Standard business scenario We were working for an extension modelling in Dynamics 365 for Operations in standard Bank module. The standard process is to download the “Bank Positive Pay” file information, which will be downloaded as text file. Change Request The approach was to avoid this manual process and automate the bank Positive Pay process. We bypassed the standard downloading process and instead transferred these positive pay files into Azure BLOB storage automatically. Then from Azure BLOB storage, an external web service will pick these files for automatic payment processing, usually an FTP service which works for most banks. Figure 1. Data Management in Dynamics 365 for Finance and Operations The reasons include the business policies on data security and operational efficiency. The reasons could differ based on business cases, however the focus here is on the programming aspect. Analysis If you further examine the arrangement of the code to deliver the file, the standard code downloads the file. (Classes/BankPositivePayExport/sendFileToDestination) /// <summary> /// Send file to destination. /// </summary> [Replaceable] protectedvoid sendFileToDestination() { str downloadUrl = DMFDataPopulation::getAzureBlobReadUrl(str2Guid(fileId)); Filename filename = strFmt(‘%1-%2%3′, bankPositivePayTable.PayFormat, bankPositivePayTable.PositivePayNum, this.getFileExtensionFromURL(downloadUrl)); System.IO.Stream stream = File::UseFileFromURL(downloadUrl); File::SendFileToUser(stream, filename); } As per Chain of Command (or Event Handlers), we have option to only write custom code either before method execution or after method execution. But in this case since standard method was already downloading file. Challenge here was to see how we can block or bypass the standard code.   Since NEXT command is mandatory in calling Chain of Command method, when this NEXT command is called standard code will get executed and file will be downloaded to the local computer. <Chain of command> Pre code NEXT () <standard code of sending positive pay>Mandatory Post code <End of Chain of command> Resolution About Replaceable Attributes With replaceable method, we need not call NEXT command in Chain of Command (CoC) and completely override the logic written by standard Microsoft code. But Microsoft suggests using this NEXT command conditionally.   As part of this blog, let’s see how to mitigate the scenario of blocking the standard local file storage and get the file stored in Azure BLOB’s. Create an extension class for “Classes/BankPostivePayExport” usingMicrosoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Blob; [ExtensionOf(classStr(BankPositivePayExport))] finalclass BankPositivePayExport_Extension { } By replaceable attribute, write code to store in Azure BLOB This method is actual replaceable method in “Classes/BankPostivePayExport”. Following code explains how we wrote custom logic to download the file and send it for Azure BLOB storage. /// <summary> /// Sends file to Blob storage /// </summary> protectedvoid sendFileToDestination() { #define.Underscore(“_”) #define.DOT(“.”) #define.XML(“XML”) #define.FILENAME(“ASCPositivePay”) if (this.parmBankPositivePayNum()) { BankPositivePayTableascbankPositivePayTable; ascbankPositivePayTable= BankPositivePayTable::find(this.parmPostitivePayNum()); ASCDownloadURL downloadURL; ascDownloadURL = DMFDataPopulation::getAzureBlobReadUrl(str2Guid(fileId)); Filename filename = strFmt(’%1%2%3%4%5’, #FILENAME, ascbankPositivePayTable.PositivePayNum, #Underscore, System.String::Format(’{0:MM.dd.yy}’, today()), #DOT + #XML); System.IO.Stream stream = File::UseFileFromURL(downloadUrl); CloudBlobContainer blobContainer; blobContainer = this.connectToBlob(“<Name from Azure Key vault>”, “<Key from Azure Key Value>”, “<BLOB Folder name>”); this.uploadFile(blobContainer, stream, filename); } } Establish connect to Azure BLOB container Following code establishes the connection with Azure BLOB. As a safety measure directly storage account name, account key and container name is not advisable, these values should be stored in Azure Key Vault and picked based on tokens (Note: Concept of storing and retrieving from key vault is outside scope of this blog, you can refer to Azure Key Vault article on how to store and retrieve storage keys) /// <summary> /// Establish connection to blob storage /// </summary> /// <param name = “_storageAccountName”>Storage account name</param> /// <param name = “_accountKey”>Account key</param> /// <param name = “_blobContainerName”>Blob container name</param> /// <returns>Blob container</returns> public CloudBlobContainer connectToBlob(ASCStorageAccount _storageAccountName, ASCAccountKey _accountKey, ASCBLOBContainer _blobContainerName) { CloudBlobClient blobClient; CloudBlobContainer blobContainer; CloudStorageAccount storageAccount; ASCConnectionString connectionString; connectionString = strfmt(@ASC:ConnectionString, _storageAccountName, _accountKey); storageAccount = CloudStorageAccount::Parse(connectionString); blobClient = storageAccount.CreateCloudBlobClient(); blobContainer = blobClient.GetContainerReference(_blobContainerName); return blobContainer; } Labels @ASC:ConnectionString= ”DefaultEndpointsProtocol=https;AccountName=%1;AccountKey=%2;EndpointSuffix=core.windows.net” Custom method to upload file to Azure BLOB This is a custom method, which explains how the BLOB container is uploaded to Azure BLOB. This method takes file stream, blob container name and blob file name as input to create files in Azure BLOB. /// <summary> /// Uploads the file to the blob /// </summary> /// <param name = “_blobContainer”>Blob container</param> /// <param name = “_stream”>File stream</param> /// <param name = “_fileName”>File name</param> publicvoid uploadFile(CloudBlobContainer_blobContainer, System.IO.Stream_stream, FileName_fileName) { _blobContainer.CreateIfNotExistsAsync(); try { ttsbegin; CloudBlockBlob blockblob =blobContainer.GetBlockBlobReference(_fileName); if(blockblob && !blockblob.Exists(null, null)) { if(_stream) { blockblob.UploadFromStreamAsync(_stream).Wait(); blockBlob.FetchAttributes(null,null,null); BlobProperties BlobProperties = blockblob.Properties; if(BlobProperties.Length == _stream.Length) { info(strFmt(“@ASC:FileUploadedSuccessfully”, _fileName)); } } } else { info(strFmt(“@ASC:FileAlreadyExists”, _fileName)); } ttscommit; } catch { info(“@ASC:ErrorUploadingFile”); } } Labels @ASC:ErrorUploadingFile =“Error while uploading file” @ASC:FileAlreadyExists =“File already exists” @ASC:FileUploadedSuccessfully=“File uploaded successfully” Conclusion We find this Replaceable Attribute cool and very beneficial in blocking the standard code completely and writing our own custom code. To know more about Do’s and Don’t’s of using replaceable methods, refer to the Microsoft documentation:   https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/extensibility/extensibility-attributes#replaceable   This blog focuses on the ability to customize a standard behavior. However, we recommend to thoroughly evaluate the need to customize. Author: Krish Moorthy Technical Specialist – Microsoft Dynamics 365 Share Latest Posts You might also like:

importing-custom-data-in-d365-data-entities

Importing custom data in D365 Data Entities Table of Contents Importing custom data in D365 Data Entities Every organization has custom data that is specific to their operations/process.   Most organizations see tremendous value in bringing over that data to work with your enterprise software system either interactively for transactions or use for reports/dashboards.   Microsoft Dynamics 365 Finance and Operations has provided a Data Management Framework (DMF), that allows for the D365 system to be that one fully-integrated & all-encompassing system.   We will walk through Data Management Framework of Dynamics 365 Finance & Operations. Data Management & Data Entity Data Management Framework of Microsoft Dynamics 365 for Finance and Operations utilizes data entities in Data migration Copy module setups and configurations Integrations In this example, we will see how data is imported through Data entity from an Excel file. Figure 1. Data Management in Dynamics 365 for Finance and Operations Sample Scenario D365 Finance & Operations allows for adding fields to existing Data entities, so most custom data can be included within that entity. For the sake of explaining the programming concept, we will create a new entity which is related to the existing master data entity, say Customer.   There are many scenarios where users bring data from external systems and import into D365 on a daily/weekly basis. Figure 2. Sample excel data The user will make a list of customer accounts available, in an excel file, and then import that file in Dynamics 365 for Finance and Operations using the Data management workspace through a Data Entity.   During the import, the system will find the related record of customer account in Customer master and store its record Id (CustTable.RecId) into a new table ASCCustRelatedTable as well as monthly account balance (and any other custom data fields). In case of any exception, user would able to view the error or warning details in the log of the Data management module. Building the solution First, we will create a table ASCCustRelatedTable with all the required fields and index where we store the values of CustTable.RecId during the time of import from excel. In our case, we have fields and index as mentioned here Figure 3. ASCCustRelatedTable structure Figure 4. View of ASCCustRelatedTable in the development window After creating the table ASCCustRelatedTable, we would be creating a data entity. For this we must right click on the table and go to Add-ins, then click on Create data entity. By doing this, new data entity ASCCustRelatedTableEntityalong with some dependent security privilege objects. Figure 5. Creating new entity from ASCCustRelatedTable Figure 6. New entity and security objects from ASCCustRelatedTable The user will only import an excel file that contains a list of customer account, but we require to store the record Id of Customer master (CustTable.RecId) into a separate table ASCCustRelatedTable as well as monthly account balance. To achieve this, we must create a new class ASCCustRelatedTableEntityEvents and subscribe MappingEntityToDataSource event of data entity ASCCustRelatedTableEntity.   We would be initializing the value of CustTableRecId and MAB fields through this MappingEntityToDataSourceevent which would be executed during the time of excel import.   Here is how the assignment would be CustTableRecId = CustTable.RecIdMAB = CustTable.BalanceCurrency() We can change and utilize the event as per your need and business scenarios. Figure 7. Assigning values to the fields of the ASCCustRelatedTable through MappingEntityToDataSource After all these steps, we should build and synchronize our Visual Studio solution. Testing Log in to your Microsoft Dynamics 365 for Finance and Operations environment & open the workspace Data management. Click on Import, choose the data entity ASCCustRelatedTableEntity and add your source excel file with the list of customers. Figure 8. Data management workspace – Import This is how the import project looks Figure 9. Data management – Import project Click on Import in the action pane and records will get inserted into the table ASCCustRelatedTable along with the field CustTableRecId and MAB. Figure 10. ASCCustRelatedTable table browser In case of an error, we can refer the log. Figure 11. Data management – Import project’s log information Author: Krish Moorthy Technical Specialist – Microsoft Dynamics 365 Latest Posts You might also like:

All-in-one-Deployable Package (ADP)

All-in-one-Deployable Package (ADP) Table of Contents All-in-one-Deployable Package (ADP) Introduction Microsoft has recently released a feature called All-in-one-Deployable Package (ADP).   This is designed to eliminate deployment errors that many programmers deal with and improve deployment efficiency and reliability.   We analyzed how it will impact our projects and how this concept plays a vital role in deployment process. This blog series is as result of our analysis . We have taken a business scenario to explain how package movement should be carried out before ADP and after ADP. Scenario: Current model architecture On an enhancement in Human resource management solution, we have created two new models: Time and Attendance and PTO Management models Time and Attendance Let’s consider Time and attendance model containing elements with respect to employee’s working hours, clock-in, clock-out and time taken on breaks. Table -ASCTimeAndAttendanceTable Form – ASCTimeAndAttendanceForm Class – ASCTimeAndAttendanceClass PTO Management Let’s consider PTO (Personal Time Off) Management or Leave Management model with elements handling the employee’s time-off requests, approvals and the utilization. Table – ASCLeaveRequestTable Form – ASCLeaveRequestForm Class – ASCLeaveRequestClass Let us see how package deployment is done currently and how package deployment should be done after introduction to All in one deployable package (ADP). To make this blog series clear, we have split up this blog content into two-part series: Part-1: Current deployment scenarios (Before ADP) Part-2: Deployment approach with ADP Part-1. Current deployment scenarios (Before ADP) How model movement is done currently (Before ADP) In order to deploy our code and customizations to a runtime environment (demo, sandbox, production), we must create a deployable package. Currently deployments are carried out in two ways: Package generation from Visual Studio Package generation from DevOps Package generation from Visual Studio In order to deploy our code and customizations to a runtime environment (demo, sandbox, production), we have created a deployable package of our models through Visual Studio Dev tools (Dynamics 365 Menu > Create Deployable package) and then we would upload to the Asset library in LCS. Package generation from DevOps Azure DevOps provides pipelines that can be used to automate builds. There are two types of pipelines: YML and Classic. We have followed the classic pipeline which is Team Foundation Version Control (TFVC).   We have installed the NuGet package, built solution/projects, then the new deployable package would be created. We can then publish the deployable package artifact as build output. If you are following package generation from DevOps, there is no impact of ADP as that packages the entire models at all times. How it works? Initial deployment to Production For the initial deployment process, we have selected all the models i.e. Time and Attendance and Leave Management. So, our deployable package contains both the models. Going further to add the below changes which we have explained deployment-wise. Deployment -1: Changes in single model We have created a new inquiry form to list down employee’s clock-in and clock-out information.   For this, we have created a new form ASCTimeAndAttendanceInquiryForm with existing table ASCTimeAndAttendanceTable as data source.   Our deployable package contains only Time and Attendance model now. Deployment -2: Change in multiple models Change Request – 1A request came in to add a new feature for Work From Home maintenance. In order to accommodate this, we have created a new form ASCTimeAndAttendanceWFHForm with new table ASCTimeAndAttendanceWFHTable.   Change Request – 2A new validation “Employee cannot take more than 2 days of Optional leave in a year” should be added when an employee creates new leave request.   For this, we have added validation in the ValidateWrite of the ASCLeaveRequestTable.   Both these models would be a part of deployable package.  Now, in Part-2, let us see how the same scenario can be handled with ADP approach. Part-2: All in one deployable package What is ADP? In this concept, we create a deployable package with all the customized models instead of the specific model(s) with our customizations.   We follow same set of procedures here, but we would be selecting all our customized models every time we create a new deployable package regardless of changes being done in the model elements or not.   So, an ADP All-in-one Deployable Package is a software deployable package that contains all the models and binaries that you currently have in an environment. How it works? Let’s consider the same set of scenarios we explained in Part-1and try to implement them with respect to ADP (All in one data packages) Initial deployment /Existing deployment For the initial deployment process, we have selected all the models i.e. Time and Attendanceand Leave Management. So, our deployment package contains both the models. Deployment – 1: Changes in single model For the initial deployment process, we have selected all the models i.e. Time and Attendanceand Leave Management. So, our deployment package contains both the models. Deployment – 2: Change in multiple models Change – 1   Employee’s work from home details listed in a new form ASCTimeAndAttendanceWFH with new table ASCTimeAndAttendanceWFHTable.   Change – 2 A new validation “Employee cannot avail more than 2 days of Optional leave in a year” should be added when an employee creates new leave request.   For this, we have added validation in the ValidateWrite of the ASCLeaveRequestTable.   As part of deployable package creation, both these models would be a part of it. Benefits of following the ADP model deployment approach This approach helps the multi-member development teams, this process becomes even more important when dealing with multiple sprints, releasing features in parallel as well in Support mode.   This concept helps developers tremendously by drastically reducing the risk of deployment errors and helps with the overall project efficiency.   Some of the common issues multi-member development teams face: Consider we already moved our deployable package containing Time and Attendance and Leave Management to Test environment. We have modified an object ASCLeaveRequestTable in the model Leave Management, based on a Change Request. If we create deployable package with only Leave management model. When this package gets applied to the Test … Read more