Inc. 5000 Fastest-Growing Company in the US

inc-5000-logo
Replaceable Attributes in D365 Operations

Replaceable Attributes in D365 Operations Table of Contents Share Replaceable Attributes in D365 Operations We were working for an extension modelling in Dynamics 365 for Operations in standard Bank module. The standard process is to download the “Bank Positive Pay” file information, which will be downloaded as text file. Standard business scenario We were working for an extension modelling in Dynamics 365 for Operations in standard Bank module. The standard process is to download the “Bank Positive Pay” file information, which will be downloaded as text file. Change Request The approach was to avoid this manual process and automate the bank Positive Pay process. We bypassed the standard downloading process and instead transferred these positive pay files into Azure BLOB storage automatically. Then from Azure BLOB storage, an external web service will pick these files for automatic payment processing, usually an FTP service which works for most banks. Figure 1. Data Management in Dynamics 365 for Finance and Operations The reasons include the business policies on data security and operational efficiency. The reasons could differ based on business cases, however the focus here is on the programming aspect. Analysis If you further examine the arrangement of the code to deliver the file, the standard code downloads the file. (Classes/BankPositivePayExport/sendFileToDestination) /// <summary> /// Send file to destination. /// </summary> [Replaceable] protectedvoid sendFileToDestination() { str downloadUrl = DMFDataPopulation::getAzureBlobReadUrl(str2Guid(fileId)); Filename filename = strFmt(‘%1-%2%3′, bankPositivePayTable.PayFormat, bankPositivePayTable.PositivePayNum, this.getFileExtensionFromURL(downloadUrl)); System.IO.Stream stream = File::UseFileFromURL(downloadUrl); File::SendFileToUser(stream, filename); } As per Chain of Command (or Event Handlers), we have option to only write custom code either before method execution or after method execution. But in this case since standard method was already downloading file. Challenge here was to see how we can block or bypass the standard code.   Since NEXT command is mandatory in calling Chain of Command method, when this NEXT command is called standard code will get executed and file will be downloaded to the local computer. <Chain of command> Pre code NEXT () <standard code of sending positive pay>Mandatory Post code <End of Chain of command> Resolution About Replaceable Attributes With replaceable method, we need not call NEXT command in Chain of Command (CoC) and completely override the logic written by standard Microsoft code. But Microsoft suggests using this NEXT command conditionally.   As part of this blog, let’s see how to mitigate the scenario of blocking the standard local file storage and get the file stored in Azure BLOB’s. Create an extension class for “Classes/BankPostivePayExport” usingMicrosoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Blob; [ExtensionOf(classStr(BankPositivePayExport))] finalclass BankPositivePayExport_Extension { } By replaceable attribute, write code to store in Azure BLOB This method is actual replaceable method in “Classes/BankPostivePayExport”. Following code explains how we wrote custom logic to download the file and send it for Azure BLOB storage. /// <summary> /// Sends file to Blob storage /// </summary> protectedvoid sendFileToDestination() { #define.Underscore(“_”) #define.DOT(“.”) #define.XML(“XML”) #define.FILENAME(“ASCPositivePay”) if (this.parmBankPositivePayNum()) { BankPositivePayTableascbankPositivePayTable; ascbankPositivePayTable= BankPositivePayTable::find(this.parmPostitivePayNum()); ASCDownloadURL downloadURL; ascDownloadURL = DMFDataPopulation::getAzureBlobReadUrl(str2Guid(fileId)); Filename filename = strFmt(’%1%2%3%4%5’, #FILENAME, ascbankPositivePayTable.PositivePayNum, #Underscore, System.String::Format(’{0:MM.dd.yy}’, today()), #DOT + #XML); System.IO.Stream stream = File::UseFileFromURL(downloadUrl); CloudBlobContainer blobContainer; blobContainer = this.connectToBlob(“<Name from Azure Key vault>”, “<Key from Azure Key Value>”, “<BLOB Folder name>”); this.uploadFile(blobContainer, stream, filename); } } Establish connect to Azure BLOB container Following code establishes the connection with Azure BLOB. As a safety measure directly storage account name, account key and container name is not advisable, these values should be stored in Azure Key Vault and picked based on tokens (Note: Concept of storing and retrieving from key vault is outside scope of this blog, you can refer to Azure Key Vault article on how to store and retrieve storage keys) /// <summary> /// Establish connection to blob storage /// </summary> /// <param name = “_storageAccountName”>Storage account name</param> /// <param name = “_accountKey”>Account key</param> /// <param name = “_blobContainerName”>Blob container name</param> /// <returns>Blob container</returns> public CloudBlobContainer connectToBlob(ASCStorageAccount _storageAccountName, ASCAccountKey _accountKey, ASCBLOBContainer _blobContainerName) { CloudBlobClient blobClient; CloudBlobContainer blobContainer; CloudStorageAccount storageAccount; ASCConnectionString connectionString; connectionString = strfmt(@ASC:ConnectionString, _storageAccountName, _accountKey); storageAccount = CloudStorageAccount::Parse(connectionString); blobClient = storageAccount.CreateCloudBlobClient(); blobContainer = blobClient.GetContainerReference(_blobContainerName); return blobContainer; } Labels @ASC:ConnectionString= ”DefaultEndpointsProtocol=https;AccountName=%1;AccountKey=%2;EndpointSuffix=core.windows.net” Custom method to upload file to Azure BLOB This is a custom method, which explains how the BLOB container is uploaded to Azure BLOB. This method takes file stream, blob container name and blob file name as input to create files in Azure BLOB. /// <summary> /// Uploads the file to the blob /// </summary> /// <param name = “_blobContainer”>Blob container</param> /// <param name = “_stream”>File stream</param> /// <param name = “_fileName”>File name</param> publicvoid uploadFile(CloudBlobContainer_blobContainer, System.IO.Stream_stream, FileName_fileName) { _blobContainer.CreateIfNotExistsAsync(); try { ttsbegin; CloudBlockBlob blockblob =blobContainer.GetBlockBlobReference(_fileName); if(blockblob && !blockblob.Exists(null, null)) { if(_stream) { blockblob.UploadFromStreamAsync(_stream).Wait(); blockBlob.FetchAttributes(null,null,null); BlobProperties BlobProperties = blockblob.Properties; if(BlobProperties.Length == _stream.Length) { info(strFmt(“@ASC:FileUploadedSuccessfully”, _fileName)); } } } else { info(strFmt(“@ASC:FileAlreadyExists”, _fileName)); } ttscommit; } catch { info(“@ASC:ErrorUploadingFile”); } } Labels @ASC:ErrorUploadingFile =“Error while uploading file” @ASC:FileAlreadyExists =“File already exists” @ASC:FileUploadedSuccessfully=“File uploaded successfully” Conclusion We find this Replaceable Attribute cool and very beneficial in blocking the standard code completely and writing our own custom code. To know more about Do’s and Don’t’s of using replaceable methods, refer to the Microsoft documentation:   https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/extensibility/extensibility-attributes#replaceable   This blog focuses on the ability to customize a standard behavior. However, we recommend to thoroughly evaluate the need to customize. Author: Krish Moorthy Technical Specialist – Microsoft Dynamics 365 Share Latest Posts You might also like:

Migrate EWS Apps to Microsoft Graph for Exchange Online in D365 Finance and Operations

Migrate EWS Apps to Microsoft Graph for Exchange Online in D365 Finance and Operations Table of Contents Share Migrate EWS Apps to Microsoft Graph for Exchange Online in D365 Finance and Operations Exchange Web Services (EWS) is a legacy protocol that has been in use since Exchange Server 2007. In August 2018, Microsoft announced that there won’t be any active investment in EWS APIs for Exchange Online. We recommend that you migrate your EWS apps that access Exchange Online to Microsoft Graph. Checkout the below recording on how to set this up in D365 Finance and Operations. The exchange email provider stopped sending emails as of September 15, 2024. https://www.ascent365.com/wp-content/uploads/2025/03/Switch-from-Exchange-to-Microsoft-Graph-for-email-settings.mp4 Author: Sohena Hafiz President Icon-linkedin Share Latest Posts You might also like:

importing-custom-data-in-d365-data-entities

Importing custom data in D365 Data Entities Table of Contents Importing custom data in D365 Data Entities Every organization has custom data that is specific to their operations/process.   Most organizations see tremendous value in bringing over that data to work with your enterprise software system either interactively for transactions or use for reports/dashboards.   Microsoft Dynamics 365 Finance and Operations has provided a Data Management Framework (DMF), that allows for the D365 system to be that one fully-integrated & all-encompassing system.   We will walk through Data Management Framework of Dynamics 365 Finance & Operations. Data Management & Data Entity Data Management Framework of Microsoft Dynamics 365 for Finance and Operations utilizes data entities in Data migration Copy module setups and configurations Integrations In this example, we will see how data is imported through Data entity from an Excel file. Figure 1. Data Management in Dynamics 365 for Finance and Operations Sample Scenario D365 Finance & Operations allows for adding fields to existing Data entities, so most custom data can be included within that entity. For the sake of explaining the programming concept, we will create a new entity which is related to the existing master data entity, say Customer.   There are many scenarios where users bring data from external systems and import into D365 on a daily/weekly basis. Figure 2. Sample excel data The user will make a list of customer accounts available, in an excel file, and then import that file in Dynamics 365 for Finance and Operations using the Data management workspace through a Data Entity.   During the import, the system will find the related record of customer account in Customer master and store its record Id (CustTable.RecId) into a new table ASCCustRelatedTable as well as monthly account balance (and any other custom data fields). In case of any exception, user would able to view the error or warning details in the log of the Data management module. Building the solution First, we will create a table ASCCustRelatedTable with all the required fields and index where we store the values of CustTable.RecId during the time of import from excel. In our case, we have fields and index as mentioned here Figure 3. ASCCustRelatedTable structure Figure 4. View of ASCCustRelatedTable in the development window After creating the table ASCCustRelatedTable, we would be creating a data entity. For this we must right click on the table and go to Add-ins, then click on Create data entity. By doing this, new data entity ASCCustRelatedTableEntityalong with some dependent security privilege objects. Figure 5. Creating new entity from ASCCustRelatedTable Figure 6. New entity and security objects from ASCCustRelatedTable The user will only import an excel file that contains a list of customer account, but we require to store the record Id of Customer master (CustTable.RecId) into a separate table ASCCustRelatedTable as well as monthly account balance. To achieve this, we must create a new class ASCCustRelatedTableEntityEvents and subscribe MappingEntityToDataSource event of data entity ASCCustRelatedTableEntity.   We would be initializing the value of CustTableRecId and MAB fields through this MappingEntityToDataSourceevent which would be executed during the time of excel import.   Here is how the assignment would be CustTableRecId = CustTable.RecIdMAB = CustTable.BalanceCurrency() We can change and utilize the event as per your need and business scenarios. Figure 7. Assigning values to the fields of the ASCCustRelatedTable through MappingEntityToDataSource After all these steps, we should build and synchronize our Visual Studio solution. Testing Log in to your Microsoft Dynamics 365 for Finance and Operations environment & open the workspace Data management. Click on Import, choose the data entity ASCCustRelatedTableEntity and add your source excel file with the list of customers. Figure 8. Data management workspace – Import This is how the import project looks Figure 9. Data management – Import project Click on Import in the action pane and records will get inserted into the table ASCCustRelatedTable along with the field CustTableRecId and MAB. Figure 10. ASCCustRelatedTable table browser In case of an error, we can refer the log. Figure 11. Data management – Import project’s log information Author: Krish Moorthy Technical Specialist – Microsoft Dynamics 365 Latest Posts You might also like:

Microsoft Dynamics 365 Finance & Operations – Deployment options

Microsoft Dynamics 365 Finance & Operations – Deployment options Table of Contents Microsoft Dynamics 365 Finance & Operations – Deployment options You can now deploy Microsoft Dynamics 365 for Finance and Operations in the cloud or on-premises. Cloud deployments offer a software system that is fully managed by Microsoft, while on-premises deployments as name suggests, are deployed locally within a customer’s data center.   Important: On-premises deployments of Microsoft Dynamics 365 for Finance and Operations are NOT supported on any public cloud infrastructure, including Azure.   The following table provides a comparison of the capabilities provided by the two deployment options. Why cloud? Cloud deployments provide a cloud service that is easy to scale up or down as needed, as well as data centers that are fully managed by Microsoft. The time spent implementing Finance and Operations can be significantly shortened, fewer customizations may be required, and the costs of IT hardware and infrastructure are lower.   Cloud deployments include high availability, disaster recovery, sandbox environments, and application lifecycle management combined with cloud-based systems of intelligence, infrastructure, compute, and database services in a single offering. When needed, data failover in the cloud, automated deployment and continuous updates, and elastic compute capacity are available. A cloud deployment also provides data aggregation, financial reporting, and intelligence.   The cloud service provides customers with the greatest value, the broadest range of functionality, the best application lifecycle experience, the easiest and broadest integration with Microsoft Azure services, the best option for business insights and intelligence, and the most value for customers’ technology investments. Why on-premises? With an on-premises deployment, existing data center investments can be leveraged. Customers can also configure their enterprise preferences to meet their needs. The regulatory and compliance requirements of their business, Comply with data sovereignty rules in regions where there are no Azure Data Centers, or Ensure business continuity in areas with limited public infrastructure. A customer’s business data and processes are disconnected from the cloud and are stored and run locally in the customer’s or their partner’s data center. Some connectivity is required for system management and updates which are enabled through Microsoft Dynamics Lifecycle Services (LCS), a cloud-based application lifecycle management service. Customer data that is related to the configuration and application customization may be stored in the cloud.   For customers who choose to run Finance and Operations in their own data center, the on-premises deployment option will have a similar user-interface and application functionality as other deployment options. However, customers must take on the following responsibilities: Stand up their own infrastructure. Configure their own high-availability and disaster recovery solutions. Stand up sandbox environments. Manage their infrastructure, including scheduling operating system updates. The additional costs to deploy and manage these capabilities might lead to higher deployment costs and a greater Total Cost of Ownership (TCO). Tools for deploying the Finance and Operations software and updates will be available to partners and customers via Lifecycle Services. Unlike the cloud deployment option, Advanced Analytics and Azure Machine Learning services are not included in the on-premises deployment option. Talk to us regarding TCO comparison depending on your current and future state of your infrastructure. Features not implemented in on-premises deployments There are some features that have not yet been implemented and planned to be part of future releases. However, there are few features that are not intended for on-premise deployments. Features not yet implemented The following features have not yet been implemented in on-premises deployments. These features have not been deprecated. If these features are critical to your on-premises deployment, please let Microsoft know to help in prioritizing on the Dynamics 365 Application Ideas site.   Features not planned for on-premises deployments The following features are not intended for use in on-premises deployments. There are no plans to implement these features in on-premises deployments.   Author: Abdul Hafiz Enterprise Solution Architect Icon-linkedin Latest Posts You might also like:

All-in-one-Deployable Package (ADP)

All-in-one-Deployable Package (ADP) Table of Contents All-in-one-Deployable Package (ADP) Introduction Microsoft has recently released a feature called All-in-one-Deployable Package (ADP).   This is designed to eliminate deployment errors that many programmers deal with and improve deployment efficiency and reliability.   We analyzed how it will impact our projects and how this concept plays a vital role in deployment process. This blog series is as result of our analysis . We have taken a business scenario to explain how package movement should be carried out before ADP and after ADP. Scenario: Current model architecture On an enhancement in Human resource management solution, we have created two new models: Time and Attendance and PTO Management models Time and Attendance Let’s consider Time and attendance model containing elements with respect to employee’s working hours, clock-in, clock-out and time taken on breaks. Table -ASCTimeAndAttendanceTable Form – ASCTimeAndAttendanceForm Class – ASCTimeAndAttendanceClass PTO Management Let’s consider PTO (Personal Time Off) Management or Leave Management model with elements handling the employee’s time-off requests, approvals and the utilization. Table – ASCLeaveRequestTable Form – ASCLeaveRequestForm Class – ASCLeaveRequestClass Let us see how package deployment is done currently and how package deployment should be done after introduction to All in one deployable package (ADP). To make this blog series clear, we have split up this blog content into two-part series: Part-1: Current deployment scenarios (Before ADP) Part-2: Deployment approach with ADP Part-1. Current deployment scenarios (Before ADP) How model movement is done currently (Before ADP) In order to deploy our code and customizations to a runtime environment (demo, sandbox, production), we must create a deployable package. Currently deployments are carried out in two ways: Package generation from Visual Studio Package generation from DevOps Package generation from Visual Studio In order to deploy our code and customizations to a runtime environment (demo, sandbox, production), we have created a deployable package of our models through Visual Studio Dev tools (Dynamics 365 Menu > Create Deployable package) and then we would upload to the Asset library in LCS. Package generation from DevOps Azure DevOps provides pipelines that can be used to automate builds. There are two types of pipelines: YML and Classic. We have followed the classic pipeline which is Team Foundation Version Control (TFVC).   We have installed the NuGet package, built solution/projects, then the new deployable package would be created. We can then publish the deployable package artifact as build output. If you are following package generation from DevOps, there is no impact of ADP as that packages the entire models at all times. How it works? Initial deployment to Production For the initial deployment process, we have selected all the models i.e. Time and Attendance and Leave Management. So, our deployable package contains both the models. Going further to add the below changes which we have explained deployment-wise. Deployment -1: Changes in single model We have created a new inquiry form to list down employee’s clock-in and clock-out information.   For this, we have created a new form ASCTimeAndAttendanceInquiryForm with existing table ASCTimeAndAttendanceTable as data source.   Our deployable package contains only Time and Attendance model now. Deployment -2: Change in multiple models Change Request – 1A request came in to add a new feature for Work From Home maintenance. In order to accommodate this, we have created a new form ASCTimeAndAttendanceWFHForm with new table ASCTimeAndAttendanceWFHTable.   Change Request – 2A new validation “Employee cannot take more than 2 days of Optional leave in a year” should be added when an employee creates new leave request.   For this, we have added validation in the ValidateWrite of the ASCLeaveRequestTable.   Both these models would be a part of deployable package.  Now, in Part-2, let us see how the same scenario can be handled with ADP approach. Part-2: All in one deployable package What is ADP? In this concept, we create a deployable package with all the customized models instead of the specific model(s) with our customizations.   We follow same set of procedures here, but we would be selecting all our customized models every time we create a new deployable package regardless of changes being done in the model elements or not.   So, an ADP All-in-one Deployable Package is a software deployable package that contains all the models and binaries that you currently have in an environment. How it works? Let’s consider the same set of scenarios we explained in Part-1and try to implement them with respect to ADP (All in one data packages) Initial deployment /Existing deployment For the initial deployment process, we have selected all the models i.e. Time and Attendanceand Leave Management. So, our deployment package contains both the models. Deployment – 1: Changes in single model For the initial deployment process, we have selected all the models i.e. Time and Attendanceand Leave Management. So, our deployment package contains both the models. Deployment – 2: Change in multiple models Change – 1   Employee’s work from home details listed in a new form ASCTimeAndAttendanceWFH with new table ASCTimeAndAttendanceWFHTable.   Change – 2 A new validation “Employee cannot avail more than 2 days of Optional leave in a year” should be added when an employee creates new leave request.   For this, we have added validation in the ValidateWrite of the ASCLeaveRequestTable.   As part of deployable package creation, both these models would be a part of it. Benefits of following the ADP model deployment approach This approach helps the multi-member development teams, this process becomes even more important when dealing with multiple sprints, releasing features in parallel as well in Support mode.   This concept helps developers tremendously by drastically reducing the risk of deployment errors and helps with the overall project efficiency.   Some of the common issues multi-member development teams face: Consider we already moved our deployable package containing Time and Attendance and Leave Management to Test environment. We have modified an object ASCLeaveRequestTable in the model Leave Management, based on a Change Request. If we create deployable package with only Leave management model. When this package gets applied to the Test … Read more

Microsoft Dynamics 365 for Finance and Operations begin One Version era

Microsoft Dynamics 365 for Finance and Operations begin One Version era Table of Contents Microsoft Dynamics 365 for Finance and Operations begin One Version era Microsoft Dynamics 365 for Finance and Operations has entered the so-called One Version era, in which customers will say goodbye to traditional ERP upgrades and apply an ongoing and mandatory series of smaller updates to their systems on a monthly to quarterly basis.  Microsoft has made the case for the transition, hoping to convince current Finance and Operations (D365FO) customers that the new approach will benefit them in the long-run, and that Dynamics AX customers that upgrade will never have to endure another upgrade project in the future. This is in line with Microsoft Office 365 Upgrade philosophy.   The transition comes at a time when D365FO is growing and adapted at a far higher pace than ever before in the ERP product family.   Customer mix 30% Enterprise and 70% midsize. 77% are New Implementations vs. 23 percent Upgrades.   Functional usage in D365FO production environments Finance: 98 percent Supply chain management: 82 percent SCM Human capital management: 81 percent Warehouse Management: 63 percent Project Accounting: 16 percent   Another reason is due to improved upgrade and migration tools and the approaching end of life of various versions of AX.   The current model calls for customers to apply at least one update every three months, meaning a customer can skip two monthly updates but then must deploy the next one. For each update the customer can specify the week, day (Saturday or Sunday for most, but some industries like retail are opting for mid-week), and the time.   Another validation step for new updates will be the Release Validation Program, which brings customer-specific projects into an environment where they can be validated ahead of a release. Test results are checked by Microsoft and any defects are shared with the customer. This program requires sharing code and data with a team at Microsoft. Customer responsibilities and response The users historically have been unwilling to make frequent updates when they knew a system was working well, but One Version forces them to do just that. In theory, that risk from more frequent updates should be mitigated by two factors. First, early adopters will be getting these updates ahead of the majority of customers, and those teams should detect and report problems a month or more in advance of more conservative customers. Second, partners will be seeing the same update across all customers, meaning they should be able to predict the most common issues in any given month or quarter across all their D365FO clients.   The users did not raise significant concerns or questions in response to his guidance, which helped Microsoft propel in this direction at a faster pace. The business leaders have adopted to this model of SaaS and see the value in the long run.   Microsoft is also actively investigating customer needs and have engaged an independent consulting firm to report back to Microsoft on what customers say they need in various scenarios and industries (for example, FDA-regulated manufacturers, public sector regulations, ITAR compliance). Microsoft has already heard feedback from the community, which initially raised concerns, and is now in the process of adapting these principles. Author: Abdul Hafiz Enterprise Solution Architect Icon-linkedin Latest Posts You might also like: