Consume finance and operations data entities from C#

Consume Dynamics 365 F&O Data Entities from C#

On this occasion, I am here to share a piece of code that has been very useful when cosuming Dynamics 365 Finance and Operations data entities, whether they are standard entities or custom entities, from any type of application written in C#. In my specific case, it is a piece of code written to be consumed from Azure Functions.

As many of you know, Microsoft has at our disposal a sample code in its GitHub repository to consume these entities using the OData Client. It is a more than valid example, which I have used more than once, and which I also highly recommend to learn how they work, as well as authentication through Azure Active Directory, but in my particular case, I decided to make this helper with http calls through the .NET standard HttpClient in order to generate a much lighter code, without the need to generate the large number of proxy classes that the OData client generates to be able to use all the entities of Dynamics 365 F&O.

As you already know at this point in the story, I am not an expert in C# code, I am not even close to being one, therefore, if thougout this post you see any point of improvement, I will be more thant grateful if you leave me a comment with the aim of improving it :).

One consideration to take in account when writting code to consume data entities, is that we can receive a 429 error at any time, due to the throttling priority, so, make sure you develop a consistent retry pattern to deal with it.

Without further ado, I am going to show you the generic code that I have written to be able to use it in a generic way with any of the existing entities in the system, which, as you already know, are not few.

As you will see below, I have developed a method for each of the operations that we can perform. These are GET, POST, PATCH and DELETE.

Authentication

The first step to be able to consume data entities is to get the access token through Azure Active Directory. To do this, we will need to generate an application registration in our tenant. To do this, you can follow the instructions I wrote in this post.

Next, we generate a data contract, which allows us to easily interact with the response we get when obtaining the token.

Now we can see, in a simple way, the method that we will use to obtain the access token, where:
domain is your Azure tenant, for example, jatomas.com
clientId is the clientId of the App Registration
clientSecret is the secret generated within the App Registration
resource is your F&O instance url (without the final bar ‘/’)

Once we have the access token, we can continue, performing the operation we need. Next, I leave you the method that we will use for each of these operations.

GET (Select)

Use of method GetEntity

In the case of GET, omment that there are different ways to call it. The one I am using (https://fnourl.com/data/DataEntity(EntityKey=’Value’) will be used whenever we are looking for a specific record, through its entity key. It would be the closest thing to the methods find that we use in X++, but we also have the option of obtaining several records filtering by the fields that we want, depending on the information we have, following this nomenclature: https://fnourl.com/data/DataEntity?$filter =Field1 eq ‘Value’ and Field2 eq ‘Value’. The difference when obtaining the data is that, instead of obtaining a single object, we will obtain an array with all the records that meet the conditions indicated in the $filter.

POST (Insert)

Use of method InsertEntity

PATCH (Update)

Use of method UpdateEntity

DELETE (Delete)

Use of method DeleteEntity

Conclusion

So far today’s post, I hope you find it useful, and as I said at the beginning, any questions, (constructive) criticism or improvement, I will be happy to read them in the comments. Regards!

8 comments / Add your comment below

  1. This is a good piece of work! Thank you so much.
    I have a question about a situation where you have to read data from a third party system to D365FO. In that scenario, the third part doesn’t push information to D365FO, but rather D365FO is the one to consume information from third part application.
    How do you approach this scenario?
    Thank you for your reply in advance.

    1. Hi! Thanks for reading and commenting!
      Regarding your question, It really depends of the specific case. Could you give me mor details about your scenario? Do you have to consume a REST API? Do you have to get files from an SFTP?… I mean, how this third party exposes the data you need to consume?

      Regards!!

      1. Hi Tomas,

        To my surprise I found myself reading the question I posted to you a year ago, lol. In the case I was referring to, you would be consuming a REST API, i.e getting data and feeding it into D365.

        1. Hi again Huggins! 🙂
          In this case, it also depends hahaha.
          For example, one valid approach, if the volume of data is high and, after getting the data, you have to process it and “do things” in D365, like posting invoices, creating projects, etc, could be, consume this REST API, save the data in a custom table, and after that, create a batch process that reads the data and process it. It will be more efficient and performant, and you can create multiple tasks within the batch process to create the stuff in parallel.

          Thanks for comment again!

  2. Thank you for the nice post.
    I assume, HTTP GET return only top 10000 records. In order to fetch other records we need to use next-link property from the current GET output array. I assume above code for GET does not handle this. May be you can extend this in existing code

    1. Hello Ajay Kumar,
      Thanks for comment. In this case, it is not necessary because I am using the GET method to get a specific record. You can see that I am using the “entity key” i.e. the primary key to get only one record.

      But of course you are absolutely right, if I had to loop through a large number of records, the next-link property should be considered!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Uso de cookies

Este sitio web utiliza cookies para que usted tenga la mejor experiencia de usuario. Si continúa navegando está dando su consentimiento para la aceptación de las mencionadas cookies y la aceptación de nuestra política de cookies, pinche el enlace para mayor información. ACEPTAR

Aviso de cookies