Uncategorized Archives - Page 4 of 12 - - Page 4

Category Archives: Uncategorized

Unit Of measure showing Blank/Null On POS in D365 Retail (Commerce)

Dynamics 365 Financial and Operations uses the unit of measure to calculate the amount of product that its purchased, sold, or stored in inventory. When you perform sales or purchases, it’s important that you select the correct unit that is being sold or purchased, as the cost and price of the item is dependent upon it. When  products can be added in cart, but unit of measure showing blank/Null  on POS I recently came across an issue where the unit of measure I created in HQ and add in Products unit but this was not showing up on POS. Following are steps to resolve this issue. Click on Unit which showing blank on POS. Check the description it must be filled Click on Translated Unit description. Check the language it must be match with store language. After preforming these steps, run 1040,1070, 1090 job. Refresh the POS link. You will able to se Unit of measure on POS. Hope this helps!

Share Story :

Automated statement Posting in D365 Retail (Commerce)

The Retail statement functionality in D365F&O is the process that puts everything together and makes sure transactions from POS flows into D365F&O HQ.   If you are using shift-based statements, a statement will be calculated when the shift is closed. Using shift-based closing can be tricky, but I highly recommend doing this! After the statement is calculated and there are no issues, the statement will be posted, and an invoiced sales order is created. 1. Manually create a new “blank” batch job 2. Click on “View Tasks”. 3. Add the following 4 classes: Class Name Task Description RetailCDXScheduleRunner Upload channel transaction(P-job) RetailTransactionSalesTransMark_Multi  Post inventory RetailEodStatementCalculateBatchScheduler Calculate statement RetailEodStatementPostBatchScheduler Post statement Here I choose to include upload of transactions, post inventory, calculate statement and post statement into a single batch-job. Click on each task, Under the general tab. Set ignore task failure to YES.  Do this process for all task in the job. And click on the “parameters” to set the parameters on each task, like what organization notes that should be included. Add this parameter to Post inventory, Calculate Statement, Post statement. On each batch task I also add conditions, so that the previous step needs to be completed before the batch-job starts on the next. Provide condition to Post inventory, Calculate Statement, Post statement according to there sequence. Click on recurrence and set the recurrence that when the statement done. The benefit of this is that when you are opening the statements workspace you mostly see statements where there are cash differences, or where the issues on master data. Now you will able to post statement automatically as per set time in recurrence. Hope this helps!

Share Story :

Configuring Scheduled Jobs – Part I

Introduction: In this blog, we will be seeing Scheduled Job configuration for Data Replication using Data Director. Scheduled Jobs comprise of two parts: the Job Header and Sub-jobs. In the Job Header, we define different parameters for the Jobs like, Error Handling, To and From Locations, Compression Types, Scheduling details and the sub-jobs. In the Sub Jobs, we define where to get the schemas of the table, the tables to replicate, methods of replication, filters on the data to be replicated, linked tables, etc. References: LS Retail Data Director User Guide (ls-one.com) Isolation Levels in SQL Server – SQLServerCentral Distribution Sublocations, Scheduler Job Header (lsretail.com) Distribution Restrictions, Scheduler Job Header (lsretail.com) Pre-requisites: Microsoft Dynamics 365 Business Central LS Data Director Configuration: General: Job ID :- A unique Identifier for this Scheduled Job. Scheduler Job Type Code:- It is a kind of category for this Job, we can use this category as a filter when we configure NAS Services. Subjobs Defined By Job:- Specifies where system is supposed to fetch sub-jobs from for this Job. Generally it is the same as “Job ID” but LS allows you to create a job with its sub-jobs defined in another job. Location Settings: In this tab, we specify where the Data is supposed to come from and where the data is supposed to go. There are multiple ways to configure this, From Multiple Locations to Single Location Set the “From Dist. Restrictions to “Include List”. Click on Navigate -> Jobs -> Sender Location List. Add all the locations that you want to pull the data from.  Set “Distribution Restrictions” to “Single Location” Set the Location in “To-Location Code” field.  From Single Location to Multiple Locations – This is the similar to the previous one simply reversing where we set the values. Set “From Dist. Restrictions” to Single Location and set the location. Set “Distribution Sublocations” to “Included in Replic.”, this field is used to specify whether Data should be sent to sublocations(POS Terminals) or not. Set “Distribution Restrictions” to “Include List” and Go to Navigate > Jobs > “Receiver Locations Include/Exclude.”  Add all the locations you want to send the Data to.  From Multiple Locations to Multiple Locations Simply set Include List on both sides and add all the locations that the data is supposed to come from and where it is supposed to go. Schedule Details Here we specify how often this Job is supposed to run. You can schedule the Job to run every day, hour, minute or second as per your needs. In the above example, I have scheduled the Job to run every 15 minutes, every day. Do note that you need to have NAS Services configured for the jobs to run automatically. Data Replication: Here we have to define the “Subjobs”, which are in essence, tables which are to be replicated. Note:- In the DD Setup tab, there are advanced settings which can be used to control the compression type and SQL Isolation levels. More information regarding that can be found in the aforementioned references. Conclusion: Thus, we saw how to create a Scheduled Job Header, there are a lot more things that can be done with this like, using different codeunits for Data Replication, Object Replication Or simply running Codeunits for automating tasks. Thanks for reading!

Share Story :

Enabling/Disabling Services in Azure DevOps

We can control which services to be made available by turning a service on or off. To do that one should have organization in Azure DevOps and should be organization owner or should be Project administrator group member.  Step 1: Open Azure Dev ops > Go to Project Settings.  Step 2 : In Project setting, Select Overview, under Azure Devops Services we can enable or disable the services.  When we turnoff (gives below popup message) any services that services will not be visible to Project members.   Step 3 : Now we have turned off Board service,if we will open project and check Boards will not be visible.  Hope this helps! 

Share Story :

Connect Azure Databricks to Power BI

Open Power BI and Click on Get Data and Search for Azure Databricks and click on connect.  It will ask for below details,  Server Hostname  HTTP Path    Now we will see, how to get above details,  Go to Azure Databricks and click on Clusters  Once clusters is opened the go to Advanced setting > JDBC/ODBC. Under this we can get Server Hostname and HTTP Path, which can be used in above steps.  Fill the details and click on OK, It will ask for user credentials, after that it will open a pop up  asking to select the from List of Tables. Select Tables and click on load  In this way we can create Power BI report based on the current data received from Azure Databeicks.  In this way we can create Power BI report and create fields above it. azure BI

Share Story :

Load Azure Devops Data into Power BI using Odata

In this blog we will see how to fetch Azure Devops Data and based on it create Visualization as per requirements. Step 1: Go to Azure Devops > Go to User Setting > Personal Access Token  Step 2: Once Clicked on Personal Access Token, click on New Token, following popup will appear.  Enter details Name, organization and Expiration and specify Scope.  New popup will appear, and it will generate access token key, copy it.  Step 3: Prepare the OData URL             Sample OData URL: https://analytics.dev.azure.com/{OrganizationName}/_odata/{version}/   In our Case, Organization is gvishwakarma Version is  v1.0  Then URL will be https://analytics.dev.azure.com/gvishwakarma/_odata/v1.0/  Step 4 :Open Power BI click Get Data > select OData Feed  Sample Data  Once data loaded in power BI we create visualization as per requirement.  Hope this Helps! 

Share Story :

How to Modify Existing Report Layout in Business Central

Introduction: To modify the layouts you can remove the fields, tables, or change the position fields. There is an option to extend the Table and Page but to make changes in the existing report there is no extension yet but there is a way you can change the layout of the report. Steps to achieve the goal: Search for Report Layout selection. Select the Report ID which you want to modify. Go to Process ->Custom Layout.  Make a copy of the layout file to do that select new action it will ask you to insert RDLC or WORD according to it will create a file. Then go to layout action in Custom Layout Page->Export layout. Make changes in that file again. Import new modified layout using Import layout action. Make sure to update the layout. Import layout will only import your file you need to update as well. And then go to the Report layout selection page choose the selected layout of the report to be Custom Layout and select your modified report file in the Custom layout description. You can test your new report by Reports-> Run Report. Conclusion: Thus we saw how we can change the layout of existing reports using Report Layout Selection. Thank you very much for reading my work. Hope you enjoyed the blog!

Share Story :

Power BI Preview Feature: Small Multiple

Step 1: Open Power BI desktop and navigate File > options and settings > options Step 2: Pop-up will appear select preview feature and select the feature required and click on OK. Once clicked on OK, another popup will appear asking to restart Power BI. Step 3 Small Multiple This feature allows us to view the chart data from different dimensions. In this case we have considered the sales data across different region and date wise. Sample Data: In ideal scenarios, we view the data only based on single dimension. But when we want to view data from different dimensions, we can put dimensions in the small multiple. In above snapshot we can see sales data across different region and Month wise. Hope this helps!

Share Story :

Ability to specify locations as “Shipping” or “Pickup” enabled within Fulfillment group in D365 Commerce(Retail)

In Commerce version 10.0.12 and later, organizations can define whether the warehouse or warehouse and store combinations that are defined in fulfillment groups can be used for shipping, for pickup, or for both shipping and pickup. This allows for added flexibility for the business to determine which warehouses can be selected when creating a customer order for items to ship vs. which stores can be selected when creating a customer order for items to pick up. To use these configuration options, turn on the Ability to specify locations as “Shipping” or “Pickup” enabled within Fulfillment group from feature management. If a warehouse that’s linked to a fulfillment group isn’t a store, it can be configured only as a shipping location. It can’t be used when orders for pickup are configured in POS. If you enable Ability to specify locations as “Shipping” or “Pickup” then it is more than likely that you will get this error message. In order to get past this error you need to select the warehouse manually by clicking on the origin/ship from location This will ensure that cashier has to manually select the shipping location every time when you create orders for shipping Hope this works for you!

Share Story :

Handling Pagination in Logic App for JSON payloads having Linked

This blog will guide you through how the paginated data from API’s can be handled and Processed in Azure Logic App.

Share Story :

SEARCH BLOGS:

FOLLOW CLOUDFRONTS BLOG :


Secured By miniOrange