Thursday, November 25, 2021

Create an Inbound Interface process using HCM Extract - Automated process to upload an HDL file (Part-2)

 We will be creating only a dummy HCM extract here just to use the 'Initiate HCM Data Loader' feature  as data extraction and transformation we are already doing in our BI report.

> My Client Group > Data Exchange > HCM Extracts > Extract Definitions 

Use the + sign to create a new extract of type 'Inbound Interface'


In the Parameters section add a parameter called Auto Load (Tag Name = Auto_Load) with Data Type as "Text" with Default Value as "Y". 
>> In new releases this parameter is already there but hidden. Click on 'Show hidden parameters' and update the Default Value to "Y" and Save



For the Data Group I have taken "PER_EXT_ASG_STATUS_DETAILS_UE" as User Entity as it returns very less row count.


Save
In Data group Filter criteria, I have put condition as "1=2" so that it doesn't return any data set
Edit > Put the condition 1=2 and 'Add Constant' and ‘Ok’
Save
In the Data Group create a Data Record and then a Dummy Attribute called "Dummy Field" with Data Type as "Text", Type "String" and string value has been passed as "Demo"
Dummy Record
Save, It will generate a fast formula and then create a dummy attribute


Now for the Extract Delivery Option, you can pass the details as below :
    Start Date               : 01/01/1901
    End Date                 : 12/31/4712
    Delivery Option Name     : HDLDemoOutput
    Output Type              : Text
    Report                   : <pass the BI Report path here ; 
                           Ex: /Custom/UoE Data Migration/UserCatHDLReport/HDLCallDemoRpt.xdo
    Template Name    : <pass the eText template name here;
                           Ex: HDLCallDemo
    Output Name              : HDLCallDemoOpt
    Delivery Type            : Inbound Interface
    Required/Bursting Node   : ticked


In the "Additional Details" section please pass the details as below :
   Encryption Mode          : None
   Override File  Extension : .dat
   Run Time File Name       : User
   Integration Name         : UserCategoryUpdate
   Integration Type         : Data Loader
   Integration Parameters   : blank
   Key                      : blank
   Locale                   : blank
   Time Zone                : blank
   Compress                 : Yes Compress
   Compressed Delivery Group: UserCatUpdate.zip

** Integration Name : 1st 8 characters will be used in the content ID name during the HDL call for this extract.
It starts with DL_<1st 8 characters from Integration Name  in upper case>_ProcessID 
              EX: DL_USERCATE_1092672
Save

Now navigate to Home --> My Clients Groups > Data Exchange > HCM Extracts> Refine Extracts
Search for the dummy HCM Extract which has been created. Select the HCM Extract and click on the "Edit" button.

Once the HCM Extract is opened, in the "Tasks" tab, under Flow Task the HCM Extract name will be present. Now click on the "Actions" button and select the option "Select and Add".


A search popup will appear and search the task "Initiate HCM Data Loader" (Generate HCM Data Loader file and optionally perform a data load). Now select the task and click on the "Done" button.
Save
Now select the task "Initiate HCM Data Loader" click on the "Go to Task" button.
             2 options will be there :
                    a> Data Loader Archive Action
                    b> Data Loader Configuration

Select the 1st option "Data Loader Archive Action".Click on the "Edit" button.
In the "Parameter Basis" dropdown, select the option "Bind to Flow Task".
In the "Basis Value" dropdown, select the option "Extract Name , Submit , Payroll Process"
                                              Ex: HDLCallDemo, Submit, Payroll Process

Now select the 2nd option "Data Loader Configuration". Click on the "Edit" button.
In the "Parameter Basis" dropdown, select the option "Constant Bind".
In the "Basis Value" multiline text box, please provide the below details : ImportMaximumErrors=100,LoadMaximumErrors=100,LoadConcurrentThreads=8,LoadGroupSize=100

Now click on the "Next" button and then click on the "Submit" button. You might get error to compile the fast formula, please do so. You can ignore the warnings

It's time now to submit the HCM Extract and check the Inbound Interface process. 

HCM Extracts > Submit Extract


You can check the 
HCM Data Extracts > View Extract Results


In case of any error, you can download the report for the last stuck process from the Instance Details and check for the error details



Now check if it has triggered the HDL load or not

HCM Data Loader > Import and Load Data

‘Show filters’ and remove your User Name and search as it won’t be against your User. You can see the Data Set been called, content ID/Data Set Name explanation has given above.


Import and Load both are successful, so it's all good at the end :)

Thursday, October 28, 2021

Create an Inbound Interface process using HCM Extract - Automated process to upload an HDL file (Part-1)

There are requirements where you need to rectify an existing record in fusion. Today's topic is to create an user friendly automated process to extract the existing record, transform it and load the same using HDL on a single click.

So to achieve this we would be creating an inbound interface process using HCM extract and BI report.

I am sharing step by step process with a simple example here which I had created with the help of my good friend Pavan.

Let's start, we would update the User Category of a person record. 

1> Create an BI report to extract and transform the existing data in an HDL format

     > Prepare an extract query

SELECT '1' KEY

               ,'MERGE|User|'||papf.person_number||'|EMPLOYEE_USER' mdata

   FROM  fusion.per_users            pu,

                per_all_people_f            papf

WHERE pu.person_id IS NOT NULL

      AND pu.active_flag = 'Y'

      AND papf.person_id = pu.person_id

      AND ( (trunc(sysdate) BETWEEN papf.effective_start_date

               AND papf.effective_end_date)

                  OR papf.effective_start_date >= trunc(sysdate)) 

      AND  person_number is not null

      AND papf.person_number = '169980'

> Now create a Data Model

             >  Tools > Reports and Analytics > Browse Catalog > New > Data Model


                  Data > View > Export > It will create an XML output


   > Based on the above XML data now create an RTF template 



 > Save the DM under Custom folder and create a report using this RTF template, make sure your repot is referring to the correct DM

                Please note down the 

                Template Name: HDLCallDemo and the  

                Report Path:/Custom/HDL Call Demo/UserCatHDLReport/HDLCallDemoRpt.xdo 

           > Now create bursting with the report template name

           > And add an additional permission  

               Search for 'Enterprise Scheduler Job Application Identity for HCM' under All and add


          > Now check if report is returning the data


2> Lets move on to creating an Extract (Part-2)

Tuesday, October 19, 2021

HDL: End date a record/Alter the Effective Start date through HDL

 If you just pass the EffectiveEndDate field in an HDL load to end date any component, it would create a date track entry. 

To end date a record completely you should also use ReplaceLastEffectiveEndDate field with value 'Y' 

Similarly to alter the EffectiveStartDate, you need to use ReplaceFirstEffectiveStartDate field with value 'Y'