Employee Work Assignment
Employees may have multiple work assignments in Dayforce as a possible way your Dayforce system is configured for your organization. As assignments for an employee can change those changes are tracked by DLH.io.
However, DLH.io, does not update the is_active or is_deleted value in the raw landing zone to designate which assignment is the active assignment for the employee. While in Dayforce an administrator can see a previous assignment for an employee has ended there is no actual active status, only Date of Assignment Completion.
This logic is the same when it lands in your target destination. And because the business logic from one Dayforce customer to another can be different based on their industry or the general business practice of people management from one industry to another, DLH.io does not assume an active or de-active assignment status. Rather, it is left to the customer.
If this situation occurs, and determining the active assignment per employee is required, to account for this situation a Transformation needs to be created in DataLakeHouse.io in order to mark the __DLH_IS_ACTIVE flag to TRUE or FALSE depending on the business logic requirements. DLH.io support team can assist with this exercise for your implementation or it can be accomplished with some mid-level SQL skill by following the general advise below.
In order to work with identifying an active employee assignment these basic steps act as a general exercise to query the data and then determine if the business logic will meet the business requirement. If not, additional tuning or assistance from support will be require.
Using the SQL logic here, determine if this provides the current active work assignment status of your employees. If so you can move to the steps below this and understand how to create a pre-sql update in the system
- Select your target connection
- Select the option for SQL (or POST-SQL)
- Enter a SQL (POST-SQL) update statement such as this one below and then save the transformation.
- NOTE: Please test this post-sql logic before adding it to the transformation. You MUST make sure that it works via a SQL editor and verified the results before simply adding it to the transformation.
After you've entered the Post-SQL logic:
- Complete the remainder of the required fields in the form
- Save the transformation form
Next time the sync bridge runs and completes a load for the EMPLOYEE_WORK_ASSIGNMENT table your transformation and the update statement will run prior to loading the data into your target schema.
Ceridian Dayforce allows certain users to deleted Employee Work Assignments. While the deleted records remain visible in the Work Audit screen in Dayforce, they are no longer provided to the Dayforce API.
This situation can cause records that have been replicated to your Target Connection to not be marked as IS_DELETED which will lead to incorrect data to be used in downstream applications and/or data models.
Employee Work Assignments should never be deleted in Dayforce since the best practice is to set an End Date on the Work Assignment. If records are deleted, DataLakeHouse.io still has you covered!
Use the following process in DataLakeHouse.io to land the data and when the work assignment is deleted, DLH.io will only mark the record for analytics as deleted and no longer active but will keep the record in your Target Source:
In order to work with identifying an employee work assignment that has been deleted these basic steps act as a general exercise to query the data and then determine if the business logic will meet the business requirement. If not, additional tuning or assistance from support will be require.
If you've been running the Ceridian Dayforce connector in DLH.io for awhile and you're unsure how many records have been deleted in Ceridian Dayforce, run a Historical Re-sync. Once the Sync Bridge has completed the historical sync, use the SQL logic here, to determine how many records should be marked as IS_DELETED = True. If there are records that should be marked IS_DELETED = True, move to the steps below to create a Pre-SQL update in DataLakeHouse.io
- Select your target connection
- Select the option for SQL (or POST-SQL)
- Enter a SQL (POST-SQL) update statement such as this one below and then save the transformation.
- NOTE: Please test this post-sql logic before adding it to the transformation. You MUST make sure that it works via a SQL editor and verified the results before simply adding it to the transformation.
After you've entered the Post-SQL logic:
- Complete the remainder of the required fields in the form
- Save the Transformation form
Next time the sync bridge runs and completes a historical load for the EMPLOYEE_WORK_ASSIGNMENT table your transformation and the update statement will run prior to loading the data into your target schema.