In this article, we will demonstrate how to import tables from a CSV file, flag anomalies through transformations, and move the cleaned data into the destination database using Bold Data Hub. Follow the step-by-step process below.
Sample Data Source:
Sample CSC Data
Note: On-Demand Refresh will be triggered when the pipeline is saved. If needed, the pipeline can be scheduled in the Schedules tab.
Go to the Transform tab and click Add Table.
Enter the table name to create a transform table for customer satisfaction summary.
Note: The data will initially be transferred to the DuckDB database within the designated {pipeline_name} schema before undergoing transformation for integration into the target databases. As an illustration, in the case of a pipeline named “customer_service_data”, the data will be relocated to the customer_service_data table schema.
Learn more about transformation here
Identifying anomalies in response and resolution times helps detect inefficiencies and potential service issues. Anomalies can also highlight customer dissatisfaction, requiring further investigation.
We use statistical thresholds to flag anomalies:
SELECT
Ticket_ID,
Customer_ID,
Agent_ID,
Resolution_Time,
Customer_Satisfaction,
CASE
WHEN Resolution_Time > (
SELECT AVG(Resolution_Time) + 2 * STDDEV(Resolution_Time)
FROM {pipeline}.sample_csc_data
) THEN 'High Resolution Time'
WHEN Customer_Satisfaction < 2 THEN 'Low Satisfaction'
ELSE 'Normal'
END AS Anomaly_Flag
FROM {pipeline_name}.sample_csc_data;