Search results

Generating Time Intervals and Transforming Data Using Bold Data Hub

In this article, we will demonstrate how to import tables from a CSV file, generate time intervals through transformations, and migrate the cleaned data into the destination database using Bold Data Hub. Follow the step-by-step process below.

Sample Data Source:
Sample CSC Data


Step-by-Step Process in Bold Data Hub

Step 1: Open Bold Data Hub

  • Click on the Bold Data Hub.

Tranformation Use Case

Step 2: Create a New Pipeline

  • Click Add Pipeline in the left-side panel.
  • Enter the pipeline name and click the tick icon.

Tranformation Use Case

Step 3: Choose the Connector

  • Select the newly created pipeline and opt for the CSV connector. You can either double-click or click on the Add Template option to include a template.

Tranformation Use Case

Step 4: Upload Your CSV File

  • Click the “Upload File” button to select and upload your CSV file.

Tranformation Use Case

Step 5: Set the Properties

  • Copy the file path and paste it into the filePath property field.

Tranformation Use Case

Step 6: Save and Choose the Destination

  • Click Save, choose the destination, and confirm by clicking the Yes button.

Tranformation Use Case

Note: On-Demand Refresh will be triggered when the pipeline is saved. If needed, the pipeline can be scheduled in the Schedules tab.

Step 7: View Logs and Outputs

  • Click the pipeline name in the left-side panel and switch to the Logs tab to view logs.

Tranformation Use Case

Step 8: Apply Transformations

  • Go to the Transform tab and click Add Table.

  • Enter the table name to create a transform table for customer satisfaction summary.

Tranformation Use Case

Note: The data will initially be transferred to the DuckDB database within the designated {pipeline_name} schema before undergoing transformation for integration into the target databases. As an illustration, in the case of a pipeline named “customer_service_data”, the data will be relocated to the customer_service_data table schema.


Learn more about transformation here

Generating Time Intervals

Overview

Categorizing service request response times helps in performance analysis and identifying efficiency gaps. We classify response times into predefined thresholds:

  • “Fast” → Resolved within 1 day
  • “Medium” → Resolved within 3 days
  • “Slow” → Resolved after 3 days

Approach

We use a CASE statement to categorize response times based on the difference between ticket creation and resolution dates.

SQL Query for Generating Time Intervals

SELECT 
    Ticket_ID, 
    Ticket_Creation_Date, 
    Ticket_Resolution_Date, 
    CASE 
        WHEN (CAST(Ticket_Resolution_Date AS DATE) - CAST(Ticket_Creation_Date AS DATE)) <= 1 THEN 'Fast' 
        WHEN (CAST(Ticket_Resolution_Date AS DATE) - CAST(Ticket_Creation_Date AS DATE)) <= 3 THEN 'Medium' 
        ELSE 'Slow' 
    END AS Response_Time_Category 
FROM {pipeline_name}.sample_csc_data;

Tranformation Use Case