To set the destination credentials navigate to the settings tab in the Bold Data Hub. Bold Data Hub supports five different destinations:
>1. Bold IMDB Datastore
>2. PostgreSQL
>3. Apache Doris
>4. SQL server
>5. MySQL
>6. Google BigQuery
>7. Snow Flake
>8. Oracle
We can configure multiple data store destinations with the same server type and load data into them. This is common in scenarios where we might have multiple databases of the same type (for example, multiple MySQL or PostgreSQL databases) for different environments like development, testing, staging, or production, or for different segments of business operations.
Step 1 : Click on the settings.
Step 2: Choose the Connection Type.
New: Choose this option if you are creating a new connection to a destination for which you have not previously created credentials.
Existing: Select this option if you are updating or modifying the credentials or settings of a connection that you have already set up.
Step 3: Choose the destination where you want to move the data.
Step 4: Enter the credentials for the respective destination.
Step 5: Click on Save to save the credentials from the Bold BI Data Store. If all the given credentials are valid, the “Datastore settings are saved successfully” message will appear near the save button.
The Bold IMDB Datastore fetches the destination credentials from the BOLD BI Data Store configuration.
The data will be moved based on the credentials given in the Bold BI data store.
Enter the following credentials for PostgreSQL:
postgres
.Enter the credentials for the Apache Doris.
Click on Save to save the credentials. If all the given credentials are valid, the “Datastore settings are saved successfully” message will appear near the save button.
Enter the following credentials for PostgreSQL:
Enter the following credentials for PostgreSQL:
Enter the credentials for Snow Flake
account.snowflakecomputing.com
Click on Save to save the credentials. If all the given credentials are valid, the “Datastore settings are saved successfully” message will appear near the save button.
Enter the credentials for Oracle
Click on Save to save the credentials. If all the given credentials are valid, the “Datastore settings are saved successfully” message will appear near the save button.
Enter the credentials for Google BigQuery:
Click Here to Know how to get the Google BigQuery service account JSON file.
Click on Save to save the credentials. If all the given credentials are valid, the “Datastore settings are saved successfully” message will appear near the save button.
You will then need to create a service account. After clicking the Go to Create service account button on the linked docs page, select the project you created and name the service account whatever you would like. 6. Click the Continue button and grant the following roles, so that we can create schemas and load data: · BigQuery Data Editor · BigQuery Job User · BigQuery Read Session User You don’t need to grant users access to this service account now, so click the Done button. 7. Download the service account JSON In the service accounts table page that you’re redirected to after clicking Done as instructed above, select the three dots under the Actions column for the service account you created and select Manage keys. This will take you to a page where you can click the Add key button, then the Create new key button, and finally the Create button, keeping the preselected JSON option.
A JSON file that includes your service account private key will then be downloaded.
Note: Currently Bold Data Hub does not support creating a data source in Bold BI when using Google Bigquery as the data store.