Configure CSV upload for your Databricks connection

CSV upload for Databricks is configured for each connection individually. Before configuring it, you must have a pre-defined schema and database in Databricks. When uploaded, the data goes to memory and is then pushed to Databricks. No uploaded CSV data is stored in ThoughtSpot.

To enable CSV upload for Databricks for your cluster, contact your administrator. Once this feature is enabled, your administrator can configure CSV upload for each Databricks connection.

Configuring CSV upload for your Databricks connection

To configure CSV upload for your Databricks connection:

  1. Select Data in the top navigation bar.

  2. Open the Connections tab in the left navigation bar.

  3. Select the name of the Databricks connection in which you want to enable CSV upload.

  4. Click the More menu more menu icon for the connection, and select Configure data upload.

  5. In the Configure data upload window, select the toggle for Enable data upload to enable the feature.

    Click on the toggle to enable data upload
  6. From the Select destination database menu, select the Databricks database where you want to create the uploaded CSV data.

  7. From the Select destination schema menu, select the schema where you want to create the uploaded CSV data.

  8. Select Save.



Was this page helpful?