DataFlow requirements and guidelines

Here are some requirements and guidelines for working with DataFlow.

Requirements

Your organization has the following setup:

Access ThoughtSpot Dataflow by navigating to the Data tab, selecting Utilities, and clicking Open DataFlow.

Minimum disk space allocation of approximately 5 GB in the /etc/thoughtspot/ directory.

Minimum disk space allocation of approximately 5 GB in the /etc/thoughtspot/ directory.

Guidelines

Temporary storage

Some data sources, such as Snowflake, Amazon S3, Google Cloud Storage, and Azure Blob Storage, temporarily store data as local files before loading into the internal ThoughtSpot database.

These sources require additional disk space, depending on how much data is in the source.

DataFlow TQL editor

The DataFlow TQL editor supports the following SQL commands. For details, see the TQL reference.

ALTER TABLE
  • rename, add, and remove columns

  • modify column data types

  • add and remove primary and foreign keys.

CREATE DATABASE
CREATE SCHEMA
DELETE FROM <table> [WHERE…​]
UPDATE <table> …​ SET …​ [WHERE …​]

 

Limitations

DataFlow allows you to load data into the Falcon in-memory database of your ThoughtSpot cluster only. If you store your data in a cloud data warehouse and want to connect your ThoughtSpot cluster directly to the cloud data warehouse, you must use Connections.


Related information


Was this page helpful?