TABLE OF CONTENTS

Data Syncs

If you're seeking a centralized data storage solution or a method to visualize and interact with your data across subaccounts and campaigns, you can enable either a BigQuery or S3 sync.


Important Notes

  • We omit some tables and fields that are still under active development to avoid having to introduce breaking schema changes for features that are still under development. 

  • New columns and tables may be added at any time.

  • Once a table or column is included, we’ll do our best to support that indefinitely. 

  • We’ll mark tables and columns as deprecated if they’re slated to be discontinued in the future, but will continue to provide the data in the current format for as long as possible.

  • We won’t rename or delete columns. If we do have to discontinue a column, we’ll leave it in the table but with empty values to avoid breaking any queries that rely on it. You should also make sure that your queries can handle empty values in any column.



BigQuery

The BigQuery sync runs every 15 minutes, syncing over new and updated data.


Prepare to Enable Your BigQuery Export

  1. Create a GCS bucket in your Google Cloud Platform account. You can also use an existing bucket. We will allow you to specify a prefix for the files it stores in the bucket, so you can, for example, have multiple texting accounts using the same GCP bucket with different prefixes. 

  2. Create a BigQuery dataset. This dataset should be just for Scale to Win synced data, separate from all your other BigQuery data, since we will need to create and update tables.

  3. Create a GCP Service Account. This service account will need the following permissions:

    • The “BigQuery job user” permission (on the service account itself)

    • The “Storage legacy bucket owner” and “Storage legacy object owner” permissions on the GCS bucket you created

    • The “BigQuery data owner” permission on the BigQuery dataset you created

  4. Create a JSON credential for this service account.

  5. Have your GCS bucket name, GCS object prefix (if you need one), dataset name, and JSON credential handy.


Setup in STW Text

  1. Navigate to Organization Settings > Data Exports and click Configure in the BigQuery Export section (only Organization Owners can set this up).

  1. Fill out the form and hit “Save”. We'll run a series of checks to ensure the permissions are correct. The first sync should start running within 15 minutes. If there’s a lot of data, this first sync may take up to a few hours to sync over all of the existing data. Future runs will be faster, because only new and updated data will be synced.


Data Dictionary

Once the BigQuery export is enabled, you can learn more about the data schema from the linked Data Dictionary in the BigQuery Export section of the Data Exports tab. This documentation is also synced to your BigQuery table and viewable in the BigQuery UI. Ignore tables ending with “__STAGING” as they are used by the sync process and should not be queried directly.



S3

The S3 sync runs once daily. There are two syncs that we offer: an incremental sync (previous 7 days) and a full sync, which caps at 1M rows per table. 


Once enabled, a set of CSVs will be dumped into your S3 bucket nightly between midnight and 2AM UTC.


Prepare to Enable Your S3 Export

  1. Create an S3 bucket in your AWS account. You can also use an existing bucket. 

  2. Create an AWS IAM user. This IAM user will need read and write permission to the S3 bucket you created.

  3. Create an AWS access key for that IAM user.

  4. Have your S3 bucket name, S3 object prefix (if you need one), IAM user’s access key ID, and IAM user’s secret access key handy.


Setup in STW Text

  • Navigate to Organization Settings > Data Exports and click Configure in the S3 Export section (only Organization Owners can set this up).

  • Fill out the form and hit “Save”. We'll run a series of checks to make sure that the permissions are correct. The exports will begin to run nightly.


Data Dictionary

Once the S3 export is enabled, you can learn more about the data schema from the linked Data Dictionary in the S3 Export section of the Data Exports tab.