“BigQuery is Google's serverless, highly scalable, low cost enterprise data warehouse designed to make all your data analysts productive. Remove the headache of planning for data warehouse capacity and reach for infinity with elastic capacity scaling that has no limit. Google BigQuery meets the challenges of real-time analytics by leveraging Google’s serverless infrastructure that uses automatic scaling and high-performance streaming ingestion to load data.
BigQuery supports a standard SQL dialect which is ANSI:2011 compliant, reducing the need for code rewrite and allowing you to take advantage of advanced SQL features. BigQuery provides free ODBC and JDBC drivers to ensure your current applications can interact with BigQuery’s powerful engine. With Indicative for BigQuery, take the enhanced and powerful data insights from Indicative to the next level by combining it with other popular BI tools like Tableau, MicroStrategy, Looker, Google DataStudio.” - Google Cloud Platform
With Indicative for BigQuery, your data analysis potential is limitless.
BigQuery as a Source
The BigQuery integration with Indicative is available for Enterprise customers only. If interested, please contact us. You must grant 'bigquery.dataViewer' access to Indicative for your BigQuery project.
In order to perform the following steps you must have administrative access to the BigQuery console as well as your BigQuery database.
Start in Indicative
1. In Indicative, go to Settings and click Data Sources.
2. Click on New Data Source.
3. Click on the Google BigQuery icon.
4. Select Next.
5. Enter in the GCP Project ID. This can be found within your BigQuery console.
6. Enter the Dataset Name
7. Enter the Table Name and click Next.
Configuring in BigQuery
8. Click on Select a project.
15. Go back to Indicative and press Validate Integration.
16. If successful, you will see this scheduling screen.
17. Please select a date and time to schedule a meeting with an Indicative specialist
BigQuery as a Destination
Indicative for BigQuery is a fully managed data warehousing service that encapsulates the effort required to fully support, maintain and load your Indicative data into BigQuery. Because there is no infrastructure to manage, you can focus on analyzing data to find meaningful insights. The modern and infinitely scalable data warehouse provided by BigQuery combined with the enriched data from Indicative allows you to leverage additional value from your investment through a built-in SQL interface.
The BigQuery Export Integration provided by Indicative allows customers to easily export their raw Indicative data to BigQuery for further analysis via a SQL interface. The BigQuery Export Integration does not include derived properties, such as IP address or IP address based location information. The purpose of this section is to provide an overview of how Indicative loads raw data into BigQuery and what customers need to provide to configure and maintain the integration.
In order to use the BigQuery Export Integration for Indicative, the customer must provide programmatic BigQuery access to the Indicative Platform. The customer is required to grant dataViewer BigQuery access to Indicative.
In order to perform the following steps you must have administrative access to the BigQuery Console as well as your BigQuery database.
Create a BigQuery Dataset (Optional)
Open the BigQuery web UI in the GCP Console.
In the navigation panel, in the Resources section, select your project.
On the right side of the window, in the details panel, click Create dataset.
On the Create dataset page:
- For Dataset ID, enter a unique dataset name.
(Optional) For Data location, choose a geographic location for the dataset. If you leave the value set to Default, the location is set to
US. After a dataset is created, the location can't be changed.
For Default table expiration, choose one of the following options:
- Never: (Default) Tables created in the dataset are never automatically deleted. You must delete them manually.
- Number of days after table creation: This value determines when a newly created table in the dataset is deleted. This value is applied if you do not set a table expiration when the table is created.
Click Create dataset.
Share Your Dataset with Indicative
- ProjectId: the Indicative project ID to export
- BQProject: the GCP project name to write data into
- BQDataset: the BQ dataset name to write data into
Each payload type sent to Indicative is represented as a separate table within the BigQuery dataset. Each table will contain columns relevant to the associated payload type (e.g. event, identify, and alias) in addition to common 'messageId' and 'receivedTimestamp' columns. See below for specific table structure.
Alias Table Schema
Indicative supports aliasing between anonymous IDs and user IDs to allow customers to unify event streams submitted with separate unique keys. Aliasing is typically used to connect the activity stream of an anonymous user to their known activity stream after they have been identified as a known user
An anonymous ID is an ID used to identify a user before they’ve registered, logged in, or otherwise identified themselves. A user ID is the ID used to uniquely identify a single user within your application
After receiving an alias call, all data submitted in the future from either the anonymous ID or a user ID will be processed as coming from the same user. As a best practice, we recommend that an alias call between a pair of IDs is made exactly once. It’s sensible to make this call the first time a user identifies themselves, such as upon registration within your application.
Identify Table Schema
Use the Identify call to set or update user attributes of a particular user without sending an event. The Identify table schema is almost identical to the custom event table schema. It contains a timestamp, user ID and relevant properties.
Track Table Schema
Google’s BigQuery data warehouse also supports additional data sets from within your existing infrastructure. Indicative is happy to assist customers with loading additional data sets into BigQuery. For more information, please contact firstname.lastname@example.org.