Snowflake: Define Your Own Schema

Prerequisites:

To integrate with  Snowflake, you will need to access your Snowflake console.

For this self-service integration, we also have some data requirements:

  1.  All of your events must be unified into one singular table as opposed to having separate tables for each event type.
  2.  By definition, an event must have a user_id, event_name, and timestamp field. The fields do not have to be named as such, and any additional fields will be treated as event properties.
  3.  There can only be a maximum of one authenticatedID and one unauthenticatedID for aliasing. 
  4.  The event timestamp must be in UTC
  5.  JSON fields must be pre-parsed and flattened into their own fields.
  6.  All joins must be done beforehand.

For any advanced enrichments or modeling such as creating custom user aliasing logic, please contact us

Instructions:

Adding a Data Source In Indicative

  1. In Indicative, click on Settings and select Data Sources

    Screen_Shot_2022-09-07_at_3.34.09_PM.png

  2. Click on New Data Source

    Screen_Shot_2022-09-08_at_4.02.42_PM.png

  3. Select Connect via Data Warehouse or Lake
    mceclip0.png
  4. Select Snowflake as your data connection and Define your own schema as the connection schema and click Connect
    mceclip0.png
  5. You should see this Snowflake Overview screen. Click Next
    mceclip1.png

Connection Setup

mceclip2.png

  1. Log in to your Snowflake account. 
  2. Enter the Account ID information into Indicative.
    mceclip2.png
    Account ID is everything to the left of .gcp.fnowflakecomputing.com/...
  3. Enter the Warehouse name.
    mceclip3.png
  4. Enter the Database name.
    mceclip4.png
  5. Click into Warehouses and copy the Schema.
    mceclip5.png
  6. Enter the Table name.mceclip6.png
  7. For Auto-Generated Password, we randomly generate a password for you to use. If you would like to create your own password, please replace the autofilled value in that field.

Grant Permissions 

mceclip3.png

  1. You will need to copy and paste these code snippets into your Snowflake console. The last snippet is optional.
  2. Navigate to the Worksheets tab and paste the snippets into the SQL runner, check the All Queries checkbox, and hit Run.
  3. Click Next to test your connection

Data Loading

mceclip0.png

  1. Load Timestamp Field
    Select the field used to identify new data. We recommend using a timestamp that denotes when the event was published, not the actual event timestamp to allow for late data to be collected. This will not impact your analyses since we reference the event timestamp for our queries.

    For example, if an event with an event timestamp of 12/1 was published to the table on 12/3, this will not be collected unless we use the publishing timestamp since every daily extract would look for events that occurred on 12/3. Using the publishing timestamp will allow us to extract all new data that was published to the table on a nightly basis.

  2. Start Date
    Select the date from where Indicative should load your data from. 

    If your data history exceeds 1 billion events, a Solutions Engineer will contact you to assist with the integration.

  3. Schedule Interval
    Select the frequency to make new data available in Indicative. 

  4. Processing Delay
    Select when we should start extracting your data in UTC. This time should be when all of your previous day's data is fully available in your table for extraction.

Event Modeling

mceclip4.png

  1. In the Events Field section, input the field that should be used to derive Indicative event names.
  2. For Timestamp, input the field that represents the time that the event was performed.
  3. Click Next

    After this step, we will perform a few checks on your data with the model that you provided. The checks are:

    • Valid event field (Do at least 80% of your records have a value for the event field?
    • Valid timestamp field (Do at least 80% of your records have a value for the timestamp field?
    • Total number of unique events. We recommend 20-300 unique events and limit it to 2000. 

User Modeling(Aliasing)
mceclip5.png

After some basic checks, we can define your users within your data. For more information on User Identification (Aliasing), please refer to this article.

  1. If you choose to enable Aliasing:
    1. Unauthenticated ID - Input the field used to identify anonymous users.
    2. Authenticated ID - Input the field used to identify known users.

  2. I you choose to disable Aliasing, press Disabled:
    1. Unauthenticated ID - Enter the field used to identify your users. All users must have a value for this field.

If you have a non-null value that represents null UserID values, please click on the Show Advanced button. In this field, please enter these non-null values. 

After this step, we will perform additional checks on your data with the user model that you provided. The checks are:

  • User Hotspot (Is there a single UserID that represents over 40% of your records?)
  • Anti-Hotspot (Does your data have too many unique userIDs? A good events table contains multiple events per user)
  • Aliasing
    • Too many unauthenticated IDs for a single authenticated userID
    • Too many authenticated IDs for a single anonymous ID

Assisted Modeling

 

mceclip6.png

You should see a summary of your data based on the last 7 days in two main blocks.

You should only be concerned if the margin of error is significant. If so, please reach out to a product specialist:

  1. Events Summary
    You should see a daily breakdown of your Total Event Count, and the number of Unique Event Names. If there are certain events to exclude, please click on the Exclude checkbox for those events.

    If you would like to exclude any events by regex or property value, please contact a product specialist.

  2. Properties Summary
    Here you will see the number of Unique Property Names. If there are certain properties to exclude, please click on the Exclude checkbox for those events.

    If you require more advanced configurations such as parsing out JSON fields, creating derived properties, or  excluding properties based on regex, please contact a product specialist.

  3. Users Summary
    This section lists the number of Unique users seen. If the numbers do not look correct, please go back to the User Modeling section to confirm that the correct ID was chosen.

Waiting for Datamceclip3.png

If you see this screen, you're all done! You should see your data in Indicative within 48-72 hours and will be notified by email. 

Advanced Settings

For additional advanced settings such as excluding certain events and properties, please refer to this page

0 users found this helpful