B2D Customer Configuration for Snowflake
  • 18 Apr 2022
  • 7 Minutes to read
  • Dark
    Light

B2D Customer Configuration for Snowflake

  • Dark
    Light

B2D Customer Config - Snowflake

Background

This article explains how to configure a new environment to use Build to Destination (B2D).

The configuration consists of two parts:

  • Setting properties/permissions in the client DB to use it as a destination for build.
  • Configuring Sisense to use these properties.


To enable B2D in Sisense:

  1. Select the Admin tab.
  2. From the navigation menu on the left side of the window, select Feature Management.
  3. Toggle ON the Build Destination switch.

B2D is performed through cloud storage. Therefore, for a given destination database, you must set up and configure the required storage and the destination database:

  • Snowflake and Redshift DB destinations are built through Amazon S3.
  • The Google BigQuery DB destination is built through GS (Google Storage).

This article only pertains to the Snowflake DB destination.

Snowflake Setup

This section contains the DB configuration (different properties such as users/roles and their permissions) and storage configuration (s3 parameters, bucket permissions).

Storage - Amazon S3

The authentication setup must allow all permissions for the buckets.

  1. On the top bar, select the IAM Service.
  2. On the left panel, under Access Management, click Policies.
  3. On the right side, click Create Policy.
  4. Select the JSON tab and paste it in the AWS Editor. Make sure to replace the fields that are highlighted and bolded below with the values that you require:
  5. On the bottom-right, click Next and add any tags needed.
  6. Click Next: Review.
  7. Type a meaningful name and description for the policy.
  8. Click Create Policy.
  9. Once the Policy is created, you can attach it to Groups, Roles, or Users.
    For example, attach the Policy to a Group, since it is easier to manage several Users that belong to the same Group than attaching the Policy one User at a time.
  10. Select User groups on the left panel, and then Add permissions > Attach policies.
  11. Use the search box to find the Policy you created and click Add Permissions.
  12. If the bucket does not exist, by default Sisense attempts to create it. If there aren't permissions to create buckets, turn OFF the EnableBucketCreation flag. To set the flag on/off:
    1. Click the Sisense logo 5 times.
    2. In the Service Configuration > Build window, set the EnableBucketCreation flag.

Database - Snowflake

To set up the destination database you must be logged in to the DB as a user having the useradmin and accountadmin roles.

To build to Snowflake, you must define the DB to build to, and define users with read/write permissions. It can be an existing DB and existing users (but you must make sure they have the correct permissions - see the Grant Permissions commands below), or to create new ones, and grant them the permissions, using the steps defined below.

The first step is to create the writer and viewer users. This is done using the useradmin role. Enter their names and passwords, as well as their roles, warehouse and DB properties that will be used (it is ok if they do not yet exist). All of these properties must later be configured under Build (see the Destination Configuration section).


Note:
Each text that starts with "MY" should be replaced by the real field name. For example, MY_WRITER_USER should be replaced with the user name that has write permissions.


USE ROLE useradmin;

CREATE OR REPLACE USER MY_WRITER_USER

LOGIN_NAME='MY_WRITER_USER' PASSWORD='w123'

   DISPLAY_NAME='My Writer User' DEFAULT_ROLE='MY_WRITER_ROLE'

   DEFAULT_WAREHOUSE='MY_WAREHOUSE' DEFAULT_NAMESPACE='MY_DB';

CREATE OR REPLACE USER MY_VIEWER_USER

LOGIN_NAME='MY_VIEWER_USER' PASSWORD='v123'

   DISPLAY_NAME='My Viewer User' DEFAULT_ROLE='MY_VIEWER_ROLE'

   DEFAULT_WAREHOUSE='MY_WAREHOUSE' DEFAULT_NAMESPACE='MY_DB';

The following commands will require the accountadmin role:

USE ROLE accountadmin;

Creating the warehouse and database for all built data:

CREATE WAREHOUSE IF NOT EXISTS MY_WAREHOUSE;

CREATE DATABASE IF NOT EXISTS MY_DB;

Creating the roles and granting them to the users:

CREATE ROLE IF NOT EXISTS MY_WRITER_ROLE;

CREATE ROLE IF NOT EXISTS MY_VIEWER_ROLE;

GRANT ROLE MY_WRITER_ROLE TO USER MY_WRITER_USER;

GRANT ROLE MY_VIEWER_ROLE TO USER MY_VIEWER_USER;

Granting the warehouse-level permissions to the roles:

GRANT MODIFY, MONITOR, USAGE, OPERATE ON WAREHOUSE MY_WAREHOUSE

TO ROLE MY_WRITER_ROLE;

GRANT USAGE, OPERATE ON WAREHOUSE MY_WAREHOUSE

TO ROLE MY_VIEWER_ROLE;

Granting the database-level permissions to the roles:

GRANT MODIFY, MONITOR, USAGE, CREATE SCHEMA ON DATABASE MY_DB

TO ROLE MY_WRITER_ROLE WITH GRANT OPTION;

GRANT USAGE ON DATABASE MY_DB TO ROLE MY_VIEWER_ROLE;

Granting query permissions to the viewer user (the writer user will be the owner of the destination schemas):

GRANT USAGE ON ALL SCHEMAS IN DATABASE MY_DB TO ROLE MY_VIEWER_ROLE;

GRANT USAGE ON FUTURE SCHEMAS IN DATABASE MY_DB TO ROLE MY_VIEWER_ROLE;


GRANT SELECT ON FUTURE TABLES IN DATABASE MY_DB TO ROLE MY_VIEWER_ROLE;

GRANT SELECT ON ALL TABLES IN DATABASE MY_DB TO ROLE MY_VIEWER_ROLE;

If you are using an AWS environment where an access key and private key are not used for authentication, you must set up an S3 storage integration with write permissions, named:

SISENSE_INTEGRATION_MY_WRITER_ROLE

Note:     
The name should be SISENSE_INTEGRATION_ and then the user role of the user with the permissions to write.

https://docs.snowflake.com/en/user-guide/data-load-s3-config-storage-integration.html

Configuration

This section describes how to configure the properties defined above, in the Sisense env.

Configurations are applied globally (for all cubes) via CLI commands.

CLI Configuration

Storage
yaml Configuration

To create/update storage parameters, you must create yaml files as follows. For example:

storage:

   - storageType: AMAZON_S3

   accessKey: the access key if needed for authentication.

   secretKey: the secret key if needed for authentication.

   iamRole: the IAM role

   bucketName: the staging bucket name

   bucketRegion: the bucket region


   - storageType: GOOGLE_STORAGE

   bucketName: the staging bucket name

   bucketRegion: the bucket region

Commands

si storage list : get all existing storage parameters.

si storage create -path  : create new storage parameters using the yaml file (as described above).

si storage update -path  : update existing storage parameters using the yaml file (as described above).

si storage delete -type : delete existing storage parameters for a specific type.
For example: si storage delete -type “AMAZON_S3”.

si storage delete -type all : delete all existing storage parameters.

DB Destination

Currently you can only create one writer user and one reader user for each destination.

yaml Configuration

To create/update connection parameters, you must create a yaml file as follows.
For example:

connections:

   - connectionString: (e.g jdbc:snowflake://host_string/?warehouse=warehouse_name&role=role_type)

   user: the user name

   password: the password for the user above

   role: which role related to the user

   database: database name

   permissions: what permissions granted to the role defined (e.g writer)

   buildDestination: destination (e.g snowflake)




   - connectionString: (e.g jdbc:snowflake://host_string/?warehouse=warehouse_name&role=role_type)

   user: the user name

   password: the password for the user above

   role: which role related to the user

   database: database name

   permissions: what permissions granted to the role defined (e.g viewer)

   buildDestination: destination (e.g snowflake)


Note:
You must define a different connection for each user type (writer/viewer).
Commands

si connections list : get all existing connections.

si connections create -path  : create a new connection as described in the yaml file above.

si connections delete -destination : delete connections for a specific destination.

si connections delete -destination: delete a connection for a specific destination and specific user permissions.

si connections delete -destination all : deletes all destinations connections.

si connections update -path  : updates an existing connection as described in the yaml file above.

Troubleshooting

Guidelines

Most errors related specifically to the setup or configuration of B2D are identified as such in the opening line(s) of the error message, describing a general area where the issue might be. Subsequent lines in the error message provide the relevant details to address the relevant issue, as with all build errors.

Notes:

  • Each issue might be discovered during different phases of the build, but should fail relatively early (approximately during the first minute).
  • Usually only one issue will be presented, and not necessarily in a particular order. For instance, if both the storage authentication and the destination connection string are incorrect, the user will usually randomly see one of those errors, and only when it’s handled, will get the other.
  • Configuration changes will apply immediately (for builds that are yet to start). There is no need for restarts.

Identified General Issues

  • Destination storage error. Verify the build destination storage configurations.
  • Destination storage authentication failed. Make sure the connection credentials are valid, and the user has sufficient permissions.
  • Destination database error. Verify the build destination database configurations.
  • Destination database authentication failed. Make sure the connection credentials are valid, and the user has sufficient permissions.
  • Destination connection failed. Verify the database configurations of the build destination.
  • Destination connection failed: Unauthorized. Verify the roles and privileges of the destination database user.

Build Error Examples

As with any build error, there will not necessarily be specific wording or specific BE references.

  • S3 bucket not configured:
    Destination storage error. Verify the build destination storage configurations.
    Error details: BE#366389 Failed to create S3 bucket.
    No bucket name configured; Cannot create bucket for this cube.
  • S3 region not configured:
    Destination storage error. Verify the build destination storage configurations.
    Error details: BE#366389 Failed to create S3 bucket: uploaddataredshift.
    No region configured; Cannot create bucket for this cube.
  • Bad S3 region name:
    Destination storage error. Verify the build destination storage configurations.
    Error details: BE#366389 Failed to create S3 bucket: uploaddataredshift.
    Sdk client: Unable to execute HTTP request.
    Unknown host: Uploaddataredshift.s3.us-eas-2.amazonaws.com; Cannot create bucket for this cube.
  • Missing or incorrect access keys or authentication:
    Destination storage authentication failed. Make sure the connection credentials are valid.
    Error details: BE#366389 Failed to create S3 bucket: uploaddataredshift.
    Sdk client: Unable to load AWS credentials from any provider in the chain: [...]; Cannot create bucket for this cube.
  • Typo in writer role in Snowflake:
    Destination database error. Verify the build destination database configurations.
    [...]
    Snowflake SQL: SQL compilation error:
    Role 'WRITERR' does not exist or not authorized.
  • Bad user or password:
    Destination database authentication failed. Make sure the connection credentials are valid.
    Error details: Could not connect to destination SnowflakeJDBC.
    Connector response: Failed to connect to data source - incorrect username or password was specified.
  • Role not granted in Snowflake:
    Destination connection failed: Unauthorized. Verify the roles and privileges of the destination database user.
    Error details: Could not connect to destination SnowflakeJDBC.
    Connector response: Failed to connect to data source - role 'B2D_WRITER' specified in the connect string is not granted to this user.
    Contact your local system administrator, or attempt to login with another role, e.g. PUBLIC.
  • Warehouse access not granted in Snowflake:
    Destination database error. Verify the build destination database configurations.
    [...] Connection parameter 'warehouse' is not specified or cannot be used.
  • Database access not granted in Snowflake:
    Failed to apply schema: [...]
    Unable to use database '...', make sure access was granted.



Was this article helpful?