cancel
Showing results for 
Search instead for 
Did you mean: 
TriAnthony
Community Team Member

Automatically Copy Data Security Rules Between Data Models (including Elasticube to Live and Live to Elasticube)

Overview

Managing data access consistently across multiple data models can be time-consuming and error-prone, especially with complex row-level security (RLS) configurations. This article provides a script and step-by-step instructions to automate copying data security rules from one model to another within the same environment. It helps ensure security alignment while reducing manual work.

The script copies both the RLS rules and scopes, and works for every possible combination of source and target data model types:

  • Elasticube to Live model
  • Live model to Elasticube
  • Elasticube to another Elasticube
  • Live model to another Live model

This script utilizes Sisense REST API.

Requirements

  • The names of the table(s) and column(s) used for data security in the source and target data models must match.
  • Admin access to server.
  • The Custom Code feature has to be enabled. To simplify the implementation, this script can be viewed and executed on the Sisense server's Custom Code's Jupyter notebook, so you don't have to install Python on your own computer. To enable Custom Code, go to the Admin tab > Feature Management > Advanced Analytics section > toggle on Custom Code.

Limitations

  • This script does not merge RLS rules. All existing RLS rules will be deleted from the target data model before the RLS rules from the source data model are copied.
  • Sisense doesn't currently have a built-in validation logic for table and column names when creating RLS rules via API. The script does not validate these either during the copy process. Ensure security table and column names match between the source and target data models. Refer to the Requirements section above. This limitation may be addressed in a future version of this script.
  • The script only supports one target data model at a time. This limitation may be addressed in a future version of this script.

Instructions

1. Download the CopyDataSecurityRules_v1 Script

Download the script here. It's also attached at the bottom of this article.

Note: our community site doesn't currently support Python/Jupyter files. Please change the file extension from .txt to .ipynb after downloading.

2. Upload the Script to Your Server

Upload the script to this location: /opt/sisense/storage/notebooks/. You can also use any of the existing subfolders or create a new subfolder in this location, if needed.

The easiest way to upload files to your Sisense server is to use the File Management feature. In the file browser UI, open the notebooks folder, then click the Upload (Up arrow) icon at the top right corner. Click File, then select the CopyDataSecurityRules_v1 file.

3. Open the Script in Jupyter.

To open the Jupyter notebook, use this direct URL:

ChatGPT said:
https://YourSisenseServerAddress/app/diag/notebooks/work/storage_notebooks/CopyDataSecurityRules_v1.ipynb
 
If you opted to upload the script in a subfolder, the direct URL should look like this:
https://YourSisenseServerAddress/app/diag/notebooks/work/storage_notebooks/subFolderName/CopyDataSecurityRules_v1.ipynb

4. Configure the Script

Once the notebook is open, update the first cell and specify the below configuration variables. This is the only cell that needs your input.
  • source_datamodel_type: the type of the source data model you're copying the RLS from. The value should either be "cube" or "live".
  • target_datamodel_type: the type of the target data model you're copying the RLS to. The value should either be "cube" or "live".
  • source_datamodel_name: the name of the source data model.
  • target_datamodel_name: the name of the target data model.
  • batch_size = 100: the number of RLS rules to copy at a time. The script copies RLS rules in batches due to possible API limits (most relevant if you have a large number of RLS rules). This does not mean you need to run the script multiple times.

Validations are in place to ensure:

  • The source and target types are either "cube" or "live".
  • The source and target data models exist in the server.
  • The types of the source and target data models match the specified types.
  • The batch size is a non-zero, positive integer.

The script will throw a descriptive error message if any of the validations above fail.

Here is an example of the configurations:

# Specify the source and target model types and names
source_datamodel_type = 'cube'
target_datamodel_type = 'live'
source_datamodel_name = 'Sample Insurance - Elasticube'
target_datamodel_name = 'Sample Insurance - Live'
batch_size = 100  # Define batch size

5. Run the Script

Once you have specified the configuration values, run the script by opening the Run menu, then selecting Run All Cells.

 
TriAnthony_0-1746222551832.png

[ALT Text: Jupyter Notebook interface showing a "Run" menu expanded, highlighting the "Run All Cells" option in red. Coding cells are visible below.]

Monitor the progress at the bottom of the notebook. A completed log should look like this:

 
Completed Log Example - Grey.png
[ALT Text: Text log showing the process of updating a database model. It details the deletion of existing rules and successful copying of 650 data security rules in 7 batches.]
Version history
Last update:
‎05-05-2025 09:45 AM
Updated by: