Add functionality for updating the Elasticube a Pulse Alert is associated with
When changing the Elasticube associated with a dashboard, any Pulse alerts built off widgets from that dashboard will not be changed and will continue to be affected by the old Elasticube. This is an issue if you are replacing one Elasticube with another. There is no warning / message when changing a dashboard's Elasticube that pulses will be affected. There is no way to see the Elasticube a pulse is attached to - only the dashboard (but again, changing the dashboard's Elasticube doesn't change the pulses). There is no way to see which pulses are built on top of an Elasticube from the Elasticube screen (unlike with dashboards). I would like the ability when changing the Elasticube for a Dashboard that has Pulse alerts associated to it, give the prompt option to update to the new Elasticube for Pulses as well. This could either be done: Via some new UI Via some warning (similar to when you try and delete a cube with active dashboards / pulses) - but this would then also need the secondary ability to manually change the Elasticube associated with Pulses.573Views5likes2CommentsGive developers the ability to explore production without being able to modify
right now we can only allow our developers to either edit models in test or read/use. With read/use they aren't able to actually see the cube in the data page. I need my developers to be able to explore connections, schemas and tables in prod without having the ability to actually make any changes. https://community.sisense.com/category/support/discussions/supportportal#case-detail:500Pk00000mKN4vIAG4Views0likes0CommentsData Security Synchronization between Elasticubes
I was wondering if there is a possibility to synchronize the data security between elasticubes. For example, I have Elasticube A & Elasticube B which use one field (the same) for data security. Elasticube A is updated Monthly. Elasticube B is updated Weekly. The question is, if there is a functionality or a way to synchronize them. eg. If a change is done to Elasticube A, to synchronize the access rights of Elasticube B "automatically" .55Views0likes6CommentsCreate New Sample Data Sets with more data and more concurrent date ranges
Create a new sample data sets that have a lot more data, date ranges that are current to our dates today, has more scenarios that we see in all customers cases. Like more complex formula creations, displaying a wide range of different KPI's, have a more complex data model to show how dimensions and fact tables can work. Having a sample data set with custom code, custom tables, and custom column. Having one of the tables connect to a DW to show how the connection works as well.10Views0likes0CommentsQuery Plan Analyzer - is it reliable?
Looks to me like the Query Plan Analyzer incorrectly labels some joins as "many" when they should be "one". Also, it seems to analyze the SQL constructed by Sisense, not the database's query plan. Is that correct, or am I missing something? (My goal with this post is to check my understanding of Sisense and the Query Plan Analyzer, and to point out potential issues.) 1) Incorrect "many" See the attached "QPA-graph2.png". Looks like it's saying "Sum(Table1.Field1) Cross Join Sum(Table2.Field2) yields >1 rows". But the sums are not being sliced or grouped; it's just a sum of every row in the table. How can that possibly yield >1 rows? 2) Sisense SQL vs database query plan See attached "QPA order of operations.png". Looks like it's saying that the plan is to: [1] Select a row from the Table1 (Episode dim table). (1 row.) [2] Join Table2 (Dates) to Table3 (Patient/Episode fact table) on DateID. (80mil rows.) [3] Join the results of [1] and [2]. (1 row.) It would be more efficient to: [1] Select a row from the Table1 (Episode dim). (1 row.) [2] Join [1] to Table3 (fact) on EpisodeID. (1 row.) [3] Join [2] to Table2 (dates) on DateID. (1 row.) That way, it doesn't need to load all the Table2 and Table3 rows. The query planner knows that it can safely rearrange the joins this way, and it has cardinality estimates so it can see which order will result in less work.Solved988Views0likes2CommentsRefresh schema for all tables
Can we get a "refresh schema for all tables" button? Reason: Our tables are usually "Select * From AnalyticsSchema.ViewName". We control which fields to return by editing the view, not the Sisense table definition. When a field gets added/removed/changed, we need to refresh schema. That's fine to do manually as you're working on that datamodel+views, but we need to refresh all when: We copy a datamodel to a different server. We need to refresh schema at least to double-check that the views are as expected on the new server. (If any fields have changed, then I'll need to go fix any widgets using those fields, or, more likely, update the view to include them.) A view gets edited, perhaps for a different datamodel, and my datamodel hasn't been updated. I edit several views and want to refresh schema for all those Sisense tables. If I've changed used fields then I'll need to go into each table manually anyway so it doesn't matter, but I've had a case where I've removed unused fields from several views and now I need to click refresh schema on every table individually.2.4KViews5likes12CommentsSingleStore (MemSQL) Connector: Use CONCAT instead of Pipe operator (||)
The queries Sisense generates in some cases uses the pipe operator (||) to concatenate the strings. However, SingleStore has 2 different modes of treating the pipe operator, depending on @@sql_mode engine variable and PIPES_AS_CONCAT flag: flag is ON: pipe operator is treated as CONCAT and the Sisense-generated query works flag is OFF (default): pipe operator is treated as OR and the Sisense-generated query throws an error In order to avoid the ambiguity, I suggest that SingleStore (MemSQL) Connector uses CONCAT instead of pipe operator by default.12Views0likes0CommentsBug in data model list view rename field
Bug: I want to rename a field in my data model. In "Diagram view", that works fine. In "List view", when I click rename on the field, then the textbox accepts letters but when I press the space or arrow keys, the keyboard input gets handled incorrectly: it saves my half-finished new field name.6Views0likes0CommentsConnection Tool - Programmatically Remove Unused Datasource Connections, and List All Connections
Managing connections within your Sisense environment can become complex over time, if there are a large number of connections, and connections are often added, and replace earlier datasource connections. In some scenarios unused connections can accumulate, potentially cluttering the connection manager UI with no longer relevant connections. Although unused connections typically represent minimal direct security risk, it's considered best practice to maintain a clean, organized list of connections, and in some scenarios it can be desired to remove all unused connections. Sisense prevents the deletion of connections actively used in datasources, safeguarding your dashboards and datasources from disruptions. However, inactive or "orphaned" connections remain after datasources are deleted or a connection is replaced, potentially contributing to unnecessary UI complexity in the connection manager UI. Connections can be of any type Sisense supports, common types include various SQL connections, Excel files, and CSV files, as well as many data providers, such as Big Panda. This tool can also be used to list all connections, with no automatic deletion of unused connections.403Views4likes3CommentsHelp with CI/CD Workflow for Sisense + GitHub Integration
Hi everyone, We're currently using two environments—test and prod—and recently began using Sisense's Git integration with GitHub. We're now looking to improve our CI/CD process and wondering if anyone has come across solid documentation or examples. Specifically, we're hoping to: Automatically deploy changes from a feature branch in test to main in test after a pull request is merged Then automatically deploy the same changes to prod Has anyone implemented a workflow like this or found official guidance on automating these deployment steps? We'd love to avoid manually pulling these changes into each environment. Appreciate any advice, links, or lessons learned!49Views0likes5Comments