Connection management - Adding a Test option
Hello, In the Connection Mangement windows, I would like to have the ability to test the connection is working. Requirement : Be able to validate all the connection parameters are correct : connect string, hostnmae, credential ... Suggestion : In the contextual menu, from the three dots at the right of the connection, there are option to Share, Edit, Rename a connection. I would like to have a Test option. Thank you, Franck4Views0likes0CommentsData Source Dependency Report
I would like to be able to run a "lineage/dependency" report in Sisense. Something that will tell me for all the cube/live data sources: - Data source name - Type (cube/live) - Dates of the data model (created, modified, last built) - Data sources being referenced and type (web service, database, etc.) - Details of each data sources (SQL Server, servername, database name) - Tables or objects being referenced by those data sources It's important for managing the environment to know what we are using, or have an easy way to know what data sets/dashboards would break if a table's structure were changed in the database.2KViews1like5CommentsAdd Support for Oracle 26 as a VectorDB provider to the Sisense AI Assistant
We need to have more options for vectorDB providers as the current version of Sisense supports only MongoDB. This makes it a single-dependency feature which would cause problems in the future for sure. Also, Oracle is a really popular choice for a DB, which would make the feature much more accessible for all Sisense customers. Please, consider it.66Views0likes0CommentsNative Support for Salesforce Connected App Authentication
Salesforce has announced that API Access Control must be enforced, and once fully enforced, the username/password + security token method will no longer be permitted for API integrations. https://help.salesforce.com/s/articleView?id=005228838&type=1 While the current Sisense guidance allows us to continue using token-based authentication, we belive this is only a temporary gap in Salesforce’s enforcement. We expect that Salesforce will require all integrations to authenticate exclusively through an allowlisted Connected App using OAuth. This is not a feature request in the traditional sense. It is a compliance requirement dictated by Salesforce’s security model, and other analytics and reporting tools that integrate with Salesforce already support Connected App–based OAuth authentication natively. To ensure long term compatibility and security, Sisense needs to provide: Native support for OAuth via Salesforce Connected App, without requiring manual JDBC string assembly or custom development A UI-driven configuration aligned with Salesforce’s allowlisting and API Access Control policies Clear guidance for customers migrating away from token-based authentication Without this capability, Sisense will no longer be able to integrate with Salesforce once Salesforce completes enforcement. Please escalate this as a priority compliance feature.97Views0likes1CommentAbility to Specify Provider in Pulse Alerts
-------------------------------------------------------------------Problem Statement: Inability to Scope System Alerts by Data Provider Currently, Pulse System Alerts for build failures are binary—either enabled for everything or disabled. In complex enterprise environments, we often run hybrid deployments where ElastiCubes are powered by vastly different backend providers (e.g., legacy MSSQL/Oracle vs. modern Snowflake/Redshift/BigQuery). When a legacy database goes down for maintenance, or when we have non-critical local CSV cubes failing, our administrators are flooded with Pulse notifications. This noise often causes us to miss critical failures in our primary Snowflake cloud data warehouse, which has direct cost and SLA implications. Proposed Feature: Provider-Based Alert Routing We need the ability to configure Pulse System Alert rules based on the underlying Provider or Connectivity Type of the ElastiCube. Specifically, in the Pulse > System Alerts > Build Failed configuration, please add a condition logic or filter that allows us to include/exclude specific providers. Configuration Example: Alert Rule 1: Send to CloudOps Team IF Build Fails AND Provider = Snowflake, Redshift. Alert Rule 2: Send to DBA Team IF Build Fails AND Provider = MSSQL, Oracle. Alert Rule 3: Do NOT send alert IF Provider = CSV, Excel. Business Impact Noise Reduction: Eliminates "alert fatigue" by filtering out expected failures from dev/test environments or legacy systems. Targeted Incident Response: Ensures the right team (Cloud Ops vs. Legacy DBAs) receives the alert immediately, reducing Mean Time to Resolution (MTTR). Cost Management: Helps prioritize failures that impact billable cloud compute consumption.66Views0likes0CommentsReusable/shared connection information for data models
It seems strange to me that each time I need to add another table to my model, I have to re-enter all the connection (or pick a recent connection). If i have 10 tables in my model, they all have their own individual connection information. If a server name ever gets renamed, it'll be a crazy headache for us. There should be a place where you define the distinct list of actual database connections (one per server or database), give it a name, apply security, etc. And then when you go to add a table to a data model, at that point you pick from the previously defined available list of connections.2.3KViews8likes7CommentsDifferent database connections on staging server vs production server
Hi, We have two cloud servers in Sisense: one for development (staging) and one for production. The staging server connects to our staging database. The production server connects to our production database. All cubes and dashboards are identical except for database connection strings and names. Our Git branching strategy follows these steps: Create a feature branch from staging. Make changes and push them to the feature branch. Open a pull request from the feature branch to staging. Test changes in staging. If approved, merge staging into master (production) to deploy changes. The Issue Git integration tracks database connection names, meaning both servers must either run on staging data or production data—this is not feasible for us. Proposed Solution We suggest implementing a decentralized environmental variable for storing database connections. For example: Use {database-server-name} as a placeholder in configurations. Set database-server-name = db_server_staging on staging. Set database-server-name = db_server_production on production. This would allow the same codebase to dynamically connect to the appropriate database without manual adjustments. Would love to hear your thoughts on this!425Views6likes2CommentsAdd a native connector and dialect for Dremio
We are using Dremio as a universal semantic layer to connect to several backend databases: PostgreSql, Oracle, HDFS, Snowflake. Using the Arraw flight jdbc driver allows the connection to Dremio, but often results in bad queries. I believe a custom dialect will be needed.43Views0likes0CommentsCreate New Sample Data Sets with more data and more concurrent date ranges
Create a new sample data sets that have a lot more data, date ranges that are current to our dates today, has more scenarios that we see in all customers cases. Like more complex formula creations, displaying a wide range of different KPI's, have a more complex data model to show how dimensions and fact tables can work. Having a sample data set with custom code, custom tables, and custom column. Having one of the tables connect to a DW to show how the connection works as well.45Views0likes0Comments