Best Practices for Optimizing Sisense Dashboards for Large Datasets
Hi everyone, I’m currently working on optimizing some Sisense dashboards that are handling large datasets, and I’d love to get some insights from the community on best practices. We’re pulling in millions of records, and while performance is decent, I’d like to make it even more efficient. Here are a few specific challenges I’m facing: Slow Load Times: Some dashboards take a while to load, especially when filters are applied. Are there specific design techniques or settings that can help speed this up? Aggregation Strategy: Would it be more efficient to pre-aggregate certain calculations in the Elasticube rather than doing them in the dashboard? If so, how do you determine which ones to pre-aggregate? Materialized Views vs. Elasticube Optimization: Have you found materialized views in the source database to be a better approach than optimizing the Elasticube? What’s been your experience with balancing these two strategies? Best Plugins or Extensions: Are there any plugins or third-party tools you’d recommend for performance monitoring or caching that integrate well with Sisense? Data Modeling Tips: What are some best practices for structuring the Elasticube to minimize unnecessary joins and calculations of power apps certification? I appreciate any advice or real-world experiences you can share! Looking forward to learning from the community. Thanks in advance! Regards Delilah453Views0likes2CommentsDoes Sisense free up RAM from crashed queries and builds?
I've been looking at RAM use. While experimenting, I sometimes build bad widgets or cubes that use lots of RAM and then fail with the "abnormal memory consumption" message. That's fine, but I noticed after the RAM use climbs up, it remains high. After the failure, shouldn't Sisense free up the RAM? I want to experiment, and I don't mind hogging server resources while building or querying, but I expect those resources to be released for other users after I cancel or crash.Solved1.2KViews0likes4CommentsFeedback loop with github integration
I can't make updates to sisense/periscope through our github integration because the repository is > 500MB. I also can't delete files in the repository to make it smaller because of the same error and the periscope ingest is canceled. How do I proceed to make the repository capable of pushing data to our dashboards? Error message received through slack integration: Periscope ingest canceled. * Git repository too large: Repository cannot be larger than 500000000 bytes.738Views0likes2CommentsGithub changes are discarded
We are trying to make bulk edits to our dashboards and trying to use our existing github integration to achieve this. When a change to a file is merged in github, the commit will show in the commit history for ~4 minutes, and then disappear and revert to the last change that was saved through the web UI. The changes never appear in the dashboard on the web UI, and the commits never show up in the dashboard history. The sync appears to be working (at least in one direction) because the github repository is updated immediately with all changes made through the web UI. I looked at troubleshooting steps in this doc: https://dtdocs.sisense.com/article/git-integration and no Git Tags are created during this process. What can I do to make changes via the github repository?Solved1.1KViews0likes3CommentsIframe Embedded SDK not loading dashboard in Chrome but functioning in Incognito mode
We've encountered an issue with our Sisense integration with the Salesforce platform over the past few days. Specifically, the Sisense dashboard isn't loading correctly in the Chrome browser, although it functions properly in Incognito mode. I'm uncertain whether this is due to a cache issue resulting from recent incidents or configuration changes, or if it's caused by another factor. Could you provide some insight into this matter?2.1KViews0likes6Comments