Taming Chaos and Clutter in Sisense Self-Service BI!
With an established self-servable Sisense platform, data explorers can now freely analyze data by crafting their widgets and dashboards. But beware! Without proper guidelines and guardrails, this newfound freedom can conjure chaos and clutter. While my previous blog delved into the reasons for embracing self-service capabilities this one stresses the significance of maintaining control and governance for a clutter-free and productive Sisense experience. Why Self-Service Matters: Allowing users to create their dashboards empowers them to extract valuable insights without relying on dedicated teams or specialized knowledge. This democratization of data leads to quicker decision-making and fosters a data-driven culture within the organization. The Challenge of Clutter: While self-service offers numerous benefits, it also presents challenges, with clutter being one of the most significant concerns. Without proper oversight, the platform can quickly become inundated with irrelevant or poorly constructed dashboards, hindering productivity and data analysis. Strategies for a Clutter-Free Sisense Environment: Standardization and Governance: To maintain consistency and ensure compliance, implement standardized templates, color schemes, and design guidelines. Define clear governance policies and data access controls to prevent unauthorized access to sensitive data. At Gousto we use pre-designed templates in order to maintain consistency across the board. We are also exploring options to use GitHub with Sisense to version control dashboards. This will prevent inadvertent overwriting of dashboards and at the same time allow users to personalize the dashboards to suit their purpose. Training and Resources: Empower users with comprehensive training and resources to enhance their BI skills. Offer tutorials, documentation, and online courses to encourage best practices in dashboard creation. Dashboard Approval Process: Introduce a review and approval process for user-generated dashboards. Designate administrators or data stewards to validate and approve dashboards before they are shared, promoting quality and relevance. Encourage Reusability: Promote the use of shared data sources and pre-built widgets to discourage duplication and redundancy. Facilitate collaboration among users and maintain a clutter-free environment. Tip: A dashboard catalog, listing all the available dashboards along with their data sources would be good to start with. Anyone intending to create new dashboards can then skim through the list before proceeding, to avoid duplication. Usage Analytics: Utilize analytics tools within the Sisense platform to track dashboard usage and engagement metrics. Identify underused dashboards and consider archiving or removing them to reduce clutter. We connect to a (MongoDB) database that stores the dashboard, ElastiCube, and user metadata to run our Governance dashboard, which monitors dashboard usage and user activity. This then instructs the next step Regular Cleanup Campaigns: Conduct periodic clean-up campaigns where administrators review existing dashboards, remove unused or irrelevant ones, and promote best practices to maintain a clutter-free environment. Establish a Community Forum: Create a community forum where users can share and discuss their dashboards. Encourage collaboration and identify dashboards that provide value to a broader audience. Usage Quotas or Limitations: Consider setting usage quotas or limitations on certain resources to prevent the creation of an excessive number of dashboards and encourage thoughtful design. The options are endless, and I have outlined the top few that have proven useful to us. A self-servable Sisense platform empowers users to explore and analyze data effectively. However, to prevent chaos and clutter, it is essential to implement control measures and governance. By standardizing processes, encouraging reusability, and conducting regular clean-up campaigns, organizations can strike the right balance between autonomy and order. This approach ensures a productive and clutter-free environment, allowing users to make the most of their Sisense platform while adhering to best practices and governance.1.5KViews3likes1CommentThank you all for an amazing cohort 🙏
I wanted to thank you all for joining the cohort and participating at any level. I know everyone has many obligations pulling them in different directions, and I’m thankful for the time we got to spend together during the cohort! Also, a huge thank you to Sofia and Jacqueline for making the cohort happen. It was great to meet all of you, and I’m looking forward to seeing everyone around in the future! If you ever want to chat about writing, give feedback on the cohort experience, or just stay connected, you can connect with me here on LinkedIn, Substack, Twitter1KViews1like0CommentsIf you torture the data long enough, it will confess to anything: How to avoid misrepresentation
If you torture the data long enough, it will confess to anything: How to avoid misrepresentation There is so much data you have access to within a company. Communicating accurate insights from company data is challenging. This post covers the mental biases and common mistakes that people make when analyzing data, namely Confirmation Bias, Selection Bias and Survivorship Bias Understanding Confirmation Bias Confirmation bias refers to the inclination to actively seek or interpret information that aligns with preexisting beliefs, often disregarding contradictory evidence. This bias tends to overlook opposing data, especially in a business setting where it can lead to ignoring indications that a feature, product, or business element isn't functioning optimally due to a focus on a single favorable metric. This phenomenon urges individuals to downplay negative signals while emphasizing positive aspects. How to Detect Confirmation Bias? There are three common signs that confirmation bias is influencing the data you are looking at. Only good news If the only news is good news confirmation bias is probably hiding some inconvenient metrics. Limited metrics reported Usually people present a few metrics because they are trying to be brief and that is desirable in most cases. Obscure metrics being used Many times, we cannot measure exactly what we want so we have to settle for proxy metrics. If people are confidently sharing convoluted proxy metrics, they are likely looking for ways to find positive signals in the data. Identifying Confirmation Bias Several indicators can signal the presence of confirmation bias in the analysis of data: Selective Positivity: If exclusively positive news is highlighted, it's likely that confirmation bias is obscuring less favorable metrics. Limited Metrics: Presenting only a handful of metrics could indicate an attempt to simplify, yet it might also be concealing contrary data. Complex Proxies: When intricate proxy metrics are touted with confidence, there might be an effort to extract positive signals from the data due to the absence of direct measurements. Counteracting Confirmation Bias To approach confirmation bias systematically, consider these strategies: 1.Academic Approach: Establish a peer review system where a fellow analyst scrutinizes your analysis before presentation. This ensures diverse perspectives and helps validate conclusions. Additionally, replicating tests within the company, if resources permit, can corroborate findings. 2.Third-Party Audit: Leverage data analysts to randomly assess shared analyses. If discrepancies or misleading visuals emerge, engage in a collaborative review with the creator. Once corrections are made, distribute updates to affected stakeholders. 3. Insurance Methodology: Define clear parameters for data claims, restricting certain causal phrases, for instance. Claims beyond established boundaries aren't approved for sharing, aligning with an "insurance" mindset within the company. 4. Cultural Approach: Highlight the company's mission as paramount. Emphasize that the mission's importance surpasses personal satisfaction with product or feature success. Acknowledge that mistakes are inevitable, and they serve as stepping stones for better decision-making in the future. By integrating these strategies, businesses can mitigate the influence of confirmation bias, fostering a more objective and holistic analysis of data. Understanding Selection Bias? Selection bias arises when the group chosen for analysis does not accurately represent the broader population under study. Acquiring an Unrepresentative Group Unintentionally 1. Convenience: This occurs when a group is chosen for analysis due to ease of measurement, leading to a non-representative sample. Example: When evaluating the value of a new feature, asking only three friends for their opinion may not reflect the broader customer base. Common Convenient Biases: Engaged customers Latest user cohort Specific geographic region 2. Self-Selection: This happens when individuals who voluntarily participate in analysis possess traits that differ from the entire population. Example: Distributing a product satisfaction survey could attract highly opinionated or time-wasting respondents, skewing results. Common Biased Self-Selectors: Extremely negative individuals Overly positive individuals Early adopters Power users Selection Bias in Business Context Suppose you’re launching a premium feature in your BI Tool. If you invite only the most active users to try it, their motivations might cloud the analysis: They might test all new features, irrespective of value. They might seek contact with your company. They might aim to suggest new features. Although well-intentioned, their feedback might mislead you. Remedies: Actively engage a representative sample for testing. Employ qualifying questions to select a balanced sample. If many early adopters respond, consider weighing their feedback against typical customers. Grasping Survivorship Bias These biased interpretations of events and data are prevalent within the business landscape. Imagine you’re overseeing a two-week trial phase for a recently launched dashboard/data product. At the midpoint of the trial, you observe only a few remaining active users. Let’s suppose that these users primarily consist of data analysts, and they are progressively generating more intricate analyses using your BI tool. Possible Conclusions and Their Caveats 1. My dashboard/data product resonates with data analysts (inferring a norm): Without investigating the users who ceased engagement with your dashboard, you lack insight into whether the initial trial group included data analysts. If a majority of those who started the trial were indeed data analysts but a larger portion discontinued usage, your initial conclusion would be contradicted. It’s imperative to delve into the non-surviving trial participants before drawing conclusive statements. Engaged data analyst users do not necessarily imply that your product resonates with all data analysts. To reach a more informed conclusion: Analyze the entire trial participant pool to unveil distinct patterns between engaged and non-engaged cohorts. Understanding the characteristics of both groups before deducing a norm is crucial. 2. My dashboard empowers deep analysis (inferring causality): Currently, you lack a point of comparison for your users’ capabilities. While they may be utilizing your dashboard to perform advanced tasks, their proficiency might transcend your product’s design. Their success could be due to their analytical skills rather than your dashboard’s influence. To reach a more informed conclusion: Pre-assess users’ skill levels to gauge the true value added by your product’s features. Compare their performance using other tools for similar tasks. If your product demonstrates clear benefits, you can leverage these results as a testament to its effectiveness. Key Takeaways Analyze the entire trial cohort, encompassing both engaged and non-engaged users. Gauge your product’s impact by comparing user behavior with competitor tools. Evaluate users’ skill levels before attributing success solely to your product. By adopting these approaches, you can mitigate survivorship bias and arrive at more accurate and nuanced conclusions about your product’s performance.1.6KViews2likes1CommentThe Importance of Thoughtful Design: A Prerequisite for Effective Dashboards
The Importance of Thoughtful Design: A Prerequisite for Effective Dashboards When tackling a problem, people tend to become emotionally invested in the solutions they come up with. The more time and effort they dedicate to a particular solution, the more they believe in its effectiveness, even if it may not actually address the core problem adequately. That's precisely why it is crucial to resist the temptation of rushing into creating a dashboard as a solution. We must avoid the creation of ineffective dashboards that won't serve their intended purpose. In my previous posts, I emphasized the importance of finding the right problem before jumping into crafting a solution and gave you the tools to do so. To ensure the success of dashboard design, it is essential to invest time in understanding why we need a dashboard in the first place. Exploring multiple ideas and possibilities before settling on a specific solution is key. The dashboard design process should begin by identifying our stakeholders and their decision-making needs. This understanding will guide us in determining the relevant metrics that support those decisions. The next step involves prototyping the dashboards using simple tools like pen and paper, obtaining feedback and iterating to refine the design. Only after confirming that the prototype aligns with the project's objectives should we proceed to gather the actual data and begin building the dashboard. Lastly, creating a useful dashboard requires sharing it effectively with the intended audience and maintaining it over time. By following this thoughtful approach to dashboard design, we can ensure that the end product becomes a valuable tool for its users. Dashboard Design Process We can summarize this process into four steps: Define Prototype Build Deploy Step 1: The Initial and Crucial Step: Defining Stakeholders and Metrics for the Dashboard The first and foremost step in dashboard creation is gaining absolute clarity regarding the intended audience and the metrics that hold significance to them. This clarity is paramount to ensure the dashboard's practicality and usability. Stakeholders: In this endeavor, four main stakeholders play crucial roles: The Designer (You) - Responsible for crafting the dashboard. The User(s) - Those who will access and utilize the dashboard. The Key-User - An experienced user who provides valuable insights and feedback. The Data Buddy - A team member offering assistance and support in dealing with data-related aspects. Metrics: Working in collaboration with the key-user, you will transition from identifying the decisions that need to be made to determining the metrics that can be effectively queried and tracked. This process necessitates a thorough back-and-forth exchange to distinguish between interesting but non-essential metrics and the mission-critical data required for making informed decisions. Step 2: Prototyping - Crafting a Useful Dashboard After identifying the metrics to be included in the dashboard, the next step is to determine the most effective way to present them to the audience for maximum usefulness. Visualizations: Select visualizations that convey the metrics clearly and accurately. Even during the sketching and prototyping phase, making the right visualization choices can significantly enhance the prototype and feedback process. Dashboards: Utilize best practices for assembling the chosen visualizations into a cohesive dashboard. The composition decisions might even lead you to reconsider your initial choice of optimal visualizations. Sketching and Iteration: At this stage, it is highly recommended to sketch out the visualizations and dashboards either on paper or using a tool like mokkup.ai. This approach allows you to swiftly discard any unsuitable ideas without concerns about time investment. Moreover, it enables you to concentrate on the design itself rather than being preoccupied with validating the accuracy of the numbers. Links: Mokkup.AI Step 3: Building the Dashboard: Transforming Prototype into Reality After finalizing and being content with the prototype, the next step involves bringing the dashboard to life using real data. Locating the Data: This stage can present various challenges. Questions such as where the data is stored, its cleanliness and its availability may arise. Collaborating closely with the development team and the Data Buddy becomes essential in navigating through this phase. Creating Metrics and Dashboard: To build the dashboard, we must generate queries to power the metrics, formulate necessary calculations and transform the data into visually informative charts. Utilizing a framework to log the metrics, formulas and data sources simplifies the process of creating queries and ensures a smoother development experience. Step 4: Deploying and Sustaining the Dashboard: Ensuring Widespread Impact After successfully creating a fully functional dashboard, the next crucial steps involve sharing it with the entire audience, enhancing its effectiveness at scale and ensuring its ongoing maintenance as usage evolves. Sharing: Given that users possess varying levels of data literacy and contextual understanding, it is imperative to ensure that the dashboard provides sufficient context within its design and offers adequate training resources. This approach enables people to easily derive valuable insights from the presented data. Scaling: As the dashboard gains popularity, the number of views and viewers is likely to increase. To accommodate this growth, it is beneficial to incorporate links, interactivity and comprehensive documentation, enabling the dashboard to serve a broader range of use cases and inspire other dashboard creators. Additionally, dedicating time to optimize queries becomes crucial in maintaining the dashboard's usefulness amidst growing demands. Maintenance: Datasources, tables and fields are subject to change over time, necessitating corresponding adjustments in the dashboard. Establishing scheduled review periods is vital in ensuring that the dashboard remains relevant and fully functional. Furthermore, providing a means for the audience to report issues allows for informed improvements and ensures the dashboard's ongoing effectiveness.1.9KViews4likes1CommentTransforming Decision-Making: Real-Time Insights for Competitive Advantage
In the dynamic realm of modern business intelligence, real-time reporting has emerged as a pivotal force reshaping the landscape. While traditional approaches often involved generating static dashboards and reports linked to databases refreshed daily or less frequently, the advent of real-time reporting has opened new vistas of opportunity. While this approach remains important in many scenarios, there are instances where real-time insights can provide a competitive edge and drive better decision-making. Traditional vs. Real-Time Reporting Imagine a scenario in warehouse management where operational efficiency is paramount. Traditional reporting methods often fail to detect underutilized stations until the next day's data refresh. This delay in identifying performance issues can result in reduced productivity and missed opportunities for immediate corrective action. However, the adoption of real-time reporting using tools like Sisense introduces a paradigm shift in how businesses can harness the power of data. A Case Study in Warehouse Management To emphasize the impact of real-time reporting, consider this example that vividly demonstrates how our stakeholders from the Warehouse Management team stand to benefit significantly. In this case study, the focal point is the integration of real-time Sisense reporting within warehouse management operations. The data flow originates from warehouse management systems and ultimately populates a data lake that undergoes daily updates. This data forms the basis for constructing comprehensive Sisense dashboards that illuminate the operational performance of different warehouse stations, delving down into the granular performance of individual pickers. The drawback of relying solely on daily data refreshes becomes evident when considering the impact of underutilized stations. Traditionally, declining station performance might go unnoticed until the next day. For the sake of argument, if station xyz were to underperform by 1% at the 5th hour of a 12-hour shift, in a traditional set-up, this issue would only be picked up by the dashboard the following day due to the nature of data refresh. This means that by the end of the shift, there would be a 7% decline in the overall performance of the station. To mitigate this, the proposition to employ Sisense Live models gained traction. With live models and dashboards, underperforming stations can be detected immediately, thereby enabling prompt countermeasures. This approach prevents stations from underperforming for a prolonged period, preserving overall performance from dropping. Harnessing Sisense Live Models The implementation of real-time reporting involves leveraging Sisense Live models, which enable the creation of dashboards with minimal delays. To achieve this, a prerequisite is to have a data source providing real-time data. Once connected, Sisense's live model seamlessly integrates with the data source, enabling dynamic visualization of data in real-time. Additionally, Sisense allows widgets to be set to auto-refresh at specified intervals, making it possible to display these dashboards on factory floors for live monitoring. One key advantage of real-time reporting is enhancing situational awareness and enabling proactive decision-making. By displaying real-time dashboards on factory floors, supervisors and employees can monitor performance at a glance. Additionally, this live monitoring is further augmented by pulse alerts, which provide instant notifications for critical changes or anomalies in performance. For instance, when a station's efficiency drops below a certain threshold, it triggers an alert, enabling immediate intervention. Caching Periods and Sisense Live Models One major challenge we encountered is related to the caching period for Sisense Live Models. Currently, the caching period of live models is set for the whole instance, which removes flexibility when different caching periods are required, such as for exceptionally large models. For example, we have a table that must be ingested via a live connection as it is too large for ElastiCubes. However, we wouldn't want the caching period for this live connection to be too "low" as it would incur high costs through AWS. Conversely, a low caching period is desired for warehouse management low latency reporting, as explained above. We are exploring options and will update the thread once we have identified a solution. In conclusion, real-time reporting is revolutionizing the field of business intelligence, and the warehouse management case study highlights its tangible impact on operational efficiency and productivity. By leveraging tools like Sisense, businesses can transition from reactive to proactive decision-making, ensuring that performance issues are promptly identified and addressed. As the business landscape continues to evolve, real-time reporting stands as a crucial tool for those seeking to stay ahead in a fast-paced world.1.6KViews6likes0CommentsEasy to learn, Hard to master
Easy to learn, Hard to master Creating insights that immediately grab the attention of the end-user, in a positive way, and inform them of the most important and relevant information is difficult. On the one hand end-users might have demands regarding the look and feel of the dashboard and the chart types that are being used. This rarely leads to a positive outcome. On the other hand designers try to be either to concise or to be long-winded in creating their dashboards and placing their widgets. Not to forget the understandability of the widgets and dashboard itself e.g. the storytelling that you are trying to accomplish. For our designers we have created a Dashboard Handbook to support them in creating the best looking and easy to understand dashboards. We use two frameworks to support designers in doing the groundwork for designing out a dashboard; The 4 W’s and GPS. The 4 W’s The 4 W’s help you focus on your audience and determine what the best way is to deliver the relevant insights to your end-users. The 4 W’s are Who, What, When and Where. As a company serving elderly care homes I will explain our 4 W’s as follows; Who… → Doctors, Nurses, Practitioners, HR, Finance needs What… → Patient, Ward, Measurements, Treatment When… → Real-time, daily refresh, weekly refresh, monthly refresh and Where… → Dashboard, Email-report, Pulse Notification, Slack/Teams, Google Slides/Sheets Example: A nurse needs a overview of all patients on a ward , three times a day (8.00 , 13.00 and 16.00). They need to print and hand it out, thus using a email report is the most usable medium for the nurse to consume data. Designers tend to think dashboards are the only way to bring insights to their end-users. However with Sisense this is not the case. To really let end-users use data in their day-to-day, we need to figure out what they need, when they need and where they want to get their data. For example; in our own environment 60-70% of our end-users has never logged into Sisense. We ‘push’ the insights to them when they need it. Majority of are email-reports, however messages to Teams are becoming more and more frequent. The remaining 30-40% are data savvy users. These users want to interact with the data, click on it, filter on it and analyze it. The best way to serve these users is with dashboards. However, as mentioned before, designing a dashboard is easy to do, but hard to master. Since we started using GPS we were able to design dashboards quicker, easier to understand and goal driven. GPS focusses on Goal, Problem and Solution. A goal of an end-user could be that they would like to reduce the number of incidents. The problem is that they only know the number of incidents. No additional information is known. The solution would then be to give insights into where, when and how those incidents take place. The GPS-method can be used on both a dashboard and widget level. Creating a hierarchy of widget-GPS’ that are a solution to the dashboard-GPS - Hamza2.8KViews5likes4CommentsEmpowering Data-Driven Decisions: The Power of Self-Service Business Intelligence with Sisense
In today's fast-paced and data-driven business landscape, the ability to make timely and informed decisions is paramount for success. Traditional approaches to business intelligence (BI) often involved lengthy data requests, manual reporting, and reliance on IT teams, resulting in delays and missed opportunities. However, the advent of self-service BI tools has revolutionized the way organizations interact with data, empowering users to access, analyze, and visualize information independently. Among the leading self-service BI platforms, Sisense stands out as a trendsetter, offering powerful and user-friendly features that drive data discovery and democratize insights. In this blog, we will explore the remarkable potential of Sisense and its impact on businesses seeking to harness the full potential of their data. What is Self-Service Business Intelligence? Traditional BI systems often restricted access to data, relying on specialized IT teams to generate reports and analysis for business users. Self-service BI, on the other hand, flips the script, placing the power of data exploration and analysis directly into the hands of end-users. With self-service BI, business users can access and manipulate data without extensive technical expertise, fostering a culture of data-driven decision-making at all levels of an organization. The Sisense Advantage: Sisense is a leading player in the self-service BI market, renowned for its cutting-edge features and ease of use. Here are some key aspects that set Sisense apart: a. Data Integration Made Easy: Sisense simplifies the process of data integration by seamlessly connecting to various data sources, including databases, cloud services, spreadsheets, and more. Its powerful ETL (Extract, Transform, Load) capabilities enable users to merge, clean, and prepare data for analysis without complex coding or scripting. b. Intuitive Data Visualization: Visualizing data is crucial for understanding trends, patterns, and insights. Sisense's drag-and-drop interface allows users to create visually appealing and interactive dashboards and reports effortlessly. With customizable widgets and a wide array of chart types, users can tell compelling data stories without relying on IT or design teams. c. Elastic Scalability: As businesses grow, so does the demand for data insights. Sisense's elastic scalability ensures that the platform can handle vast amounts of data and concurrent users without compromising on performance. Whether you're a small startup or a multinational corporation, Sisense grows with you. d. AI-Powered Analytics: Sisense leverages the power of artificial intelligence and machine learning to assist users in uncovering valuable insights from their data. From anomaly detection to predictive analytics, the platform empowers users to make data-driven decisions based on accurate forecasts and trends. Democratizing Data Insights: One of the most significant advantages of Sisense is its role in democratizing data within an organization. By empowering business users with self-service BI capabilities, data insights are no longer confined to a select few. Teams across departments can explore data, ask questions, and derive insights independently, fostering a data-driven culture where decisions are grounded in evidence. Enhanced Decision-Making Agility: In the dynamic business world, agility is a competitive advantage. With Sisense's self-service capabilities, business users can access real-time data and respond promptly to market shifts and emerging opportunities. The reduced dependence on IT teams for data queries and reports enables faster decision-making, leading to a significant competitive edge. Security and Governance: While self-service BI encourages data exploration, it must be balanced with proper security and governance measures. Sisense prioritizes data security and allows administrators to control access, permissions, and data-sharing settings. This ensures that sensitive information remains protected while promoting a collaborative and data-driven environment. The era of self-service business intelligence has transformed how organizations interact with data. Sisense stands out as a leading provider of user-friendly and powerful self-service BI tools, empowering businesses to harness the true potential of their data assets. By enabling data democratization and fostering a culture of data-driven decision-making, Sisense empowers organizations to navigate the complexities of the modern business landscape with confidence and agility. Embrace the power of Sisense, and unlock the insights that will propel your business toward success in an increasingly data-centric world.3KViews4likes2CommentsWelcome to Week 1!
Thank you to everyone who joined the Kickoff! As a reminder, we’ll be sharing our Week 1 topics in this group today — feel free to make a post or reply under this thread ✍️ Also, you should have received two emails from GrantNissly with more information 📮 A Weekly Summary Email — everything you need to get started 📮 A Writers Tips Email — the start of an email series with actionable writing tips With that, let’s start the week off by sharing our writing topics for the week right here. It can be as simple as the following: Week 1 Topic: The importance of planning our work or workflow before we start using the software (a gem from Steven88's Bullet Draft) Who has a Week 1 topic to share?3.3KViews1like7Comments
Group Content
About Sisense Writers Cohort
This group is for the Taptive Writers Cohort. This cohort will help community members showcase what they’re building, become Sisense thought leaders, and connect with other Sisense builders.
Owned by: jpacheco, lindavinod, GrantNissly, and slosadaCreated: 2 years agoOpen Group