Reporting & Analytics Technology Recommendations?

Kerri Wienbeck
Kerri Wienbeck Founding Member | Scholar ✭✭

What tool(s) are you using for going beyond basic reporting into advanced analytics and predictive data science? If Gainsight is part of your technology stack, are you using the reporting functionality and giving visibility to other functions in the organization or transferring the data to another BI tool?


  • Alexander Ziegler
    Alexander Ziegler Founding Member | Expert ✭✭✭

    As we're having our own leading analytics tools we're using our tools. But to comment on the second half of your question: We have seen in the last years an important 'best practice', that we're using internally: centralize your data in one spot - and make it available as people need it. Having multiple data sources, where somebody is using one tool and then exports to another tool and/or used in another database may be fast and simple (as you avoid the discussion to use one dataset etc) - but it is wrong. Keeping all on ine place, have all people seeing the same data, using the same reports seems to be the most efficient strategy.

  • CindyYoung
    CindyYoung Founding Member | Scholar ✭✭

    Our PS and Ops teams uses Power BI for reporting of sales, resourcing, and utilization data. Gainsight is part of our technology stack, used by the Support Services/Customer Success areas but is not included yet into our Power BI reporting - this data is currently distributed via .ppt/email.

  • Joe Thomas
    Joe Thomas Founding Analyst | Scholar ✭✭

    We leverage the Einstein Platform from Salesforce for advanced analytics and predictive data science, including customer sentiment, across our entire organization We use some of the core Gainsight reporting, but we also bring Gainsight data into our core dashboards.

    We've also productized the Einstein dashboards we've built as part of our core ERP and PSA offerings

  • Alexander Mundorff
    Alexander Mundorff Founding Analyst | Scholar ✭✭

    We have pulled all our data into a data warehouse and using Power BI and Qlik for analyzations and transfer data back into other systems, so people have the information in the systems they are working with.

  • Jeremy DalleTezze
    Jeremy DalleTezze Member, TSIA Administrator | admin

    Great comments on this thread thus far. One persistent challenge, even with new technologies that come with embedded analytics and out-of-the-box integrations, is getting all of the data required for your analysis in one spot and in the correct shape. In most cases, our team still has to pull multiple tables from system A and a few tables from system B, reshape the data to a common level of detail, and then perform the analysis.

    From a tools perspective, we encourage members to leverage no-code/low-code Extract/Transform/Load (ETL) tools to minimize generating complex queries to get the data AND to avoid data scientists/analysts from wasting time reshaping data - after all, we want them to analyze the data for patterns, not figure out how to get the data ready for analysis.

    There are some ETL tools that also contain sophisticated data mining/analysis capabilities, so that analysts/scientists can minimize code creation/maintenance while performing the analysis. Lastly, per Alexander's comment, you can then push these transformed insights back into the systems where employees work. This enables you to get a positive return on your analysis investments.

    In terms of specific brands/tools, the below poll results highlight a few. The market is always maturing, and there are often a few good options for your specific environment. Let me know if you want to discuss!

  • At Qlik we use our own product (Qlik Sense), for data analysis :)

    We pull data from Salesforce, COVEO, JIRA, CISCO Phone system and others.

  • StevenForth
    StevenForth Founding Partner | Expert ✭✭✭

    The approach we are taking is to take data from multiple sources (our platform, customer success data, social media data) and then aggregating them into a data lake. Everything is time stamped. We are building a simple application using Tensorflow to generate something we call Predictive E (E for engagement). Our goal is to predict future engagement from trailing data.

    Eventually we plan to use this to build our own renewal prediction engine.

    We aspire to do this using and explainable AI approach (xAI) but are still working out how we would do this.

    We are applying the same concepts to value-based segmentation.

    The principles

    No one system has all the data to generate accurate predictions.

    Predictive models need to be self evolving.


    Engagement will help to predict renewals.

  • Iurii Kim
    Iurii Kim Founding Analyst | Scholar ✭✭

    A whole bag of different things in our case, whatever works best. The approach is to find the most efficient and suitable tool that exists out there and implement it in our environment.

    Due to my empirical background, I do tend to use Stata as my main software package - it allows for more complex calculations that, if set correctly, lead to insights not really visible in standard surface-level reporting.

    What also helps is that the data is stored on a centralized system so I don't get to waste too much time figuring out what goes where.