Bridging the Gap: Techniques for Data Analysts to Influence Decision-Making
“Nearly half (45%) of data experts say they struggle to interpret domain experts’ data questions or needs, and more than a third (36%) surveyed believe domain experts do not understand what can and can’t be done with data.”
— Sigma Computing
One way analytics professionals can differentiate themselves is by effectively communicating to executives. Many data professionals have little interest in this — applying far greater focus on the journey taken to produce their work than the presentation of the result.
Regardless of your level of interest, data professionals need to become effective communicators. Effective communication and presentation is the best way for your work to have an impact on your organization.
Sigma Computing found that a third of data professionals feel unable to convey the value of their work.
“After all the time and resources that go into preparing these reports, data experts feel their work is under-appreciated by company leadership, with 34% sharing they’ve felt unable to convey the value of an analysis to an executive or key decision maker.”
For this reason, we’re going to discuss ways to approach analytical conversations when you’re in presentations or boardrooms. Specifically, we’ll discuss how to communicate:
- Certainty
- Consistency
- Insights & Recommendations
- Severity
TL;dr. Effectively communicating analytical topics requires a balance of (a) technical depth and thoroughness and (b) clarity and simplicity in your message. You are not in the room to prove your intelligence, you’re there to translate the “thesis” of your work and its relevance to your organization.
1. Communicating Certainty
Statistics allow us to assess the magnitude of an impact or relationship and our certainty that it is real. We measure this certainty with two key measures: p-values and confidence intervals. When discussing these in a meeting, it is important to provide a simple explanation of how to interpret them.
- P-values indicate the likelihood that our findings (i.e. that reject the null hypothesis) are not due to randomness in user behaviors. For instance, a p-value of 0.05 in an A/B test implies that we are 95% sure that the test variant outperforms the control group.
- Confidence intervals represent the range in which we are X% certain that the actual value falls. As our tests consider only a sample of users, there is a level of uncertainty in our predictions. By using confidence intervals, we can be precise about this level of uncertainty. For example, if a confidence interval is 95%, with a lower bound of 2 and an upper bound of 4, we can be 95% certain that the true value lies between 2 and 4.
We can also describe our confidence in the data itself. Identifying data quality is an important step in enterprise analytics. The following dimensions should be considered when classifying the quality of underlying data.
- Data availability. We have the data required to complete the analysis (ie no missing columns or datasets).
- Data completeness. There’s no missing data (ie missing rows) in the dataset. If there are, we have discussed how to address the missing values with the business stakeholders.
- Data triangulation. When our datasets are aggregated (for instance, by calculating total users and revenue), high-level metrics align with our team’s key business reports.
- Data semantics. Metrics and dimensions are defined in the data in the same way that business domain experts discuss them.
- Data bias. We are aware of outliers and potential biases in our data. Here are two excellent articles on data bias (one and two).
2. Communicating Consistency
We evolve our understanding of a business through iterative evidence generation, a process that bears resemblance to the Scientific Method. Though the following terms aren’t traditionally used within the scientific or data community, we can them to precisely communicate the consistency of the findings we’ve observed.
Early indication. Use this caveat when discussing a finding for the first time with your audience.
- “The survey found that highly engaged business users have lower CSAT than the average. Some feedback points indicate that this is due to challenges with integrations. We will conduct deeper research to understand the drivers of this group’s lower CSAT.”
Trend. Link multiple findings across time or categories (e.g. marketing channel, product line).
- “The redesign of the new product pages has performed well. As you’ll see in the chart below the redesign outperforms across all product categories .”
- “Our ‘buy-one-get-one’ offers have continued to outperform our ‘10% discount’ offers for the past 3 months. If we continue to see this outperformance, we’ll move a larger share of our promotional offers to BOGOs.”
Contrarian finding. Call out findings that counter-trends we’ve seen previously, or where a subgroup’s behavior differs from the overarching customer base.
- “We see high levels of price elasticity in all product categories except for furniture and other large ticket items.”
- “This is the first year our holiday promotion didn’t drive incremental profits. We believe that the current economic climate has impacted consumer spending habits.”
Hypothesis. These are beliefs that you or other colleagues have about a business topic. The findings that you collect during your research will either support or refute this hypothesis.
- “Same-store sales are down 4% year-over-year. We believe that the current economic climate has impacted consumer spending habits. However, we need to conduct further analysis to understand the impact of market pressures on the business.”
Theory. These are collectively held beliefs about a particular topic in the organization. These beliefs are supported by insights and trends over time. That does not mean, however, that theories cannot change based on shifts in the competitive and economic climate.
- “After conversations with the Revenue and Internationalization teams, we collectively believe that we should add Spanish natively into the product and focus our expansion efforts in Latin America. This is for several reasons. (1) Our existing LATAM user base is among the most engaged regions, (2) adding Spanish will have the greatest potential impact on our User base than any other language, (3) regulatory and operational hurdles are less significant in LATAM than other target markets.”
3. Communicating Insights & Recommendations
Analysts should be careful when giving recommendations, especially as a consultant.
- Firstly, the analyst may not have the complete business context and, therefore, may provide a misinformed insight.
- Secondly, the audience may (even subconsciously) disregard your recommendation if it counters their preconceived hypotheses — even if it’s a great insight.
For this reason, analysts (whether internal or external) should incrementally build to a recommendation using an iterative, collaborative process between the researcher and their stakeholders.
- Analysts present “early findings” to stakeholders to determine (a) whether the data that underlies the potential insight makes sense and (b) what business context can be added to produce value-added, relevant insights.
- Analysts receive feedback from stakeholders and adjust or deepen the existing insight as needed.
- Analysts present the finalized insight, which combines a visual of the underlying data point(s) and the business context.
- Analysts and stakeholders jointly review related insights to determine what actions should be taken based on the new findings. This recommendation can take the format — “because {insights} indicate that {reason for business trend}, we recommend {business action}.
4. Communicating Severity
It’s challenging being the bearer of bad news. We can use the following phrases to discuss findings that may have a significant impact on the business. The conversations have been tailored to the startup, ClassPass.**
Monitor. Watching a particular metric or trend to see whether it continues or becomes stronger. This term suggests that the stakeholder should not take action on a finding, but may consider doing so in the future. Words like monitor communicate, “We’re aware of what’s going on and we’ll keep you informed.” Very reassuring for executives.
- “We will continue to monitor the decline in the share of luxury classes, which has led to a 6% YoY decline in Average Monthly Revenue per Active. The sustained decline may require a review of our pricing relative to our competitor set.”
Consider. This is a “soft” recommendation to take action, based on a finding or set of business knowledge. There are several reasons we may want to use soft language. For instance, (a) it may influence the strategic, operational, or political state of the business, or (b) because the insights that support the recommendation have not been supported in a greater trend.
- “Our average price for running classes within the Chicago and Los Angeles markets exceeds our competitor set by 20%. We have seen preliminary indications that sales within this product segment have stagnated MoM due to this price disparity. Consider reducing the price to 10% above the competitive set.”
Suggest. “Suggest” has a moderate tone when used within board rooms. Executives will most likely perk up and listen to what follows, but you’re not risking your job based on a suggestion gone awry. That said, “suggestions” should be supported by multiple data points and collaboratively developed with strategic stakeholders. IMHO, data scientists are often not equipped with the necessary information to give business recommendations to an executive. However, insights combined with wisdom gleaned from executives results in a powerful perspective.
- We’ve seen a substantial increase in boxing interest — with a 220% year-over-year growth in boxing class signups. We spoke to the Partnerships team and suggested that expanding boxing studio partnerships should be a focus this year.
Recommend. The term recommendation has been a commonly used term throughout the history of business analytics and data science. Recommendations can be described as “suggestions for strategic change in an organization, based on information generated through analytics.” Due to the historical weight of the term — “we recommend” — building to a recommendation takes more collaboration and iteration than suggestions (as discussed above).
- “The new variant of the user flow increased conversions by 12% versus our current design. We saw statistically significant gains across all of our primary user segments. Therefore, we recommend moving forward with a full release of the new design.
Declare. Simply stating “we are doing {action}” is the most unequivocal delivery of a recommendation. This should only be used when
- The finding or recommendation is an obvious and non-controversial decision.
- All relevant decision-makers are in favor of the declared action.
For example, “The Pink creative consistently has 10% — 20% higher click-through rates than the Purple creative. We will continue with the Pink creative until we see signs of creative fatigue.”
Conclusion
In conclusion, communicating insights can be a challenging endeavor, even for an experienced data scientist. The words we use to communicate have a greater weight than we often realize. By remaining aware of the words we use during meetings and using the “communication conventions” above, analysts can have effective conversations with executives (or any strategic stakeholders) when presenting their work.
** References to specific companies are for hypothetical purposes only. We do not have access to these companies’ data nor do we have insight into their business, aside from the publicly available information.