In his excellent book on the research around the theory of Deliberate Practice, Geoff Colvin distills volumes of often complicated academic research into a clear understandable prose that is easily accessible to a wide audience. Colvin uses a series of carefully chosen anecdotes to explain the various dimensions of this complicated theory (yes, there is more to it than 10,000 hours by the way). What is remarkable about the book, is that it accurately reflects the messages and strength of the research that has been done. There are varying levels of evidence from the research for different aspects of the theory, and those points are clearly made. The goal of this post is not to sell more copies of Talent is Overrated (though I have provided a link in case you are interested), but rather to make the point that clearly communicating research is at least as important as the work itself. This is just as true for statistical analysis as it is for scholarly research.
At the 2010 Sloan Sports Analytics Conference, I was sitting next to a high level NBA executive at a research presentation. The work being presented was interesting if not revolutionary. When the presentation was over, I went to the front of the room to ask a few questions of the presenter, but was beaten to the punch by the exec I was sitting with who said (and I paraphrase here a bit) "oh my God, you can talk". And it was true, the presenter had distilled some very complicated analysis down to the core message and accurately conveyed the strengths, potential, and limits of the work in such a way that audience could clearly understand it. If the presenter had not been able to do communicate his research to a non-geek audience (ok, it was SSAC 2010 non-geek is overstating) he would not have been speaking to a full room by the end of his presentation, never mind having extended conversations with NBA execs about it.
Communicating statistical analysis is a careful balance between the strength and limits of the analysis. The story of the analysis has to be conveyed in such a way that a non-geek user of the information can use the analysis properly. Your projection may show that a player is going to improve their rebounding by 20% over the next 3 years, but you also need to convey the risk associated with that analysis. What are the range of likely outcomes? What are the risks?
It is tempting when when working with team executives to make it all too simple and speak in absolutes, especially when others are making similar statements about their point of view. It is incumbent on the analyst not to fall into this trap though, because our analysis does contain variance and we will be wrong. When we are wrong, it becomes easy to dismiss the analysis if we have spoken in absolutes, while if we have strongly communicated (and accepted ourselves) what are research actually says, then, while we may not win every argument, we will win more over the long term.
It is also possible to believe that we are so clever in the techniques we have used to solve a problem, we lose sight of the problem we were trying to solve. It is rare that you will run into an exec who understands or truly cares about how cutting edge the techniques are. They want to know that they are getting good information that they can have confidence in, not that you used some slick new neural network algorithm in R to get the slickest results. This is one of the reasons the communication piece can be so tricky. We have confidence in our results because of the techniques used, and while you may want to have the one sentence description of what you actually did ready in case they ask, management will only have confidence by seeing the results.
After spending hours and hours carefully constructing your analysis, be sure to put a significant amount of time into deciding how to present it. Think like your audience, and what will help them use the analysis properly. If you don't communicate the analysis effectively, then the analysis will be wasted.
No comments:
Post a Comment