- 1/29/2014 6:39:48 PM
There are two issues here.
First, there are the inherent objective limitations of science when it comes to behavioral/social phenomena. But this is all the more reason to be circumspect with respect to methodology and results.
Second, a practitioner who has a good grasp of the scientific method and is cognizant of these limitations is cautious and strives to minimize their implications, but convey both in a way acceptable to management. Practitioners without the proper education are not and cannot.
The hype of BI, big data and "data science", with emphasis on software tools, not only do not help in this context, but exacerbates: neither the analysts nor management are fully aware of problems and how to minimize them. It encourages management to expect more than possible and discourages analysts from expressing uncertainties.
This is precisely what the article I've discussed reflects.
- by louisw900, Blogger
- 1/29/2014 6:18:56 PM
@dbdebunker Thank you for delving into the issues between cause and effect and sample size as well. This is very difficult topic to understand ( which is probably the reason most analyst take the easy route), but I do agree with your argument against business analtyics being considered "data science" in the truest sense, since the lack of a scientific approach towards a question is indeed lacking.
- 1/29/2014 2:15:55 PM
Exactly rght. Due to the huge hype of "data scence", whch is a msnomer, a lot of practitioners wthout a scientific education have declared themselves analysts/scientists overnght and because managers cannot assess who is and who isn't, a lot of what passes for analyss/science is not.
There is no easy solution to the problem. It s what I call systemic and a vicious cycle whch cannot be addressed at the individual or company level. It requres a change n culture--educaton, management, hring.
- by Hospice_Houngbo, Prospector
- 1/29/2014 5:27:44 AM
@dbdebunker: I get your point. Maybe it is finding reliable analysts that is not always easy. But you are right, someone cannot appreciate or judge what they don't understand.
- 1/28/2014 6:16:00 PM
As I wrote in the post, that is why they must hire reliable nalysts who employ the right methods for the purpose and should trust them when they convey uncertainties and provide estimates of the cost benefits of the options.
- by Hospice_Houngbo, Prospector
- 1/28/2014 5:18:49 PM
"How can management ensure that the results are accurate without knowing the methods?"--- Most managements rely on extrincic evaluation to validate the efficiency of their analytics methods. I guess it is because intrincic evaluation is not always easy to apply in most cases, especially when you don't have labelled data to work with.
- 1/28/2014 2:35:10 AM
Read my post carefully and you'll see that I stressed the costs of wrong answers in either direction.
You focus on the cost of doing the research to improve accuracy, but ignore the cost of the wrong advice. Both need to be estimated for intelligent decisios\ns and the analyst must provide the info: it will cost this to improve odds vs. the wrong decision will cost this.
He must also be creative and miinimize the research costs as much as possible.
- by Michael Steinhart, Blogger
- 1/27/2014 10:47:54 PM
So how does the model or the scientist or analyst account for these other factors? Isn't that a matter of guesswork? Doesn't the scientific method, in this case, translate to high costs to the company?
- 1/27/2014 2:44:45 PM
Not necessarily. If there is another factor that causes both top selling and training attendance, extending training will not increase sales.
If, however, you established the causality training --> sales increase, you gotta estimate the cost of training vs. the average increase in sales per salesman and decide whether training is worth it.