Imagine you are a quality manager at a solar cell manufacturing line, and you have just received a note from your testing group that the last batch produced had an alarming rate of low-efficiency product. Given the implications and impact of such an event, you need to get into full action to understand the reasons for the higher failure rate and how to fix it for the current and future batches. Meanwhile, the clock is running, because each incremental failure is adding to the millions of lost revenue already incurred. What do you do?
Now imagine you are a treasure hunter who has just received a grant from a private investor to find lost treasures in the Pacific Ocean. You'll get a special bonus for finding the Japanese general Tomoyuki Yamashita's ship, which had precious treasures looted from Burma. The grant is enough to help sustain you and your team in the sea for 30 days. What do you do?
Although these are two very different situations, the paths to the solution -- or, should we say, the paths to the efficient solution -- would be similar.
As a treasure hunter, you could take a “Christopher Columbus, the explorer” approach: setting sail with a crew and a submarine in tow and starting to look for the gold. If you go the explorer route, you are guaranteed the views (nice corals, beautiful wildlife, emerald-green waters), but the chances of finding gold in 30 days are slim. Essentially, you are not directing your effort toward finding gold. Your actions of exploration are independent of what you are looking for. You'd take the same action when tasked with looking for killer whales.
Or you could take a “Sherlock Holmes, the detective” approach and identify potential areas for shipwrecks in general and where Yamashita’s ship might have sunk in particular. How do you identify top candidates for the shipwreck's location? Can you look at historical trade routes and records of wrecks and then eliminate some locations using depth information and possibly records of Yamashita’s post-World War II retreat? By using clues and facts, you can start identifying potential areas to explore.
Once you have identified a dozen-plus potential locations, you can prioritize the top three areas and then use your submarine or deep-sea divers to go explore. You will likely find your gold in a much shorter time. You will either succeed fast or fail faster and then restrategize to attack the problem again.
Going back to our quality manager in the solar cell line with a major malfunction, the approach to identifying and fixing the problem is no different than the treasure hunt. You could take the Columbus approach and start collecting data points, hoping to find the causes of the failure. But, as you can imagine, with multiple assembly lines, each having multitudes of processes and equipment, your chances of finding the problem area quickly are going to be slim. It could be because of a single valve malfunction, but imagine the probability of finding it among the millions of things you could inspect!
That changes if you take the Holmes approach and identify clues guided by where the failures happened. Are all the lines producing unusually faulty products? When did the problem start? Where exactly are the faults in the product, and to what processes/equipment do they correspond? You get the gist!
The important thing to note is you don’t need to know all the answers to the guided questions you are asking. You can construct a solid hypothesis based on what you know and then use the hypothesis to unravel the potential problem candidates. With this approach, you can find quickly that high temperature in line 10 caused the faulty construction. Process recipe changes or malfunctioning hardware, such as heat exchangers, often cause temperature issues. By identifying the top things to consider, you can easily narrow down the candidates and find the cause of the problem -- in this case, a heat exchanger's faulty valve.
I can’t imagine a treasure hunter worth any salt going the Columbus route, but I have seen enough explorers of data in the business world! Efficient managers and analysts use the guided Holmes approach to direct their efforts and look for answers that are relevant to the problems at hand, thereby finding gold nuggets and delivering a financial impact to the organization. In this case, you, the smart manager, can follow the clever Holmes approach and save your company millions by identifying and replacing the faulty valve in a few days!
To learn more about the detective approach to analytics, download this whitepaper on Aryng's five-step analytics framework for moving from data to decision. We elaborate on how hypothesis-driven analysis helped us identify $120,000 for a $1 million winery in just two hours!
I agree with everyone that less is more. From my exposure to organizational goals and objectives, strategic plans provide direction to medium term and tactical initiatives which in turn provide direction to organizational goals and metrics.
Critical few goals (e.g., Profitability, Cost, Quality & Safety, and Growth) provide laser-like focus to the organization. Within the organization every employee has the same goals but the matter and degree (% of contribution) would differ.
This in my past experience has helped the organization to keep a standard, simple, and focused pursuit towards organizational alignment.
I forgot 1 last step in the problem solving process.
After corrective action is verified, typically there is an activity widely known as "Lessons Learned" where the process owner shares lessons learned within the organization or if it is a global organization with other facilities to prevent others from creating the same defect.
This step would be very beneficial in any endeavor of the organization.
The defective product in a manufacturing process typically has tell-tale signs in addition to product identification. Using scientific approach to problem solving, the following could be collected in a matter of minutes to an hour:
1. Product produced on an assembly line (date code stamp) on product if available. 2. Review existing Failure Mode and Effects Analysis (FMEA) 3. Control plan for the line and product (shows inspection, and test protocol and criteria etc.) 4. Process flow diagram (process sequence with responsibilities, operation etc).
Based on the above, following actions are typically taken:
1. Containment Action (in suspect product lot) so suspect product is quarantined from rest of "good" production parts.
2. Root Cause Analysis-combination of brainstorming (with process experts), review of tell tale signs of product failure, talking to "parts" (identifying differences and causes of Best of the Best (BOB) and Worst of the Worst (WOW) products, Fish Bone Diagram, etc. could be drawn to move towards most likely cause and testing the cause by turning the problem "on" and "off" to finalize the root cause.
3. The above step is critical in leading towards problem resolution (permanent corrective action) that directly addresses the root cause identified above.
Typically the problem resolution happens in days to weeks depending on complexity of the problem, identification of causes, organization culture, infrastructure conducive to problem solving etc.
I like the article on Holmes Vs. Columbus. It is thought provoking and thanks to Piyanka.
"However, practically, I have seen, the better the tool, the higher the expectation/dependence on the "tool" showing the right answer"
Priynka, you are right. There are many tools for the same reason and the output only depends up on the input datas. Tools cannot create any outputs; it can only segregate the results based on inbuilt equations and decision blocks. So the data flow and decision approaches are more important.
Piyanka's view is what I strongly believe in as well. If the tools were too perfect, nobody would hire an analyst. Most of the tools though advanced and user friendly, unfortunately lack "scope for customization". A perfect GUI can just get you what you need but not interpret the results for you. How would a marketing manager understand 'Rsquare' , 'chi-square' and 'validation misclassification rate'.
It is the analysts who need take the " Statistics to Business reommendations" path rather than relying on a tool with the threat for "Garbage in Garbage out"
Ah ha! Good point Piyanka. I do think some cuation is needed when introducing data visualization, for the very reasons you spell out. Great for those who "get it," but too much of a detraction for those who only think they get it. Thanks!
However, practically, I have seen, the better the tool, the higher the expectation/dependence on the "tool" showing the right answer. But unfortunately, tool doesn't have the answer, the analyst with a proper method can find the answer. So my experience with great visual tools is it often distracts because it can visualize anything and everything and can thus allow a user to be lazy and dump everything into the tool and see what comes up.. a common mistake I see is folks would get super excited by a deviation from trend and would spend a lot of time on it before realizing that the trend affects very very tiny percentage of the population (like a very small country in emerging markets) and thus has literally no impact on the business (for example: on the overall global business) even though visually, one sees a clear deviation from the mean.. So in my experience
1. good tool + good skills = great business results
Diego Klabjan, chair of the INFORMS University Analytics Program Committee and program director for Northwestern University's Master of Science in Analytics program, gives his advice for figuring out where to get an advanced analytics degree.
What Works: Open Source Analytics Software International Institute for Analytics WebinarOn Wednesday, Sept. 24, join IIA CEO and Co-Founder Jack Phillips, along with featured guest Gary Spakes, as we explore the five modernization stages that analytics hardware/software have experienced. We will discuss the considerations when calculating total cost of ownership of the analytics ecosystem.
2014 VA Interactive Roadshow -- Cary, NCThe 2014 VA Interactive Roadshow will feature SAS® Data Management and SAS® Visual Analytics experts covering topics like prepping data for VA and VA integration with SAS® Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
Essential Practice Skills for Analytics Professionals Drawing on best practices from the field, this INFORMS course helps analytics professionals add value from beginning to end: listening to clients, framing the central problem, scoping a project, defining metrics for success, creating a work plan, assembling data and expert sources, selecting modeling approaches, validating and verifying analytical results, communicating and presenting results to clients, driving organizational change, and assessing impact.
Analytics 2014 The Analytics 2014 Conference is a two-day, educational event for anyone who is serious about analytics. This annual event brings together hundreds of professionals, industry experts and leading researchers in the field of analytics. All Analytics members save $500 on conference fees by using promo code ACAA.
Premier Business Leadership Series 2014 The Premier Business Leadership Series is an exclusive event for senior executives and decision makers that focuses on solving the current issues that affect governments and businesses globally. The Series is a unique learning and networking experience focused on the most innovative leadership strategies and analytic solutions for competing in todayâ€™s global economy.
2014 VA Interactive Roadshow -- BostonThe 2014 VA Interactive Roadshow will feature SAS® Data Management and SAS® Visual Analytics experts covering topics like prepping data for VA and VA integration with SAS® Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
Data Exploration & Visualization Get hands-on training that focuses on the critical steps in the process of analyzing data: accessing and extracting data, cleaning and preparing data, exploring and visualizing data. This INFORMS course will use several of the most popular software tools intensively, and provide an overview of the range of software options.
Foundations of Modern Predictive Analytics In this INFORMS course, learn about modern predictive analytics, the science of discovering and exploiting complex data relationships. This course will give participants hands-on practice in handling real data types, real business problems and practical methods for delivering business-useful results.
2014 VA Interactive Roadshow -- AtlantaThe 2014 VA Interactive Roadshow will feature SAS® Data Management and SAS® Visual Analytics experts covering topics like prepping data for VA and VA integration with SAS® Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
LEADERS FROM THE BUSINESS AND IT COMMUNITIES DUEL OVER CRITICAL TECHNOLOGY ISSUES
The Current Discussion
Visual Analytics: Who Carries the Onus? The Issue: Data visualization is an up-and-coming technology for businesses that want to deliver analytical results in a visual way, enabling analysts the ability to spot patterns more easily and business users to absorb the insight at a glance and better understand what questions to ask of the data. But does it make more sense to train everybody to handle the visualization mandate or bring on visualization expertise? Our experts are divided on the question. The Speakers: Hyoun Park, Principal Analyst, Nucleus Research; Jonathan Schwabish, US Economist & Data Visualizer
The hospitality industry gathers massive amounts of customer data, and mining that data effectively can yield tremendous results in terms of improved CRM, better-targeted marketing spend, and more efficient back-end processes. Roger Ares, vice president of analytics at Hyatt Corp., discusses the ways he and his staff use big data.
Charged with keeping track of travel assets, including employees, iJET International relies on data management best-practices and advanced analytics to keep its clients in the know on current and potential world events affecting travel, Rich Murnane, Director of Enterprise Data Operations & Data Architect, told All Analytics in an interview from the 2014 SAS Global Forum Executive Conference.
Jason Dorsey, chief strategy officer for the Center for Generational Kinetics and keynote speaker at last month's SAS Global Forum 2014, describes how Gen Y professionals are enhancing the makeup of multigenerational analytics organizations.
From analytics talent development to the power of visual analytics, All Analytics found a variety of common themes circulating throughout the exhibition floor and session discussions at the 2014 SAS Global Forum and SAS Global Forum Executive Conference events held last month in Washington, DC.
Talking with All Analytics live from the 2014 SAS Global Forum Executive Conference, Eric Helmer, senior manager of campaign design and execution for T-Mobile, discussed the importance of customer data -- starting internally -- in devising the mobile operator's marketing plans.
The big-data analytics market can be a confusing place. Among the vendors vying for your dollars are traditional database management providers, Hadoop startup services, and IT giants. In this video, All Analytics editors Beth Schultz and Michael Steinhart sit down in a Google+ Hangout on Air with Doug Henschen, executive editor of InformationWeek. Henschen discusses use cases for big-data analytics, purchase considerations, and his recent roundup of the top 16 big-data analytics platforms.
At the National Retail Federation BIG Show last month, All Analytics executive editor Michael Steinhart noted a host of solutions for tracking and analyzing customer activity in retail stores. From Bluetooth beacons to RFID tags to NFC connections to video analytics, retailers must find the right combination of tools to help optimize the shopper experience, streamline operations, and boost revenues.
The days when historical shipment trends and gut feelings were enough to forecast retail demand accurately are long over. SAS chief industry consultant Charles Chase outlines the benefits of pulling real-time sales information from point-of-sale and product scanner systems, then flowing that data into dynamic forecasting tools from SAS.
With today's advanced visual analytics tools, you can stream data into memory for real-time processing, provide users the ability to explore and manipulate the data, and bring your data to life for the business.
Dynamic data visualizations let analysts and business users interact with the data, changing variables or drilling down into data points, and see results in a flash. Advance your use of data visualization with tools that support features like auto-charting, explanatory pop-ups, and mobile sharing.
No doubt your enterprise is amassing loads of data for fact-based decision-making. Hand in hand with all that data comes big computational requirements. Can traditional IT infrastructure handle the increasing number and complexity of your analytical work? Probably not, which is why you need a backend rethink. Big data calls for a high-performance analytics infrastructure, as Fern Halper, a partner at the IT consulting and research firm, Hurwitz & Associates, discusses here.
Redbox's bright-red DVD kiosks are all but ubiquitous these days, located in more than 28,000 spots across the country. Jayson Tipp, Redbox VP of Analytics and CRM, provides an insider's look at how the company has accomplished its phenomenal nine-year growth.
InterContinental Hotels Group (IHG), a seven-brand global hotelier, has woven analytics into the fabric of its operations. David Schmitt, director of performance strategy and planning, shares IHG's analytics story and his lessons learned.