Japan was devastated by the earthquake and tsunami of March 11th 2011 and the nuclear crisis that resulted from the natural disaster.
More than 15,000 people died, and as the survivors began rebuilding, it became clear that the impact on the global economy and Japanese businesses would be immense.
After all, damage to businesses in Japan would cause huge supply-chain disruptions throughout the world.
But the predictive analytics John Wilenski, CPA, CGMA, helped put into place at Nissan Motor Co. Ltd. played a role in keeping the company strong in the disaster’s aftermath.
More than 45 of Nissan’s critical suppliers sustained severe damage as a result of the disaster, according to research by the Massachusetts Institute of Technology (MIT) and PwC. But analytics that had given Nissan comprehensive knowledge of its supply chain before the earthquake helped the company make good decisions after the disaster.
Wilenski, whose ultimate position in seven years at the company was deputy general manager of global corporate finance and risk management, had helped develop a multifactor model to predict the financial health of Nissan’s suppliers. The model was designed to assist the company in mitigating risk during the financial crisis, and it proved important after the earthquake.
“Since we’d done our homework and had been doing this for a period of time, we were able to move quicker than our competitors,” said Wilenski, who was based in Nashville, Tennessee, with Nissan and reported first to the Americas regional headquarters and later to the corporate headquarters in Yokohama, Japan. “It was less of a scramble. We did not have to search for data. We had it.”
Wilenski is no longer with Nissan. Today, he is creating a predictive analytics infrastructure in the higher-education industry. But it was his experience with the automaker that helped him truly see the power of predictive analytics.
Nissan, using audited financial data and information given by supplier CFOs — sometimes updated weekly or even daily — Wilenski’s team developed various analytics models for assessing suppliers’ financial health, including:
- A cash flow assessment tool, which proved valuable during the global cash crunch.
- A stress-test optimising programme, which enabled “what-if” scenario analysis of suppliers.
- A break-even tool, which enabled the team to perform a scenario analysis on what some suppliers needed to do to break even or become financially healthy.
These tools were based on inputs derived from financial data and local and global economic data as well as information about strategies Nissan and the suppliers were pursuing.
There were occasions when suppliers — often smaller, private companies — would hesitate to provide Nissan with data. But non-disclosure agreements proved helpful.
Data from these tools helped Nissan predict which of its suppliers would survive the disaster and which suppliers would need help, and the company was prepared to be agile following the earthquake.
“We wanted to know instantly the current situation [of suppliers after the earthquake],” Wilenski said. “Once those reports came in, the global team was ready to move forward.”
According to the MIT/PwC report, strong risk management and effective countermeasures helped Nissan end 2011 with a 9.3% increase in production, compared with a 9.3% decrease across the industry.
Prepare for some barriers
This is an exciting time in predictive analytics, as companies are developing models to enhance their strategies and operations in areas as diverse as reducing hospital readmissions and choosing the right athletes to help sports franchises win on the field.
“When used appropriately, predictive analytics accurately provide insight into the future so an organisation can plan, prepare, and take action,” Wilenski said.
Despite the potential usefulness of predictive analytics, Wilenski said implementers need to be prepared to face barriers and accept limits. For example, the analytics may uncover some risks that can’t be mitigated at a reasonable cost.
It’s also not unusual for predictive analytics to face resistance from within the organisation. At Nissan, Wilenski said, predictions of bankruptcy for certain suppliers during the financial crisis were met with natural scepticism. Nissan had been working with some of these suppliers for years and had developed long-term relationships with them.
In such cases, Wilenski recommends back-testing the model using historical results versus any other strategic approach and comparing approaches for a test period to see which would yield better results. At Nissan, analytics gained favour after the predictive model’s conclusions about suppliers turned out to be consistent, accurate, timely, and useful.
“It took some pain,” Wilenski said. “And it took a credible track record.”
Now Wilenski is applying the techniques he has learned to a new environment: the higher-education sector.
His models will focus on predicting financial statement results as well as forecasting key performance indicators related to business strategy and operational growth.
He is excited by how technology is constantly creating additional possibilities for use of predictive analytics for business growth and risk mitigation.
Wilenski doesn’t advise being an early adopter of technology for the prideful sake of being on the forefront. He believes it can be better to wait until the technology has been thoroughly tested — with bugs resolved — and the cost comes down a bit.
The return on investment may be limited for early adopters.
“In certain circumstances, I’m a believer that the second mouse gets the cheese,” Wilenski said.
But powerful computing capabilities have reduced the cost of crunching data, and cloud computing gives executives the power to merge and share data in new and innovative ways.
“There are better tools out there,” Wilenski said. “There are even new applications of statistics. New things are being found all the time, new tools to use, and new ways to apply different measures in business situations. ... Predictive analytics can provide a competitive advantage and drive the right business decisions.”
Ten keys to implementing predictive analytics
John Wilenski, CPA, CGMA, helped Nissan Motor Co. Ltd. develop a predictive analytics model for the company’s global supply chain that provided key strategic insight during the global financial crisis and following the earthquake and tsunami that devastated Japan in 2011.
Wilenski now is building a predictive analytics model in the higher-education sector. Here is his advice for organisations implementing predictive analytics:
1. Identify the business objective. Are you trying to predict bankruptcy or default? Mitigate risk? Boost revenue? Reduce costs? Retain employees? Attract new customers? A clearly defined goal is a necessary starting point.
2. Discover sources of data. Once you have defined your objective, you should have some idea of the types of data that could be predictive. Some of those data could be currently and easily available on company-owned systems such as a general ledger, marketing, or enterprise resource planning. But other data may need to be purchased from a third party or obtained in a survey.
3. Build at the lowest level of detail. At publishing company McGraw-Hill, Wilenski helped build a software tool that allowed executives to study profitability by market channel, state, and country — and even by individual book title. Establishing data and detail at a granular level can help predictive analytics drive successful strategies. “As you build and create a predictive analytic infrastructure, there are things you are not thinking about today that will be important a year from now or five years from now,” Wilenski said.
4. Make sure data are accurate, timely, and useful. Data that have been fully vetted — such as audited financial statements — are highly reliable, but because the reporting and auditing processes are lengthy, the timeliness of the information they contain may not be ideal for some predictive analytics models. Inaccurate data will produce flawed results, and data that are not timely can drive decisions that are obsolete. If crunching the data produces strategies that cannot be operationalised, then the predictive analytics process is ineffective.
5. Address data quality concerns. Maybe you have 1,000 records, and 11 are missing some data points. Maybe there is a margin of error in a poll. Maybe one data set contains letters instead of numbers. You may need to do some “data cleansing” to make the inputs appropriate as they are entered into the model. Comparability throughout a data set is essential in predictive analytics, Wilenski said. “If you have exceptions here and there, it slows down what you’re doing,” he said. Where financial statements are being used for comparisons, differences in generally accepted accounting principles across countries can be a significant hurdle in the development of predictive analytics models. This is one reason some global companies are eager and hopeful for IFRS implementation across the world.
6. Determine which data may be predictive. This is accomplished by testing. “Back testing” through historical trends can be used as a starting point, but Wilenski said the past does not necessarily predict the future. So forward-looking testing is essential. Once you find a factor that’s predictive, you can combine it with other factors to see if the accuracy of the model increases. “A and B independently may not be as powerful, but when you put them together, maybe it’s more powerful, more predictive,” Wilenski said.
7. Balance automation and the human touch. Machines allow the rapid processing of data, but people need to decide what to measure and how to interpret the data. “You want to get to the point of machine learning,” Wilenski said. “But at the same time, there needs to be a release valve, a flexibility built in, where humans can get in and make adjustments as needed.”
8. Communication is critical. “Be able to explain things in simple terms so people can understand it,” Wilenski said. Methods of gathering data and calculating may be complex, but the resulting direction and focus need to be clear.
9. Be collaborative and cross-functional. Getting out in the field can help create support for your project and ensure that data are valid and accurate. “You can’t do this in a silo or an ivory tower,” Wilenski said.
10. Improvement must be continuous. Predictive analytics experts often say that although the initial model is useful, it becomes more effective as it gets refined. Changing conditions also make it necessary to update models. Even when data accuracy gets close to 100%, there may be ways to make the model more timely or lower the costs involved in acquiring or analysing the data.