In my last blog, I wrote about the forward-looking questions utilities can ask their backward-looking data and improve upon it with real-time information sources.
To some degree, all these sorts of analytic progressions already occur at most utilities, but suspicions about the levels of accuracy often dissuade executives from taking decisive action.
Suspicions about accuracy have dogged decisive action for quite a while now. Consider a 1982 study conducted by the US General Accounting Office for the U.S. Congressí Subcommittee on Energy Conservation and Power Committee titled " Analysis of Electric Utility Load Forecasting."
(As an aside, I absolutely loved reading this report as a piece of utility history. Think about it. The report was written before computing power was dispersed on desktops, when state-of-the-art data storage meant large rooms filled with tape machines. Even the typeface font used in the report is quaint, probably coming right off an IBM Selectric, to my non-expert, but age-experienced eye.)
The report is fascinating because it summarizes the utility energy forecasting methodologies that were considered cutting edge at the time:
Trend Forecasting "predicts future power demand by assuming that the factors that influenced demand in the past will continue to do so in the same way in the future." Certainty was affected by the fact that past results donít guarantee identical future outcomes.
"Econometric forecasting uses mathematical equations based on the relationship between past demand and economic and demographic conditions to forecast future demand." The assumption that the relationship will continue into the future made this method weak.
End-use forecasting breaks electricity consumption into the residential, commercial, and industrial demand profiles. While the data for this type of forecasting was considered in 1982 to be "expensive and time consuming to collect and maintain," the advantage was that it readily reflects changes in consumer tastes, increased efficiency of energy-using products, and changes in the economy, particularly technological shifts in our industrial base. But the ability to utilize this sort of approach was questionable given the technologies of the time.
A sum-of-the-utilities forecast gives regional or national perspective on utility demand as a combination of individual utility forecasts. "[T]his approach, because of its aggregated nature, is of limited use to individual utilities in planning new resources because the service areas are significantly smaller than the area covered by the forecast." Enough said.
So weíve looked at limitations of forecasting as perceived in 1982. Letís compare that to the forecasting capabilities that are considered important 31 years later, and the confidences asserted in their likely accuracy, as contained in a recent press release from SAS, the sponsor of this site, about its new energy forecasting solution for utilities. Here are a few of the productís selling points along with my commentary:
"[The new solution] helps utilities operate more efficiently and effectively by capitalizing on new interval data being returned from smart meters." Smart meters werenít included in the 1982 report. But the commission understood certain modern appliances had energy usage profiles that could be discovered, albeit with expense and time. My how smart meters and the development of energy usage profiles for every appliance have increased the possibility for confidence in energy forecasting.
"Unlike other load forecasting software, SAS Energy Forecasting supports multiple planning horizons -- from the next hour to the next 50 years. Utilities can leverage big data from smart meters, power plants and other sources to produce accurate and timely forecasts of short- and long-term load and demand. This helps the utilities better trade energy on the open market, while optimally managing power plants, generators and other assets." Ah, yes, big data -- a term and concept not even created in 1982. But it addresses so many of the factors that contributed to the low levels of confidence that utility manager once had.
"Utilities have successfully used forecasting in the past. Todayís new challenges, including the added complexity of wind and solar power generation, require even greater attention to the data sets and models that feed those forecasts." Once again, to be effective today, forecasting must include consideration of volatile wind and solar sources, as part of a US priority to include more renewable power sources in its mix.
How times have changed. Each of the three new energy forecasting considerations endeavors to increase confidence in methodology and capability to address modern challenges. I wonder how a similar "Analysis of Electric Utility Load Forecasting" report might be written today. Any ideas?
I was just reading about how the use of visual analytics is coming into play for preventive asset management of big manufacturing systems or really anything big with structure -- aircraft, ships, trains, buildings, bridges, and so on. Sensors are delivering vast amounts of data, as is the case in weather systems. I wonder if being able to explore the data visually will help researchers in finding patterns and understanding results in energy fields as well. I would think the application would be perfect in the energy industry as you describe it.
My thought is "No, alternative energy data sets are not readily enough available." It's my opinion that the data histories for weather are so colossal that the industry is struggling with its Big Data conundrums. Weather science algorithms are incredibly complex and its even possible that major factors, like sun spot influences, aren't even being considered by many scientists over the past decade or so to the degree that some scientists believe they should. The use of analytics to deliver data analysis with regard to alternative energy, in bite-sized chunks, will be a ongoing exercise of incredible value -- and difficulty.
Joe, interesting stuff here. We've come a long way in 30 years, that's for sure. And while I agree with you that new forecasting considerations are boosting confidence and capabilities, I have to wonder if we've come far enough. I'm especially wondering whether alternative energy data sets are readily enough evailable and being taken advantage of in energy modeling. Your thoughts?
SAS Global Forum Executive Conference 2014 The Executive Conference is held in conjunction with SAS Global Forum, a SAS users technology event. Investing in thought leadership and technical training are two of the best moves a successful company can make so take advantage of the world-class speakers, sessions and discussions around Analytics, Big data, Risk, Fraud and Data management.
LEADERS FROM THE BUSINESS AND IT COMMUNITIES DUEL OVER CRITICAL TECHNOLOGY ISSUES
The Current Discussion
Visual Analytics: Who Carries the Onus? The Issue: Data visualization is an up-and-coming technology for businesses that want to deliver analytical results in a visual way, enabling analysts the ability to spot patterns more easily and business users to absorb the insight at a glance and better understand what questions to ask of the data. But does it make more sense to train everybody to handle the visualization mandate or bring on visualization expertise? Our experts are divided on the question. The Speakers: Hyoun Park, Principal Analyst, Nucleus Research; Jonathan Schwabish, US Economist & Data Visualizer
David Tishgart, senior director of marketing and alliances at security provider Gazzang, explains the importance of data encryption for companies that are rolling out Hadoop environments to leverage big data analytics.
At the Strata Conference / Hadoop World 2013, Samuel Kommu, technical marketing engineer at Cisco Systems, shares some of the benefits that Hadoop brings to analytics platforms that leverage next-generation hardware. Kommu looks at big data operations that required 3,500 nodes in 2009, 2,000 in 2011, and now require only 64 nodes.
Wayne Thompson, manager of SAS Data Sciences Technologies, delivers a fascinating preview demonstration of SAS Visual Statistics, a tool that enables fast and flexible modeling against massive datasets on the fly. Visual Statistics will be made generally available in March, but you can see it here first.
At Strata/Hadoop World 2013, Cloudera CEO Tom Reilly discusses the new Enterprise Data Hub offering, explaining how it works with Hadoop, how it creates a single repository of full-history and full-fidelity data, and how it exposes that data to all users interested in exploratory analytics.
At this year's Strata Conference/Hadoop World 2013, SAS big data vice president Paul Kent presented a session on setting up Hadoop clusters for advanced analytics. We caught up with several audience members and recorded their impressions of the presentation.
In hearing directly from a doctorate-level Hadoop specialist, a healthcare data analyst, and a marketing executive, it's clear that big data analytics is a burgeoning field that cutting-edge companies are eager to explore.
At this year's Strata Conference/Hadoop World 2013 event, SAS VP of Big Data Paul Kent presented several sessions about modernizing and deploying advanced data analytics infrastructures based on Hadoop. In this video, he talks about the state of Hadoop adoption among enterprises today and looks out to the big data-driven applications of the future.
Companies that use SAS analytics tools for their traditional databases are looking to derive even more value by mining unstructured data. Data management platforms like Hortonworks enable that relationship by delivering an enterprise-ready Hadoop framework.
In this video, Shaun Connolly, vice president of corporate strategy at Hortonworks, explains how companies can incorporate Hadoop into their data analytics streams.
At the SAS Premier Business Leadership Series in Orlando, Manuel Sanchez, CRM Manager for Club Premier Aeromexico, explains the challenges and opportunities of transaction data. Using dozens of data sources among participating airlines and merchants, Club Premier creates robust customer profiles and works to maximize benefits for members and business partners alike while protecting individual privacy.
At SAS's October Premier Business Leadership Series (PBLS) in Orlando, attendees from the corporate and academic worlds joined thought leaders and analytics professionals to share insights and strategies around big data.
Will Hakes, CEO and co-founder of Link Analytics and keynote speaker at the SAS Analytics 2013 conference in Orlando, Fla., last month, talks candidly about the challenges that large enterprises face as they explore advanced analytics solutions. He also shares some practical tips for smoothing the transition.
At the SAS Analytics 2013 conference in Orlando, Bob Gladden, vice president for decision support and informatics at the Ohio nonprofit health insurance provider CareSource, explains how his company uses advanced analytics to keep administrative costs down and to identify at-risk patients for targeted healthcare initiatives.
At the Analytics 2013 conference in Orlando, Fla., two analytics experts from Dell -- global decision sciences manager Natalie Kortum and senior credit risk consultant Jack Chen -- share their real-world advice for analysts who want to sell their project ideas to business executives.
At the SAS Premier Business Leadership Series in Orlando, Fla., Lousiana State Representative Chris Broadwater outlined the state's success with analytics-driven fraud detection and shared his vision for streamlined processes at the DMV, the healthcare system, and even the department of corrections -- all delivered via a centralized repository of rich customer data.
Organizations that are ready to leverage big data need to move beyond buzzwords and approach the challenges with a business focus. Peter Guerra, principal at Booz Allen Hamilton, shares his insight and experience in helping clients transition to Hadoop and embrace new decision support platforms.
At this year's Strata Conference / Hadoop World 2013, Michael Steinhart chats with Rackspace Product Marketing Manager Sean Anderson about Hadoop, cloud computing, and how the two come together for companies that want to undertake a "proof of value" project.
With today's advanced visual analytics tools, you can stream data into memory for real-time processing, provide users the ability to explore and manipulate the data, and bring your data to life for the business.
Dynamic data visualizations let analysts and business users interact with the data, changing variables or drilling down into data points, and see results in a flash. Advance your use of data visualization with tools that support features like auto-charting, explanatory pop-ups, and mobile sharing.
No doubt your enterprise is amassing loads of data for fact-based decision-making. Hand in hand with all that data comes big computational requirements. Can traditional IT infrastructure handle the increasing number and complexity of your analytical work? Probably not, which is why you need a backend rethink. Big data calls for a high-performance analytics infrastructure, as Fern Halper, a partner at the IT consulting and research firm, Hurwitz & Associates, discusses here.
Redbox's bright-red DVD kiosks are all but ubiquitous these days, located in more than 28,000 spots across the country. Jayson Tipp, Redbox VP of Analytics and CRM, provides an insider's look at how the company has accomplished its phenomenal nine-year growth.
InterContinental Hotels Group (IHG), a seven-brand global hotelier, has woven analytics into the fabric of its operations. David Schmitt, director of performance strategy and planning, shares IHG's analytics story and his lessons learned.