ACT, in conjunction with the Industry Advisory Council (IAC), recommends government agencies increase investment in data analytics, technology to increase productivity, and other technological solutions to reduce waste, fraud, and abuse. It concludes the potential savings will far outweigh the $80 billion a year the federal government currently invests in technology.
By way of background, ACT was created in 1979 to improve government through the efficient and innovative application of IT. A decade later, ACT established the IAC to bring industry and government executives together to collaborate on IT issues of interest to the government.
The report concludes that investment in IT innovations is the best way for federal agencies to continue to deliver essential services at reasonable costs. Specifically, it makes three recommendations:
Accelerate the use of data analytics to identify opportunities to reduce government costs. "Gathering and analyzing data from a variety of internal and external sources can help determine performance and outcomes for federal programs," the report notes.
Invest in technology to increase productivity and reduce costs. "The federal government has the opportunity to accomplish its missions in a cost effective manner while providing the citizenry with expected levels of service by strategically using IT as a part of the solution."
Use technology to combat fraud, waste, and abuse. "The federal government is in a unique position to strategically apply IT investments to reduce federal outlays and the federal deficit. Further, IT investments typically result in many ancillary benefits, such as providing jobs and much needed tax revenue."
Although the government has reduced the amount of "improper payments" in recent years, we still waste a lot of tax dollars -- about $115 billion in fiscal year 2011. In that period, federal agencies reported a government-wide improper payment rate of 4.69 percent, a decrease from the 5.3 percent improper payment rate reported a year earlier, government data shows.
Improper payments" occur when:
funds go to the wrong recipient
the right recipient receives the incorrect amount of funds (including overpayments and underpayments)
documentation is not available to support a payment
the recipient uses funds in an improper manner.
One of the key recommendations in the report is for the federal government to establish "analytical shared centers of excellence (COE) focused on enterprise-level operations to reduce fraud, waste, and abuse and strengthen program integrity."
These COEs, whether dedicated to an agency, or shared among agencies, will enable sophisticated modeling and simulation of historical data, e.g., to identify anomalous thresholds based on multivariate factor analysis, disparate data fusion to include advanced geospatial analytics, testing of new means, methods, and tools, etc. Further, the COEs can develop more sophisticated rule systems (rather than simple filters) to screen incoming entities and outgoing transactions with acceptable false positive rates.
As data professionals, how do you feel about these recommendations? Do you believe investment in IT solutions, specifically analytics, can help bridge the gap between smaller budgets and essential services?
@SaneIT I haven't looked into that. I'm not sure the government is completely transparent about how much of our personal data it collects and shares among agencies. I have seen articles about the government extracting that data from Google, like this one:
Google is reporting yet another uptick in government requests for its users' data.
Google's latest Transparency Report indicates governments want more user data than ever.
Governments want Google's data more than ever.
That's the conclusion of the search engine giant's latest Transparency Report, which indicates that governments around the world filed an increasing number of requests for user data in the second half of 2012. In the United States, some 68 percent of those requests came through subpoenas, while 22 percent came through Electronic Communications Privacy Act (ECPA) search warrants; the remainder was "mostly court orders," according to Google's Jan. 23 blog posting on the matter.
Google received a total of 21,389 requests for information about 33,634 users for the July-through-December timeframe. The United States topped the list with 8,438 user data requests, followed by India with 2,431, France with 1,693, Germany with 1,550, and the United Kingdom in fifth with 1,458.
The United States also headed the list of countries issuing court orders to remove government data from Google services, with 209, followed by Germany with 180, Brazil with 143, Turkey with 48 and France with 37. When that same data is broken down by "Other requests (executive, police, etc.)" instead of court orders, Turkey tops the list with 453 requests, followed by the United Kingdom with 79, Germany with 67, and the United States and India with 64 apiece.
That's great and exactly the kind of thing I was thinking of. The amount of weather data that NASA and NOAA both hold is immense, why not share it and make our tax dollars go further. Have you seen any that are doing this with personal data?
@SaneIT there is some attempt to do that now with collaborative projects. For example, in 2012 NASA listed a number of of big data intitiatives, and some are to be collaborative efforts with DOE or NOAA and the EPA. Much of what NASA investigates now relates to weather and climate changes, which does overlap with the interests of other government agencies.
Absolutely SaneIT. I don't think the federal government is willing to acknowledge that some of the data that was once proprietary to an agency is now readily available through many sources -- including melissa data, white pages, etc.
That's exactly what I was talking about Noreen. There seems to be huge gaps in awareness of what other agencies have, can do or have done. I'm wondering how much we could save by pulling some of that data into one pool and make them share it. The initial project would be expensive but the end result would be less waste.
I agree that not all data is the same but we have many overlapping government agencies that have overlapping functions and sometimes I think of how nice it would be if they had similar procedures when dealing with them at the very least. When you think about all the government agencies who have data on you and the fact that if you update data with one it doesn't necessarily trickle down to the others it's obvious that they could do better. I also think about things like local government issues we've seen around here lately of agencies leasing space when there are empty government owned buildings. Chances are most of them don't know that this space exists and that leads me to believe that they don't know about other things going on around them like databases they could be using to sanity check their data or that they could use instead of duplicating data.
In the interest of eliminating redundancy, strive for uniformity and promote efficient cost controls, all federal tech initiatives and implimentation should be under the directive of a newly formed autonomous agency with the symple mandate of applying the best available technologies to government operations. Much like the concept of the GSA.
@SaneIT - Sorry, I was trying for irony rather than sarcasm but sometimes there's just a fine line between the two.
I agree with your sentiment that if there are any agencies that know what to do with data they should share. But all data is not created equal or agencies either. Often Government agencies don't know what their other hand is doing or want to work together due to interagency competition (and jealousy). I.e., the FBI and CIA. Another issue is the contract funding process that can frequently tie an agency's hands or limit their flexibility to make changes.
I've wondered the same thing. The education dept outsourced an address canvassing operation for a survey in 2011 that duplicated work the census bureau had performed in 2009. And either agency could have obtained the same info from any one of a number of private companies for a lot less money!
@Callmebob, I'm reading that comment with a touch of sarcasm, but I'd rather the government was spending money on real usable systems and paying real companies who are doing more than just soaking the government fraudulently than let the government keep paying out blindly. In the short term the cost might look similar to what they are already paying out in fraudulent transactions or even less than optimal transactions but in the long run as waste is reduced the cost is recovered. I just wonder if we have any government agencies that are good with using the data they have since we hear so many horror stories. If there is one agency that is good with data why can't we get them to share some of that savvy with other agencies?
for the Business and IT Communities Executive forums with additional hands-on learning opportunities offered around the world
Each ideal for practitioners, Business leaders & senior executives
SAS Health Analytics Virtual ConferenceThe Health care is rapidly transforming. And there has never been a greater need for analytics. We're tackling tough challenges like data transparency, care delivery, consumer engagement, and financial and clinical risk. And there are still numerous opportunities to use health data that we haven't even tapped into.
2014 VA Interactive Roadshow -- HoustonThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
2014 VA Interactive Roadshow -- New YorkThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
2014 VA Interactive Roadshow -- Rockville, MDThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
2014 VA Interactive Roadshow -- DetroitThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
2014 VA Interactive Roadshow -- ChicagoThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
2014 VA Interactive Roadshow -- Cary, NCThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
2014 VA Interactive Roadshow -- BostonThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
2014 VA Interactive Roadshow -- AtlantaThe 2014 VA Interactive Roadshow will feature SASŪ Data Management and SASŪ Visual Analytics experts covering topics like prepping data for VA and VA integration with SASŪ Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
Analytics 2014The The Analytics 2014 Conference is a two-day educational event for anyone who is serious about analytics. This annual event brings together hundreds of professionals, industry experts, and leading researchers in the field of analytics. Register before April 30 for the early-bird discount.
LEADERS FROM THE BUSINESS AND IT COMMUNITIES DUEL OVER CRITICAL TECHNOLOGY ISSUES
The Current Discussion
Visual Analytics: Who Carries the Onus? The Issue: Data visualization is an up-and-coming technology for businesses that want to deliver analytical results in a visual way, enabling analysts the ability to spot patterns more easily and business users to absorb the insight at a glance and better understand what questions to ask of the data. But does it make more sense to train everybody to handle the visualization mandate or bring on visualization expertise? Our experts are divided on the question. The Speakers: Hyoun Park, Principal Analyst, Nucleus Research; Jonathan Schwabish, US Economist & Data Visualizer
The big-data analytics market can be a confusing place. Among the vendors vying for your dollars are traditional database management providers, Hadoop startup services, and IT giants. In this video, All Analytics editors Beth Schultz and Michael Steinhart sit down in a Google+ Hangout on Air with Doug Henschen, executive editor of InformationWeek. Henschen discusses use cases for big-data analytics, purchase considerations, and his recent roundup of the top 16 big-data analytics platforms.
At the National Retail Federation BIG Show last month, All Analytics executive editor Michael Steinhart noted a host of solutions for tracking and analyzing customer activity in retail stores. From Bluetooth beacons to RFID tags to NFC connections to video analytics, retailers must find the right combination of tools to help optimize the shopper experience, streamline operations, and boost revenues.
The days when historical shipment trends and gut feelings were enough to forecast retail demand accurately are long over. SAS chief industry consultant Charles Chase outlines the benefits of pulling real-time sales information from point-of-sale and product scanner systems, then flowing that data into dynamic forecasting tools from SAS.
Electronic shelf-edge labels (ESLs) equipped with low-energy Bluetooth beacons enable retailers to deliver real-time customer interaction and execute dynamic pricing strategies. Andrew Dark, CEO of Displaydata, outlines the ESL architecture and explains how it integrates with backend management and analytics systems.
Retailers like Family Dollar and suppliers like Procter & Gamble are using big-data analytics to maximize efficiency and revenue across the entire supply chain. Lori Schafer, Executive Advisor for the SAS Institute Retail Practice, moderated a panel with executives from these companies at the National Retail Federation BIG Show in New York last month. Here, she shares insights on retail supply chain optimization and in-store customer tracking for targeted sales.
EKN Research's "The Rising Importance of Customer Data Privacy in a SoLoMo Retailing Environment" report details the top challenges and opportunities that retailers face when embracing big data analytics. EKN SVP of Research and Principal Analyst Gaurav Pant explains the importance of data management and lays out seven steps that retailers can take to ensure customer privacy while reaping the benefits of big data.
Customer data is fueling a new phase of retail marketing across physical and online channels. Lori Bieda, executive lead for customer intelligence at SAS Americas, explains how integrated insight enables retailers to optimize offers and improve sales across product categories. She also shares some best-practices for leveraging analytics talent in retail.
This year's National Retail Federation BIG Show wrapped up on January 14. All Analytics executive editor Michael Steinhart reviews highlights of the conference and discusses trends around analytics, personalization, omnichannel, and retail security.
In the wake of 2008's financial meltdown, banks are subject to strict regulations around the soundness of their loan portfolios. Capgemini senior manager Rex Pruitt explains how advanced transition matrices -- driven by SAS analytics tools -- help banks perform effective credit loss forecasting and meet their regulatory requirements.
David Bencs, assistant director of Insight and Analytics for the Orlando Magic, outlines different analytics projects and the benefits they're delivering to the NBA franchise. The team put demand-based pricing in place a few years ago, for example, and single-game ticket revenue grew 28% despite a disappointing season. Next up for the Magic is to combine social media activity, television viewership stats, and ticket sales data to achieve a 360-degree customer view.
David Tishgart, senior director of marketing and alliances at security provider Gazzang, explains the importance of data encryption for companies that are rolling out Hadoop environments to leverage big data analytics.
At the Strata Conference / Hadoop World 2013, Samuel Kommu, technical marketing engineer at Cisco Systems, shares some of the benefits that Hadoop brings to analytics platforms that leverage next-generation hardware. Kommu looks at big data operations that required 3,500 nodes in 2009, 2,000 in 2011, and now require only 64 nodes.
With today's advanced visual analytics tools, you can stream data into memory for real-time processing, provide users the ability to explore and manipulate the data, and bring your data to life for the business.
Dynamic data visualizations let analysts and business users interact with the data, changing variables or drilling down into data points, and see results in a flash. Advance your use of data visualization with tools that support features like auto-charting, explanatory pop-ups, and mobile sharing.
No doubt your enterprise is amassing loads of data for fact-based decision-making. Hand in hand with all that data comes big computational requirements. Can traditional IT infrastructure handle the increasing number and complexity of your analytical work? Probably not, which is why you need a backend rethink. Big data calls for a high-performance analytics infrastructure, as Fern Halper, a partner at the IT consulting and research firm, Hurwitz & Associates, discusses here.
Redbox's bright-red DVD kiosks are all but ubiquitous these days, located in more than 28,000 spots across the country. Jayson Tipp, Redbox VP of Analytics and CRM, provides an insider's look at how the company has accomplished its phenomenal nine-year growth.
InterContinental Hotels Group (IHG), a seven-brand global hotelier, has woven analytics into the fabric of its operations. David Schmitt, director of performance strategy and planning, shares IHG's analytics story and his lessons learned.