ACT, in conjunction with the Industry Advisory Council (IAC), recommends government agencies increase investment in data analytics, technology to increase productivity, and other technological solutions to reduce waste, fraud, and abuse. It concludes the potential savings will far outweigh the $80 billion a year the federal government currently invests in technology.
By way of background, ACT was created in 1979 to improve government through the efficient and innovative application of IT. A decade later, ACT established the IAC to bring industry and government executives together to collaborate on IT issues of interest to the government.
The report concludes that investment in IT innovations is the best way for federal agencies to continue to deliver essential services at reasonable costs. Specifically, it makes three recommendations:
Accelerate the use of data analytics to identify opportunities to reduce government costs. "Gathering and analyzing data from a variety of internal and external sources can help determine performance and outcomes for federal programs," the report notes.
Invest in technology to increase productivity and reduce costs. "The federal government has the opportunity to accomplish its missions in a cost effective manner while providing the citizenry with expected levels of service by strategically using IT as a part of the solution."
Use technology to combat fraud, waste, and abuse. "The federal government is in a unique position to strategically apply IT investments to reduce federal outlays and the federal deficit. Further, IT investments typically result in many ancillary benefits, such as providing jobs and much needed tax revenue."
Although the government has reduced the amount of "improper payments" in recent years, we still waste a lot of tax dollars -- about $115 billion in fiscal year 2011. In that period, federal agencies reported a government-wide improper payment rate of 4.69 percent, a decrease from the 5.3 percent improper payment rate reported a year earlier, government data shows.
“Improper payments" occur when:
funds go to the wrong recipient
the right recipient receives the incorrect amount of funds (including overpayments and underpayments)
documentation is not available to support a payment
the recipient uses funds in an improper manner.
One of the key recommendations in the report is for the federal government to establish "analytical shared centers of excellence (COE) focused on enterprise-level operations to reduce fraud, waste, and abuse and strengthen program integrity."
These COEs, whether dedicated to an agency, or shared among agencies, will enable sophisticated modeling and simulation of historical data, e.g., to identify anomalous thresholds based on multivariate factor analysis, disparate data fusion to include advanced geospatial analytics, testing of new means, methods, and tools, etc. Further, the COEs can develop more sophisticated rule systems (rather than simple filters) to screen incoming entities and outgoing transactions with acceptable false positive rates.
As data professionals, how do you feel about these recommendations? Do you believe investment in IT solutions, specifically analytics, can help bridge the gap between smaller budgets and essential services?
@SaneIT I haven't looked into that. I'm not sure the government is completely transparent about how much of our personal data it collects and shares among agencies. I have seen articles about the government extracting that data from Google, like this one:
Google is reporting yet another uptick in government requests for its users' data.
Google's latest Transparency Report indicates governments want more user data than ever.
Governments want Google's data more than ever.
That's the conclusion of the search engine giant's latest Transparency Report, which indicates that governments around the world filed an increasing number of requests for user data in the second half of 2012. In the United States, some 68 percent of those requests came through subpoenas, while 22 percent came through Electronic Communications Privacy Act (ECPA) search warrants; the remainder was "mostly court orders," according to Google's Jan. 23 blog posting on the matter.
Google received a total of 21,389 requests for information about 33,634 users for the July-through-December timeframe. The United States topped the list with 8,438 user data requests, followed by India with 2,431, France with 1,693, Germany with 1,550, and the United Kingdom in fifth with 1,458.
The United States also headed the list of countries issuing court orders to remove government data from Google services, with 209, followed by Germany with 180, Brazil with 143, Turkey with 48 and France with 37. When that same data is broken down by "Other requests (executive, police, etc.)" instead of court orders, Turkey tops the list with 453 requests, followed by the United Kingdom with 79, Germany with 67, and the United States and India with 64 apiece.
That's great and exactly the kind of thing I was thinking of. The amount of weather data that NASA and NOAA both hold is immense, why not share it and make our tax dollars go further. Have you seen any that are doing this with personal data?
@SaneIT there is some attempt to do that now with collaborative projects. For example, in 2012 NASA listed a number of of big data intitiatives, and some are to be collaborative efforts with DOE or NOAA and the EPA. Much of what NASA investigates now relates to weather and climate changes, which does overlap with the interests of other government agencies.
Absolutely SaneIT. I don't think the federal government is willing to acknowledge that some of the data that was once proprietary to an agency is now readily available through many sources -- including melissa data, white pages, etc.
That's exactly what I was talking about Noreen. There seems to be huge gaps in awareness of what other agencies have, can do or have done. I'm wondering how much we could save by pulling some of that data into one pool and make them share it. The initial project would be expensive but the end result would be less waste.
I agree that not all data is the same but we have many overlapping government agencies that have overlapping functions and sometimes I think of how nice it would be if they had similar procedures when dealing with them at the very least. When you think about all the government agencies who have data on you and the fact that if you update data with one it doesn't necessarily trickle down to the others it's obvious that they could do better. I also think about things like local government issues we've seen around here lately of agencies leasing space when there are empty government owned buildings. Chances are most of them don't know that this space exists and that leads me to believe that they don't know about other things going on around them like databases they could be using to sanity check their data or that they could use instead of duplicating data.
In the interest of eliminating redundancy, strive for uniformity and promote efficient cost controls, all federal tech initiatives and implimentation should be under the directive of a newly formed autonomous agency with the symple mandate of applying the best available technologies to government operations. Much like the concept of the GSA.
@SaneIT - Sorry, I was trying for irony rather than sarcasm but sometimes there's just a fine line between the two.
I agree with your sentiment that if there are any agencies that know what to do with data they should share. But all data is not created equal or agencies either. Often Government agencies don't know what their other hand is doing or want to work together due to interagency competition (and jealousy). I.e., the FBI and CIA. Another issue is the contract funding process that can frequently tie an agency's hands or limit their flexibility to make changes.
I've wondered the same thing. The education dept outsourced an address canvassing operation for a survey in 2011 that duplicated work the census bureau had performed in 2009. And either agency could have obtained the same info from any one of a number of private companies for a lot less money!
@Callmebob, I'm reading that comment with a touch of sarcasm, but I'd rather the government was spending money on real usable systems and paying real companies who are doing more than just soaking the government fraudulently than let the government keep paying out blindly. In the short term the cost might look similar to what they are already paying out in fraudulent transactions or even less than optimal transactions but in the long run as waste is reduced the cost is recovered. I just wonder if we have any government agencies that are good with using the data they have since we hear so many horror stories. If there is one agency that is good with data why can't we get them to share some of that savvy with other agencies?
Premier Business Leadership Series 2014 The Premier Business Leadership Series is an exclusive event for senior executives and decision makers that focuses on solving the current issues that affect governments and businesses globally. The Series is a unique learning and networking experience focused on the most innovative leadership strategies and analytic solutions for competing in todayâ€™s global economy.
2014 VA Interactive Roadshow -- BostonThe 2014 VA Interactive Roadshow will feature SAS® Data Management and SAS® Visual Analytics experts covering topics like prepping data for VA and VA integration with SAS® Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
Data Exploration & Visualization Get hands-on training that focuses on the critical steps in the process of analyzing data: accessing and extracting data, cleaning and preparing data, exploring and visualizing data. This INFORMS course will use several of the most popular software tools intensively, and provide an overview of the range of software options.
Foundations of Modern Predictive Analytics In this INFORMS course, learn about modern predictive analytics, the science of discovering and exploiting complex data relationships. This course will give participants hands-on practice in handling real data types, real business problems and practical methods for delivering business-useful results.
2014 VA Interactive Roadshow -- AtlantaThe 2014 VA Interactive Roadshow will feature SAS® Data Management and SAS® Visual Analytics experts covering topics like prepping data for VA and VA integration with SAS® Office Analytics. This year's events will keep presentations at a minimum and focus on giving attendees hands-on exposure to the latest version of VA.
LEADERS FROM THE BUSINESS AND IT COMMUNITIES DUEL OVER CRITICAL TECHNOLOGY ISSUES
The Current Discussion
Visual Analytics: Who Carries the Onus? The Issue: Data visualization is an up-and-coming technology for businesses that want to deliver analytical results in a visual way, enabling analysts the ability to spot patterns more easily and business users to absorb the insight at a glance and better understand what questions to ask of the data. But does it make more sense to train everybody to handle the visualization mandate or bring on visualization expertise? Our experts are divided on the question. The Speakers: Hyoun Park, Principal Analyst, Nucleus Research; Jonathan Schwabish, US Economist & Data Visualizer
The hospitality industry gathers massive amounts of customer data, and mining that data effectively can yield tremendous results in terms of improved CRM, better-targeted marketing spend, and more efficient back-end processes. Roger Ares, vice president of analytics at Hyatt Corp., discusses the ways he and his staff use big data.
Charged with keeping track of travel assets, including employees, iJET International relies on data management best-practices and advanced analytics to keep its clients in the know on current and potential world events affecting travel, Rich Murnane, Director of Enterprise Data Operations & Data Architect, told All Analytics in an interview from the 2014 SAS Global Forum Executive Conference.
Jason Dorsey, chief strategy officer for the Center for Generational Kinetics and keynote speaker at last month's SAS Global Forum 2014, describes how Gen Y professionals are enhancing the makeup of multigenerational analytics organizations.
From analytics talent development to the power of visual analytics, All Analytics found a variety of common themes circulating throughout the exhibition floor and session discussions at the 2014 SAS Global Forum and SAS Global Forum Executive Conference events held last month in Washington, DC.
Talking with All Analytics live from the 2014 SAS Global Forum Executive Conference, Eric Helmer, senior manager of campaign design and execution for T-Mobile, discussed the importance of customer data -- starting internally -- in devising the mobile operator's marketing plans.
The big-data analytics market can be a confusing place. Among the vendors vying for your dollars are traditional database management providers, Hadoop startup services, and IT giants. In this video, All Analytics editors Beth Schultz and Michael Steinhart sit down in a Google+ Hangout on Air with Doug Henschen, executive editor of InformationWeek. Henschen discusses use cases for big-data analytics, purchase considerations, and his recent roundup of the top 16 big-data analytics platforms.
At the National Retail Federation BIG Show last month, All Analytics executive editor Michael Steinhart noted a host of solutions for tracking and analyzing customer activity in retail stores. From Bluetooth beacons to RFID tags to NFC connections to video analytics, retailers must find the right combination of tools to help optimize the shopper experience, streamline operations, and boost revenues.
The days when historical shipment trends and gut feelings were enough to forecast retail demand accurately are long over. SAS chief industry consultant Charles Chase outlines the benefits of pulling real-time sales information from point-of-sale and product scanner systems, then flowing that data into dynamic forecasting tools from SAS.
With today's advanced visual analytics tools, you can stream data into memory for real-time processing, provide users the ability to explore and manipulate the data, and bring your data to life for the business.
Dynamic data visualizations let analysts and business users interact with the data, changing variables or drilling down into data points, and see results in a flash. Advance your use of data visualization with tools that support features like auto-charting, explanatory pop-ups, and mobile sharing.
No doubt your enterprise is amassing loads of data for fact-based decision-making. Hand in hand with all that data comes big computational requirements. Can traditional IT infrastructure handle the increasing number and complexity of your analytical work? Probably not, which is why you need a backend rethink. Big data calls for a high-performance analytics infrastructure, as Fern Halper, a partner at the IT consulting and research firm, Hurwitz & Associates, discusses here.
Redbox's bright-red DVD kiosks are all but ubiquitous these days, located in more than 28,000 spots across the country. Jayson Tipp, Redbox VP of Analytics and CRM, provides an insider's look at how the company has accomplished its phenomenal nine-year growth.
InterContinental Hotels Group (IHG), a seven-brand global hotelier, has woven analytics into the fabric of its operations. David Schmitt, director of performance strategy and planning, shares IHG's analytics story and his lessons learned.