5 Cool Demos for Data Analytics Pros


Dr. Jim Goodnight at SAS Global Forum

Dr. Jim Goodnight at SAS Global Forum

Dr. Jim Goodnight used Amazon's Alexa as a voice interface to SAS Visual Analytics during the opening keynote address at the SAS Global Forum in Orlando on April 2. And while Alexa wasn't perfectly responsive 100% of the time --the executives called her "feisty" -- the fact that CEO Goodnight used this tool as part of the demonstration at the opening session illustrates the focus SAS (the sponsor of AllAnalytics) is putting on technology such as artificial intelligence, cognitive computing, machine learning, and deep neural networks.

The Alexa demo was one of several demonstrations, advances, and updates introduced during the first two sessions of the user and executive conference, which attracted nearly 5,600 attendees this year.

Several of these stood out. Here are 4 more cool things I saw in the first two days of the SAS Global Forum:

The GatherIQ App

SAS demonstrated the GatherIQ app on Sunday night, and it is now available in the iTunes store for download, with an Android client coming this summer. The app is intended to unite data power with crowd power to solve global humanitarian challenges. It's first project is with the International Organization for Migration (IOM), and is designed to collect and analyze data to help people better understand the dangers that migrants face. Every year thousands of migrants go missing or die on their journeys, and IOM's Missing Migrant Project is using GatherIQ to surface data about them, including demographics, migrant routes, and locations.

The GatherIQ project continues an ongoing collaboration between SAS and IOM. SAS helped analyze data from IOM shelters following Typhoon Haiyan's devastation of the Philippines in 2013, enabling IOM to target relief efforts and identify the most crucial health problems. SAS and IOM also worked together following the Nepal earthquake in 2015.

Machine Learning

Jorge Silva, senior machine learning scientist at SAS, used the power of text analytics for next generation machine learning in a demonstration he did on the main stage during Monday's session, which was moderated by SAS CTO Oliver Schabenberger. Silva created a factorization machine using NCAA basketball data and designed to predict the outcome of the Monday night final. Using four variables, Silva used the Python client Jupyter as an interface to SAS, processing over 20 million rows to predict that North Carolina will beat Gonzaga. (UNC did win.)

"You could have run this in SAS as well," he said. "All the heavy lifting is done by SAS Viya." But the Jupyter client option offers another choice to programmers who may be more comfortable with Python.

SAS Graphics Accelerator

Ed Summers, senior manager of accessibility and applied assistive technology at SAS, gave a demonstration of sonification, the art and science of using sound to represent data. SAS has released a new product called the SAS Graphics Accelerator that displays data using sound.

"It will become as important to data as Braille is to text," Summers said. In the demo, Summers mapped the Y axis of data on a bar chart to musical pitch, or notes on a piano keyboard. Each note represents the top of the bar in the bar chart. Summers demonstrated a couple of bar charts this way -- one of the popularity of various makes of automobiles. Another of these bar charts provided an audio representation of the Dow Jones Industrial Average closing prices for the week over the course of about a decade.

Edge Analytics

SAS recently announced an IoT partnership with networking giant Cisco. During Monday's session SAS provided a look at how Edge-to-Enterprise IoT analytics will work, and why edge analytics are important. In IoT, data is generated on devices. The volume of devices -- say sensors on a truck among a fleet of trucks -- make it impractical to send all the data to a central location for analysis. And some data may need to be used at the edge immediately. Maybe the truck driver has a maintenance issue that should be addressed sooner rather than later. So it makes sense for analytics to move to the edge, too. SAS and Cisco provided a look at why that is important and what the architecture will look like.

A great example of a use case application is predictive maintenance. If you can predict which parts are about to fail, you can be proactive with your replacements and reduce downtime. But to do this, you need analytics and models that allow you to predict failure, Schabenberger said.

Jessica Davis, Senior Editor, Enterprise Apps, Informationweek

Jessica Davis has spent a career covering the intersection of business and technology at titles including IDG's Infoworld, Ziff Davis Enterprise's eWeek and Channel Insider, and Penton Technology's MSPmentor. She's passionate about the practical use of business intelligence, predictive analytics, and big data for smarter business and a better world. In her spare time she enjoys playing Minecraft and other video games with her sons. She's also a student and performer of improvisational comedy. Follow her on Twitter: @jessicadavis.

Success Secrets of Top Omnichannel Retailers

It's a tough and changing environment for retailers. Yet some are enjoying continued success during turbulent times. We take a closer look at how they do it.

A2 Radio: Lean Analytics for 2018

Lean Analytics author Alistair Croll joins AllAnalytics radio to talk about how to apply the process for 2018.


Re: Can someone explain this a little more?
  • 4/11/2017 8:35:03 AM
NO RATINGS

That should be a great thing to watch out for with the huge numbers possible for data. And speaking of that, interesting how "processing over 20 million rows to predict that North Carolina will beat Gonzaga," was pretty interesting as well.

Re: Can someone explain this a little more?
  • 4/6/2017 9:45:38 PM
NO RATINGS

Using sound to represent data doesn't sound to far fetched to me.  We've been using sound to tell us how to feel or to represent other factors in theater, movies and T.V. programs.  It helps to create a more realistic and 3D world for us. 

Humans can generally hear from 20 to 20,000 hertz.  There are 88 notes on a piano.  That's a lot of possible representation. 

Re: Can someone explain this a little more?
  • 4/5/2017 2:39:08 PM
NO RATINGS

I agree. t really is another step towards a totally natural interface that anyone could use.

Re: Can someone explain this a little more?
  • 4/5/2017 2:14:36 PM
NO RATINGS

I see the auditory output as another step along the path of conforming machines to the way people work, instead of the other way around.

Representing a bar graph as a sequence of tones is, I would think, a rudimentary application. But it opens up the imagination to a future where we could hear a symphony of how the organization is currently running.

I realize some people are tone-deaf, so this doesn't work for everyone. However, for some of us, it will be musical and beautiful. And we'll wonder - how did we ever get along without this?

Re: Can someone explain this a little more?
  • 4/4/2017 1:33:59 PM
NO RATINGS

@saneit it sounds like it's just a variation on data visualization, using the notes rather than just standard plots or colors. I should think that the spatial identification would be faster with something simplet, but maybe this appeals in particular to those who do read music.

Can someone explain this a little more?
  • 4/4/2017 8:36:54 AM
NO RATINGS

"It will become as important to data as Braille is to text,"   I'm not quite understanding how musical notes mapped to data points becomes useful.  Even if you are visually impaired wouldn't a 3D model of a chart or graph be more meaningful than an audible tone?  All I can picture here is Close Encounters of the Third Kind and an analyst reporting data with a keyboard and some flashing lights while board members answer back in Charlie Brown adult voices. 

INFORMATION RESOURCES
ANALYTICS IN ACTION
CARTERTOONS
VIEW ALL +
QUICK POLL
VIEW ALL +