Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelors degree in journalism. He joined the publication in 2003.
AI to Become Key Competitive Factor by 2020, Says Tata
- by Charles Babcock
- 5/1/2017 1:12:48 PM
@Michelle: With open APIs, open standards, and the like, this competitive advantage that the big boys have is largely going away for companies that want to leverage AI to compete.
- by Michelle, Data Doctor
- 4/30/2017 11:04:01 AM
@k4qym That's the data I want to see too. It seems as though larger organizations could implement AI much easier. Smaller orgnizations may not have as much to work with in terms of funding. Profitability is key.
- 4/30/2017 10:59:20 AM
@Ariella: It's kind of amazing how unadaptable most people are to variable change. They find that a couple of certain things or ways of thinking or doing things work for them, so they stick with that the rest of their lives regardless of change and regardless of circumstances.
- 4/30/2017 10:58:00 AM
@louis: I'm with you, to an extent. For businesses, autonomous vehicles have tremendous promise. (Think self-driving taxis/Ubers, self-driving freight trucks, etc.) For consumers, however, the only "advantages" that self-driving vehicles offer comes down to two arguments:
1) You can multitask and relax instead of focusing on the road.
2) Computers are better drivers than pitiful humans.
#1 is insufficient, and #2 is, overall, just plain wrong.
Not to mention many other arguments against becoming a society of autonomous vehicles (primarily: the loss of human autonomy).
- 4/30/2017 10:53:18 AM
@T Sweeney: "AI" sounds sexier and is a more graspable concept than "machine learning," "deep learning," "neural networks," and the like.
"AI" today is kind of like the "cloud computing" that Larry Ellison was ranting about close to a decade ago...except that we actually did have the technology for "platonic" cloud computing back then.
- by kq4ym, Data Doctor
- 4/27/2017 4:45:07 PM
I was surprised by the numbers quoted "Eighty-four percent of large companies around the world say they are using artificial intelligence," but after realizing those polled were gigantic corporations, no longer so surprised. The interesting thing to watch is how long and to what extend moderate size companies will invest in AI and find it profitable to do so.
- by louisw900, Blogger
- 4/25/2017 6:54:43 PM
rbaz, I appreciate your insight. When I sit back and consider your wisdom, I can't help but agree. I think you are right - everyone does want to be at the party whether they understand AI or not. Recently before this onslaught of AI articles, I asked myself what is the best plausible new frontier ? AI could not be denied and I am not sure anything really comes close.
As a result, I enrolled in an Edx class on AI among others. Which I highly encourage others to do as well and to take advantage of the massive amount of resources available on A2 of course.
It has been pretty heady stuff but one has to start somewhere and worth the time investment I hope because as you say, " the train is leaving the station" !
- by rbaz, Data Doctor
- 4/25/2017 4:27:47 PM
Louis, the rush is due to fear of being left behind and missing out. Everyone wants to be at the party, even though they may not be able to make sense of it all presently, it will be clear later on. In the meantime hop on board because the train is leaving the station.
- by rbaz, Data Doctor
- 4/25/2017 4:20:29 PM
No suprise to me at all. Investment in technology seems to be more like wading in the water versus diving in. Both are engaged with the water but the level of engagement is different.
- by Ariella, Data Doctor
ANALYTICS IN ACTION
INFOGRAPHICSVIEW ALL +
- by James M. Connolly