AI: Doomed to Buzzword Status


It's a bit of a scary thought. For all of the good that machine learning and artificial intelligence promise, what happens if either machine learning or AI becomes the next hot marketing buzzword in tech and the business sector?

Credit: Pixabay
Credit: Pixabay

Picture AI falling helplessly into the morass that sucked in "big data," "smart" anything, and "Internet of" you name it. That rumble that you would feel might be Alan Turing fidgeting in his grave.

I can hear some marketing newbie now, "Well, our product is made of plastic. That's artificial, right? And, we think it's really intelligent. Artificial intelligence!"

Or, machine learning. "It's really smart!"

For more than 60 years hundreds of very bright and accomplished computer scientists, from Turing to today's doctoral students, have researched and debated what AI is, and what it isn't. At what point is a computer actually thinking? The answers aren't easy.

Then, we have machine learning as a subset of or precurser to AI. Feed a neural network with enough examples -- such as text and images -- and it advances to the point where it can translate English into another language, recognize faces of people, or identify the most successful treatments for diseases.

I suppose AI is destined to be cast into Buzzword Hades once everyone from that marketing newbie to the CEO desperate for something innovative hear more about the real-world successes of AI and machine learning. Memos and meetings will be punctuated with shouts of, "We need to be doing that."

We already are seeing examples of niche applications utilizing techniques such as image recognition in anti-terrorism initiatives and pattern recognition in cybersecurity. Applications in the commercial space seem to be ready to pop up in the public view.

A Forbes article cites three industry sectors -- healthcare, finance, and insurance -- as prime candidates for AI and machine learning applications.

The article notes, "Sequencing of individual genomes and then comparing them to a vast database will allow doctors -- and/or AI bots -- to predict the probability that you will contract a particular disease and the best ways to treat those diseases when they appear. Companies including Google, Apple, Samsung, and others are investing billions in developing new biometric sensors. Combined with big data, the information from these sensors could help prevent disease and extend lifespans."

In finance, writer Bernard Marr says, AI-based systems will replace human financial advisors, analyzing thousands more investment possibilities and big data drawn from our own social media and other activities to help shape our financial strategies.

In insurance, systems are utilizing AI already to identify who should get discounts on their life, health, or car insurance, based on their life styles and past activities.

Incalculable numbers of applications in every industry will tag along, whether they use AI or not.

I guess it's inevitable that any success stories in AI will turn it into a "me too" buzzword. We have a tendency in the tech and business sectors to get bored with tech terms, perhaps frustrated when the concepts they represent aren't being adopted fast enough. Just yesterday, Gartner released data showing that enterprise investments in big data are up but that fewer companies plan future investments in the concept. Of course, that news was greeted by blogs effectively saying, "Alas, poor Big Data, I knew him..."

Over the decades I've come to the conclusion that our rush to adopt new buzzwords is based on the fact that we have a two or three year attention span when it comes to tech initiatives that have timelines to maturity that are two or three times longer. If a concept isn't broadly adopted and providing ROI after those first two or three years we declare it a failure and move on to the next buzzword. Patience is not in our dictionary.

We have to remember that testing technologies, conceiving and building out corporate applications, and implementing them across dozens of departments with thousands of employees isn't a snap-your-fingers type of thing. And, some situations don't justify a move to the new idea at all, as we are sure to learn with big data, IoT, and AI after that.

James M. Connolly, Editor of All Analytics

Jim Connolly is a versatile and experienced technology journalist who has reported on IT trends for more than two decades. As editor of All Analytics he writes about the move to big data analytics and data-driven decision making. Over the years he has covered enterprise computing, the PC revolution, client/server, the evolution of the Internet, the rise of web-based business, and IT management. He has covered breaking industry news and has led teams focused on product reviews and technology trends. Throughout his tech journalism career, he has concentrated on serving the information needs of IT decision-makers in large organizations and has worked with those managers to help them learn from their peers and share their experiences in implementing leading-edge technologies through publications including Computerworld. Jim also has helped to launch a technology-focused startup, as one of the founding editors at TechTarget, and has served as editor of an established news organization focused on technology startups and the Boston-area venture capital sector at MassHighTech. A former crime reporter for the Boston Herald, he majored in journalism at Northeastern University.

AI in the Workplace: Augment, Instead of Replacing Humans

AI and machine learning won't create massive job losses in the foreseeable future, but some societal issues to come to mind.

MIT's McAfee: Smart Machines Pick up the Pace

MIT's Andrew McAfee looks at all those things we said computers could never do, but they did. And, he warns of the HiPPO.


Will the real AI please stand up?
  • 12/27/2016 9:50:19 PM
NO RATINGS

..

In his blog post, Jim writes


For more than 60 years hundreds of very bright and accomplished computer scientists, from Turing to today's doctoral students, have researched and debated what AI is, and what it isn't. At what point is a computer actually thinking? The answers aren't easy.

Then, we have machine learning as a subset of or precurser to AI. Feed a neural network with enough examples -- such as text and images -- and it advances to the point where it can translate English into another language, recognize faces of people, or identify the most successful treatments for diseases.


 

In my view of the issue, machine learning must form a key, integral component of true AI. You can't really have AI without the ability of the machine to learn and thus expand its cognitive capabilities. 

 

Re: Wheat and Chaff
  • 11/30/2016 9:23:30 PM
NO RATINGS

Good point, especially with legal. The questions that repeat are more valuable than how the information is stored. That can always be changed, and would be better suited if it matched queries. 

Re: Wheat and Chaff
  • 10/17/2016 8:06:54 AM
NO RATINGS

I know the search terms issue all too well, the way we worked a lawyer or paralegal would hand me a list of search terms and I would plug away.  Those search terms were refined over years and most of the filing systems I worked with used naming conventions that mirrored the typical search terms.  There were systems within systems to make it easier for a human to wade through the sea of data.  I'm sure Watson would be a whole lot faster than a sleep deprived 20 year old with a printed sheet of resources so I'm all for AI taking over in this regard.  I think a Watson type system would also teach us how to ask better questions, instead of assuming other people classify things the same way we do, Watson can look at things as a blob of data and find the relevant pieces.  This means finding relevant data in areas we never would have thought to ask about.

I'm going to say that canceling those subscriptions would have resulted in some very embarrassed lawyers in my case but it would have made my job a lot less tedious. 

 

Re: Wheat and Chaff
  • 10/16/2016 8:41:10 PM
NO RATINGS

I am also in that supercomputing camp, Terry, though I think blending that AI camp is becoming more and more a reality by the day.  The number of iterations that is possible through a Watson is too enticing a fruit for data scientists to ignore exploring.

Re: Wheat and Chaff
  • 10/16/2016 8:21:36 PM
NO RATINGS

@kq4ym I agree with you. I don't see AI taking over the financial sector like this, but the show will be interesting to watch. I have a feeling AI predictions will be in 'beta' for quite a while until they prove to be as good as human advisors.

Re: Wheat and Chaff
  • 10/14/2016 10:47:39 AM
NO RATINGS

@SaneIT. The other issue with services such as Lexis/Nexis is that the results are only as good as your search terms. Add that little bit of extra intelligence into the machine/system (as even Google sort of does) and you get "knowledge" instead of lists of documents.

I never ran a library but at various times in the days of paper reference books I did have to insert updates into binders. After a year or two of that it was apparent that nobody was even using the references. So we cancelled the subscriptions and I had one less thing to worry about.

Re: Wheat and Chaff
  • 10/14/2016 9:13:34 AM
NO RATINGS

I'm skeptical of when "AI-based systems will replace human financial advisors" and if they will perform better or worse than our often haphazard financial predictions currently being offered by human advisors. Now, maybe AI will find the secret formula for near perfect success in prediction of economic futures, but I doubt it.

Re: Wheat and Chaff
  • 10/14/2016 8:15:54 AM
NO RATINGS

@Jamescon, LexisNexis is what I was thinking of, I did time in college managing a legal library for a very large company's HR legal department.  I spent many afternoons pulling books for lawyers and pulling files in the name of research.  In my time there I took a spreadsheet based catalog and built a database to track changes that I made to the books.  Many times that an update came for a book in our library I had to track down who had it so that I could swap the pages.  More than once I was trading out sections that someone was working with.  From what I've seen Watson is intelligent enough to know when information changes or conflicts.  That ability tells me that with a little effort Watson could learn to tell you when you have conflicts and when the information you're using changes.  That would be a huge benefit in research heavy fields.  Couple Watson with a couple of the bot projects that I've been seeing a lot of lately and the workflow systems could get a big boost.

Re: Wheat and Chaff
  • 10/13/2016 9:32:01 AM
NO RATINGS

@SaneIT. The knowledge management systems that were promised in the late 1990s and early 2000s never really took off, probably because they required so many changes in workflow systems and even detailed interviews. Yet that is what systems like Watson are delivering, knowledge management. The pitch on 60 Minutes the other night was that the system could stay up to date on medical research and the effectiveness of treatments because a doctor could never read all of the relevant, ever-changing material.

Now, apply that to many types of support and service roles. Services such as Lexis, which automated legal research, still returned only somewhat relevant results and required lawyers (law clerks) to read dozens of cases before they could find one that really was appropriate in terms of precedence. It's one of the reasons that the justice system moves so slowly. If machine learning led to the best results, it would mean greater efficiency. (No, it won't change the bureaucracy).

Consider any tech support scenario, or distribution and routing systems. How about real estate searches? Those currently are pretty much limited to a handful of factors (number of bedrooms and baths, acreage, neighborhood) and are still largely manual to the point where the system returns total lemons that you only discover when you drive up for a viewing. Suppose you could submit the address of a house that you love in terms of its design and configuration and search for something similar in the neighborhood that you love. That's a realistic future for this type of technology.

Re: Wheat and Chaff
  • 10/13/2016 8:45:48 AM
NO RATINGS

I think it is one of the most incredible combinations we've seen in a long time and wish we had more frequent updates.  I believe that the applications are incredibly broad for Watson or a Watson like system especially in research heavy areas.   Being able to ingest large amounts of data and draw connections between the various pieces is a big deal.  Watson would make an incredible resource for case law applications and for data tracking lab environments. 

Page 1 / 3   >   >>
INFORMATION RESOURCES
ANALYTICS IN ACTION
CARTERTOONS
VIEW ALL +
QUICK POLL
VIEW ALL +