x
Cloud Native/NFV

AI Jitters Kick Off 2017

It's only halfway through January, and artificial intelligence is already making many people jittery in 2017.

The very thought of AI has had a long track record of making smart people uncomfortable. In the last few years, Microsoft Corp. (Nasdaq: MSFT) co-founder Bill Gates has detailed his cautious feelings about AI, and eminent physicist Stephen Hawking has also expressed trepidation.

Despite concerns, the tech field has forged ahead with bringing AI into the mainstream, at least as far as it can go at the moment. Google (Nasdaq: GOOG) has made some significant strides with the technology, although it's mostly contained to games and other novelties for now but that momentum is not stopping. (See AI Threat Is Tech's Fart in the Room.)

Still, the hand-wringing continues.

On Jan. 10, two tech billionaires, along with several respected public research foundations, dedicated $27 million to studying the effects of AI and to apply what it calls "humanities, the social sciences and other disciplines" to the development of the technology over the coming decades.

(Source: Giralt via Pixabay)

Contributors include the Omidyar Network, a charitable fund overseen by eBay co-founder Pierre Omidyar; LinkedIn co-founder Reid Hoffman; and the John S. and James L. Knight Foundation, which is best-known for its contributions to NPR and other journalism endeavors.

Other individuals and philanthropists also contributed to the fund, which is officially called the Ethics and Governance of Artificial Intelligence Fund, and more donations are expected. The MIT Media Lab and the Berkman Klein Center for Internet & Society at Harvard University will oversee the academic research.

While the organization has several different goals in mind, its main thrust is to make AI -- at least the development of the technology at this phase -- a little more human.

"Because of this pervasive but often concealed impact, it is imperative that AI research and development be shaped by a broad range of voices -- not only by engineers and corporations, but also by social scientists, ethicists, philosophers, faith leaders, economists, lawyers and policymakers," according to the announcement of the fund.

The notion of making AI a little more human -- or at least a little more understanding of its creators -- is also on the mind of the World Economic Forum, which is warning about the "weaponization" of AI.

Written by John Drzik, the president of Global Risk and Specialties at Marsh, an insurance and risk management firm, the Jan. 11 article warns that the innovation driving AI, the Internet of Things (IoT), and biotechnology could also lead to a more unstable environment for businesses, as well as society as a whole.

Drzik offers a general overview of the changes AI and other disruptive technologies have in store for the globe, but he doesn't get too detailed into how weaponized AI works. However, he cites terrorism and cybersecurity risks in general terms.


Want to know more about the cloud? Visit Light Reading Enterprise Cloud.


IoT and AI use different clouds, leading to security risks, which enterprises should keep in mind now and not in the far-off future, Drzik warns. (See Artificial Intelligence Expert Weighs In for WiC.)

"Other innovations in the technology landscape, such as the migration of data and software to the Cloud and the use of AI and robotics in commercial applications, are also shifting the nature of cyber risk," Drzik wrote.

While AI is seen as one of the next great leaps in technology, there's some concern about how it will develop. That is why there is worry about its weaponization as well as an effort to bring ethics into the development process. Since AI remains in very much in its early stages, it's not clear if these worries are warranted or an overreaction.


— Scott Ferguson, Editor, Enterprise Cloud. Follow him on Twitter @zdeferguson.

mendyk 1/17/2017 | 12:10:55 PM
Bring it on Given the current state of "real" intelligence, we could use all the artificial intelligence we can get.
HOME
SIGN IN
SEARCH
CLOSE
MORE
CLOSE