Skip to main content
  1. Tech
  2. Gadgets & Tech
  3. Tech Gear

The age of artificial intelligence is here

See also

Computers can now learn from their mistakes and this will turn the digital world into a new era in 2014, according to the N.Y. Times print edition today. The vision of artificial intelligence is now real.

The first commercial version of the new kind of computer chip is scheduled to be released in 2014. Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term ‘computer crash’ obsolete.

This all relates to the technology that would come when systems are self-aware; systems that perceives their environments and takes actions to maximize their chances of success. The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do base on the changing signals.

A new generation of artificial intelligence systems will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning; the biometrics age is fast developing facial, iris, and palm sensory recognition and voice characteristics...

‘We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,’ said Larry Smarr, an astrophysicist who directs the California Institute for Telecommunications and Information Technology, one of many research centers devoted to developing these new kinds of computer circuits.

Instead of merely being programmed to do a series of steps, algorithms have been applied. Last year, Google researchers were able to get a machine-learning algorithm, known as a neural network, to perform an identification task without supervision. The result was identification of a cat.

The new approach, used in both hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kabana Boahen, a computer scientist who leads Stanford’s Brains in Silicon research program, said that is also its limitation, as scientists are far from fully understanding how brains function.

The history of the design of computers was dictated by ideas originated by the mathematician John von Neumann about 65 years ago. Microprocessors perform operations at lightning speed; following instructions programmed using long strings of 1s and 0s. It all became memory.

Once data is stored in short term memory while the computer carries out the programmed action, the result is then moved to its main memory.

The new electronic components can be connected by wires that mimic biological synapses. Because they are based on large groups of neuron-like elements, they are known as neuromorphic processors, a term credited to the California Institute of Technology physicist Carver Mead, who pioneered the concept in the late 1980s.

It is no longer simple programming but the connections between the circuits are weighted according to correlations in data that the processor has already learned. Those weights are altered as they flow to the chip which causes a signal to go on and changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.

‘Instead of bringing data to computation as we do today, we can now bring computation to data,’ said Dharmendra Modha, an I.B.M. computer scientist who leads the company’s cognitive computing research effort. ‘Sensors become the computer, and it opens up a new way to use computer chips that can be everywhere.’

Qualcomm has said that it is coming out in 2014 with a commercial version of neuromorphic processors, which is expected to be used largely for further development. Moreover, many universities are now focused on this new style of computing. This fall the National Science Foundation financed a new research center based at the Massachusetts Institute of Technology, with Harvard and Cornell.

The largest class on campus this fall at Stanford was a graduate level machine-learning course covering both statistical and biological approaches, taught by the computer scientist Andrew Ng.

Terry Sejnowski, a computational neuroscientist at the Salk Institute, who pioneered early biologically, inspired algorithms says, ‘Everyone knows there is something big happening, and they’re trying find out what it is.’

Advertisement

Don't Miss

  • Massive
    Ubisoft Massive exclusive: 'The Division', PS4 & Xbox One, Activision to Ubisoft & more
    Camera
    Games Exclusive
  • iPhone
    Get your wallet ready: The next iPhone could cost $100 more than your last one
    Video
    Tech Buzz
  • Civ
    Need to catch up on 'Sid Meier's Civilization'? Here is everything you need to know
    Camera
    Games Feature
  • Google Glass
    See how Google Glass is letting sick kids go to the zoo without leaving the hospital
    Tech News
  • Upcoming
    These are 2014's biggest PS4, Xbox One and Wii U games
    Camera
    Games Feature
  • Google
    Google has filed for a patent to develop contact lenses capable of taking photos
    Video
    Headlines

Related Videos:

  • USA Science & Engineering Festival logo
    <iframe width="420" height="315" src="//www.youtube.com/embed/ND6dOpbTZQY?VQ=HD720&amp;autoplay=1"></iframe>
  • Preventing Data Breaches and Hacking
    <div class="video-info" data-id="518103352" data-param-name="playList" data-provider="5min" data-url="http://pshared.5min.com/Scripts/PlayerSeed.js?sid=1304&width=480&height=401&playList=518103352&autoStart=true"></div>
  • Computer hardware
    <iframe width="420" height="315" src="//www.youtube.com/embed/ThTVsdh8QN0?VQ=HD720&amp;autoplay=1"></iframe>

User login

Log in
Sign in with your email and password. Or reset your password.
Write for us
Interested in becoming an Examiner and sharing your experience and passion? We're always looking for quality writers. Find out more about Examiner.com and apply today!