TechiWarehouse.Com


Top 3 Products & Services

1.
2.
3.

Dated: Jul. 04, 2011

Related Categories

Computer Beginners Guides
Technological Advances

Related Article:

Just like their human creators and developers, computers can be broken down into a series of generations, each with their own distinct technology and feature set. At the current time, many computer scientists break computing into four separate generations, with an exciting fifth generation currently on the horizon. This new generation of computing, which takes the form of so-called artificial intelligence, will be unlike its predecessors -- noted more for its features than its hardware.

Earlier Generations of Computing

The first generation of computing is generally thought of as the "vacuum tube era." These computers used large vacuum tubes as their circuits, and large metal drums as their memory. This first generation of computer lasted for sixteen years, between 1940 and 1956.

Second-generation computing was characterized by a switch from vacuum tubes to transistors, and saw a significant decrease in the size of computing devices. Its popularity and utility in computing machines lasted until 1963, when integrated circuits supplanted them.

During the third generation of computing, between 1964, and 1971, the semiconductor increased the speed and efficiency of computers by leaps and bounds. These semiconductors used miniaturized transistors, which were much smaller than the traditional transistor found in earlier computers, and put them on a silicon chip.

In 1971, computing hit the big time: microprocessing. Microprocessors can be found in every single computing device today, from desktops and laptops to tablets and smartphones. They contain thousands of integrated circuits that are housed on a single chip. Their parts are microscopic, allowing one small processor to handle many simultaneous tasks at the same time.

The Fifth Generation of Computing

While the microprocessor has revolutionized the computing industry, the fifth generation of computer looks to turn the whole industry on its head once again. The fifth generation of computing is called "artificial intelligence," and it is the goal of computer scientists and developers to eventually create computers than outsmart, outwit, and maybe even outlast their human inventors.

Artificial intelligence can be broken into five distinct categories: games playing, robotics, expert systems, neural networks, and natural language. Each of these categories is being developed largely independent of one another; game playing, for instance, has seen great success over the course of the past 15 years, while natural language has taken longer to full develop and perfect.

Robotic ArmGame Playing

No longer will computing simply be a person playing a game alone on their computer; in the future, computers wil be able to play along, and possibly win. One of the biggest breakthroughs of artificial intelligence was in 1997, when an IBM computer successfully beat the world champion of chess at his own game. It was the first time a computer had beat a human being.

In 2011, IBM introduced "Watson" to Jeopardy viewers in the United States. The event was designed as a test for their newest artificial technology that can interpret human language as well as use logic to find the answers to common questions and trivia. Their latest foray into artificial intelligence had a few minor errors when hearing or interpreting questions, but still managed to beat all of its opponents on the trivia-heavy game show -- even longest-running champion, Ken Jennings.

Neural Networks

A neural network tries to reproduce the thoughts and physical connections of human or animal brains, and is one of the hottest areas of fifth generation computing. This is, in fact, the secret to IBM's Watson: they gave him a very human brain that could largely understand language and do enough research to answer questions.

These neural networks are also becoming important in much smaller applications, such as the voice recognition feature on many current personal computers and mobile phones.

Natural Language

This is often considered one of the "holy grails" of artificial intelligence. Currently, the kind of voice recognition that is available to consumers falls more under the category of "dictation" than "conversation." That's because the computer can hear the words and transcribe them into text, but it doesn't really have the ability to understand their meaning or their context.

Likewise, natural language is currently limited to one tone of voice, and most artificial intelligence computing devices can't distinguish between a softy-spoken sentence, and an angry sentence that has been screamed at them in at high volume.

Expert Systems

We've all been victims of so-called "human error," whether it was at the doctor's office, the bank, or even while we were driving our car to one of those places. Increasingly, researchers are looking to artificia intelligence as a sort of fail-proof way of diagnosing patients and doing everyday human tasks.

These so-called expert systems can help people make the right decision in a tough environment; not only are they able to store much more information than the human brain, as well as have it more readily available, but their systems are not clouded by biases and other purely human errors in judgment. Expert systems are quite black and white, quite robotic, and it is the hope of artificial intelligence developers that they will be better at decision making and diagnosing problems than their human counterparts.

Robotics

This might be the most popular area of artificial intelligence among those who are not familiar with more advanced concepts like "neural networks" or "expert systems." But these aren't your typical vacuum-cleaning robots on the late-night infomercials. Robotics in the realm of artificial intelligence is about creating robots which can experience, and react to, external stimuli -- just like their human counterparts.

That means these robots will be able to lead semi-autonomous lives, aware of their surroundings and able to independently modify their behavior based on their environment. It's one of the most promising, and most difficult, areas of artificial intelligence.

Now that you've gotten free know-how on this topic, try to grow your skills even faster with online video training. Then finally, put these skills to the test and make a name for yourself by offering these skills to others by becoming a freelancer. There are literally 2000+ new projects that are posted every single freakin' day, no lie!


Previous Article

Next Article


teddy's Comment
teddy
17 Sun Feb 2013
Admin's Reply:

Ruxpin




Susan basil's Comment
Thanks, helpd in my assignment
09 Sun Sep 2012
Admin's Reply:

Glad to hear that Susan




sabit aleem's Comment
sssssss!!!
15 Wed Aug 2012
Admin's Reply:

Ahh, okay




lalitklumar's Comment
fundamental of computer
05 Sun Aug 2012
Admin's Reply:



M Ismail's Comment
Thanks Dear Admin for this inf
15 Sun Apr 2012
Admin's Reply:

You're truly welcome M. Visit us often.




Adedayo's Comment
you are amazing
03 Mon Oct 2011
Admin's Reply:

 Thank you Adedayo :)




Latasha's Comment
This info is the cat's paajmas!
07 Wed Sep 2011
Admin's Reply:

 Thank you Latashan :) lol