Top 3 Products & Services
Dated: Aug. 13, 2004
Related CategoriesComputer Training Schools
The Internet has revolutionized the computer and communications world like nothing before. The invention of the telegraph, telephone, radio, and computer set the stage for this unprecedented integration of capabilities. The Internet is at once aworld-wide broadcasting capability, a mechanism for information dissemination, and a medium for collaboration and interaction between individuals and their computers without regard for geographic location.
The Internet represents one of the most successful examples of the benefits of sustained investment and commitment to research and development of informationin frastructure. Beginning with the early research in packet switching, the government, industry and academia have been partners in evolving and deploying this exciting new technology.
No one actually owns the Internet, and no single person or organization controls the Internet in it sentirety. More of a concept than an actual tangibleentity, the Internet relies on a physical infrastructure that connects networks to other networks. There are many organizations, corporations, governments, schools, private citizens and service providers that all own pieces of the infrastructure, but there is no one body that owns it all. There are, however, organizations that oversee and standardize what happens on the Internet and assign IP addresses and domain names, such as the National Science Foundation, the Internet Engineering Task Force, ICANN, InterNIC and the Internet Architecture Board.
IETF (Internet Engineering Task Force) is the mainstandards organization for the Internet. The IETF is alarge open international community of networkdesigners, operators, vendors, and researchersconcerned with the evolution of the Internetarchitecture and the smooth operation of the Internet.It is open to any interested individual.
The InterNIC is currently an informational Web site established to provide the public with information about domain name registration.
ICANN shorts for Internet Corporation for Assigned Names and Numbers, a nonprofit organization that has assumed the responsibility for IP address space allocation, protocol parameter assignment, domain name system management and root server system management functions previously performed under U.S. Government contract.
Internet Architecture Board is a technical advisory group of the Internet Society, whose responsibilities include:
Oversee the Internet Engineering Task Force (IETF) Oversee the Internet standards process Publish and manage Request for Comments (RFCs)
InterNIC is responsible for assigning classes todifferent organizations according to their number of hosts.
A Brief History of the Internet
The U.S. Department of Defense laid the foundation ofthe Internet roughly 30 years ago with a network called ARPANET. But the general public didn't use the Internet much until after the development of the World Wide Web in the early 1990s. As recently as June 1993, there were only 130 Web sites. Now there are more then 2 billion websites.
ARPANET was a large wide-area network created by the United States Defense Advanced Research Project Agency(ARPA). Established in 1969, ARPANET served as a test bed for new networking technologies, linking many universities and research centers. The first two nodes that formed the ARPANET were UCLA and the Stanford Research Institute, followed shortly there after by the University of Utah.
From ARPANET to Internet
In October 1972 ARPANET went 'public'. At the First International Conference on Computers and Communication, held in Washington DC, ARPA scientists demonstrated the system in operation, linking computers together from 40 different locations. This stimulated further research in scientific community throughout the Western World. Soon other networks would appear. The Washington conference also set up an Internetworking Working Group (IWG) to coordinate there search taking place. Meanwhile ARPA scientists worked on refining the system and expanding its capabilities:
In 1972, they successfully employed a new program to allow the sending of messages over the net, allowing direct person-to-person communication that we no wrefer to as e-mail. This development we will deal withat length in the next section.
Also in the early 70s, scientists developed host-to-host protocols. Before then the system only allowed a 'remote terminal' to access the files ofeach separate host.
In 1974, ARPA scientists, working closely with experts in Stanford, developed a common language that would allow different networks to communicate with each other. This was known as a transmission control protocol/internet protocol (TCP/IP).
The development of TCP/IP marked a crucial stage in networking development, and it is important to reflecton the implications inherent in the design concepts...since it could all have turned out very differently. Although 1974 marked the beginning of TCP/IP, it would take several years of modification and redesign before it was competed and universally adopted.
Meanwhile computer networking developed a pace. In 1974 Stanford opened up Telnet, the first openly accessible public 'packet data service' (a commercial version ofARPANET). In the 1970s the US Department of Energy established MFENet for researchers into Magnetic Fusion Energy, which spawned HEPNet devoted to High Energy Physics. This inspired NASA physicists to establish SPAN for space physicists. In 1976 aUnix-to-Unix protocol was developed by AT&T Bell laboratories and was freely distributed to all UNIX computer users (since UNIX was the main operating system employed by universities, this opened upnetworking to the broader academic community).
History of Usenet/Newsgroups
In 1979 Usenet was established, an open system focusing on e-mail communication and devoted to 'newsgroups' is opened, and still thriving today.
Usenet came into being in late 1979, shortly after the release of V7 Unix with UUCP. Two Duke University gradstudents in North Carolina, Tom Truscott and JimEllis, thought of hooking computers together to exchange information with the UNIX community. Steve Bellovin, a grad student at the University of North Carolina, put together the first version of the news software using shell scripts and installed it on the first two sites: Unc and Duke.
At the beginning of 1980 the network consisted of those two sites and PHS (another machine at Duke), and was described at the January Usenet conference. Steve Bellovin later rewrote the scripts into C programs, but they werenever released beyond unc and duke. Shortly there after, Steve Daniel did another implementation in C for public distribution. Tom Truscott made further modifications, and this became the "A" news release.
In 1981 at U. C. Berkeley, grad student Mark Hortonand high school student Matt Glickman rewrote the news software to add functionality and to cope with theever increasing volume of news -- "A" News was intended for only a few articles per group per day.This rewrite was the "B" News version. The first public release was version 2.1 in 1982; the 1.*versions were all beta test. As the net grew, the news software was expanded and modified. The last version maintained and released primarily by Mark was 2.10.1.
Rick Adams, at the Center for Seismic Studies, tookover coordination of the maintenance and enhancementof the B News software with the 2.10.2 release in1984. By this time, the increasing volume of news was becoming a concern, and the mechanism for moderated groups was added to the software at 2.10.2. Moderated groups were inspired by ARPA mailing lists and experience with other bulletin board systems.
In March 1986 a package was released implementing news transmission, posting, and reading using the Network News Transfer Protocol (NNTP) (as specified in RFC977). This protocol allows hosts to exchange articles via TCP/IP connections rather than using the traditional uucp. It also permits users to read and post news (using a modified news user agent) from machines which cannot or choose not to install theUSENET news software. Reading and posting are done using TCP/IP messages to a server host which does run the USENET software. Sites which have many workstations like the Sun and SGI, and HP products find this a convenient way to allow workstation users to read news without having to store articles on each system. Many of the Usenet hosts that are also on the Internet exchange news articles using NNTP because the load impact of NNTP is much lower than uucp (and NNTP ensures much faster propagation).
NNTP grew out of independent work in 1984-1985 byBrian Kantor at U. C. San Diego and Phil Lapsley at U.C. Berkeley. NNTP includes support for System V UNIXwith Excelan Ethernet cards and DECNET under Ultrix.NNTP was developed at U. C. Berkeley by Phil Lapsleywith help from Erik Fair, Steven Grady, and MikeMeyer, among others. The NNTP package was distributedon the 4.3BSD release tape (although that was version1.2a and out-of-date) and is also available from thevarious authors, many major hosts, and by anonymousFTP from lib.tmc.edu, mthvax.cs.miami.edu andftp.uu.net.
One new version of news, known as C News, wasdeveloped at the University of Toronto by GeoffCollyer and Henry Spencer. This version is a rewriteof the lowest levels of news to increase article processing speed, decrease article expiration processing and improve the reliability of the newssystem through better locking, etc. The package was released to the net in the autumn of 1987. For more information, see the paper News Need Not Be Slow,published in the Winter 1987 Usenet TechnicalConference Proceedings. The most recent version of CNews is the 20 Feb 1993 "performance release." C Newscan be obtained from its official archive site, cs.toronto.edu, using FTP.
Another Usenet system, known as InterNetNews, or INN,was written by Rich Salz. INN is designed to run onUNIX hosts that have a socket interface. It isoptimized for larger hosts where most traffic usesNNTP, but it does provide full UUCP support. INN isvery fast, and since it integrates NNTP many peoplefind it easier to administer only one package. Thepackage was publicly released on August 20, 1992. Formore information, see the paper InterNetNews: UsenetTransport for Internet Sites published in the June1992 Usenet Technical Conference Proceedings. INN canbe obtained from many places; its official archivesite is ftp.uu.net in the directorynetworking/news/nntp/inn. The current version is1.4sec, last release 22 December 1993.
ANU-NEWS is news package written by Geoff Huston of Australia for VMS systems. ANU-NEWS is a complete news system that allows reading, posting, direct replies,moderated newsgroups, etc. in a fashion closely related to regular news. The implementation includes the RFC 1036 news propagation algorithms and integrated use of the NNTP protocols to support remote news servers, implemented as a VAX/VMS Decnet object.An RFC 977 server implemented as a Decnet object isalso included. ANU-NEWS currently includes support forthe TCP/IP protocols.The ANU-NEWS interface is similarto standard DEC screen oriented systems. The licensefor the software is free, and there are no restrictions on the re-distribution.
In 1981 Bitnet was developed City University New York to link university scientists using IBM computers, regardless of discipline, in the Eastern US. CSNet,funded by the US national Science Foundation wasestablished to facilitate communication for ComputerScientists in universities, industry and government.In 1982 a European version of the Unix network, Eunet,was established, linking networks in the UK, Scandinavia and the Netherlands, followed in 1984 by a European version of Bitnet, known as EARN (European Academic and Research Network).
ARPANET is still the backbone to the entire system. When, in 1982 it finally adopts the TCP/IP the Internet is born... a connected set of networks usingthe TCP/IP standard.
From Internet to World Wide Web
One early, and essential development, was the introduction in 1984 of Domain Name Servers (DNS).Until then each host computer had been assigned aname, and there was a single integrated list of names and addresses that could easily be consulted. The new system introduced some tiring into US internet addresses such as edu (educational), com.(commercial), gov (governmental) in addition to org.(international organization) and a series of countrycodes. This made the names of host computers easier to remember.
A second development was the decision by national governments to encourage the use of the internetthroughout the higher educational system, regardlessof discipline. In 1984 the British government announced the construction of JANET (Joint AcademicNetwork) to serve British universities, but more important was the decision, the following year, of theUS National Science Foundation to establish NSFNet forthe same purpose the use of TCP/IP protocols was mandatory for all participants in the program
NSFNet signed shared infrastructure 'no-metered-cost'agreements with other scientific networks (includingARPANET), which formed the model for all subsequent agreements.
Finally, NSFNet agreed to provide the 'backbone' forthe US Internet service, and provided five 'supercomputers' to service the envisaged traffic. The first computers provided a network capacity of 56,000bytes per second but the capacity was upgraded in 1988 to 1,544,000,000 bytes per second.
The effect of the creation of NSFNet was dramatic. In the first place it broke the capacity bottleneck inthe system. Secondly, it encouraged a surge in Internet use. It had taken a decade for the number of computer hosts attached to 'the Net' to top thethousand marks. By 1986 the number of hosts had reached 5000 and a year later the figure had climbedto hosts 28,000.
Although commercial exploitation of the net had started, the expansion of the Internet continued to be driven by the government and academic communities. It was also becoming ever more international. By 1989 the number of hosts surpassed 100,000 for the first time and had climbed to 300,000 a year later. The end ofthe 1980s and the start of the 1990s provide aconvenient cut-off point for several reasons:
In 1990 ARPANET (which had been stripped of its military research functions in 1983) became a victim of its own success. The network had been reduced to apale shadow of its former self and was wound up.
In 1990, the first Internet search-engine for finding and retrieving computer files, Archie, was developed at McGill University, Montreal. The development of search-engines will be dealt with in the last lecture.
In 1991, the NSF removed its restriction on private access to its backbone computers.
"Information superhighway" project came into being. This was the name given to popularize Al Gore's HighPerformance Computing Act which provided funds forfurther research into computing and improving the infrastructure of the Internet's (US) structure. Its largest provisions from 1992-96 were $1,500 millions for the NSF, $600 millions for NASA and $660 for the Department of Energy.
The Difference Between the Two
Many people use the terms Internet and World Wide Web interchangeably, but in fact the two terms are not synonymous. The Internet and the Web are two separate but related things.
The Internet is a massive network of networks, a networking infrastructure. It connects millions of computers together globally, forming a network in which any computer can communicate with any other computer as long as they are both connected to the Internet. Information that travels over the Internet does so via a variety of languages known as protocols.
The World Wide Web, or simply Web, is a way ofaccessing information over the medium of the Internet. It is an information-sharing model that is built on top of the Internet. The Web uses the HTTP protocol,only one of the languages spoken over the Internet, to transmit data. Web services, which use HTTP to allow applications to communicate in order to exchange business logic, use the Web to share information. The Web also utilizes browsers, such as Internet Explorer or Netscape, to access Web documents called Web pages that are linked to each other via hyperlinks. Web documents also contain graphics, sounds, text an dvideo.
The Web is just one of the ways that information can be disseminated over the Internet. The Internet, not the Web, is also used for e-mail, which relies on SMTP, Usenet news groups, instant messaging and FTP. So the Web is just a portion of the Internet, albeit alarge portion, but the two terms are not synonymous and should not be confused.
History of World Wide Web
World Wide Web was proposed by Tim Berners-Lee at European Laboratory for Practical Physics (CERN) in Geneva, Switzerland in 1989. 1993 was the year of Mosaic, the first graphical Web browser. Mosaic was developed at National Center for Supercomputing Applications (NCSA) and the University of Illinois.The free distribution of Mosaic over Internet brought huge public attentions to the World Wide Web.
In 1994, Marc Andreessen, one of the developers of Mosaic, left NCSA, co-founded Netscape Communications Corp., released the Netscape Navigator, a graphicalWeb browser, in October 1994. The freely distributed Netscape Navigator for UNIX, Windows and Macintosh OS brought the worldwide public interest to Internet and Web. It marked the beginning of Internet business era.
In 1995, Microsoft stepped in the web browser market, released the Internet Explorer version 1.0.
From that time on, the Web browser war began. The Internet gold rush began; the E-Commerce erabegan.......
Tim Berners-Lee: Father of the Web
The World Wide Web came into being in 1991, thanks todeveloper Tim Berners-Lee and others at the EuropeanLaboratory for Particle Physics, also known as Conseil Européenne pour la Recherche Nucléaire (CERN). The CERN team created the protocol based on hypertext that makes it possible to connect content on the Web with hyperlinks. Berners-Lee now directs the World Wide Web Consortium (W3C), a group of industry and university representatives that oversees the standards of Web technology.
Early on, the Internet was limited to noncommercial uses because its backbone was provided largely by the National Science Foundation, the National Aeronautics and Space Administration, and the U.S. Department of Energy, and funding came from the government. But as independent networks began to spring up, users could access commercial Web sites without using the government-funded network. By the end of 1992, the first commercial online service provider, Delphi,offered full Internet access to its subscribers, and several other providers followed.
In June 1993, the Web boasted just 130 sites. By a year later, the number had risen to nearly 3,000. Asof April 1998, there were more than 2.2 million siteson the Web.
History of Email
Email is by far and away the most popular application on the internet. Just about everyone uses email, and generally people use it all of the time.
It all began in 1968 with a company called Bolt Beranek and Newman (BBN). This firm was hired by the United States Defense Department to create something called the ARPANET, which later became the internet. ARPANET stands for Advanced Research Projects Agency Network, and its purpose was to create a method that military and educational institutions could communicate with each other.
In 1971, an engineer named Ray Tomlinson was assigned to a project called SNDMSG. This program was not new; in fact it had existed for a number of years. By today's standards it was more than primitive. All it did was allow users on the same machine to send messages to each other. Users could create text files which would then be delivered to mailboxes on the same machine.
A mailbox was simply a text file which could have additional text added to the end. Data could be added, but not deleted or changed. The name of the mailbox was the name of the text file.
Ray was assigned to make this simple application do a little bit more. As it turned out, he had been working on something called CYPNET, which was intended to transfer files between computers within the ARPANET."The idea occurred to me that CYPNET could append material to a mailbox file as readily as SNDMSG could," said Ray.
So he modified CYPNET to perform one additional task -to append to a file. This was pretty simple and the change was quickly made.
After that, Ray made a decision which changed history. He created the format of the email address. He definedit as a mailbox name, the @ sign, and the machine's node name. He used the @ sign because "it seemed to make sense. I used the @ sign to indicate that the user was 'at' some other host rather than being local."
He sent himself a message, the contents of which have been lost in time. The first email message was unceremoniously sent between two PDP-10 nodes of the ARPANET network. History had been made.
Email usage grew quickly. In fact, a study two yearslater found that 75% of all ARPANET traffic was email.
One of the first big email programs available to the general public (at least the first major one to catch on) is Eudora. This email client was first written in1988 by Steve Dorner. At the time he was an employee at the University of Illinois.
Eudora was named for the now deceased Eudora Welty, an author from America. Eudora was the first email client which provided a graphic interface. It was free whenit first came out; although once it was purchased byQualcomm in 1994 it became a professional product.
Like most applications on the web, Eudora was king for a few years, then quickly supplanted by the email clients that came with Netscape and Internet Explorer. Both email clients became popular not because theywere better than Eudora, but because they were provided for free with the web browser.
According to a recent report by Forrester Research, more than half of all Americans use email for an average of half an hour each day. They claim a total of 87 million Americans are active email users.
The History of the @ Sign
In 1972, Ray Tomlinson sent the first electronic message, now known as e-mail, using the @ symbol to indicate the location or institution of the e-mail recipient. Tomlinson, using a Model 33 Teletypedevice, understood that he needed to use a symbol that would not appear in anyone's name so that there was no confusion. The logical choice for Tomlinson was the "at sign," both because it was unlikely to appear in anyone's name and also because it represented the word "at," as in a particular user is sitting @ this specific computer.
However, before the symbol became a standard key on type writer keyboards in the 1880s and a standard on QWERTY keyboards in the 1940s, the @ sign had a long if somewhat sketchy history of use throughout the world. Linguists are divided as to when the symbol first appeared. Some argue that the symbol dates back to the 6th or 7th centuries when Latin scribes adapted the symbol from the Latin word ad, meaning at, to ortoward. The scribes, in an attempt to simplify the amount of pen strokes they were using, created the ligature (combination of two or more letters) by exaggerating the upstroke of the letter "d" andcurving it to the left over the "a."
Other linguists will argue that the @ sign is a more recent development, appearing sometime in the 18th century as a symbol used in commerce to indicate price per unit, as in 2 chickens @ 10 pence. While thesetheories are largely speculative, in 2000 Giorgio Stabile, a professor of the history of science at La Sapienza University in Italy, discovered some original 14th-century documents clearly marked with the @ signto indicate a measure of quantity - the amphora, meaning jar. The amphora was a standard-sized terracotta vessel used to carry wine and grain among merchants, and, according to Stabile, the use of the @symbol ( the upper-case "A" embellished in the typicalFlorentine script) in trade led to its contemporary meaning of "at the price of."
While in the English language, @ is referred to as the"at sign," other countries have different names forthe symbol that is now so commonly used in e-mail transmissions throughout the world. Many of these countries associate the symbol with either food or animal names.
Now that you've gotten free know-how on this topic, try to grow your skills even faster with online video training. Then finally, put these skills to the test and make a name for yourself by offering these skills to others by becoming a freelancer. There are literally 2000+ new projects that are posted every single freakin' day, no lie!