TVs. Consoles. Projectors and accessories. Technologies. Digital TV

The World Wide Web was created in. World Wide Web (www). World Wide Web Technologies

Hello, dear readers of the blog site. We all live in the era of the global Internet and use the terms site, web, www (World Wide Web - World Wide Web, global network) quite often and without particularly going into what it is.

I observe the same thing from other authors, and even ordinary interlocutors. “Site”, “Internet”, “network” or the abbreviation “WWW” have become such common concepts for us that it doesn’t even occur to us to think about their essence. However, the first website was born only some twenty years ago. What is the Internet?

After all, it has a rather long history, however, before the advent of the global network (WWW), 99.9% of the planet's inhabitants did not even suspect its existence, because it was the lot of specialists and enthusiasts. Now even the Eskimos know about the World Wide Web, in whose language this word is identified with the ability of shamans to find answers in the layers of the universe. So let's discover for ourselves what the Internet, website, World Wide Web, and everything else is.

What is the Internet and how it differs from the World Wide Web

The most remarkable fact that can now be stated is that Internet has no owner. In essence, this is an association of individual local networks (thanks to the common standards once adopted, namely the TCP/IP protocol), which is maintained in working order by network providers.

It is believed that due to the ever-increasing media traffic (video and other heavy content moving in tons on the network), the Internet will soon collapse due to its currently limited bandwidth. The main problem in this regard is updating the network equipment that makes up the global web to a higher speed one, which is primarily hampered by the additional costs required. But I think that the problem will be solved as the collapse matures, and there are already separate segments of the network operating at high speeds.

In general, in light of the fact that the Internet is essentially no one’s, it should be mentioned that many states, trying to introduce censorship on the global network, want to identify it (namely its currently most popular component WWW) with.

But there is actually no basis for this desire, because The Internet is just a means of communication or, in other words, a storage medium comparable to a telephone or even plain paper. Try applying sanctions to paper or its distribution around the planet. In fact, individual states can apply certain sanctions only to sites (islands of information on the network) that become available to users via the World Wide Web.

The first prerequisites for the creation of the global web and the Internet were undertaken... What year do you think? Surprisingly, back in the wild days of 1957. Naturally, the military (and, naturally, the United States, where would we be without them) needed such a network for communication in the event of military operations involving the use of nuclear weapons. It took quite a long time to create the network (about 12 years), but this can be explained by the fact that at that time computers were in their infancy.

But nevertheless, their power was quite enough to create an opportunity between the military departments and leading US universities by 1971. Thus, the Email transfer protocol became first way to use the Internet for user needs. After a couple more, overseas people already knew what the Internet was. By the beginning of the 80x, the main data transfer protocols were standardized (mail, ), and the protocol of the so-called Usenet news conferences appeared, which was similar to mail, but made it possible to organize something similar to forums.

And a few years later, the idea of ​​​​creating a domain name system (DNS) appeared vital role in the formation of WWW) and the world's first protocol for communication via the Internet in real time appeared - IRC (in colloquial Russian - irka). It allowed you to chat online. Science fiction that was accessible and interesting to a very, very small number of inhabitants of planet Earth. But that's just for now.

At the junction of the 80s and 90s, such significant events took place in the history of infrastructure development that they, in fact, predetermined its future fate. In general, such a spread of the global network in the minds of modern inhabitants of the planet is due to almost one single person - Tim Berners-Lee:

Berners-Lee is an Englishman, born into a family of two mathematicians who dedicated their lives to creating one of the world's first computers. It was thanks to him that the world learned what the Internet, a website, e-mail etc. Initially, he created the World Wide Web (WWW) for the needs of nuclear research at Cern (they have the same collider). The task was to conveniently place all the scientific information available to the concern in their own network.

To solve this problem, he came up with everything that is now the fundamental elements of the WWW (what we consider the Internet, without understanding its essence a little). He took as a basis the principle of organizing information called. What is it? This principle was invented long before and consisted of organizing the text in such a way that the linearity of the narrative was replaced by the ability to navigate through different links (connections).

The Internet is hypertext, hyperlinks, URLs and hardware

Thanks to this, hypertext can be read in different sequences, thereby obtaining various options linear text (well, this should be clear and obvious to you, as experienced Internet users, now, but then it was a revolution). The role of hypertext nodes should have been, which we now simply call links.

As a result, all the information that now exists on computers can be represented as one large hypertext, which includes countless nodes (hyperlinks). Everything developed by Tim Berners-Lee was transferred from the local CERN grid to what we now call the Internet, after which the Web began to gain popularity at a breakneck pace (the first fifty million users of the World Wide Web were registered in just first five years of existence).

But to implement the principle of hypertext and hyperlinks, it was necessary to create and develop several things from scratch. Firstly, we needed a new data transfer protocol, which is now known to all of you HTTP protocol(at the beginning of all website addresses you will find a mention of it or its secure HTTPs version).

Secondly, it was developed from scratch, the abbreviation of which is now known to all webmasters in the world. So, we have received tools for transferring data and creating websites (a set of web pages or web documents). But how can one refer to these same documents?

The first allowed you to identify a document on a separate server (site), and the second allowed you to mix a URI into the identifier domain name(received and clearly indicating that the document belongs to a website hosted on a specific server) or IP address (a unique digital identifier of absolutely all devices on a global or local network). Read more about it at the link provided.

There is only one step left to take for the World Wide Web to finally work and become in demand by users. Do you know which one?

Well, of course, we needed a program that could display on the user’s computer the contents of any web page requested on the Internet (using a URL address). It became such a program. If we talk about today, there are not so many main players in this market, and I managed to write about all of them in a short review:

  1. (IE, MSIE) - the old guard is still in service
  2. (Mazila Firefox) - another veteran is not going to give up his position without a fight
  3. (Google Chrome) - an ambitious newcomer who managed to take the lead in the shortest possible time
  4. - a browser beloved by many in RuNet, but gradually losing popularity
  5. - a messenger from the apple mill

Timothy John Berners-Lee independently wrote the program for the world's first Internet browser and called it, without further ado, the World Wide Web. Although this was not the limit of perfection, it was with this browser that the victorious march of the World Wide Web WWW across the planet began.

In general, of course, it is striking that all the necessary tools for the modern Internet (meaning its most popular component) were created by just one person in such a short time. Bravo.

A little later, the first graphical browser Mosaic appeared, from which many of the modern browsers (Mazila and Explorer) originated. It was Mosaic that became the drop that was missing to there was an interest in the Internet(namely to the World Wide Web) among ordinary residents of planet Earth. A graphical browser is a completely different matter than a text browser. Everyone loves to look at pictures and only a few love to read.

What is noteworthy is that Berners-Lee did not receive any terribly large sums of money, which, for example, as a result he received or, although he probably did more for the global network.

Yes, over time, in addition to the Html language developed by Berners-Lee, . Thanks to this, some of the operators in Html were no longer needed, and they were replaced by much more flexible tools for cascading style sheets, which made it possible to significantly increase the attractiveness and design flexibility of the sites being created today. Although in studying CSS rules, of course, are more complex than a markup language. However, beauty requires sacrifice.

How do the Internet and the global network work from the inside?

But let's see what is Web (www) and how information is posted on the Internet. Here we will come face to face with the very phenomenon called website (web is a grid, and site is a place). So, what is a “place on the network” (analogous to a place in the sun in real life) and how to actually get it.

What is intet? So, it consists of invisible and non-having of great importance for users of channel-forming devices (routers, switches). The WWW network (what we call the Web or World Wide Web) consists of millions of web servers, which are programs running on slightly modified computers, which in turn must be connected (24/7) to the global web and use the HTTP protocol for data exchange.

The web server (program) receives a request (most often from the user’s browser, which opens the link or entered the Url in the address bar) to open a document hosted on this very server. In the simplest case, a document is physical file(with the html extension, for example), which lies on the server’s hard drive.

In a more complex case (when using ), the requested document will be generated programmatically on the fly.

To view the requested page of the site, special software is used on the client (user) side called a browser, which can draw the downloaded fragment of hypertext in a digestible form on the information display device where this same browser is installed (PC, phone, tablet, etc. ). In general, everything is simple, if you don’t go into details.

Previously, each individual website was physically hosted on separate computer. This was mainly due to the weak computing power of the PCs available at that time. But in any case, a computer with a web server program and a website hosted on it must be connected to the Internet around the clock. Doing this at home is quite difficult and expensive, so they usually use the services of hosting companies specializing in this to store websites.

Hosting service Due to the popularity of the WWW, it is now quite in demand. Thanks to the growing power of modern PCs over time, hosters have the opportunity to host many websites on one physical computer (virtual hosting), and hosting one website on one physical PC has become called a service.

When using virtual hosting, all websites hosted on a computer (the one called a server) can be assigned one IP address, or each one can have a separate one. This does not change the essence and can only indirectly affect the Website located there (a bad neighborhood on one IP can have a bad impact - search engines sometimes treat everyone with the same brush).

Now let's talk a little about website domain names and their meaning on the World Wide Web. Every resource on the Internet has its own domain name. Moreover, a situation may arise when the same site may have several domain names (the result is mirrors or aliases), and also, for example, the same domain name may be used for many resources.

Also, for some serious resources there is such a thing as mirrors. In this case, the site files may be located on different physical computers, and the resources themselves may have different domain names. But these are all nuances that only confuse novice users.

World Wide Web

The World Wide Web is a distributed system that provides access to interconnected documents located on various computers connected to the Internet. The word “web” is also used to refer to the World Wide Web (translation from English web means "web") and the abbreviation WWW. The World Wide Web is made up of hundreds of millions of web servers. Most of the resources on the World Wide Web are based on hypertext technology. Hypertext documents posted on the World Wide Web are called web pages. Several web pages that share a common theme, design, and links and are usually located on the same web server are called a website. To download and view web pages, special programs are used - browsers. The World Wide Web has caused a real revolution in information technology and an explosion in the development of the Internet. Often, when talking about the Internet, they mean the World Wide Web, but it is important to understand that they are not the same thing.

History of the World Wide Web

Tim Berners-Lee and, to a lesser extent, Robert Cayo are considered the inventors of the World Wide Web. Tim Berners-Lee is the originator of HTTP, URI/URL and HTML technologies. In 1980 he worked for the European Council for Nuclear Research as a consultant on software. It was there, in Geneva (Switzerland), that he wrote the Enquire program for his own needs, which used random associations to store data and laid the conceptual basis for the World Wide Web. In 1989, while working at CERN on the organization's intranet, Tim Berners-Lee proposed the global hypertext project, now known as the World Wide Web.

The project involved the publication of hypertext documents linked by hyperlinks, which would facilitate the search and consolidation of information for CERN scientists. To implement the project, Tim Berners-Lee invented URIs, the HTTP protocol and HTML language. These are technologies you can no longer imagine without. modern Internet. Between 1991 and 1993 Berners-Lee improved, technical specifications these standards and published them. But, nevertheless, the official year of birth of the World Wide Web should be considered 1989. As part of the project, Berners-Lee wrote the world's first web server, httpd, and the world's first hypertext web browser, called WorldWideWeb. This browser was also a WYSIWYG editor; its development began in October 1990 and was completed in December of the same year.

What is the World Wide Web?

The web, or “web,” is a collection of interconnected pages with specific information. Each such page can contain text, images, video, audio and other various objects. But besides this, there are so-called hyperlinks on web pages. Each such link points to another page, which is located on some other computer on the Internet.

Various information resources, which are interconnected by means of telecommunications and based on hypertext representation of data, form the World Wide Web (or WWW for short).

Hyperlinks link pages that are located on different computers located in different parts of the world. A huge number of computers that are united into one network is the Internet, and the “World Wide Web” is a huge number of web pages hosted on network computers.

Each web page on the Internet has an address - URL (Uniform Resource Locator - unique address, name). It is at this address that you can find any page.

How was the World Wide Web created?

On March 12, 1989, Tim Berners-Lee presented to the CERN management a project for a unified system of organization, storage and public access to information that was supposed to solve the problem of sharing knowledge and experience between the Center’s employees. The problem of access to information on different computers Berners-Lee proposed a solution for employees using browser programs that provide access to a server computer where hypertext information is stored. After the successful implementation of the project, Berners-Lee was able to convince the rest of the world to use common Internet communication standards using the Hypertext Transfer Protocol (HTTP) and Universal Markup Language (HTML) standards.

It should be noted that Tim Berners-Lee was not the first creator of the Internet. The first system of protocols that ensure data transfer between networked computers was developed by employees of the US Defense Advanced Research Projects Agency (DARPA) Vinton Cerf And Robert Kahn in the late 60s - early 70s of the last century. Berners-Lee only suggested using the opportunities computer networks to create new system organizing information and accessing it.

What was the prototype of the World Wide Web?

Back in the 60s of the 20th century, the US Department of Defense set the task of developing a reliable information transmission system in case of war. Advanced agency research projects The USA (ARPA) proposed to develop a computer network for this. They called it ARPANET (Advanced Research Projects Agency Network). The project brought together four scientific institutions - the University of Los Angeles, the Stanford Research Institute and the Universities of Santa Barbara and Utah. All work was financed by the US Department of Defense.

The first data transmission over a computer network took place in 1969. A Los Angeles University professor and his students tried to log into Stanford's computer and pass the word "login." Only the first two letters L and O were successfully transmitted. When they typed the letter G, the communication system failed, but the Internet revolution took place.

By 1971, a network with 23 users was created in the United States. The first program for sending email over the Internet was developed. And in 1973, University College London and the Civil Services in Norway joined the network, and the network became international. In 1977, the number of Internet users reached 100, in 1984 - 1000, in 1986 there were already more than 5,000, in 1989 - more than 100,000. In 1991, the World-Wide Web (WWW) project was implemented at CERN. In 1997, there were already 19.5 million Internet users.

Some sources indicate the date of the emergence of the World Wide Web a day later - March 13, 1989.

>>Informatics: Internet and World Wide Web

§ 4. Internet and World Wide Web

Main topics of the paragraph:

What is the World Wide Web

The most interesting service provided to Internet users since 1993 has been the ability to work with the World Wide Web information system (abbreviated as WWW). This phrase can be translated as “world wide web”. It was working with the WWW that was meant when at the beginning of this paragraph you were offered all sorts of information miracles.

Brazil's criminal code has already suffered from a lack of more modern definitions and a lack of solutions to new problems created by new technologies. How to proceed, for example, when a virus is sent and does not damage the machine, but corrupts photo files with emotional value? How to classify important but simply virtual data?

Technological evolution has been responsible for the world evolving from an analogue world to a digital world, this change can be seen in new teaching methods. Before society was digitized, the class consisted of the teacher writing on the board, all searches were done in printed books and depended on the many books in the library, notes in the class had to be made in notebooks using pencils or pens. Nowadays we use digital classrooms in the classroom, quizzes can be taken online where you can find instant results from an almost unlimited number of sources, and notes can be taken on computers or other electronic devices.

It is very difficult to give an exact definition of what WWW is. This system can be compared to a huge encyclopedia, the pages of which are scattered across computer servers connected by the Internet. To get the right information, the user must get to the corresponding encyclopedia page. Perhaps with this analogy in mind, the creators of the WWW introduced the concept of a Web page.

A huge change that can be noticed is that traditional teaching activities have been modified by the digitization of the world, similar methods such as writing on photographs or documents and researching in printed books are being replaced by digital methods which are faster in their function and effective .

It is clear that many parts of the world still have to adapt to digital education, but the changes brought about by it can already be seen and are already changing the dynamics of teaching, in the future learning may be further changed by them as new technologies and how we can adapt to its implementation .

Web server, Web page, Web site

A web page is the main information unit of the WWW. She represents separate document, stored on the Web server. A page has a name (similar to a page number in an encyclopedia) by which it can be accessed.

The information on a Web page can be very different: text, drawing, photograph, multimedia. Advertising is also placed on Web pages, background information, scientific articles, latest news, illustrated publications, art catalogs, weather forecasts and much, much more. To put it simply: Web pages have “everything.”

Today it is impossible not to notice how the Internet and new technologies influence our daily lives. This new digital world is influenced by work, leisure, sports, hobbies and often even emotions and feelings. It is clear that the digital world has brought us greater development as distances have been shortened, information almost the same as the news of the Boston Marathon bombing. Moreover, the job was changed and so was their relationship. Best? many say yes, but others say no, but of course the conveniences and conveniences of the digital world.

A number of Web pages can be related thematically and form a Web site. Each site has home page, which is called home (Home page). This is peculiar front page, starting from which you can view documents stored on the server. Typically, the home page contains a table of contents - the names of sections. To access the desired section, just move the mouse pointer to the section name and click the button mice.

But to take advantage of this world, it has brought us the imposition of always being renewed, great is never enough, and the most qualified is the one who must always be very well informed. On the other hand, the analogue world, which has been replaced by a digital one, should not be excluded. Of course, of course, it is replaced, but information bases of this “world” are still being used by the world of new technologies. Crimes and other types of acts that currently exist in the digital world were, until then, unknown in the analog world.

WWW hyperstructure

However, it is not at all necessary to view Web pages in a row, flipping through them, as in a book. The most important property of the WWW is the hypertext organization of connections between Web pages. Moreover, these connections operate not only between pages on the same server, but also between different WWW servers.

Typically, hyperlinked keywords are highlighted or underlined on a Web page. By clicking on such a word, you will follow a hidden link to view another document. Moreover, this document may be located on another server, in another country, on another continent. More often than not, the Internet user has no idea where the server he is currently communicating with is located. Figuratively speaking, in one session you can “fly” around the globe several times.

The big fear is whether he will be ready to handle this change. The digital world brings us great things and would be much more attractive and dynamic if this digital revolution were more democratic and better recommended. The analog system is still present today, but digital system more and more in fashion.

Responsibilities must increasingly be shared between public authorities and private authorities, making the world of digital access accessible to everyone, thereby contributing to society. Nowadays, when it comes to development or discovery, it is simply impossible not to associate this fact with technology, because everything is connected to it. Technology moves the world, producing development, but destroys it if used without caution, as will be seen below.

The role of a key for communication can be played not only by text, but also by a drawing, a photograph, or a pointer to a sound document. In this case, instead of the term “hypertext” the term “hypermedia” is used.

You can reach the same Web page in many different ways. The analogy with the pages of a book no longer works here. In a book, the pages have a certain sequence. Web pages do not have such a sequence. The transition from one page to another occurs through hyperlinks, forming a network that resembles a web. This is where the name of the system comes from.

Technology was the best invention of all time because it gave rise to many others, allowing researchers to make great discoveries that saved generations and created more jobs. Countries that are now economic powers have also contributed to the development of technology.

It is noted that the Internet is an important tool that is growing rapidly over the years, so today it is almost necessary to know how to handle the technological tools that must be present in the work scenario. The professional of this century must understand that he must master fields other than his own. Technological evolution has made the job market tougher. Increasing professionalism in various areas.

Summarizing the above, we can give the following definition:

The World Wide Web is a globally distributed information system with hyperconnections, existing on the technical basis of the World Wide Web.

Browser is a WWW client program. The problem of searching for information on the Internet

The user is helped to navigate the “web” by special software called a Web browser from the English “browse” - “inspect, study.” Using a browser, you can find the information you need in different ways. The shortest way is using the web page address. You type this address on the keyboard, press the enter key and are taken straight to the location.

He has a good and a bad side. But only the good is remembered, and the bad side is always forgotten, perhaps due to ignorance of the information that negligence has not very good consequences. The technology was created in the midst of war to be used as a weapon in military camps. Today this weapon is in the hands of criminals who use the media to hit someone, kill someone and pedophiles who use this tool to destroy a child's dream. Media requires caution when using it for your own safety.

Another way is search. You can start moving with your home page via hyperlinks. At the same time, there is a danger of going the wrong way, getting entangled in the “web”, and ending up in a dead end. However, the browser allows you to go back any number of steps and continue searching along a different route. Such a search is similar to wandering in an unfamiliar forest (though less dangerous).

These days, the evolution of technology is an insecurity even for students who suffer from doubts about which profession to pursue because there is a list of professions of the future and the other one they talk about will disappear. Workers are also a little afraid to imagine that one day it will replace people with machines, leading to mass unemployment throughout the world.

At this time, it is realized that the only way is to always be open to gaining new knowledge. It's amazing how the Internet has achieved great global significance in recent years, even when compared to many of the fantastic inventions we now see. The technological evolution of this wonderful network can be said to be what has caused this proliferation and use, but it is not only what makes the Internet what it is today. This success is due to the central idea of ​​the Internet, which is presented in its original concept: it is not a new invention, but a collection of various human needs, such as communication and the dissemination of knowledge. organized and, above all, accessible and fast.

School of Informatics and Computing
"Abstract"
On the topic: World Wide Web.

The work was performed by student 190(1)

Grigorieva Anastasia

The work is checked by teacher Isaeva I.A.

Tallinn 2010

Introduction 3

History of the World Wide Web 5

Therefore, it is not surprising that this network has become a part of our lives more than any other invention of our time. But some social reactions to the Internet inevitably raise interesting questions: How did Internet freedom take root for people? These are just some of the current issues that the Internet has raised and which are likely to remain open for a long time.

A good example of these problems of dependence and freedom is when governments and states try to stop or control the flow of any information on the Internet. The uprising of the population is instantaneous, as if we had come into the private possession of each of the population, but a possession that at the same time is communal: in the history of mankind there is something that covered so much of the entire human population, it is like a super community: countries and connecting everyone us.

Journey on the World Wide Web 7

Linking hypertext pages 8

Prospects for the development of the World Wide Web 9


Fig.1.1

Structure and principles of the World Wide Web

The World Wide Web is made up of millions of Internet web servers located around the world. A web server is a program that runs on a computer connected to a network and uses the HTTP protocol to transfer data. In its simplest form, such a program receives an HTTP request for a specific resource over the network, finds the corresponding file on the local hard drive and sends it over the network to the requesting computer. More complex web servers are capable of dynamically allocating resources in response to an HTTP request. To identify resources (often files or parts thereof) on the World Wide Web, uniform resource identifiers (URIs) are used. Uniform Resource Identifier). Uniform URL resource locators are used to determine the location of resources on the web. Uniform Resource Locator). These URL locators combine URI identification technology and the DNS domain name system. Domain Name System) - a domain name (or directly an IP address in numeric notation) is part of the URL to designate a computer (more precisely, one of its network interfaces) that executes the code of the desired web server.

To view information received from the web server, a special program is used on the client computer - a web browser. The main function of a web browser is to display hypertext. The World Wide Web is inextricably linked with the concepts of hypertext and hyperlinks. Most of the information on the Internet is hypertext. To facilitate the creation, storage and display of hypertext on the World Wide Web, HTML is traditionally used. HyperText Markup Language), hypertext markup language. The work of marking up hypertext is called layout; the markup master is called a webmaster or webmaster (without a hyphen). After HTML markup, the resulting hypertext is placed into a file; such an HTML file is the most common resource on the World Wide Web. Once an HTML file is made available to a web server, it is called a “web page.” A collection of web pages makes up a website. Hyperlinks are added to the hypertext of web pages. Hyperlinks help World Wide Web users easily navigate between resources (files), regardless of whether the resources are located on local computer or on remote server. Web hyperlinks are based on URL technology. (2 link)

For ambiguity, in addition to controlling the shocks that arise from the "truth", it is chosen how people will think and behave. Despite its recent opening up and adopted economic model, China remains one step behind its recent past. However, like society itself, the Internet requires norms and control. The false concept of anonymity causes many people to take criminal actions that must be curtailed with the same energy and hard work of the "real world". We have spent so much time on this that it is unthinkable not to make the Internet as the main character in our lives.

History of the World Wide Web

Tim Berners-Lee and, to a lesser extent, Robert Cayo are considered the inventors of the World Wide Web. Tim Berners-Lee is the originator of HTTP, URI/URL and HTML technologies. In 1980 he worked at the European Council for Nuclear Research (French). Conseil Européen pour la Recherche Nucléaire, CERN) software consultant. It was there, in Geneva (Switzerland), that he wrote the Enquire program for his own needs. « Enquire» , can be loosely translated as "Interrogator"), which used random associations to store data and laid the conceptual foundation for the World Wide Web.

Many legal assets are linked to the virtual world, and their protection is necessary for adequate coexistence for everyone. For this to happen there must be reasonable government control. First, create safety in the virtual world, second, ensure freedom of expression and free information about the movement - as long as it is not offensive.

The Internet is a communication system and at the same time an information system - a medium for people to communicate. Currently, there are many definitions of this concept. In our opinion, one of the definitions of the Internet that most fully characterizes the information interaction of the planet’s population is: “The Internet is a complex transport and information system of mushroom-shaped (dipole) structures, the cap of each of which (the dipoles themselves) represents the brain of a person sitting at a computer , together with the computer itself, which is, as it were, an artificial extension of the brain, and the legs, for example, telephone network connecting computers, or the airwaves through which radio waves are transmitted.”

The advent of the Internet gave impetus to the development of new information technologies, leading not only to changes in the consciousness of people, but also the world as a whole. However, the worldwide computer network was not the first discovery of its kind. Today, the Internet is developing in the same way as its predecessors - telegraph, telephone and radio. However, unlike them, it combined their advantages - it became not only useful for communication between people, but also a publicly accessible means for receiving and exchanging information. It should be added that the capabilities of not only stationary, but also mobile television have already begun to be fully used on the Internet.

The history of the Internet begins around the 60s of the 20th century.

The first documentation of the social interaction made possible by the Internet was a series of notes written by J. Licklider. These notes discussed the concept of the "Galactic Network". The author envisioned the creation of a global network of interconnected computers, through which everyone could quickly access data and programs located on any computer. In spirit this concept is very close to current state Internet.

Leonard Kleinrock published the first paper on packet switching theory in July 1961. In the article, he presented the advantages of his theory over the existing principle of data transmission - circuit switching. What is the difference between these concepts? When packet switching occurs, there is no physical connection between two end devices (computers). In this case, the data necessary for transmission is divided into parts. Each part is appended with a header containing full information about delivery of the package to its destination. When switching channels, two computers are physically connected “each to each” during the transmission of information. During the connection period, the entire volume of information is transferred. This connection is maintained until the end of the information transfer, i.e., just as it was when transmitting information over analog systems that provide connection switching. At the same time, the utilization rate of the information channel is minimal.

To test the concept of packet circuit switching, Lawrence Roberts and Thomas Merrill connected a TX-2 computer in Massachusetts to a Q-32 computer in California using low-speed telephone dial-up lines in 1965. Thus, the first ever (albeit small) non-local computer network was created. The result of the experiment was the understanding that time-shared computers could successfully work together, executing programs and retrieving data on a remote machine. It also became clear that the telephone system with circuit switching (connections) was absolutely unsuitable for building a computer network.

In 1969, the American agency ARPA (Advanced Research Projects Agency) began research on creating an experimental packet-switching network. This network was created and named ARPANET, i.e. network of advanced research projects agency. A sketch of the ARANET network, consisting of four nodes - the embryo of the Internet - is shown in Fig. 6.1.

At this early stage, research was conducted on both network infrastructure and network applications. At the same time, work was underway to create a functionally complete protocol for computer-to-computer interaction and other network software.

In December 1970, the Network Working Group (NWG), led by S. Crocker, completed work on the first version of the protocol, called the Network Control Protocol (NCP). After work was completed to implement NCP on ARPANET nodes in 1971–1972, network users were finally able to begin developing applications.

In 1972, the first application appeared - email.

In March 1972, Ray Tomlinson wrote basic programs forwarding and reading electronic messages. In July of the same year, Roberts added to these programs the ability to display a list of messages, selective reading, saving to a file, forwarding, and preparing a response.

Since then, email has become the largest network application. For its time, e-mail became what the World Wide Web is today - an extremely powerful catalyst for the growth of the exchange of all types of interpersonal data flows.

In 1974, the Internet Network Working Group (INWG) introduced a universal protocol for data transmission and network interconnection - TCP/IP. This is the protocol that is used on the modern Internet.

However, the ARPANET switched from NCP to TCP/IP only on January 1, 1983. This was a Day X style transition, requiring simultaneous changes to all computers. The transition had been carefully planned by all parties involved over the previous several years and went surprisingly smoothly (it did, however, lead to the proliferation of the "I Survived the TCP/IP Migration" badge). In 1983, the transition of the ARPANET from NCP to TCP/IP allowed the network to be divided into MILNET, the military network itself, and ARPANET, which was used for research purposes.

In the same year, another important event occurred. Paul Mockapetris developed the Domain Name System (DNS). This system allowed the creation of a scalable, distributed mechanism for mapping hierarchical computer names (eg, www.acm.org) to Internet addresses.

Also in 1983, a Domain Name Server (DNS) was created at the University of Wisconsin. This server (DNS) automatically and secretly from the user provides translation of the dictionary equivalent of the site into an IP address.

With the widespread spread of the Internet outside the United States, country codes first level ru, uk, ua, etc.

In 1985, the National Science Foundation (NSF) took part in the creation of its own network, NSFNet, which was soon connected to the Internet. Initially, the NSF included 5 supercomputer centers, however, less than in APRANET, and the data transmission speed in communication channels did not exceed 56 kbit/s. At the same time, the creation of NSFNet was a significant contribution to the development of the Internet, as it allowed for a new look at how the Internet could be used. The Foundation set the goal that every scientist, every engineer in the United States would be “connected” to a single network, and therefore began to create a network with faster channels that would unite numerous regional and local networks.

Based on ARPANET technology, the NSFNET network (the National Science Foundation NETwork) was created in 1986, in the creation of which NASA and the Department of Energy were directly involved. Six large research centers equipped with the latest supercomputers, located in different regions of the United States, were connected. The main purpose of this network was to provide US research centers with access to supercomputers based on an interregional backbone network. The network operated at a base speed of 56 Kbps. When creating the network, it became obvious that it was not worth even trying to connect all universities and research organizations directly to the centers, since laying such an amount of cable was not only very expensive, but practically impossible. Therefore, we decided to create networks on a regional basis. In every part of the country the institutions concerned connected with their nearest neighbors. The resulting chains were connected to the supercomputer centers through one of their nodes, thus the supercomputer centers were connected together. With this design, any computer could communicate with any other by passing messages through its neighbors.

One of the problems that existed at the time was that early networks (including the ARPANET) were built specifically for the benefit of a narrow circle of interested organizations. They were to be used by a closed community of specialists; As a rule, the work of networks was limited to this. There was no particular need for network compatibility; accordingly, there was no compatibility itself. At the same time, alternative technologies began to appear in the commercial sector, such as XNS from Xerox, DECNet, and SNA from IBM. Therefore, under the auspices of DARPA NSFNET, together with specialists from the subordinate thematic groups on technology and Internet architecture (Internet Engineering and Architecture Task Forces) and members of the NSF Network Technical Advisory Group, “Requirements for Internet Gateways” were developed. These requirements formally guaranteed interoperability between parts of the Internet administered by DARPA and NSF. In addition to choosing TCP/IP as the basis for NSFNet, US federal agencies adopted and implemented a number of additional principles and rules that shaped the modern face of the Internet. Most importantly, NSFNET had a policy of "universal and equal access to the Internet." Indeed, in order for an American university to receive NSF funding for an Internet connection, it, as the NSFNet program states, “must make that connection available to all qualified users on campus.”

NSFNET worked quite successfully at first. But the time came when she could no longer cope with the increased needs. The network created for the use of supercomputers allowed connected organizations to use a lot of information data not related to supercomputers. Network users in research centers, universities, schools, etc. realized that they now had access to a wealth of information and that they had direct access to their colleagues. The flow of messages on the Internet grew faster and faster, until, in the end, it overloaded the computers that controlled the network and the telephone lines connecting them.

In 1987, NSF transferred to Merit Network Inc. a contract under which Merit, with the participation of IBM and MCI, was to provide management of the NSFNET core network, transition to higher-speed T-1 channels and continue its development. The growing core network already united more than 10 nodes.

In 1990, the concepts of ARPANET, NFSNET, MILNET, etc. finally left the stage, giving way to the concept of the Internet.

The scope of the NSFNET network, combined with the quality of the protocols, led to the fact that by 1990, when the ARPANET was finally dismantled, the TCP/IP family had supplanted or significantly displaced most other global computer network protocols around the world, and IP was confidently becoming the dominant data transport service in the global network. information infrastructure.

In 1990, the European Organization for Nuclear Research established the largest Internet site in Europe and provided Internet access to the Old World. To help promote and facilitate the concept of distributed computing over the Internet, CERN (Switzerland, Geneva), Tim Berners-Lee developed hypertext document technology - the World Wide Web (WWW), allowing users to access any information located on the Internet on computers around the world.

WWW technology is based on the definition of URL specifications (Universal Resource Locator), HTTP (HyperText Transfer Protocol) and the HTML language itself (HyperText Markup Language). Text can be marked up in HTML using any text editor. A page marked up in HTML is often called a Web page. To view a Web page, a client application—a Web browser—is used.

In 1994, the W3 Consortium was formed, which brought together scientists from different universities and companies (including Netscape and Microsoft). Since that time, the committee began to deal with all standards in the Internet world. The organization's first step was the development of the HTML 2.0 specification. This version introduces the ability to transfer information from the user's computer to the server using forms. The next step was the HTML 3 project, work on which began in 1995. It was first introduced CSS system(Cascading Style Sheets, hierarchical style sheets). CSS allows you to format text without disrupting logical and structural markup. The HTML 3 standard was never approved; instead, HTML 3.2 was created and adopted in January 1997. Already in December 1997, the W3C adopted the HTML 4.0 standard, which distinguishes between logical and visual tags.

By 1995, the growth of the Internet showed that regulation of connectivity and funding issues could not be in the hands of NSF alone. In 1995, payments for connecting numerous private networks to the national backbone were transferred to regional networks.

The Internet has grown far beyond what it was envisioned and designed to be; it has outgrown the agencies and organizations that created it; they can no longer play a dominant role in its growth. Today it is a powerful worldwide communication network based on distributed switching elements - hubs and communication channels. Since 1983, the Internet has grown exponentially, and hardly a single detail has survived from those times - the Internet still operates based on the TCP/IP protocol suite.

If the term “Internet” was originally used to describe a network built on the Internet protocol (IP), now this word has acquired a global meaning and is only sometimes used as a name for a set of interconnected networks. Strictly speaking, the Internet is any set of physically separate networks that are interconnected by a single IP protocol, which allows us to talk about them as one logical network. The rapid growth of the Internet has caused increased interest in the TCP/IP protocols, and as a result, specialists and companies have appeared who have found a number of other applications for it. This protocol began to be used to build local area networks (LAN - Local Area Network) even when their connection to the Internet was not provided. In addition, TCP/IP began to be used in the creation of corporate networks that adopted Internet technologies, including WWW (World Wide Web) - the World Wide Web, in order to establish an effective exchange of intra-corporate information. These corporate networks are called "Intranets" and may or may not be connected to the Internet.

Tim Berners-Lee, who is the author of HTTP, URI/URL and HTML technologies, is considered the inventor of the World Wide Web. In 1980, for his own use, he wrote the Enquirer program, which used random associations to store data and laid the conceptual basis for the World Wide Web. In 1989, Tim Berners-Lee proposed the global hypertext project, now known as the World Wide Web. The project implied the publication of hypertext documents interconnected by hyperlinks, which would facilitate the search and consolidation of information for scientists. To implement the project, he invented URIs, the HTTP protocol, and the HTML language. These are technologies without which it is no longer possible to imagine the modern Internet. Between 1991 and 1993, Berners-Lee refined the technical specifications of these standards and published them. He wrote the world's first web server, "httpd", and the world's first hypertext web browser, called "WorldWideWeb". This browser was also a WYSIWYG editor (short for What You See Is What You Get). Its development began in October 1990 and was completed in December of the same year. The program worked in the NeXTStep environment and began to spread across the Internet in the summer of 1991. Berners-Lee created the world's first Web site at http://info.cern.ch/; the site is now archived. This site went online on the Internet on August 6, 1991. This site described what the World Wide Web is, how to install a Web server, how to use a browser, etc. This site was also the world's first Internet directory, because Tim Berners-Lee later posted and maintained a list of links to other sites.

Since 1994, the main work on the development of the World Wide Web has been taken over by the World Wide Web Consortium (W3C), founded by Tim Berners-Lee. This Consortium is an organization that develops and implements technology standards for the Internet and the World Wide Web. The W3C's mission is to "Unleash the full potential of the World Wide Web by establishing protocols and principles to ensure the long-term development of the Web." Two other major goals of the Consortium are to ensure complete “internationalization of the Network” and to make the Network accessible to people with disabilities.

The W3C develops uniform principles and standards for the Internet (called “Recommendations”, English W3C Recommendations), which are then implemented by software and hardware manufacturers. This ensures compatibility between software and hardware various companies, which makes the World Wide Web more perfect, universal and convenient. All World Wide Web Consortium Recommendations are open, that is, not protected by patents and can be implemented by anyone without any financial contributions to the consortium.

Currently, the World Wide Web is formed by millions of Internet Web servers located around the world. A web server is a program that runs on a computer connected to a network and uses the HTTP protocol to transfer data. In its simplest form, such a program receives an HTTP request for a specific resource over the network, finds the corresponding file on the local hard drive and sends it over the network to the requesting computer. More complex Web servers are capable of dynamically allocating resources in response to an HTTP request. To identify resources (often files or parts thereof) on the World Wide Web, Uniform Resource Identifiers (URIs) are used. Uniform Resource Locators (URLs) are used to determine the location of resources on the network. Such URL locators combine URI identification technology and the DNS (Domain Name System) domain name system - a domain name (or directly an IP address in a numeric notation) is part of the URL to designate a computer (more precisely, one of its network interfaces) ), which executes the code of the desired Web server.

To view information received from the Web server, a special program, a Web browser, is used on the client computer. The main function of a Web browser is to display hypertext. The World Wide Web is inextricably linked with the concepts of hypertext and hyperlinks. Most of the information on the Web is hypertext. To facilitate the creation, storage and display of hypertext on the World Wide Web, HTML (HyperText Markup Language), a hypertext markup language, is traditionally used. The work of marking up hypertext is called layout; markup masters are called webmasters. After HTML markup, the resulting hypertext is placed into a file; such an HTML file is the most common resource on the World Wide Web. Once an HTML file is made available to a web server, it is called a “web page.” A collection of web pages makes up a website. Hyperlinks are added to the hypertext of web pages. Hyperlinks help World Wide Web users easily navigate between resources (files), regardless of whether the resources are located on the local computer or on a remote server. "Web" hyperlinks are based on URL technology.

In general, we can conclude that the World Wide Web is based on “three pillars”: HTTP, HTML and URL. Although recently HTML has begun to lose its position somewhat and give way to more modern markup technologies: XHTML and XML. XML (eXtensible Markup Language) is positioned as the foundation for other markup languages. To improve the visual perception of the web, CSS technology has become widely used, which allows you to set uniform design styles for many web pages. Another innovation worth paying attention to is the URN (Uniform Resource Name) resource naming system.

A popular concept for the development of the World Wide Web is the creation of a semantic web. The Semantic Web is an add-on to the existing World Wide Web, which is designed to make information posted on the network more understandable to computers. The Semantic Web is a concept of a network in which every resource in human language would be provided with a description that a computer can understand. The Semantic Web opens up access to clearly structured information for any application, regardless of platform and regardless of programming languages. Programs will be able to find themselves necessary resources, process information, classify data, identify logical connections, draw conclusions and even make decisions based on these conclusions. If widely adopted and implemented wisely, the Semantic Web has the potential to spark a revolution on the Internet. To create a machine-readable description of a resource on the Semantic Web, the RDF (Resource Description Framework) format is used, which is based on XML syntax and uses URIs to identify resources. New in this area are RDFS (RDF Schema) and SPARQL (Protocol And RDF Query Language), a new query language for quick access to RDF data.

Currently, there are two trends in the development of the World Wide Web: the semantic web and the social web. The Semantic Web involves improving the coherence and relevance of information on the World Wide Web through the introduction of new metadata formats. The Social Web relies on the work of organizing the information available on the Web, carried out by the Web users themselves. In the second direction, developments that are part of the semantic web are actively used as tools (RSS and other web channel formats, OPML, XHTML microformats).

Internet telephony has become one of the most modern and economical types of communication. Her birthday can be considered February 15, 1995, when VocalTec released its first soft-phone - a program used for voice exchange over an IP network. Microsoft then released the first version of NetMeeting in October 1996. And already in 1997, connections via the Internet between two ordinary telephone subscribers located in completely different places on the planet became quite common.

Why is regular long-distance and international telephone communication so expensive? This is explained by the fact that during a conversation the subscriber occupies an entire communication channel, not only when speaking or listening to the interlocutor, but also when he is silent or distracted from the conversation. This happens when voice is transmitted over the telephone using the usual analog method.

With the digital method, information can be transmitted not continuously, but in separate “packets”. Then, information can be sent simultaneously from many subscribers via one communication channel. This principle of packet transmission of information is similar to transporting many letters with different addresses in one mail car. After all, they don’t “drive” one mail car to transport each letter separately! This temporary “packet compaction” makes it possible to use existing communication channels much more efficiently and “compress” them. At one end of the communication channel, information is divided into packets, each of which, like a letter, is equipped with its own individual address. Over a communication channel, packets from many subscribers are transmitted “interspersed”. At the other end of the communication channel, packets with the same address are again combined and sent to their destination. This packet principle is widely used on the Internet.

Having personal computer, a sound card, a microphone and headphones (or speakers) compatible with it, the subscriber can use Internet telephony to call any subscriber who has a regular landline telephone. During this conversation, he will also only pay for using the Internet. Before using Internet telephony, the subscriber who owns a personal computer must install a special program on it.

To use Internet telephony services it is not necessary to have a personal computer. To do this, it is enough to have a regular telephone with tone dialing. In this case, each dialed digit goes into the line not in the form of a different number of electrical impulses, as when the disk rotates, but in the form of alternating currents different frequencies. This tone mode is found in most modern telephones. To use Internet telephony using a telephone, you need to buy a credit card and call a powerful central computer server at the number indicated on the card. Then the server machine voice (optionally in Russian or English) communicates the commands: dial the serial number and card key using the telephone buttons, dial the country code and the number of your future interlocutor. Next the server turns analog signal into digital, sends it to another city, to a server located there, which again converts the digital signal into analog and sends it to the desired subscriber. The interlocutors talk as if on a regular telephone, although sometimes there is a slight (a fraction of a second) delay in the response. Let us recall that to save communication channels, voice information is transmitted in “packets” of digital data: your voice information is divided into segments, packets, called Internet protocols (IP).

In 2003, the Skype program (www.skype.com) was created, completely free and requiring virtually no knowledge from the user either to install it or to use it. It allows you to talk in video mode with interlocutors who are at their computers in different parts of the world. In order for the interlocutors to see each other, the computer of each of them must be equipped with a web camera.

Humanity has come such a long way in the development of communications: from signal fires and drums to a cellular mobile phone, which allows two people located anywhere on our planet to communicate almost instantly. At the same time, despite the different distances, subscribers create a feeling of personal communication.



Related publications