Abstract
- Relevance
- Research Goal and Objectives
- Overview of Research and Development
- Conclusions
- List of sources
Introduction
Every year the Internet absorbs more and more information.
The bandwidth of channels is increased, users are switching over
with dial-up access to unlimited. Sites are becoming larger in size,
more filling and more difficult to interact.
The sizes of downloaded files thus increase many times,
while the waiting time for users is declining at a slower pace.
Over the past 5 years, the average size of web pages has tripled (according to Akamai research),
and for the last year - one and a half times (according to
webo.in ).
In this case, each page uses an average of 50 objects,
which has a very negative effect on the total load time.
Only about 5-10% of the total load time falls on the server side.
The rest is exactly the client architecture.
On average, dozens of different objects are used on a web page, and not always it is only pictures. Scope
script files that provide user interaction with the web page, now
already far exceeds the size of the information provided on this page. And peace
moves towards the complication of human interaction with the machine, not in
the inverse.
[1,10]
The speed of loading pages is very important for any Internet user. Research on
Kissmetrics
show that 47% of customers expect the site to load within 2 seconds, and 40% of users close
sites that load more than 3 seconds. A second delay in loading the site may lead to
decrease conversion by 7%, reduce page views by 11% and decrease
the satisfaction of users of your site by 16%. For example, if your site earns 100 thousand.
dollars per day, the second delay in loading the page will cost you about $ 2.5 million in
year.
For example, a delay of 500 milliseconds in the delivery of company pages
Google led to a loss of traffic by 20%, a delay of 100 milliseconds on Amazon.com
to a decrease in revenue by 1%, and for Yahoo, a delay of 400 milliseconds
has reduced traffic by 9%. The slow loading of pages is perceived by the final
user as a long-term phenomenon, so even after improving efficiency
and half of the lost users are not returned to the resource.
[3]
The speed of downloading the site becomes even more important when it comes to search engines. Even the most
the painstaking work on SEO and AdWords is useless if your site is slowly being loaded. So, you
You will lose the battle even before it starts - at the stage of ranking. The speed of the site is one of the mandatory
factors in the Google ranking algorithm. Google experts firmly believe that "... the faster the sites,
the happier the users. "
[2]
Relevance
The actuality of optimizing the download speed of http-resources can be considered from
taking into account the number of available
sites for today and what their dynamics of growth. In addition, it is worth considering
that with a large number of resources on the Internet
the struggle of each of them for the end user is getting stronger.
As you already know the speed of loading pages greatly affects the conversion of the site,
so it is very reverent to this topic. Otherwise your web resource is simply never
will not fall on the first lines of search engines and as a result this will have a detrimental effect on
the number of visitors
site.
Another important factor is the dynamics of growth in the number of Internet users.
Animation 1 - Dynamics of growth in the number of Internet users Size:
122Kb; Frames 12; Repetition: not limited; Latency: 0.2 sec.
Now we will discuss the dynamics of the growth of the number of sites on the Internet since 1991.
Actually create sites in the network of ordinary users began at the end of the last century.
The very first page earned on the Internet in 1991. Dedicated she was
technologies of the World Wide Web, based on HTML markup language. Also on this mini-site
the principles of operation of servers and browsers were considered.
After the creation of the first public network resource, things went "on the go."
By 1993 there were already about 100 websites on the Internet. True, the search engines then
still did not exist and to get to these sites it was possible only with the same first,
created in 1991, a page.
In 1997, the network began a real boom dotcom - companies whose activities were entirely and
is completely connected with the World Wide Web. This agiotage continued until about 2000.
By that time the network already had more than 10 million sites. In 15 years the number
Internet sites exceeded a billion. The number of sites in Russia at the same time approached 5 million.
The lion's share was commercial resources.
Official research on how a growing number of sites on the Internet today,
unfortunately, was not carried out. However, many experts believe,
that in the network at present only an exponential increase is observed.
New sites on the Internet still appear, but a free audience for them is not enough.
This is primarily due to a slowdown in the actual number of Internet users themselves.
After all, computers and all sorts of gadgets are available for almost everyone, and most of the topics
interesting for
this or that category of people, has already been covered.
According to informal research conducted by enthusiasts, by now
on the Internet daily appears about 100 thousand new sites. But at the same time, approximately the same
number disappears.
The number of sites is growing, but incomparably more slowly than in the period from 1997 to 2015. About 60%
of all
Currently available in the network of sites are non-working and unclaimed.
Thus, the growth in the number of sites on the Internet in the near future is likely to slow even further,
and then completely stop. The quantity on the Internet will be finally replaced by quality.
That is, new sites will appear, but only the most informative and useful for the user will survive.
The total number of sites in the network will remain virtually unchanged. That is, locally on the network
and the standard market law will fully take effect in terms of supply and demand.
[4]
Proceeding from this, it can be argued that, despite the fact that, according to forecasts, the growth rate
of the number of sites can
To slow down, the number of already existing web-resources is so much that their competition at a very high
level.
Therefore, no painstaking work on SEO will not help you, provided that your site is significantly
loses
others in terms of download speed.
The purpose and objectives of the study
The objectives of the study are:
- achieving the minimum possible time to
load a particular page;
- achieve the minimum possible load time
for a group of pages viewed in an arbitrary order;
- ensuring the minimum possible time from
the moment the page is requested to the moment when the user's ability to view the page and interact
with it. [5]
This is not a complete list of possible goals, and sometimes even, it is required to reach a compromise
and choose between several mutually exclusive optimization options.
In such situations it is better to have the maximum possible information about the website and their
visitors.
Determine the list of critical pages that require the maximum effect
optimization can be done with the help of systems for collecting and analyzing statistics. It is also
necessary
Consider the purpose and specificity of an optimized site or service.
As a rule, optimization is required on the main page of the site and other
pages with high attendance, but this is not always the case. As an example
you can bring ordering pages on a commercial site.
They can come only 5% of the total number of visitors to the site,
However, if they load too slowly,
visitors may never become customers.
[6]
Each web page consists of a main HTML file and a set of external resources.
Speaking about the size of the page (or site), very often mean the size exactly
the first file, which, of course, is incorrect.
Currently, on each page, several dozen external objects are invoked,
and the size of the source file is no more than 5% of the total size. As shown
numerous studies, the size of the web page over the past 5 years has tripled,
and the number of objects on it is almost twofold. At the same time, the growth rate of the average
throughput
The capacity of the channel is only slightly higher than these indicators. If we take into account the
bundle
users on access speeds, then the desire to reduce the number of users,
exceeding the permissible waiting threshold by 1-5%, forces us to apply increasingly complex
and advanced technology.
Naturally, these technologies are not limited to the compression of text
(HTML, CSS, JavaScript) files on the server side. As it is easy to understand,
most of the external objects on the page are images or
multimedia files. And for them, too, have their own methods of optimization.
[7]
There are 3 main tasks of client optimization:
- Optimize the size of files;
- Optimizing boot delays;
- Optimization of interaction with the
user.
Review of Research and Development
In this case, all the main methods can be divided into 6 groups (each of which allows one to solve one of
the declared
tasks).
- Reducing the size of objects. Here,
compression and image optimization methods are included;
- Features of caching that can
drastically reduce the number of requests for repeated visits;
- Association of objects. The main
technologies are the merging of text files, application CSS Sprites or data: URI for images;
- Parallel loading of objects, Effective
time-out for each file;
- Optimizing CSS-performance, which is
manifested in the speed of appearance of the original picture in user's browser and the speed of its
further change.
- Optimization of JavaScript. There are
quite a lot of problematic places in JavaScript, which you need to know when designing complex web
applications.
It should be noted that, despite the complexity of the topic,
The initial acceleration of loading a web page can be achieved in several
very simple steps. At the same time, you can reduce the time for the appearance of a web page in several
(usually 2-3 times). For simple web projects, you can restrict only by enabling caching
and archiving (gzip or deflate).
More complex will need to change the layout, using CSS Sprites or data: URI,
and add multiple hosts to upload images.
For highly loaded projects, you need to consider all aspects of client optimization
from the very beginning when designing and apply them consistently
to achieve the best result.
[7]
In addition, there are solutions for automatic web optimization
At the moment, the subject of automatic client optimization is very concerned about the minds of web
programmers,
entrepreneurs and just enthusiasts. The benefits are quite obvious:
a fast website has significant advantages over slow competitors.
If there is high competition, this can be significant.
Moreover, users do not tend to wait long. Fast loading can be the key
to the prosperity of the Internet direction of the whole company.
Understanding this is a long time ago
. However, to create a powerful and open web application that would accumulate
all the accumulated experience and independently optimized the final site,
so far it has not been possible. Let's look at those products using
which can automate certain actions for client optimization.
JSMin Ant Task
JSMin Ant Task. The application allows you to use the logic of work
JSMin (JavaScript conversion algorithm
by removing unnecessary characters from it) when working with the Ant-server.
JSMin PHP
JSMin PHP. A fairly well-known PHP application,
Implementing JSMin logic in PHP. Of the observed shortcomings:
Conditional comments are discarded and there may be problems when parsing
complex regular expressions. In all other respects it is good itself
has proved (also by code conversion speed).
With additional gzip-compression, it is insignificant
YUI Compressor is inferior, but only PHP requires work.
YUI Compressor
YUI Compressor. This tool originated from the Rhino optimizer
and is actively developing specialists Yahoo !. YUI Compressor goes further
in the optimization of JavaScript code: it replaces the names of all local
variables by their abbreviated (in most cases up to 1 character) variants.
When used with gzip-compression gives the best result.
Unfortunately, it requires a java installed on the server.
Packer
Packer from Dean Edwards. Unlike all previous instruments
Packer creates some kind of archive, using the built-in JavaScript,
the compression ratio is very high (up to 50%).
However, with the additional use of gzip files compressed with Packer,
lose to their analogues, transformed with the help of YUI Compressor.
Additional disadvantages include some load on the processor
when unpacking such an archive (usually 30-300 ms).
Available as an implementation in PHP.
Source list
-
Н.Мациевский, Разгони свой сайт — Методы клиентской оптимизации веб-страниц // «Бином» [Электронный
ресурс] –
Режим доступа:[Ссылка ]
-
И.Алексеев, 7 способов увеличить скорость загрузки сайта в 2017 году // МОТОCMS [Электронный ресурс]
– Режим доступа:[Ссылка]
-
Т. Каловская, Клиентская оптимизация скорости загрузки веб-страниц // [Электронный ресурс] – Режим
доступа:[Ссылка]
-
Как растет количество сайтов в интернете // Как просто [Электронный ресурс] – Режим доступа:[Ссылка]
-
Что такое клиентская оптимизация? // CWPRO [Электронный ресурс] – Режим доступа:[Ссылка]
-
Анализ веб-страниц. Определение цели оптимизации. Обзор методов клиентской оптимизации. //
cap-design [Электронный ресурс] – Режим доступа:[Ссылка]
-
Что такое клиентская оптимизация? // Онлайн библиотека PLAM.RU [Электронный ресурс] – Режим
доступа:[Ссылка]
-
Обзор технологий автоматической клиентской оптимизации // Проверка скорости загрузки сайтов
[Электронный ресурс] – Режим доступа:[Ссылка]
-
И. Дмитрий, Сборка front-end проекта с помощью Gulp и Node.js // Частный вебмастер Ильичёв Дмитрий
[Электронный ресурс] – Режим доступа:[Ссылка]
-
Н. Мациевский, Клиентская оптимизация // Проверка скорости загрузки сайтов. [Электронный ресурс] –
Режим доступа:[Ссылка]
-
Основы использования Gulp для сборки JavaScript-приложений // GetInstance - статьи о
фронтенд-разработке. [Электронный ресурс] – Режим доступа:[Ссылка]