Master of DonNTU Dmitry Maksimenko
Dmitry Maksimenko Faculty of computer science and technology Department of computer engineering Speciality  Computing machines, complexes, systems and networks Development of methods to increase the speed of access to server and cloud storage Scientific adviser: Ph.D., prof., Raisa Malcheva

Abstract

Content
Introduction
1. Relevance of the topic
2. Aims and tasks of the research
3. Analysis of cloud computing
4. Fog computing as a way to improve data access
5. Hardware accelerators access the data
5.1 Main features and benefits of the Alveo U50 accelerator
Conclusions
References

Introduction

At the moment, there is a huge variety of transmitted and stored data in electronic form, and this number is growing from year to year. Therefore, there is often a problem of accessing them at any time, in any place and from any devices available to the user (PCs, tablets, smartphones).

We live in a world filled with a large number of electronic devices, and almost all of these devices are connected to the Internet. As a result, every hour generates a huge amount of information transmitted over the network. This leads to the fact that for the correct operation of devices, a constant, fast and stable connection channel to the Internet is required, which requires large financial costs. In addition, such a connection is sometimes a difficult task, since systems can be distributed over long distances.

1. Relevance of the topic

Cloud computing is becoming more popular, which, quite justifiably, and causes the development of all kinds of services. their implementation at enterprises is improving and gaining rapid pace. They allow you to use a large number of programs anywhere and on any device. But, given the increasing number of Internet-connected devices and the emergence of new complex requirements, this technology needs support.

2. Aims and tasks of the research

The purpose of the master's thesis is to research and develop a method for improving the speed of access to server and cloud storage

To achieve this goal, need to solve the following tasks:

3. Analysis of cloud computing

Today, cloud computing is one of the most modern and promising areas of development of information and computing Internet technologies. The National Institute of Standards (NIST) defined it as cloud computing is a model that provides convenient on –demand network access to shared configurable computing resources (networks, servers, data warehouses, applications, and services), which is quickly provided with minimal effort to manage and interact with the service provider [1]. They are based on the principles proposed by the developers of the Clouds operating system, developed at the Georgia Institute of Technology, USA, implemented in 1986 [2].

There are no common, uniform standards for working with clouds. There is a classification of service models and deployment models for cloud computing platforms, formulated in 2011 by the US National Institute of Standards and Technology, which distinguishes three service models [3]:

At the same time, there are private, public, public and hybrid clouds. Each cloud service provider offers its own capabilities for using its cloud services, while providing a software interface (API) that provides fast integration.

Cloud computing is widely used in Internet of Things (IoT) applications due to the potentially unlimited computing power and data storage. hey offer a centralized solution for static analysis and data storage, as well as the ability to visualize them. However, due to the possible delay caused by a large amount of data being transferred to the cloud, irrelevant and redundant information reaches the end user. In addition, sending irrelevant data to the cloud for processing and storage can reduce network bandwidth and compromise all applications.

4. Fog computing as a way to improve data access

To solve this problem, in 2012, Cisco proposed the concept of fog computing [4]. Fog computing is a new technology that serves as an intermediate layer between IoT devices and cloud platforms. Fog computing is an extension of cloud computing at the edge of the network. They provide local data aggregation and focus on removing unnecessary data from the network [5].

Fog computing is a decentralized computing architecture in which data is processed and stored between an information source and a cloud infrastructure [6]. This minimizes data transfer overhead and, as a result, improves computing performance in cloud platforms by reducing the processing and storage requirements for large amounts of redundant data.

The main goal of cloud computing is to place storage, computing, and communication resources in close proximity to IoT devices to provide high-speed wireless connectivity and minimize time and money costs. Calculations are continuously connected to cloud systems, devices of foggy systems process data received from local devices and send the results to the cloud, which in due time performs deferred and heavy tasks, as shown in figure 1 [7]. But fog computing devices are inherently independent of the internet, working independently and receiving information from the external environment, which allows to respond to changes in the environment in real time.

Figure 1. – Fog computing architecture (animation: 8 frames, repetition cycles: 5, 30 kilobytes)

Fog computing is pushing the boundaries of cloud technology by making the network and data extremely dispersed. Such an infrastructure is useful for a number of reasons: [8]:

5. Hardware accelerators access the data

Also, one of the options to speed up data transfer is to use accelerator cards. They provide an optimized distribution of workloads to accelerate computing in the areas of financial computing, machine learning, data storage, and data retrieval and analysis [9].

As an example, consider the Xilinx Alveo U50 accelerators (figure 2). They adapt to processing algorithms and acceleration requirements, and are able to speed up any data processing without changing the server hardware and thus reduce the overall cost of operating the equipment.

Figure 2. – Xilinx Alveo U50 accelerator card

The Alveo accelerator card ecosystem includes an ever-growing number of applications from Xilinx partner companies. To implement custom solutions, there are developer tool packages – SDAccel and a machine learning package that provides developers with a platform for developing and bringing to market a variety of applications.

5.1 Main features and benefits of the Alveo U50 accelerator

Productive – designed to achieve the highest productivity and efficiency [10]:

Adaptable – speeds up any application:

Available in cloud services with pre-installed apps:

Conclusions

At the time of writing this abstract: