Threats of cloud computing and methods for their protection. Information Security in cloud computing: vulnerabilities, methods and means of protection, tools for auditing and investigating incidents of cloud computing

There are several methods for building corporate IT infrastructure. The deployment of all resources and services based on the cloud platform is just one of them. However, the obstacle on this path often becomes prejudices regarding the safety of cloud solutions. In this article, we will deal with the security system in the cloud of one of the most famous Russian providers - Yandex.

Tale - lie, yes in it hint

The beginning of this story can be told as a famous fairy tale. It was three admins in the company: Senior smart was a kids, the middleway was and so and the younger was at all ... in an intern Einichik. Waveed users in Active Directory and twisted tischs tails. The company's time has come to expand, and called the king, that is, the boss, his administrative man. I wish, says new web services for our customers, own file storage, managed databases and virtual machines For testing software.

Junior with the move offered to create from scratch my own infrastructure: buy servers, install and configure software, expand the main Internet channel and add backup to it - for reliability. And the firm is calmer: the iron is always at hand, at any time it is possible to replace or reconfigure, and he himself will have an excellent opportunity to pump out his admin skills. Calculated and preteen: the company will not pull such costs. Large business is under power, but for the middle and small - it turns out too much. Well, it is necessary not just equipment to purchase, server to equip, air conditioners hang Yes fire alarm to establish, it is also necessary to organize duty to be allowed to monitor and reflect the network attacks of dashing people from the Internet. And at night and at the weekend admins, for some reason they did not want to work. If only for dual payment.

Senior Admin looked thoughtfully in the terminal window and offered to put all the services in the cloud. But here, his colleagues began to scare each other with stories: Say, cloud infrastructure has unprotected interfaces and APIs, poorly balanced the load of various customers, which is why your own resources may suffer, and still unstable to theft of data and external attacks. And in general, it is afraid to transmit control over critical data and for unauthorized persons with whom you did not eat powder salt and did not drank the beer bucket.

The average filed the idea to accommodate the entire IT system in the data center of the provider, on its channels. On that and shaped. However, here our trinity was waiting for several surprises, not all of which were pleasant.

First, any network infrastructure requires the mandatory availability of protection and security, which, of course, were deployed, configured and launched. Only here the costs of the hardware resources used by them, as it turned out, should pay the client himself. And resources Modern IB system consumes considerable.

Secondly, the business continued to grow and constructed initially infrastructure quickly rested into the scalability ceiling. Moreover, for its expansion, simple change of the tariff was not enough: many services in this case would have to transfer to other servers, to reconfigure, and it is reoperating something at all.

Finally, one day due to critical vulnerability in one of the applications, the entire system fell. The admins quickly raised it from backups, but it was not possible to quickly understand the reasons that did not succeed, because for logging services, the backup is forgotten. Value time was lost, and the time, as folk wisdom says, is money.

Calculation of expenses and summing up the company's management to disappointing conclusions: the admin was right was the admin, which from the very beginning suggested using the IAAS cloud model - "infrastructure as a service". As for the security of such platforms, it is worth talking separately. And we will do it on the example of the most popular from such services - Yandex.Roblak.

Safety in Yandex.obolev

Let's start, as I advised the girl Alis Cheshire Cat, from the beginning. That is, from the question of delimitation of responsibility. In Yandex.Robeke, as in any other similar platforms, the provider is responsible for the security of services provided to users, while the scope of the client itself includes ensuring the proper operation of the applications being developed, organizing and distinguishing remote access to allocated resources, configuring databases and virtual machines, control over logging. However, for this it is provided with all the necessary tools.

The safety of the Cloud infrastructure of Yandex has several levels, on each of which their own principles of protection are implemented and a separate arsenal of technologies is applied.

Physical level

It's no secret that Yandex has its own data centers that serve their own security departments. It is not only about video surveillance and access control services, designed to prevent penetration into server strangers, but also on climate maintenance systems, fire extinguishing and uninterrupted nutrition. From the harsh guards little, if the rack with your servers will one day fill with water from firefare or they overheat after the air conditioner failure. In the data centers of Yandex, this will definitely not happen to them.

In addition, the hardware of the clouds are physically separated from the "Big Yandex": they are located in different racks, but regular regulatory services and replacement of components also pass exactly. On the border of these two infrastructures, hardware firewills are used, and inside the clouds - software host-based firewall. In addition, the Top-OF-The-Rack switches use ACL Access Control System (Access Control List), which significantly improves the safety of the entire infrastructure. Yandex on an ongoing basis conducts cloud scanning from the outside in search of open ports and configuration errors, so that potential vulnerability can be recognized and eliminated in advance. For employee clouds working with resources, a centralized SSH key authentication system has been implemented with a role-playing access model, and all sessions of administrators are logged. Such an approach is part of the SECURE BY DEFAULT model used by Yandex: Security is laid in IT infrastructure at the stage of its design and development, and not added when everything is already commissioned.

Infrastructure level

At the level of "hardware and software logic" in Yandex. Three infrastructure services are used: Compute Cloud, Virtual Private Cloud and Yandex Managed Services. And now about each of them a little more.

Compute Cloud.

This service provides scalable computing power for various tasks, such as hosting web projects and high-loaded services, testing and prototyping or temporary migration of IT infrastructure for a period of repair or replacing your own equipment. You can manage the service through the console, command line (CLI), SDK or API.

Safety Compute Cloud is based on the fact that all client virtual machines use at least two cores, and overcommitment does not apply when memory distribution. Since in this case only the client code is executed on the kernel, the system is not subject to vulnerabilities like L1TF, Spectre and Meltdown or attacks on the side channels.

In addition, Yandex uses the QEMU / KVM's own assembly, which disabled everything is disconnected, only the minimum set of code and libraries needed for the operation of hypervisors are left. At the same time, the processes are started under the control of the ApparMor database, which using security policies determine how system resources And with what privileges can access something or another application. AppARMOR, working on top of each virtual machine, reduces the risk that the client application will be able to access the Hypervisor. To receive and process the logs, Yandex built the process of supplying data from AppARMOR and sandboxes to its own SPLUNK.

Virtual Private Cloud.

The Virtual Private Cloud service allows you to create cloud networks used to transfer information between different resources and their connection to the Internet. Physically, this service is supported by three independent data centers. In this medium, logical isolation is carried out at the level of multiprotocol interaction - MPLS. At the same time, Yandex constantly conducts the Fuzzing of the SDN junction and the hypervisor, that is, from the side of virtual machines to the external environment, the flow of incorrectly formed packets is continuously sent to obtain a response from SDN, to analyze it and close the possible breaks in the configuration. DDoS-attack protection When creating virtual machines is turned on automatically.

Yandex Managed Services.

Yandex Managed Services is a software environment for managing various services: DBMS, KUBERNETES clusters, virtual servers In Yandex.orderka infrastructure. Here, most of the work to ensure security service takes on. All backup, backup encryption, Vulnerability Management and so on automatically software Yandex.orderus.

Incidents Response Tools

For timely response to information security incidents, it is required to determine the source of the problem on time. What is necessary to use reliable monitoring tools that should work around the clock and without failures. Such systems will inevitably spend resources, but Yandex.Robacco does not shift the cost of computing security tools on the platform users.

When the Yandex is selected, the Yandex was guided by another important requirement: in the event of a successful operation of 0day-vulnerability in one of the applications, the attacker should not go beyond the host of the application, while the security team should immediately learn about the incident and react in the desired manner.

And last, but not the most important wish was that all tools would have an open source code. These criteria are fully consistent with the Bunch of AppARMOR + OSQuery, which it was decided to use in Yandex.Robel.

Apparmor

AppARMOR has already been mentioned above: This is a proactive protection tool based on custom safety profiles. Profiles use access separation technology based on Mandatory Access Control privacy labels, implemented using LSM directly in the Linux kernel itself starting from version 2.6. Yandex developers have chosen on AppARMOR for the following reasons:

  • ease and speed, since the tool is based on the Linux kernel part;
  • this is an open source solution;
  • AppARMOR can be very quickly deployed in Linux without having to write code;
  • possible flexible setup using configuration files.

Osquery.

Osquery is a system security monitoring tool developed by Facebook, now it is successfully applied in many IT industries. In this case, the cross-platform tool and has an open source code.

Using OSQuery, you can collect information about the status of various components. operating system, accumulate it, transform into standardized JSON format and direct the selected recipient. This tool allows you to write and send the application standard SQL queries that are stored in the RockSDB database. You can configure the frequency and conditions for the execution or processing of these queries.

In standard tables, many possibilities have already been implemented, for example, you can get a list of processes running in the system installed, the current set of IPTables rules, the entity of crontab, and so on. "From the box" is implemented support for receiving and parseing events from the kernel audit system (used in Yandex.Robel for processing ApparMor events).

Osquery itself is written in C ++ and distributed with open sources, you can modify them and both add new tables to the main code base, and create your extensions on C, GO or Python.

Useful feature Osquery - the availability of a distributed query system, with which you can perform in real time to perform requests to all virtual machines on the network. This can be useful, for example, if a vulnerability is detected in any package: With a single request, you can get a list of machines on which this package is installed. Such an opportunity is widely used in administering large distributed systems with complex infrastructure.

conclusions

If we return to the story told at the very beginning of this article, we will see that the concerns that our heroes abandon the deployment of infrastructure on the cloud platform, were groundless. At least we are talking Oh Yandex.order. The safety of the CLOUD infrastructure created by Yandex has a multi-level echelonized architecture and therefore provides a high level of protection from most threats known to date.

At the same time, due to the savings on the regulatory service of iron and the payment of resources consumed by the monitoring and prevention and prevention of incidents, which Yandex takes on themselves, the use of Yandex. Therapy will noticeably saves the means to small and medium businesses. Of course, completely abandon the IT department or the Department responsible for information security (especially if both of these roles are combined in the same team) will not work. But Yandex.Roodochko will significantly reduce labor costs and overhead.

Since Yandex.Robacco provides its customers with a secure infrastructure with all required tools Security, they can focus on business processes, leaving the maintenance tasks and iron monitoring provider. It does not eliminate the need for current administration of VM, database and applications, but such a range of tasks would have to be solved anyway. In general, it can be said that Yandex.Robuko saves not only money, but also time. And the second, in contrast to the first, irreplaceable resource.

Under the cloud computing, the aggregate is understood as a large pool of easily used and easily accessible virtualized resources (such as hardware complexes, services, etc.). These resources can be dynamically redistributed (scaled) for adjustment under a dynamically changing load, providing optimal use of resources. This pool of resources is usually provided on the "payment as use" principle. At the same time, the owner of the cloud guarantees the quality of service based on certain user agreements.

In accordance with all of the foregoing, you can highlight the following main features of cloud computing:

1) cloud computing represent a new paradigm for the provision of computational resources;

2) basic infrastructure resources (hardware resources, data storage systems, system software) and applications are provided in the form of services;

3) These services can be provided with an independent supplier for external users according to the "payment as use" principle, the main features of cloud computing are virtualization and dynamic scalability;

4) Cloud services can be provided to the end user through a web browser or using a specific API program interface (Application Programming Interface).

The general model of cloud computing consists of external and internal parts. These two elements are connected over the network, in most cases via the Internet. Through the external part, the user interacts with the system; The inner part is actually the cloud itself. The external part consists of a client computer or network of computer computers and applications used to access the cloud. The inner part is applications, computers, servers and data warehouses that create cloud services by virtualization (Fig. 1).

When moving existing physical virtual machines (VM) from the data center (data center) in the external clouds or the provision of IT services outside the safe perimeter in private clouds, leads to the perimeter of the network completely loses its meaning, and the overall security level becomes rather low.

If in traditional data center, engineers' access to servers is strictly controlled at the physical level, then in cloud computing, the access of engineers occurs via the Internet, which leads to the emergence of appropriate threats. Accordingly, strict access control for administrators is critical, as well as ensuring control and transparency of changes in the system level

Virtual machines are dynamic. VAM variability complicates the creation and maintenance of a holistic security system. Vulnerabilities and errors in the settings may be uncontrolled. In addition, it is very difficult to fix for the subsequent audit of the protection status at any specific point in time.

Cloud computing servers use the same OS and the same web applications as local virtual, and physical servers. Accordingly, for the cloud systems of the threat of remote hacking or infection with malicious code as high.

Another threat is the threat of data integrity: compromising and theft of data. The integrity of the operating system and application files, as well as internal activity should be monitored.

The use of multiplayer cloud services complicates adherence to the requirements of standards and laws, including the requirements of the use of cryptographic tools, to protect important information, such as information about the owner of the credit card and information identifying person. This in turn generates a difficult task of ensuring reliable protection and secure access to important data.

Based on the analysis of possible threats to cloud computing, a possible software and hardware comprehensive protection of cloud computing security is proposed, including 5 technologies: firewall, intrusion detection and prevention, integrity control, magazine analysis and malware analysis.

Cloud computing providers use virtualization to provide their customers with access to inexpensive computing resources. At the same time, the VM clients share the same hardware resources that need to achieve the greatest economic efficiency. Corporate customers who are interested in cloud computing to expand their internal IT infrastructure should take into account the threats that generate a similar step. In addition to the traditional network processing network protection mechanisms that use such safety approaches such as: Edge Firewall, allocation of demilitarized zones, network segmentation, network status control tools, the system of detection and intrusion prevention, the data protection mechanisms on virtualization servers or on themselves VM, as with the transfer of VM to public cloud services, the perimeter of the corporate network gradually loses its meaning and the overall security level is beginning to significantly influence the least protected nodes. It is the impossibility of physical separation and application of hardware safety to reflect attacks between VM leads to the need to place the protection mechanism on the virtualization server or on the VM itself. The introduction of a comprehensive protection method on the most virtual machine, which includes the program implementation of firewall, detecting and preventing intrusion, integrity control, analysis of journaling and malware protection, is the most effective way to protect integrity, compliance with regulators, compliance with security policies when moving virtual resources From the internal network to cloud environments.

Literature:

1. Radchenko G.I. Distributed computing systems // Tutorial. - 2012. - P. 146-149.

2. Kondrashin M. Cloud computing safety // Storage News. - 2010. - №1.

2019

McAfee: 19 best practices in cloud security in 2019

The largest concerns of companies cause protection of external cloud services. Thus, respondents experience that incidents can occur with suppliers, which are transmitted to outsourcing business processes, in third-party cloud services or in IT infrastructure, where the company rents computing power. However, despite all this anxiety, only 15% of companies spend compliance with compliance with the safety requirements of third-party security.

"Despite the fact that the last large-scale hacks occurred inside the data center, traditional security systems are still focused on the protection of the network perimeter and control the rights of access. At the same time, the negative impact of decisions to protect the physical infrastructure on the performance of virtual environments is rare - explained Veniamin Levtsov, Vice-President for Corporate Sales and Development of the Kaspersky Lab. - Therefore, in convergent environments, it is so important to use the appropriate comprehensive protection, ensuring the safety of virtual systems specially designed solutions. We implement an approach that, regardless of the type of infrastructure, the entire corporate network coverage is provided for all systems. And in this, our technologies and modern development of VMware (as, for example, microsensation) perfectly complement each other. "

2015: Forrester: Why are customers are unhappy with cloud suppliers?

Opaque cloud

Published recent research Forrester Consulting shows: Many organizations believe that cloud service providers provide them with information about interaction with the cloud, and it harms their business.

In addition to insufficient transparency, there are other factors that reduce the enthusiasm of the transition to the cloud: this is the level of service for customers, additional costs and adaptation when migration (on-boarding). The organizations love the cloud, but not its suppliers - in any case, not as much.

The study was ordered by ILAND, a provider of corporate cloud hosting, was carried out during May and covered professionals in the field of infrastructure and current support from 275 organizations in, and Singapore.

"Among all the difficulties of today's clouds, annoying flaws," Lilac Schoenbeck writes, Vice President, and Product Marketing ILAND. - Such important metadata is not reported, significantly braking the adoption of the clouds, and yet the organizations are building growth plans based on the assumptions of limitless cloud resources. "

Where is the key to achieving the harmony of business relationships? That's what you need to know Var'am to try to settle problems and lead the parties to reconciliation.

Inattention to customers

Apparently, many cloud users do not feel the same individual approach.

Thus, 44% of respondents answered that their provider does not know their company and does not understand their business needs, and 43% believe that if their organization was simply larger, then, probably, the supplier would pay them more attention. In short, they feel the cold of an ordinary deal, buying cloud services, and they do not like it.

And yet: there is one practice to which a third of the surveyed companies also pointed out, also instilling the feeling of petty transactions, "fee charge for the slightest question or incomprehensibility.

Too many secrets

The reluctance of the supplier to provide all the information not only annoying customers, but often worth them of money.

All respondents who participated in the Forrester survey, answered that they feel certain financial implications and the impact on the current work due to the missing or closed data on the use of clouds.

"The lack of clear data on the parameters of using the clouds leads to performance problems, reporting difficulties before manuals on the real value of use, pay for resources, and not consumed by users, and unforeseen accounts," - states Forrester.

And where is the metadata?

IT managers responsible for cloud infrastructure in their organizations want to have a metric value and operating parameters that provide clarity and transparency, but obviously it is difficult for them to convey it to suppliers.

The survey participants noted that the metadata received by them about cloud workloads are usually incomplete. Almost half of the companies answered that there are no data on regulatory norms, 44% indicated the lack of data on the parameters of use, 43% of retrospective data, 39% of security data, and 33% of billing data and cost.

Question transparency

The lack of metadata causes any kind of problem, respondents say. Almost two thirds of the respondents reported that insufficient transparency does not allow them to fully understand all the benefits of the cloud.

"The absence of transparency gives rise to various problems, and first of all it is a question about the parameters of use and interruptions in work," the report says.

Approximately 40% are trying to eliminate these gaps themselves by purchasing additional tools from their own cloud suppliers, while the other 40% simply purchase services of another supplier, where such transparency is present.

Compliance with regulatory norms

No matter how cool, organizations are responsible for all its data, whether on local storage or sent to the cloud.

More than 70% of respondents in the study were answered that their organizations regularly conduct an audit, and they must confirm compliance with existing standards, wherever their data be. And it puts an obstacle to the taking clouds for almost half of the surveyed companies.

"But the aspect of compliance with your regulatory standards should be transparent to your end users. When the cloud providers hold or do not disclose this information, they do not allow you to achieve this, "it is stated in the report.

Communication problems

More than 60% of the surveyed companies responded that the problems of compliance with regulatory requirements limit further cloud acceptance.

The main problems are as follows:

  • 55% of companies related to such requirements answered that the most difficult for them to implement appropriate means of control.
  • Approximately half says that it is difficult for them to understand the level of compliance with the requirements provided by their cloud supplier.
  • Another half of the respondents answered that it was difficult for them to get the necessary documentation from the provider on compliance with these requirements to pass an audit. And 42% find it difficult to obtain documentation on compliance with the requirements themselves regarding workloads running in the cloud.

Migration problems

It seems that the transition process (on-boarding) is another area of \u200b\u200bgeneral dissatisfaction: just over half of the surveyed companies replied that they were not satisfied with the process of migration and support that cloud suppliers offered them.

Of the 51% of the dissatisfied migration process, 26% answered that it took too much time, and 21% complained about the lack of live participation from the staff of the provider.

More than half were also not satisfied with the support process: 22% indicated a long wait for a response, 20% - insufficient knowledge of support personnel, 19% - by the prolonged problem of solving problems, and 18% received accounts with higher than expected, the cost of support.

Obstacles to the cloud

Many of the companies surveyed by Forrester are forced to restrain their expansion plans in the cloud because of the problems they experience with existing services.

At least 60% responded that the lack of transparency in the use, information of compliance with regulatory standards and reliable support keeps them from wider use of the cloud. If it were not for these problems, they would transfer more workloads in the cloud, they say respondents.

2014

  • The role of IT units is gradually changing: before them is worth the task of adapting to new realities of the cloud IT. IT divisions should tell employees about security issues, develop comprehensive data management policies and to comply with legislative requirements, develop recommendations on the implementation of cloud services and establish rules as to which data can be stored in the cloud, and which are not.
  • IT divisions are able to fulfill the mission to protect corporate data and at the same time act as a tool in the implementation of "shadow IT", implementing data security measures, for example, introducing the approach `Encryption-AS-A-service` (" Encryption in The view of the service "). Such an approach allows IT departments to centrally manage data protection in the cloud, providing other divisions of the company the ability to independently find and use cloud services as needed.
  • As more and more companies keep their data in the cloud, and their staff are increasingly enjoyed by cloud services, IT departments need to pay more attention to the implementation of more efficient mechanisms to monitor user access, such as multifactorial authentication. This is especially true for companies that provide third parties and suppliers access to their data in the cloud. Multifactor authentication solutions can be controlled centrally and provide more protected access to all applications and data, wherever they are placed in the cloud, or on the company's own equipment.

Ponemon and Safenet data

Most IT organizations are in ignorance as to how corporate data protection is being protected in the cloud - as a result of the company, accounts and confidential information of their users are risks. This is just one of the conclusions of the recent study of the fall of 2014, conducted by the Ponemon Institute on the order of SafeNet. As part of the study entitled "Problems of information management in the cloud: a global data security study", over 1800 information technology and IT security specialists were interviewed.

Among other conclusions, the study has shown that although organizations are increasingly using cloud computing capabilities, corporate IT divisions face problems when managing data and ensuring their safety in the cloud. The survey showed that only 38% of organizations clearly defined roles and responsibilities for ensuring the protection of confidential and other sensitive information in the cloud. It aggravates the situation that 44% of corporate data stored in a cloud environment is not controlled by IT departments and are not managed by them. In addition, more than two thirds (71%) respondents noted that they are faced with new difficulties when using traditional mechanisms and security techniques to protect confidential data in the cloud.

With the growing popularity of cloud infrastructures, the risks of confidential data leaks of about two thirds of IT specialists surveyed (71%) confirmed that cloud computing today are of great importance for corporations, and more than two thirds (78%) believe that the relevance of cloud computing will continue and in two years. In addition, according to respondents' estimates about 33% of all the needs of their organizations in information technology and data processing infrastructure today can be satisfied with cloud resources, and over the next two years, this share will increase by an average of up to 41%.

However, the majority of respondents (70%) agree that to comply with the requirements for preserving the confidentiality of the data and their protection in the cloud environment is becoming more complicated. In addition, respondents note that the risk of leaks are most susceptible to such species stored in the corporate data cloud as email addresses, data on consumers and customers and payment information.

On average, the introduction of more than half of all cloud services in enterprises is carried out by third-party departments, and not corporate IT departments, and on average, about 44% of corporate data placed in the cloud is not monitored and not managed by IT units. As a result, only 19% of respondents could declare their confidence that they know about all cloud applications, platforms or infrastructure services used in currently in their organizations.

Along with the lack of monitoring the installation and use of cloud services, there was no single opinion among those who actually responsible for the security of the data stored in the cloud. Thirty-five percent of respondents said that the responsibility was divided between users and suppliers of cloud services, 33% believe that the responsibility lies with the entire users, and 32% believe that the data security provider is responsible for the data provider.

More than two thirds (71%) respondents noted that protecting confidential user data stored in the cloud, with the help of traditional means and security methods is becoming more complicated, and about half (48%) note that it becomes more difficult to control or limit it. End users access to cloud data. As a result, more than a third (34%) of IT professionals stated that their organizations have already implemented corporate policies that require as a mandatory condition for working with certain cloud computing services to apply such security mechanisms as encryption. Seventy-one (71) percentage of respondents noted that the possibility of encryption or toxiumization of confidential or other sensitive data is of great importance for them, and 79% believe that the importance of these technologies will increase over the next two years.

Responding to the question of what is being taken in their companies to protect data in the cloud, 43% of respondents said that private networks are used in their data organizations. Approximately two fifth (39%) respondents said that in their companies to protect data in the cloud, encryption, toxium and other cryptographic agents are used. Another 33% of respondents do not know which safety solutions are implemented in their organizations, and 29% said that they use paid Services The security provided by their cloud computing service providers.

Respondents also believe that the Corporate Encryption Key Management is important to ensure the safety of data in the cloud, given the increasing number of key management platforms and encryption used in their companies. In particular, 54% of respondents said that their organizations retain control over encryption keys when storing data in the cloud. However, 45% of respondents said that they store their encryption keys in the program form, where the data itself is stored, and only 27% store keys in more protected environments, for example, on hardware devices.

As for access to data stored in the cloud, sixty-eight (68) percent of respondents claim that to manage user accounts in cloud infrastructure becomes more difficult, with sixty-two (62) percentage of respondents said that they are in organizations access to the cloud Provided for both third parties. Approximately half (46 percent) of the respondents said that their companies use multifactorial authentication to protect third-party access to data stored in a cloud environment. Approximately the same (48 percent) of respondents said that their companies use multifactor authentication technologies including to protect their employees to the cloud.

2013: CLOUD SECURITY Alliance study

CLOUD Security Alliance (CSA), a non-profit sectoral organization, promoting the methods of protection in the cloud, recently updated its list of main threats in the report entitled "Cloud Evil: 9 main threats in cloud services in 2013".

CSA indicates that the report reflects the agreed opinion of experts on the most significant threats to the cloud and pays focus on threats arising from the sharing of common cloud resources and accessing multiple users on demand.

So, the main threats ...

Data theft

The theft of confidential corporate information is always scary of the organization with any IT infrastructure, but the cloud model opens up "new, significant attack highways", indicates CSA. "If the cloud database with multiple leases is not properly thought out, then a flaw in the application of one client can open access to data from not only this client, but also all other cloud users," CSA warns.

Any "clouds" has several levels of protection, each of which protects information from a different type of "attempted".

For example, physical server protection. Here it is not even about hacking, but about theft or damage of media information. Examine the server from the room may be hard in the literal sense of the word. In addition, any self-respecting company stores information in data-centers with security, video surveillance and restriction of access not only by strangers, but also most employees of the company. So the likelihood that the attacker will simply come and takes information, is close to zero.

Just like an experienced traveler, fearing robbery, does not store all the money and values \u200b\u200bin one place,

Course work on discipline

Information security software and hardware

"Information Security In cloud computing: vulnerabilities, methods and means of protection, tools for auditing and investigating incidents. "

Introduction

1. History and key development factors

2. Definition of cloud computing

3. Reference architecture

4. Service Level Agreement

5. Methods and means of protection in cloud computing

6. Safety of cloud models

7. Safety audit

8. Investigation of incidents and criminalistics in cloud computing

9. Threat model

10. International and domestic standards

11. Territorial data affiliation

12. State standards

13. Tools Protection in cloud technologies

14. Practical part

Output

Literature

Introduction

The increasing rate of cloud computing is explained by the fact that for small, in general, the money Customer gets access to a reliable infrastructure with the necessary performance without the need for procurement, installation and maintenance of expensive calculators. Systems reaches 99.9% what also saves on computing resources . And what else is important - practically unlimited possibilities for scalability. By purchasing the usual hosting and trying to jump above the head (with a sharp load burst) there is a risk of getting a few hours of service. In the cloud, additional resources are provided on the first request.

The main problem of cloud computing is a nonregated level of security of the processed information, the degree of resource security and, often, is a completely lack of regulatory and legislative framework.

The purpose of the study will be an overview of the existing cloud computing market and the means to ensure security in them.

cloud computing safety information

1. History and key development factors

For the first time, the idea that today we call cloud computing was voiced by J. C. R. Licklider, in 1970. In these years, he was responsible for the creation of Arpanet (Advanced Research Projects Agency Network). His idea was that each person on Earth will be connected to the network from which it will receive not only data but also programs. Another scientist John McCarthy expressed the idea that computing power would be provided to users as a service (service). On this, the development of cloud technologies was suspended until the 90s, after which its development contributed a number of factors.

The expansion of Internet bandwidth, in the 90s did not allow to obtain a significant jump in the development in cloud technology, as practically no company and technology of that time were not prepared for this. However, the very fact of the acceleration of the Internet gave impetus to the early development of cloud computing.

2. One of the most significant events in this area was the appearance of Salesforce.com in 1999. This company has become the first company providing access to its application through the site. In fact, this company has become the first company providing its software on the principle - software as a service (SAAS).

The next step was the development of a cloudy web service by Amazon in 2002. This service allowed to store, information and perform calculations.

In 2006, Amazon launched a service called Elastic Compute Cloud (EC2) as a web service that allowed its users to run their own applications. Amazon EC2 and Amazon S3 services have become the first accessible cloud computing services.

Another milestone in the development of cloud computing occurred after the creation of Google, platforms Google Apps. For web applications in the business sector.

A significant role in the development of cloud technologies was played by virtualization technologies, in particular the software allowing you to create a virtual infrastructure.

The development of hardware contributed not so much to the rapid growth of cloud technologies, how much accessibility of this technology for small businesses and individual individuals. As for technical progress, a significant role in this was played by the creation of multi-core processors and an increase in the capacity of information drives.

2. Definition of cloud computing

According to the definition of the National Institute of Standards and Technology, CSHA:

Cloud computing (Cloud computing) (englishCloud -cloud; computing - Calculations) - This is a model for providing universal and convenient network access as needed to a common pool of configurable computing resources (for example, networks, servers, storage systems, applications and services) that can be quickly provided and exempted with minimal management efforts and necessity. interaction with the service provider (service provider).

The cloud model supports the high availability of services and is described by five main characteristics (Essential Characteristics), three service models / service models (Service Models) and four deployment models (Deployment Models) (DEPLOYMENT MODELS).

Programs are launched and issuing the results of the work in the standard web browser window on the local PC, while all applications and their data required for work are on a remote server on the Internet. Computers exercising Cloud Computing are called a "computing cloud". At the same time, the load between computers included in the "computing cloud" is distributed automatically. The simplest example of Cloud Computing is the P2P network.

To implement cloud computing, intermediate software products created by special technologies are used. They serve as an intermediate link between the equipment and the user and ensure monitoring the status of equipment and programs, the uniform load distribution and the timely allocation of resources from the total pool. One of these technologies is virtualization in calculations.

Virtualization in calculations - The process of representing the set of computing resources, or their logical association, which gives any advantages over the original configuration. This is a new virtual look at the resources of component parts, not limited to the implementation, physical configuration or geographical position. Typically, virtualized resources include computing power and data warehouse. In scientific, virtualization is the isolation of computing processes and resources from each other.

An example of virtualization is symmetric multiprocessor computer architectures that use more than one processor. Operating systems are usually configured in such a way that several processors seem like a single processor module. That is why software applications can be written for one logical ( virtual) Computational module, which is much easier than working with a large number of different processor configurations.

Grid-calculations are used for highly large and resource-intensive calculations.

Grid computing (grid -the grille, network) is a form of distributed calculations in which the "virtual supercomputer" is presented as clusters connected by network, weakly coupled, heterogeneous computers working together to perform a huge number of tasks (operations, works).

This technology is used to solve scientific, mathematical tasks that require significant computing resources. Grid computing is also used in the commercial infrastructure to solve such labor-intensive tasks, such as economic forecasting, seismic analyzes, development and study of the properties of new drugs.

Grid from the point of view of the network organization is a coordinated, open and standardized environment, which provides flexible, secure, coordinated separation of computing resources and information storage resources that are part of this environment, within one virtual organization.

Paraircultualization - This is a method of virtualization that provides virtual machines software interface similar to, but not identical to basic hardware. The task of this modified interface is to reduce the time spent by the guest operating system to perform operations that in the virtual environment are much more difficult in the virtual environment than in the unpetualized one.

There are special "hooks" (hooks), allowing guest and master systems to request and confirm the performance of these complex tasks that could be performed in a virtual environment, but much slower.

Hypervisor (or Monitor virtual machines) - In computers, a program or hardware scheme, providing or allowing simultaneous, parallel execution of several or even many operating systems on the same host computer. The hypervisor also provides isolation of operating systems from each other, protection and security, the separation of resources between different OS running and resource management.

The hypervisor may also (but not obliged) to provide operating under its control on one host computer, a means of communication and interaction between them (for example, through file sharing or network connections) as if these OS were performed on different physical computers.

The hypervisor itself in some way is the minimum operating system (microner or nanoadrome). It provides running under its control system of the virtual machine, virtualizing or emulating the real (physical) hardware of a particular machine, and manages these virtual machines, the allocation and release of resources for them. The hypervisor allows independent "inclusion", reboot, "shutting down" any of the virtual machines with a particular OS. At the same time, the operating system operating in a virtual machine under the control of a hypervisor may not necessarily "know" that it is performed in the virtual machine, and not on real hardware.

Models of cloud services

Options for providing computing capacities are very different. Everything related to Cloud Computing is usually customized with the word AAS, it is simply described - "as a service", that is, "as a service", or "in the form of service".

Software As a service (SaaS) -the provider provides a customer ready to use the application. Applications are available from various client devices or through thin client interfaces, such as a web browser (for example, webmail) or program interfaces. The consumer does not control the basic cloud infrastructure, including networks, servers, operating systems, storage systems, and even individual application settings except for some user configuration settings.

As part of the SaaS model, customers pay no for the ownership of software as such, but for its rent (that is, its use through the web interface). Thus, unlike the classical software licensing scheme, the Customer carries relatively small periodic costs, and it does not need to invest substantial means to acquire software and its support. The periodic payment scheme assumes that if the need for software is temporarily absent - the customer can suspend its use and freeze the payments to the developer.

From the point of view of the developer, the SaaS model allows you to effectively deal with not licensed use of software (piracy), since the Software itself does not enter the end customers. In addition, the SAAS concept often allows you to reduce the cost of deploying and implementing information systems.

Fig. 1 Typical SAAS scheme

Platform as a service (PAAS) -the provider offers a client software platform and tools for designing, developing, testing and deploying user application applications. The consumer does not control the basic infrastructure of the cloud, including networks, servers, operating systems and data storage systems, but has control over deployed applications and, possibly, by some hosting environment configuration parameters.

Fig. 2 Typical paas scheme

Infrastructure as a service (IaAs). -the provider offers the client for renting: servers, storage systems, network equipment, operating systems and system software, virtualization systems, resource management systems. The consumer does not control the basic cloud infrastructure, but has control over operating systems, storage systems, deployed applications and, possibly, limited control of the selection of network components (for example, a host with network screens).

Fig. 3 Typical IaAs Scheme

Additionallyallocate services such as:

Communication as a service (COM-AAS) -it is understood that communication services are provided as services; This is usually IP telephony, mail and instant communications (chats, IM).

Cloud data warehouse - The user is provided with a certain amount of space for storing information. Since the information is stored distributed and duplicated, such storage facilities provide a much greater degree of data security than local servers.

Workplace as a service (WAAS) -the user, having a not enough powerful computer at its disposal, can buy computing resources from the supplier and use your PC as a terminal to access the service.

Antivirus cloud - Infrastructure that is used for processing information coming from users to promptly recognize new, previously unknown threats. The cloud antivirus does not require any unnecessary actions from the user - it simply sends a request for a suspicious program or reference. If the hazard is confirmed, all necessary actions are performed automatically.

Models deploying

Among the models of deployment, 4 main types of infrastructure

Private Cloud (Private Cloud) -the infrastructure intended to be used by one organization comprising several consumers (for example, units of one organization), possibly clients and contractors of this organization. The private cloud can be owned, managing and operating both the organization itself and a third party (or some combination thereof), and it can physically exist both inside and outside the owner's jurisdiction.

Fig. 4 Private cloud.

Public cloud (Public Cloud) -infrastructure intended for free use with a general public. A public cloud may be owned, managing and operating commercial, scientific and government organizations (or some combination thereof). Public cloud physically exists in the jurisdiction of the owner - service provider.

Fig. 5 Public cloud.

Hybrid Cloud (Hybrid Cloud) -this is a combination of two or more different cloud infrastructures (private, public or public) remaining unique objects, but related to the standardized or private data transmission and application technologies (for example, short-term use of public clouds for balancing the load between the clouds).

Fig. 6 hybrid cloud.

Public Cloud (Community Cloud) -the type of infrastructure designed to use the specific community of consumers from organizations with common tasks (for example, missions, security requirements, policies, and compliance with various requirements). The public cloud can be in cooperative (joint) property, management and operation of one or more of the community organizations or third parties (or some combination thereof), and it can physically exist both inside and outside the owner's jurisdiction.

Fig. 7 Description of the properties of clouds

Basic properties

NIST In your document `The Nist Definition of Cloud Computing` Defines the following cloud characteristics:

ON-DEMAND SELF-SERVICE). The consumer has the opportunity to access those provided by computing resources unilaterally as they need, automatically, without the need to interact with the employees of each service provider.

Wide Network Access (Broad Network Access). The computational resources provided are available over the network through standard mechanisms for various platforms, thin and thick customers (mobile phones, tablets, laptops, workstations, etc.).

Combining Resources in Pools (Resorce Pooling). The computing resources of the provider are combined into the Pools for servicing many consumers by Multi-Independent (Multi-Tenant) model. Pools include various physical and virtual resources that can be dynamically assigned and reassigned in accordance with consumer requests. There is no need for the consumer to know the exact location of resources, however, you can specify their location at a higher level of abstraction (for example, a country, region or data center). Examples of this kind of resources can be storage systems, computing power, memory, bandwidth network.

Instant Elasticity (Rapid Elasticity). Resources can be elasticized and exempted, in some cases automatically, for quick scaling, in contrast to demand. For the consumer, the possibility of providing resources is seen as unlimited, that is, they can be assigned in any quantity and at any time.

Measured service (Measured Service). Cloud systems automatically control and optimize resources using measurement tools implemented at the abstraction level for different services of services ((for example, control of external memory, processing, bandwidth or active user sessions). Used resources can be monitored and monitored, which ensures transparency as For the supplier and for the consumer who uses the service.

Fig. 8 Cloud Server Structural Scheme

Advantages and disadvantages of cloud computing

Dignity

· Requirements for the computing power of the PC (an indispensable condition is only the availability of Internet access);

· fault tolerance;

· safety;

· High data processing speed;

· Reduction of costs for hardware and software for maintenance and electricity;

· Saving disk space (and data, and programs are stored on the Internet).

· Live migration - transfer of a virtual machine from a single physical server to another without stopping the virtual machine and stop service.

· At the end of 2010, in connection with DDOS attacks against companies that refused to provide WikiLeaks resources, another advantage of Cloud Computing technology was found out. All companies speaking against WikiLeaks were attacked, but only Amazon turned out to be insensitive to these effects, as it used the CLOUD computing tools. ("Anonymous: Serious Threat Or Mere Annoyance", Network Security, N1, 2011).

disadvantages

· Dependence of the safety of user data from companies providing CLOUD computing services;

· Permanent connection to the network - to access the services of "clouds" you need a permanent connection to the Internet. However, in our time, this is not such a large drawback, especially with the arrival of 3G and 4G cellular technology.

· Software and change it - there are restrictions on which you can deploy on the "clouds" and provide it to the user. The software user has limitations in the software used and sometimes it does not have the ability to adjust it to their own goals.

· Privacy - confidentiality of data stored on public "clouds" currently causes a lot of disputes, but in most cases experts agree that it is not recommended to store the most valuable documents for the company on the public "cloud", since there is currently no technology that Guaranteed 100% confidentiality of stored data. That is why the use of encryption in the cloud is required.

· Reliability - As for the reliability of stored information, it can be said with confidence that if you have lost information stored in the "cloud", then you have lost it forever.

· Safety - the "cloud" in itself is a fairly reliable system, however, when you penetrate the attacker gets access to a huge data warehouse. Another minus is the use of virtualization systems, in which the standard OS kernels such as Linux, Windows are used as a hypervisor. et al., which allows the use of viruses.

· High cost of equipment - To build a company's own cloud, it is necessary to allocate significant material resources, which is not beneficial just created and small companies.

3. Reference architecture

The reference architecture of cloud computing NIST contains five main actors - actors. Each actor acts as a role and performs actions and functions. The reference architecture is represented as consecutive diagrams with an increasing level of detail.

Fig. 9 Conceptual Scheme of Reference Architecture

Cloud consumer - A person or organization supporting business relationships and using cloud service providers.

Cloud consumers are divided into 3 groups:

· SaaS - uses applications to automate business processes.

· PAAS - develops, tests, deploys and manages applications deployed in a cloud environment.

· IAAS - Creates, manages IT infrastructure services.

Cloud provider - face, organization or essence responsible for the availability of cloud services for clouds.

· SaaS - sets, manages, accompanies and provides software deployed on cloud infrastructure.

· PAAs - provides and manages cloud infrastructure and binding software. Provides development and administration tools.

· IaAs - provides and serve servers, databases, computational resources. Provides a cloud structure to the consumer.

The activities of cloud providers are divided into the main 5 typical actions:

Deploying services:

o Private cloud - one organization is served. The infrastructure is managed both the organizing and third party and can be deployed both from the provider (Off Premise) and the organization (on premise).

o Common Cloud - Infrastructure is used jointly by several organizations with similar requirements (safety, RD compliance).

o Public cloud - infrastructure is used by a large number of organizations with different requirements. Only off premise.

o Hybrid cloud - infrastructure combines various infrastructures on the principle of similar technologies.

Services management

o The level of service - defines the basic services provided by the provider.

§ SaaS is an application used by the consumer by referring to a cloud from special programs.

§ PAAS - Containers for consumer applications, development and administration tools.

§ IAAS - computing power, databases, fundamental resources, on top of which the consumer deploys its infrastructure.

o Level of abstraction and resource control

§ Managing a hypervisor and virtual components necessary to implement infrastructure.

o The level of physical resources

§ Computer equipment

§ Engineering infrastructure

o Availability

o Privacy

o Identity

o Security monitoring and processing of incidents

o Security Policies

Privacy

o Protection of processing, storage and transfer of personal data.

Cloud auditor - a participant who can perform an independent assessment of cloud services, services of information systems, productivity and safety of the cloud implementation.

It may give its own assessment of security, privacy, performance and other things in accordance with the approved documents.

Fig. 10 Activity provider

Cloud broker - The essence that managing the use, performance and provision of cloud services, as well as establishing relations between providers and consumers.

With the development of cloud computing, the integration of cloud services may be too complicated for the consumer.

o Service mediation is the expansion of the specified service and the provision of new opportunities

o Aggregation - Association of various services for the provision of consumer

Cloud communication operator - Mediator providing connections and transport services (communication services) delivery of cloud services from providers to consumers.

Provides access through communication devices

Provides a connection level according to SLA.

Among the five actors presented, the cloud broker is optional, because Cloud consumers can receive services directly from the cloud provider.

Entering actors is due to the need to develop a relationship between the subjects.

4. Service Level Agreement

Service Level Agreement is a document describing the level of services expected by the customer from the supplier based on the indicators applicable to this service and establishing the provider's responsibility if the agreed indicators are not achieved.

Here are some indicators, in a particular composition in operator documents:

ASR (ANSWER SEIZURE RATIO) -a parameter defining the quality of the telephone connection in a given direction. ASR is calculated as the percentage ratio of the number as a result of calls for telephone connections to the total number of challenges in the specified direction.

PDD (Post Dial Delay) -the parameter that defines the period of time (in seconds), which has passed since the call until the telephone connection is set.

Services availability ratio - The ratio of the time break in the provision of services to the total time when the service should be provided.

Coefficient loss of information packs - the ratio of properly received data packets to the total number of packets that were transferred over the network for a certain period of time.

Temporary delays in transmitting information packages - The time interval required to transmit information such between two network devices.

Accountability of information transfer - The ratio of the number of erroneously transmitted data packets to the total number of transmitted data packets.

Periods of work, the time of the alerts of subscribers and the restoration time of services.

In other words, the availability of services 99.99% indicates that the operator guarantees not more than 4.3 minutes of communication per month, 99.9% - that the service may not be 43.2 minutes, and 99% - that the break can last More than 7 hours. In some practitioners, the network availability is distinguished and a smaller value of the parameter is assumed - no time. Different types of services (traffic classes) also provide different values \u200b\u200bof indicators. For example, a delay rate is most important for voice - it must be minimal. And the speed for it is needed low, plus some of the packages can lose without loss of quality (about 1% depending on the codec). To transfer data to the first place, speed comes out, and packet losses should strive for zero.

5. Methods and means of protection in cloud computing

Privacy must be provided throughout the chain, including a "cloud" solution, consumer and connecting their communications.

The task of the provider is to ensure both physical and software immunity of data from third-party encroachments. The consumer must enter into force "on its territory" relevant policies and procedures that exclude the transfer of access rights to third parties.

Tasks to ensure the integrity of information in the case of the use of individual "cloud" applications, it is possible to solve - thanks to modern database architectures, systems reserve copy, integrity check algorithms and other industrial solutions. But that's not all. New problems may occur in the case when it comes to integrating several "cloud" applications from different suppliers.

In the near future, the single output will be the creation of a private cloud system for companies in need of a secure virtual environment. The fact is that private clouds, in contrast to public or hybrid systems, are most similar to virtualized infrastructures, which IT departments of large corporations have already learned to implement and on which they can maintain complete control. The disadvantages of information protection in public cloud systems represent a serious problem. Most of the incidents with hacking occurs in public clouds.

6. Safety of cloud models

The level of risk in three cloud models is very different, and the ways to solve security issues also differ depending on the level of interaction. Safety requirements remain the same, but in various models, SaaS, PAAS or IAAS, security controls changes. From a logical point of view, nothing changes, but the possibilities of physical implementation are radically vary.

Fig. 11. The most actual threats of IB

in the SaaS model, the application starts on a cloud infrastructure and is available through a web browser. The client does not control the network, servers, operating systems, data storage, and even some applications features. For this reason, in the SaaS model, the main responsibility for safety is almost completely lighted on suppliers.

Problem number 1 - Password management. In the SaaS model, applications are in the cloud, so the main risk is to use multiple accounts to access applications. Organizations can solve this problem thanks to the unification of accounts for cloud and local systems. When using a single login system, users get access to workstations and cloud services using one account. This approach reduces the likelihood of "fossil" accounts subject to unauthorized use after the dismissal of employees.

By explaining CSA, PAAS assumes that customers create applications using programming languages \u200b\u200band tools supported by vendor, and then deploy them on cloud infrastructure. As in the SaaS model, the client cannot manage or control the infrastructure - network, server, operating systems or data storage systems - but has control over the deployment of applications.

In the PAAS model, users should pay attention to the security of applications, as well as questions related to API management, such as confirmation of access rights, authorization and verification.

Problem number 1 - data encryption. The PAAS model is initially safe, but the risk lies in insufficient system performance. The reason is that when sharing data with PAAS providers, it is recommended to use encryption, and this requires additional processor capacities. Nevertheless, in any solution, the transfer of confidential user data should be carried out through the encrypted channel.

Although here clients do not control the underlying cloud infrastructure, they have control over operating systems, data storage and application deployment and, possibly limited control over the selection of network components.

This model has several built-in security features without infrastructure protection in itself. This means that users must manage and ensure the safety of operating systems, applications and content, as a rule, via API.

If it is translated into the language of protection methods, the provider must provide:

· Reliable access control to the infrastructure itself;

· Infrastructure fault tolerance.

At the same time, the cloud consumer takes on a lot more protection functions:

· Mebietary shielding in the framework of infrastructure;

· Protection against invasions into the network;

· Protection of operating systems and databases (access control, vulnerabilities, security settings control);

· Protection of finite applications (antivirus protection, access control).

Thus, most of the protection measures lie on the shoulders of the consumer. The provider can provide model recommendations for protection or ready-made solutions than simplify the task of end users.

Table 1. Remuneration of responsibility for ensuring security between the client and the service provider. (P - supplier, to - client)


Enterprise server

application

Data

Implementation environment

Binder software

Operating system

Virtualization

Server

Data warehouse

network hardware



7. Safety audit

The tasks of the cloud auditor are essentially different from the tasks of the auditor of conventional systems. Safety audit in the cloud is divided into an audit of the supplier and user audit. The user's audit is made at the request of the user, while the audit of the supplier is one of the most important conditions for doing business.

It consists of:

· Initiation of the audit procedure;

· Collecting audit information;

· Analysis of audit data;

· Preparation of the audit report.

At the initiation stage, the audit procedure should be solved by the authority of the auditor, the timing of the audit. It should also be agreed by the mandatory assistance of employees to the auditor.

In general, the auditor conducts an audit to determine reliability

· Virtualization systems, hypervisor;

· Servers;

· Data warehouses;

· Network equipment.

If the supplier on the server used uses the IAAS model, then this check will be sufficient to identify vulnerabilities.

When using the PAAS model, you must additionally be checked

· operating system,

· Binder software

· Execution environment.

When using the SaaS model on vulnerabilities also checked

· Data storage and processing systems,

· Applications.

The audit of the security system is performed using the same methods and tools as the audit of ordinary servers. But in contrast to the usual server in cloud technologies, the hypervisor is checked for stability. In cloud technologies, the hypervisor is one of the main technologies and therefore its audit should be of particular importance.

8. Investigation of incidents and criminalistics in cloud computing

Information security measures can be divided into preventive (for example, encryption and other access control mechanisms), and reactive (investigations). The preventive aspect of the safety of the clouds is the area of \u200b\u200bactive scientific research, while the reactive aspect of the safety of the cloud is paid much less attention.

Investigation of incidents (including investigating crimes in the information sphere) is a well-known information security section. The objectives of such investigations are usually:

Proof that the crime / incident occurred

Restoration Events surrounding the incident

Identification of offenders

Proof of the involvement and responsibility of offenders

Proof of dishonest intentions from offenders.

New discipline - computer-technical expertise (or forenzika) appeared, in view of the need for a criminalistic analysis of digital systems. Objectives Computer and technical examination as a rule:

Data recovery that may have been removed

Restoring events in and outside digital systems associated with incident

Identification of users of digital systems

Detection of the presence of viruses and other malicious software

Detection of the presence of illegal materials and programs

Hacking passwords, key encryption and access codes

Ideally, the computer-technical expertise is a kind of time machine for the investigator, which can move at any time in the past digital device and provide the researcher with information about:

people used the device at a certain point

users (for example, opening documents, receiving access to a website, printing data in a text processor, etc.)

data stored, created and processed by the device at a certain time.

Cloud services Replacing autonomous digital devices must provide a similar level of forensic readiness. However, this requires overcoming the problems associated with the association of resources, multi-stage and elasticity of cloud computing infrastructure. The main tool in the investigation of the incident is the audit log.

Audit logs - designed to control the history of user registration in the system, executing administrative tasks and data change - are a significant part of the security system. In the cloud technologies, the audit log itself is not only a tool for investigations, but also a tool for calculating the cost of using servers. Although the control magazine does not eliminate the bars in the protection system, it allows you to look at a critical look at what is happening and formulate proposals for the correction of the situation.

Creating archives and backups is of great importance, but cannot replace the formal audit magazine, which registers who, when and what did. Audit log is one of the main security auditor tools.

The service agreement is usually mentioned exactly which audit logs will be conducted and provided to the user.

9. Threat model

In 2010, CSA analyzed the main threats to the cloud technologies. The result of their work was the document "Top Threats of Cloud Computing V 1.0" in which the most fully describes the threat model and the intruder model. At the moment, a more complete, second version of this document is being developed.

The current document describes the violators for three SAAS, PAAS and IAAS service models. 7 main directions of attacks were revealed. For the most part, all the types of attacks under consideration are attacks inherent in conventional, "unlaced" servers. Cloud infrastructure imposes certain features on them. For example, attacks are added to the attacks on the vulnerabilities in the software part of the servers, which is also their program part.

Safety threat number 1

Unlawful and dishonest use of cloud technologies.

Description:

To obtain resources from the IAAS cloud provider, the user is enough to have a credit card. The simplicity of registration and allocating resources allows spammers, viruses to authors, etc. Use cloud service in your criminal purposes. Previously, this kind of attack was observed only in PAAS, however, recent studies have shown the possibility of using IAAS for DDoS attacks, placement of malicious code, creating a botnet networks and other things.

Samples were used to create a botnet network based on the Trojan program "Zeus", storing the code of the Trojan horse "InfoStealer" and placing information about various vulnerabilities of MS Office and AdobePDF.

In addition, the nets are used by IAAS to manage their peers and to send spam. Because of this, some IAAS services hit black lists, and their users were completely ignored by mail servers.

· Improving user registration procedures

· Improving credit card verification procedures and monitoring the use of payments

· Comprehensive study of network activity of users of the service

· Tracking the main black sheets for the appearance of a cloud provider network there.

Affected service models:

Safety threat number 2.

Unsafe program interfaces (API)

Description:

Cloud infrastructure providers provide users with a set of software interfaces for resource management, virtual machines or services. The safety of the entire system depends on the security of these interfaces.

Anonymous access to the interface and the transfer of credentials to the open text is the main signs of unsafe program interfaces. Limited features of monitoring the use of the API, the lack of journaling systems, as well as unknown interrelations between different services only increases the risks of hacking.

· Perform a cloud provider security model analysis

· Make sure that sustainable encryption algorithms are used.

· Ensure that reliable authentication and authorization methods are used.

· Understand the entire chain of relationships between different services.

Cheap service model:

The threat of security number 3.

Internal violators

Description:

The problem of unlawful access to information from the inside is extremely dangerous. Often, the monitoring system for the activity of employees is not implemented on the side of the provider, and this means that an attacker can access the client's information, using its official position. Since the provider does not reveal his policy of a set of employees, the threat can proceed from both the amateur hacker and from an organized criminal structure that penetrates the ranks of the provider staff.

At the moment there are no examples of this kind of abuse.

Implementation of strict equipment procurement rules and the use of the relevant systems detection of unauthorized access

Regulation of rules for hiring employees in public contracts with users

Creating a transparent security system, along with publishing reports on the security audit of internal provider systems

Cheap service model:

Fig. 12 Example of an internal violator

Safety threat number 4.

Cloud technology vulnerabilities

Description:

IAAS service providers use hardware resource abstraction using virtualization systems. However, hardware can be designed without carrying out work with shared resources. In order to minimize the impact of this factor, the hypervisor controls the access of the virtual machine to hardware resources, but even in hypervisors there may be serious vulnerabilities, the use of which can lead to an increase in privileges or obtaining unlawful access to physical equipment.

In order to protect systems from such problems, the implementation of the mechanisms of isolation of virtual environments and failure detection systems. Users of the virtual machine should not access the shared resources.

There are examples potential vulnerabilities, as well as theoretical methods for bypassing insulation in virtual environments.

· Implementation of the most advanced installation methods, configuration and protection of virtual environments

· Using unauthorized access systems

· Application of reliable authentication and authorization rules for administrative work

· Tightening requirements for the use of patches and updates

· Conduct timely procedures for scanning and detecting vulnerabilities.

Safety threat number 5.

Loss or Data Leak

Description:

Data loss can occur due to a thousand reasons. For example, the deliberate destruction of the encryption key will lead to the fact that the encrypted information will not be subject to recovery. Deleting data or part of data, unauthorized access to important information, changing records or failure of the carrier is also an example of such situations. In a complex cloud infrastructure, the likelihood of each event increases due to the close interaction of components.

Incorrect application of authentication, authorization and audit rules, incorrect use of rules and methods of encryption and equipment breakdowns can lead to a loss or leakage of data.

· Using a reliable and secure API

· Encryption and protection of transmitted data

· Analysis of data protection model at all stages of system functioning

· Implementation of a reliable encryption key management system

· Selection and acquisition of only the most reliable carriers

· Ensuring timely data backup

Cheap service model:

Safety threat number 6.

Theft of personal data and unauthorized access to the service

Description:

A similar type of threat is not new. Millions of users are faced with him every day. The main goal of attackers is the username (login) and its password. In the context of cloud systems, the stealing password and user name increases the risk of data stored in the cloud infrastructure of the provider. So the attacker has the opportunity to use the reputation of the victim for its activities.

· Prohibition of accounting of accounts

· Using two factory authentication methods

· Implementation of proactive monitoring unauthorized access

· Description of the safety model of the cloud provider.

Cheap service model:

Safety threat number 7.

Other vulnerabilities

Description:

The use of cloud technologies for doing business allows the company to focus on their business, providing care for IT infrastructure and services to a cloud provider. Advertising your service, the cloud provider seeks to show all the possibilities, while revealing the sales details. This can be a serious threat, as the knowledge of the internal infrastructure gives an attacker the opportunity to find an unclosed vulnerability and carry out an attack on the system. In order to avoid such situations, cloud providers may not provide information on the internal device of the cloud, however, this approach also does not contribute to improved confidence, since potential users do not have the opportunity to assess the degree of data security. In addition, this approach limits the possibilities in timely finding and eliminating vulnerabilities.

· Amazon refusal from EC2 CLOUD security audit

· Vulnerability in the processing software, which led to the Safety System Date of the Center Hearthland

· Curl data disclosure

· Full or partial disclosure of data on the architecture of the system and details of the installed

· Using vulnerabilities monitoring systems.

Cheap service model:

1. Legal base

According to experts, 70% of safety issues in the cloud can be avoided if it commemorates the contract for the provision of services.

The base for such a contract may serve as "Bill on the Rights of the Cloud"

"Bill about the Rights of the Cloud" was developed back in 2008. James Urquhart. He published this material in his blog, which caused so much interest and disputes that the author periodically updates its "manuscript" in line with the realities.

Article 1 (Partially): Customers own their data

· No manufacturer (or supplier) should not in the process of interaction with customers of any plan discuss the rights to any data loaded, created, generated, modified or any others, the rights to which has a client.

· Manufacturers should initially provide the minimum access to customer data at the development of solutions and services.

· Customers own their data, which means that they are responsible for the fact that these data comply with legislative standards and laws.

· Since the issues of compliance with regulatory standards for the use of data, security and security are very important, it is necessary that the client determines the geographical location of its own data. Otherwise, manufacturers must provide users with all guarantees that their data will be stored in accordance with all the rules and rules.

Article 2: Manufacturers and customers jointly own and manage the level of service in the system

· Manufacturers own, and should also do everything in order to match the level of service for each client separately. All the necessary resources and efforts attached to achieve the proper level of service in working with clients must be free for the client, that is, not to enter the cost of the service.

· Customers, in turn, are responsible for and owning the level of service provided by their own internal and external customers. When using the manufacturer's solutions to provide its own services, the client's responsibility and the level of such a service should not fully depend on the manufacturer.

· If it is necessary to integrate the manufacturer and client systems, manufacturers must offer customers the possibility of monitoring the integration process. In case the client exists corporate standards for the integration of information systems, the manufacturer must comply with these standards.

· Under any conditions, manufacturers should not close customer accounts for political statements, incorrect speech, religious comments, if it does not contradict specific legislation, is not an expression of hatred, etc.

Article 3: Manufacturers own their interfaces

· Manufacturers are not required to provide standard interfaces or open source interfaces if the inverse is not spelled out in the client agreements. Manufacturers have rights to interfaces. In the event that the manufacturer does not consider it possible to provide the client with the possibility of finalizing the interface in the usual programming language, the client can purchase services to the manufacturer or third-party developers to finalize interfaces in accordance with its own requirements.

· The client, however, has the right to use the acquired service for its own purposes, as well as expand its capabilities, replicate and improve. This item does not exempt customers from liability of patent law and intellectual property rights.

The above three articles are the basis of the foundations for customers and manufacturers "in the cloud". With their full text you can find outdoor available on the Internet. Of course, this Bill is not a finished legal document, and even more so official. Its articles can change and expand at any time, as well as Bill can be complemented by new articles. This is an attempt to formalize the "ownership ownership" in the cloud, in order to somehow standardize this freethuable area of \u200b\u200bknowledge and technologies.

Relationship between the parties

To date, the best cloud security expert is Cloud Security Alliance (CSA). This organization has released and recently updated the manual, which includes a description of hundreds of nuances and recommendations that need to be taken into account when assessing risks in cloud computing.

Another organization whose activities affect security aspects in the cloud is the Trusted Computing Group (TCG). It is the author of several standards in this and other areas, including the TRUSTED STORAGE, TRUSTED NETWORK CONNECT (TNC) and Trusted Platform Module (TPM) is widely used today.

These organizations have developed a number of issues that the customer and the provider should work on the contract. These issues will allow solving most of the problems when using clouds, force majeure, changing the cloud service provider and other situations.

1. Save stored data. How does the service provider ensures the safety of stored data?

The best measure of protection located in the data warehouse is the use of encryption technologies. The provider must always encrypt the client information stored on its servers to prevent unlawful access cases. The provider must also permanently delete the data when they are no longer needed and will not be required in the future.

2. Data protection during transmission. How does the provider ensures the safety of data when they are transmitted (inside the cloud and on the path from / to the cloud)?

Transmitted data should always be encrypted and accessible to the user only after authentication. This approach ensures that this data will not be able to change or read by any person, even if it receives access to them through unreliable nodes on the network. The mentioned technologies were developed during the "thousands of man-years" and led to the creation of reliable protocols and algorithms (for example, TLS, IPSEC and AES). Providers must use these protocols, and not to invent their own.

3. Authentication. How does the provider recognize the authenticity of the client?

The most common way of authentication is password protection. However, providers seeking to offer their customers higher reliability are resorting to the help of more powerful funds, such as certificates and tokens. Along with the use of more reliable authentication tools, providers should be able to work with such standards as LDAP and SAML. This is necessary to ensure the interaction of the provider with the identification system of the client users when authorizing and determining the authority issued by the user. Thanks to this, the provider will always have relevant information about authorized users. The worst option - when the client provides a provider with a specific list of authorized users. As a rule, in this case, when dismissing an employee or its movement to another position may arise difficulties.

4. Isolation of users. How does the data and applications of one client are separated from the data and applications of other clients?

Best option: When each of the clients use an individual virtual machine (Virtual Machine - VM) and a virtual network. The separation between Vm and, therefore, between users, provides a hypervisor. Virtual networks, in turn, are deployed using standard technologies, such as VRTUAL Local Private LAN Service and VPN (Virtual Private Network).

Some providers place all customer data into a single software environment and due to changes in its code are trying to isolate customer data from each other. This approach is increasing and unreliable. First, the attacker can find a breech in a non-standard code that will allow him to access the data that he should not see. Secondly, the error in the code can lead to the fact that one client will accidentally "see" other data. Recently, those and other cases have met. Therefore, to delimit user data, the use of different virtual machines and virtual networks is a more reasonable step.

5. Regulatory issues. How much does the provider follow the laws and rules applicable to the cloud computing?

Depending on jurisdiction, laws, rules and some special provisions may vary. For example, they can prohibit data export, require us to use strictly defined protection measures, compatibility with certain standards and the availability of an audit capabilities. Ultimately, they may require that public departments and judicial authorities be able to access information if necessary. The negligent relationship of the provider to these moments can lead its customers to substantial expenditures due to legal consequences.

The provider is obliged to follow the tough rules and adhere to a single strategy in the legal and regulatory spheres. This concerns the security of user data, their exports, compliance with standards, audit, safety and data deletion, as well as information disclosure (the latter is particularly important when information of several clients can be stored on one physical server). To find out to find out, customers are strongly advised to seek help from specialists who will study this issue thoroughly.

6. Reaction to the incident. How the provider reacts to the incidents, and how many customers can be involved in the incident?

Sometimes not everything goes according to plan. Therefore, the service provider is obliged to adhere to specific rules of conduct in case of unforeseen circumstances. These rules must be documented. Providers must be identified to identify incidents and minimize their consequences, informing users about the current situation. Ideally, they should regularly supply customers with information with maximum detail on the problem. In addition, customers themselves should evaluate the likelihood of safety issues and take the necessary measures.

10. International and domestic standards

The evolution of cloud technologies is ahead of the creation and modification activities of the necessary sectoral standards, many of which have not been updated for many years. Therefore, lawmaking in the field of cloud technologies is one of the most important steps to ensure security.

IEEE, one of the largest organizations on the development of standards, announced the launch of a special project in the field of cloud technology Cloud Computing Initiative. This is the first initiative to standardize cloud technologies, nominated at the international level - until today, the standards in the field of cloud computing were mainly sectoral consortia. The initiative now includes 2 projects: IEEE P2301 (TM), "Chernoval Guide to Provide Portableness and Interoperability Cloud Technology Profiles", and IEEE P2302 (TM) - "Chernovaya Standard for Interoperability and Distributed Interaction (Federation) Cloud Systems ".

Within the framework of the IEEE Standards Development Association, 2 new working groups have been created to work on IEEE P2301 and IEEE P2302 projects, respectively. IEEE P2301 will contain profiles of existing and developing standards in the field of applications, tolerability, management and interfaces for interoperability, as well as file formats and operation agreements. Information in the document will be logically structured in accordance with various groups of target audience: vendors, service providers and other interested market participants. At the end, it is expected to use the standard when purchasing, developing, constructing and using cloud products and services based on standard technologies.

The IEEE P2302 standard will describe the basic topology, protocols, functionality and control methods necessary for the interaction of various cloud structures (for example, to interact a private cloud and public, such as EC2). This standard will allow suppliers of cloud products and services to extract economic benefits from the scales effect, while providing transparency for users of services and applications at the same time.

ISO is preparing a special standard on cloud computing. The main focus of the new standard is the solution of organizational issues related to the clouds. However, due to the complexity of the ISO's coordination procedures, the final version of the document should be released only in 2013.

The value of the document is that not only government organizations (NIST, ENISA), but also representatives of expert communities and associations, such as ISACA and CSA are involved in its preparation. Moreover, in one document, recommendations are collected both for cloud service providers and for their consumers - customer organizations.

The main task of this document is described in detail the best practices related to the use of cloud computing from the point of view of information security. At the same time, the standard is not concentrated only on technical aspects, but rather on organizational aspects that cannot be forgotten when switching to cloud computing. This is the division of rights and responsibilities, and the signing of agreements with third parties, and the management of assets owned by various participants of the "cloud" process, and personnel management issues and so on.

The new document largely absorbed materials developed in the IT industry earlier.

Government of Australia

After several months of the "brainstorming", the Government of Australia issued a number of manuals on the transition to the use of cloud computing. On February 15, 2012, these guidelines were posted on the Australian Government Information Management Office, Agimo Information Management Department.

To facilitate the companies to migrate to the clouds, recommendations were prepared on the best practice of using cloud services in the light of the fulfillment of the requirements of the 1997 Law on Finance and Accountability Law (Better Practice Guides for Financial Management and Accountability Act 1997). In general terms, financial and legal problems are considered, as well as personal data protection issues.

The manuals are talking about the need for continuous monitoring and controlling the use of cloud services through daily analysis of accounts and reports. This will help avoid hidden "cheating" and addiction from cloud service providers.

The first leadership is called "cloud computing and protection of personal data for australia government agencies" (Privacy and Cloud Computing for Australian Government Agencies, 9 p.). In this document, special attention is paid to the issues of provision of privacy and security during data storage.

In addition to this guide, a document "Negotiations on the provision of cloud services - legal issues in cloud service agreements" was also prepared (Negotiating The Cloud - Legal Issues In Cloud Computing Agreements, 19 pp.), Helping to understand the provisions included in contract.

Last, Third Guide "Financial Considerations When Using Considerations for Government Considerations" (Financial Considerations for Government Use of Cloud Computing, 6 p.) Considers financial issues to which companies should pay attention if it decides to use cloud computing in your business.

In addition to affected by the manuals, there are a number of other issues that need to be solved when using cloud computing, including issues related to public administration, with procurement and business management policies.

Public discussion of this analytical document makes it possible to consider and comment on the following problems:

· Unauthorized access to secret information;

· Loss of data access;

· Inability to ensure the integrity and authenticity of data, and

· Understanding the practical aspects related to the provision of cloud services.

11. Territorial data affiliation

In various countries, there are a number of standards that require confidential data to remain within the country. And although the storage of data within a certain area, at first glance, is not a challenge, cloud service providers often cannot guarantee this. In systems with a high degree of virtualization, data and virtual machines can move from one country to another for various purposes - load balancing, fault tolerance.

Some large players in the SaaS market (such as Google, Symantec) can provide a storage guarantee in the relevant country. But it, rather, exceptions, in general, the fulfillment of these requirements is found quite rarely. Even if the data remains in the country, customers have no opportunity to check it out. In addition, do not forget about the mobility of employees of companies. If a specialist working in Moscow is sent to New York, then it is better (or at least faster) so that it receives data from the data center in the United States. Provide this - the task is already an order of magnitude harder.

12. State standards

At the moment, there is no serious regulatory framework in our country cloud technologiesAlthough the developments in this area are already underway. So order of the President of the Russian Federation №146 of 8.02.2012. It is determined that federal executive authorities authorized in the field of data security in information systemsCreated using supercomputer and grid technologies are FSB of Russia and FSTEC of Russia.

In connection with this decree, the powers of the named services were expanded. The FSB of Russia is now developing and approved by regulatory and methodological documents on the security of these systems, organizes and conducts research on information security.

The service is also performed by expert cryptographic, engineering and cryptographic and special studies of these information systems and is preparing expert opinions on proposals for their creation.

The document also enshrines that FSTEC of Russia is developing a strategy and determines the priority areas for ensuring the safety of information in information systems created using supercomputer and grid technologies processing limited access data, and also monitors the condition of provision of related security.

FSTEC ordered a study, as a result of which the beta version of the "Terminosystem in the cloud technologist" appeared "

As you can understand, all this thermosystem is an adapted translation of two documents: "Focus Group on Cloud Computing Technical Report" and "The Nist Definition of Cloud Computing". Well, that these two documents are not very consistent with each other - this is a separate question. And visually shows: in the Russian "thermosystem" the authors for the beginning simply did not lead references to these English documents.

The fact is that for such work, the concept, goals and tasks, methods of their solution must be discussed. There are many questions and comments. The main methodical note: It is necessary to formulate very clearly - what task this study solves, its goal. Immediately, I note "Creating a term System" - it cannot be a goal, it is a means, but the achievements of what is not very clear.

Not to mention that a normal study should include a section "Overview of the Existing State of Affairs."

Discussing the results of the study is difficult, not knowing the initial setting of the problem and how its authors solved.

But one term error in the thermosystem is clearly visible: it is impossible to discuss "cloud topics" in the separation from "non-slap". Outside the general context of IT. But this context is just in the study and not visible.

And the result is that in practice it will be impossible to use such thermososus. It can only confuse the situation even more.

13. Tools Protection in cloud technologies

The cloud server protection system in its minimum configuration should ensure the security of network equipment, data warehouse, server and hypervisor. Additionally, it is possible to accommodate in the highlighted antivirus kernel to prevent the hypervisor to infect the virtual machine, the data encryption system for storing user information in encrypted form and the means to implement encrypted tuning between the virtual server and the client machine.

To do this, we need a server that supports virtualization. Solutions of this kind are offered by Cisco, Microsoft, Vmware, Xen, KVM companies.

It is also permissible to use classic server, and the virtualization on it is provided with the help of a hypervisor.

For virtualization of operating systems for the X86-64 platforms, any servers with compatible processors are suitable.

Such a solution will simplify the transition to the virtualization of calculations without making additional financial investments in the equipment upgrade.

Scheme of work:

Fig. 11. An example of a "cloud" server

Fig. 12. Server Reaction to Equipment Block Failure

At the moment, the market for the protection of cloud computing is still empty. And it is not surprising. In the absence of a regulatory framework and the unknown of future standards, developers do not know what to emphasize their strength.

However, even in such conditions, specialized software and hardware complexes appear, allowing to protect the cloud structure from the main types of threats.

· Integrity violation

· Hacking hypervisor

· Insiders

· Identification

· Authentication

· Encryption

Accord-B.

Hardware and software system Accord-c.designed to protect VMware VSphere 4.1 virtualization infrastructure, VMware vsphere 4.0 and VMware Infrastructure 3.5.

Accord-c. Provides protection of all components of the virtualization environment: ESX servers and virtual machinery themselves, VCenter management servers and additional servers with VMware services (for example, VMware Consolidated Backup).

The software and hardware complex of Accord-B implemented the following protection mechanisms:

· Step-by-step control of the integrity of the hypervisor, virtual machines, files within virtual machines and infrastructure management servers;

· Democipation of access of administrators of virtual infrastructure and security administrators;

· Remuneration of user access inside virtual machines;

· Hardware identification of all users and infrastructure administrators of virtualization.

· Information on the availability of certificates:

The certificate of conformity of the FSTEC of Russia No. 2598 dated March 20, 2012 certifies that the software and hardware complex of information protection tools from unauthorized access "Accord-B." Is a software and technical means of protecting information that does not contain information that make up state secrets, from unauthorized access, Complies with the requirements of the guidelines "means of computing technology. Protection against unauthorized access to information. Protection indicators from unauthorized access to information" (StateComissions of Russia, 1992) - 5 Class of security, "Protection against unauthorized access to information. Part 1. Software for information security tools. Classification in terms of the level of non-declared capabilities" (State Commission of Russia, 1999) - 4 The level of control and technical specifications TU 4012-028-11443195-2010, and can also be used when creating automated systems to a class of security 1g inclusive and to protect information in personal data information systems up to grade 1 inclusive.

vgate R2.

vGATE R2 - a certified information security tool from unauthorized access and control of IB policy execution for virtual infrastructure based on VMware VSphere 4 and VMware VSphere 5.s R2 systems - a version of the product applicable to the protection of information in virtual infrastructures of state-owned companies, which are presented to Requirements for using qi with a high level of certification.

Allows you to automate the work of administrators to configure and operate a security system.

Contributes to countering errors and abuse when managing virtual infrastructure.

Allows you to bring the virtual infrastructure in compliance with the legislation, industry standards and best world practices.

<#"783809.files/image017.gif"> <#"783809.files/image018.gif"> <#"783809.files/image019.gif"> <#"783809.files/image020.gif">

Fig. 13 Claimed features VGATE R2

Thus, summing up, we present the fixed assets with which VGate R2 has to protect the service provider's data center from internal threats emanating from its own administrators:

· Organizational and technical separation of authority of administrators VSphere

· Allocation of the individual role of the IB administrator, which will manage the security of the resources of the data center on the database VSphere

· Separation of the clouds on the security zones, within which administrators operate with the corresponding level of authority

· Monitoring the integrity of virtual machines

· The possibility at any time to receive a report on the security of the infrastructure VSphere, as well as to conduct an audit of IB events

In principle, it is almost everything you need to protect the infrastructure of the virtual data center from internal threats from the point of view of the virtual infrastructure. Of course, you will also need protection at the level of equipment, applications, and guest OS, but this is another problem, which is also solved by the company's products. Safety code<#"783809.files/image021.gif">

Fig. 14. Server structure.

To ensure safety on a similar object, it is necessary to ensure safety, according to Table 2.

To do this, I suggest using a software product. vgate R2.It will solve such tasks as:

· Enhanced authentication of administrators of virtual infrastructure and information security administrators.

· Protection of virtual infrastructure management tools from NSD.

· Protection of ESX servers from NSD.

· Mandate access control.

· Monitoring the integrity of the configuration of virtual machines and trusted loading.

· Control of access administrators to Virtual machines.

· Registration of information security events.

· Monitoring the integrity and protection against the NSD components of the SZI.

· Centralized management and monitoring.

Table 2. Compliance with the need for security for the PAAS model

Certificate FSTEK Russia (SVT 5, NDV 4) Allows you to apply a product in automated security systems to class 1g inclusive and in information systems of personal data (CAP) to class K1 inclusive. The cost of this decision will be 24,500 rubles per 1 physical processor on a protective host.

In addition, to protect against insiders, the security alarm is required. These solutions are richly provided in the server protection market. The price of such a solution to the restriction of access to the controlled zone, the system of signaling and video surveillance ranges from 200,000 rubles and above

For example, we will take the amount of 250,000 rubles.

To protect virtual machines from viral infectionsThe McAfee Total Protection for Virtualization will be launched on one server kernel. The value of the decision is from 42,200 rubles.

To prevent data loss on storage facilities will be used. sYMANTEC NetBackup. It allows you to reliably back up information and image images.

The final value of the implementation of such a project will be:

The implementation of such a project solution based on Microsoft can be downloaded from here: http://www.microsoft.com/en-us/download/confirmation. ASPX? id \u003d 2494.

Output

"Cloud technologies" is one of the most actively developing IT market areas at present. If the growth rate of technologies will not decrease, then by 2015 they will be brought to the treasury of European countries more than 170 million euros per year. In our country, cloud technologies are appropriate. Partly, this is caused by the Izmpnostiy of the leadership views, partly distrust for security. But this type of technology with all their advantages and disadvantages is a new IT-progress locomotive.

The application "From the other side of the cloud" is absolutely no matter if you create your own request on a computer with an X86 processor Intel, AMD, VIA or make it on your phone or smartphone based on the Freescale, Omap, Tegra processor. Moreover, he will be without a difference to him, whether you are running a Google Chrome, Oha Android, Intel Moblin, Windows CE, Windows Mobile Windows XP / Vista / 7, or use something even more exotic for this. . If only the request was compiled competently and understood, and your system was able to "master" the response received.

The issue of security is one of the main in cloud computing and its solution will allow efficiently increase the level of services in the computer sphere. However, in this direction there is still a lot to do.

In our country, it is worth starting with a single dictionary terms for the entire IT region. Develop standards based on international experience. Put the requirements for protection systems.

Literature

1. FINANCIAL CONSIDERATIONS FOR GOVERNMENT USE OF CLOUD COMPUTION - Australia Government 2010.

2. Privacy and Cloud Computing for Australian Government Agencies 2007.

Negotiating The Cloud - Legal Issues in Cloud Computing Agreements 2009.

Magazine "Modern Science: Actual problems of theory and practice" 2012.

Similar work on - information security in cloud computing: vulnerabilities, methods and means of protection, tools for conducting an audit and investigation of incidents

Grigoriev1 Vitaly Robertovich, Candidate of Technical Sciences, Associate Professor Kuznetsov2 Vladimir Sergeevich

Problems of identifying vulnerabilities in the cloud computing model

The article provides an overview of the approaches to the construction of a conceptual model of cloud computing, as well as a comparison of existing views to identify vulnerabilities that are inherent in systems based on this model. Keywords: cloud computing, vulnerability, core of threats, virtualization.

The purpose of this article is a review of approaches to building a conceptual cloud computing model given in document "NIST CLOUD COMPUTING REFERENCE ARCHITECTURE" and comparing the views of the leading organizations in this area of \u200b\u200bvulnerabilities in the conditions of this computing model, as well as the main players in the market for creating cloud systems.

Cloud computing is a model that provides convenient network access on the requirements for general configurable computing resources (networks, servers, data warehouses, applications and services), which is promptly provided with minimal management and interaction efforts with a service provider. This definition of the National Institute of Standards (NIST) is widespread in the entire industry. The definition of cloud computing includes five main basic characteristics, three service models and four deployment models.

Five basic characteristics

Self-service on demand

Users are able to receive, monitor and manage computing resources without the help of system administrators. Wide Network Access - Computing Services are provided through standard networks and heterogeneous devices.

Operational elasticity - 1T-

resources can be promptly scaled in any direction as needed.

Resource pool - 1T resources shared by various applications and users in unrelated mode.

Calculation of the cost of the service - Using a 1T-resource is tracked for each application and user, as a rule, to ensure billing on a public cloud and internal calculations for using private clouds.

Three service models

As a service (SAAS) - usually applications are provided to end users as a service via a web browser. Today there are hundreds of SAAS proposals, from horizontal enterprises applications to specialized proposals for individual industries, as well as consumer applications, such as email.

Platform as a service (PAAS) - application development and deployment platform is provided to developers to create, deploy and manage SaaS applications. Usually, the platform includes databases, by middle layer and development tools, all this is provided as a service via the Internet. PAAS is often focused on programming language or API, for example, Java or Python. Virtualized cluster architecture of distributed computing often serves as a base base

1 - MSTU MEAA, Associate Professor, Department of Information Security;

2 - Moscow State University of Radio Electronics and Automation (MSU MEA), student.

Raya, since the grid structure of the network resource provides the necessary elastic scalability and resource combination. Infrastructure as Service (IAAS) - Servers, Data Warehouses and Network Hardware are provided as a service. This infrastructure equipment is often vis-toalizable, therefore virtualization, management and operating system are also elements 1Aaya.

Four deployment models

Private clouds are intended for exceptional use by one organization and are usually controlled, managed and hostited by private data centers. Hosting and managing private clouds can be transferred to outsourcing an external service provider, but

a cloud remains in the exclusive use of one organization. Public clouds are used by many organizations (users) together, serviced and managed by external service providers.

Group clouds are used by a group of related organizations wishing to use a common cloud computing environment. For example, a group can make up various types of armed forces, all universities in this region or all suppliers of a large manufacturer.

Hybrid clouds - appear when an organization uses both private and public cloud for the same application to take advantage of both. For example, with the "Livnev" scenario, the user organization in the case of standard load on the application

it uses a private cloud, and when the load is peak, for example, at the end of the quarter or on the festive season, the potential of a public cloud will use the potential of a public cloud, subsequently returning these resources into a common pool when there is no need for them.

In fig. 1 presents a conceptual cloud computing model according to the document "NIST CLOUD COMPUTING REFERENCE ARCHITECTURE". According to the presented in Fig. 1 Models The standard highlighted the main participants in the cloud system: cloud consumer, cloud provider, cloud auditor, cloud broker, cloud mediator. Each participant is a person or an organization that performs its own functions for implementation or providing cloud computing. Cloud consumer - a person or organization that supports business interaction with other

Cloud consumer

Cloud auditor

With Audit L I Security J

I Audit configured i learning j

(Audit providing - | Lyable services j

Cloud provider

Complex levels

User level

^ Service as a service ^ ^ platform as a service ^ infrastructure as a service)

Abstraction level

Physical level

Cloud Service

^ Support J ^ Setting J

Portability

Cloud broker

Cloud mediator

Fig. 1. Conceptual model developed by NIST specialists

network torus and uses services from cloud providers. The cloud provider is a person, an organization or anyone who is responsible for the availability of services provided to concerned consumers. Cloud auditor - a participant who can conduct independent assessments of cloud services, services and safety of cloud implementation. The cloud broker is a participant who manages the use, current characteristics and delivery to the consumer cloud services, and also coordinates the interaction between cloud providers and cloud consumers. Cloud mediator - an intermediary who provides communication and delivery of cloud services between cloud providers and cloud consumers.

Advantages and problems of cloud computing

The latest surveys of specialists in the field of 1T-technologies show that cloud computing offer two main advantages when organizing distributed services - speed and cost. Thanks to autonomous access to the computing pool, users may be included in the processes that interest them in a matter of minutes, and not after weeks or months, as it was earlier. Changing computational potential is also performed quickly due to the elastic scalable grid architecture of the computing environment. Since, in cloud computing, users pay only for being used, and the possibilities of scaling and automation reach a high level, the value ratio and efficiency of the services provided is also a very attractive factor for all participants in exchange processes.

The same surveys show that there are a number of major considerations that hold some companies from the transition to the cloud. Among these considerations with a large margin, cloud computing safety issues are leading.

For an adequate safety assessment in cloud systems, it makes sense to explore views on the threat of the area of \u200b\u200bthe main market players. We compare existing threats to cloud systems presented in the NIST Cloud Computing Standards Roadmap document with approaches that are offered by IBM, Oracle and VMware.

Standard for the safety of cloud computing adopted by the National Institute of Standards

NIST CLOUD COMPUTING STANDARTS ROADMAP), adopted in Nist, covers possible potential types of cloud computing services:

♦ compromising the confidentiality and availability of data transmitted by cloud providers;

♦ attacks that proceed from the characteristics of the structure and capabilities of the cloud computing environment for strengthening and increase damage from attacks;

♦ Unauthorized consumer access (by incorrect authentication or authorization, or vulnerabilities made by means of periodic maintenance) to software, data and resources used by the author-zooked consumer cloud service;

♦ Increasing the level of network attacks, such as DOS, operating software, in the development of which the threat model was not taken into account for distributed Internet resources, as well as resource vulnerabilities that were available from private networks;

♦ limited data encryption capabilities in a medium with a large number of participants;

♦ tolerability arising from the use of non-standard APIs that complicate the cloud consumer with the possibility of transition to a new cloud provider when availability requirements are not performed;

♦ attacks that operate the physical abstraction of cloud resources, and operating flaws in records and audit procedures;

♦ Attacks on virtual machines that were not appropriately updated;

♦ Attacks operating inconsistencies in global and private security policies.

Also, the standard allocates basic security tasks for cloud computing:

♦ Protection of user data from unauthorized access, disclosure, modification or viewing; implies support for the identification service in such a way that the consumer has the ability to perform identification and access control policies on authorized users who have access to cloud services; Such an approach implies the ability to provide access to its data selectively to other users;

♦ protection against "chains" (Supply Chain Threats) threats; includes confirmation of the degree of confidence and reliability of the provider service to the same extent as the degree of confidence used by the Iron and Iron;

♦ prevent unauthorized access to cloud computing resources; Includes the creation of protected domains that are logically separated from resources (for example, logical separation of workloads running on the same physical server by means of a hypervisor in multi-induction environment) and the use of safe configurations;

♦ development of web applications deployed in the cloud for a threat model of distributed Internet resources and embeding security functions into the software development process;

♦ protection of Internet browsers from attacks to mitigate weak security seats of the end user; includes taking measures to protect the Internet connection of personal computers based on the use of secure software, firewall (firewood) and periodic installation of updates;

♦ Deploying access control and detection technologies

from a cloud provider and an independent evaluation, to test the presence of these; Includes (but it is not limited to) traditional perimeter safety measures in combination with the domain security model; Traditional perimeter safety includes restricting physical access to the network and devices, protecting the individual component from operating by deploying updates, the default setting of most security settings, disconnecting all unused ports and services, using role-playing management, monitor audit entries, minimizing the privileges used, using antivirus packets and encryption of compounds;

♦ The task of trusted boundaries between the service provider (AMI) and consumers in order to make sure the clarity of authorized responsibility for the provision of security;

♦ support tolerability carried out in order for the consumer to have the opportunity to change the cloud provider in cases where he has the need to meet the requirements for integrity, availability, confidentiality; This includes the ability to close the account at the moment and copy data from one service provider to another.

Thus, the cloud computing security standard (NIST Cloud Computing Standards Roadmap), adopted in NIST, defines the basic list of attacks on cloud systems and a list of basic tasks that should

to be solved by applying

relevant measures.

We formulate the threats of the information security of the cloud system:

♦ U1 - threat (compromise, accessibility, etc ...) data;

♦ U2 - threats generated by the characteristics of the structure and capabilities of the architecture of the implementation of distributed calculations;

♦ U4 - threats associated with an incorrect threat model;

♦ U5 - threats associated with incorrect encryption use (it is necessary to use encryption in an environment where there are several data streams);

♦ U6 - threats associated with the use of non-standard APIs during development;

♦ U7 - virtualization threats;

♦ U8 - threats that operate inconsistencies in global security policies.

A look at cloud computing safety issues adopted in IBM

CLOUD Security Guidance IBM Recommendations for the Implementation of Cloud Security allows us to draw conclusions about the provision of security formed by IBM specialists. On the basis of this document, we can expand the previously proposed list of threats, namely:

♦ U9 - threats associated with third-party access to physical resources \\ systems;

♦ U10 - threats associated with incorrect disposal (life cycle) of personal information;

♦ U11 - threats associated with violation of regional, national and international laws related to the processed information.

IBM, Oracle and VMware companies approaches to cloud computing safety

Documentation provided by these companies and describing views on ensuring security in their systems does not provide fundamentally different threats from the above.

In tab. 1 provides basic grades of vulnerabilities formulated by companies in their products. Table. 1 allows you to see the absence of a complete coating of threats from the studied companies and formulate the "core of threats" created by companies in their cloud systems:

♦ threat of data;

♦ Threats based on the structure of distributed computing capabilities;

♦ Threats associated with an incorrect threat model;

♦ virtualization threats.

Conclusion

An overview of the basic classes of cloudy platform vulnerabilities allows us to conclude that currently there are no ready-made solutions for full-fledged cloud protection due to the diversity of attacks using vulnerability data.

It should be noted that the constructed table of class of vulnerabilities (Table 1), integrating approaches leading in

Table 1. Classes of vulnerable

Source declared threats

U1 U2 U3 U4 U5 U6 U7 U8 U9 U10 U11

Nist + + + + + + + + - - -

IBM + + + + + - + - + + +

Sun / Oracle + + + + - - + - - + -

VMWare + + + + - - + - - - - -

this industry of players is not exhausted by threats presented in it. For example, it does not reflect the threats associated with erasing the boundaries between media with different levels of data privacy, as well as erasing the boundaries of liability for information security between the consumer services and the cloud provider.

It becomes obvious that to implement a complex cloud system, protection needs to be developed for a specific implementation. An important role for the implementation of secure calculations in virtual environments is plays the lack of standards FSTEC and FSB for cloud systems. Allocated in the work of the "core of threats" makes sense to use when

drawing the problem of building a unified model of classes of vulnerable stages. This article is reviewed, subsequently, it is planned to analyze the threat classes related to virtualization, to develop approaches to creating a protection system, potentially preventing threat data

Literature

1. Cloud Security Guidance IBM Recommendations for the Implementation of Cloud Security, IBM.com/redbooks, November 2, 2009.

2. http://www.vmware.com/technical-resources/security/index.html.

3. NIST CLOUD. Computing Reference Architecture, National Institute of Standards and. Technology, Special Publication. 500-292, September 2011.

4. NIST CLOUD. Computing Standards Roadmap, National Institute of Standards and. Technology, Special Publication. 500-291, July 2011.

5. http://www.orcle.com/techNetwork/indexes/documentation/index.html.