Cloud computing is the provision of computing as a service rather than a physical product. Resources, such as applications and data, are provided to other computers and users as a utility over the internet. Cloud computing offers computation, software applications, and data storage resources without the user knowing the location and details of the physical computing infrastructure.
Users normally access cloud based applications with a web browser or mobile app, and the associated software and data are stored on servers at a remote location. Although not always possible, cloud applications try to achieve the same speed as software installed locally on the user's computer.
The foundation of cloud computing is infrastructure convergence and shared services. The cloud computing provider manages the physical infrastructure in the data centre and will move resources between customers according to demand. This can allow customers to release their applications more quickly because they do not have to build and maintain physical infrastructure. And it allows immediate adjustments to the quantity of servers, storage, and bandwidth available to an application to meet peaks in demand.
Cloud computing shares characteristics with Utility computing where a package of computing resources, such as computation and storage, are provided as a metered service similar to a traditional public utility, such as electricity.
Cloud computing has the following characteristics:
- Users of computing resources can re-provision resources without having to be involved with physical infrastructure or constrained by a centralized IT department managing the hardware. This can give improved agility.
- Application Programming Interface (API) that enables machines to interact with cloud services in the same way humans interact with them through a user interface. Cloud computing services usually offer a REST-based API.
- Cost may be reduced and capital expenditure convert to operational expenditure. This can lower barriers to entry. The infrastructure is provided by a third-party and does not need to be purchased to cover peak demand. Pricing is fine-grained with usage-based options and fewer in-house IT skills are required.
- Device and location independence are enabled for users because they usually access the systems using a web browser on their PC or mobile phone. Users can connect from anywhere because the infrastructure is accessed via the internet.
- Virtualization technology allows servers and storage devices to be shared and utilization increased. Applications can be transparently migrated from one physical server to another.
- Multi-tenancy shares physical resources and costs across a pool of users. This may allow siting of infrastructure in a location with lower costs, such cooling and electricity. And it can provide utilisation and efficiency improvements for systems that are often only 10-20% utilized.
- Reliability can be improved with multiple redundant sites, which makes cloud computing suitable for business continuity and disaster recovery.
- Scalability and elasticity via dynamic on-demand provision of resources on a fine-grained, near real-time self-service basis.
- Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.
- Security can be better than other traditional systems where providers are able to devote more resources to solving shared security issues than individual customers can afford. However, the complexity of security is increased by putting data in more locations and devices, and into multi-tenant systems shared with unrelated users. In addition security audit logs may be unavailable or inaccessible to the end-user.
- Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer.
The term "cloud" originated from the cloud drawn to represent the telephone network. The cloud symbol denoted the demarcation between the responsibilities of the telecoms provider and the user. It later depicted the Internet in computer network diagrams as an abstraction of the underlying infrastructure.
The availability of high capacity networks, cheap computers and storage devices and the widespread adoption of virtualisation have led to a rapid growth in cloud computing. Details are hidden from end-users, who no longer need expertise in the physical infrastructure that supports them.
The concept of cloud computing dates back to the 1960s, when John McCarthy said that "computation may someday be organized as a public utility." Almost all characteristics of cloud computing, and the comparison to the electricity industry, were explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility. Others have said that cloud computing's roots go back to the 1950s when scientist Herb Grosch postulated that the entire world would operate on dumb terminals powered by about 15 large data centers.
Amazon played a key role in the development of cloud computing. Like other companies, their data centers were sometimes using only 10% of their total capacity, to allow room for occasional spikes. They found that redeveloping them to a cloud architecture gave significant internal efficiency improvements. As a result they developed the Amazon Web Service (AWS) to provide cloud computing to external customers. This was launched in 2006.
In 2008 Eucalyptus became the first open-source, AWS API-compatible platform for private clouds, and OpenNebula became the first open-source software for deploying private and hybrid clouds.
By mid-2008, Gartner observed that "organisations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to cloud computing ... will result in dramatic growth in IT products in some areas and significant reductions in other areas."
Cloud computing providers offer three types of service:
- Software as a Service (SaaS).
- Platform as a Service (PaaS).
- Infrastructure as a Service (IaaS).
Infrastructure as a Service is the lowest layer, then Platform as a Service, and Software as a Service at the top. The higher layers abstract details of those beneath.
Infrastructure as a Service (IaaS)
In this lowest level cloud service, cloud providers offer computers as physical, or more often as virtual machines, raw block storage, firewalls, load balancers, and networks. IaaS providers supply these resources on demand from large pools installed in data centers. Internet connectivity, dedicated virtual private networks, local area networks and IP addresses are also offered.
To deploy an application the cloud user installs an operating system image on the machine then their application software. With IaaS the user is responsible for patching and maintaining the operating systems and application software. Cloud providers typically bill IaaS services on a utility computing basis, and the cost reflects the resources allocated and consumed.
Platform as a Service (PaaS)
With PaaS the cloud provider delivers a computing platform with software which normally includes the operating system, programming language execution environment, database, and web server. Application developers can develop and run their software on this cloud platform without the complexity of buying and managing the underlying hardware and software layers. With some PaaS the compute and storage resources are scaled automatically to match demand and the cloud user does not have to allocate resources manually.
Software as a Service (SaaS)
With SaaS the cloud providers install and operate application software in the cloud, and the cloud users access the application from cloud clients. The users do not manage the cloud infrastructure or the platform on which the application is running. This eliminates the need to install and run the application which simplifies maintenance and support. A cloud application can also provide elasticity, normally by cloning tasks onto multiple virtual machines at run-time, to meet changing demand. Load balancers distribute the work over the set of virtual machines. This process is transparent to the user. Cloud applications are often multi-tenant, a single physical machine serves more than one customer.
The pricing model for SaaS applications is typically a monthly or yearly flat fee per user.
It is common to refer to special types of cloud-based application software with a similar naming convention: desktop as a service, business process as a service, Test Environment as a Service, communication as a service.
Users access cloud computing services using networked client devices, such as desktop computers, laptops, tablets and smartphones. Some of these devices, termed cloud clients, rely on cloud computing for most of their applications, and are essentially useless without it. Examples are thin clients and the browser-based Chromebook.
Many cloud applications do not require specific software on the client and instead use a web browser to interact with the cloud application. With Ajax and HTML5 a web user-interface sometimes achieves a similar look and feel to a native application. Some cloud applications require specific client software, e.g. virtual desktop and email clients. Legacy applications, line of business applications that until now have been prevalent in Windows computing, may be delivered via a screen-sharing technology such as Remote Desktop.
Applications, storage, and other resources are made available to any customers who require them by a service provider. Public cloud services may be free or offered on a pay-per-usage model. Access is normally only through the internet.
A Community Cloud shares infrastructure between several organizations from a specific community with common concerns such as security compliance, or jurisdiction. It may be managed and hosted internally, or externally by a third-party. The costs are spread over fewer users than a public cloud, so only some of the cost savings of public cloud computing are likely to be realized.
A Private cloud is operated for a single organization. It may be managed and hosted internally or externally by a third-party. The cloud has to be sized for a single customer's peak demand, which reduces the benefits and makes it more expensive.
A Hybrid cloud is composed of two or more clouds - private, community or public - that remain unique entities but are bound together, offering the benefits of multiple deployment models.
Private Cloud Rentals
Private Cloud Rentals are an option when security is a concern. Companies can use the Hybrid Cloud model to replace obsolete data center equipment by temporarily hosting the applications on a rented Private Cloud. If moving important company data off-site to a Public Cloud is not an option, renting a modular data center can be considered. Using Virtual Machine concepts, the live applications can be moved from the existing data center to the leased equipment without disrupting the users. The obsolete data center equipment can be removed and replaced with new hardware, and then the applications can be moved from the leased equipment onto the new hardware. The leased equipment is then returned, or kept on site as a backup or to cover increased demand.
Cloud computing sample architecture
Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over a loose coupling mechanism such as a messaging queue. Elastic provision requires intelligent use of tight or loose coupling for mechanisms such as these.
The Intercloud is an interconnected global "cloud of clouds" and an extension of the Internet "network of networks" on which it is based.
Cloud computing poses privacy concerns because a cloud service provider can access or release the data held in the cloud. The cloud model makes it easy for companies hosting the cloud services to monitor, lawfully or unlawfully, the user's communication and data. Incidents, such as the secret NSA program, with AT&T and Verizon, which recorded over 10 million phone calls between American citizens, have caused privacy concerns.
Using a cloud service provider (CSP) can also complicate the privacy of data because of the virtualization of cloud processing and storage used to implement a cloud service. Through virtualisation data may not remain on the same system, data center or provider's cloud. This can lead to legal problems over jurisdiction. While there have been efforts, such as the US-EU Safe Harbor, to harmonize the legal environment, providers such as Amazon still cater to major markets, typically the United States and the European Union, by deploying local infrastructure and allowing customers to select "availability zones".
In order to be compliant with regulations such as FISMA, HIPAA, and SOX in the United States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to adopt community or hybrid deployment. These are typically more expensive and may offer restricted benefits. This is how Google is able to "manage and meet additional government policy requirements beyond FISMA" and Rackspace Cloud and QubeSpace are able to meet PCI compliance requirements. Many providers also obtain SAS 70 Type II certification.
Customers in the EU contracting with cloud providers established outside the EU/EEA have to adhere to the EU regulations on export of personal data.
US Federal Agencies have to use a process called FedRAMP (Federal Risk and Authorization Management Program) to assess and authorize cloud services. FedRAMP is a subset of the NIST Special Publication 800-53 security controls selected to provide protection in cloud environments. A subset has been defined for the FIPS 199 low categorization and the FIPS 199 moderate categorization. The FedRAMP program also has a Joint Accreditation Board (JAB) responsible for accreditation standards for 3rd party organizations who perform assessments of cloud solutions.
Open Source - Free software for cloud computing
Open-source software has provided the foundation for many cloud computing implementations, one prominent example being the Hadoop framework.
In 2007 the Free Software Foundation released the Affero General Public License, a version of GPLv3 intended to close a perceived legal loophole associated with free software designed to be run over a network.
Open standards - Cloud standards
Most cloud providers expose APIs that are well-documented, often under a Creative Commons license, but also unique to their implementation and not interoperable. Some vendors have adopted others' APIs and there are a number of open standards under development to encourage interoperability and portability.
Security - Cloud computing security
As cloud computing has become popular, concerns are being voiced about its security. The traditional protection mechanisms need to be reconsidered as the characteristics of the cloud differ from those of traditional architectures. The security principles of shared multi-user mainframes may apply to cloud security.
The security of cloud computing services may be delaying their adoption. Physical control of the Private Cloud equipment is more secure than having the equipment off-site and under someone else's control. Physical control and the ability to visually inspect the equipment is required to ensure data links are not compromised. An issue which bars the adoption of cloud computing is the private and public sector unease about external management of security-sensitive services. It is the nature of cloud services, private or public, that they promote external management of services. This delivers great incentive to cloud computing providers to build and maintain secure services. Security issues include: sensitive data access, data segregation, privacy, bug exploitation, recovery, accountability, malicious insiders, account control, and multi-tenancy issues. Solutions to cloud security issues vary, from cryptography, particularly public key infrastructure (PKI), to use of multiple cloud providers, standardisation of APIs, and improving virtual machine and legal support.
Although cloud computing is assumed to be "green computing", there are no published studies to support this. Siting of the servers affects the environmental impact of cloud computing. In areas where the climate is cool and renewable electricity is readily available, the environmental effects will be more moderate. Countries such as Finland, Sweden and Switzerland, are trying to attract cloud computing data centers on this basis. Energy efficiency in cloud computing can result from energy-aware scheduling and server consolidation. Where a cloud is distributed over data centers with different sources of energy, including renewable energy, a small compromise on location could result in a high carbon footprint reduction.
As with privately purchased hardware, crackers posing as legitimate customers can purchase cloud services for nefarious purposes. This includes password cracking and launching attacks using purchased services. In 2009, a banking trojan used the Amazon service as a command and control channel that issued software updates and malicious instructions to PCs that were infected by the malware.
Many universities, vendors and government organisations are researching cloud computing:
- In 2009 University California Santa Barbara released the first open source platform-as-a-service, AppScale, which is capable of running Google App Engine applications on a multitude of infrastructures.
- In 2009 the St Andrews Cloud Computing Co-laboratory was launched, focusing on research in the area of cloud computing. Unique in the UK, StACC aims to become an international centre of excellence for research and teaching in cloud computing, and will provide advice and information to businesses interested in cloud-based services.
- In 2010 the TClouds (Trustworthy Clouds) project was started, funded by the European Commission's 7th Framework Programme. The project's goal is to research the legal foundation and architectural design of a resilient and trustworthy cloud-of-cloud infrastructure. The project will develop a prototype.
- In 2010 the TrustCloud research project was started by HP Labs Singapore to address transparency and accountability of cloud computing via detective, data-centric approaches encapsulated in a five-layer TrustCloud Framework. The team identified the need to monitor data life cycles and transfers in the cloud, leading to security issues such as data leakage, accountability and cross-national data transfer.
- In 2011 the High Performance Computing Cloud (HPCCLoud) project was started to find ways to enhance the performance of cloud environments running scientific applications. HPCCLoud Performance Analysis Toolkit.
- In 2011 the Telecommunications Industry Association developed a Cloud Computing White Paper, to analyze the integration challenges and opportunities between cloud services and traditional US telecommunications standards.