You are on page 1of 9

CLOUD COMPUTING & SATELLITE IMAGERY

Anupam Tiwari :anupam@blumail.org

INTRODUCTION

1. Opportunities for improving IT efficiency and performance through centralization of


resources have increased dramatically in the last few years with the maturation of
technologies such as SOA, virtualization, grid computing, and management automation. A
natural outcome of this is what has become increasingly referred to as ―cloud computing‖,
where a consumer of computational capabilities sets up or makes use of computing ―in the
cloud‖ (i.e. over a network) in a self-service manner, without direct involvement in how that
computing is resourced. The cloud in cloud Computing provides the means through which
everything — from computing power to computing infrastructure, applications, business
processes to personal Collaboration can be delivered as a service wherever and whenever
you need. Having made a bizarre explosion of millions of possibilities and advantages at the
door step of the regular commercial and single user customer, how can sky be left behind
alone? The awesome power of the cloud computing has not left the skies unaffected.

2. The growing prevalence of instruments and sensors deployed to continuously


observe nature at fine spatial and temporal granularity is causing a surge in the volume of
observational data collected, leading to increased resource needs for storing, managing,
analyzing and sharing it. This is particularly true for data that stream in from various
satellites.Much of data is collected and managed at the long-tail of scientific research, legacy
tools running on single-system servers serving a small community of users.

3. Cloud computing is emerging as a viable platform for executing these scientific


workflows and/or their constituent tasks . Clouds hold the promise of on-demand availability
of computation and data resources using simple service abstractions .Cloud resources are
more accessible to a large fraction of scientific users, when compared to costly capital and
management costs of local clusters or the complexity and access limitations.

AIM

4. To study the concept of cloud computing and explore possible applications of the
same in the field of satellite imagery.

SCOPE

5. This article will be covered as under :

(a) Part-I : Basic Concept


(b) Part-II : Applications
(c) Part-III : Deployment models
(d) Part-IV : Satellite imagery & Cloud
(e) Part-V : Case Study
(i) Nebula @ NASA Cloud Platform
(ii) Server Sky
(f) Part-VI : Security issues
(g) Part-VII : Conclusion
PART 1 : BASIC CONCEPT

6. Cloud computing portends a major change in how we store information and run
applications. Instead of running programs and data on an individual desktop computer,
everything is hosted in the ―cloud‖—a nebulous assemblage of computers and servers
accessed via the Internet. Cloud computing provisions access to all applications and
documents from anywhere in the world, freeing one from the confines of the desktop and
making it easier for group members in different locations to collaborate. Cloud computing is a
model for enabling convenient, on-demand network access to a shared pool of configurable
computing resources (e.g., networks, servers, storage, applications, and services) that can
be rapidly provisioned and released with minimal management effort or service provider
interaction.

7. To make it in more simple words, with traditional desktop computing, we run copies
of software programs on each computer we own. The documents created are stored on the
computer on which they were created. Although documents can be accessed from other
computers on the network, they can’t be accessed by computers outside the network. The
whole scene is PC-centric. With Cloud Computing, the software programs no more run from
the traditional personal computer, but are rather stored on servers accessed via the Internet.
If our computer crashes, the software is still available for others to use.

8. Anyone with permission can not only access the documents, but can also edit and
collaborate on those documents in real time. Unlike traditional computing, this cloud
computing model isn’t PC centric, it’s document-centric. It is not important that which PC is
being used to access a document.

What Cloud Computing Isn’t ?

9. First, cloud computing isn’t network computing. With network computing,


applications/documents are hosted on a single company’s server and accessed over the
company’s network. Cloud computing is a lot bigger than that. It encompasses multiple
companies, multiple servers, and multiple networks. Plus, unlike network computing, cloud
services and storage are accessible from anywhere in the world over an Internet connection;
while network computing access is over the company’s network only.

What Cloud Computing Is ?

10. Key to the definition of cloud computing is the ―cloud‖ itself which is a large group of
interconnected computers. These computers can be personal computers or network servers;
they can be public or private. For example, Google hosts a cloud that consists of both
smallish PCs and larger servers. Google’s cloud is a private one (that is, Google owns it)
that is publicly accessible (by Google’s users).This cloud of computers extends beyond a
single company or enterprise. The applications and data served by the cloud are available to
broad group of users, cross-enterprise and cross-platform. Access is via the Internet. Any
authorized user can access these docs and apps from any computer over any Internet
connection. And, to the user, the technology and infrastructure behind the cloud is invisible.
It isn’t apparent whether cloud services are based on HTTP, HTML, XML, JavaScript, or
other specific technologies.

11. There are six key properties of cloud computing which are depicted below in the
figure :
User-
centric

Programmable Task-centric

CLOUD
COMPUTING

Intelligent Powerful

Accessible

KEY PROPERTIES OF CLOUD COMPUTING

PART 2 : APPLICATIONS & KEY FEATURES

12. The applications of cloud computing are practically illimitable. With the right
middleware, a cloud computing system could execute all the programs a normal computer
could run. Potentially, everything from generic word processing software to customized
computer programs designed for a specific company could work on a cloud computing
system.In broad terms,the following services and applications are offered by cloud
computing :

(a) SaaS (Software as a Service)


(b) PaaS(Platform as a Service)
(c) IaaS(Infrastructure as a Service)
(d) DaaS(Data as a Service)
(e) Storage as a Service

13. SaaS : Cloud application services or "Software as a Service (SaaS)" deliver software
as a service over the Internet, eliminating the need to install and run the application on the
customer's own computers and simplifying maintenance and support.

14. PaaS : Cloud platform services or "Platform as a Service (PaaS)" deliver a


computing platform and/or solution stack as a service, often consuming cloud infrastructure
and sustaining cloud applications. It facilitates deployment of applications without the cost
and complexity of buying and managing the underlying hardware and software
layers.Regular use platform by various companies include iCloud,CLOUDO,eyeOS etc.

15. IaaS : Cloud infrastructure services, also known as "Infrastructure as a Service


(IaaS)", delivers computer infrastructure - typically a platform virtualization environment - as
a service. Rather than purchasing servers, software, data-center space or network
equipment, clients instead buy those resources as a fully outsourced service. Suppliers
typically bill such services on a utility computing basis and amount of resources consumed
(and therefore the cost) will typically reflect the level of activity. IaaS evolved from virtual
private server offerings.

16. DaaS : Data as a Service is based on the concept that the data, can be provided on
demand to the user regardless of geographic or organizational separation of provider and
consumer. Additionally, the emergence of service-oriented architecture (SOA) has rendered
the actual platform on which the data resides also irrelevant.

CLOUD STORAGE

17. Storage as a Service : Cloud storage is networked online storage where data is
stored on multiple virtual servers rather than being hosted on dedicated servers. Hosting
companies operate large data centers; and people who require their data to be hosted buy
or lease storage capacity from them and use it for their storage needs. The data center
operators, in the background, virtualize the resources according to the requirements of the
customer and expose them as storage pools, which the customers can themselves use to
store files or data objects. Physically, the resource may span across multiple servers.Cloud
storage services may be accessed through a web service application programming interface
(API), or through a Web-based user interface.

18. In addition to the key Features listed vide para 11,the following advantages augment
the concept :
Peak-load
Utilization
capacity
Scalability
Centralization
& Reliability

Sharing of
Security??
resources

Device/Loc
Maintenance
independence

Agility Cost
Key Metering
Advantages

PART 3 : DEPLOYMENT MODELS

19. Four deployment models are discussed below :

(a) Public Cloud: In simple terms, public cloud services are characterized as
being available to clients from a third party service provider via the Internet. The term
―public‖ does not always mean free, even though it can be free or fairly inexpensive
to use. A public cloud does not mean that a user’s data is publically visible; public
cloud vendors typically provide an access control mechanism for their users. Public
clouds provide an elastic, cost effective means to deploy solutions.

(b) Private Cloud: A private cloud offers many of the benefits of a public cloud
computing environment, such as being elastic and service based. The difference
between a private cloud and a public cloud is that in a private cloud-based service,
data and processes are managed within the organization without the restrictions of
network bandwidth, security exposures and legal requirements that using public
cloud services might entail. In addition, private cloud services offer the provider and
the user greater control of the cloud infrastructure, improving security and resiliency
because user access and the networks used are restricted and designated.

(c) Community Cloud: A community cloud is controlled and used by a group of


organizations that have shared interests, such as specific security requirements or a
common mission. The members of the community share access to the data and
applications in the cloud.

(d) Hybrid Cloud: A hybrid cloud is a combination of a public and private cloud
that interoperates. In this model users typically outsource non business-critical
information and processing to the public cloud, while keeping business-critical
services and data in their control.
PART 4 : SATELLITE IMAGERY & CLOUD STORAGE

20. Satellite imagery consists of photographs of Earth or other planets made by means of
artificial satellites. There are four types of resolution in context of satellite imagery in remote
sensing which include the following :

(a) Spatial resolution is defined as the pixel size of an image representing the size of
the surface area (i.e. m2) being measured on the ground, determined by the sensors'
instantaneous field of view (IFOV);

(b) Spectral resolution is defined by the wavelength interval size (discreet segment of
the Electromagnetic Spectrum) and number intervals that the sensor is measuring;

(c) Temporal resolution is defined by the amount of time (i.e. days) that passes
between imagery collection periods;

(d) Radiometric resolution is defined as the ability of an imaging system to record


many levels of brightness (contrast for example).Radiometric resolution refers to the
effective bit-depth of the sensor (number of greyscale levels) and is typically
expressed as 8-bit (0-255), 11-bit (0-2047), 12-bit (0-4095) or 16-bit (0-65,535).

21. Owing to relatively brobdingnagian and critical requisites of high resolution of the
received satellite images, further results in demand of large capacity storage requirements of
data. In such scenario’s cloud architectures can come to rescue to easily and conveniently
provision larger amounts of storage and computing power; they also offer easy access to
centrally located information reachable through any compatible device a user wishes to
implement; they can provide a back-up to locally stored data; they allow people to easily
share their data with others. But at the same time there is always going to be an optimum
'balance point' that identifies the 'best' mix of local (PC) processing and storage, on-premise
(enterprise data centre) processing, storage and networking, as well as 'cloud' processing.
That balance point changes would again be dependent on the relative cost-effectiveness of
processing power, storage, and data communication changes. The advantages are briefly
mentioned below :

(a) Investment only for storage that is used and no idle & wasted space.

(b) No need to install physical storage devices in their datacenters or offices, which
reduces IT and hosting costs.

(c) Storage maintenance tasks, such as backup, data replication, and purchasing
additional storage devices are offloaded to the responsibility of a service provider,
allowing organizations to focus on their core business only.

PART-V : Case Study Brief

NEBULA @NASA : Cloud Computing for a Universe of Data

22. Nebula combines cloud computing and data center containers. It is a new data
powerhouse, which provides on-demand computing power for NASA researchers. The
Nebula application lives in a 40-foot container at the NASA Ames Research Center in
Mountain View, Calif. The ―data center in a box‖ was built inside a container from which is
filled with Cisco Systems’ Unified Computing System and servers from Silicon Mechanics.
23. Nebula is a self-service platform built from open source software that provides high
capacity computing, storage, and network connectivity for NASA research. It has been
designed to automatically increase the computing power and storage available to science-
and data-oriented web applications as demand rises. Nebula thus allows for rapid expansion
of IT infrastructure, and can provide excellent energy efficiency by offering more precise
control of airflow within the container. It provides high-capacity computing, storage, and
network connectivity and uses a virtualized, scalable approach to achieve cost and energy
efficiencies

24. The project began in 2007 and is an open-source project and uses a variety of open-
source components, including Eucalyptus, Lustre and RabbitMQ.The Ames Internet
Exchange hosts the Nebula Cloud, and also provisions Nebula to connect with 10 GigE
connections.

25. Nebula provides three classes of storage:

(a) Local Storage: Virtual Machines use local storage to run, but the information is local
disk and is not saved by default. Nebula uses hot-swappable commodity drives in a
hardware RAID configuration. This allows up to 3 drives to fail before data loss occurs.

(b) Persistent Block Device (iSCSI): Nebula uses iSCSI to provide a persistent network
storage block device. This storage is always backed up. This type of storage can be
used by conventional applications that have not been converted to cloud architectures.
This allows highly-reliable and permanent storage, and decouples the storage from the
connected server as a single point-of-failure.

(c) Object Store: With Object Store, it is easy to store petabytes of data and billions of
files. Open-Source implementations of object stores have been used with custom code
that adds in the access control layer (ACL) and management and potentially the API
layer.

26. Thus when we speak of benefits of the same, all the advantages as listed vide Para
18 automatically get augmented with the subject attempt including Rapid Provisioning, Cost
Savings, Resource Elasticity, Scalability, Integrated Reporting and Policy Compliance etc.
SERVER SKY

27. A typical server farm is a collection of computer servers usually maintained by an


enterprise to accomplish server needs far beyond the capability of one machine. Server
farms often have backup servers, which can take over the function of primary servers in the
event of a primary server failure. Server farms are commonly used for cluster computing.
Many modern supercomputers comprise of giant server farms of high-speed processors
connected by either Gigabit Ethernet or custom interconnects such as Infiniband or Myrinet.

28. Server sky is a proposal to build large dispersed arrays of ultra light solar powered
server satellites and launch them into 6000km earth orbit, between the inner and outer Van
Allen belts. A 50 gram server-sat consists of a thinned 12 inch solar cell, with an efficient
2GIPs processor, terabit solid state disk, and microwave transmitter bonded to the back.
Thousands of server-sats position themselves into dozens of dispersed three dimensional
clouds (kilometers on a side) using light pressure for thrust and liquid-crystal shutters for trim
tab steering. A server-sat array acts as a large phased array antenna, permitting it to steer
thousands of communication beams at receiving stations and communities under its position
in orbit, handing off communication and control to the server-sat clouds that follow it in orbit
as it passes overhead.

29. Since server-sat arrays operate outside the biosphere, the environmental impact of
power generation and heat disposal is close to zero. Server-sat arrays can grow to
practically unlimited size – space is big, and filled with unused solar energy. In time, new
launch techniques, and solar cells made from lunar rock, can greatly reduce the
environmental and economic costs of manufacturing and launch. There is room for 1 trillion
server-sats within a 100 millisecond ping time distance from earth. Someday, quintillions of
server-sats scattered around the solar system will perform cluster computation.

PART-VI : SECURITY ISSUES

30. Cloud computing is revolutionizing how organizations are constructing their networks
and systems; it is changing how organizations invest in their information technology
infrastructure; and it is forcing organizations to reconsider how they secure critical
information—security is critical and at the forefront of cloud computing. The obvious security
concerns include data integrity, data availability, protection of personally identifiable
information, data protection, data destruction, and communications security. Trust, security
and privacy always pose issues in any internet provided service, but due to the specific
nature of clouds, additional aspects incl issues related to multitenancy and control over data
location etc. arise. Thus new security governance models & processes are required that
cater for these specific issues arising from the cloud model.

31. Data and operations security are largely alleviated by today’s mature scalable and
redundant multitier architectures, and shared resources environments. Third-party data
centres offer facilities to isolate customer data, perform regular backups, and minimize
failure through redundancy. Detailed service level agreements spell out responsibilities.
There are standards for disaster recovery and business continuity to protect ,PaaS,IaaS,
SaaS etc customers.But irrespective of agreements, building confidence measures, it is a
well understood fact that all these security claims are somewhat limited in nature.

32. The cloud computing industry is in the early stages of the technology adoption cycle
as many products are still "vaporware" that faces major hurdles particularly in the area of
data security. However, cloud computing in one form or another will eventually be part of
most IT organizations due to its significant cost savings.
33. Cloud security is an ill-defined, little-understood area of distributed computing.
However, gradual progress is being made to provide a level of assurance that
accommodates the resources needed to support organisations information processing
requirements.

PART VII : CONCLUSION

34. Cloud computing represents an exciting opportunity to bring on-demand applications


to customers in an environment of reduced risk(????) and enhanced reliability. However, it is
important to understand that existing applications cannot just be unleashed on the cloud as
is. Careful attention to design will help ensure a successful deployment. In particular, cloud-
based applications should be deployed as virtual appliances so they contain all the
components needed to operate, update and manage them. Simple design will aid with
scalability as demand for the application increases. And, planning for failure will ensure that
the worst doesn’t happen when the inevitable occurs.

35. The environmental benefits of cloud computing is a key driver as many technology
companies are going to great lengths to make eco-friendly data centers. As an extreme
example, Google was recently awarded a patent for a floating data center that would be
located 3 to 7 miles off shore that incorporates wave energy machines to create electricity
from ocean waves to power its servers. Whether Google will actually build these floating
data centres is debatable, but if Googlers can build a data center in the ocean, why can't the
satellite industry build one in space? This is a concept recently presented in form of idea of
"Cloud Computing On Orbit" which involves orbit satellites powered by solar energy as
space based server farms(Figure above)

You might also like