Big Data and Cloud Computing
Background
The emergence of cloud data storage and cloud computing is known
to be a big facilitator and precursor to the big data rise. It may be referred
to as the commodification of data storage as well as computing time using
standardized technological advancements.
There are numerous advantages associated with using this kind of
computing over the traditional techniques that were previously used in data
storage. However, different kinds of cloud platforms sometimes also require to
be incorporated with some traditional computing architectures as well.
This entire concept proves to be a dilemma for the big stake
holders involved in the decision making process of high end big data management
projects. There are several questions that need to be answered. These include
which techniques along with which kind of cloud computing infrastructure may be
deployed in order to create an optimal solution for different projects.
Different kinds of big data projects in this regard exhibit different
challenges including unpredictability, high capacity storage needs and immense
computing power. At the very same time, the projects also expect inexpensive,
dependable and swift products as well as other project outcomes. The aim of
this article is to give you an introduction to cloud storage, cloud computing
and the core architectures involved in the process.
Cloud Providers
Almost ten years ago, a
startup that required a reliable computing infrastructure with connectivity to
high speed internet had to hire some space in a data center or in several data
centers sometimes. On contrary to it, presently any one of us may get whatever
amount of storage and computing powers that we may require much easily in
comparison to the troubles that we had to go through in the past. This starts
off with small virtual machines and the range goes up to mini super computers.
These services are paid per hour. In other words, you can enjoy several hours
of super computing powers in exchange of a few hundreds of dollars. The services
offered by cloud computing companies are usually globally distributed which
ensures a high level of durability which was almost impossible in the past.
Cloud Storage
A professional grade cloud
storage service requires to have a high level of durability and availability
along with the ability to scale from just a few bytes to several petabytes. The
cloud storage offered by Amazon’s S3 is known to be the best spacing solution
in this regard. It promises a 99.9% rate of availability every month while the per
annum durability is even higher than that. Similarly there are many other
options that offer different packages of readily available cloud storage and
computing powers.
Cloud Computing
Cloud
computing makes use of computing resources visualization for running several
standardized virtual servers that are present on just one physical machine. The
cloud services providers make succeed in achieving this kind of accessibility using
scale economies that allow low prices along with a billing facility based on
several smaller intervals of time.
Cloud Big Data Challenges
There
are several different challenges that are faced by cloud big data in order to
provide the users with flawless accessibility to storage and other computing
powers. There are different software products like Hadoop that are especially
designed on the model of distributed systems in order to benefit from vertical
scaling. They make sure to process smaller independent tasks on a massive
parallel computing scale. These distributed systems also have the ability to
serve as a high end data store or database like NoSQL. Hadoop’s HDFS, HBase and
Cassandra are a few common examples in this regard. There are a few different
alternatives as well. They are designed to offer a well coordinated stream of
data processes in approximated real time using a cluster of different machines
having complex work flows.
The
entire concept of interchangeability of different resources that are deployed
in association with distributed software designs has the ability to absorb most
of the failures in order to scale different computing instances. Bursting or
spiking demands may also be accommodated as well. Renting out literally
unlimited amounts of resources for different intervals enable us to carry out
wonders in exchange of a very nominal amount of money. Web crawling and data
mining are two of the biggest examples in this regard.
Cloud
Architecture
There
are three primary models of cloud architecture that have been developed so far.
These include hybrid, public and private clouds. All of them share the same
idea of commodification of resources which enable us to enjoy uninterrupted
storage with incredible computing powers.
Private Cloud
As the name suggests, private
clouds are designed to offer dedicated services to a single organization. They
do not share any of their physical resources with others. These resources may
be provided both externally as well as in house. The most typical underlying
requirements in this regard include different regulations as well as security
requirements in order to make sure the isolation of the resources of the
organization from any malicious or accidental access through the shared
resources. The setups of private clouds are considered to be very challenging
as it is hard to achieve the economical benefits of scale within most of the
projects even if all the industry standards are utilized. But even then this
kind of cloud storage is considered to be the best and the safest among the
rest.
Public Cloud
As the name suggests, public
clouds make use of shared physical resources for processing, storage and
transfer of data. However, the clients get to have isolated storage as well as
a private visualized computing environment. There are a few security concerns
that are associated with this kind of cloud which make high end clients to
switch to a private cloud system.
Hybrid Cloud
A hybrid cloud architecture
may be referred to as a merger of both public as well as private cloud
deployments. This usually is deployed as an attempt for achieving elasticity
and security or in order to provide inexpensive burst capabilities and base
load. Several organizations are known to experience short intervals of high
loads especially in rush hours.
Other blogs by the author
55 54 53 52 51 50 49 48 47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 09 08 07 06 05 04 03 02 01
Other blogs by the author
55 54 53 52 51 50 49 48 47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 09 08 07 06 05 04 03 02 01
Comments