In cloud computing, the word cloud (also phrased as “the cloud”) is used as a metaphor for “the Internet,” so the phrase cloud computing means “a type of Internet-based computing,” where different services — such as servers, storage and applications — are delivered to an organization’s computers and devices through the Internet.
The goal of cloud computing is to apply traditional supercomputing, or high-performance computing power, normally used by military and research facilities, to perform tens of trillions of computations per second, in consumer-oriented applications such as financial portfolios, to deliver personalized information, to provide data storage or to power large, immersive online computer games.
To do this, cloud computing uses networks of large groups of servers typically running low-cost consumer PC technology with specialized connections to spread data-processing chores across them. This shared IT infrastructure contains large pools of systems that are linked together. Often, virtualization techniques are used to maximize the power of cloud computing.
In a cloud computing system, there’s a significant workload shift. Local computers no longer have to do all the heavy lifting when it comes to running applications. The network of computers that make up the cloud handles them instead. Hardware and software demands on the user’s side decrease. The only thing the user’s computer needs to be able to run is the cloud computing system’s interface software, which can be as simple as a Web browser, and the cloud’s network takes care of the rest.