1950:- In the 50s, mainframe computers were huge and occupied more space like an entire room. Due to the cost of buying and maintaining mainframes, organizations couldn’t afford to purchase one for each user. The solution of that problem was “time sharing” in which multiple users shared access to data and CPU time.

1969:- In 1969, J.C.R developed the ARPANET (Advanced Research Projects Agency Network) –the network that became the basis of the internet. His vision was for everyone on the globe was to be interconnected so that everyone can easily access programs and data at any site, from anywhere.

1970:- IBM released an operating system called VM that allowed admins to have multiple Virtual Systems or “virtual machines” (VMs) on a single physical mode.

1990:- Telecommunication companies started offering virtual private network connections, which meant it was possible to allow for more users through shared access to the same physical infrastructure. This change enables traffic to be shifted to allow for better network balance and more control over bandwidth usage. Meanwhile, virtualization for PC- based systems started in earned, and as the internet became more accessible, the next logical step was to take Virtualization.

1997:- The term “cloud computing” was coined by the University of Texas Professor Ramnath Chellapa in a talk on a “new computing paradigm.”

2002:- Amazon created Amazon Web Service (AWS), providing an advanced system of cloud services from storage to computation paradigm.”

2009: –Google and Microsoft entered the playing field. The Google app engine brought the low-cost computing and storage services, and Microsoft followed suit with window azure.

 How does Cloud Computing Works

Suppose that you are an executive at a very huge company. Your specific responsibilities include to make sure that all of your employees have the proper hardware and software they need to perform their tasks. For everyone to buy a computer is not enough. You also have a requirement of software as well as software licenses, and after that, provide this software to your employees when they require, and whenever you hire a new employee, you may also have to buy more software so, in this context, you have to spend lots of money.

But there may be a substitute for an executive like you. So, instead of installing software for each computer, you need to install one application. This application allows the employee to log-in into a web-based service which hosts all the programs for the user that is necessary for his or her job. Remote servers purchased by another company, and that will run everything from e-mail to word processing to complex data analysis programs. This is called cloud computing, and it could change the whole computer industry.

In a cloud computing system, there is a crucial workload shift. Local computers cannot do all the heavy lifting when it comes to run applications. But cloud computing can handle that much heavy load easily, and undoubtedly. Hardware and software demands on the user sides decrease. The only thing the user’s computer needs to be able to run in the cloud computing interface software of system, which can be as simple as a web browser and the clouds network, takes care of an another issues.   

Pin It on Pinterest

Share This