Skip to main content

The Story behind Mainframe to Cloud Journey

Mainframe to cloud
Gettyimage.in

Mainframe to CLOUD: Mainframe computing took off in the 1950s and gained much prominence through-out the 1960s. Corporations such as IBM (International Business Machines), Univac, DEC (Digital Equipment Corporation), and Control Data Corporation started developing powerful mainframe systems.

These mainframe systems mainly carried out number-crunching for scientists and engineers. The main programming language used was Fortran. Then in the 1960s, the notion of database systems was conceived and corporations developed database systems based on the network and hierarchical data models. The database applications at that time were written mainly in COBOL.

Cloud Vs Mainframe

In the 1970s, corporations such as DEC created the notion of mini-computers. An example is DEC's VAX machine. These machines were much smaller than the mainframe systems. Around that time, terminals were developed. This way, programmers did not have to go to computing centers and use punch cards for their computations. They could use their terminals and submit the jobs to the computing machines. This was a huge step forward. It was also during this time that languages such as C and operating systems such as UNIX were developed.



A significant development in the late 1970s was the emergence of the personal computer. This resulted in Apple Computers. Soon after, IBM developed its own personal computers. Microsoft developed the DOS operating system for these IBM machines. Powerful workstations were developed in the early 1980s by corporations such as Sun Microsystems, Apollo, and HP (Hewlett Packard). Database systems based on the relational data model were developed by corporations such as IBM and Oracle. By the mid-1980s, computers were poised to take over the world.

Distributed Computing

With the invention of the Internet by DARPA (Defense Advanced Research Projects Agency), networked systems gained momentum in the 1970s and the early products came out in the 1980s. Computers were networked together and were communicating with each other and exchanging messages through what is now known as email. Several applications were developed for these distributed systems. The idea was to utilize the resources and carry out a computation in multiple machines. The late 1980s also saw the emergence of parallel computing.

A computing paradigm that exploded in the early 1990s was the distributed object paradigm. Here, computers were encapsulated as objects. This way objects communicated with each other by exchanging messages. This work resulted in consortia such as the Object Management Group [OMG] to be formed. It was at this time that object-oriented languages such as Smalltalk and C++ rose to prominence.

Evaluation of WWW

In the early 1990s, one of the major innovations of the twentieth century was initiated and that was the World Wide Web (WWW). Tim Berners Lee, the inventor of the WWW was a programmer at CERN in Geneva, Switzerland. He started a project to support physicists sharing data. This project resulted in the WWW. Around the same time, programmers at the University of Illinois National Computing Center developed the MOSAIC browser. These two innovations resulted in ordinary people using the WWW to query and search for information. The late 1990s saw the emergence of several search engines such as Alta Vista and Lycos. Then two researchers from Stanford University started a company called Google that is now the largest web search company in the world. Java became one of the popular programming languages.

CLOUD Computing
  • Cloud application 
  • Cloud data layer 
  • Cloud storage layer 
  • Cloud operating system and hypervisor layer 

The late 1990s also saw what is now called the dot-com boom. Several companies that provided services were formed and this resulted in electronic commerce. However, the infrastructure technologies were not mature at that time and, as a result, many of these companies did not survive. In the late 1990s and early 2000s, the notion of web services based on the service paradigm was created. With the service technologies, better infrastructures were built for e-commerce. Corporations were providing services to the consumer based on the service paradigm.

Developments in services computing, distributed computing, and the WWW have resulted in cloud computing. The idea was to provide computing as a service just like we use electricity as a service. That is, a cloud service provider would provide different levels of service to the consumer. The service could be to use the cloud for computing, for database management, or for application support such as organizing one's finances.

Comments

Popular posts

Blue Prism complete tutorials download now

RPA blue prsim tutorial popular resources I have given in this post. You can download quickly.Learning Blue Prism is really good option if you are learner of Robotic process automation. The RPA is also called "Robotic Process Automation"- Real advantages are you can automate any business process and you can complete the customer requests in less time.

The Books Available on Blue Prism 
Blue Prism resourcesDavid chappal PDF bookBlue Prism BlogsVideo Training
RPA training The other Skills you need
Basic business skills and Domain skills are more than enough to be successful in this automation careerScripting languages like Perl/JSON/JavaScript/VBScript.  The interesting point is learning any RPA tool is not a problem. You can learn tool quickly. The real point is how quickly you apply your knowledge to implement automated tasks is important.


Also read
Robotic RPA Software developer skills you needBlue Prism tutorials download to learn quicklyPopular RPA tools functionality differen…

Three popular RPA tools functional differences

Robotic process automation is growing area and many IT developers across the board started up-skill in this popular area. I have written this post for the benefit of Software developers who are interested in RPA also called Robotic Process Automation.

In my previous post, I have described that total 12 tools are available in the market. Out of those 3 tools are most popular. Those are Automation anywhere, BluePrism and Uipath. Many programmers asked what are the differences between these tools. I have given differences of all these three RPA tools.

BluePrismBlue Prism has taken a simple concept, replicating user activity on the desktop, and made it enterprise strength. The technology is scalable, secure, resilient, and flexible and is supported by a comprehensive methodology, operational framework and provided as packaged software.The technology is developed and deployed within a “corridor of IT governance” and has sophisticated error handling and process modelling capabilities to ensu…

Robotic RPA Software developer skills you need

Robotic process automation is an upcoming and becoming most popular skill. As I said there are three popular tools. To become proficient in any one of the tool is really good to get a job in Developer role.
To get a job in this line, I found in my research that some programming skills and Hand-on training on any one of the tools is required. Also, try to to know differences in other popular rpa tools.

Most people are asking experience in tools like Automation anywhare, Blue Prism and Uipath. But, you cannot be proficient in all. So just know what are the differences. Ok...
You may ask a question like how to know. First join one good coaching institute and learn one tool perfectly. And start taking online training. Really good for you. Whatever you are lacking quickly you can learn online way.

To learn Uipath try here. Also, you can learn Automation anywhere tool online way.

The following are the list of IT skills commonly asking:
Automation anywhere/Blue Prism/Uipath.Net/C#/Java/SQL ski…