Skip to main content

The Story behind Mainframe to Cloud Real Journey

Mainframe to cloud
Gettyimage.in

Mainframe to CLOUD: Mainframe computing took off in the 1950s and gained much prominence through-out the 1960s. Corporations such as IBM (International Business Machines), Univac, DEC (Digital Equipment Corporation), and Control Data Corporation started developing powerful mainframe systems.

These mainframe systems mainly carried out number-crunching for scientists and engineers. The main programming language used was Fortran. Then in the 1960s, the notion of database systems was conceived and corporations developed database systems based on the network and hierarchical data models. The database applications at that time were written mainly in COBOL.

Cloud Vs Mainframe

In the 1970s, corporations such as DEC created the notion of mini-computers. An example is DEC's VAX machine. These machines were much smaller than the mainframe systems. Around that time, terminals were developed. This way, programmers did not have to go to computing centers and use punch cards for their computations. They could use their terminals and submit the jobs to the computing machines. This was a huge step forward. It was also during this time that languages such as C and operating systems such as UNIX were developed.



A significant development in the late 1970s was the emergence of the personal computer. This resulted in Apple Computers. Soon after, IBM developed its own personal computers. Microsoft developed the DOS operating system for these IBM machines. Powerful workstations were developed in the early 1980s by corporations such as Sun Microsystems, Apollo, and HP (Hewlett Packard). Database systems based on the relational data model were developed by corporations such as IBM and Oracle. By the mid-1980s, computers were poised to take over the world.

Distributed Computing

With the invention of the Internet by DARPA (Defense Advanced Research Projects Agency), networked systems gained momentum in the 1970s and the early products came out in the 1980s. Computers were networked together and were communicating with each other and exchanging messages through what is now known as email. Several applications were developed for these distributed systems. The idea was to utilize the resources and carry out a computation in multiple machines. The late 1980s also saw the emergence of parallel computing.

A computing paradigm that exploded in the early 1990s was the distributed object paradigm. Here, computers were encapsulated as objects. This way objects communicated with each other by exchanging messages. This work resulted in consortia such as the Object Management Group [OMG] to be formed. It was at this time that object-oriented languages such as Smalltalk and C++ rose to prominence.

Evaluation of WWW

In the early 1990s, one of the major innovations of the twentieth century was initiated and that was the World Wide Web (WWW). Tim Berners Lee, the inventor of the WWW was a programmer at CERN in Geneva, Switzerland. He started a project to support physicists sharing data. This project resulted in the WWW. Around the same time, programmers at the University of Illinois National Computing Center developed the MOSAIC browser. These two innovations resulted in ordinary people using the WWW to query and search for information. The late 1990s saw the emergence of several search engines such as Alta Vista and Lycos. Then two researchers from Stanford University started a company called Google that is now the largest web search company in the world. Java became one of the popular programming languages.

CLOUD Computing
  • Cloud application 
  • Cloud data layer 
  • Cloud storage layer 
  • Cloud operating system and hypervisor layer 

The late 1990s also saw what is now called the dot-com boom. Several companies that provided services were formed and this resulted in electronic commerce. However, the infrastructure technologies were not mature at that time and, as a result, many of these companies did not survive. In the late 1990s and early 2000s, the notion of web services based on the service paradigm was created. With the service technologies, better infrastructures were built for e-commerce. Corporations were providing services to the consumer based on the service paradigm.

Developments in services computing, distributed computing, and the WWW have resulted in cloud computing. The idea was to provide computing as a service just like we use electricity as a service. That is, a cloud service provider would provide different levels of service to the consumer. The service could be to use the cloud for computing, for database management, or for application support such as organizing one's finances.

Comments

Popular posts from this blog

Blue Prism complete tutorials download now

Blue prism is an automation tool useful to execute repetitive tasks without human effort. To learn this tool you need the right material. Provided below quick reference materials to understand detailed elements, architecture and creating new bots. Useful if you are a new learner and trying to enter into automation career.
The number one and most popular tool in automation is a Blue prism. In this post, I have given references for popular materials and resources so that you can use for your interviews.
Why You Need to Learn RPA blue prism tutorial popular resources I have given in this post. You can download quickly. Learning Blue Prism is a really good option if you are a learner of Robotic process automation.
RPA Advantages The RPA is also called "Robotic Process Automation"- Real advantages are you can automate any business process and you can complete the customer requests in less time.

The Books Available on Blue Prism 
Blue Prism resourcesDavid chappal PDF bookBlue Prism…

Topologies in Apache Storm the concept you need to know

There are two main reasons why Apache Storm is so popular. The number one is it can connect to many sources. The number two is scalable. The other advantage is fault tolerant. That means, guaranteed data processing.
The map-reduce jobs process the data analytics in Hadoop. The topology in Storm is the real data processor. The co-ordination between Nimbus and Supervisor carried by Zookeeper What are topologiesThe jobs in Hadoop are similar to topology. The jobs run as per schedule defined.In Storm, the topology runs forever.A topology consists of many worker processes spread across many machines. A topology is a pre-defined design to get end product using your data.A topology comprises of 2 parts. These are Spout and bolts.The Spout is a funnel for topology Two nodes in StormMaster Node: similar to Hadoop job tracker. It runs on a daemon called Nimbus.Worker Node: It runs on a daemon called Supervisor. The Supervisor listens to the work assigned to each machine.Master NodeNimbus is re…

Three popular RPA tools functional differences

Robotic process automation is growing area and many IT developers across the board started up-skill in this popular area. I have written this post for the benefit of Software developers who are interested in RPA also called Robotic Process Automation.

In my previous post, I have described that total 12 tools are available in the market. Out of those 3 tools are most popular. Those are Automation anywhere, BluePrism and Uipath. Many programmers asked what are the differences between these tools. I have given differences of all these three RPA tools.

BluePrismBlue Prism has taken a simple concept, replicating user activity on the desktop, and made it enterprise strength. The technology is scalable, secure, resilient, and flexible and is supported by a comprehensive methodology, operational framework and provided as packaged software.The technology is developed and deployed within a “corridor of IT governance” and has sophisticated error handling and process modelling capabilities to ensu…