Featured Post

Scraping Website: How to Write a Script in Python

Image
Here's a python script that you can use as a model to scrape a website. Python script The below logic uses BeautifulSoup Package for web scraping. import requests from bs4 import BeautifulSoup url = 'https://www.example.com' response = requests.get(url) soup = BeautifulSoup(response.text, 'html.parser') # Print the title of the webpage print(soup.title.text) # Print all the links in the webpage for link in soup.find_all('a'):     print(link.get('href')) In this script, we first import the Requests and Beautiful Soup libraries. We then define the URL we want to scrape and use the Requests library to send a GET request to that URL. We then pass the response text to Beautiful Soup to parse the HTML contents of the webpage. We then use Beautiful Soup to extract the title of the webpage and print it to the console. We also use a for loop to find all the links in the webpage and print their href attributes to the console. This is just a basic example, but

Case Study On Cloud Computing

The concept of cloud computing is in use for many years, but in recent years has it become a highlighted and come in picture. In the year 1990s, cloud computing was developed by major IT providers such as Sun, Microsoft, Google, and Amazon.

Cloud computing
Cloud computing
Different products came into use for different levels of users. The most popular services for end users include web-based email systems (SaaS), e.g. AOL, Gmail, Hotmail, and Yahoo! Mail, and office applications such as Google Docs, Microsoft MS Office Online, Cloud-canvas.com, and Write.fm, etc.

Developers can run their programs on the cloud (PaaS) like Google AppEngine, Windows Azure, and Force.com. Companies or organizations store or backup their large data on remote servers (IaaS), for example, Rackspace, Microsoft Azure, Animoto, Jungle Disk and Amazon's EC2 or S3 servers.

In 2011, the Primary Research Group (PRG) published a report of its recently conducted survey on library use of cloud computing (Primary Research Group, 2011). Participants included 70 libraries worldwide with the majority from the United States. The survey report reveals that 61.97 percent of libraries in the sample used free SaaS while 22.54 percent of libraries sampled used paid subscription SaaS; less than 3 percent of libraries surveyed used PaaS, and 4.23 percent used IaaS.

Most libraries using PaaS or IaaS had annual budgets over $5,000,000. Smaller libraries usually used the servers of their parent organizations, while libraries with multi-million dollar budgets tended to use their own servers.

  • Few case studies also show the exploration and adoption of other models of cloud computing in libraries. For example, California State University libraries have migrated their key library systems to vendors' cloud-based servers (i.e. a public cloud) and to campus IT's internally virtualized environment (i.e. a private cloud) (Wang, 2012). 
  • The Burritt Library at Central Connecticut State University used Amazon's S3 to back up their high-resolution digital objects (Iglesias, 2011). Murray State University library experimented with Dropbox for library services (Bagley, 2011). University of Arizona Libraries have migrated their ILS, Digital Libraries website, Interlibrary Loan system and repository software to cloud-based services (Han, 2010). 
  • The Z. Smith Reynolds Library in Winston Salem, NC, started to use Amazon's EC2 for hosting its website, discovery services, and digital library services in 2009 (Mitchell, 2010b).

The service model represent server administration and maintenance responsibilities are moved from local personnel to the hosting vendor, while the management of the application remains in the traditional way, i.e. librarians are still able to access the backend of the system for local customizations as if they were managing the system locally.

Comments

Popular posts from this blog

7 AWS Interview Questions asked in Infosys, TCS

How to Decode TLV Quickly