Featured Post

Scraping Website: How to Write a Script in Python

Image
Here's a python script that you can use as a model to scrape a website. Python script The below logic uses BeautifulSoup Package for web scraping. import requests from bs4 import BeautifulSoup url = 'https://www.example.com' response = requests.get(url) soup = BeautifulSoup(response.text, 'html.parser') # Print the title of the webpage print(soup.title.text) # Print all the links in the webpage for link in soup.find_all('a'):     print(link.get('href')) In this script, we first import the Requests and Beautiful Soup libraries. We then define the URL we want to scrape and use the Requests library to send a GET request to that URL. We then pass the response text to Beautiful Soup to parse the HTML contents of the webpage. We then use Beautiful Soup to extract the title of the webpage and print it to the console. We also use a for loop to find all the links in the webpage and print their href attributes to the console. This is just a basic example, but

Amazon Web Service Import/Export Commands

In the process like Hadoop cluster, which is already installed on CLOUD, the main input for data processing is huge volume of data. The big questions is how to send data to CLOUD from local machine.
It is NOT so easy to send huge volume of data to CLOUD through network.

AWS Import or Export

AWS introduced new feature called Import/Export, so that you can send hard drive to AWS, they will upload your data to S3 storage.

Different calculations:

How networking causes hurdle to move data to cloud?

A) Your internet speed is 1.544 MBPS it takes 82 days - So your data is 100 GB or more, based on your net speed you need to go for Import/Export.

Your internet speed is 10 MBPS it takes 13 days - So your data is 600 GB or more, based on your net speed you need to go for Import/Export.

Comments

Popular posts from this blog

7 AWS Interview Questions asked in Infosys, TCS

How to Decode TLV Quickly