Featured Post

Scraping Website: How to Write a Script in Python

Here's a python script that you can use as a model to scrape a website. Python script The below logic uses BeautifulSoup Package for web scraping. import requests from bs4 import BeautifulSoup url = 'https://www.example.com' response = requests.get(url) soup = BeautifulSoup(response.text, 'html.parser') # Print the title of the webpage print(soup.title.text) # Print all the links in the webpage for link in soup.find_all('a'):     print(link.get('href')) In this script, we first import the Requests and Beautiful Soup libraries. We then define the URL we want to scrape and use the Requests library to send a GET request to that URL. We then pass the response text to Beautiful Soup to parse the HTML contents of the webpage. We then use Beautiful Soup to extract the title of the webpage and print it to the console. We also use a for loop to find all the links in the webpage and print their href attributes to the console. This is just a basic example, but

Hyderabad Based Startup Built Largest Ever Big data Electoral Repository

I was gone through an email from my friend saying that they are creating a Hadoop project to analyze voters data. This project in my view is both academic and research oriented.
hadoop project
The real challenge was extraction of voter info from 2.5 crore PDF pages and translation of the same into English to fuse with other sources. The technology was a big hurdle. 

Hadoop Project

The infrastructure, built especially for the project, included 64 node Hadoop, PostgreSQL and servers that process a master file containing over 8 Terabytes of Data.

Besides, Testing and Validation was another big task. ‘First of a Kind’ Heuristic (machine learning) algorithms were developed for people classification based on name, geography etc., which help in the identification of religion, caste, and even ethnicity.

Data from Sources

“Data from multiple sources like census, economic and social surveys were mapped to polling booths. Simultaneously, external and propriety data sources had to be fused with individual voters’ data,” informed Joshi. 

Also Read


Popular posts from this blog

7 AWS Interview Questions asked in Infosys, TCS

How to Decode TLV Quickly