Featured Post

Scraping Website: How to Write a Script in Python

Image
Here's a python script that you can use as a model to scrape a website. Python script The below logic uses BeautifulSoup Package for web scraping. import requests from bs4 import BeautifulSoup url = 'https://www.example.com' response = requests.get(url) soup = BeautifulSoup(response.text, 'html.parser') # Print the title of the webpage print(soup.title.text) # Print all the links in the webpage for link in soup.find_all('a'):     print(link.get('href')) In this script, we first import the Requests and Beautiful Soup libraries. We then define the URL we want to scrape and use the Requests library to send a GET request to that URL. We then pass the response text to Beautiful Soup to parse the HTML contents of the webpage. We then use Beautiful Soup to extract the title of the webpage and print it to the console. We also use a for loop to find all the links in the webpage and print their href attributes to the console. This is just a basic example, but

Social Analytics - How Marketers Will Use

Of all the windows through which a business can peer into an audience, seems most enticing. The breadth of subjects, range of observations, and, above all, the ability to connect and draw inferences make hugely exciting for anyone who is interested in understanding and influencing past, present and potential customers, employees, or even investors.

As individuals leave traces of their activities - personal, social and professional - on the internet, they allow an unprecedented view into their lives, thoughts, influences and preferences. Social analytics attempts to draw useful understanding and inferences, which could be relevant to marketers, sales persons, HR managers, product designers, investors and so on. Thus, as social tools like Facebook, Twitter, LinkedIn, WhatsApp, and many more, host a plethora of social activities of many people, a humongous amount of data is generated about people's preferences, behaviour and sentiments. Like any data, it is amenable to analysis to gain useful insights.

The challenge comes from the sheer volume, velocity, and variety. It is very difficult to ensure that the analysis is relevant and reliable. Besides the daunting technical intricacies of setting up the appropriate analytics, the aspects of choosing information sources, filtering the right data, and its interpretation and aggregation are susceptible to errors and biases. For example, some social activities are relatively easier to access (like activity on Twitter, or public updates on Facebook), many are not. Some types of data (like text, or location) are easy to search and interpret, many (like pictures) are not. So a good analysis model must judiciously compensate for the nature of the sources included, and hence it could be at times very difficult to assess if the analysis is useful or just meaningless mumbo-jumbo.

Read more

Comments

Popular posts from this blog

7 AWS Interview Questions asked in Infosys, TCS

How to Decode TLV Quickly