Skip to main content

Featured post

5 Super SEO Blogger Tools

In this post, I have explained top blogging tools that need to be considered by every blogger. These tools help you write better SEO friendly blog posts.



1). Headline Analyzer The best tool is the EMV Headline Analyzer. When you enter the headline it analyzes it and gives you EMV ranking. When you get '50' and above it performs better SEO.

2). Headline Length Checker The usual headline length is 50 to 60 characters. Beyond that, the headline will get truncated and looks ugly for search engine users. The tool SERP Snippet Optimization Tool useful to know how it appears in the search results.

3). Free Submission to Search Engines The tool Ping-O-Matic is a nice free submission tool. After your blog post, you can submit your feed to Ping-O-Matic. It submits to search engines freely.

4). Spell and Grammar Check Another free tool is Grammarly, this tool checks your spelling and grammar mistakes. So that you can avoid small mistakes.

5). Keyword AnalyzerWordstream Keyword analyzer i…

Data science Secrets Simple Project Useful for Practice

I want to share with you how to use Python for your Data science or analytics Projects. Many programmers struggle to learn Data science because they do not know where to start. You can get hands on if you start with a mini project.

I have used Ubuntu Operating System for this project

Skills You Must Need to become a Data Scientist

You need dual skills - Learning and Apply knowledge. In Data science you need to learn and apply your knowledge. After study engineering, any person gets B Tech or M Tech Degree. You can become a real engineer if you apply engineering principles. So Data science also the same.

data analsis for data engineers

Data Visualization in Python is my simple project

Importance of Data

Data is the precious resource in resolving Machine Learning and Data Science Problems. Define first what is your problem.
  1. Collect Data
  2.  Wrangle  the Data and Clean it.
  3. Visualize the Patterns
In olden days you might be studied a subject called Statistical Analysis. In this subject, you need to study the actual problem and collect the data in a notebook. Let us assume that there were no computers in olden days, and people use paper note books. 

After that, they use pencil and graph paper and draw the charts based on selected data. It is time consuming and laborious process. Finally based on the data visualization people correct the process. The same concept you can see in current data science projects.

Use Python to Write a Script

Related Posts 

Steps You Need 

  • Install Ubuntu on a Virtual Machine
  •  Install Python 3.7X
  • Install Anaconda Python - Which contains all the packages that you need for Data science projects.

ls Command Shows the Script I have created for Mini Project 

ls command you can use all the scripts

Actual Python Code check in the Script using less command

use less command to check the code

Points to Remember

  1. Import command you can use to import packages. 
  2. matplotlib is a package. This you need to draw a plot.
  3. Numpy is required to use all mathematical and Scientific calculation. So I imported Numpy.
  4. The 'as' command is an alias. So that you can save lot of coding time.
  5. I have drawn two plots. One is uniform and the other one in normal

Why I used if__name__=="__main__":

The real meaning is that the script is running under main(). Also the module mypython.py, you can run as Stand alone or you can call this module using import command from another script.

In Python you need to create modules using .py extension. Some people say as scripts and other people say as Modules. All Python documents and Standard text books using the word Python module. So you also can use it.

Related Posts

 How to Execute mypython.py in Python Console

 $python mypython.py

Two Plots Drawn - Normal Distribution and Uniform Distribution

Normal Distribution

normal distribution

 Uniform Distribution

uniform distribution

Summary

These are simple Projects and you can try on your own. These kind of plots you need to draw in Data science projects using real-time data.

You can ask any questions using Comments block below.


Comments

Post a Comment

Thanks for your message. We will get back you.

Most Viewed

Tokenization story you need Vault based Vs Vault-less

The term tokenization refers to create a numeric or alphanumeric number in place of the original card number. It is difficult for hackers to get original card numbers.

Vault-Tokenization is a concept a Vault server create a new Token for each transaction when Customer uses Credit or Debit Card at Merchant outlets 
Let us see an example,  data analysis. Here, card numbers masked with other junk characters for security purpose.

Popular Tokenization ServersThere are two kinds of servers currently popular for implementing tokenization.
Vault-based Vault-less Video Presentation on Tokenization
Vault-based server The term vault based means both card number and token will be stored in a Table usually Teradata tables. During increasing volume of transactions, the handling of Table is a big challenge.
Every time during tokenization it stores a record for each card and its token. When you used a card multiple times, each time it generates multiple tokens. It is a fundamental concept.
So the challe…