Skip to main content

Featured post

5 Super SEO Blogger Tools

In this post, I have explained top blogging tools that need to be considered by every blogger. These tools help you write better SEO friendly blog posts.



1). Headline Analyzer The best tool is the EMV Headline Analyzer. When you enter the headline it analyzes it and gives you EMV ranking. When you get '50' and above it performs better SEO.

2). Headline Length Checker The usual headline length is 50 to 60 characters. Beyond that, the headline will get truncated and looks ugly for search engine users. The tool SERP Snippet Optimization Tool useful to know how it appears in the search results.

3). Free Submission to Search Engines The tool Ping-O-Matic is a nice free submission tool. After your blog post, you can submit your feed to Ping-O-Matic. It submits to search engines freely.

4). Spell and Grammar Check Another free tool is Grammarly, this tool checks your spelling and grammar mistakes. So that you can avoid small mistakes.

5). Keyword AnalyzerWordstream Keyword analyzer i…

How to Show Data science Project in Resume

In any project, the Data analyst role is to deal with data. The data for data science projects come from multiple sources. This post will explain how to put in data science project in Resume.
data science project for resume

Data Science project for Resume

The first step for an interview of any project is you need Resume. You need to tell clearly about your resume.

In interviews, you will be asked questions about your project. So the second step is you need to be in a position explain about project.

The third point is you need to explain the roles you performed in your data science project. If you mention the roles correctly, then, you will have 100% chance to shortlist your resume. Based on your experience your resume can be 1 page or 2 pages.

How to show Technologies used in Data science projects

In interviews, again they will be asked how you used different tools to complete your data science project.

So, you need to be in a position to explain about how you used different options present in the tools. Sometimes, in interviews, they may ask about specific role in Tools specifically you used. You should be in a position to answer these questions too.

4 Key Points You Need to Present in Resume

  1. Write clear description
  2. Write specific role
  3. Explain Tools
  4. Write about data flow

1. Write Clear Description

In any data science project, you will find few things like Client name, the expectation of client, and what you are going to deliver. These things you need to present clearly in Resume.

2. Write Specific Role

To convince your interviewer, you need to tell about your team roles and your specific role. In general, you can find the following roles.
  • Architect
  • Data scientist
  • Business Analyst
  • Development team
  • Testing Team
  • Integration testing team
  • Production release team

3. Explain Tools

You need to present all the Tools your project is using, and your specific tools. Then, in face to face interview, you need to tell what options you used to achieve what.

For example, I used some integration tool, to receive data to the development region, and to send out after unit testing.

If you explain, these key points, I can say, 100% sure, you will be selected.

4. Write about data flow

You need to explain how data is coming, is it in sequential data set, or document data. Something you need to tell clearly.

You also need to tell, after unit testing, which form you will send the data to next region. If you know this flow correctly, then you can convince easily your interviewer.

Also Read

Comments

Most Viewed

Tokenization story you need Vault based Vs Vault-less

The term tokenization refers to create a numeric or alphanumeric number in place of the original card number. It is difficult for hackers to get original card numbers.

Vault-Tokenization is a concept a Vault server create a new Token for each transaction when Customer uses Credit or Debit Card at Merchant outlets 
Let us see an example,  data analysis. Here, card numbers masked with other junk characters for security purpose.

Popular Tokenization ServersThere are two kinds of servers currently popular for implementing tokenization.
Vault-based Vault-less Video Presentation on Tokenization
Vault-based server The term vault based means both card number and token will be stored in a Table usually Teradata tables. During increasing volume of transactions, the handling of Table is a big challenge.
Every time during tokenization it stores a record for each card and its token. When you used a card multiple times, each time it generates multiple tokens. It is a fundamental concept.
So the challe…