Skip to main content

Featured post

5 Super SEO Blogger Tools

In this post, I have explained top blogging tools that need to be considered by every blogger. These tools help you write better SEO friendly blog posts.



1). Headline Analyzer The best tool is the EMV Headline Analyzer. When you enter the headline it analyzes it and gives you EMV ranking. When you get '50' and above it performs better SEO.

2). Headline Length Checker The usual headline length is 50 to 60 characters. Beyond that, the headline will get truncated and looks ugly for search engine users. The tool SERP Snippet Optimization Tool useful to know how it appears in the search results.

3). Free Submission to Search Engines The tool Ping-O-Matic is a nice free submission tool. After your blog post, you can submit your feed to Ping-O-Matic. It submits to search engines freely.

4). Spell and Grammar Check Another free tool is Grammarly, this tool checks your spelling and grammar mistakes. So that you can avoid small mistakes.

5). Keyword AnalyzerWordstream Keyword analyzer i…

Big Data: Top Hadoop Interview Questions (4 of 5)

Hadoop,Jobs,Career
1) What is MAP Reduce program?
- You need to give actual steps in this program
- You have to write scripts and codes

2) What is MAPReduce?
-Mapreduce is a data processing model
-It is combination of 2 parts. One is Mappers and the other one is Reducers

3)What will happen in Mapping phase?
It takes the input data, and feeds each data element into the mapper

4)What is the function of Reducer?
The reducer process all outputs from mapper and arrives at a final result

5)What kind of input required for Mapreduce?
It should be structured in the form of (Key,Value) pairs

6)What is HDFS?
HDFS is a file system designed for large-scale data processing under frameworks such as MapReduce.

7) Is HDFS like UNIX?
No, but commands in HDFS works similarly to UNIX

8) What is Simple file command?
hadoop fs -ls

9) How to copy data into HDFS file system?

Copy a file into HDFS from local system

10) What is default working directory in HDFS?
/user/$USER
$USER ==> Your login user name

Comments

Most Viewed

Tokenization story you need Vault based Vs Vault-less

The term tokenization refers to create a numeric or alphanumeric number in place of the original card number. It is difficult for hackers to get original card numbers.

Vault-Tokenization is a concept a Vault server create a new Token for each transaction when Customer uses Credit or Debit Card at Merchant outlets 
Let us see an example,  data analysis. Here, card numbers masked with other junk characters for security purpose.

Popular Tokenization ServersThere are two kinds of servers currently popular for implementing tokenization.
Vault-based Vault-less Video Presentation on Tokenization
Vault-based server The term vault based means both card number and token will be stored in a Table usually Teradata tables. During increasing volume of transactions, the handling of Table is a big challenge.
Every time during tokenization it stores a record for each card and its token. When you used a card multiple times, each time it generates multiple tokens. It is a fundamental concept.
So the challe…