Skip to main content

Big data: Quiz-2 Hadoop Top Interview Questions

I hope you enjoyed my previous post. This is second set of Questions exclusively for Big data engineers.

Read QUIZ-1.

Q.1) You have submitted a job on an input file which has 400 input splits in HDFS. How many map tasks will run?
A. At most 400.
B. At least 400
C. Between 400 and 1200.
D. Between 100 and 400.
Ans: c

QUESTION 2

What is not true about LocalJobRunner mode? Choose two
A. It requires JobTracker up and running.
B. It runs Mapper and Reducer in one single process
C. It stores output in local file system
D. It allows use of Distributed Cache.

Ans: d,a
Hadoop Jobs and Career
Hadoop Jobs and Career

QUESTION 3
What is the command you will use to run a driver named “SalesAnalyisis” whose compilped code is available in a jar file “SalesAnalytics.jar” with input data in directory “/sales/data” and output in a directory “/sales/analytics”?
A. hadoopfs  –jar  SalesAnalytics.jar  SalesAnalysis  -input  /sales/data  -output /sales/analysis
B. hadoopfs  jar  SalesAnalytics.jar    -input  /sales/data  -output /sales/analysis
C. hadoop    –jar  SalesAnalytics.jar  SalesAnalysis  -input  /sales/data  -output /sales/analysis
D. hadoop  jar  SalesAnalytics.jar  SalesAnalysis   /sales/data   /sales/analysis
ans:d

QUESTION 4
One map-reduce program takes a text file where each line break is considered one complete record and the line offset as a key. The map method parses the record into words and for each word it creates multiple key value pair where keys are the words itself and values are the characters in the word. The reducer finds the characters used for each unique word. This program may not be a perfect program but it works correctly. The problem this program has is that, it creates more key value pairs in the intermediate output of mappers from single input (key-value). This leads to increase of which of the followings? (Select the correct answer)
A. Disk-io and network traffic.
B. Memory foot-print of mappers and network traffic.
C. Disk-io and memory foot print of mappers
D. Block size and disk-io
Ans:

QUESTION 5
What is true about HDFS? (Select one)
A. It is suitable for storing large number of small files.
B. It is suitable storing small number of small files.
C. It is suitable for storing large number of large files.
D. It is suitable for storing small number of large files.
Ans:   c

QUESTION 6
You have just executed a mapreduce job. Where the intermediate data is written to after being emitted from mapper’s map method?
A. The intermediate data is directly transmitted to reducer and is not written anywhere in the disk.
B. The intermediate data is written to HDFS.
C. The intermediate data is written to the in-memory buffers which spill over to the local file system of the tasktracker’s machine where the mapper task is run.
D. The intermediate data is written to the in-memory buffers which spill over to the local file system of the tasktracker’s machine where the reducer task is run.
E. The intermediate data is written to the in-memory buffers which spill over to HDFS of the tasktracker’s machine where the reducer task is run.
Ans: e

QUESTION 7
You are developing a MapReduce job for reporting. The mapper will process input keys representing the year(intWritable) and input values representing product identities(Text). Identify what determines the data types used by the Mapper for a given job.
A. The key and value types specified in the JobConf.setMapInputKeyClass and JobConf.setMapInputValueClass methods.
B. The data types specified in HADOOP_MAP_DATATYPES environment variable.
C. The mapper-specification.xml file submitted with the job determine the mapper’s input key and value types.
D. The InputFormat used by the job determines the mapper’s input key and value types.
Ans: d

QUESTION 8
What types of algorithms are difficult to express in MapReduce v1 (MRv1)?
A. Algorithms that require applying the same mathematical function to large numbers of individual binary records.
B. Relational operations on large amounts of structured and semi-structured data.
C. Algorithms that require global sharing states.
D. Large scale graph algorithm that require one step link traversal.
E. Text analysis algorithms on large collection of un-structured text (e.g. Web crawl).
Ans: c

QUESTION 9
You wrote a map function that throws a runtime exception when it encounters any control character in input data. The input you supplied had 12 such characters spread across five input splits. The first 4 input split has 2 control characters each and the 5th input split has 4 control characters.
Identify the number of failed tasks if the job is run with mapred.max.map.attempts =4.
A. You will have 48 failed tasks.
B. You will have 12 failed tasks.
C. You will have 5 failed tasks.
D. You will have 20 failed tasks.
E. You will have 17 failed tasks.
Ans:

QUESTION 10
What are supported programming languages for Map Reduce?
A.  The most common programming language is Java, but scripting languages are also supported via Hadoop streaming.
B.  Any programming language that can comply with Map Reduce concept can be supported.
C. Only Java supported since Hadoop was written in Java.
D.  Currently Map Reduce supports Java, C, C++ and COBOL.
Ans: a,b

QUESTION 11

What is true about LocalJobRunner?
A. It can configure as many reducers as it needs.
B. You can use “Partitioners”.
C. It can use local file system as well as HDFS.
D. It can only use local file system.
Ans: d

Comments

Popular posts from this blog

10 Tricky Interview Questions On Storm

Storm is real time computation system. It is a flagship software from Apache foundation. Has the capability to process in stream data. Storm is capable to integrate traditional databases. The list given below are tricky and highly useful for your next interview.
Bench mark for Storm is a million tuples processed per second per node. Tricky Interview Questions1) Real uses of Storm?

A) You can use in realtime analytics, online machine learning, continuous computation, distributed RPC, ETL

2) What are different availble layers on Storm?
FluxSQLStreams APITrident3)  Real use of SQL API on top of Storm?
A) You can run SQL queries on stream data
4) Most popular integrations to Storm? HDFSCassandraJDBCHIVEHBase 5) What are different possible Containers integration with Storm? YARNDOCKERMESOS6) What is Local Mode?

A) Running topologies in Local server we can say as Local Mode.

7) Where all the Events Stored in Storm?
A) Event Logger mechanism saves all events

8) What are Serializable data types in …

Blue Prism complete tutorials download now

Blueprsim is an automation tool useful to execute repetitive tasks without human effort. To learn this tool you need right material. Provided below quick reference materials to understand detailed elements, architecture and creating new bots. Useful if you are a new learner and trying to enter into automation career.
The number one and most popular tool in automation is Blue prism. In this post I have given references for popular materials and resources, so that you can use for your interviews.
Why You Need to Learn RPA blue prsim tutorial popular resources I have given in this post. You can download quickly. Learning Blue Prism is really good option if you are learner of Robotic process automation.
RPA Advantages The RPA is also called "Robotic Process Automation"- Real advantages are you can automate any business process and you can complete the customer requests in less time.

The Books Available on Blue Prism 
Blue Prism resourcesDavid chappal PDF bookBlue Prism BlogsVideo…

Blockchain Smart contract behind mechanism you need to learn quickly

Smart contract in Blockchain is a kind of software application that works without human intervention based on the transaction logs and provide solution to user request. I want to share the back end mechanism in Smart Contract of Blockchain. Smart Contract Mechanism What is Smart ContractA smart contract is a protocol which can auto execute, facilitate, verify or enforce the negotiation of a contract.Agreement between two parties you can say as a contract.Incorporating the rules of physical contract into computing world, you can say as smart contractBlockchain supports you to create smart contracts.Smart Contracts are self-executing programs which run on the blockchain and are capable of enforcing rulesUsing Blockchain as platform and making an agreement or contract between more than two parties, you can say as Smart Contract.Traditional Markets  4 Top Benefits of Smart ContractCurrently smart contracts are being used only in Crypto CurrenciesNow Smart Contracts being used in all financ…