Skip to main content

Featured post

8 Popular Encryption methods real Usage

Encryption works as it replaces original data with some special characters and stores the data. You need a private key to decrypt the data. Else it is not possible to understand encrypted data even for hackers.

Private keys are two types. Sender uses one private key and receiver uses one other private key.
I have given in this post about 8 encryption methods.

Flow Diagram Encryption
 Different Encryption Methods Classic CipherRandom Key GenerationRSA TokenHash FunctionsMac value - Message Authentication codeKey HashTLS ProtocolAuthenticated Cipher  1. Classic Cipher The best example is this kind of algorithm was used in world war -II communication machines. 2. Random Key Generations One kind of security. They use an algorithm that sends random numbers. People use that number as a password.
3. RSA Token Every 60 seconds a token is generated. Mostly people use for financial applications. An extra identity mechanism along with user id and password. 4. Hash Functions It assigns a hash va…

Top 100 Hadoop Complex interview questions (Part 4 of 4)

Hadoop framework is most popular in data analytics and data related projects. I have given here my 4th set of questions for you to read quickly.
hadoop part 4

1) What is MapReduce?
Ans) It is a framework or a programming model that is used for processing large data sets over clusters of computers using distributed programming.

2). What are ‘maps’ and ‘reduces’?
Ans). ‘Maps‘ and ‘Reduces‘ are two phases of solving a query in HDFS. ‘Map’ is responsible to read data from input location, and based on the input type, it will generate a key-value pair, that is, an intermediate output in the local machine. ’Reducer’ is responsible to process the intermediate output received from the mapper and generate the final output.

3). What are the four basic parameters of a mapper?

Ans) The four basic parameters of a mapper are LongWritable, text, text, and IntWritable. The first two represent input parameters and the second two represent intermediate output parameters.

4). What are the four basic parameters of a reducer?
Ans). The four basic parameters of a reducer are text, IntWritable, text, IntWritable. The first two represent intermediate output parameters and the second two represent final output parameters.

5). What do the master class and the output class do?
Ans). Master is defined to update the Master or the job tracker and the output class is defined to write data onto the output location.

6). What is the input type/format in MapReduce by default?
Ans). By default, the type input type in MapReduce is ‘text’.

7). Is it mandatory to set input and output type/format in MapReduce?
Ans) No, it is not mandatory to set the input and output type/format in MapReduce. By default, the cluster takes the input and the output type as ‘text’.

8) What does the text input format do?
Ans). In the text input format, each line will create a line object, that is a hexadecimal number. Key is considered as a line object and value is considered as a whole line text. This is how the data gets processed by a mapper. The mapper will receive the ‘key’ as a ‘LongWritable‘ parameter and value as a ‘text‘ parameter.

9)What does job conf class do?
Ans). MapReduce needs to logically separate different jobs running on the same cluster. ‘Job conf class‘ helps to do job level settings such as declaring a job in the real environment. It is recommended that Job name should be descriptive and represent the type of job that is being executed.

10). What does conf.setMapper Class do?
Ans) Conf.setMapper class sets the mapper class and all the stuff related to map job such as reading a data and generating a key-value pair out of the mapper.

11). What do sorting and shuffling do?
Ans). Sorting and shuffling are responsible for creating a unique key and a list of values. Making similar keys at one location is known as Sorting. And the process by which the intermediate output of the mapper is sorted and sent across to the reducers is known as Shuffling.

12). What does a split do?
Ans). Before transferring the data from hard disk location to map method, there is a phase or method called the ‘Split Method‘. Split method pulls a block of data from HDFS to the framework.

The Split class does not write anything, but reads data from the block and pass it to the mapper. By default, Split is taken care by the framework. Split method is equal to the block size and is used to divide the block into a bunch of splits.

13). How can we change the split size if our commodity hardware has less storage space?
Ans) If our commodity hardware has less storage space, we can change the split size by writing the ‘custom splitter‘. There is a feature of customization in Hadoop which can be called from the main method.

14). What does a MapReduce partitioner do?
Ans). A MapReduce partitioner makes sure that all the value of a single key goes to the same reducer, thus allows even distribution of the map output over the reducers. It redirects the mapper output to the reducer by determining which reducer is responsible for a particular key.

15). How is Hadoop different from other data processing tools?
Ans). In Hadoop, based upon your requirements, you can increase or decrease the number of mappers without bothering about the volume of data to be processed. this is the beauty of parallel processing in contrast to the other data processing tools available.

16). Can we rename the output file?
Ans). Yes, we can rename the output file by implementing multiple format output class.

17). Why we cannot do aggregation (addition) in a mapper? Why we require reducer for that?
Ans). We cannot do aggregation (addition) in a mapper because sorting is not done in a mapper. Sorting happens only on the reducer side. Mapper method initialization depends upon each input split. While doing aggregation, we will lose the value of the previous instance. For each row, a new mapper will get initialized. For each row, the input split again gets divided into mapper, thus we do not have a track of the previous row value.

18) What is Streaming?
Ans). Streaming is a feature with Hadoop framework that allows us to do programming using MapReduce in any programming language which can accept standard input and can produce standard output. It could be Perl, Python, Ruby and not necessarily be Java. However, customization in MapReduce can only be done using Java and not any other programming language.

19). What is a Combiner?
Ans). A ‘Combiner’ is a mini reducer that performs the local reduce task. It receives the input from the mapper on a particular node and sends the output to the reducer. Combiners help in enhancing the efficiency of MapReduce by reducing the quantum of data that is required to be sent to the reducers.

20). What is the difference between an HDFS Block and Input Split?

Ans). HDFS Block is the physical division of the data and Input Split is the logical division of the data.

21). What happens in a text input format?
Ans). In text input format, each line in the text file is a record. Key is the byte offset of the line and value is the content of the line. For instance, Key: longWritable, value: text.

22). What do you know about key-value text input format?
Ans). In key value text input format, each line in the text file is a ‘record‘. The first separator character divides each line. Everything before the separator is the key and everything after the separator is the value. For instance, Key: text, value: text.

23). What do you know about Sequence file input format?
Ans). Sequencefileinputformat is an input format for reading in sequence files. Key and value are user-defined. It is a specific compressed binary file format which is optimized for passing the data between the output of one MapReduce job to the input of some other MapReduce job.

24). What do you know about Nlineoutputformat?

Ans). Nlineoutputformat splits ‘n’ lines of input as one split.

Related Posts

Comments

Popular posts from this blog

Python for Loop Print rows using List concept

For loop is crucial for your python programming. Out of all the For Loop options, the list option is so popular. If you know, how to use this concept, you can write any number of programs quickly and with less effort.

Using for loop you can get index for list items. You can print list elements using for loop.

If you take any language, the loops play a vital role.

The use of For Loop When you want to deal with data of multiple values, you need loops. So, For Loop is so famous because, you can control multiple input values.

Syntax For Loop for i in my_list: print(i)
The 'i' says variable that represents value in the 'my_list'. The for helps to iterate the list values one by one.
Example Program For Loop List in Python my_list=[2,3,4] for i in my_list: print(i)
Result 2 3 4 How to create Multiplication Table using Python with For Loop and List option Result of function that gener…

8 Popular Encryption methods real Usage

Encryption works as it replaces original data with some special characters and stores the data. You need a private key to decrypt the data. Else it is not possible to understand encrypted data even for hackers.

Private keys are two types. Sender uses one private key and receiver uses one other private key.
I have given in this post about 8 encryption methods.

Flow Diagram Encryption
 Different Encryption Methods Classic CipherRandom Key GenerationRSA TokenHash FunctionsMac value - Message Authentication codeKey HashTLS ProtocolAuthenticated Cipher  1. Classic Cipher The best example is this kind of algorithm was used in world war -II communication machines. 2. Random Key Generations One kind of security. They use an algorithm that sends random numbers. People use that number as a password.
3. RSA Token Every 60 seconds a token is generated. Mostly people use for financial applications. An extra identity mechanism along with user id and password. 4. Hash Functions It assigns a hash va…

Blue Prism complete tutorials download now

Blue prism is an automation tool useful to execute repetitive tasks without human effort. To learn this tool you need the right material. Provided below quick reference materials to understand detailed elements, architecture and creating new bots. Useful if you are a new learner and trying to enter into automation career.
The number one and most popular tool in automation is a Blue prism. In this post, I have given references for popular materials and resources so that you can use for your interviews.
RPA Blue Prism RPA blue prism tutorial popular resources I have given in this post. You can download quickly. Learning Blue Prism is a really good option if you are a learner of Robotic process automation.

RPA Advantages The RPA is also called "Robotic Process Automation"- Real advantages are you can automate any business process and you can complete the customer requests in less time.

The Books Available on Blue Prism 
Blue Prism resourcesDavid chappal PDF bookBlue Prism Blogs

Hyperledger Fabric Real Interview Questions Read Today

I am practicing Hyperledger. This is one of the top listed blockchains. This architecture follows R3 Corda specifications. Sharing the interview questions with you that I have prepared for my interview.

Though Ethereum leads in the real-time applications. The latest Hyperledger version is now ready for production applications. It has now become stable for production applications.
The Hyperledger now backed by IBM. But, it is still an open source. These interview questions help you to read quickly. The below set of interview questions help you like a tutorial on Hyperledger fabric. Hyperledger Fabric Interview Questions1). What are Nodes?
In Hyperledger the communication entities are called Nodes.

2). What are the three different types of Nodes?
- Client Node
- Peer Node
- Order Node
The Client node initiates transactions. The peer node commits the transaction. The order node guarantees the delivery.

3). What is Channel?
A channel in Hyperledger is the subnet of the main blockchain. You c…

Block storage vs object storage real differences

Interesting questions, I am giving on the two different types of storage popular in Amazon aws. Those are Block and Object.
Object vs Block Storage in AWS Why these names are different. You need to understand storage available in AWS. You alreday know that AWS is leading provider in cloud computing.
Object StorageObject means it is a single object. You are not dividing here.In the context of AWS, object storage helps your file to store as -is. How big it is does not matter.Suppose, your file size is 10MB, it stores as 10 MB file.What happens when you update a 30MB file, simply, it deletes old object and create a brand new object.Even, for small change, you need to update whole file. So, it utlilizes lot of resources. Object storage is much better for big file and very few changes.The object storage is well managed by AWS .AWS has full control over Object storage.Block StorageBlock storage divides your file into blocksYou have selected a block size as 512 bytes, then you want to upload…

Three popular RPA tools functional differences

Robotic process automation is growing area and many IT developers across the board started up-skill in this popular area. I have written this post for the benefit of Software developers who are interested in RPA also called Robotic Process Automation.


In my previous post, I have described that total 12 tools are available in the market. Out of those 3 tools are most popular. Those are Automation anywhere, BluePrism and Uipath. Many programmers asked what are the differences between these tools. I have given differences of all these three RPA tools.

BluePrism Blue Prism has taken a simple concept, replicating user activity on the desktop, and made it enterprise strength. The technology is scalable, secure, resilient, and flexible and is supported by a comprehensive methodology, operational framework and provided as packaged software.The technology is developed and deployed within a “corridor of IT governance” and has sophisticated error handling and process modelling capabilities to ens…

AWS EC2 Real Story on Elastic Cloud Computing

The short name for Amazon Elastic Computing Cloud is EC2. You can keep this point as an interview question. The computing capacity has an elastic property. Based on your requirement you can increase or decrease computing power. I am giving in this AWS EC2 real story for your quick reference.
You need to be very attentive when you enable Auto scaling feature. It is a responsibility on Admins. Amazon AWS EC2Making your existing hardware to the requirement, always is not so easy. So the EC2 feature in AWS helps you to allocate computing power according to your needs. AWS EC2 instance acts as your physical server.It has memory.You can increase the instance size in terms of CPU, Memory, Storage and GPU.EC2 auto scaling is a property, where it automatically increase your computing power. List of Top Security Features in EC2  1#. Virtual Private CloudThe responsibility of Virtual Private Cloud, is to safeguard each instance separately. That means, you cannot access others instance, which is al…

Python Syntax Rules Eliminate Errors Before you start debugging

In Python, if you know syntax rules, you can eliminate errors. The basic mistakes programmers do are missing semicolons, adding extra commas, and extra spaces. Python is case sensitive. So using the wrong identifier gives an error.
Indentation is unique to Python. You cannot find this kind of rule in any other programming languages Python Syntax Cheat Sheet These are the main areas you need to focus while writing a Python program. You need to learn rules. Else you need to waste a lot of time fixing the issues or errors.
Indentation or Syntax ErrorsExceptionsHandling Exceptions
1. Indentation If you do not follow proper order, you will get an error. The details of one block shroud follow in one vertical line. The sub-block should be inside of that.

In if loop, the if, elif, and else should have same indentation. Not only, the statement inside of them should have same indentation.Understand these examples a good material on indentation for you.   2. Exceptions  Python raises exception, wh…