Skip to main content

The awesome basics about Hadoop Security

The awesome basics about Hadoop Security
#The awesome basics about Hadoop Security:
What is Hadoop Security

We live in a very insecure world. Starting with the key to your home's front door to those all-important virtual keys, your passwords, everything needs to be secured—and well. In the world of Big Data where humongous amounts of data are processed, transformed, and stored, it's all the more important to secure your data.

Imagine if your company spent a couple of million dollars installing a Hadoop cluster to gather and analyze your customers' spending habits for a product category using a Big Data solution.

Because that solution was not secure, your competitor got access to that data and your sales dropped 20% for that product category.

How did the system allow unauthorized access to data? 
Wasn't there any authentication mechanism in place? 
Why were there no alerts? This scenario should make you think about the importance of security, especially where sensitive data is involved.
  • Although Hadoop does have inherent security concerns due to its distributed architecture the situation described is extremely unlikely to occur on a Hadoop installation that's managed securely. A Hadoop installation that has clearly defined user roles and multiple levels of authentication (and encryption) for sensitive data will not let any unauthorized access go through.
  • When talking about Hadoop security, you have to consider how Hadoop was conceptualized. 
  • When Doug Cutting and Mike Cafarella started developing Hadoop, security was not exactly the priority. 
  • Hadoop was meant to process large amounts of web data in the public domain, and hence security was not the focus of development. That's why it lacked a security model and only provided basic authentication for HDFS—which was not very useful, since it was extremely easy to impersonate another user.
  • Another issue is that Hadoop was not designed and developed as a cohesive system with predefined modules, but was rather developed as a collage of modules that either correspond to various open source projects or a set of (proprietary) extensions developed by various vendors to supplement functionality lacking within the Hadoop ecosystem.
  • Therefore, Hadoop assumes the isolation of (or a cocoon of) a trusted environment for its cluster to operate without any security violations—and that's lacking most of the time. 

Comments

Popular posts from this blog

The best 5 differences of AWS EMR and Hadoop

With Amazon Elastic MapReduce (Amazon EMR) you can analyze and process vast amounts of data. It does this by distributing the computational work across a cluster of virtual servers running in the Amazon cloud. The cluster is managed using an open-source framework called Hadoop.

Amazon EMR has made enhancements to Hadoop and other open-source applications to work seamlessly with AWS. For example, Hadoop clusters running on Amazon EMR use EC2 instances as virtual Linux servers for the master and slave nodes, Amazon S3 for bulk storage of input and output data, and CloudWatch to monitor cluster performance and raise alarms.

You can also move data into and out of DynamoDB using Amazon EMR and Hive. All of this is orchestrated by Amazon EMR control software that launches and manages the Hadoop cluster. This process is called an Amazon EMR cluster.


What does Hadoop do...

Hadoop uses a distributed processing architecture called MapReduce in which a task is mapped to a set of servers for proce…

5 Things About AWS EC2 You Need to Focus!

Amazon Elastic Compute Cloud (Amazon EC2) - is a web service that provides resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers.
Amazon EC2’s simple web service interface allows you to obtain and configure capacity with minimal friction.

The basic functions of EC2... 
It provides you with complete control of your computing resources and lets you run on Amazon’s proven computing environment.Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change.Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use. Amazon EC2 provides developers the tools to build failure resilient applications and isolate themselves from common failure scenarios. 
Key Points for Interviews:
EC2 is the basic fundamental block around which the AWS are structured.EC2 provides remote ope…

6 Most Popular IoT Protocols Currently Being Used

The below is complete list of Protocols being used in Internet of things projects.
CoAP: Constrained Application Protocol. MQTT: Message Queue Telemetry Transport. XMPP: Extensible Messaging and Presence Protocol. RESTFUL Services: Representational State Transfer. AMQP: Advanced Message Queuing Protocol Websockets. Related:
5 Challenges in Internet-of-things mostly people look inHot IT Skills by Udemy and Dice