Posts

Featured Post

SQL Interview Success: Unlocking the Top 5 Frequently Asked Queries

Image
 Here are the five top commonly asked SQL queries in the interviews. These you can expect in Data Analyst, or, Data Engineer interviews. Top SQL Queries for Interviews 01. Joins The commonly asked question pertains to providing two tables, determining the number of rows that will return on various join types, and the resultant. Table1 -------- id ---- 1 1 2 3 Table2 -------- id ---- 1 3 1 NULL Output ------- Inner join --------------- 5 rows will return The result will be: =============== 1  1 1   1 1   1 1    1 3    3 02. Substring and Concat Here, we need to write an SQL query to make the upper case of the first letter and the small case of the remaining letter. Table1 ------ ename ===== raJu venKat kRIshna Solution: ========== SELECT CONCAT(UPPER(SUBSTRING(name, 1, 1)), LOWER(SUBSTRING(name, 2))) AS capitalized_name FROM Table1; 03. Case statement SQL Query ========= SELECT Code1, Code2,      CASE         WHEN Code1 = 'A' AND Code2 = 'AA' THEN "A" | "A

Hadoop Vs RDBMS Real Differences

Image
Hadoop comes into the picture to process a large volume of unstructured data. The structured data is already taken care of by traditional databases. Traditional databases. Traditional relational databases have been able to store massive data sets for a long time. An Oracle 10g database can store over 8 Petabytes while for many years DB2 databases have been capable of storing well over 500 Petabytes. Of course, this is all theoretical.  No customer has an Oracle or DB2 database that approaches sizes even close to that. Why? Because the speed, or velocity, at which data can be loaded and queries can be executed approaches zero well before then. Similarly, all traditional relational databases can store any variety of data as text or binary large objects. The problem is that large volumes of unstructured data cannot be moved fast enough to enable rapid search and retrieval. Hadoop Processing. Running constant and predictable workloads is what your existing data warehouse has be

The In-and-Out of Nodes in Blockchain

Image
Blockchain is a decentralized technology or distributed ledger on which transactions are anonymously recorded. Which means the transaction ledger is maintained simultaneously across a network of unrelated computers or servers called “nodes”, like a spreadsheet that is duplicated thousands of times across a network of computers. The ledger contains a continuous and complete record (the “chain”) of all transactions performed which are grouped into blocks A block is only added to the chain if the nodes, which are members in the blockchain network with high levels of computing power, reach consensus on the next ‘valid’ block to be added to the chain. A transaction can only be verified and form part of a candidate block if all the nodes on the network confirm that the transaction is valid. Related 11 Useful Blockchain Advantages to Read now Blockchain Smart Contract The Perfect Example

Hadoop: How to Improve College a Mini Project

Image
This is based on my research of developing an engineering college using data analytics. This is a great subject that can be applied by all engineering aspirants in their final project. In my view it has dual benefits. The one is for student they can gain lot of analytics knowledge and application to develop engineering college to keep it in the list of top colleges. The second is for Engineering colleges they can benefit to improve quality of education and to become one of the top colleges. Hadoop: How to Improve College a Mini Project The project theme is data analytics: There are total 2 parts: Use Hadoop technologies to study student database what they did in School level- This gives lot of insights on the Student interests. Approach each student and get some innovative ideas to improve the college Use Faculty database to get the skills and projects what they did in previous years. This helps to get right faculty for new innovative project Basically the qualities of g

4 Layers of AWS Architecture a Quick Answer

Image
I have collected real interview questions on AWS key architecture components. Those are S3, EC2, SQS, and SimpleDB. AWS is one of the most popular skills in the area of Cloud computing. Many companies are recruiting software developers to work on cloud computing. AWS Key Architecture Components AWS is the top cloud platform. The knowledge of this helpful to learn other cloud platforms. Below are the questions asked in interviews recently. What are the components involved in AWS?   Amazon S3. With this, one can retrieve the key information which is occupied in creating cloud structural design, and the amount of produced information also can be stored in this component that is the consequence of the key specified. Amazon EC2 . Helpful to run a large distributed system on the Hadoop cluster. Automatic parallelization and job scheduling can be achieved by this component. Amazon SQS . This component acts as a mediator between different controllers. Also worn for cushioning requirem

Amazon Web services Daily career tips subscribe today

Image
You will receive daily tips to your email id. They are suitable to all application developers and working professionals in Amazon web services. These are from theory to practical and other relevant tips to find suitable jobs. You can subscribe here. You will receive tips on the following points. Get ready and share it to your friends also. Tips about AWS What the developers will do , if they selected for AWS jobs What type of roles available in AWS Interview tips on AWS Best training on AWS Also Read: Top 8 AWS basic Questions and answers 5 Key modules you need to learn in AWS

How to Work on Re-skill Legacy to Recent

Image
How to work on your reskill plan to grow your career from any level. Just you need to learn and be ready with proficiency. Someone will absorb you.

Sqoop Real Use in Hadoop Framework

Image
Why Sqoop you need while working on Hadoop-The Sqoop and its primary reason is to import data from structural data sources such as Oracle/DB2 into HDFS(also called Hadoop file system). To our readers, I have collected a good video from Edureka which helps you to understand the functionality of Sqoop. The comparison between Sqoop and Flume How name come for Sqoop Sqoop word came from SQL+HADOOP=SQOOP. And Sqoop is a data transfer tool. The main use of Sqoop is to import and export a large amount of data from RDBMS to HDFS and vice versa. List of basic Sqoop commands Codegen- It helps to generate code to interact with database records. Create-hive-table- It helps to Import a table definition into a hive Eval- It helps to evaluate SQL statement and display the results Export-It helps to export an HDFS directory into a database table Help- It helps to list the available commands Import- It helps to import a table from a database to HDFS Import-all-tables- It