Posts

Featured Post

SQL Interview Success: Unlocking the Top 5 Frequently Asked Queries

Image
 Here are the five top commonly asked SQL queries in the interviews. These you can expect in Data Analyst, or, Data Engineer interviews. Top SQL Queries for Interviews 01. Joins The commonly asked question pertains to providing two tables, determining the number of rows that will return on various join types, and the resultant. Table1 -------- id ---- 1 1 2 3 Table2 -------- id ---- 1 3 1 NULL Output ------- Inner join --------------- 5 rows will return The result will be: =============== 1  1 1   1 1   1 1    1 3    3 02. Substring and Concat Here, we need to write an SQL query to make the upper case of the first letter and the small case of the remaining letter. Table1 ------ ename ===== raJu venKat kRIshna Solution: ========== SELECT CONCAT(UPPER(SUBSTRING(name, 1, 1)), LOWER(SUBSTRING(name, 2))) AS capitalized_name FROM Table1; 03. Case statement SQL Query ========= SELECT Code1, Code2,      CASE         WHEN Code1 = 'A' AND Code2 = 'AA' THEN "A" | "A

3 top IT Skills every new IT Professionals learn to progress in software career

Image
What are the skills needed by the new IT professionals or job seekers who help the  organisation  transition to IT-as-a-Service.  In order  to lead their  organisations  to the cloud, IT professionals must focus on three fundamental areas: Core  Virtualisation  Skill Sets IT professionals must think and operate in the virtual world. No longer can they be tied to the old paradigm of physical assets dedicated to specific users or applications. They must think in terms of “services” riding on top of a fully virtualized infrastructure, and how applications will take advantage of shared resources with both servers and storage. This requires comprehensive skills in both server and storage virtualization technology, and enough experience as a practitioner to understand the intricacies and critical elements of managing virtual platforms. Rules of Old IT and New IT Cross-training Competency Leaders of IT innovation cannot be completely siloed and hyper-focused. Although there will

Linux Must Read Course Contents

Image
The complete syllabus for the Linux certification course you need to know before start preparation for the test. List of Course Contents The Linux community and a career in open source Finding your way on a Linux system The power of the command line The Linux operating system Security and file permissions Topic 1: The Linux Community and a Career in Open Source (weight: 7) 1.1 Linux Evolution and Popular Operating Systems Weight: 2 Description: Knowledge of Linux development and major distributions. Key Knowledge Areas: Open Source Philosophy Distributions Embedded Systems The following is a partial list of the used files, terms and utilities: Android Debian, Ubuntu (LTS) CentOS, openSUSE, Red Hat Linux Mint, Scientific Linux 1.2 Major Open Source Applications Weight: 2 Description: Awareness of major applications as well as their uses and development. Key Knowledge Areas: Desktop Applications Server Applications Development Languages Package Management

AWS Certified Developer: Eligibility Criteria

Image
The below is complete eligibility criteria is as follows- One or more years of hands-on experience designing and maintaining an AWS-based application. In-depth knowledge of at least one high-level programming language. Understanding of core AWS services, uses, and basic architecture best practices. ... Proficiency in designing, developing, and deploying cloud-based solutions using AWS. Experience with developing and maintaining applications written for Amazon Simple Storage Service, Amazon DynamoDB, Amazon Simple Queue Service, Amazon Simple Notification Service, Amazon Simple Workflow Service, AWS Elastic Beanstalk, and AWS CloudFormation. Related: AWS Basics for Software Engineer The requirement for Developer Exam Professional experience using AWS technology Hands-on experience programming with AWS APIs Understanding of AWS Security best practices Understanding of automation and AWS deployment tools Understanding storage options and their underlying consistency models Excellent

Unix: How to Write Shell Script Using vi Editor

Image
Stockphotos.io When you login into UNIX, you are first in the home directory: $/home: Then you can issue $/home: cd jthomas Then you come to your own directory: $/home/jthomas: How to write your first script: $/home/ jthomas : vi test.sh Here, you can write your script. The first line in the script is: #!/bin/ksh - It denotes which shell you are going to use. Example: $vi test.sh  #!/bin/ksh  ################################################### #  Written By: Jason Thomas # Purpose: This script was written to show users  how to develop their first script  ################################################### # Denotes a comment root daemon bin sys adm uucp nobody lpd How to run a script $ sh test.sh Also read:  The complete list of UNIX basic commands

11 Top PIG Interview Questions

Here are the top PIG interview questions. These are useful for your project and interviews. 1). What is PIG? PIG is a platform for analyzing large data sets that consist of a high-level language for expressing data analysis programs, coupled with infrastructure for evaluating these programs.  PIG’s infrastructure layer consists of a compiler that produces a sequence of MapReduce Programs. 2). What is the difference between logical and physical plans? Pig undergoes some steps when a Pig Latin Script is converted into MapReduce jobs. After performing the basic parsing and semantic checking, it produces a logical plan.  The logical plan describes the logical operators that have to be executed by Pig during execution. After this, Pig produces a physical plan. The physical plan describes the physical operators that are needed to execute the script. 3). Does ‘ILLUSTRATE’ run MR job? No, illustrate will not pull any MR, it will pull the internal data. On the console, illustrate will

How to Understand AWS CloudFormation Easily

Image
AWS CloudFormation is a service that helps you model and set up your Amazon Web Services resources so that you can spend less time managing those resources and more time focusing on your applications that run in AWS. You create a template that describes all the AWS resources you want (like Amazon EC2 instances or Amazon RDS DB instances), and AWS CloudFormation provides and configures those resources for you.   You don't need to individually create and configure AWS resources and figure out what's dependent on what; AWS CloudFormation handles all of that.  Managing Infrastructure For a scalable web application that also includes a back-end database, you might use an Auto Scaling group, an Elastic Load Balancing load balancer, and an Amazon Relational Database Service database instance.  Normally, you might use each individual service to provide these resources. And after you create the resources, you would have to configure them to work together. All these tasks can a

AWS EMR Vs. Hadoop: 5 Top Differences

Image
With Amazon Elastic MapReduce Amazon EMR, you can analyze and process vast amounts of data. It distributes the computational work across a cluster of virtual servers ( run in the Amazon cloud). An open-source framework of Hadoop manages it.  Amazon EMR - Elastic MapReduce, The Unique Features Amazon EMR has made enhancements to Hadoop and other open-source applications to work seamlessly with AWS. For instance, Hadoop clusters running on Amazon EMR use EC2 instances as virtual Linux servers for the master and slave nodes,  Amazon S3   for bulk storage of input and output data, and CloudWatch to monitor cluster performance and raise alarms. Also, you can move data into and out of DynamoDB using Amazon EMR and Hive. That orchestrates by Amazon EMR control software that launches and manages the Hadoop cluster. This process is called an Amazon EMR cluster. What does Hadoop do? Hadoop uses a  distributed processing  architecture called MapReduce, in which a task maps to a set of servers f