Posts

Showing posts with the label Amazon Web Service

Featured Post

SQL Interview Success: Unlocking the Top 5 Frequently Asked Queries

Image
 Here are the five top commonly asked SQL queries in the interviews. These you can expect in Data Analyst, or, Data Engineer interviews. Top SQL Queries for Interviews 01. Joins The commonly asked question pertains to providing two tables, determining the number of rows that will return on various join types, and the resultant. Table1 -------- id ---- 1 1 2 3 Table2 -------- id ---- 1 3 1 NULL Output ------- Inner join --------------- 5 rows will return The result will be: =============== 1  1 1   1 1   1 1    1 3    3 02. Substring and Concat Here, we need to write an SQL query to make the upper case of the first letter and the small case of the remaining letter. Table1 ------ ename ===== raJu venKat kRIshna Solution: ========== SELECT CONCAT(UPPER(SUBSTRING(name, 1, 1)), LOWER(SUBSTRING(name, 2))) AS capitalized_name FROM Table1; 03. Case statement SQL Query ========= SELECT Code1, Code2,      CASE         WHEN Code1 = 'A' AND Code2 = 'AA' THEN "A" | "A

AWS CLI PySpark a Beginner's Comprehensive Guide

Image
AWS (Amazon Web Services) and PySpark are separate technologies, but they can be used together for certain purposes. Let me provide you with a beginner's guide for both AWS and PySpark separately. AWS (Amazon Web Services): Amazon Web Services (AWS) is a cloud computing platform that offers a wide range of services for computing power, storage, databases, machine learning, analytics, and more. 1. Create an AWS Account: Go to the AWS homepage. Click on "Create an AWS Account" and follow the instructions. 2. Set Up AWS CLI: Install the AWS Command Line Interface (AWS CLI) on your local machine. Configure it with your AWS credentials using AWS configure. 3. Explore AWS Services: AWS provides a variety of services. Familiarize yourself with core services like EC2 (Elastic Compute Cloud), S3 (Simple Storage Service), and IAM (Identity and Access Management). PySpark: PySpark is the Python API for Apache Spark, a fast and general-purpose cluster computing system. It allows you