Skip to main content

Featured post

Ubuntu VI editor Commands to learn today

I have given VI editor commands to use in Ubuntu operating system. You can practice using them for your benefit and you can complete your work quickly.

To begin changing or adding to text with vi, you can enter Insert or Replace modes, as shown in the following list. When you enter Insert or Replace mode, the characters you type will appear in the text document (as opposed to being interpreted as commands).
Ubuntu Vi editor commands Press the Esc key to exit to Normal mode after you are done inserting or replacing text.
List of VI Editor Commands.. i—Typed text appears before current character.

a—Typed text appears after current character.

o—Open a new line below current line to begin typing.

s—Erase current character and replace with new text.

c?—Replace ? with l, w, $, or c to change the current letter, word, end of line, or line.

r—Replace current character with the next one you type.

Shift+i—Typed text appears at the beginning of current line.

Shift+a—Typed text appears at the end…

Spark SQL Query how to write it in Ten steps

Spark SQL example
Spark SQL example
The post tells how to write SQL query in Spark and explained in ten steps.This example demonstrates how to use sqlContext.sql to create and load two tables and select rows from the tables into two DataFrames.

The next steps use the DataFrame API to filter the rows for salaries greater than 150,000 from one of the tables and shows the resulting DataFrame. Then the two DataFrames are joined to create a third DataFrame. Finally the new DataFrame is saved to a Hive table.

1. At the command line, copy the Hue sample_07 and sample_08 CSV files to HDFS:
$ hdfs dfs -put HUE_HOME/apps/beeswax/data/sample_07.csv /user/hdfs
$ hdfs dfs -put HUE_HOME/apps/beeswax/data/sample_08.csv /user/hdfs

where HUE_HOME defaultsto /opt/cloudera/parcels/CDH/lib/hue (parcel installation) or /usr/lib/hue
(package installation).

2. Start spark-shell:
$ spark-shell

3. Create Hive tables sample_07 and sample_08:

scala> sqlContext.sql("CREATE TABLE sample_07 (code string,description string,total_emp
 int,salary int) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' STORED AS TextFile")
scala> sqlContext.sql("CREATE TABLE sample_08 (code string,description string,total_emp
 int,salary int) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' STORED AS TextFile")

Also Read: Learn SparkSQL by your own with little money

4. In Beeline, show the Hive tables:
[0: jdbc:hive2://hostname.com:> show tables;
+------------+--+
| tab_name |
+------------+--+
16 | Spark Guide
Developing Spark Applications
| sample_07 |
| sample_08 |
+------------+--+

Also read: The role of Spark in Hadoop eco system

5. Load the data in the CSV files into the tables:
scala> sqlContext.sql("LOAD DATA INPATH '/user/hdfs/sample_07.csv' OVERWRITE INTO TABLE
 sample_07")
scala> sqlContext.sql("LOAD DATA INPATH '/user/hdfs/sample_08.csv' OVERWRITE INTO TABLE
 sample_08")

6. Create DataFrames containing the contents of the sample_07 and sample_08 tables:
scala> val df_07 = sqlContext.sql("SELECT * from sample_07")
scala> val df_08 = sqlContext.sql("SELECT * from sample_08")

Apache Spark
7. Show all rows in df_07 with salary greater than 150,000:
scala> df_07.filter(df_07("salary") > 150000).show()
The output should be:
+-------+--------------------+---------+------+
| code| description|total_emp|salary|
+-------+--------------------+---------+------+
|11-1011| Chief executives| 299160|151370|
|29-1022|Oral and maxillof...| 5040|178440|
|29-1023| Orthodontists| 5350|185340|
|29-1024| Prosthodontists| 380|169360|
|29-1061| Anesthesiologists| 31030|192780|
|29-1062|Family and genera...| 113250|153640|
|29-1063| Internists, general| 46260|167270|
|29-1064|Obstetricians and...| 21340|183600|
|29-1067| Surgeons| 50260|191410|
|29-1069|Physicians and su...| 237400|155150|
+-------+--------------------+---------+------+

8.Create the DataFrame df_09 by joining df_07 and df_08, retaining only the code and description columns.
scala> val df_09 = df_07.join(df_08, df_07("code") ===
df_08("code")).select(df_07.col("code"),df_07.col("description"))
scala> df_09.show()

The new DataFrame looks like:
+-------+--------------------+
| code| description|
+-------+--------------------+
|00-0000| All Occupations|
|11-0000|Management occupa...|
|11-1011| Chief executives|
|11-1021|General and opera...|
|11-1031| Legislators|
|11-2011|Advertising and p...|
|11-2021| Marketing managers|
|11-2022| Sales managers|
|11-2031|Public relations ...|
|11-3011|Administrative se...|
|11-3021|Computer and info...|
|11-3031| Financial managers|
|11-3041|Compensation and ...|
|11-3042|Training and deve...|
|11-3049|Human resources m...|
|11-3051|Industrial produc...|
|11-3061| Purchasing managers|
|11-3071|Transportation, s...|
|11-9011|Farm, ranch, and ...|
+-------+--------------------+

9. Save DataFrame df_09 as the Hive table sample_09:
scala> df_09.write.saveAsTable("sample_09")

10. In Beeline, show the Hive tables:
[0: jdbc:hive2://hostname.com:> show tables;
+------------+--+
| tab_name |
+------------+--+
| sample_07 |
| sample_08 |
| sample_09 |
+------------+--+

Comments

Popular posts from this blog

Blue Prism complete tutorials download now

Blue prism is an automation tool useful to execute repetitive tasks without human effort. To learn this tool you need the right material. Provided below quick reference materials to understand detailed elements, architecture and creating new bots. Useful if you are a new learner and trying to enter into automation career.
The number one and most popular tool in automation is a Blue prism. In this post, I have given references for popular materials and resources so that you can use for your interviews.
RPA Blue Prism RPA blue prism tutorial popular resources I have given in this post. You can download quickly. Learning Blue Prism is a really good option if you are a learner of Robotic process automation.

RPA Advantages The RPA is also called "Robotic Process Automation"- Real advantages are you can automate any business process and you can complete the customer requests in less time.

The Books Available on Blue Prism 
Blue Prism resourcesDavid chappal PDF bookBlue Prism Blogs

Data science Secrets Simple Project Useful for Practice

I want to share with you how to use Python for your Data science or analytics Projects. Many programmers struggle to learn Data science because they do not know where to start. You can get hands on if you start with a mini project.

I have used Ubuntu Operating System for this project Skills You Must Need to become a Data Scientist You need dual skills - Learning and Apply knowledge. In Data science you need to learn and apply your knowledge. After study engineering, any person getsB Tech or M Tech Degree. You can become a real engineer if you apply engineering principles. So Data science also the same.

Data Visualization in Python is my simple project Importance of Data Data is the precious resource in resolving Machine Learning and Data Science Problems. Define first what is your problem. Collect Data Wrangle  the Data and Clean it.Visualize the PatternsIn olden days you might be studied a subject called Statistical Analysis. In this subject, you need to study the actual problem an…

Top Solidity Interview Questions for Sure Success

Solidity is the prime language to write logic for smart contracts in Ethereum blockchain. These are selected interview questions refresh before you take any interview.

Interview Questions on Solidity1. What is Solidity?

A) Solidity is the main language that you can use to write programs in Blockchain, such as smart contracts.

2. Where the Solidity programs run in Ethereum?

A). Those will run in EVM. You can also call it as Ethereum virtual Machine.

3. What kind of logic a Smart Contract can contain?

A contract in the sense of Solidity is a collection of code (its functions) and data (itsstate) that resides at a specific address on the Ethereum blockchain

4). Can the source code of smart contract accessible to the outside world? 

A). No, it is not possible to access source code to the network. Also, there is a limited access to one smart contract logic to the other smart contract logic. Eve, file system and other processes cannot access the source code.

Actually the code is sand boxed an…

Hyperledger Fabric Real Interview Questions Read Today

I am practicing Hyperledger. This is one of the top listed blockchains. This architecture follows R3 Corda specifications. Sharing the interview questions with you that I have prepared for my interview.

Though Ethereum leads in the real-time applications. The latest Hyperledger version is now ready for production applications. It has now become stable for production applications.
The Hyperledger now backed by IBM. But, it is still an open source. These interview questions help you to read quickly. The below set of interview questions help you like a tutorial on Hyperledger fabric. Hyperledger Fabric Interview Questions1). What are Nodes?
In Hyperledger the communication entities are called Nodes.

2). What are the three different types of Nodes?
- Client Node
- Peer Node
- Order Node
The Client node initiates transactions. The peer node commits the transaction. The order node guarantees the delivery.

3). What is Channel?
A channel in Hyperledger is the subnet of the main blockchain. You c…

Automation developer these are top Skills you need to learn

Robotic process automation is an upcoming IT skill. Three tools are popular. It is difficult to learn all three tool. So, learn anyone tool to start your career in automation.
To get a job in this line, I found in my research that some programming skills and Hand-on training on any one of the tools is required. Also, try to know the differences between popular RPA tools.
Skills Companies Looking in Automation Engineers All big companies looking for candidates having experience in Automation anywhere, Blue Prism and UIPath. It is not possible to learn all tools. Learn anyone tool and do practice well.

Ok.

You may ask a question about how to do it. Join in good training institute and learn one tool.  Take online classes to learn faster.

To learn Uipath try here. Also, you can enroll online course to learn UiPath.

UiPath GO The list of IT skills you needAutomation anywhere/Blue Prism/Uipath .Net/C#/Java/SQL skills MS-Visio Power Builder Python scripts/Unix Scripts/Perl Scripts HTML/CSS/J…

8 Popular Encryption methods real Usage

Encryption works as it replaces original data with some special characters and stores the data. You need a private key to decrypt the data. Else it is not possible to understand encrypted data even for hackers.

Private keys are two types. Sender uses one private key and receiver uses one other private key.
I have given in this post about 8 encryption methods.

Flow Diagram Encryption
 Different Encryption Methods Classic CipherRandom Key GenerationRSA TokenHash FunctionsMac value - Message Authentication codeKey HashTLS ProtocolAuthenticated Cipher  1. Classic Cipher The best example is this kind of algorithm was used in world war -II communication machines. 2. Random Key Generations One kind of security. They use an algorithm that sends random numbers. People use that number as a password.
3. RSA Token Every 60 seconds a token is generated. Mostly people use for financial applications. An extra identity mechanism along with user id and password. 4. Hash Functions It assigns a hash va…

Ubuntu VI editor Commands to learn today

I have given VI editor commands to use in Ubuntu operating system. You can practice using them for your benefit and you can complete your work quickly.

To begin changing or adding to text with vi, you can enter Insert or Replace modes, as shown in the following list. When you enter Insert or Replace mode, the characters you type will appear in the text document (as opposed to being interpreted as commands).
Ubuntu Vi editor commands Press the Esc key to exit to Normal mode after you are done inserting or replacing text.
List of VI Editor Commands.. i—Typed text appears before current character.

a—Typed text appears after current character.

o—Open a new line below current line to begin typing.

s—Erase current character and replace with new text.

c?—Replace ? with l, w, $, or c to change the current letter, word, end of line, or line.

r—Replace current character with the next one you type.

Shift+i—Typed text appears at the beginning of current line.

Shift+a—Typed text appears at the end…

Three popular RPA tools functional differences

Robotic process automation is growing area and many IT developers across the board started up-skill in this popular area. I have written this post for the benefit of Software developers who are interested in RPA also called Robotic Process Automation.


In my previous post, I have described that total 12 tools are available in the market. Out of those 3 tools are most popular. Those are Automation anywhere, BluePrism and Uipath. Many programmers asked what are the differences between these tools. I have given differences of all these three RPA tools.

BluePrism Blue Prism has taken a simple concept, replicating user activity on the desktop, and made it enterprise strength. The technology is scalable, secure, resilient, and flexible and is supported by a comprehensive methodology, operational framework and provided as packaged software.The technology is developed and deployed within a “corridor of IT governance” and has sophisticated error handling and process modelling capabilities to ens…