Featured Post

Python map() and lambda() Use Cases and Examples

Image
 In Python, map() and lambda functions are often used together for functional programming. Here are some examples to illustrate how they work. Python map and lambda top use cases 1. Using map() with lambda The map() function applies a given function to all items in an iterable (like a list) and returns a map object (which can be converted to a list). Example: Doubling Numbers numbers = [ 1 , 2 , 3 , 4 , 5 ] doubled = list ( map ( lambda x: x * 2 , numbers)) print (doubled) # Output: [2, 4, 6, 8, 10] 2. Using map() to Convert Data Types Example: Converting Strings to Integers string_numbers = [ "1" , "2" , "3" , "4" , "5" ] integers = list ( map ( lambda x: int (x), string_numbers)) print (integers) # Output: [1, 2, 3, 4, 5] 3. Using map() with Multiple Iterables You can also use map() with more than one iterable. The lambda function can take multiple arguments. Example: Adding Two Lists Element-wise list1 = [ 1 , 2 , 3 ]

The best helpful HDFS File System Commands (2 of 4)

Hadoop+Big data+Jobs, Apply Now
#Top-Selected-HDFS-file-system-commands
CopyFrom Local
Works similarly to the put command, except that the source is restricted to a local file reference.
hdfs dfs -copyFromLocal URI
hdfs dfs -copyFromLocal input/docs/data2.txt hdfs://localhost/user/rosemary/data2.txt

HDFS Commands Part-1of 4

copyToLocal
Works similarly to the get command, except that the destination is restricted to a local file reference.
hdfs dfs -copyToLocal [-ignorecrc] [-crc] URI
hdfs dfs -copyToLocal data2.txt data2.copy.txt

count
Counts the number of directories, files, and bytes under the paths that match the specified file pattern.
hdfs dfs -count [-q]
hdfs dfs -count hdfs://nn1.example.com/file1 hdfs://nn2.example.com/file2

cp
Copies one or more files from a specified source to a specified destination. If you specify multiple sources, the specified destination must be a directory.
hdfs dfs -cp URI [URI …]
hdfs dfs -cp /user/hadoop/file1 /user/hadoop/file2 /user/hadoop/dir

du
Displays the size of the specified file, or the sizes of files and directories that are contained in the specified directory. If you specify the -s option, displays an aggregate summary of file sizes rather than individual file sizes. If you specify the -h option, formats the file sizes in a "human-readable" way.

hdfs dfs -du [-s] [-h] URI [URI …]
hdfs dfs -du /user/hadoop/dir1 /user/hadoo

Comments

Popular posts from this blog

How to Fix datetime Import Error in Python Quickly

SQL Query: 3 Methods for Calculating Cumulative SUM

Python placeholder '_' Perfect Way to Use it