0% found this document useful (0 votes)
303 views

HDFS Exercises - Basic

The document describes commands for interacting with the Hadoop File System (HDFS) using the hadoop fs command line tool. It lists common HDFS operations like creating/removing directories, copying/moving files between local and HDFS, listing files, checking disk usage, modifying permissions/ownership, and provides examples of commands for each operation. It also mentions the WebHDFS REST API as an alternative to interact with HDFS through HTTP/curl requests.

Uploaded by

Prabhu Kushwaha
Copyright
© © All Rights Reserved
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
303 views

HDFS Exercises - Basic

The document describes commands for interacting with the Hadoop File System (HDFS) using the hadoop fs command line tool. It lists common HDFS operations like creating/removing directories, copying/moving files between local and HDFS, listing files, checking disk usage, modifying permissions/ownership, and provides examples of commands for each operation. It also mentions the WebHDFS REST API as an alternative to interact with HDFS through HTTP/curl requests.

Uploaded by

Prabhu Kushwaha
Copyright
© © All Rights Reserved
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Usage

make and remove directories

copy or move file from local system to HDFS

list files in HDFS


disk utilization

move or copy files with HDFS

display file content


display file content - last 1KB

copy or move file from HDFS to local system

modify file/directory permission

change owner of file/directory


add the entire local directory
remove hdfs file / directory
help

Command / Syntax
$ bin/hadoop fs -mkdir [dir-name]
$ bin/hadoop fs -rmdir [dir-name]
Recursive Delete (for non-empty directories):
$ bin/hadoop fs -rmr [dir-name]

$ bin/hadoop fs -copyFromLocal
$ bin/hadoop fs -moveFromLocal

[local-source] [hdfs-destination]
[local-source] [hdfs-destination]

$ bin/hadoop fs -ls [dir-name]


$ bin/hadoop fs -du [path]
$ bin/hadoop fs -mv
$ bin/hadoop fs -mv

[source]
[source]

[destination]
[destination]

$ bin/hadoop fs -cat [filename]


$ bin/hadoop fs -tail [filename]

$ bin/hadoop fs -copyToLocal
$ bin/hadoop fs -moveToLocal

[source] [local-destination]
[source] [local-destination]

$ bin/hadoop fs -chmod [-R] mode,mode,...

$ bin/hadoop fs -chown [-R] [owner][:[group]]

[path]

[path]

$ bin/hadoop fs -put [local-source] [hdfs-destination]


$ bin/hadoop fs -rm [hdfs-destination]
$ bin/hadoop fs -help

Examples
$
$
$
$
$
$
$

bin/hadoop
bin/hadoop
bin/hadoop
bin/hadoop
bin/hadoop
bin/hadoop
bin/hadoop

fs
fs
fs
fs
fs
fs
fs

$
S
$
$
$
$

bin/hadoop fs
ls -ltr $HOME
bin/hadoop fs
bin/hadoop fs
ls -ltr $HOME
bin/hadoop fs

-mkdir /hduser00
-mkdir /testing00
-ls /
-ls /hduser00
-ls /testing00
-rmdir /testing00
-ls /
-copyFromLocal

$HOME/testing123.txt /hduser00/testing123.txt

-ls /hduser00
-moveFromLocal $HOME/testing123.txt /hduser00/testingABC.txt
-ls /hduser00

$ bin/hadoop fs -ls /
$ bin/hadoop fs -ls /hduser00
$ bin/hadoop fs -du /
$ bin/hadoop fs -du /hduser00
$
$
$
$

bin/hadoop
bin/hadoop
bin/hadoop
bin/hadoop

fs
fs
fs
fs

-mv /hduser00/testing123.txt /hduser00/testing456.txt


-ls /hduser00
-cp /hduser00/testingABC.txt /hduser00/testingXYZ.txt
-ls /hduser00

$ bin/hadoop fs -cat /hduser00/testingABC.txt


$ bin/hadoop fs -cat /hduser00/testingXYZ.txt
$ bin/hadoop fs -tail /hduser00/testingABC.txt
S ls -ltr $HOME
$ bin/hadoop fs -copyToLocal
S ls -ltr $HOME

/hduser00/testingXYZ.txt $HOME/testingXYZ.txt

$ bin/hadoop fs -moveToLocal /hduser00/testingXYZ.txt $HOME/testingABC.txt


$ bin/hadoop fs -ls /hduser00
S ls -ltr $HOME
$ bin/hadoop fs -ls /hduser00
$ bin/hadoop fs -chmod 777 /hduser00/testingABC.txt
$ bin/hadoop fs -ls /hduser00
$ bin/hadoop fs -ls /hduser
$ bin/hadoop fs chown root:root /hduser/testingABC.txt
$ bin/hadoop fs -ls /hduser
$ bin/hadoop fs -put /hduser/tmp /user/hduser/tmp
$ bin/hadoop fs -rm /user/hduser/tmp
$ bin/hadoop fs -help

http://<HOST>:<PORT>/webhdfs/v1/<PATH>?user.name=hduser&]op=
where
op=MKDIRS[&permission=<OCTAL>]

EXAMPLE
curl -i -X PUT -b cookie.jar "http://host:port/webhdfs/v1/tmp/csv.txt?op=RENAME&destination=/tmp

REF: https://hadoop.apache.org/docs/r1.0.4/webhdfs.html

http://ubuntu.dhcp.blrl.sap.corp:50070/webhdfs/v1/testing00?op=LISTSTATUS

You might also like