Using the Hadoop command we can send fairly regular filesystem commands to navigate HDFS. On Cloudera I am sending commands to port 8020 of the active name node. If you don’t have the hadoop command available on your system you need to download hadoop, extract it and add the bin directory to your path. For me using bash I added the following to my ~/.bashrc file.
export PATH="/path/to/extracted/hadoop/bin:$PATH"
It is also possible to use the hdfs command in that bin directory instead of the hadoop command.
List files on the remote HDFS filesystem:
hadoop fs -ls hdfs://namenode.cluster.tld:8020/user
Make a directory:
hadoop fs -mkdir hdfs://namenode.cluster.tld:8020/user/joebloggs
Disk Usage (-h for human readable):
hadoop fs -du -s -h hdfs://namenode.cluster.tld:8020/druid/*
Available/Used Disk Space:
hadoop fs -df -h hdfs://namenode.cluster.tld:8020