847-505-9933 | +91 20 66446300 info@datametica.com

Hadoop Commands

clip_image001[5]

HDFS stands for ‘Hadoop Distributed File System’. The HDFS is a sub-project of the Apache Hadoop project. This Apache Software Foundation project is designed to provide a fault-tolerant file system designed to run on commodity hardware. HDFS is accessed through a set of shell commands which will be discussed in this post.

A short note before starting: All the Hadoop Shell commands are invoked by the bin/hadoop script.

User Commands:

  • Run DFS file system:

Usage: hadoop fsck – /

clip_image002[7]

  • Check version of Hadoop:

Usage: Hadoop version

clip_image003[5]

FS Shell Commands:

The Hadoop fs command runs a generic filesystem user client that interacts with the MapR filesystem (MapR-FS).

  • View file listings:

Usage: hadoop fs -ls hdfs :/

clip_image004[4]

  • Check memory status:

Usage: hadoop fs -df hdfs :/

clip_image005[4]

  • Count of Directories, Files and Bytes in specified path and file pattern:

Usage: hadoop fs -count hdfs :/

clip_image006[4]

  • Move file from one location to another:

Usage: -mv <src> <dst>

clip_image007[4]

  • Copy file from source to destination :

Usage: -cp <src> <dst>

clip_image008[4]

  • Delete File:

Usage: -rm <path>

clip_image009[4]

  • Put file from the Local file system to Hadoop Distributed File System:

Usage: -put <localsrc> … <dst>

clip_image010[4]

  • Copy file from Local to HDFS:

Usage: -copyFromLocal <localsrc> … <dst>

clip_image011[4]

  • View file in Hadoop Distributed File system:

Usage: -cat <src>

clip_image012[4]

Administration Commands:

  • Format the namenode:

Usage: hadoop namenode -format

clip_image013[4]

  • Starting Secondary namenode:

Usage: hadoop secondrynamenode

clip_image014[4]

  • Run namenode :

Usage: hadoop namenode

clip_image015[4]

  • Run data node:

Usage: hadoop datanode

clip_image016[4]

  • Cluster Balancing:

Usage: hadoop balancer

clip_image017[4]

  • Run MapReduce job tracker node:

Usage: hadoop jobtracker

clip_image018[4]

  • Run MapReduce task tracker node:

Usage: hadoop tasktracker

clip_image019[4]

Showing 9 comments

  • Kirk
    Reply

    I do believe all the ideas you’ve introduced in your post.
    They are really convincing and will certainly work.

    Still, the posts are too short for starters.
    May you please extend them a bit from next time?
    Thank you for the post.

  • Felicitas
    Reply

    Your style is really unique in comparison to other folks I’ve read
    stuff from. Thank you for posting when you’ve got
    the opportunity, Guess I’ll just book mark this site.

  • Analisa
    Reply

    This is very interesting, You’re a very skilled blogger. I have joined your rss feed and look forward to seeking
    more of your magnificent post. Also, I’ve shared your website in my social networks!

  • Emilia
    Reply

    Its like you read my mind! You seem to know a lot about this, like you wrote the book in it or something.

    I think that you could do with some pics to drive
    the message home a little bit, but instead of that, this is excellent blog.
    A fantastic read. I’ll definitely be back.

  • Autumn
    Reply

    Fantastic website. Plenty of useful info here. I’m sending
    it to a few friends ans additionally sharing in delicious.
    And obviously, thanks for your effort!

  • Kelli
    Reply

    Highly energetic post, I loved that bit. Will there be a part 2?

  • Andrew A. Sailer
    Reply

    I just want to tell you that I’m new to blogging and actually loved you’re web blog. Likely I’m going to bookmark your blog . You amazingly come with fantastic well written articles. Thank you for sharing your blog site.

  • Lamar
    Reply

    Heya i am for the first time here. I came across this board and I
    find It really useful & it helped me out much. I hope
    to give something back and help others like you aided me.

  • Rengasamy
    Reply

    This post is very useful for me to rewind the basic commands and i believe that this page would be very helpful for Hadoop beginners.Thanks DataMetica.

Leave a Comment

POST COMMENT Back to Top
*
Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.