HDFS Explorer – Managing the Hadoop file system on a remote machine

Working with hadoop means working with the Hadoop File System (HDFS). Therefore, it is mandatory to read, write and delete files via command line. That can be quite difficult and exhausting when you are not familiar with the common unix and hadoop commands. To handle this task, I wrote a small application that is able to work with an HDFS running on Ubuntu.

So far, the application is able to:

  • Read the HDFS in a treeview
  • Upload / Download files to/from the local machine
  • Delete files on the HDFS
  • Create directories on the HDFS

When there is a need (and if I get enough good feedback 😉 ), I’ll add session management for several file systems as well as the function to start MapReduce jobs from the application (as it can be seen in the lower group box).

A download is about to follow soon!

5 thoughts on “HDFS Explorer – Managing the Hadoop file system on a remote machine”

  1. Your HDFS explorer seems to be very nice.
    I would like to test it for my hadoop cluster. So it would be nice if you can provide also a download of a beta version for testing purposes. That would be great.

    Good luck for your project and i hope to see a version to download next time.

  2. Hi there,

    I am also working on the exact same SW 🙂
    I even called mine “HDFS-Explorer” too.
    Maybe we could exchange experience?
    I should also bring a screenshot online 🙂

    Cheers,
    Felix

    1. Hi Felix,
      we are working on the same software AND live apparently in the same city 🙂 I stopped the development with .NET and switched to Java since the Hadoop API is much better. What environment do you use?

Leave a Reply to Felix Bach Cancel reply

Your email address will not be published. Required fields are marked *