Posted by: Wildan Maulana | February 25, 2009

Expose HDFS as WebDav Store & using FUSE implementation

Wanna make mount-able drive that pointed to HDFS ? Just read & watch the issue on JIRA : https://issues.apache.org/jira/browse/HADOOP-496

At the end of disccussion regarding the above issue, Doug Cutting pointing out the solution given by iponweb, http://www.hadoop.iponweb.net/Home/hdfs-over-webdav/webdav-server …, it’s worth to try.

Update :

Hadoop WebDav Interface

Hadoop WebDav Interface


After succesfully compile and run WebDav support for Hadoop i found that it’s still read only by default, still finding how to make
it writeable …

FUSE

I’s located in the contrib section of hadoop source.

Steps (you can read the detail on the README file of fuse-dfs source):

From HADOOP_HOME :

  1. $ant compile-libhdfs -Dlibhdfs=1
  2. $ant package
  3. $ant  compile-contrib -Dlibhdfs=1 -Dfusedfs=1
  4. $sudo apt-get install libfuse2 fuse-utils libfuse-dev
  5. Make sure everything fine :
    hadoop@tobeThink:/opt/hadoop/bin$ ldd fuse_dfs
    linux-gate.so.1 =>  (0xb7fc2000)
    libhdfs.so => /usr/lib/libhdfs.so (0xb7f86000)
    libfuse.so.2 => /lib/libfuse.so.2 (0xb7f6a000)
    libjvm.so => /usr/lib/libjvm.so (0x06000000)
    libc.so.6 => /lib/tls/i686/cmov/libc.so.6 (0xb7e0b000)
    libm.so.6 => /lib/tls/i686/cmov/libm.so.6 (0xb7de5000)
    librt.so.1 => /lib/tls/i686/cmov/librt.so.1 (0xb7ddc000)
    libdl.so.2 => /lib/tls/i686/cmov/libdl.so.2 (0xb7dd8000)
    libpthread.so.0 => /lib/tls/i686/cmov/libpthread.so.0 (0xb7dbf000)
    /lib/ld-linux.so.2 (0xb7fa8000)
  6. create the directory for export destination :
    $mkdir -p  /exports/hdfs
  7. Ok .., export the HDFS store now using this command :
    sudo ./fuse_dfs_wrapper.sh hdfs://<HDFS-NODE>:54310 /export/hdfs/
  8. Just ls over the exported HDFS :
    hadoop@tobeThink:/opt/hadoop$ ls -lah  /export/hdfs/user/hadoop/
    total 28K
    drwxr-xr-x  8 hadoop 99 4.0K 2009-02-20 15:43 .
    drwxr-xr-x  3 hadoop 99 4.0K 2009-02-20 15:43 ..
    -rw-r–r–  1 hadoop 99 2.4K 2009-02-19 15:35 Gambar1.4.json
    drwxr-xr-x  3 hadoop 99 4.0K 2009-02-19 14:34 HelloHDFS
    drwxr-xr-x 14 hadoop 99 4.0K 2009-02-20 13:23 input
    drwxr-xr-x  3 hadoop 99 4.0K 2009-02-19 16:18 OCW4HighSchool
    drwxr-xr-x  4 hadoop 99 4.0K 2009-02-20 15:43 output
    -rw-r–r–  1 hadoop 99 2.4K 2009-02-19 15:35 output.txt

Voila …🙂

Exported HDFS store using FUSE

Exported HDFS store using FUSE

According to the readme [1], FUSE implementation for hadoop support the following directories operations :

cp, ls, more, cat, find, less, rm, mkdir, mv, rmdir

Ok .., that’s for today .., tomorrow i’ll try to benchmark this FUSE implementation for hadoop using Iozone [2].

Update :

After trying both iozone and bonnie, both of them cannot be used to benchmark the fuse-dfs, may be because not all
fuse spec implemented, CMIIW.

Iozone error

Iozone error

Happy Haddopying !

Reference :

[1] Fuse-DFS Readme, http://svn.apache.org/repos/asf/hadoop/core/trunk/src/contrib/fuse-dfs/README
[2] IOzone – Filesystem benchmark, http://www.iozone.org/
[3] Hadoop File System Browser discussion, http://www.nabble.com/hadoop-file-system-browser-td14997268.html

[4]Distributed Filesystem Benchmarking,  http://www-pdsf.nersc.gov/~sychan/filesystems.html


Responses

  1. I managed to install hadoop webdav, followed all the instructions on “http://www.hadoop.iponweb.net/Home/hdfs-over-webdav/webdav-server” but I cannot connect to the server.

    When I issue the :
    mount -t davfs hdfs://127.0.0.1:8180 /home/hadoop/mount_place/files/

    I use root and root as password but I get:

    sbin/mount.davfs: mounting failed
    500 javax.jcr.repositoryException

    Am I missing something?


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

%d bloggers like this: