22 July 2015

Big Data: Hadoop File system checking Utility- fsck

Hadoop+Big data+Jobs
Hadoop provides file system utility which is called "fsck"

  • Basically, it checks health of all the files under a path
  • It also check health of all the files under the '/'(root)
BIN/HADOOP  fsck  /

- It checks health of all the files

BIN/HADOOP  fsck    /test/

- It checks health of files under the path

How to find which file is healthy:

- It prints out dot for each healthy file

- It will print message for each file, if it is not healthy, also for under replicated blocks, over replicated blocks, mis-replicated blocks, and corrupted blocks.

By default fsck utility cannot do anything for under replicated blocks and over replicated blocks. Hadoop itself heal the blocks.

How to delete corrupted blocks:

BIN/HADOOP  fsck   -delete  block-names

It will delete all corrupted blocks

BIN/HADOOP  fsck   -move  block-names

It will move corrupted blocks to /lost directory

Other options we can use with fsck:
  • -files
  • -blocks
  • -locations

No comments:

Post a Comment

Thanks for your message. We will get back you.

© 2010-2017 Biganalytics.me. All rights reserved.. Powered by Blogger.

Total Pageviews

All material, files, logos and trademarks within this site are properties of their respective organizations.