BigDataApache Hadoop

Hadoop 

The Hadoop Distributed File System (HDFS) is a distributed file system designed to run on commodity hardware.

It has many similarities with existing distributed file systems are significant.

HDFS is highly fault-tolerant and is designed to be deployed on low-cost hardware.

HDFS provides high throughput access to application data and is suitable for applications that have large data sets.

HDFS was originally built as infrastructure for the Apache Nutch web search engine project.

原文地址:https://www.cnblogs.com/masterSoul/p/7610147.html