sitemap

RSS地图

收藏本站

设为首页

Oracle研究中心

当前位置:Oracle研究中心 > 运维DBA >

【学习笔记】记录hadoop FOR linux详细安装笔记

时间:2016-11-23 19:17   来源:Oracle研究中心   作者:网络   点击:

天萃荷净 Oracle研究中心学习笔记:分享一篇关于hadoop的笔记,该笔记详细记录了hadoop FOR Linux系统的安装详细步骤。

本站文章除注明转载外,均为本站原创: 转载自love wife & love life —Roger 的Oracle技术博客
本文链接地址: hadoop 学习笔记(1)-for linux install

本文将是学习Nosql 数据库学习笔记系列的第一篇,回头会继续坚持,欢迎高手赐教!
++++++ 卸载老版的JDK ++++++

[root@roger etc]# java -version

java version "1.4.2"
gij (GNU libgcj) version 4.1.2 20080704 (Red Hat 4.1.2-48)

Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

[root@roger etc]#  rpm -qa | grep gcj

libgcj-devel-4.1.2-48.el5
java-1.4.2-gcj-compat-devel-1.4.2.0-40jpp.115
java-1.4.2-gcj-compat-src-1.4.2.0-40jpp.115
libgcj-4.1.2-48.el5
libgcj-src-4.1.2-48.el5
java-1.4.2-gcj-compat-1.4.2.0-40jpp.115

[root@roger etc]# yum -y remove java-1.4.2-gcj-compat-1.4.2.0-40jpp.115

Loaded plugins: rhnplugin, security
This system is not registered with RHN.
RHN support will be disabled.
Setting up Remove Process
Resolving Dependencies
--> Running transaction check
---> Package java-1.4.2-gcj-compat.i386 0:1.4.2.0-40jpp.115 set to be erased
--> Processing Dependency: java-gcj-compat for package: jakarta-commons-codec
--> Processing Dependency: java-gcj-compat for package: jakarta-commons-codec
--> Processing Dependency: java-gcj-compat for package: antlr
--> Processing Dependency: java-gcj-compat for package: antlr
--> Processing Dependency: java-gcj-compat for package: junit
--> Processing Dependency: java-gcj-compat for package: junit
--> Processing Dependency: java-gcj-compat for package: jakarta-commons-logging
--> Processing Dependency: java-gcj-compat for package: jakarta-commons-logging
--> Processing Dependency: java-gcj-compat >= 1.0.31 for package: tomcat5-jsp-2.0-api
--> Processing Dependency: java-gcj-compat >= 1.0.31 for package: tomcat5-jsp-2.0-api
--> Processing Dependency: java-gcj-compat >= 1.0.64 for package: gjdoc
--> Processing Dependency: java-gcj-compat >= 1.0.64 for package: gjdoc
--> Processing Dependency: java-gcj-compat for package: jakarta-commons-httpclient
--> Processing Dependency: java-gcj-compat for package: jakarta-commons-httpclient
--> Processing Dependency: java-gcj-compat >= 1.0.31 for package: tomcat5-servlet-2.4-api
--> Processing Dependency: java-gcj-compat >= 1.0.31 for package: tomcat5-servlet-2.4-api
--> Processing Dependency: java-gcj-compat for package: bsf
--> Processing Dependency: java-gcj-compat for package: bsf
--> Processing Dependency: java-gcj-compat for package: xalan-j2
--> Processing Dependency: java-gcj-compat for package: xalan-j2
--> Processing Dependency: java-gcj-compat for package: xmlrpc
--> Processing Dependency: java-gcj-compat for package: xmlrpc
--> Processing Dependency: java-gcj-compat for package: bsh
--> Processing Dependency: java-gcj-compat for package: bsh
--> Processing Dependency: java-1.4.2-gcj-compat = 1.4.2.0-40jpp.115 for package: java-1.4.2-gcj-compat-src
--> Processing Dependency: java-1.4.2-gcj-compat = 1.4.2.0-40jpp.115 for package: java-1.4.2-gcj-compat-devel
--> Running transaction check
---> Package antlr.i386 0:2.7.6-4jpp.2 set to be erased
---> Package bsf.i386 0:2.3.0-11jpp.1 set to be erased
---> Package bsh.i386 0:1.3.0-9jpp.1 set to be erased
---> Package gjdoc.i386 0:0.7.7-12.el5 set to be erased
---> Package jakarta-commons-codec.i386 0:1.3-7jpp.2 set to be erased
---> Package jakarta-commons-httpclient.i386 1:3.0-7jpp.1 set to be erased
---> Package jakarta-commons-logging.i386 0:1.0.4-6jpp.1 set to be erased
---> Package java-1.4.2-gcj-compat-devel.i386 0:1.4.2.0-40jpp.115 set to be erased
---> Package java-1.4.2-gcj-compat-src.i386 0:1.4.2.0-40jpp.115 set to be erased
---> Package junit.i386 0:3.8.2-3jpp.1 set to be erased
---> Package tomcat5-jsp-2.0-api.i386 0:5.5.23-0jpp.7.el5_3.2 set to be erased
---> Package tomcat5-servlet-2.4-api.i386 0:5.5.23-0jpp.7.el5_3.2 set to be erased
---> Package xalan-j2.i386 0:2.7.0-6jpp.1 set to be erased
---> Package xmlrpc.i386 0:2.0.1-3jpp.1 set to be erased
--> Processing Dependency: /usr/bin/rebuild-gcj-db for package: eclipse-ecj
--> Processing Dependency: /usr/bin/rebuild-gcj-db for package: eclipse-ecj
--> Restarting Dependency Resolution with new changes.
--> Running transaction check
---> Package eclipse-ecj.i386 1:3.2.1-19.el5 set to be erased
--> Finished Dependency Resolution

Dependencies Resolved

====================================================================================
Package                        Arch    Version                  Repository    Size
====================================================================================
Removing:
java-1.4.2-gcj-compat          i386    1.4.2.0-40jpp.115        installed     441
Removing for dependencies:
antlr                          i386    2.7.6-4jpp.2             installed    2.5 M
bsf                            i386    2.3.0-11jpp.1            installed    812 k
bsh                            i386    1.3.0-9jpp.1             installed    1.2 M
eclipse-ecj                    i386    1:3.2.1-19.el5           installed     18 M
gjdoc                          i386    0.7.7-12.el5             installed    1.7 M
jakarta-commons-codec          i386    1.3-7jpp.2               installed    207 k
jakarta-commons-httpclient     i386    1:3.0-7jpp.1             installed    1.3 M
jakarta-commons-logging        i386    1.0.4-6jpp.1             installed    233 k
java-1.4.2-gcj-compat-devel    i386    1.4.2.0-40jpp.115        installed     81 k
java-1.4.2-gcj-compat-src      i386    1.4.2.0-40jpp.115        installed     0.0
junit                          i386    3.8.2-3jpp.1             installed    602 k
tomcat5-jsp-2.0-api            i386    5.5.23-0jpp.7.el5_3.2    installed    163 k
tomcat5-servlet-2.4-api        i386    5.5.23-0jpp.7.el5_3.2    installed    250 k
xalan-j2                       i386    2.7.0-6jpp.1             installed    5.1 M
xmlrpc                         i386    2.0.1-3jpp.1             installed    864 k

Transaction Summary
====================================================================================
Remove       16 Package(s)
Reinstall     0 Package(s)
Downgrade     0 Package(s)

Downloading Packages:
Running rpm_check_debug
Running Transaction Test
Finished Transaction Test
Transaction Test Succeeded
Running Transaction
Erasing        : java-1.4.2-gcj-compat-devel                1/16
Erasing        : bsf                                        2/16
Erasing        : antlr                                      3/16
Erasing        : tomcat5-servlet-2.4-api                    4/16
Erasing        : jakarta-commons-codec                      5/16
Erasing        : java-1.4.2-gcj-compat-src                  6/16
Erasing        : jakarta-commons-logging                    7/16
Erasing        : junit                                      8/16
Erasing        : tomcat5-jsp-2.0-api                        9/16
Erasing        : xmlrpc                                    10/16
Erasing        : java-1.4.2-gcj-compat                     11/16
Erasing        : xalan-j2                                  12/16
Erasing        : jakarta-commons-httpclient                13/16
Erasing        : bsh                                       14/16
Erasing        : gjdoc                                     15/16
Erasing        : eclipse-ecj                               16/16

Removed:
java-1.4.2-gcj-compat.i386 0:1.4.2.0-40jpp.115

Dependency Removed:
antlr.i386 0:2.7.6-4jpp.2                                 bsf.i386 0:2.3.0-11jpp.1                         bsh.i386 0:1.3.0-9jpp.1                                
eclipse-ecj.i386 1:3.2.1-19.el5                           gjdoc.i386 0:0.7.7-12.el5                        jakarta-commons-codec.i386 0:1.3-7jpp.2                
jakarta-commons-httpclient.i386 1:3.0-7jpp.1              jakarta-commons-logging.i386 0:1.0.4-6jpp.1      java-1.4.2-gcj-compat-devel.i386 0:1.4.2.0-40jpp.115   
java-1.4.2-gcj-compat-src.i386 0:1.4.2.0-40jpp.115        junit.i386 0:3.8.2-3jpp.1                        tomcat5-jsp-2.0-api.i386 0:5.5.23-0jpp.7.el5_3.2       
tomcat5-servlet-2.4-api.i386 0:5.5.23-0jpp.7.el5_3.2      xalan-j2.i386 0:2.7.0-6jpp.1                     xmlrpc.i386 0:2.0.1-3jpp.1

Complete!
++++++ 安装新版的1.6版本JDK(可以去oracle官网下载) ++++++

[root@roger java]# sudo ./jdk-6u27-linux-i586-rpm.bin

Unpacking...
Checksumming...
Extracting...
UnZipSFX 5.50 of 17 February 2002, by Info-ZIP (Zip-Bugs@lists.wku.edu).
inflating: jdk-6u27-linux-i586.rpm
inflating: sun-javadb-common-10.6.2-1.1.i386.rpm
inflating: sun-javadb-core-10.6.2-1.1.i386.rpm
inflating: sun-javadb-client-10.6.2-1.1.i386.rpm
inflating: sun-javadb-demo-10.6.2-1.1.i386.rpm
inflating: sun-javadb-docs-10.6.2-1.1.i386.rpm
inflating: sun-javadb-javadoc-10.6.2-1.1.i386.rpm
Preparing...                ########################################### [100%]
1:jdk                    ########################################### [100%]
Unpacking JAR files...
rt.jar...
jsse.jar...
charsets.jar...
tools.jar...
localedata.jar...
plugin.jar...
javaws.jar...
deploy.jar...
Installing JavaDB
Preparing...                ########################################### [100%]
1:sun-javadb-common      ########################################### [ 17%]
2:sun-javadb-core        ########################################### [ 33%]
3:sun-javadb-client      ########################################### [ 50%]
4:sun-javadb-demo        ########################################### [ 67%]
5:sun-javadb-docs        ########################################### [ 83%]
6:sun-javadb-javadoc     ########################################### [100%]

Java(TM) SE Development Kit 6 successfully installed.

Product Registration is FREE and includes many benefits:
* Notification of new versions, patches, and updates
* Special offers on Oracle products, services and training
* Access to early releases and documentation

Product and system data will be collected. If your configuration
supports a browser, the JDK Product Registration form will
be presented. If you do not register, none of this information
will be saved. You may also register your JDK later by
opening the register.html file (located in the JDK installation
directory) in a browser.

For more information on what data Registration collects and
how it is managed and used, see:
http://java.sun.com/javase/registration/JDKRegistrationPrivacy.html

Press Enter to continue.....

Done.

[root@roger java]# groupadd hadoop
[root@roger java]# useradd -g hadoop -G hadoop  hadoop

useradd: warning: the home directory already exists.
Not copying any file from skel directory into it.

[root@roger java]# passwd hadoop

Changing password for user hadoop.
New UNIX password:
BAD PASSWORD: it is based on a dictionary word
Retype new UNIX password:
passwd: all authentication tokens updated successfully.

====== vi /etc/profile   add ======

export JAVA_HOME=/usr/java/jdk1.6.0_27
export PATH=$PATH:$JAVA_HOME/bin
export CLASSPATH=.:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/rt.jar

====== source /etc/profile(然后进行验证) ======

[root@roger bin]# source /etc/profile
[root@roger bin]# which java

/usr/java/jdk1.6.0_27/bin/java

[root@roger bin]# java -version

java version "1.6.0_27"
Java(TM) SE Runtime Environment (build 1.6.0_27-b07)
Java HotSpot(TM) Client VM (build 20.2-b06, mixed mode, sharing)
[root@roger bin]#

====== 验证hadoop java环境 ======

[root@roger bin]# su - hadoop
-bash-3.2$ which java

/usr/java/jdk1.6.0_27/bin/java


++++++ 配置ssh key ++++++

ssh-keygen -t rsa
cat /home/hadoop/.ssh/id_rsa.pub >> /home/hadoop/.ssh/authorized_keys

++++++ 修改hadoop  conf ++++++

-bash-3.2$ pwd
/home/hadoop/hadoop-0.20.2/conf

-bash-3.2$ cat hadoop-env.sh|grep JAVA_HOME

# The only required environment variable is JAVA_HOME.  All others are
# set JAVA_HOME in this file, so that it is correctly defined on
# export JAVA_HOME=/usr/lib/j2sdk1.5-sun
export JAVA_HOME=/usr/java/jdk1.6.0_27
====== 修改如下几个文件 ======

++++++ 重命名core-site.xml,新建该文件,添加如下内容 ++++++

-bash-3.2$ cat core-site.xml

< xml version="1.0" >
< xml-stylesheet type="text/xsl" href="configuration.xsl" > 

 

>
>
>hadoop.tmp.dir>
>/home/hadoop/hadooptmp>
>A base for other temporary directories.>
>
>
>fs.default.name>
>hdfs://192.168.2.130:9000>
>The name of the default file system.  A URI whose
scheme and authority determine the FileSystem implementation.  The
uri's scheme determines the config property (fs.SCHEME.impl) naming
the FileSystem implementation class.  The uri's authority is used to
determine the host, port, etc. for a filesystem.>
>
>
++++++ 重命名hdfs-site.xml,新建该文件,添加如下内容 ++++++

-bash-3.2$ cat hdfs-site.xml

< xml version="1.0" >
< xml-stylesheet type="text/xsl" href="configuration.xsl" > 

 

>
>
>dfs.replication>
>1>
>Default block replication. 
The actual number of replications can be specified when the file is created.
The default is used if replication is not specified in create time.
>


>
++++++ 重命名mapred-site.xml,新建该文件,添加如下内容 ++++++

-bash-3.2$ cat mapred-site.xml

< xml version="1.0" >
< xml-stylesheet type="text/xsl" href="configuration.xsl" > 

 

>
>
>mapred.job.tracker>
>192.168.2.130:9001>
>The host and port that the MapReduce job tracker runs
at.  If "local", then jobs are run in-process as a single map
and reduce task.
>
>
> style="color: rgb(0, 0, 0); font-family: Verdana, Arial, Helvetica, sans-serif; font-size: 12px;">
>
++++++ 格式化hadoop ++++++

-bash-3.2$ pwd
/home/hadoop/hadoop-0.20.2/bin

-bash-3.2$ ./hadoop  namenode -format

11/10/12 07:01:40 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = hadoopName/192.168.2.130
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 0.20.2
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
11/10/12 07:01:40 INFO namenode.FSNamesystem: fsOwner=hadoop,hadoop
11/10/12 07:01:40 INFO namenode.FSNamesystem: supergroup=supergroup
11/10/12 07:01:40 INFO namenode.FSNamesystem: isPermissionEnabled=true
11/10/12 07:01:40 INFO common.Storage: Image file of size 96 saved in 0 seconds.
11/10/12 07:01:40 INFO common.Storage: Storage directory /home/hadoop/hadooptmp/dfs/name has been successfully formatted.
11/10/12 07:01:40 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at hadoopName/192.168.2.130
************************************************************/

++++++ 启动hadoop ++++++

-bash-3.2$ ./start-all.sh

starting namenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-namenode-hadoopName.out
hadoop@localhost's password:
localhost: starting datanode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-datanode-hadoopName.out
hadoop@localhost's password:
localhost: starting secondarynamenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-secondarynamenode-hadoopName.out
starting jobtracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-jobtracker-hadoopName.out
hadoop@localhost's password:
localhost: starting tasktracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-tasktracker-hadoopName.out
localhost: [Fatal Error] mapred-site.xml:15:18: The markup in the document following the root element must be well-formed.

-bash-3.2$ jps
11597 Jps

-bash-3.2$ hadoop fs -ls
-bash: hadoop: command not found

-bash-3.2$ ./hadoop fs -ls

[Fatal Error] mapred-site.xml:15:18: The markup in the document following the root element must be well-formed.
11/10/12 07:06:35 FATAL conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException: The markup in the document following the root element must be well-formed.
Exception in thread "main" java.lang.RuntimeException: org.xml.sax.SAXParseException: The markup in the document following the root element must be well-formed.
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1168)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1030)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:980)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:382)
at ohttp://www.oracleplus.netrg.apache.hadoop.conf.Configuration.getInt(Configuration.java:451)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:182)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.hadoop.fs.FsShell.init(FsShell.java:82)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:1731)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)
Caused by: org.xml.sax.SAXParseException: The markup in the document following the root element must be well-formed.
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:249)
at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:284)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:180)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1079)
... 17 more


++++++ 下面来看看到底是什么错误?++++++

-bash-3.2$ cat hadoop-hadoop-tasktracker-hadoopName.out

[Fatal Error] mapred-site.xml:15:18: The markup in the document following the root element must be well-formed.
-bash-3.2$ cat hadoop-hadoop-tasktracker-hadoopName.log
2011-10-12 07:03:08,906 INFO org.apache.hadoop.mapred.TaskTracker: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting TaskTracker
STARTUP_MSG:   host = hadoopName/192.168.2.130
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.20.2
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
2011-10-12 07:03:09,053 FATAL org.apache.hadoop.conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException: The markup in the document following the root element must be well-formed.
2011-10-12 07:03:09,054 ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.lang.RuntimeException: org.xml.sax.SAXParseException: The markup in the document following the root element must be well-formed.
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1168)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1030)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:980)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:382)
at org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:1662)
at org.apache.hadoop.mapred.JobConf.(JobConf.java:165)
at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2829)
Caused by: org.xml.sax.SAXParseException: The markup in the document following the root element must be well-formed.
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:249)
at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:284)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:180)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1079)
... 6 more

2011-10-12 07:03:09,055 INFO org.apache.hadoop.mapred.TaskTracker: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down TaskTracker at hadoopName/192.168.2.130
************************************************************/
++++++ 从错误来看,是文件配置有点问题,修改为如下内容,然后再次启动ok。++++++

-bash-3.2$ cat mapred-site.xml

< xml version="1.0" >
< xml-stylesheet type="text/xsl" href="configuration.xsl" > 

 

>
>mapred.job.tracker>
>192.168.2.130:9001>
>
-bash-3.2$ ./start-all.sh

starting namenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-namenode-hadoopName.out
hadoop@192.168.2.130's password:
192.168.2.130: starting datanode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-datanode-hadoopName.out
hadoop@192.168.2.130's password:
192.168.2.130: starting secondarynamenode, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-secondarynamenode-hadoopName.out
starting jobtracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-jobtracker-hadoopName.out
hadoop@192.168.2.130's password:
192.168.2.130: starting tasktracker, logging to /home/hadoop/hadoop-0.20.2/bin/../logs/hadoop-hadoop-tasktracker-hadoopName.out

-bash-3.2$ jps

17189 SecondaryNameNode
17461 Jps
17094 DataNode

-bash-3.2$ ./hadoop dfs

Usage: java FsShell
[-ls ]
[-lsr ]
[-du ]
[-dus ]
[-count[-q] ]
[-mv  ]
[-cp  ]
[-rm [-skipTrash] ]
[-rmr [-skipTrash] ]
[-expunge]
[-put  ... ]
[-copyFromLocal  ... ]
[-moveFromLocal  ... ]
[-get [-ignoreCrc] [-crc]  ]
[-getmerge   [addnl]]
[-cat ]
[-text ]
[-copyToLocal [-ignoreCrc] [-crc]  ]
[-moveToLocal [-crc]  ]
[-mkdir ]
[-setrep [-R] [-w]  ]
[-touchz ]
[-test -[ezd] ]
[-stat [format] ]
[-tail [-f] ]
[-chmod [-R]  PATH...]
[-chown [-R] [OWNER][:[GROUP]] PATH...]
[-chgrp [-R] GROUP PATH...]
[-help [cmd]]

Generic options supported are
-conf      specify an application configuration file
-D             use value for given property
-fs       specify a namenode
-jt     specify a job tracker
-files     specify comma separated files to be copied to the map reduce cluster
-libjars     specify comma separated jar files to include in the classpath.
-archives     specify comma separated archives to be unarchived on the compute machines.

The general command line syntax is
bin/hadoop command [genericOptions] [commandOptions]

-bash-3.2$ ls
hadoop             hadoop-daemon.sh   rcc       
hadoop-config.sh   hadoop-daemons.sh  slaves.sh 
start-all.sh       start-dfs.sh       stop-all.sh       stop-dfs.sh
start-balancer.sh  start-mapred.sh    stop-balancer.sh  stop-mapred.sh

-bash-3.2$ ./stop-all.sh
no jobtracker to stop
hadoop@192.168.2.130's password:
192.168.2.130: no tasktracker to stop
no namenode to stop
hadoop@192.168.2.130's password:
192.168.2.130: stopping datanode
hadoop@192.168.2.130's password:
192.168.2.130: stopping secondarynamenode

--------------------------------------ORACLE-DBA----------------------------------------

最权威、专业的Oracle案例资源汇总之【学习笔记】记录hadoop FOR linux详细安装笔记

本文由大师惜分飞原创分享,网址:http://www.oracleplus.net/arch/1317.html

Oracle研究中心

关键词:

hadoop安装配置

hadoop Linux安装详细步骤