测试Hadoop2.7.1
生活随笔
收集整理的這篇文章主要介紹了
测试Hadoop2.7.1
小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.
三臺機器 CentOS7(機器名分別為master-CentOS7、slave1-CentOS7、slave2-CentOS7),每臺機器內(nèi)存2G(迫于無奈,剛換了內(nèi)存條)
之前寫了一篇“CentOS 7 安裝Hadoop 2.7.1”http://blog.csdn.net/noob_f/article/details/52356779
wordcount統(tǒng)計單詞
master-CentOS7(已啟動Hadoop集群)
[root@master ~]# cd /usr/local/hadoop/[root@master hadoop]# bin/hdfs dfs -mkdir /test001 [root@master hadoop]# bin/hdfs dfs -ls / Found 3 items drwxr-xr-x - root supergroup 0 2016-09-01 19:41 /test001 drwx------ - root supergroup 0 2016-08-29 20:26 /tmp drwxr-xr-x - root supergroup 0 2016-08-29 20:26 /user [root@master hadoop]# ls bin etc lib LICENSE.txt NOTICE.txt sbin tmp dfs include libexec logs README.txt share [root@master hadoop]# wc -l LICENSE.txt 289 LICENSE.txt [root@master hadoop]# du -sh !$ du -sh LICENSE.txt 16K LICENSE.txt [root@master hadoop]# bin/hdfs dfs -copyFromLocal ./LICENSE.txt /test001 [root@master hadoop]# bin/hdfs dfs -ls /test001 Found 1 items -rw-r--r-- 2 root supergroup 15429 2016-09-01 19:46 /test001/LICENSE.txt [root@master hadoop]# bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar wordcount /test001/LICENSE.txt /test001/ [root@master hadoop]# echo $? 255發(fā)現(xiàn)命令執(zhí)行出錯:
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://192.168.1.182:9000/test001 already exists命令改一下
[root@master hadoop]# bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar wordcount /test001/LICENSE.txt /test001/wordcount 16/09/02 17:09:35 INFO client.RMProxy: Connecting to ResourceManager at /192.168.1.182:8032 16/09/02 17:09:36 INFO input.FileInputFormat: Total input paths to process : 1 16/09/02 17:09:36 INFO mapreduce.JobSubmitter: number of splits:1 16/09/02 17:09:37 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1472804584592_0003 16/09/02 17:09:37 INFO impl.YarnClientImpl: Submitted application application_1472804584592_0003 16/09/02 17:09:37 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1472804584592_0003/ 16/09/02 17:09:37 INFO mapreduce.Job: Running job: job_1472804584592_0003 16/09/02 17:09:46 INFO mapreduce.Job: Job job_1472804584592_0003 running in uber mode : false 16/09/02 17:09:46 INFO mapreduce.Job: map 0% reduce 0% 16/09/02 17:09:55 INFO mapreduce.Job: map 100% reduce 0% 16/09/02 17:10:04 INFO mapreduce.Job: map 100% reduce 100% 16/09/02 17:10:05 INFO mapreduce.Job: Job job_1472804584592_0003 completed successfully 16/09/02 17:10:05 INFO mapreduce.Job: Counters: 49File System CountersFILE: Number of bytes read=10992FILE: Number of bytes written=252973FILE: Number of read operations=0FILE: Number of large read operations=0FILE: Number of write operations=0HDFS: Number of bytes read=15539HDFS: Number of bytes written=8006HDFS: Number of read operations=6HDFS: Number of large read operations=0HDFS: Number of write operations=2Job CountersLaunched map tasks=1Launched reduce tasks=1Data-local map tasks=1Total time spent by all maps in occupied slots (ms)=6493Total time spent by all reduces in occupied slots (ms)=6714Total time spent by all map tasks (ms)=6493Total time spent by all reduce tasks (ms)=6714Total vcore-seconds taken by all map tasks=6493Total vcore-seconds taken by all reduce tasks=6714Total megabyte-seconds taken by all map tasks=6648832Total megabyte-seconds taken by all reduce tasks=6875136Map-Reduce FrameworkMap input records=289Map output records=2157Map output bytes=22735Map output materialized bytes=10992Input split bytes=110Combine input records=2157Combine output records=755Reduce input groups=755Reduce shuffle bytes=10992Reduce input records=755Reduce output records=755Spilled Records=1510Shuffled Maps =1Failed Shuffles=0Merged Map outputs=1GC time elapsed (ms)=146CPU time spent (ms)=2360Physical memory (bytes) snapshot=312647680Virtual memory (bytes) snapshot=1717682176Total committed heap usage (bytes)=163123200Shuffle ErrorsBAD_ID=0CONNECTION=0IO_ERROR=0WRONG_LENGTH=0WRONG_MAP=0WRONG_REDUCE=0File Input Format CountersBytes Read=15429File Output Format CountersBytes Written=8006 [root@master hadoop]# echo $? 0 [root@master hadoop]# bin/hdfs dfs -ls /test001/ Found 2 items -rw-r--r-- 2 root supergroup 15429 2016-09-01 19:46 /test001/LICENSE.txt drwxr-xr-x - root supergroup 0 2016-09-02 17:10 /test001/wordcount [root@master hadoop]# bin/hdfs dfs -ls /test001/wordcount Found 2 items -rw-r--r-- 2 root supergroup 0 2016-09-02 17:10 /test001/wordcount/_SUCCESS -rw-r--r-- 2 root supergroup 8006 2016-09-02 17:10 /test001/wordcount/part-r-00000 [root@master hadoop]# bin/hdfs dfs -cat /test001/wordcount/part-r-00000 "AS 4 "Contribution" 1 "Contributor" 1 "Derivative 1 "Legal 1 "License" 1 "License"); 1 "Licensor" 1 "NOTICE" 1 "Not 1 "Object" 1 "Source" 1 "Work" 1 "You" 1 "Your") 1 "[]" 1 "control" 1 "printed 1 "submitted" 1 (50%) 1 (C) 1 (Don't 1 (INCLUDING 2 (INCLUDING, 2 (a) 1 (an 1 (and 1 (b) 1 (c) 2 (d) 1 (except 1 (http://www.one-lab.org) 1 (http://www.opensource.org/licenses/bsd-license.php) 1 (i) 1 (ii) 1 (iii) 1 (including 3 (or 3 (such 1 (the 1 * 34 */ 3 - 7 /* 1 /** 2 034819 1 1 1 1. 1 2-Clause 1 2. 1 2.0 1 2.0, 1 2004 1 2005, 1 2008,2009,2010 1 2011-2014, 1 3. 1 4. 1 5. 1 6. 1 7. 1 8. 1 9 1 9. 1 : 3 A 3 ADVISED 2 AND 11 ANY 10 APACHE 1 APPENDIX: 1 ARE 2 ARISING 2 Accepting 1 Additional 1 All 2 Apache 5 Appendix 1 BASIS, 2 BE 2 BSD 1 BSD-style 1 BUSINESS 2 BUT 4 BY 2 CAUSED 2 CONDITIONS 4 CONSEQUENTIAL 2 CONTRACT, 2 CONTRIBUTORS 4 COPYRIGHT 4 CRC 1 Catholique 1 Collet. 1 Commission 1 Contribution 3 Contribution(s) 3 Contribution." 1 Contributions) 1 Contributions. 2 Contributor 8 Contributor, 1 Copyright 5 DAMAGE. 2 DAMAGES 2 DATA, 2 DIRECT, 2 DISCLAIMED. 2 DISTRIBUTION 1 Definitions. 1 Derivative 17 Disclaimer 1 END 1 EVEN 2 EVENT 2 EXEMPLARY, 2 EXPRESS 2 Entity 3 Entity" 1 European 1 FITNESS 3 FOR 6 Fast 1 File 1 For 6 GOODS 2 Grant 2 HADOOP 1 HOLDERS 2 HOWEVER 2 Hadoop 1 Header 1 How 1 However, 1 IF 2 IMPLIED 4 IN 6 INCIDENTAL, 2 INCLUDING, 2 INDIRECT, 2 INTERRUPTION) 2 IS 2 IS" 4 If 2 In 1 Institute 1 January 1 KIND, 2 LIABILITY, 4 LIABLE 2 LICENSE 1 LIMITED 4 LOSS 2 LZ 1 LZ4 3 Legal 3 Liability. 2 License 10 License, 6 License. 11 License; 1 Licensed 1 Licensor 8 Licensor, 1 Limitation 1 Louvain 1 MERCHANTABILITY 2 MERCHANTABILITY, 1 Massachusetts 1 NEGLIGENCE 2 NO 2 NON-INFRINGEMENT, 1 NOT 4 NOTICE 5 Neither 1 Notwithstanding 1 OF 19 ON 2 OR 18 OTHERWISE) 2 OUT 2 OWNER 2 Object 4 OneLab 1 PARTICULAR 3 POSSIBILITY 2 PROCUREMENT 2 PROFITS; 2 PROVIDED 2 PURPOSE 2 PURPOSE. 1 Patent 1 REPRODUCTION, 1 Redistribution 2 Redistribution. 1 Redistributions 4 SERVICES; 2 SHALL 2 SOFTWARE 2 SOFTWARE, 2 SPECIAL, 2 STRICT 2 SUBCOMPONENTS: 1 SUBSTITUTE 2 SUCH 2 Sections 1 See 1 Source 8 Subject 2 Submission 1 TERMS 2 THE 10 THEORY 2 THIS 4 TITLE, 1 TO, 4 TORT 2 Technology. 1 The 3 This 1 To 1 Trademarks. 1 UCL 1 USE 2 USE, 3 University 1 Unless 3 Use 1 Version 2 WARRANTIES 4 WARRANTIES, 2 WAY 2 WHETHER 2 WITHOUT 2 Warranty 1 Warranty. 1 We 1 While 1 Work 20 Work, 4 Work. 1 Works 12 Works" 1 Works, 2 Works; 3 Yann 1 You 24 Your 9 [name 1 [yyyy] 1 a 21 above 4 above, 1 acceptance 1 accepting 2 act 1 acting 1 acts) 1 add 2 addendum 1 additional 4 additions 1 advised 1 against 1 against, 1 agree 1 agreed 3 agreement 1 algorithm 1 all 3 alleging 1 alone 1 along 1 alongside 1 also 1 an 6 and 51 and/or 3 annotations, 1 any 28 appear. 1 applicable 3 applies 1 apply 2 appropriate 1 appropriateness 1 archives. 1 are 10 arising 1 as 15 asserted 1 associated 1 assume 1 at 3 attach 1 attached 1 attribution 4 author 1 authorized 2 authorship, 2 authorship. 1 available 1 based 1 be 7 been 2 behalf 5 below). 1 beneficial 1 binary 4 bind 1 boilerplate 1 brackets 1 brackets!) 1 but 5 by 21 by, 3 calculation 1 can 2 cannot 1 carry 1 cause 2 changed 1 character 1 charge 1 choose 1 claims 2 class 1 classes: 1 code 5 code, 2 combination 1 comment 1 commercial 1 common 1 communication 3 compiled 1 compliance 1 complies 1 compression 1 computer 1 conditions 14 conditions. 1 conditions: 1 configuration 1 consequential 1 consistent 1 conspicuously 1 constitutes 1 construed 1 contact 1 contained 1 contains 1 content 1 contents 1 contract 2 contract, 1 contributors 1 contributory 1 control 2 control, 1 controlled 1 conversions 1 copies 1 copy 3 copyright 15 copyright, 1 counterclaim 1 cross-claim 1 customary 1 damages 3 damages, 1 damages. 1 date 1 de 1 defend, 1 defined 1 definition, 2 deliberate 1 derived 2 describing 1 description 1 designated 1 determining 1 different 1 direct 2 direct, 1 direction 1 disclaimer 2 disclaimer. 2 discussing 1 display 1 display, 1 distribute 3 distribute, 2 distributed 3 distribution 3 distribution, 1 distribution. 2 do 3 document. 1 documentation 3 documentation, 2 does 1 each 4 easier 1 editorial 1 either 2 elaborations, 1 electronic 1 electronic, 1 enclosed 2 endorse 1 entities 1 entity 3 entity, 1 entity. 2 even 1 event 1 example 1 except 2 excluding 3 executed 1 exercise 1 exercising 1 explicitly 1 express 2 failure 1 fee 1 fields 1 fifty 1 file 6 file, 1 file. 2 filed. 1 files 1 files. 1 files; 1 following 10 for 19 for, 1 form 10 form, 4 form. 1 format. 1 forms, 2 forum 1 found 1 from 4 from) 1 from, 1 generated 2 give 1 goodwill, 1 governed 1 governing 1 grant 1 granted 2 granting 1 grants 2 grossly 1 harmless 1 has 2 have 2 hereby 2 herein 1 hold 1 http://code.google.com/p/lz4/ 1 http://www.apache.org/licenses/ 1 http://www.apache.org/licenses/LICENSE-2.0 1 https://groups.google.com/forum/#!forum/lz4c 1 identification 1 identifying 1 if 4 implementation 1 implied, 1 implied. 1 import, 1 improving 1 in 31 inability 1 incidental, 1 include 3 included 2 includes 1 including 5 including, 1 inclusion 2 incorporated 2 incurred 1 indemnify, 1 indemnity, 1 indicated 1 indirect, 2 individual 3 information. 1 informational 1 infringed 1 infringement, 1 institute 1 intentionally 2 interfaces 1 irrevocable 2 is 10 issue 1 its 4 language 1 law 3 lawsuit) 1 least 1 legal 1 liability 2 liability. 1 liable 1 licensable 1 license 7 licenses 1 licenses. 1 limitation, 1 limitations 1 limited 4 link 1 list 4 lists, 1 litigation 2 loss 1 losses), 1 made 1 made, 1 mailing 1 make, 1 making 1 malfunction, 1 managed 1 management 1 marked 1 marks, 1 materials 2 may 10 mean 10 means 2 mechanical 1 media 1 medium, 1 meet 1 merely 1 met: 2 modification, 2 modifications 3 modifications, 3 modified 1 modify 2 modifying 1 more 1 must 8 name 2 name) 1 names 2 names, 1 native 1 necessarily 1 negligence), 1 negligent 1 no 2 no-charge, 2 non-exclusive, 2 nor 1 normally 1 not 11 nothing 1 notice 2 notice, 5 notices 9 object 1 obligations 1 obligations, 1 obtain 1 of 75 of, 3 offer 1 offer, 1 on 11 one 1 only 4 or 65 or, 1 org.apache.hadoop.util.bloom.* 1 origin 1 original 2 other 9 otherwise 3 otherwise, 3 out 1 outstanding 1 own 4 owner 4 owner. 1 owner] 1 ownership 2 page" 1 part 4 patent 5 patent, 1 percent 1 perform, 1 permission 1 permission. 1 permissions 3 permitted 2 perpetual, 2 pertain 2 places: 1 portions 1 possibility 1 power, 1 preferred 1 prepare 1 prior 1 product 1 products 1 project 2 prominent 1 promote 1 provide 1 provided 9 provides 2 public 1 publicly 2 purpose 2 purposes 4 readable 1 reason 1 reasonable 1 received 1 recipients 1 recommend 1 redistributing 2 regarding 1 remain 1 replaced 1 repository 1 represent, 1 representatives, 1 reproduce 3 reproduce, 1 reproducing 1 reproduction, 3 required 4 reserved. 2 responsibility, 1 responsible 1 result 1 resulting 1 retain 2 retain, 1 revisions, 1 rights 3 risks 1 royalty-free, 2 same 1 section) 1 sell, 2 sent 1 separable 1 separate 2 service 1 shall 15 shares, 1 should 1 slicing-by-8 1 software 3 sole 1 solely 1 source 9 source, 1 special, 1 specific 2 src/main/native/src/org/apache/hadoop/io/compress/lz4/{lz4.h,lz4.c,lz4hc.h,lz4hc.c}, 1 src/main/native/src/org/apache/hadoop/util: 1 state 1 stated 2 statement 1 stating 1 stoppage, 1 subcomponents 2 subject 1 sublicense, 1 submit 1 submitted 2 submitted. 1 subsequently 1 such 17 supersede 1 support, 1 syntax 1 systems 1 systems, 1 terminate 1 terms 8 terms. 1 text 4 that 25 the 122 their 2 then 2 theory, 1 thereof 1 thereof, 2 thereof. 1 these 1 third-party 2 this 22 those 3 through 1 to 41 tort 1 tracking 1 trade 1 trademark, 1 trademarks, 1 transfer 1 transformation 1 translation 1 types. 1 under 10 union 1 unless 1 use 8 use, 4 used 1 using 1 verbal, 1 version 1 warranties 1 warranty 1 warranty, 1 was 1 where 1 wherever 1 whether 4 which 2 whole, 2 whom 1 with 16 within 8 without 6 work 5 work, 2 work. 1 works 1 worldwide, 2 writing 1 writing, 3 written 2 you 2 your 4運行 PI 實例
master-CentOS7(已啟動Hadoop集群)
[root@master ~]# cd /usr/local/hadoop/ [root@master hadoop]# bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar pi 100 100 Number of Maps = 100 Samples per Map = 100 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Wrote input for Map #5 Wrote input for Map #6 Wrote input for Map #7 Wrote input for Map #8 Wrote input for Map #9 Wrote input for Map #10 Wrote input for Map #11 Wrote input for Map #12 Wrote input for Map #13 Wrote input for Map #14 Wrote input for Map #15 Wrote input for Map #16 Wrote input for Map #17 Wrote input for Map #18 Wrote input for Map #19 Wrote input for Map #20 Wrote input for Map #21 Wrote input for Map #22 Wrote input for Map #23 Wrote input for Map #24 Wrote input for Map #25 Wrote input for Map #26 Wrote input for Map #27 Wrote input for Map #28 Wrote input for Map #29 Wrote input for Map #30 Wrote input for Map #31 Wrote input for Map #32 Wrote input for Map #33 Wrote input for Map #34 Wrote input for Map #35 Wrote input for Map #36 Wrote input for Map #37 Wrote input for Map #38 Wrote input for Map #39 Wrote input for Map #40 Wrote input for Map #41 Wrote input for Map #42 Wrote input for Map #43 Wrote input for Map #44 Wrote input for Map #45 Wrote input for Map #46 Wrote input for Map #47 Wrote input for Map #48 Wrote input for Map #49 Wrote input for Map #50 Wrote input for Map #51 Wrote input for Map #52 Wrote input for Map #53 Wrote input for Map #54 Wrote input for Map #55 Wrote input for Map #56 Wrote input for Map #57 Wrote input for Map #58 Wrote input for Map #59 Wrote input for Map #60 Wrote input for Map #61 Wrote input for Map #62 Wrote input for Map #63 Wrote input for Map #64 Wrote input for Map #65 Wrote input for Map #66 Wrote input for Map #67 Wrote input for Map #68 Wrote input for Map #69 Wrote input for Map #70 Wrote input for Map #71 Wrote input for Map #72 Wrote input for Map #73 Wrote input for Map #74 Wrote input for Map #75 Wrote input for Map #76 Wrote input for Map #77 Wrote input for Map #78 Wrote input for Map #79 Wrote input for Map #80 Wrote input for Map #81 Wrote input for Map #82 Wrote input for Map #83 Wrote input for Map #84 Wrote input for Map #85 Wrote input for Map #86 Wrote input for Map #87 Wrote input for Map #88 Wrote input for Map #89 Wrote input for Map #90 Wrote input for Map #91 Wrote input for Map #92 Wrote input for Map #93 Wrote input for Map #94 Wrote input for Map #95 Wrote input for Map #96 Wrote input for Map #97 Wrote input for Map #98 Wrote input for Map #99 Starting Job 16/09/02 16:40:43 INFO client.RMProxy: Connecting to ResourceManager at /192.168.1.182:8032 16/09/02 16:40:44 INFO input.FileInputFormat: Total input paths to process : 100 16/09/02 16:40:45 INFO mapreduce.JobSubmitter: number of splits:100 16/09/02 16:40:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1472804584592_0002 16/09/02 16:40:45 INFO impl.YarnClientImpl: Submitted application application_1472804584592_0002 16/09/02 16:40:46 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1472804584592_0002/ 16/09/02 16:40:46 INFO mapreduce.Job: Running job: job_1472804584592_0002 16/09/02 16:40:55 INFO mapreduce.Job: Job job_1472804584592_0002 running in uber mode : false 16/09/02 16:40:55 INFO mapreduce.Job: map 0% reduce 0% 16/09/02 16:41:16 INFO mapreduce.Job: map 2% reduce 0% 16/09/02 16:41:31 INFO mapreduce.Job: map 3% reduce 0% 16/09/02 16:41:32 INFO mapreduce.Job: map 4% reduce 0% 16/09/02 16:41:44 INFO mapreduce.Job: map 5% reduce 0% 16/09/02 16:41:46 INFO mapreduce.Job: map 6% reduce 0% 16/09/02 16:41:59 INFO mapreduce.Job: map 7% reduce 0% 16/09/02 16:42:00 INFO mapreduce.Job: map 8% reduce 0% 16/09/02 16:42:12 INFO mapreduce.Job: map 9% reduce 0% 16/09/02 16:42:13 INFO mapreduce.Job: map 10% reduce 0% 16/09/02 16:42:26 INFO mapreduce.Job: map 11% reduce 0% 16/09/02 16:42:27 INFO mapreduce.Job: map 12% reduce 0% 16/09/02 16:42:40 INFO mapreduce.Job: map 13% reduce 0% 16/09/02 16:42:41 INFO mapreduce.Job: map 14% reduce 0% 16/09/02 16:42:55 INFO mapreduce.Job: map 15% reduce 0% 16/09/02 16:42:56 INFO mapreduce.Job: map 16% reduce 0% 16/09/02 16:43:10 INFO mapreduce.Job: map 17% reduce 0% 16/09/02 16:43:11 INFO mapreduce.Job: map 18% reduce 0% 16/09/02 16:43:25 INFO mapreduce.Job: map 19% reduce 0% 16/09/02 16:43:26 INFO mapreduce.Job: map 20% reduce 0% 16/09/02 16:43:39 INFO mapreduce.Job: map 21% reduce 0% 16/09/02 16:43:40 INFO mapreduce.Job: map 22% reduce 0% 16/09/02 16:43:52 INFO mapreduce.Job: map 23% reduce 0% 16/09/02 16:43:53 INFO mapreduce.Job: map 24% reduce 0% 16/09/02 16:44:06 INFO mapreduce.Job: map 25% reduce 0% 16/09/02 16:44:07 INFO mapreduce.Job: map 26% reduce 0% 16/09/02 16:44:21 INFO mapreduce.Job: map 27% reduce 0% 16/09/02 16:44:23 INFO mapreduce.Job: map 28% reduce 0% 16/09/02 16:44:35 INFO mapreduce.Job: map 29% reduce 0% 16/09/02 16:44:36 INFO mapreduce.Job: map 30% reduce 0% 16/09/02 16:44:48 INFO mapreduce.Job: map 31% reduce 0% 16/09/02 16:44:49 INFO mapreduce.Job: map 32% reduce 0% 16/09/02 16:44:59 INFO mapreduce.Job: map 33% reduce 0% 16/09/02 16:45:00 INFO mapreduce.Job: map 34% reduce 0% 16/09/02 16:45:11 INFO mapreduce.Job: map 35% reduce 0% 16/09/02 16:45:12 INFO mapreduce.Job: map 36% reduce 0% 16/09/02 16:45:22 INFO mapreduce.Job: map 37% reduce 0% 16/09/02 16:45:24 INFO mapreduce.Job: map 38% reduce 0% 16/09/02 16:45:35 INFO mapreduce.Job: map 39% reduce 0% 16/09/02 16:45:36 INFO mapreduce.Job: map 40% reduce 0% 16/09/02 16:45:46 INFO mapreduce.Job: map 41% reduce 0% 16/09/02 16:45:48 INFO mapreduce.Job: map 42% reduce 0% 16/09/02 16:45:58 INFO mapreduce.Job: map 43% reduce 0% 16/09/02 16:46:00 INFO mapreduce.Job: map 44% reduce 0% 16/09/02 16:46:12 INFO mapreduce.Job: map 45% reduce 0% 16/09/02 16:46:13 INFO mapreduce.Job: map 46% reduce 0% 16/09/02 16:46:23 INFO mapreduce.Job: map 47% reduce 0% 16/09/02 16:46:24 INFO mapreduce.Job: map 48% reduce 0% 16/09/02 16:46:34 INFO mapreduce.Job: map 49% reduce 0% 16/09/02 16:46:35 INFO mapreduce.Job: map 50% reduce 0% 16/09/02 16:46:45 INFO mapreduce.Job: map 51% reduce 0% 16/09/02 16:46:46 INFO mapreduce.Job: map 52% reduce 0% 16/09/02 16:46:59 INFO mapreduce.Job: map 53% reduce 0% 16/09/02 16:47:03 INFO mapreduce.Job: map 53% reduce 18% 16/09/02 16:47:07 INFO mapreduce.Job: map 54% reduce 18% 16/09/02 16:47:14 INFO mapreduce.Job: map 55% reduce 18% 16/09/02 16:47:21 INFO mapreduce.Job: map 56% reduce 18% 16/09/02 16:47:25 INFO mapreduce.Job: map 56% reduce 19% 16/09/02 16:47:28 INFO mapreduce.Job: map 57% reduce 19% 16/09/02 16:47:35 INFO mapreduce.Job: map 58% reduce 19% 16/09/02 16:47:42 INFO mapreduce.Job: map 59% reduce 19% 16/09/02 16:47:43 INFO mapreduce.Job: map 59% reduce 20% 16/09/02 16:47:49 INFO mapreduce.Job: map 60% reduce 20% 16/09/02 16:47:57 INFO mapreduce.Job: map 61% reduce 20% 16/09/02 16:48:05 INFO mapreduce.Job: map 62% reduce 20% 16/09/02 16:48:08 INFO mapreduce.Job: map 62% reduce 21% 16/09/02 16:48:14 INFO mapreduce.Job: map 63% reduce 21% 16/09/02 16:48:22 INFO mapreduce.Job: map 64% reduce 21% 16/09/02 16:48:31 INFO mapreduce.Job: map 65% reduce 21% 16/09/02 16:48:32 INFO mapreduce.Job: map 65% reduce 22% 16/09/02 16:48:41 INFO mapreduce.Job: map 66% reduce 22% 16/09/02 16:48:49 INFO mapreduce.Job: map 67% reduce 22% 16/09/02 16:48:57 INFO mapreduce.Job: map 68% reduce 22% 16/09/02 16:49:00 INFO mapreduce.Job: map 68% reduce 23% 16/09/02 16:49:05 INFO mapreduce.Job: map 69% reduce 23% 16/09/02 16:49:12 INFO mapreduce.Job: map 70% reduce 23% 16/09/02 16:49:20 INFO mapreduce.Job: map 71% reduce 23% 16/09/02 16:49:22 INFO mapreduce.Job: map 71% reduce 24% 16/09/02 16:49:28 INFO mapreduce.Job: map 72% reduce 24% 16/09/02 16:49:36 INFO mapreduce.Job: map 73% reduce 24% 16/09/02 16:49:43 INFO mapreduce.Job: map 74% reduce 24% 16/09/02 16:49:46 INFO mapreduce.Job: map 74% reduce 25% 16/09/02 16:49:50 INFO mapreduce.Job: map 75% reduce 25% 16/09/02 16:49:58 INFO mapreduce.Job: map 76% reduce 25% 16/09/02 16:50:09 INFO mapreduce.Job: map 77% reduce 25% 16/09/02 16:50:11 INFO mapreduce.Job: map 77% reduce 26% 16/09/02 16:50:17 INFO mapreduce.Job: map 78% reduce 26% 16/09/02 16:50:25 INFO mapreduce.Job: map 79% reduce 26% 16/09/02 16:50:32 INFO mapreduce.Job: map 80% reduce 26% 16/09/02 16:50:35 INFO mapreduce.Job: map 80% reduce 27% 16/09/02 16:50:39 INFO mapreduce.Job: map 81% reduce 27% 16/09/02 16:50:47 INFO mapreduce.Job: map 82% reduce 27% 16/09/02 16:50:55 INFO mapreduce.Job: map 83% reduce 27% 16/09/02 16:50:56 INFO mapreduce.Job: map 83% reduce 28% 16/09/02 16:51:03 INFO mapreduce.Job: map 84% reduce 28% 16/09/02 16:51:10 INFO mapreduce.Job: map 85% reduce 28% 16/09/02 16:51:17 INFO mapreduce.Job: map 86% reduce 28% 16/09/02 16:51:20 INFO mapreduce.Job: map 86% reduce 29% 16/09/02 16:51:25 INFO mapreduce.Job: map 87% reduce 29% 16/09/02 16:51:34 INFO mapreduce.Job: map 88% reduce 29% 16/09/02 16:51:41 INFO mapreduce.Job: map 89% reduce 29% 16/09/02 16:51:44 INFO mapreduce.Job: map 89% reduce 30% 16/09/02 16:51:49 INFO mapreduce.Job: map 90% reduce 30% 16/09/02 16:51:56 INFO mapreduce.Job: map 91% reduce 30% 16/09/02 16:52:03 INFO mapreduce.Job: map 92% reduce 30% 16/09/02 16:52:06 INFO mapreduce.Job: map 92% reduce 31% 16/09/02 16:52:11 INFO mapreduce.Job: map 93% reduce 31% 16/09/02 16:52:18 INFO mapreduce.Job: map 94% reduce 31% 16/09/02 16:52:26 INFO mapreduce.Job: map 95% reduce 31% 16/09/02 16:52:27 INFO mapreduce.Job: map 95% reduce 32% 16/09/02 16:52:34 INFO mapreduce.Job: map 96% reduce 32% 16/09/02 16:52:41 INFO mapreduce.Job: map 97% reduce 32% 16/09/02 16:52:48 INFO mapreduce.Job: map 98% reduce 32% 16/09/02 16:52:52 INFO mapreduce.Job: map 98% reduce 33% 16/09/02 16:52:55 INFO mapreduce.Job: map 99% reduce 33% 16/09/02 16:53:02 INFO mapreduce.Job: map 100% reduce 33% 16/09/02 16:53:03 INFO mapreduce.Job: map 100% reduce 100% 16/09/02 16:53:04 INFO mapreduce.Job: Job job_1472804584592_0002 completed successfully 16/09/02 16:53:04 INFO mapreduce.Job: Counters: 49File System CountersFILE: Number of bytes read=2206FILE: Number of bytes written=11703871FILE: Number of read operations=0FILE: Number of large read operations=0FILE: Number of write operations=0HDFS: Number of bytes read=26890HDFS: Number of bytes written=215HDFS: Number of read operations=403HDFS: Number of large read operations=0HDFS: Number of write operations=3Job CountersLaunched map tasks=100Launched reduce tasks=1Data-local map tasks=100Total time spent by all maps in occupied slots (ms)=921440Total time spent by all reduces in occupied slots (ms)=376555Total time spent by all map tasks (ms)=921440Total time spent by all reduce tasks (ms)=376555Total vcore-seconds taken by all map tasks=921440Total vcore-seconds taken by all reduce tasks=376555Total megabyte-seconds taken by all map tasks=943554560Total megabyte-seconds taken by all reduce tasks=385592320Map-Reduce FrameworkMap input records=100Map output records=200Map output bytes=1800Map output materialized bytes=2800Input split bytes=15090Combine input records=0Combine output records=0Reduce input groups=2Reduce shuffle bytes=2800Reduce input records=200Reduce output records=0Spilled Records=400Shuffled Maps =100Failed Shuffles=0Merged Map outputs=100GC time elapsed (ms)=12309CPU time spent (ms)=75150Physical memory (bytes) snapshot=20894449664Virtual memory (bytes) snapshot=86424981504Total committed heap usage (bytes)=13431619584Shuffle ErrorsBAD_ID=0CONNECTION=0IO_ERROR=0WRONG_LENGTH=0WRONG_MAP=0WRONG_REDUCE=0File Input Format CountersBytes Read=11800File Output Format CountersBytes Written=97 Job Finished in 741.887 seconds Estimated value of Pi is 3.14080000000000000000如果提示 copyFromLocal: Cannot create directory /123/. Name node is in safe mode.
這是因為開啟了安全模式(先關(guān)閉再啟動Hadoop集群時,也會導(dǎo)致安全模式,此時不妨試試關(guān)閉安全模式)- 解決方法:
cd /usr/local/hadoop
bin/hdfs dfsadmin -safemode leave
- 解決方法:
轉(zhuǎn)載于:https://www.cnblogs.com/Genesis2018/p/9079808.html
總結(jié)
以上是生活随笔為你收集整理的测试Hadoop2.7.1的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Linux指定网卡工作模式
- 下一篇: 【Hadoop】HDFS客户端开发示例