Bad connect ack with firstBadLink 192.168.*.*:50010
?今天搭建hadoop2.0 時(shí)hadoop fs? -put?文件時(shí)報(bào)錯(cuò),看到網(wǎng)上有這樣的解決方法先轉(zhuǎn)載下 呵呵 已解決
?
轉(zhuǎn)自:http://lykke.iteye.com/blog/1320558
Exception in thread "main" java.io.IOException: Bad connect ack with firstBadLink 192.168.1.14:50010
??????? at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2903)
??????? at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
??????? at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
??????? at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
運(yùn)行hadoop put文件 的時(shí)候 回報(bào)這個(gè)錯(cuò)誤
這個(gè)在 DFSClient 里
?
[java] view plaincopyprint?
//?connects?to?the?first?datanode?in?the?pipeline??
??//?Returns?true?if?success,?otherwise?return?failure.??
??//??
??private?boolean?createBlockOutputStream(DatanodeInfo[]?nodes,?String?client,??
??????????????????boolean?recoveryFlag)?{??
????String?firstBadLink?=?"";??
????if?(LOG.isDebugEnabled())?{??
??????for?(int?i?=?0;?i?<?nodes.length;?i++)?{??
????????LOG.debug("pipeline?=?"?+?nodes[i].getName());??
??????}??
????}??
??
????//?persist?blocks?on?namenode?on?next?flush??
????persistBlocks?=?true;??
??
????try?{??
??????LOG.debug("Connecting?to?"?+?nodes[0].getName());??
??????InetSocketAddress?target?=?NetUtils.createSocketAddr(nodes[0].getName());??
??????s?=?socketFactory.createSocket();??
??????int?timeoutValue?=?3000?*?nodes.length?+?socketTimeout;??
??????NetUtils.connect(s,?target,?timeoutValue);??
??????s.setSoTimeout(timeoutValue);??
??????s.setSendBufferSize(DEFAULT_DATA_SOCKET_SIZE);??
??????LOG.debug("Send?buf?size?"?+?s.getSendBufferSize());??
??????long?writeTimeout?=?HdfsConstants.WRITE_TIMEOUT_EXTENSION?*?nodes.length?+??
??????????????????????????datanodeWriteTimeout;??
??
??????//??
??????//?Xmit?header?info?to?datanode??
??????//??
??????DataOutputStream?out?=?new?DataOutputStream(??
??????????new?BufferedOutputStream(NetUtils.getOutputStream(s,?writeTimeout),???
???????????????????????????????????DataNode.SMALL_BUFFER_SIZE));??
??????blockReplyStream?=?new?DataInputStream(NetUtils.getInputStream(s));??
??
??????out.writeShort(?DataTransferProtocol.DATA_TRANSFER_VERSION?);??
??????out.write(?DataTransferProtocol.OP_WRITE_BLOCK?);??
??????out.writeLong(?block.getBlockId()?);??
??????out.writeLong(?block.getGenerationStamp()?);??
??????out.writeInt(?nodes.length?);??
??????out.writeBoolean(?recoveryFlag?);???????//?recovery?flag??
??????Text.writeString(?out,?client?);??
??????out.writeBoolean(false);?//?Not?sending?src?node?information??
??????out.writeInt(?nodes.length?-?1?);??
??????for?(int?i?=?1;?i?<?nodes.length;?i++)?{??
????????nodes[i].write(out);??
??????}??
??????checksum.writeHeader(?out?);??
??????out.flush();??
??
??????//?receive?ack?for?connect??
??????firstBadLink?=?Text.readString(blockReplyStream);??
??????if?(firstBadLink.length()?!=?0)?{??
????????throw?new?IOException("Bad?connect?ack?with?firstBadLink?"?+?firstBadLink);??
??????}??
??
??????blockStream?=?out;??
??????return?true;?????//?success??
??
????}?catch?(IOException?ie)?{??
顯示為沒有收到正確的應(yīng)答包,我用了兩種方式解決了
1) '/etc/init.d/iptables stop' -->stopped firewall
2) SELINUX=disabled in '/etc/selinux/config' file.-->disabled selinux
一般的這種hadoop 應(yīng)答類錯(cuò)誤 多半是防火墻沒有關(guān)閉
轉(zhuǎn)載于:https://blog.51cto.com/gcjava/1426492
總結(jié)
以上是生活随笔為你收集整理的Bad connect ack with firstBadLink 192.168.*.*:50010的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: debian6之eclipse和jdk安
- 下一篇: 二叉查找树(binary search