本文主要是介绍hadoop Job 运行错误 java.lang.OutOfMemoryError: Java heap space,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
错误详细内容如下:
2015-12-04 01:21:46,557 FATAL [netty-server-worker-1] org.apache.giraph.graph.GraphTaskManager: uncaughtException: OverrideExceptionHandler on thread netty-server-worker-1, msg = Java heap space,
exiting...
java.lang.OutOfMemoryError: Java heap space
at java.lang.Throwable.getStackTraceElement(Native Method)
at java.lang.Throwable.getOurStackTrace(Throwable.java:827)
at java.lang.Throwable.getStackTrace(Throwable.java:816)
at io.netty.channel.DefaultChannelHandlerContext.inExceptionCaught(DefaultChannelHandlerContext.java:750)
at io.netty.channel.DefaultChannelHandlerContext.notifyHandlerException(DefaultChannelHandlerContext.java:736)
at io.netty.channel.DefaultChannelHandlerContext.invokeChannelRead(DefaultChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelHandlerContext.fireChannelRead(DefaultChannelHandlerContext.java:324)
at org.apache.giraph.comm.netty.handler.RequestDecoder.channelRead(RequestDecoder.java:100)
at io.netty.channel.DefaultChannelHandlerContext.invokeChannelRead(DefaultChannelHandlerContext.java:338)
at io.netty.channel.DefaultChannelHandlerContext.access$700(DefaultChannelHandlerContext.java:29)
at io.netty.channel.DefaultChannelHandlerContext$8.run(DefaultChannelHandlerContext.java:329)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:354)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:101)
at java.lang.Thread.run(Thread.java:745)
2015-12-04 01:21:47,210 ERROR [netty-server-worker-1] org.apache.giraph.worker.BspServiceWorker: unregisterHealth: Got failure, unregistering health on /_hadoopBsp/job_1449219833144_0001/_applicat
ionAttemptsDir/0/_superstepDir/-1/_workerHealthyDir/master_3 on superstep -1
错误原因:mapred-default.xml中mapred.child.java.opts=-Xmx200m太小了,导致内存溢出
解决方法:在mapred-site.xml中添加如下内容:
<property>
<name>mapred.child.java.opts</name>
<value>-Xmx1000m</value>
</property>
这篇关于hadoop Job 运行错误 java.lang.OutOfMemoryError: Java heap space的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!