2017-12-07 10 views
0

map reduce를 사용하여 alluxio에 데이터를 쓰려고합니다. 나는 alluxio.It에 쓰고있는 hdfs에 관한 11 기가의 데이터를 MUST_CACHE 쓰기 타입 (alluxio.user.file.writetype.default의 디폴트 값)으로 잘 동작하고있다. 나는 그것이 CACHE_THROUGH를 사용하여 작성하려고하고 때CACHE_THROUGH를 사용하여 alluxio에 데이터를 쓰지 못했습니다.

그러나, 그것은 다음과 같은 예외와 함께 실패 :

./alluxio fs -Dalluxio.user.file.writetype.default=CACHE_THROUGH copyFromLocal <hdfs_input_path> <alluxio_output_path> 

어떤 도움 :

Error: alluxio.exception.status.UnavailableException: Channel to <hostname of one of the worker>:29999: <underfs path to file> (No such file or directory) 
      at alluxio.client.block.stream.NettyPacketWriter.close(NettyPacketWriter.java:263) 
      at com.google.common.io.Closer.close(Closer.java:206) 
      at alluxio.client.block.stream.BlockOutStream.close(BlockOutStream.java:166) 
      at alluxio.client.file.FileOutStream.close(FileOutStream.java:137) 
      at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72) 
      at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106) 
      at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat$LineRecordWriter.close(TextOutputFormat.java:111) 
      at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:679) 
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:802) 
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:346) 
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) 
      at java.security.AccessController.doPrivileged(Native Method) 
      at javax.security.auth.Subject.doAs(Subject.java:422) 
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1595) 
      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 
    Caused by: alluxio.exception.status.NotFoundException: Channel to <hostname of one of the worker>29999: <underfs path to file> (No such file or directory) 
      at alluxio.exception.status.AlluxioStatusException.from(AlluxioStatusException.java:153) 
      at alluxio.util.CommonUtils.unwrapResponseFrom(CommonUtils.java:548) 
      at alluxio.client.block.stream.NettyPacketWriter$PacketWriteHandler.channelRead(NettyPacketWriter.java:367) 
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
      at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
      at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) 
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
      at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
      at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) 
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
      at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
      at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
      at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
      at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:163) 
      at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
      at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
      at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787) 
      at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130) 
      at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) 
      at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) 
      at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) 
      at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) 
      at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) 
      at java.lang.Thread.run(Thread.java:748) 

을 나는 같은 오류가 명령 아래뿐만 아니라 함께 노력/포인터는 높이 평가됩니다. 감사합니다

답변

1

copyFromLocal 쉘 명령은 로컬 파일 시스템에서만 사용 가능한 파일을 복사 할 수 있습니다. HDFS에서 Alluxio로 파일을 복사하려면 먼저 로컬로 복사 한 다음 Alluxio에 쓸 수 있습니다.

hdfs dfs -get <hdfs_input_path> /tmp/tmp_file 
alluxio fs copyFromLocal /tmp/tmp_file <alluxio_output_path> 

, 당신의 core-site.xml 에이
<property> 
    <name>fs.alluxio.impl</name> 
    <value>alluxio.hadoop.FileSystem</value> 
    <description>The Alluxio FileSystem (Hadoop 1.x and 2.x)</description> 
</property> 
<property> 
    <name>fs.AbstractFileSystem.alluxio.impl</name> 
    <value>alluxio.hadoop.AlluxioFileSystem</value> 
    <description>The Alluxio AbstractFileSystem (Hadoop 2.x)</description> 
</property> 

를 포함 업데이트 Alluxio에 맵리 듀스에서 직접 쓰기 -libjars /path/to/client와 애플리케이션 클래스 패스에 Alluxio 클라이언트 항아리를 추가하고 alluxio://master_hostname:19998/alluxio_output_path URI에 기록합니다. 자세한 내용은 the documentation을 참조하십시오.

+0

감사합니다. – user3811124

관련 문제