无法通过java连接到hdfs
import java.io.IOException;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class App
{
public static void main( String[] args ) throws IOException
{
System.out.println( "Hello World!" );
System.out.println("---143---");
String localPath="/home/user1/Documents/hdfspract.txt";
String uri="hdfs://172.16.32.139:9000";
String hdfsDir="hdfs://172.16.32.139:9000/fifo_tbl";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri),conf);
fs.copyFromLocalFile(new Path(localPath),new Path(hdfsDir));
}
}
在此处输入图像描述
当我试图执行上述代码时,它给了我以下错误:
warn util.nativecodeloader:无法为您的平台加载本机hadoop库。。。在线程“main”org.apache.hadoop.fs.unsupportedfilesystemexception中使用内置java类(如果适用)异常:org.apache.hadoop.fs.filesystem.getfilesystemclass(filesystem)中的方案“hdfs”没有文件系统。java:3332)在org.apache.hadoop.fs.filesystem.createfilesystem(filesystem。java:3352)在org.apache.hadoop.fs.filesystem.access$200(文件系统)。java:124)在com.jambo.app.main(app。java:21)
任何其他方式上传文件在hadoop使用javaapi将不胜感激谢谢
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-common</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8.0_161</version>
<scope>system</scope>
<systemPath>/usr/local/jdk1.8.0_161/lib/tools.jar</systemPath>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
暂无答案!
目前还没有任何答案,快来回答吧!