配置日志 以及日志聚合:
第一步:
第二步参考以下网址:
https://blog.csdn.net/qq_43412289/article/details/89241271
第三步踩坑记录如下:
https://blog.csdn.net/qq_41515513/article/details/101873098
HDFS SHELL 操作 hdoop fs ....
maven项目操作hadoop=>1.在mac上配置haoop_home变量 2.新建maven项目 3.修改镜像仓库为阿里云 4.导入依赖进行开发.
pom.xml如下
<dependencies>
<!-- https://mvnrepository.com/artifact/junit/junit -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-core -->
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.13.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.2.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>3.2.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>3.2.1</version>
</dependency>
</dependencies>
测试程序如下:
public class HdfsClient {
@Test
public void testMkdirs()throws Exception{
FileSystem fileSystem = FileSystem.get(new URI("hdfs://hadoop1:9000")
,new Configuration(),"liuli");
fileSystem.mkdirs(new Path("/javamkdir"));
fileSystem.close();
}
}
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。