SpringBoot整合hdfs,实现文件上传下载删除与批量删除,以及vue前端发送请求,实现前后端交互功能;

这篇具有很好参考价值的文章主要介绍了SpringBoot整合hdfs,实现文件上传下载删除与批量删除,以及vue前端发送请求,实现前后端交互功能;。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

部分工具类代码参考文章:https://blog.csdn.net/qq_27242695/article/details/119683823

前端实现效果

springboot整合hdfs,spring boot,hdfs,大数据,hadoop,混合现实,Powered by 金山文档

HDFSController

package com.jack.graduation.controller;

import cn.hutool.core.io.FileUtil;
import cn.hutool.core.util.IdUtil;
import cn.hutool.core.util.StrUtil;
import cn.hutool.crypto.SecureUtil;
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
import com.baomidou.mybatisplus.core.metadata.IPage;
import com.baomidou.mybatisplus.extension.plugins.pagination.Page;
import com.jack.graduation.bean.FileInfo;
import com.jack.graduation.common.Constants;
import com.jack.graduation.common.Result;
import com.jack.graduation.service.FileService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
import org.springframework.web.multipart.MultipartFile;
import javax.servlet.ServletOutputStream;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.net.URLEncoder;
import java.util.HashSet;
import java.util.List;

/**
 * @BelongsProject: graduation
 * @BelongsPackage: com.jack.graduation.controller
 * @Author: jack
 * @CreateTime: 2023-01-05  17:27
 * @Description: TODO 文件上传接口
 * @Version: jdk1.8
 */

@RestController
@RequestMapping("/file")
public class FileController {
    @Autowired
    private FileService fileService;

    @PostMapping("/uploadToHdfs")
    public Result uploadToHdfs(@RequestParam MultipartFile file) throws Exception {
        String originalFilename = file.getOriginalFilename(); //文件名
        String type = FileUtil.extName(originalFilename);//获取文件扩展名(文件类型后缀名),扩展名不带“.”
        if (!"csv".equals(type)) {
            //throw new ServiceException(Constants.CODE_400, "文件类型必须是csv逗号分隔文件!");
            return Result.error(Constants.CODE_400, "文件类型必须是csv逗号分隔文件!");
        }
        //文件大小
        long size = file.getSize();
        // 定义一个文件唯一的标识码
        String uuid = IdUtil.fastSimpleUUID();
        //新的文件名
        String newOriginalFilename = uuid + StrUtil.DOT + type;
        String md5 = SecureUtil.md5(file.getInputStream());
        //下载路径
        String url = "http://localhost:9090/file/" + newOriginalFilename;
        FileInfo fileInfo = new FileInfo(null, originalFilename, md5, uuid, type, size / 1024, url, null, null, null, null);
        //信息写进数据库
        fileService.save(fileInfo);
        //存储到hdfs
        boolean res = fileService.uploadHdfs(file, newOriginalFilename);
        if (res) {
            return Result.success("文件上传成功!");
        } else {
            return Result.error(Constants.CODE_500, "服务器错误!");
        }
    }

    /**
     * 清洗后的文件,从hdfs下载
     *
     * @param newFileName 文件唯一标识
     * @param isEtl       是否清洗标识
     * @param response    响应体
     * @throws IOException exception
     */
    @GetMapping("/{newFileName}/{isEtl}")
    public void downloadFile(@PathVariable String newFileName, @PathVariable Integer isEtl, HttpServletResponse response) {

        ServletOutputStream os = null;
        // 设置输出流的格式
        try {
            os = response.getOutputStream();
            response.addHeader("Content-Disposition", "attachment;filename=" + URLEncoder.encode(newFileName, "UTF-8"));
            response.setContentType("application/octet-stream");
            byte[] resBytes = fileService.downloadHdfsFile(newFileName, isEtl);
            // 读取文件的字节流
            os.write(resBytes);
            os.flush();
        } catch (IOException e) {
            e.printStackTrace();
        } finally {
            if (os!=null){
                try {
                    os.close();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    }

    @DeleteMapping("/deleteFile/{id}")
    public Result deleteFile(@PathVariable Integer id) {
        QueryWrapper<FileInfo> queryWrapper = new QueryWrapper<>();
        queryWrapper.eq("id", id);
        FileInfo fileInfo = fileService.getOne(queryWrapper);
        if (fileService.removeHdfsFile(fileInfo) && fileService.removeById(id)) {
            return Result.success("文件删除成功");
        } else {
            return Result.error(Constants.CODE_500, "hdfs文件删除失败");
        }
    }

    //批量删除数据
    @PostMapping("/delFileBatch")
    public Result delUserBatch(@RequestBody List<Integer> ids) {
        QueryWrapper<FileInfo> queryWrapper = new QueryWrapper<>();
        queryWrapper.in("id", ids);
        List<FileInfo> fileInfoList = fileService.list(queryWrapper);
        HashSet<String> resSet = fileService.removeHdfsFileBatch(fileInfoList);
        if (resSet.isEmpty() && fileService.removeByIds(ids)) {
            return Result.success("批量删除文件成功");
        } else {
            return Result.error(Constants.CODE_500, resSet.toString());
        }
    }

    //根据md5查找文件是否存在
    public FileInfo getFileByMd5(String md5) {
        QueryWrapper<FileInfo> queryWrapper = new QueryWrapper<>();
        queryWrapper.eq("file_md5", md5);
        FileInfo fileInfo = fileService.getOne(queryWrapper);
        return fileInfo;
    }

    //分页数据
    @RequestMapping("/page")
    public Result getPage(@RequestParam Integer pageNum,
                          @RequestParam Integer pageSize,
                          @RequestParam(defaultValue = "") String fileName,
                          @RequestParam(defaultValue = "") String id,
                          @RequestParam(defaultValue = "") String uuid
    ) {
        List<FileInfo> list = fileService.list();
        IPage<FileInfo> page = new Page<>(pageNum, pageSize);
        QueryWrapper<FileInfo> wrapper = new QueryWrapper<>();

        //根据username搜索
        if (!"".equals(fileName)) {
            wrapper.eq("file_name", fileName);
        }
        //根据id搜索
        if (!"".equals(id)) {
            wrapper.and(wra -> wra.eq("id", Integer.valueOf(id)));
        }
        //根据uuid搜索
        if (!"".equals(uuid)) {
            wrapper.eq("uuid", uuid);
        }
        //倒序排
        wrapper.orderByDesc("id");
        IPage<FileInfo> iPage = fileService.page(page, wrapper);
        return Result.success(iPage);
    }
}

HDFS FileInterface (文件接口)

package com.jack.graduation.service;

import com.baomidou.mybatisplus.extension.service.IService;
import com.jack.graduation.bean.FileInfo;
import org.springframework.web.multipart.MultipartFile;

import java.util.HashSet;
import java.util.List;

public interface FileService extends IService<FileInfo> {
    //上传hdfs方法
    boolean uploadHdfs(MultipartFile file, String fileName);

    boolean removeHdfsFile(FileInfo fileInfo);

    byte[] downloadHdfsFile(String fileUUID, Integer isEtl);

    HashSet<String> removeHdfsFileBatch(List<FileInfo> fileInfoList);
}

HDFS FileImplService (文件接口实现类)

package com.jack.graduation.service.impl;

import cn.hutool.core.util.StrUtil;
import com.baomidou.mybatisplus.extension.service.impl.ServiceImpl;
import com.jack.graduation.bean.FileInfo;
import com.jack.graduation.common.Constants;
import com.jack.graduation.config.HdfsConfig;
import com.jack.graduation.exception.ServiceException;
import com.jack.graduation.mapper.FileMapper;
import com.jack.graduation.service.FileService;
import com.jack.graduation.utils.HdfsUtil;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.web.multipart.MultipartFile;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;

/**
 * @BelongsProject: graduation
 * @BelongsPackage: com.jack.graduation.service.impl
 * @Author: jack
 * @CreateTime: 2023-01-05  18:48
 * @Description: TODO
 * @Version: jdk1.8
 */
@Service
public class FileServiceImpl extends ServiceImpl<FileMapper, FileInfo> implements FileService {


    @Autowired
    private HdfsUtil hdfsUtil;
    @Autowired
    private HdfsConfig hdfsConfig;

    /**
     * @param file     前端传过来的文件
     * @param fileName 文件名
     * @return
     */
    @Override
    public boolean uploadHdfs(MultipartFile file, String fileName) {
        boolean res = false;
        try {
            hdfsUtil.createFile(hdfsConfig.getHdfsPath() + fileName, file, fileName);
            res = hdfsUtil.existFile(hdfsConfig.getHdfsPath() + fileName);
            if (res) {
                return true;
            }
        } catch (Exception e) {
            throw new ServiceException(Constants.CODE_500, "hdfs io error!");
        }
        return res;
    }

    @Override
    public boolean removeHdfsFile(FileInfo fileInfo) {
        boolean res = false;
        String filename = fileInfo.getUuid() + StrUtil.DOT + fileInfo.getFileType();

        try {
            //未清洗文件路径IsEtl==0
            if (fileInfo.getIsEtl() == 0) {
                res = hdfsUtil.deleteFile(hdfsConfig.getHdfsPath() + filename);
            } else {
                res = hdfsUtil.deleteFile(hdfsConfig.getHdfsCleanPath() + filename);
            }
        } catch (Exception e) {
            throw new ServiceException(Constants.CODE_500, "删除hdfs文件失败!");
        }
        return res;
    }

    @Override
    public byte[] downloadHdfsFile(String newFileName, Integer isEtl) {

        FileSystem fs = null;
        System.out.println("filename:"+newFileName);
        FSDataInputStream fis = null;
        byte[] resBytes;
        try {
            //文件名
            fs = hdfsUtil.getFileSystem();
            if (isEtl == 0) {
                //创建输入流
                System.out.println("hdfs:"+hdfsConfig.getHdfsPath() + newFileName);
                fis = fs.open(new Path(hdfsConfig.getHdfsPath() + newFileName));
                resBytes = IOUtils.readFullyToByteArray(fis);
            } else {
                //创建输入流
                fis = fs.open(new Path(hdfsConfig.getHdfsCleanPath() + newFileName));
                resBytes = IOUtils.readFullyToByteArray(fis);
            }
        } catch (Exception e) {
            throw new ServiceException(Constants.CODE_500, "hdfs文件下载失败!");
            //e.printStackTrace();
        } finally {
            IOUtils.closeStream(fis);
            if (fs != null) {
                try {
                    //关流
                    fs.close();
                } catch (IOException e) {
                    e.printStackTrace();
                }

            }
        }
        return resBytes;
    }

    @Override
    public HashSet<String> removeHdfsFileBatch(List<FileInfo> fileInfoList) {
        boolean res = false;
        HashSet<String> resSet = new HashSet<>();
        for (FileInfo fileInfo : fileInfoList) {
            String filename = fileInfo.getUuid() + StrUtil.DOT + fileInfo.getFileType();
            try {
                //未清洗文件路径IsEtl==0
                if (fileInfo.getIsEtl() == 0) {
                    res = hdfsUtil.deleteFile(hdfsConfig.getHdfsPath() + filename);
                    if (!res) {
                        resSet.add(fileInfo.getFileName() + "删除失败!");
                    }
                } else {
                    res = hdfsUtil.deleteFile(hdfsConfig.getHdfsCleanPath() + filename);
                    if (!res) {
                        resSet.add(fileInfo.getFileName() + "删除失败!");
                    }
                }
            } catch (Exception e) {
                throw new ServiceException(Constants.CODE_500, resSet.toString());
            }
        }
        return resSet;
    }

}

HDFSConfig(从yaml读取文件)

package com.jack.graduation.config;

import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.stereotype.Component;

/**
 * @BelongsProject: graduation
 * @BelongsPackage: com.jack.graduation.config
 * @Author: jack
 * @CreateTime: 2023-01-03  01:38
 * @Description: TODO:hdfs配置类
 * @Version: jdk1.8
 */
@Configuration
@Data
@NoArgsConstructor
@AllArgsConstructor
public class HdfsConfig {
    // hdfs nameNode连接URL
    @Value("${nameNode.url}")
    private String nameNodeUrl;

    // 操作用户
    @Value("${hdfs.userName}")
    private String hdfsUserName;

    // 操作存储节点路径
    @Value("${hdfs.dataNode}/")
    private String pdfDataNode;

    //hdfs存储路径
    @Value("${nameNode.hdfsPath}")
    private String hdfsPath;

    //hdfs清洗存储路径
    @Value("${nameNode.hdfsCleanPath}")
    private String hdfsCleanPath;

}

HDFSUTils

package com.jack.graduation.utils;

import com.alibaba.druid.util.StringUtils;
import com.jack.graduation.config.HdfsConfig;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.web.multipart.MultipartFile;

import java.io.*;
import java.net.URI;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

/**
 * @BelongsProject: graduation
 * @BelongsPackage: com.jack.graduation.utils
 * @Author: jack
 * @CreateTime: 2023-01-03  01:40
 * @Description: TODO hdfs工具类
 * @Version: jdk1.8
 */
@Component
public class HdfsUtil {
    public static final Logger logger = LoggerFactory.getLogger(HdfsUtil.class);


    @Autowired
    private HdfsConfig hdfsConfig;

    /**
     * 获取HDFS配置信息 配置文件优先级
     * Configuration  > resource下的hdfs-site.xml > 服务器上的 hdfs-default.xml
     *
     * @return
     */
    private Configuration getConfiguration() {
        Configuration configuration = new Configuration();
        configuration.set("dfs.support.append", "true");
        configuration.set("dfs.client.block.write.replace-datanode-on-failure.enable", "true");
        configuration.set("dfs.client.block.write.replace-datanode-on-failure.policy", "NEVER");
        return configuration;
    }


    /**
     * 获取HDFS文件系统对象
     *
     * @return
     * @throws Exception
     */
    public FileSystem getFileSystem() throws Exception {
        // 客户端去操作hdfs时是有一个用户身份的,默认情况下hdfs客户端api会从jvm中获取一个参数作为自己的用户身份
        // DHADOOP_USER_NAME=hadoop
        // 也可以在构造客户端fs对象时,通过参数传递进去
        System.out.println(hdfsConfig.getNameNodeUrl());
        System.out.println(hdfsConfig.getPdfDataNode());
        System.out.println(hdfsConfig.getHdfsUserName());

        FileSystem fileSystem = FileSystem.get(
                new URI(hdfsConfig.getNameNodeUrl()),
                getConfiguration(), hdfsConfig.getHdfsUserName());
        return fileSystem;
    }

    /**
     * 在HDFS创建文件夹
     *
     * @param path
     * @return
     * @throws Exception
     */
    public boolean mkdir(String path) throws Exception {
        FileSystem fs = null;
        boolean isOk = false;
        if (StringUtils.isEmpty(path)) {
            return false;
        }
        try {
            if (existFile(path)) {
                logger.error("hdfs file is exists: {}", path);
                return true;
            }
            // 目标路径
            fs = getFileSystem();
            Path srcPath = new Path(path);
            isOk = fs.mkdirs(srcPath);
            logger.error("hdfs mkdir success: {}", path);
        } catch (Exception e) {
            logger.error("hdfs mkdir: {}", e);
        } finally {
            if (fs != null) {
                fs.close();
            }
        }
        return isOk;
    }

    /**
     * 判断HDFS文件是否存在
     *
     * @param path
     * @return
     * @throws Exception
     */
    public boolean existFile(String path) throws Exception {
        Boolean isExists = false;
        FileSystem fs = null;
        if (StringUtils.isEmpty(path)) {
            return false;
        }
        try {
            fs = getFileSystem();
            Path srcPath = new Path(path);
            isExists = fs.exists(srcPath);
        } catch (Exception e) {
            logger.error("existFile {}", e);
        } finally {
            if (fs != null) {
                fs.close();
            }
        }
        return isExists;
    }

    /**
     * 读取HDFS目录信息
     *
     * @param path
     * @return
     * @throws Exception
     */
    public List<Map<String, Object>> readPathInfo(String path) throws Exception {
        try {
            if (StringUtils.isEmpty(path)) {
                return null;
            }
            if (!existFile(path)) {
                return null;
            }
            FileSystem fs = getFileSystem();
            // 目标路径
            Path newPath = new Path(path);
            FileStatus[] statusList = fs.listStatus(newPath);
            List<Map<String, Object>> list = new ArrayList<>();
            if (null != statusList && statusList.length > 0) {
                for (FileStatus fileStatus : statusList) {
                    Map<String, Object> map = new HashMap<>();
                    map.put("filePath", fileStatus.getPath());
                    map.put("fileStatus", fileStatus.toString());
                    list.add(map);
                }
                return list;
            }
        } catch (Exception e) {
            logger.error("hdfs readPathInfo {}", e);
        }
        return null;
    }

    /**
     * HDFS创建文件
     *
     * @param path 上传的路径
     * @param file
     * @throws Exception
     */
    public void createFile(String path, MultipartFile file) throws Exception {
        if (StringUtils.isEmpty(path) || null == file.getBytes()) {
            return;
        }
        FileSystem fs = null;
        FSDataOutputStream outputStream = null;
        try {
            fs = getFileSystem();
            String fileName = file.getOriginalFilename();
            // 上传时默认当前目录,后面自动拼接文件的目录
            Path newPath = new Path(path + "/" + fileName);
            // 打开一个输出流
            outputStream = fs.create(newPath);
            outputStream.write(file.getBytes());
            outputStream.flush();
        } catch (Exception e) {
            throw e;
        } finally {
            if (outputStream != null) {
                outputStream.close();
            }

            if (fs != null) {
                fs.close();
            }
        }
    }

    public void createFile(String path, MultipartFile file,String newFilename) throws Exception {
        if (StringUtils.isEmpty(path) || null == file.getBytes()) {
            return;
        }
        FileSystem fs = null;
        FSDataOutputStream outputStream = null;
        try {
            fs = getFileSystem();
            // 上传时默认当前目录,后面自动拼接文件的目录
            Path newPath = new Path(path);
            // 打开一个输出流
            outputStream = fs.create(newPath);
            outputStream.write(file.getBytes());
            outputStream.flush();
        } catch (Exception e) {
            throw e;
        } finally {
            if (outputStream != null) {
                outputStream.close();
            }

            if (fs != null) {
                fs.close();
            }
        }
    }


    /**
     * 直接往输出流输出文件
     *
     * @param path 活动方式 远程文件
     * @param os   输出流
     * @return
     * @throws Exception
     */
    public void writeOutputStreamFile(OutputStream os, String path) throws Exception {
        if (StringUtils.isEmpty(path)) {
            return;
        }
/*        if (!existFile(path)) {
            // 文件不存在则抛出异常
            throw new Exception(path + " hdfs文件不存在");
        }*/
        FileSystem fs = null;
        FSDataInputStream inputStream = null;
        try {
            // 目标路径
            Path srcPath = new Path(path);
            fs = getFileSystem();
            inputStream = fs.open(srcPath);
            // 防止中文乱码
            // BufferedReader reader = new BufferedReader(new InputStreamReader(inputStream));
            fileDownload(os, new BufferedInputStream(inputStream));
        } catch (Exception e) {
            throw e;
        } finally {
            if (inputStream != null) {
                inputStream.close();
            }
            if (fs != null) {
                fs.close();
            }
        }
    }

    /**
     * 读取HDFS文件内容
     *
     * @param path
     * @return
     * @throws Exception
     */
    public String readFile(String path) throws Exception {
        if (StringUtils.isEmpty(path)) {
            return null;
        }
        if (!existFile(path)) {
            return null;
        }
        FileSystem fs = null;
        FSDataInputStream inputStream = null;
        try {
            // 目标路径
            Path srcPath = new Path(path);
            fs = getFileSystem();
            inputStream = fs.open(srcPath);
            // 防止中文乱码
            BufferedReader reader = new BufferedReader(new InputStreamReader(inputStream));
            String lineTxt = "";
            StringBuffer sb = new StringBuffer();
            while ((lineTxt = reader.readLine()) != null) {
                sb.append(lineTxt);
            }
            return sb.toString();
        } finally {
            if (inputStream != null) {
                inputStream.close();
            }
            if (fs != null) {
                fs.close();
            }
        }
    }


    /**
     * 读取HDFS文件列表
     *
     * @param path
     * @return
     * @throws Exception
     */
    public List<Map<String, String>> listFile(String path) throws Exception {
        if (StringUtils.isEmpty(path)) {
            return null;
        }
        if (!existFile(path)) {
            return null;
        }

        FileSystem fs = null;
        try {
            fs = getFileSystem();
            // 目标路径
            Path srcPath = new Path(path);
            // 递归找到所有文件
            RemoteIterator<LocatedFileStatus> filesList = fs.listFiles(srcPath, true);
            List<Map<String, String>> returnList = new ArrayList<>();
            while (filesList.hasNext()) {
                LocatedFileStatus next = filesList.next();
                String fileName = next.getPath().getName();
                Path filePath = next.getPath();
                Map<String, String> map = new HashMap<>();
                map.put("fileName", fileName);
                map.put("filePath", filePath.toString());
                returnList.add(map);
            }
            return returnList;
        } catch (Exception e) {
            logger.error("hdfs listFile {}", e);
        } finally {
            if (fs != null) {
                fs.close();

            }
        }
        return null;
    }


    /**
     * HDFS重命名文件
     *
     * @param oldName
     * @param newName
     * @return
     * @throws Exception
     */
    public boolean renameFile(String oldName, String newName) throws Exception {
        if (StringUtils.isEmpty(oldName) || StringUtils.isEmpty(newName)) {
            return false;
        }
        FileSystem fs = null;
        Boolean isOk = false;
        try {
            fs = getFileSystem();
            // 原文件目标路径
            Path oldPath = new Path(oldName);
            // 重命名目标路径
            Path newPath = new Path(newName);
            isOk = fs.rename(oldPath, newPath);

            return isOk;
        } catch (Exception e) {
            logger.error("hdfs renameFile {}", e);
        } finally {
            if (fs != null) {
                fs.close();
            }
        }
        return isOk;
    }


    /**
     * 删除HDFS文件
     *
     * @param path
     * @return
     * @throws Exception
     */
    public boolean deleteFile(String path) throws Exception {
        if (StringUtils.isEmpty(path)) {
            return false;
        }

        FileSystem fs = null;
        Boolean isOk = false;
        try {
            if (!existFile(path)) {
                return false;
            }
            fs = getFileSystem();
            Path srcPath = new Path(path);
            isOk = fs.deleteOnExit(srcPath);
        } catch (Exception e) {
            logger.error("hdfs deleteFile {}", e);
        } finally {
            if (fs != null) {
                fs.close();
            }
        }
        return isOk;
    }

    /**
     * 上传HDFS文件
     *
     * @param path       上传路径(本服务器文件全路径)
     * @param uploadPath 目标路径(全节点路径)
     * @throws Exception
     */
    public void uploadFile(String path, String uploadPath) throws Exception {
        if (StringUtils.isEmpty(path) || StringUtils.isEmpty(uploadPath)) {
            return;
        }
        FileSystem fs = null;
        try {
            fs = getFileSystem();
            // 上传路径
            Path clientPath = new Path(path);
            // 目标路径
            Path serverPath = new Path(uploadPath);
            // 调用文件系统的文件复制方法,第一个参数是否删除本地文件  true为删除,默认为false
            fs.copyFromLocalFile(false, clientPath, serverPath);
        } catch (Exception e) {
            logger.error("hdfs uploadFile {}", e);
        } finally {
            if (fs != null) {
                fs.close();
            }
        }

    }


    /**
     * 下载HDFS文件
     *
     * @param path         hdfs目标路径
     * @param downloadPath 客户端存放路径
     * @throws Exception
     */
    public void downloadFile(String path, String downloadPath) throws Exception {
        if (StringUtils.isEmpty(path) || StringUtils.isEmpty(downloadPath)) {
            return;
        }
        FileSystem fs = null;
        try {
            fs = getFileSystem();
            // hdfs目标路径
            Path clientPath = new Path(path);
            // 客户端存放路径
            Path serverPath = new Path(downloadPath);
            // 调用文件系统的文件复制方法,第一个参数是否删除原文件 true为删除,默认为false
            fs.copyToLocalFile(false, clientPath, serverPath);
        } catch (Exception e) {
            logger.error("hdfs downloadFile {}", e);
        } finally {
            if (fs != null) {
                fs.close();
            }
        }
    }

    /**
     * HDFS文件复制
     * @param sourcePath
     * @param targetPath
     * @throws Exception
     */
    /*public void copyFile(String sourcePath, String targetPath) throws Exception {
        if (StringUtils.isEmpty(sourcePath) || StringUtils.isEmpty(targetPath)) {
            return;
        }
        FileSystem fs = getFileSystem();
        // 原始文件路径
        Path oldPath = new Path(sourcePath);
        // 目标路径
        Path newPath = new Path(targetPath);

        FSDataInputStream inputStream = null;
        FSDataOutputStream outputStream = null;
        try {
            inputStream = fs.open(oldPath);
            outputStream = fs.create(newPath);

            IOUtils.copyBytes(inputStream, outputStream, bufferSize, false);
        } finally {
            inputStream.close();
            outputStream.close();
            fs.close();
        }
    }

    *//**
     * 打开HDFS上的文件并返回byte数组
     * @param path
     * @return
     * @throws Exception
     *//*
    public byte[] openFileToBytes(String path) throws Exception {
        if (StringUtils.isEmpty(path)) {
            return null;
        }
        if (!existFile(path)) {
            return null;
        }
        FileSystem fs = getFileSystem();
        // 目标路径
        Path srcPath = new Path(path);
        try {
            FSDataInputStream inputStream = fs.open(srcPath);
            return IOUtils.readFullyToByteArray(inputStream);
        } finally {
            fs.close();
        }
    }

    *//**
     * 打开HDFS上的文件并返回java对象
     * @param path
     * @return
     * @throws Exception
     *//*
    public <T extends Object> T openFileToObject(String path, Class<T> clazz) throws Exception {
        if (StringUtils.isEmpty(path)) {
            return null;
        }
        if (!existFile(path)) {
            return null;
        }
        String jsonStr = readFile(path);
        return JsonUtil.fromObject(jsonStr, clazz);
    }

    *//**
     * 获取某个文件在HDFS的集群位置
     * @param path
     * @return
     * @throws Exception
     *//*
    public BlockLocation[] getFileBlockLocations(String path) throws Exception {
        if (StringUtils.isEmpty(path)) {
            return null;
        }
        if (!existFile(path)) {
            return null;
        }
        FileSystem fs = getFileSystem();
        // 目标路径
        Path srcPath = new Path(path);
        FileStatus fileStatus = fs.getFileStatus(srcPath);
        return fs.getFileBlockLocations(fileStatus, 0, fileStatus.getLen());
    }
*/

    /**
     * @param os  response输出流
     * @param bis 输入流
     */
    private void fileDownload(OutputStream os, BufferedInputStream bis) throws Exception {
        if (bis == null) {
            return;
        }
        try {
            byte[] buff = new byte[1024];
            int i = bis.read(buff);
            while (i != -1) {
                os.write(buff, 0, i);
                os.flush();
                i = bis.read(buff);
            }
        } catch (IOException e) {
            throw e;
        } finally {
            if (bis != null) {
                try {
                    bis.close();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    }

}

前端vue代码:文章来源地址https://www.toymoban.com/news/detail-596170.html

<template>
    <div>
        <div class="searchForm">
            <el-input style="width: 200px" placeholder="请输入ID" v-model="id"
                      prefix-icon="el-icon-search"></el-input>
            <el-input class="ml-5" style="width: 200px" placeholder="请输入文件名" v-model="fileName"
                      prefix-icon="el-icon-search"></el-input>
            <el-input class="ml-5" style="width: 200px" placeholder="请输入uuid" v-model="uuid"
                      prefix-icon="el-icon-search"></el-input>
            <el-button class="ml-5" type="primary" @click="rigthId();getData()">搜索</el-button>
            <el-button class="ml-5" type="warning" @click="reset">重置</el-button>
        </div>

        <el-table :data="tableData" border stripe :header-cell-class-name="headerBg"
                  @selection-change="handleSelectionChange"
                  :header-cell-style="{'text-align':'center'}" :cell-style="{'text-align':'center'}">
            <el-table-column type="selection" width="40"></el-table-column>
            <el-table-column prop="id" label="ID" width="60"></el-table-column>
            <el-table-column prop="fileName" label="文件名" width="70"></el-table-column>
            <el-table-column prop="fileType" label="文件类型" width="70"></el-table-column>
            <el-table-column :formatter="formatIsEtl" prop="isEtl" label="是否清洗" width="70"></el-table-column>
            <el-table-column prop="uploadTime" label="上传时间" width="90"></el-table-column>
            <!--            <el-table-column prop="updateTime" label="修改时间" width="90"></el-table-column>-->
            <el-table-column prop="etlTime" label="清洗时间" width="90"></el-table-column>
            <el-table-column prop="fileSize" label="大小(kb)" width="70"></el-table-column>
            <el-table-column prop="uuid" label="uuid" width="245"></el-table-column>
            <el-table-column prop="url" label="下载地址" width="440"></el-table-column>
            <el-table-column label="操作" width="220" align="center">
                <template slot-scope="scope">
                    <el-button style="width: 60px;margin-left: 1px;text-align: center" type="success"
                               @click="cleanFile(scope.row)">清洗
                        <i
                                class="el-icon-coin"></i>
                    </el-button>
                    <el-button type="primary" @click="downloadFile(scope.row)"
                               style="width: 60px;margin-left: 1px;text-align: center">下载 <i
                            class="el-icon-caret-bottom"></i></el-button>
                    <el-popconfirm
                            class="ml-5"
                            confirm-button-text='确定'
                            cancel-button-text='我再想想'
                            icon="el-icon-info"
                            icon-color="red"
                            title="您确定删除吗?"
                            @confirm="delFile(scope.row.id)">
                        <el-button type="danger" slot="reference"
                                   style="width: 60px;margin-right: 1px;text-align: center">删除 <i
                                class="el-icon-remove-outline"></i>
                        </el-button>
                    </el-popconfirm>
                </template>
            </el-table-column>
        </el-table>
        <div style=" margin: 10px 0">
            <el-upload action="http://" :show-file-list="false"
                       :on-success="uploadToHdfsSuccess" style="display: inline-block;">
                <el-button type="primary" class="ml-5" style="width: 90px;" @click="uploadToHdfs">上传<i
                        class="el-icon-caret-top"></i>
                </el-button>
            </el-upload>
            <el-popconfirm
                    class="ml-5"
                    confirm-button-text='确定'
                    cancel-button-text='我再想想'
                    icon="el-icon-info"
                    icon-color="red"
                    title="您确定批量删除这些数据吗?"
                    @confirm="delFileBatch"
            >
                <el-button type="danger" slot="reference" class="ml-5" style="width: 90px;">批量删除 <i
                        class="el-icon-remove-outline"></i>
                </el-button>
            </el-popconfirm>

        </div>

        <div class="pagination">
            <el-pagination
                    @size-change="handleSizeChange"
                    @current-change="handleCurrentChange"
                    :current-page="pageNum"
                    :page-sizes="[9, 18, 27, 36]"
                    :page-size="pageSize"
                    layout="total, sizes, prev, pager, next, jumper"
                    :total="total">     <!--分页插件-->
            </el-pagination>
        </div>
    </div>
</template>

<script>
    export default {
        name: "file",
        data() {
            return {
                tableData: [],
                total: 0,
                pageNum: 1,
                fileName: '',
                pageSize: 9,
                dialogFormVisible: false,
                addfileForm: {},
                uuid: '',
                id: '',
                multipleSelection: [],
                headerBg: 'headerBg'
            }
        },
        created() {
            this.getData()
        },
        methods: {
            rigthId() {
                if (isNaN(this.id)) {
                    this.$message({
                        type: "warning",
                        message: "请输入正确输入数字id!"
                    })
                    this.reset()
                }
            },
            reset() {
                this.id = ''
                this.fileName = ''
                this.uuid = ''
                this.getData()
            },
            getData() {
                this.request.get(
                    "/file/page", {
                        params: {
                            pageNum: this.pageNum,
                            pageSize: this.pageSize,
                            fileName: this.fileName,
                            uuid: this.uuid,
                            id: this.id
                        }
                    }
                ).then(res => {
                    console.log(res.data);
                    this.tableData = res.data.records
                    this.total = res.data.total
                })
            },
            //文件保存
            savefile() {
                this.request.post("/file/savefile", this.addfileForm).then(res => {
                    if (res.data) {
                        this.$message.success("添加成功")
                        this.dialogFormVisible = false
                        this.getData()
                    } else {
                        this.$message.error("添加失败")
                        this.dialogFormVisible = false
                    }
                })
            },
            //删除文件
            delFile(id) {
                this.request.delete("/file/deleteFile/" + id).then(res => {
                    if (res.code === "200") {
                        this.$message.success(res.data)
                        this.getData()
                    } else {
                        this.$message.error(res.msg)
                        this.getData()
                    }
                })
            },
            //批量选择用户
            handleSelectionChange(val) {
                console.log(val)
                this.multipleSelection = val
            },
            //批量删除
            delFileBatch() {
                //将ids对象取出变成单个id,放到数组里面
                let ids = this.multipleSelection.map(ids => ids.id);
                //post到服务器
                this.request.post("/file/delFileBatch", ids).then(res => {
                    if (res.code === "200") {
                        this.$message.success(res.data)
                        this.getData()
                    } else {
                        this.$message.error(res.msg)
                        this.getData()
                    }
                })
            },
            //分页数据请求
            handleSizeChange(pageSize) {
                console.log(pageSize)
                this.pageSize = pageSize
                this.getData()

            },
            handleCurrentChange(pageNum) {
                console.log(pageNum)
                this.pageNum = pageNum
                this.getData()
            },
            //将后端的0&1映射为是和否
            formatIsEtl(row) {
                return row.isEtl === 1 ? "已清洗" : "未清洗";
            },
            //文件上传成功回传
            uploadToHdfsSuccess(res) {
                console.log(res);
                if (res.code === '200') {
                    this.getData()
                    this.$message.success("文件上传成功")
                } else if (res.code >= "200") {
                    this.getData()
                    this.$message.error(this.date())
                } else {
                    this.getData()
                    this.$message.success("文件上传成功")
                }
            },

            downloadFile(row) {
                    //如果是清洗过的文件,则从hdfs下载
                    window.open(row.url + "/" + row.isEtl)
            }
        }
    }
</script>
<style>
    .headerBg {
        background: #eee !important;
    }

    .searchForm {
        margin: 10px 0;
    }

    .pagination {
        padding: 10px 0;
        width: max-content;
        margin: 0 auto;
        position: fixed;
        bottom: 10px;
        left: 40%;
    }

</style>

到了这里,关于SpringBoot整合hdfs,实现文件上传下载删除与批量删除,以及vue前端发送请求,实现前后端交互功能;的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

  • SpringBoot 如何实现文件上传和下载

    当今Web应用程序通常需要支持文件上传和下载功能,Spring Boot提供了简单且易于使用的方式来实现这些功能。在本篇文章中,我们将介绍Spring Boot如何实现文件上传和下载,同时提供相应的代码示例。 Spring Boot提供了Multipart文件上传的支持。Multipart是HTTP协议中的一种方式,用

    2024年02月15日
    浏览(15)
  • SpringBoot+MinIO 实现文件上传、读取、下载、删除

    一、 MinIO 二、 MinIO安装和启动 三、 pom.xml 四、 applicatin.properties(配置文件) 五、 编写Java业务类

    2024年02月09日
    浏览(14)
  • springboot+微信小程序实现文件上传下载(预览)pdf文件

    实现思路: 选择文件 wx.chooseMessageFile ,官方文档: https://developers.weixin.qq.com/miniprogram/d e v/api/media/image/wx.chooseMessageFile.html 上传文件 `wx,uploadFile` , 官方文档:https://developers.weixin.qq.com/miniprogram/dev/api/network/upload/wx.uploadFile.html 查看所有上传的pdf文件,显示在页面上 点击pdf文件

    2024年02月08日
    浏览(21)
  • SpringBoot实现文件上传和下载笔记分享(提供Gitee源码)

    前言:这边汇总了一下目前SpringBoot项目当中常见文件上传和下载的功能,一共三种常见的下载方式和一种上传方式,特此做一个笔记分享。 目录 一、pom依赖 二、yml配置文件 三、文件下载

    2024年02月11日
    浏览(15)
  • Springboot + MySQL + html 实现文件的上传、存储、下载、删除

    实现步骤及效果呈现如下: 1.创建数据库表: 表名:file_test 存储后的数据: 2.创建数据库表对应映射的实体类: import com.baomidou.mybatisplus.annotation.IdType ; import com.baomidou.mybatisplus.annotation. TableField ; import com.baomidou.mybatisplus.annotation. TableId ; import com.baomidou.mybatisplus.annotation. Tab

    2024年04月29日
    浏览(12)
  • 一张图带你学会入门级别的SpringBoot实现文件上传、下载功能

    🧑‍💻作者名称:DaenCode 🎤作者简介:啥技术都喜欢捣鼓捣鼓,喜欢分享技术、经验、生活。 😎人生感悟:尝尽人生百味,方知世间冷暖。 📖所属专栏:SpringBoot实战 标题 一文带你学会使用SpringBoot+Avue实现短信通知功能(含重要文件代码) 一张思维导图带你学会Springboot创

    2024年02月12日
    浏览(26)
  • 基于SpringBoot实现文件上传和下载(详细讲解And附完整代码)

    目录 一、基于SpringBoot实现文件上传和下载基于理论 二、详细操作步骤 文件上传步骤: 文件下载步骤: 三、前后端交互原理解释  四、小结  博主介绍:✌专注于前后端领域开发的优质创作者、秉着互联网精神开源贡献精神,答疑解惑、坚持优质作品共享。本人是掘金/腾讯

    2024年04月11日
    浏览(24)
  • Hadoop——HDFS的Java API操作(文件上传、下载、删除等)

    1、创建Maven项目 2、修改pom.xml文件 3、添加四个配置文件 为避免运行的一些错误,我们将Hadoop的四个重要配置文件添加到resources中 4、创建测试文件JavaAPI 5、初始化 因为对文件的操作我们都需要获取hdfs对象和关闭对象,所以为避免重复编写,将两个操作对立成单独方法,分别

    2024年02月06日
    浏览(23)
  • 使用javaAPI对HDFS进行文件上传,下载,新建文件及文件夹删除,遍历所有文件

    目录 //通过工具类来操作hdfs   hdfs dfs -put d:user_info.txt  /user_info.txt  // 将文件放入到hdfs中  2.通过工具类来操作hdfs   hdfs dfs -get hdfs路径   本地路经  将文件放入到本地Windows中 3.通过工具类来操作hdfs   hdfs dfs -mkdir -p  hdfs路径 4.通过工具类来操作hdfs  查看一个文件是否存在

    2024年02月12日
    浏览(19)
  • 【java】java实现大文件的分片上传与下载(springboot+vue3)

    源码: https://gitee.com/gaode-8/big-file-upload 演示视频 https://www.bilibili.com/video/BV1CA411f7np/?vd_source=1fe29350b37642fa583f709b9ae44b35 对于超大文件上传我们可能遇到以下问题 • 大文件直接上传,占用过多内存,可能导致内存溢出甚至系统崩溃 • 受网络环境影响,可能导致传输中断,只能重

    2024年02月02日
    浏览(22)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包