Upload Mutler File to Google Cloud Bucket

In this tutorial, I will show yous how to upload file to Google Cloud Storage (GCS) with Node.js example.

This Node.js App works with:
– Angular eight Client / Athwart 10 Client / Angular 11 Client / Angular 12
– Athwart Material 12
– Vue Client / Vuetify Customer
– React Client / React Hooks Client
– Fabric UI Client
– Axios Client

Related Posts:
– Node.js Express File Upload (to static folder) instance
– How to upload multiple files in Node.js
– Upload/store images in MySQL using Node.js, Express & Multer
– How to upload/store images in MongoDB using Node.js, Express & Multer
– Node.js Balance APIs example with Limited, Sequelize & MySQL

Contents

  • Node.js upload File to Google Cloud Storage example
  • Technology
  • Project Structure
  • Setup Node.js File Upload to GCS project
  • Setup Google Deject Service Bucket with Credentials
  • Create middleware for processing file
  • Restrict file size before uploading to GCS
  • Create Controller for GCS file upload/download
    • GCS Upload File API
    • GCS Read Files and Download API
  • Ascertain Road for uploading file
  • Create Express app server
  • Run & Check
  • Conclusion
  • Further Reading
  • Source Code

Node.js upload File to Google Cloud Storage case

Our Node.js Awarding will provide APIs for:

  • uploading File to Google Cloud Storage bucket (restricted file size: 2MB)
  • downloading File from server with the link
  • getting listing of Files' information (file name & url)

Here are APIs to exist exported:

Methods Urls Deportment
Mail /upload upload a File
GET /files get List of Files (name & url)
GET /files/[filename] download a File

– Upload a File:

google-cloud-storage-nodejs-upload-file-example-post

– This is the bucket that stores all uploaded files:

google-cloud-storage-nodejs-upload-file-example-bucket

– If we get list of files, the Node.js Rest Apis will return:

google-cloud-storage-nodejs-upload-file-example-read-files

– Upload a File with size larger than max file size (2MB):

google-cloud-storage-nodejs-upload-file-example-max-file-size

– Download any file from 1 of the paths higher up, or via the API.
For case: http://localhost:8080/files/1.png.

google-cloud-storage-nodejs-upload-file-example-download

Technology

  • @google-cloud/storage 5.8.five
  • limited 4.17.1
  • multer 1.4.2
  • cors 2.eight.5

Project Structure

This is the projection directory that we're gonna build:

google-cloud-storage-nodejs-upload-file-example-project-structure

google-cloud-primal.json contains credentials for working with Google Deject Storage.
middleware/upload.js: initializes Multer Storage engine and defines middleware role to process file before uploading it to Google Deject Storage.
file.controller.js exports Rest APIs: Post a file, GET all files' data, download a File with url.
routes/alphabetize.js: defines routes for endpoints that is called from HTTP Client, use controller to handle requests.
server.js: initializes routes, runs Express app.

Setup Node.js File Upload to GCS project

Open command prompt, change current directory to the root folder of our project.
Install GCS, Limited, Multer, CORS modules with the following control:

          npm install @google-cloud/storage limited multer cors                  

The parcel.json file will look similar this:

          {   "proper name": "nodejs-upload-file-google-deject-storage",   "version": "1.0.0",   "description": "Node.js Upload file to Google Deject Storage example",   "chief": "server.js",   "scripts": {     "test": "echo \"Error: no test specified\" && exit i"   },   "keywords": [     "nodejs",     "upload",     "file",     "google cloud storage",     "rest api"   ],   "author": "bezkoder",   "license": "ISC",   "dependencies": {     "@google-cloud/storage": "^5.8.v",     "cors": "^2.8.5",     "express": "^4.17.i",     "multer": "^1.4.2"   } }                  

Setup Google Cloud Service Bucket with Credentials

First yous need to follow pace past step in this post for creating a new Bucket named bezkoder-eastward-commerce with a Credentials JSON file.

Once you get the file, rename information technology to google-cloud-key.json and put it into the root binder of Node.js projection.

Create middleware for processing file

The middleware will use Multer for handling multipart/form-data along with uploading files.

Inside middleware folder, create upload.js file:

          const util = crave("util"); const Multer = require("multer"); let processFile = Multer({   storage: Multer.memoryStorage() }).single("file"); let processFileMiddleware = util.promisify(processFile); module.exports = processFileMiddleware;                  

In the code above, nosotros've done these steps:
– Showtime, nosotros import multer module.
– Next, nosotros configure multer to use Memory Storage engine.

util.promisify() makes the exported middleware object can be used with async-expect.

Restrict file size before uploading to GCS

With the new multer API. We can add limits: { fileSize: maxSize } to the object passed to multer() to restrict file size.

          const util = require("util"); const Multer = require("multer"); const maxSize = ii * 1024 * 1024; let processFile = Multer({   storage: Multer.memoryStorage(),   limits: { fileSize: maxSize }, }).single("file"); let processFileMiddleware = util.promisify(processFile); module.exports = processFileMiddleware;                  

Create Controller for GCS file upload/download

In controller folder, create file.controller.js:

          const upload = async (req, res) => {   ... }; const getListFiles = async (req, res) => {   ... }; const download = async (req, res) => {   ... }; module.exports = {   upload,   getListFiles,   download, };                  

GCS Upload File API

For File Upload method, we will consign upload() part that:

  • use middleware office for processing file
  • use Storage Bucket object for file upload
  • grab the fault
  • render response with message
          const processFile = require("../middleware/upload"); const { format } = require("util"); const { Storage } = crave("@google-cloud/storage"); // Instantiate a storage client with credentials const storage = new Storage({ keyFilename: "google-deject-primal.json" }); const bucket = storage.bucket("bezkoder-east-commerce"); const upload = async (req, res) => {   effort {     look processFile(req, res);     if (!req.file) {       return res.condition(400).send({ bulletin: "Please upload a file!" });     }     // Create a new blob in the saucepan and upload the file data.     const hulk = saucepan.file(req.file.originalname);     const blobStream = blob.createWriteStream({       resumable: false,     });     blobStream.on("mistake", (err) => {       res.condition(500).send({ message: err.message });     });     blobStream.on("finish", async (data) => {       // Create URL for directly file access via HTTP.       const publicUrl = format(         `https://storage.googleapis.com/${bucket.name}/${blob.name}`       );       try {         // Make the file public         await bucket.file(req.file.originalname).makePublic();       } catch {         return res.status(500).ship({           message:             `Uploaded the file successfully: ${req.file.originalname}, merely public access is denied!`,           url: publicUrl,         });       }       res.condition(200).send({         message: "Uploaded the file successfully: " + req.file.originalname,         url: publicUrl,       });     });     blobStream.finish(req.file.buffer);   } catch (err) {     res.status(500).send({       message: `Could not upload the file: ${req.file.originalname}. ${err}`,     });   } }; module.exports = {   upload,   ... };                  

– We call middleware function processFile() first.
– If the HTTP request doesn't include a file, send 400 condition in the response.
– We as well take hold of the error and ship 500 condition with mistake message.

So, how to handle the case in that user uploads the file exceeding size limitation?
We cheque mistake code (LIMIT_FILE_SIZE) in the take hold of() block:

          const upload = async (req, res) => {   try {     await processFile(req, res);     ...   } grab (err) {     if (err.code == "LIMIT_FILE_SIZE") {       return res.status(500).send({         bulletin: "File size cannot be larger than 2MB!",       });     }     res.condition(500).send({       message: `Could not upload the file: ${req.file.originalname}. ${err}`,     });   } };                  

GCS Read Files and Download API

For File Information and Download:

  • getListFiles(): read all files in GCS bucket, render list of files' information (name, url)
  • download(): receives file proper name as input parameter, get Media Link from file's Meta Data, and then redirect browser to Media Link.
          ... const { Storage } = require("@google-deject/storage"); const storage = new Storage({ keyFilename: "google-cloud-key.json" }); const bucket = storage.saucepan("bezkoder-eastward-commerce"); const getListFiles = async (req, res) => {   try {     const [files] = wait bucket.getFiles();     allow fileInfos = [];     files.forEach((file) => {       fileInfos.push({         name: file.name,         url: file.metadata.mediaLink,       });     });     res.status(200).send(fileInfos);   } catch (err) {     console.log(err);     res.condition(500).send({       message: "Unable to read list of files!",     });   } }; const download = async (req, res) => {   effort {     const [metaData] = expect bucket.file(req.params.proper noun).getMetadata();     res.redirect(metaData.mediaLink);        } catch (err) {     res.condition(500).send({       message: "Could not download the file. " + err,     });   } }; module.exports = {   ...   getListFiles,   download, };                  

Define Route for uploading file

When a customer sends HTTP requests, we need to determine how the server volition response by setting up the routes.

In that location are 3 routes with corresponding controller methods:

  • Mail /upload: upload()
  • Go /files: getListFiles()
  • GET /files/[fileName]: download()

Create index.js file inside routes binder with content like this:

          const express = require("express"); const router = limited.Router(); const controller = crave("../controller/file.controller"); allow routes = (app) => {   router.post("/upload", controller.upload);   router.get("/files", controller.getListFiles);   router.go("/files/:proper noun", controller.download);   app.use(router); }; module.exports = routes;                  

You can run across that we utilize controller from file.controller.js.

Create Limited app server

Finally, we create an Express server in server.js:

          const cors = require("cors"); const express = require("express"); const app = express(); let corsOptions = {   origin: "http://localhost:8081", }; app.use(cors(corsOptions)); const initRoutes = require("./src/routes"); app.utilise(express.urlencoded({ extended: true })); initRoutes(app); const port = 8080; app.listen(port, () => {   console.log(`Running at localhost:${port}`); });                  

What we practice are:
– import limited and cors modules:

  • Express is for building the Rest apis
  • cors provides Limited middleware to enable CORS with various options.

– create an Express app, so add cors middlewares using app.use() method. Notice that we set origin: http://localhost:8081.
– listen on port 8080 for incoming requests.

Run & Check

On the project root binder, run this command: node server.js.
Then use Postman to make HTTP POST/GET requests.

Conclusion

Today we've learned how to create Google Cloud Storage with Node.js File Upload example using Express for Residue API and Multer for processing file middleware. You too know how to restrict file size and take hold of Multer file size limit error.

Following tutorials explain how to build Forepart-end Apps to piece of work with our Node.js Express Server:
– Angular eight Client / Angular 10 Customer / Athwart eleven Customer / Angular 12
– Angular Material 12
– Vue Client / Vuetify Customer
– React Customer / React Hooks Client
– Fabric UI Client
– React Image Upload with Preview
– Axios Client

If y'all want to upload file into server static folder, please visit:
Node.js Express File Upload Rest API case using Multer

Happy Learning! See you again.

Further Reading

  • https://cloud.google.com/storage/docs/how-to
  • https://www.npmjs.com/bundle/limited
  • https://www.npmjs.com/package/multer

Source Code

You can observe the complete source code for this tutorial on Github.

ardwandrang.blogspot.com

Source: https://www.bezkoder.com/google-cloud-storage-nodejs-upload-file/

0 Response to "Upload Mutler File to Google Cloud Bucket"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel