Uploading files to AWS S3 Bucket using Spring Boot | ORIL (2024)

Account Configuration

To start using S3 Bucket you need to create an account on Amazon website. Registration procedure is easy and clear enough, but you will have to verify your phone number and enter your credit card info (don’t worry, your card will not be charged if you only buy some services).

After account creation we need to create an s3 bucket. Go to Services -> S3. Or enter ‘S3’ in the search field.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (1)

Then press the ‘Create bucket’ button.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (2)

Enter your bucket name (should be unique) and choose the region that is closest to you. Press the ‘Create’ button.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (3)

NOTE: Amazon will give you 5GB of storage for free, 20,000 Get Requests, 2,000 Put Requests for the first year. After reaching this limit you will have to pay for using it.

Now your bucket is created but we need to give permission for users to access this bucket. It is not secured to give the access keys of your root user to your developer team or someone else. We need to create a new IAM user and give them permission to use only S3 Bucket.

AWS Identity and Access Management (IAM) is a web service that helps you securely control access to AWS resources.

Let’s create such a user. Go to Services -> IAM. In the navigation panel, choose Users and then press the ‘Create user’ button.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (4)

Enter the user’s name and press the ‘Next’ button.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (5)

Then we need to set the permissions for this user.
Select ‘Attach policies directly’. In the search field enter ‘s3full’ and choose AmazonS3FullAccess.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (6)

Then press next and ‘Create User’. If you did everything right then you should see a new user in your list of users.

The next step is to create an access key for this user. Open the user’s details by clicking on the user name and click ‘Create access key’ link.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (7)

Among the list of access options please choose the one that fits your needs. In this example we will choose ‘Application running outside AWS’.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (8)

Then press next and add an optional tag if needed and press ‘Create access key’.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (9)

On the next screen you will see your access key and secret access key. Please save those values and download a .csv file, because you will not be able to view secret key later.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (10)

Our S3 Bucket configuration is done so let’s proceed to the Spring Boot application.

Spring Boot Part

Let’s create Spring Boot project and add amazon dependency

<dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-java-sdk</artifactId> <version>1.12.581</version></dependency>

Now let’s add s3 bucket properties to our application.yml file:

amazonProperties: accessKey: XXXXXXXXXXXXXXXXX secretKey: XXXXXXXXXXXXXXXXXXXXXXXXXX bucketName: your-bucket-name

It’s time to create our RestController with four endpoints

“/files/upload” – to upload file
“/files/{fileName}/base64” – get file as base64 string by filename
“/files/{fileName}/download” – download file by filename
“/files/{fileName:.+}” – delete file by filename

@RestControllerpublic class FileController { private FileManagerService fileManager; @Autowired FileController(FileManagerService fileManager) { this.fileManager = fileManager; } @PostMapping("/files/upload") public ResponseEntity<SavedFileDTO> uploadFile(@RequestBody FileDTO fileDTO) { return ResponseEntity.ok(fileManager.uploadFile(fileDTO)); } @GetMapping("/files/{fileName}/base64") public ResponseEntity<String> getFileInBase64(@PathVariable("fileName") String fileName) { return ResponseEntity.ok(fileManager.getFileInBase64(fileName)); } @GetMapping("/files/{fileName}/download") public ResponseEntity<Resource> downloadFile(@PathVariable("fileName") String fileName) { byte[] content = fileManager.getFileAsBytes(fileName); return ResponseEntity.ok() .header(HttpHeaders.CONTENT_TYPE, getFileMediaType(fileName)) .header(HttpHeaders.CONTENT_DISPOSITION, MediaType.APPLICATION_OCTET_STREAM_VALUE) .header(HttpHeaders.CONTENT_LENGTH, String.valueOf(content.length)) .body(new ByteArrayResource(content)); } @DeleteMapping("/files/{fileName:.+}") public ResponseEntity<Void> deleteFile(@PathVariable("fileName") String fileName) { fileManager.deleteFile(fileName); return ResponseEntity.ok().build(); } private String getFileMediaType(String fileName) { String mediaType; String fileExtension = fileName.substring(fileName.lastIndexOf('.') + 1); switch (fileExtension.toLowerCase()) { case "pdf": mediaType = MediaType.APPLICATION_PDF_VALUE; break; case "png": mediaType = MediaType.IMAGE_PNG_VALUE; break; case "jpeg": mediaType = MediaType.IMAGE_JPEG_VALUE; break; default: mediaType = MediaType.TEXT_PLAIN_VALUE; } return mediaType; }}

Method for uploading file accepts FileDTO as a request body. Here is how this class looks, just two fields filename and base64, because we will be sending a file to this endpoint as base64 string.

public class FileDTO { private String fileName; private String base64; public String getFileName() { return fileName; } public void setFileName(String fileName) { this.fileName = fileName; } public String getBase64() { return base64; } public void setBase64(String base64) { this.base64 = base64; }}

This code is actually broken because we don’t have an AmazonClient class yet and a FileManagerService class, so let’s create these classes and add all the methods we need.

AmazonClient will have the following fields and methods:

@Componentpublic class AmazonClient { private final Logger logger = LoggerFactory.getLogger(AmazonClient.class); private AmazonS3 s3client; @Value("${amazonProperties.bucketName}") private String bucketName; @Value("${amazonProperties.accessKey}") private String accessKey; @Value("${amazonProperties.secretKey}") private String secretKey; @PostConstruct private void initializeAmazonClient() { AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey); this.s3client = AmazonS3ClientBuilder.standard().withCredentials(new AWSStaticCredentialsProvider(credentials)) .withRegion(Regions.US_EAST_1).build(); createBucket(); } public void uploadFileToBucket(String fileName, File file, String folderToUpload) { logger.info("Uploading file {} to {}", fileName, folderToUpload); s3client.putObject(new PutObjectRequest(bucketName, folderToUpload + "/" + fileName, file)); } public void deleteFileFromBucket(String filename, String folderName) { logger.info("Deleting file {} from {}", filename, folderName); DeleteObjectRequest delObjReq = new DeleteObjectRequest(bucketName, folderName + "/" + filename); s3client.deleteObject(delObjReq); } public void deleteMultipleFilesFromBucket(List<String> files) { DeleteObjectsRequest delObjReq = new DeleteObjectsRequest(bucketName) .withKeys(files.toArray(new String[0])); logger.info("Deleting files..."); s3client.deleteObjects(delObjReq); } public File getFileFromBucket(String filename, String folderName) { InputStream inputStream = getFileInputStream(filename, folderName); File file = new File(filename); try { FileUtils.copyInputStreamToFile(inputStream, file); } catch (IOException e) { logger.error(ExceptionUtils.getStackTrace(e)); return file; } return file; } public InputStream getFileInputStream(String filename, String folderName) { S3Object s3object = s3client.getObject(bucketName, folderName + "/" + filename); return s3object.getObjectContent(); } private void createBucket() { if (s3client.doesBucketExistV2(bucketName)) { logger.info("Bucket {} already exists", bucketName); return; } try { logger.info("Creating bucket {}", bucketName); s3client.createBucket(bucketName); } catch (Exception e) { logger.error((ExceptionUtils.getStackTrace(e))); } }}

AmazonS3 is a class from amazon dependency. All other fields are just a representation of variables from our application.yml file. The @Value annotation will bind application properties directly to class fields during application initialization.

We added @PostConstruct method initializeAmazonClient() to set amazon credentials to amazon client. Annotation @PostConstruct is needed to run this method after the constructor will be called, because class fields marked with @Value annotation are null in the constructor. createBucket() method is also called to create a S3 bucket if it doesn’t exist yet.

All other methods represent uploading, deleting and getting files from s3-bucket.

Now let’s see what fields and methods are available in our FileManagerService class.

private static final String UPLOAD_FOLDER_NAME = "public-files";private final AmazonClient amazonClient;

These fields are just a folder name where our files will be stored and our AmazonClient that we created earlier.

public SavedFileDTO uploadFile(FileDTO fileDTO) { SavedFileDTO savedFile = new SavedFileDTO(); savedFile.setGeneratedFileName(generateFileName(fileDTO)); savedFile.setOriginalFileName(fileDTO.getFileName()); File file = convertBase64ToFile(fileDTO.getBase64(), fileDTO.getFileName()); this.amazonClient.uploadFileToBucket(savedFile.getGeneratedFileName(), file, UPLOAD_FOLDER_NAME); savedFile.setUploadedAt(new Date()); try { FileUtils.forceDelete(file); } catch (IOException e) { throw new RuntimeException(e); } return savedFile;}

The method above generates a unique filename, converts base64 string to File and uploads it to bucket. This method calls two other methods for generating name and converting:

private String generateFileName(FileDTO fileDTO) { String name = fileDTO.getFileName().replaceAll("[^a-zA-Z0-9.-]", "_"); return (new Date().getTime() + "_" + name);}private File convertBase64ToFile(String base64Content, String filename) { byte[] decodedContent = Base64.getDecoder().decode(base64Content.getBytes(StandardCharsets.UTF_8)); return bytesToFile(decodedContent, filename);}private File bytesToFile(byte[] content, String fileName) { File file = new File(fileName); try (FileOutputStream fos = new FileOutputStream(file)) { fos.write(content); } catch (IOException e) { return null; } return file;}

The next two methods shows how to get file as base64 string or as bytes:

public String getFileInBase64(String fileName) { File file = amazonClient.getFileFromBucket(fileName, UPLOAD_FOLDER_NAME); try { return Base64.getEncoder().encodeToString(FileUtils.readFileToByteArray(file)); } catch (IOException e) { e.printStackTrace(); } return null;}public byte[] getFileAsBytes(String fileName) { InputStream inputStream = amazonClient.getFileInputStream(fileName, UPLOAD_FOLDER_NAME); try { return IOUtils.toByteArray(inputStream); } catch (IOException e) { e.printStackTrace(); } return new byte[0];}

And the last method is just deletion of file:

public void deleteFile(String fileName) { amazonClient.deleteFileFromBucket(fileName, UPLOAD_FOLDER_NAME);}

NOTE: Every time when we upload, get or delete files from S3 bucket we need to specify folder name as well.

Testing time

Let’s test our application by making requests using Postman. We need to choose the POST method, in the Body we need to add json with two fields: fileName and base64. You can convert any file to base64 using online converter.

The endpoint url is: http://localhost:8080/files/upload.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (11)

If you did everything correctly then you should receive similar response body:

{ "originalFileName": "testfile.jpg", "generatedFileName": "1699358552994_testfile.jpg", "uploadedAt": "2023-11-07T12:02:34.644+00:00"}

And if you open your S3 bucket on Amazon then you should see one uploaded image there.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (12)

Now let’s test our delete method. Choose DELETE method with endpoint url: http://localhost:8080/files/1699358552994_testfile.jpg.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (13)

If the file is deleted successfully you should receive http status 204 No Content.

Let’s upload another file and test getting file as base64 string using GET method with url

http://localhost:8080/files/1699358552994_testfile.jpg/base64

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (14)

You should receive a base64 string as a response:

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (15)

Conclusion

That’s basically it. Now you can easily use S3 bucket in your own projects. Hope this was helpful for you. If you have any questions please feel free to leave a comment. Thank you for reading.

You can check a full example of this application on Oril Software GitHub.

Uploading files to AWS S3 Bucket using Spring Boot | ORIL (2024)
Top Articles
How much Money can you Save in USA on H1B/L1 Visa? Living Cost?
The Average Salary of a Millennial
Jackerman Mothers Warmth Part 3
East Cocalico Police Department
Poe Pohx Profile
Top Financial Advisors in the U.S.
Dr Lisa Jones Dvm Married
Richard Sambade Obituary
Nesb Routing Number
Rapv Springfield Ma
Jasmine Put A Ring On It Age
Uhcs Patient Wallet
Otterbrook Goldens
Paradise leaked: An analysis of offshore data leaks
Video shows two planes collide while taxiing at airport | CNN
Marvon McCray Update: Did He Pass Away Or Is He Still Alive?
How pharmacies can help
Daylight Matt And Kim Lyrics
Account Suspended
Barber Gym Quantico Hours
Timeforce Choctaw
Tips and Walkthrough: Candy Crush Level 9795
What Is The Lineup For Nascar Race Today
Bòlèt Florida Midi 30
Teekay Vop
How to Watch Every NFL Football Game on a Streaming Service
Craigs List Jonesboro Ar
Ltg Speech Copy Paste
Keshi with Mac Ayres and Starfall (Rescheduled from 11/1/2024) (POSTPONED) Tickets Thu, Nov 1, 2029 8:00 pm at Pechanga Arena - San Diego in San Diego, CA
Wells Fargo Bank Florida Locations
Halsted Bus Tracker
What does wym mean?
Grays Anatomy Wiki
Egg Crutch Glove Envelope
Indiana Wesleyan Transcripts
Today's Final Jeopardy Clue
Kvoa Tv Schedule
October 31St Weather
Pitchfork's Top 200 of the 2010s: 50-1 (clips)
Radical Red Doc
SOC 100 ONL Syllabus
Caderno 2 Aulas Medicina - Matemática
Craigslist List Albuquerque: Your Ultimate Guide to Buying, Selling, and Finding Everything - First Republic Craigslist
Paperless Employee/Kiewit Pay Statements
Jason Brewer Leaving Fox 25
T&Cs | Hollywood Bowl
Discover Wisconsin Season 16
Ucsc Sip 2023 College Confidential
Cocorahs South Dakota
Big Reactors Best Coolant
Zeeks Pizza Calories
Cvs Minute Clinic Women's Services
Latest Posts
Article information

Author: Frankie Dare

Last Updated:

Views: 6652

Rating: 4.2 / 5 (73 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Frankie Dare

Birthday: 2000-01-27

Address: Suite 313 45115 Caridad Freeway, Port Barabaraville, MS 66713

Phone: +3769542039359

Job: Sales Manager

Hobby: Baton twirling, Stand-up comedy, Leather crafting, Rugby, tabletop games, Jigsaw puzzles, Air sports

Introduction: My name is Frankie Dare, I am a funny, beautiful, proud, fair, pleasant, cheerful, enthusiastic person who loves writing and wants to share my knowledge and understanding with you.