Text
Run Lambda Functions on a Schedule using Amazon EventBridge Event Rules
Run Lambda Functions on a Schedule using Amazon EventBridge Event Rules
Amazon EventBridge is a serverless event bus service that makes it easy to connect your applications with data from a variety of sources. You can create rules that self-trigger on an automated schedule in EventBridge using cron or rate expressions. These rules can in-turn trigger Lambda functions, even with custom input if required. All scheduled events use UTC time zone & the minimum precision…
View On WordPress
0 notes
Text
Deliver S3 Events Across Regions by Routing them Through an SNS Topic
Deliver S3 Events Across Regions by Routing them Through an SNS Topic
If you’ve ever tried to use an SQS queue in another region as the destination for S3 events, you must have seen this error:
The notification destination service region is not valid for the bucket location constraint.
What that cryptic error message essentially means is that the S3 bucket & the event destination must be in the same region. But what if you need to deliver your S3 events to an…
View On WordPress
0 notes
Text
Use Pre-Installed Terraform Plugins Instead of Downloading them with "terraform init"
Use Pre-Installed Terraform Plugins Instead of Downloading them with “terraform init”
In normal usage, terraform init downloads & installs the plugins for any providers used in the configuration automatically in the .terraformdirectory. It’s sometimes desirable to disable this behavior, either because you wanna do away with the re-download every time on your dev system, or because you’re running Terraform in a CI/CD pipeline where it’s sensible to download the plugins just once &…
View On WordPress
0 notes
Text
Deploy to Multiple AWS Accounts with Terraform
Deploy to Multiple AWS Accounts with Terraform
If you’re looking for a way to deploy copies of an infrastructure, or parts of an infrastructure to several AWS accounts simultaneously, there’s an easy way to do this. It’s done by using multiple “provider configurations”.
As you might be aware, the Terraform provider for AWS must be configured with a way to authenticate itself with AWS, in order to perform infrastructure operations. There…
View On WordPress
0 notes
Text
Terraform State Management in Multi-Customer Multi-Account Multi-Environment Scenarios
This post explores ways to structure your Terraform configuration when it’s to be used to deploy infrastructure across multiple cloud accounts, for multiple customers of yours & for multiple environments for each app involved — development, staging, production. One prime example where this might be very useful to you is if you build a multi-tenant SaaS application.
The key advantage we’re…
View On WordPress
0 notes
Text
Programmatically Stream (Upload) Large Files to Amazon S3
Programmatically Stream (Upload) Large Files to Amazon S3
The upload() method in the AWS JavaScript SDK does a good job of uploading objects to S3 even if they’re large enough to warrant a multipart upload. It’s also possible to pipe a data stream to it in order to upload very large objects. To do this, simply wrap the upload() function with the Node.js stream.PassThrough() function:
const AWS = require('aws-sdk'); const S3 = new AWS.S3(); const…
View On WordPress
0 notes
Text
Use Python Packages like NumPy & Pandas with AWS Glue
Use Python Packages like NumPy & Pandas with AWS Glue
According to AWS Glue documentation:
Only pure Python libraries can be used. Libraries that rely on C extensions, such as the pandas Python Data Analysis Library, are not yet supported.
— Providing Your Own Custom Scripts
But if you’re using Python shell jobs in Glue, there is a way to use Python packages like Pandas using Easy Install.
Easy Install is a python module (easy_install)…
View On WordPress
0 notes
Text
Submit Apache Spark Jobs to an Amazon EMR Cluster from Apache Airflow
Submit Apache Spark Jobs to an Amazon EMR Cluster from Apache Airflow
If you’re running Spark on EMR & need to submit jobs remotely, you’re in the right place! You can have Airflow running on an EC2 instance & use it to submit jobs to EMR, provided they can reach each other. There are several ways you can trigger a spark-submit to a remote Spark server, EMR or otherwise, via Airflow:
Use SparkSubmitOperator
This operator requires you have a spark-submitbinary…
View On WordPress
0 notes
Text
How Amazon RDS Aurora MySQL Cross-Region Replication Really Works Under the Hood
How Amazon RDS Aurora MySQL Cross-Region Replication Really Works Under the Hood
RDS Aurora MySQL in AWS provides an in-built feature to create a cross-region read replica of a database. This is easily accessible from the console as shown below:
This article describes at a high-level the basic logistics of how replication really happens.
First things first:
Amazon RDS Aurora MySQL cross-region replication uses native MySQL binlog replication.
Tweet
How MySQL Binlog…
View On WordPress
0 notes
Text
Run Celery Tasks in the Background with a Django App in AWS Elastic Beanstalk
Run Celery Tasks in the Background with a Django App in AWS Elastic Beanstalk
Django is a high-level Python Web framework that encourages rapid development and clean, pragmatic design. Built by experienced developers, it takes care of much of the hassle of Web development, so you can focus on writing your app without needing to reinvent the wheel. It’s free and open source.
Celery is a task queue implementation for Python web applications used to asynchronously execute…
View On WordPress
0 notes
Text
Troubleshooting Permission Denied Errors When Trying to SSH Into an AWS EC2 Instance
Troubleshooting Permission Denied Errors When Trying to SSH Into an AWS EC2 Instance
If you’re new to AWS (or not), you might run into issues trying to SSH into EC2 instances. This article summarizes things you can try to fix the issue. If you get “Permission denied (publickey)” error or simply aren’t able to SSH to an instance, try the following:
Ensure the file permissions on the private key are proper. Run chmod 400 key.pem.
Use the right ssh command — ssh -i key.pem…
View On WordPress
0 notes
Text
Uploading Files to Amazon S3 Directly from a Browser using JavaScript
Uploading Files to Amazon S3 Directly from a Browser using JavaScript
It’s common in many webapps to let the user upload files to your system. These could be as simple as profile pictures or something else. In AWS, the ideal place to keep these files is S3 but if you’re uploading the files to your server & then to S3, you don’t have to. You can upload to S3 directly from the browser. Let’s see how.
What you want is browser-based uploads to S3 using HTTP POST…
View On WordPress
0 notes
Text
POSTing Binary or Multipart Form-Data to an API in Amazon API Gateway
POSTing Binary or Multipart Form-Data to an API in Amazon API Gateway
Do you have a need to upload a file to an API endpoint in API Gateway? Whether you need to HTTP POST a binary or a text file to an API, a.k.a., multipart form-data, this article will explain how to accomplish this.
The way to do this is to define multipart/form-dataas a binary media type for your API & proxy the payload directly to a Lambda function. From there you can parse the body to get…
View On WordPress
0 notes
Text
Troubleshooting .htaccess Issues with Apache Server
Troubleshooting .htaccess Issues with Apache Server
Are you using .htaccess to configure your Apache server? Or are you trying to use it but nothing you put in there seems to work? When should you use .htaccess instead of editing the server config directly? This article explains the role of .htaccess in Apache server & answers these questions.
First thing first — Know that .htaccess file is specifically for people who do NOT have access to the…
View On WordPress
0 notes
Text
Redirect HTTP to HTTPS in AWS Elastic Beanstalk
Redirect HTTP to HTTPS in AWS Elastic Beanstalk
There are many ways to redirect incoming HTTP requests to HTTPS in Beanstalk. This article explores a few, with pros & cons of each. But first, the prerequisites:
Prerequisites
Before we get to Beanstalk itself, there are a few things to take care of, to make HTTPS work. The rest of this article assumes that you have already:
Pointed your domain to the Beanstalk environment using either…
View On WordPress
0 notes
Text
How to Make All Objects in an Amazon S3 Bucket Public by Default?
How to Make All Objects in an Amazon S3 Bucket Public by Default?
There are several ways to make objects in an S3 bucket public. The first is to use the following bucket policy:
{ "Version": "2012-10-17", "Statement": { "Action": "s3:GetObject", "Effect": "Allow", "Resource": "arn:aws:s3:::my-bucket/*", "Principal": "*" } }
This (& other) policies can be generated using AWS’s official policy generator at awspolicygen.s3.amazonaws.com…
View On WordPress
0 notes
Text
Enhance API Client Experience by Deploying a CloudFront Distribution to Serve APIs from Amazon API Gateway
Enhance API Client Experience by Deploying a CloudFront Distribution to Serve APIs from Amazon API Gateway
Just like you speed-up delivery of your static assets from S3 buckets using CloudFront CDN, you can do the same for your APIs as well. In most cases, it’s enough to make your API “edge-optimized”:
If you do this, API Gateway creates & manages a CloudFront distribution for you behind the scenes, but if you need more control over CloudFront, leave the Endpoint Type of the API as Regional &…
View On WordPress
0 notes