#AWSCloudFormation
Explore tagged Tumblr posts
fazex615 · 3 months ago
Text
AWS Devops Services
A Comprehensive Overview of AWS DevOps Services Organizations may deliver applications and services at a high velocity by using the tools and methods that AWS DevOps Services offer to expedite the software development lifecycle. AWS DevOps enables continuous integration and continuous delivery (CI/CD), which guarantees quick and dependable software releases, by combining development and operations. AWS CodeBuild for creating and testing code, AWS CodeDeploy for automating deployments, and AWS CodePipeline for automating release pipelines are important services. These services enable teams to efficiently manage and deploy resources when used in conjunction with AWS CloudFormation for infrastructure as code. Improved cooperation between the development and operations teams, a shorter time to market, and a more agile development process are the outcome.
Advantages and Features of DevOps Services on AWS The software development process may be made more scalable, secure, and automated with the help of AWS DevOps Services. Granular permissions can be configured by businesses using AWS Identity and Access Management (IAM), guaranteeing safe access to resources. AWS DevOps's automation features, like automated deployment and testing, minimize errors and boost productivity by reducing the need for manual involvement. AWS CloudWatch also offers logging and monitoring, giving users access to information about the health of their infrastructure and application performance. In addition to speeding up development cycles, this all-encompassing strategy guarantees the delivery of high-quality software. Utilizing AWS DevOps Services helps businesses remain competitive in the ever-evolving digital market.
0 notes
spearheadtechnology · 1 year ago
Text
Tumblr media
Cloud Migration for Large U.S. Clothing Retailer - How Spearhead Technology Helps to Saving the Retailer $2 Million the first year - Case Study
1 note · View note
atlantisitgroup · 3 years ago
Text
Three decades of IT Solutioning experience
From MS DOS to Windows to Web to Cloud Apps Partnering with customers across all technologies.
Contact mail id : [email protected] Call us : +1.833.561.3093
Tumblr media
0 notes
holytheoristtastemaker · 5 years ago
Quote
In this post, we are going to leverage AWS Amplify authentication while still building the UI we want. Prerequisites Seeing as this is a post about AWS and AWS Amplify, you should be set up with both of those. Don't have an AWS account yet? You can set one up here. To interact with AWS Amplify you need to install the CLI via npm. $ yarn global add @aws-amplify/cli Setting up our project Before we can show how to build a custom UI using Amplify, we first need a project to work from. Let's use create-react-app to get a React app going. $ npx create-react-app amplify-demo $ cd amplify-demo With our boilerplate project created we can now add the Amplify libraries we are going to need to it. $ yarn add aws-amplify aws-amplify-react Now we need to initialize Amplify and add authentication to our application. From the root of our new amplify-demo application, run the following commands with the following answers to each question. $ amplify init Note: It is recommended to run this command from the root of your app directory ? Enter a name for the project amplify-demo ? Enter a name for the environment prod ? Choose your default editor: Visual Studio Code ? Choose the type of app that you're building: javascript ? What javascript framework are you using react ? Source Directory Path: src ? Distribution Directory Path: build ? Build Command: npm run-script build ? Start Command: npm run-script start $ amplify add auth Using service: Cognito, provided by: awscloudformation The current configured provider is Amazon Cognito. Do you want to use the default authentication and security configuration? Default configuration Warning: you will not be able to edit these selections. How do you want users to be able to sign in? Username Do you want to configure advanced settings? No, I am done. Successfully added resource amplifydemobc1364f5 locally Now that we have the default authentication via Amplify added to our application we can add the default login. To do that go ahead and update your App component located at src/App.js to have the following code. import React from "react"; import logo from "./logo.svg"; import "./App.css"; import { withAuthenticator } from "aws-amplify-react"; import Amplify from "aws-amplify"; import awsconfig from "./aws-exports"; Amplify.configure(awsconfig); function App() { return ( Internal Application behind Login ); } export default withAuthenticator(App); The default Amplify authentication above leverages the higher-order component, withAuthenticator. We should now be able to see that our App component is behind a login. Go ahead and start the app up in development mode by running yarn start. We should see something like below. Customizing The Amplify Authentication UI Now that we have the default authentication wired up it's time to customize it. In the previous blog post we essentially inherited from the internal Amplify components like SignIn. This allowed us to leverage the functions already defined in that component. But, this felt like the wrong abstraction and a bit of a hack for the long term. It was/is a valid way to get something working. But it required knowing quite a few of the implementation details implemented in the parent component. Things like knowing how handleInputChange and _validAuthStates were getting used in SignIn were critical to making the brute force version below work as expected. import React from "react"; import { SignIn } from "aws-amplify-react"; export class CustomSignIn extends SignIn { constructor(props) { super(props); this._validAuthStates = ["signIn", "signedOut", "signedUp"]; } showComponent(theme) { return ( Username .....omitted..... ); } } But in running with this brute force approach for a bit I was able to form up a better way to customize the Amplify authentication UI. The approach, as we are going to see, boils down to three changes. Instead of using the higher-order component, withAuthenticator. We are going to instead use the component instead. This is the component built into the framework that allows for more customization. We are going to change our App component to make use of an AuthWrapper component that we will write. This is the component that can manage the various states of authentication we can be in. Finally, we will write our own CustomSignIn component to have it's own UI and logic. Let's go ahead and dive in with 1️⃣. Below is what our App component is going to look like now. import React from "react"; import { Authenticator } from "aws-amplify-react"; import "./App.css"; import Amplify from "aws-amplify"; import awsconfig from "./aws-exports"; import AuthWrapper from "./AuthWrapper"; Amplify.configure(awsconfig); function App() { return ( ); } export default App; Notice that our App component is now an entry point into our application. It uses the Authenticator component provided by Amplify instead of the higher-order component. We tell that component to hide all the default authentication UI, we are going to create our own. Then inside of that, we make use of a new component we are going to create called AuthWrapper. This new component is going to act as our router for the different authentication pieces we want to have. For this blog post, we are just going to implement the login workflow. But the idea is transferrable to other things like signing up and forgot password. Here is what AuthWrapper ends up looking like. import React, { Component } from "react"; import { InternalApp } from "./InternalApp"; import { CustomSignIn } from "./SignIn"; class AuthWrapper extends Component { constructor(props) { super(props); this.state = { username: "" }; this.updateUsername = this.updateUsername.bind(this); } updateUsername(newUsername) { this.setState({ username: newUsername }); } render() { return ( ); } } export default AuthWrapper; Here we can see that AuthWrapper is a router for two other components. The first one is CustomSignIn, this is the custom login UI we can build-out. The second one is our InternalApp which is the application UI signed in users can access. Note that both components get the authState passed into them. Internally the components can use this state to determine what they should do. Before taking a look at the CustomSignIn component, let's look at InternalApp to see how authState is leveraged. import React, { Component } from "react"; import logo from "../src/logo.svg"; export class InternalApp extends Component { render() { if (this.props.authState === "signedIn") { return ( Internal Application behind Login ); } else { return null; } } } Notice that we are checking that authState === "signedIn" to determine if we should render the application UI. This is a piece of state that is set by the authentication components defined in AuthWrapper. Now let's see what our customized authentication for the login prompt looks like. Here is what CustomSignIn looks like. import React, { Component } from "react"; import { Auth } from "aws-amplify"; export class CustomSignIn extends Component { constructor(props) { super(props); this._validAuthStates = ["signIn", "signedOut", "signedUp"]; this.signIn = this.signIn.bind(this); this.handleInputChange = this.handleInputChange.bind(this); this.handleFormSubmission = this.handleFormSubmission.bind(this); this.state = {}; } handleFormSubmission(evt) { evt.preventDefault(); this.signIn(); } async signIn() { const username = this.inputs.username; const password = this.inputs.password; try { await Auth.signIn(username, password); this.props.onStateChange("signedIn", {}); } catch (err) { if (err.code === "UserNotConfirmedException") { this.props.updateUsername(username); await Auth.resendSignUp(username); this.props.onStateChange("confirmSignUp", {}); } else if (err.code === "NotAuthorizedException") { // The error happens when the incorrect password is provided this.setState({ error: "Login failed." }); } else if (err.code === "UserNotFoundException") { // The error happens when the supplied username/email does not exist in the Cognito user pool this.setState({ error: "Login failed." }); } else { this.setState({ error: "An error has occurred." }); console.error(err); } } } handleInputChange(evt) { this.inputs = this.inputs || {}; const { name, value, type, checked } = evt.target; const check_type = ["radio", "checkbox"].includes(type); this.inputs[name] = check_type ? checked : value; this.inputs["checkedValue"] = check_type ? value : null; this.setState({ error: "" }); } render() { return ( {this._validAuthStates.includes(this.props.authState) && ( Username Password Login )} ); } } What we have defined up above is a React component that is leveraging the Amplify Authentication API. If we take a look at signIn we see many calls to Auth to sign a user in or resend them a confirmation code. We also see that this._validAuthStates still exists. This internal parameter to determines whether we should show this component inside of the render function. This is a lot cleaner and is not relying on knowing the implementation details of base components provided by Amplify. Making this not only more customizable but a lot less error-prone as well. If you take a look at the class names inside of the markup you'll see that this component is also making use of TailwindCSS. Speaking as a non-designer, Tailwind is a lifesaver. It allows you to build out clean looking interfaces with utility first classes. To add Tailwind into your own React project, complete these steps. Run yarn add tailwindcss --dev in the root of your project. Run ./node_modules/.bin/tailwind init tailwind.js to initialize Tailwind in the root of your project. Create a CSS directory mkdir src/css. Add a tailwind source CSS file at src/css/tailwind.src.css with the following inside of it. @tailwind base; @tailwind components; @tailwind utilities; From there we need to update the scripts in our package.json to build our CSS before anything else. "scripts": { "tailwind:css":"tailwind build src/css/tailwind.src.css -c tailwind.js -o src/css/tailwind.css", "start": "yarn tailwind:css && react-scripts start", "build": "yarn tailwind:css && react-scripts build", "test": "yarn tailwind:css && react-scripts test", "eject": "yarn tailwind:css && react-scripts eject" } Then it is a matter of importing our new Tailwind CSS file, import "./css/tailwind.css"; into the root of our app which is App.js. 💥 We can now make use of Tailwind utility classes inside of our React components. Conclusion AWS Amplify is gaining a lot of traction and it's not hard to see why. They are making it easier and easier to integrate apps into the AWS ecosystem. By abstracting away things like authentication, hosting, etc, folks are able to get apps into AWS at lightning speed. But, with abstractions can come guard rails. Frameworks walk a fine line between providing structure and compressing creativity. They need to provide a solid foundation to build upon. But at the same time, they need to provide avenues for customization. As we saw in this post the default Amplify authentication works fine. But we probably don't want exactly that when it comes to deploying our own applications. With a bit of work and extending the framework into our application, we were able to add that customization.
http://damianfallon.blogspot.com/2020/04/customizing-aws-amplify-authentication.html
1 note · View note
qshoreonlinetraining-blog · 6 years ago
Photo
Tumblr media
Cloud Computing Architecture is an AWS Academy curriculum designed to help students develop technical expertise in cloud computing and prepare them for the AWS Certified Solutions Architect – Associate certification exam.
Register Now:https://goo.gl/Wi5qpN
Course Content:http://qshore.com/course/view/46
0 notes
adult-social-networks · 6 years ago
Text
AWS CloudFormation Masterclass
Tumblr media
AWS Certification
AWS CloudFormation is a comprehensive templating language that enables you to create managed 'stacks' of AWS resources, with a growing library of templates available for you to use. But how do you create one from scratch? This webinar will take you through building an AWS CloudFormation template from the ground up, so you can see all the essential template constructs in action. You can find the slides from this webinar on SlideShare here: Check out other upcoming webinars in the Masterclass Series here: And find details of our Journey Through the Cloud series here: Read the full article
0 notes
phungthaihy · 5 years ago
Photo
Tumblr media
How to build Chatbot using Amazon Lex Part - 2 | AWS Chatbot Tutorial | AWS Training | Edureka http://ehelpdesk.tk/wp-content/uploads/2020/02/logo-header.png [ad_1] AWS Architect Certification Trai... #awscertification #awscertifiedcloudpractitioner #awscertifieddeveloper #awscertifiedsolutionsarchitect #awscertifiedsysopsadministrator #awschatbot #awschatbotarchitecture #awschatbotdemo #awschatbotexample #awschatbotpython #awschatbotsetup #awschatbottutorial #awschatbotyoutube #awscloudformation #awscodepipeline #buildchatbotusingaws #chatbotdevelopmentusingaws #chatbotinaws #chatbotusingamazonlex #chatbotusingaws #chatbotusingawslex #chatbotwithawslambda #chatbotwithawslex #ciscoccna #comptiaa #comptianetwork #comptiasecurity #createchatbotusingaws #cybersecurity #edureka #ethicalhacking #it #kubernetes #linux #microsoftaz-900 #microsoftazure #networksecurity #software #windowsserver #ytccon
0 notes
awsexchage · 5 years ago
Photo
Tumblr media
Amazon CloudWatch EventsのイベントルールでターゲットをAmazon SQSにしてAWS Lambdaでイベントソースにして処理するAWS CloudFormationのテンプレートをつくってみた https://ift.tt/2VCgm4X
Amazon CloudWatch EventsのイベントルールでターゲットをAmazon SQSにしてAWS Lambdaでイベントソースにして処理するAWS CloudFormationのテンプレートをつくってみた
AWSマネジメントコンソールからだと簡単に設定できましたが、AWS CloudFormationのテンプレート化するのにいろいろとハマったのでメモ。
リソース
必要最低限となる構成はこんな感じになりました。
Tumblr media
利用するサービスは以下になります。
Amazon S3
AWS CloudTrail
Amazon CloudWatch Events
Amazon SQS
AWS Lambda
AWS CloudFormation(リソース管理用)
ポイント
先にポイントをいくつかあげてみます。 完成形のテンプレートはこのあとにおいてます。
Amazon S3のバケットを複数用意する
Amazon CloudWatch EventsでAmazon S3のイベントを扱うのにAWS CloudTrailも必要になります。
Amazon S3 ソースの CloudWatch イベント ルールを作成する (コンソール) – CodePipeline https://docs.aws.amazon.com/ja_jp/codepipeline/latest/userguide/create-cloudtrail-S3-source-console.html
AWS CloudTrail 証跡を作成にはログファイルを出力するAmazon S3のバケットがいりますが、これをイベントを扱いたいバケットにしてしまうと。。。あとはわかりますね。無限ループに陥ります。
また、AWS CloudTrailでログを出力しない設定にすると、イベントが発火しませんでした。
AWS CloudTrailのログファイルを保存するキー名は固定
ログファイルを保存する先はバケット名/AWSLogs/AWSアカウントID/*と指定する必要があります。
テンプレート抜粋
CloudTrailBucketPolicy: Type: AWS::S3::BucketPolicy Properties: Bucket: !Ref OutputBucket PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Principal: Service: cloudtrail.amazonaws.com Action: s3:GetBucketAcl Resource: !GetAtt OutputBucket.Arn - Effect: Allow Principal: Service: cloudtrail.amazonaws.com Action: s3:PutObject Resource: !Join - "" - - !GetAtt OutputBucket.Arn - "/AWSLogs/" - !Ref "AWS::AccountId" - "/*" Condition: StringEquals: s3:x-amz-acl: bucket-owner-full-control
AWSLogsを別名にしてみたらAWS CloudFormationのスタック作成でエラーになりました。
Incorrect S3 bucket policy is detected for bucket: <ProjectName>-output (Service: AWSCloudTrail; Status Code: 400; Error Code: InsufficientS3BucketPolicyException; Request ID: 3a9a7575-5226-4f19-b62a-737e87acc9b8)
AWSアカウントIDを指定せずにバケット名/AWSLogs/* とするとイベントが発火しませんでした。
AWS Lambda関数でメッセージの削除はしなくて良い
AWS Lambda関数でAmazon SQSを自前でポーリングして処理する場合、正常に処理が完了したらメッセージを削除する必要があったのですが、イベントソースにするとそれも必要なくなるみたいです。
AWS LambdaがSQSをイベントソースとしてサポートしました! | Developers.IO https://dev.classmethod.jp/articles/aws-lambda-support-sqs-event-source/
次に関数コードを入力します。サンプルの関数はチュートリアル通り以下の内容として保存します。ここでSQSメッセージの削除処理を入れていないことが分かります。
最初はチュートリアルのサンプルだからかな?と思ってましたが実際に動作させると正常終了時にメッセージが勝手に削除されました。こりゃ便利。
なので実装はキューの情報からS3バケットに保存されたオブジェクトのキーを取得して出力しているだけです。
テンプレート抜粋
ReceiveQueFunction: Type: AWS::Lambda::Function Properties: FunctionName: !Sub "${ProjectName}-ReceiveQueFunction" Handler: "index.lambda_handler" Role: !GetAtt LambdaExecutionRole.Arn Code: ZipFile: | from __future__ import print_function import json import os import boto3 def lambda_handler(event, context): for record in event["Records"]: requestParameters = json.loads(record["body"])["detail"]["requestParameters"] print(str(requestParameters)) Runtime: "python3.7" Timeout: "60" ReservedConcurrentExecutions: 3
AWS Lambda関数の同時実行数を調整する
上記テンプレートでReservedConcurrentExecutions: 3と同時実行数を指定していますが、こちらはケース・バイ・ケースで指定する必要があります。
AWS::Lambda::Function – AWS CloudFormation https://docs.aws.amazon.com/ja_jp/AWSCloudFormation/latest/UserGuide/aws-resource-lambda-function.html#cfn-lambda-function-reservedconcurrentexecutions
大量のキューをさばく必要がある場合、同時実行数の制限まで全力でポーリングしてくれます。 なので同時実行数を指定していないと、標準設定の1,000まで同時実行してくれます。アカウントの同時実行数は最大1,000となりますので、もし他にも関数がある場合、影響する可能性があるのでご注意ください。
イベントルールのターゲット指定でバケット名やキーがプレフィックス指定できる
こちらは下記記事をご参考ください。地味に便利です。
Amazon CloudWatch EventsのルールでAmazon S3のキーをプレフィックス指定できた – Qiita https://cloudpack.media/52397
今回はキーをprefix: hoge/とすることでs3://バケット名/hoge/ 配下にオブジェクトがPUTされた場合にイベントが発火する設定にしました。
テンプレート抜粋
CloudWatchEventRule: Type: AWS::Events::Rule Properties: Name: !Sub "${ProjectName}-EventRule" EventPattern: source: - aws.s3 detail-type: - "AWS API Call via CloudTrail" detail: eventSource: - s3.amazonaws.com eventName: - CopyObject - PutObject - CompleteMultipartUpload requestParameters: bucketName: - !Ref InputBucket key: - prefix: hoge/ Targets: - Arn: !GetAtt S3EventQueue.Arn Id: !Sub "${ProjectName}-TarfgetQueue"
SQSのキューポリシーを設定する
今回、一番ドハマリしました。 キューポリシーがなくてもリソースは作成できるのですが、それだとバケットにオブジェクトをPUTしてもイベントが発火しませんでした。AWSのドキュメントを漁ってみてもそれらしき記述が見当たらずでしたが、下記のフォーラムに情報があり知ることができました。
AWS Developer Forums: CloudWatch event rule not sending to … https://forums.aws.amazon.com/message.jspa?messageID=742808
AWSマネジメントコンソールでぽちぽちと設定する場合にはポリシーを勝手に作成してくれるので、テンプレート化する際にハマりやすいポイントだったみたいです。
テンプレート抜粋
SQSQueuePolicy: Type: AWS::SQS::QueuePolicy Properties: PolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: AWS: "*" Action: - "sqs:SendMessage" Resource: - !GetAtt S3EventQueue.Arn Condition: ArnEquals: "aws:SourceArn": !GetAtt CloudWatchEventRule.Arn Queues: - Ref: S3EventQueue
テンプレート
ちょっと長いですがテンプレートになります。
AWSTemplateFormatVersion: "2010-09-09" Parameters: ProjectName: Type: String Default: "&ltお好みで>" Resources: InputBucket: Type: AWS::S3::Bucket Properties: BucketName: !Sub "${ProjectName}-input" AccessControl: Private PublicAccessBlockConfiguration: BlockPublicAcls: True BlockPublicPolicy: True IgnorePublicAcls: True RestrictPublicBuckets: True OutputBucket: Type: AWS::S3::Bucket Properties: BucketName: !Sub "${ProjectName}-output" AccessControl: Private PublicAccessBlockConfiguration: BlockPublicAcls: True BlockPublicPolicy: True IgnorePublicAcls: True RestrictPublicBuckets: True S3EventQueue: Type: AWS::SQS::Queue Properties: DelaySeconds: 0 VisibilityTimeout: 360 LambdaExecutionRole: Type: AWS::IAM::Role Properties: RoleName: !Sub "${ProjectName}-LambdaRolePolicy" AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Principal: Service: lambda.amazonaws.com Action: "sts:AssumeRole" Path: "/" ManagedPolicyArns: - arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole Policies: - PolicyName: !Sub "${ProjectName}-LambdaRolePolices" PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Action: - s3:* Resource: "*" - PolicyName: !Sub "${ProjectName}-LambdaRoleSQSPolices" PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Action: - sqs:ReceiveMessage - sqs:DeleteMessage - sqs:GetQueueAttributes - sqs:ChangeMessageVisibility Resource: !GetAtt S3EventQueue.Arn ReceiveQueFunction: Type: AWS::Lambda::Function Properties: FunctionName: !Sub "${ProjectName}-ReceiveQueFunction" Handler: "index.lambda_handler" Role: !GetAtt LambdaExecutionRole.Arn Code: ZipFile: | from __future__ import print_function import json import os import boto3 def lambda_handler(event, context): for record in event["Records"]: requestParameters = json.loads(record["body"])["detail"]["requestParameters"] print(str(requestParameters)) Runtime: "python3.7" Timeout: "60" ReservedConcurrentExecutions: 3 CloudTrailBucketPolicy: Type: AWS::S3::BucketPolicy Properties: Bucket: !Ref OutputBucket PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Principal: Service: cloudtrail.amazonaws.com Action: s3:GetBucketAcl Resource: !GetAtt OutputBucket.Arn - Effect: Allow Principal: Service: cloudtrail.amazonaws.com Action: s3:PutObject Resource: !Join - "" - - !GetAtt OutputBucket.Arn - "/AWSLogs/" - !Ref "AWS::AccountId" - "/*" Condition: StringEquals: s3:x-amz-acl: bucket-owner-full-control CloudTrail: Type: AWS::CloudTrail::Trail DependsOn: - CloudTrailBucketPolicy Properties: TrailName: !Sub "${ProjectName}-Trail" S3BucketName: !Ref OutputBucket EventSelectors: - DataResources: - Type: AWS::S3::Object Values: - Fn::Sub: - "${InputBucketArn}/" - InputBucketArn: !GetAtt InputBucket.Arn ReadWriteType: WriteOnly IncludeManagementEvents: false IncludeGlobalServiceEvents: true IsLogging: true IsMultiRegionTrail: false CloudWatchEventRule: Type: AWS::Events::Rule Properties: Name: !Sub "${ProjectName}-EventRule" EventPattern: source: - aws.s3 detail-type: - "AWS API Call via CloudTrail" detail: eventSource: - s3.amazonaws.com eventName: - CopyObject - PutObject - CompleteMultipartUpload requestParameters: bucketName: - !Ref InputBucket key: - prefix: hoge/ Targets: - Arn: !GetAtt S3EventQueue.Arn Id: !Sub "${ProjectName}-TarfgetQueue" SQSQueuePolicy: Type: AWS::SQS::QueuePolicy Properties: PolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: AWS: "*" Action: - "sqs:SendMessage" Resource: - !GetAtt S3EventQueue.Arn Condition: ArnEquals: "aws:SourceArn": !GetAtt CloudWatchEventRule.Arn Queues: - Ref: S3EventQueue LambdaFunctionEventSourceMapping: Type: AWS::Lambda::EventSourceMapping DependsOn: - S3EventQueue - ReceiveQueFunction Properties: BatchSize: 10 Enabled: true EventSourceArn: !GetAtt S3EventQueue.Arn FunctionName: !GetAtt ReceiveQueFunction.Arn
スタック作成して動かしてみる
最後にざくっと検証してみます。
# リソースを作成 > cd テンプレートファイルがある場所 > aws cloudformation create-stack \ --stack-name <お好みで> \ --template-body file://<テンプレートファイル名> \ --capabilities CAPABILITY_NAMED_IAM \ --region <お好みの> \ --parameters '[ { "ParameterKey": "ProjectName", "ParameterValue": "<お好みで>" } ]' { "StackId": "arn:aws:cloudformation:<お好みのリージョン>:xxxxxxxxxxxx:stack/<お好みのスタック名>/18686480-6f21-11ea-bcf3-020de04cec9a" } # ファイルをアップロード > touch hoge.txt # hogeキー配下にアップロードしない > aws s3 cp hoge.txt s3://<ProjectName>-input/ upload: ./hoge.txt to s3://<ProjectName>-input/hoge/hoge.txt > aws s3 cp hoge.txt s3://<ProjectName>-input/hoge/ upload: ./hoge.txt to s3://<ProjectName>-input/hoge/hoge.txt > aws s3 ls --recursive s3://<ProjectName>-input 2020-03-26 06:28:03 0 hoge.txt 2020-03-26 06:26:12 0 hoge/hoge.txt # Lambda関数のログを確認 > aws logs get-log-events \ --region <お好みのリージョン> \ --log-group-name '/aws/lambda/<ProjectName>-ReceiveQueFunction' \ --log-stream-name '2020/03/26/[$LATEST]ae8735ef9a1c46c38ab241f23a26b384' \ --query "events[].[message]" \ --output text START RequestId: 10f98cf9-c39b-531b-9514-da0f8ea72a42 Version: $LATEST {'bucketName': '<ProjectName>-input', 'Host': '<ProjectName>-input.s3.<お好みのリージョン>.amazonaws.com', 'key': 'hoge/hoge.txt'} END RequestId: 10f98cf9-c39b-531b-9514-da0f8ea72a42 REPORT RequestId: 10f98cf9-c39b-531b-9514-da0f8ea72a42 Duration: 1.85 ms Billed Duration: 100 ms Memory Size: 128 MB Max Memory Used: 69 MB Init Duration: 275.58 ms
やったぜ。
まとめ
AWSマネジメントコンソールで設定すると比較的かんたんに設定できるのですが、CFnのテンプレート化する際にはそこそこ大変でしたが、良い知見を得ることができました。
参考
Amazon S3 ソースの CloudWatch イベント ルールを作成する (コンソール) – CodePipeline https://docs.aws.amazon.com/ja_jp/codepipeline/latest/userguide/create-cloudtrail-S3-source-console.html
AWS LambdaがSQSをイベントソースとしてサポートしました! | Developers.IO https://dev.classmethod.jp/articles/aws-lambda-support-sqs-event-source/
AWS::Lambda::Function – AWS CloudFormation https://docs.aws.amazon.com/ja_jp/AWSCloudFormation/latest/UserGuide/aws-resource-lambda-function.html#cfn-lambda-function-reservedconcurrentexecutions
Amazon CloudWatch EventsのルールでAmazon S3のキーをプレフィックス指定できた – Qiita https://cloudpack.media/52397
AWS Developer Forums: CloudWatch event rule not sending to … https://forums.aws.amazon.com/message.jspa?messageID=742808
Amazon CloudWatch LogsのログをAWS CLIでいい感じに取得する – Qiita https://cloudpack.media/50416
AWS CLIを使ってAWS Lambdaのログ取得時に注意したいこと | Developers.IO https://dev.classmethod.jp/articles/note-log-of-lambda-using-awscli/
元記事はこちら
「Amazon CloudWatch EventsのイベントルールでターゲットをAmazon SQSにしてAWS Lambdaでイベントソースにして処理するAWS CloudFormationのテンプレートをつくってみた」
April 13, 2020 at 04:00PM
0 notes
iwillreadthesesomeday · 5 years ago
Link
via Instapaper: Unread
0 notes
mbarczyk · 5 years ago
Link
0 notes
nodejstutorial4y · 6 years ago
Photo
Tumblr media
📰 Just published an article by @julian_duque on using N|Solid for AWS Lambda with @AWSCloudFormer and @goserverless to deploy an @apollographql function as an example: https://t.co/OXk7Fp6tzQ
0 notes
nodejstutorial4u · 6 years ago
Photo
Tumblr media
📰 Just published an article by @julian_duque on using N|Solid for AWS Lambda with @AWSCloudFormer and @goserverless to deploy an @apollographql function as an example: https://t.co/OXk7Fp6tzQ #node #nodejs #angular #angularjs #javascript #react
0 notes
spearheadtechnology · 1 year ago
Text
Implementing a Continuous Integration and Continuous Delivery (CI/CD) pipeline
Hey everyone, I just came across this interesting Case Study on Implementing a Continuous Integration and Continuous Delivery (CI/CD) pipeline - Reducing the time to market for applications - Case Study
1 note · View note
nodejstutorial4you-blog · 6 years ago
Photo
Tumblr media
NodeSource: 📰 Just published an article by julian_duque on using N|Solid for AWS Lambda with AWSCloudFormer and goserverless to deploy an apollographql function as an example: https://t.co/W41EOPjy0b http://twitter.com/NodejsTutorial1/status/1073647153317404672
0 notes
northxnisha-blog · 7 years ago
Text
#AWSCloudFormation now Supports #AWS Budgets as a Resource for CloudFormation Templates, Stacks, and StackSets To get more #insights on daily #technology happenings - http://tech.rithanya.com/tech/movers-and-shakers
0 notes
knifetopodes · 7 years ago
Link
jesus FUCKING christ what is this bullshit lmfao
0 notes