Friday, December 30, 2016

Follow-up: Using AWS Simple Email Service (SES) for Inbound Mail

As I alluded to in my first post on this subject, it's possible to use AWS Lambda to process mail you receive as well.   In this final post on AWS Simple Email Service, I'll show you how to forward your email using a Lambda function.

Thursday, December 29, 2016

Part 2: Using AWS Simple Email Service (SES) for Inbound Mail


Delete, delete, delete, delete, forward...

In part one of my two part blog on Amazon's Simple Email Service, we set up the necessary resources to receive and process inbound email. In part two, we'll  create a worker that reads an SQS queue and forwards the mail to another email address.

Sunday, December 18, 2016

Using AWS Simple Email Service (SES) for Inbound Mail

Oy! Good thing no one else can see THIS message!
We all know what a pain in the rump it is to setup, manage, and secure an inbound mail server.  It's a thankless job that is increasingly the point of attack for bad guys.  It's also possible that if you screw it up you might find yourself in front of Congress!

In our architectures, now more than ever, it is important to reduce the surface area for attacks. That means closing down as many access points to your network as possible.  SMTP running on port 25 is a gaping hole that most architects interested in securing their networks want turned off, like yesterday!

If you don't want to completely outsource your inbound mail to a managed service, AWS SES inbound email service is one way to have your cake and eat it too.   It's especially useful if you want to allow your application to receive mail but you don't necessarily want or need to host an email service that includes an IMAP or POP server.  You may only need to receive mail in which case AWS SES is the perfect solution.  Along with a scalable managed service, SES also includes spam filtering capabilities.

In this two part blog, we'll explore setting up a simple inbound mail handler for openbedrock.net using Amazon Web Services Simple Email Service (SES).

Saturday, December 17, 2016

Using AWS S3 as a Yum Repository

In an earlier post I described how you can use an S3 bucket to host a yum repository.  In this post we'll give the repository a friendlier name and create an index page that's a tad more helpful.  If you're not using AWS S3 and CloudFront to host your static assets, you might want to consider looking into this simple to use solution for running a website without having to manage a webserver.

Visit http://repo.openbedrock.net to see the final product.

Monday, December 5, 2016

AWS re:Invent - Wrap Up

It's Friday, I'm blogging at 33,000 feet and the captain has just turned off the seat belt signs, informed us that it's 50 degrees back at our final destination, Philadelphia, Pennsylvania and directed us to just sit back and relax.  I get very nervous when the captain of a plane flying at 33,000 feet tells me my final destination is Philadelphia.  I'm hoping to do a few more trips to AWS re:Invent before my "final destination" arrives.

CloudBlogging (pun intended) about my 5 days at one of the most important cloud computing events each year thanks to my Acer Chromebook and my free GoGo Inflight passes.  I still have 6 left of the 12 I received when I purchased the C720 nearly three years ago prior to attending my second AWS re:Invent extravaganza. Wifi at 33,000 feet is spotty but it is possible to complete a blog post on a 5 hour trip across the country.

No blog here would be complete without a shameless plug (I promise this is the only plug in the blog) for Chromebooks - #LoveMeSomeChromebook.

If you haven't read the rest of this somewhat tantalizing series on AWS re:Invent start here.  I'll wrap up the blog series with a recap of the week's highlights, opine a bit and provide you with some of the key takeaways.  Enjoy!

Friday, December 2, 2016

AWS re:Invent Day 5

Werner Vogels does his Superman impression

Werner Vogels - Keynote

Lots of announcements from the AWS CTO. Some user stories (Twilio, Netflix, Mapbox, TrainLine), lots of focus on developer tools and big data management tools.

Some highlights:


AWS Glue - ETL pipeline tool
AWS Batch - Ummm, batch processing.
Lambda@Edge - Lambda on CDNs to move processing away from your server with faster results.

Read more about the announcements here.

...on to my agenda for the day.

Thursday, December 1, 2016

AWS re:Invent Day 4

Poolside lunch at Mirage
Yet another incredibly long day - breakfast @7:00am, keynote by Andy Jassy, a breakout at 11:00am at the Venetian, a very fast pit stop to try to shovel some lunch before another 1:00pm breakout session at the Mirage, followed by sessions until 6:30pm.  Pub Crawl with 30,000 other re:Invent thirsty geeks to 7:30pm.

Wednesday, November 30, 2016

AWS re:Invent Day 3

How do you feed 30,000 people?  Hangar 1
Intense day 3!
  • 5 breakout sessions.
  • Breakfast at 7:30am, first session 10:00, last session ends at 6:00pm.
  • Reception in the vendor hall from 6-7pm (lot's of tee shirts and goodies!)


Read on for some session recaps...

AWS re:Invent Day 2

Monday, November 28th is bootcamp day.  Attended an AWS Solutions Architect Associate prep course given by some Amazon employees.  Excellent overview of what to expect in the Associates exam with pointers on how to interpret the questions and what you should be focusing on.

Yeah, but where's the holodeck?
The bootcamp was 4 hours (8am-12pm) at the Wynne Encore.   After taking the bootcamp I feel confident in passing the test, but am reminded of how many services AWS encompasses and how deep the information for each service is.  I'll study some  more.  ;-)

A lot of this is also a moving target as the certification test questions are not keeping up with the announcements being made regarding increasing capacity, capabilities and decreasing prices.

AWS re:Invent Day 1

Here we are at the Amazon annual re:Invent for another year of mind blowing technological advancement in cloud computing.  This is my fourth year attending and so far I can tell this is going to be filled with exciting technology, even more walking than 2015 (argh!) and even more standing patiently in line.

Over 30,000 people are registered so it's grown beyond the ability of the Venetian to hold the event.  This year there are events at the Mirage and the Wynne hotels in additon to the Venetian, making the logistics of schedule making a real challenge for both attendees and the conference coordinators.

Arrived on Sunday and picked up my SWAG...yet another hoody...and a surprise.  An Echo Dot sponsored by Capital One!

I'll be posting some notes about each day...

Thursday, November 10, 2016

Bedrock & Regular Expressions

Regular expressions are a powerful tool that every programmer should be familiar with.  Perl programmers are typically well versed in using regular expressions but you can find many tools and languages that support regular expressions.   Even Bedrock!


Saturday, April 23, 2016

Creating a yum repository using AWS S3 buckets

Here's a short bash script I use to create a yum repo in an S3 bucket.  The script does five (5) things:

  1. creates a local repo in a temporary directory
  2. copies an RPM file to the local repo
  3. creates the yum repository using createrepo
  4. syncs the local directory with the S3 bucket
  5. sets public permissions for the bucket (make sure this is what you actually want to do)
Before you get started create the bucket and configure it to host a static website.  No worries if you don't actually have an index.html.

Create the bucket:


$ aws s3 mb s3://openbedrock-repo

Then configure it to host static files:


$ aws s3 website s3://openbedrock-repo --index-document index.html

Here's the bash script to create the repo:

1:  #!/bin/bash  
2:    
3:   # $ sudo $0 path-to-rpm [bucket]  
4:     
5:   # create a temporary repo  
6:   repo=$(mktemp -d)  
7:   mkdir ${repo}/noarch  
8:   if test -n "$2"; then  
9:    BUCKET="$2"  
10:   else  
11:    BUCKET=openbedrock-repo  
12:   fi  
13:     
14:   # create a temporary local repo and sync with AWS S3 repo  
15:   if test -e "$1"; then  
16:     cp "$1" ${repo}/noarch  
17:     createrepo $repo  
18:   # sync local repo with S3 bucket, make it PUBLIC  
19:     PERMISSION="--grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers"  
20:     aws s3 sync --recursive --include="*" ${repo} s3://$BUCKET/ $PERMISSION  
21:     aws s3 ls s3://$BUCKET/  
22:   # cleanup local copy of repo  
23:     rm -rf $repo  
24:   fi  
25:     

Tuesday, April 12, 2016

Catching Exceptions in Bedrock

When things go wrong, it's a good idea to actually try to handle the mess.  Most higher level programming languages have exception handling built in.  While Bedrock is not a programming language, it is useful for a templating language to have the ability to catch exceptions.  Today's blog describes how Bedrock's <try/catch> blocks can be use with your Application Plugins.