Chapter 5

Chapter 5
  • Importing custom libraries in your function
  • Subscribing functions to events coming from other AWS services
  • Creating back-end resources such as S3 buckets and DynamoDB tables
  • Using binaries with your function
  • Implementing a serverless face detection function
  • Scheduling functions for recurring execution
create standalone functions that can be scheduled for periodic execution or triggered by
events coming from other AWS services such as Amazon S3.

5.1. Packaging libraries and modules with your function


For modules managed by standard package managers, such as npm for Node.js
you can use those tools to install the module locally in your development environment
in a folder where you have the source of the function.
Then you need to create a zip archive that includes your function (in the root folder)
and all dependencies.
deployment package, which you can upload directly to AWS Lambda (if smaller than 10 MB) or
via Amazon S3.
include the popular async[1] module in your Node.js function, you can run the following command
in the same directory as your function source code:
1For more information on the “async” Node.js module, see https://github.com/caolan/async.
1
npm install async
download and install Node.js, including npm, in your development environment,
follow the information provided at https://nodejs.org/.

CREATING THE DEPLOYMENT PACKAGE
Your function must be in the root folder of the archive. What I usually suggest is to zip
from inside the folder where the function source code is.
package custom modules with a Lambda function, using a common use case: reacting to
new or updated content on Amazon S3.

 

5.2. Subscribing functions to events

In event-driven applications, you want back end logic to be executed in reaction to changes
in data.
show the pictures at different resolutions, you need to build thumbnails. Those pictures
probably have metadata, such as the photographer or a description, that you want to store
in a database.
Implement a front end component to manage the upload and then process the picture and the
data according to a workflow that you define (figure 5.1).
use features of Amazon S3 to store metadata with objects and subscribe a function to process
information in the metadata when an object is created or updated

5.2.1. Creating the back-end resources

create an S3 bucket to use for this example
Bucket names are globally unique



store metadata for the picture, use Amazon DynamoDB
choose DynamoDB from the Database section and then Create table. Use “images” as
Table name and “name” as the Primary key Partition key
write down the Amazon Resource Name (ARN). You’ll need the table ARN to give access to
the resource with AWS IAM.
arn:aws:dynamodb:us-east-1:655726309340:table/images

create a Lambda function subscribed to this bucket that will be run every time an object with the
images/ prefix is created or updated in the S3 bucket.
he S3 object metadata can optionally contain the following values:
  • width, the maximum width of the thumbnail in pixels
  • height, the maximum height of the thumbnail in pixels
  • author, the author of the picture
  • title, the title of the picture
  • description, a description of the picture
the function will
1.  Create a thumbnail of the picture and store it in the same bucket, adding the thumbs
/ prefix at the beginning of the object key.
2.  Extract the metadata and store the information on the picture, including the link to the
thumbnail, in the DynamoDB table you created.

5.2.2. Packaging the function

The code for the createThumbnailAndStoreInDB function in Node.js is in listing 5.1.
This function uses external modules, such as async, gm, and util, that aren’t included by default
in the AWS Lambda execution environment. You need to install them locally and then create a
deployment package.
var async = require('async');
var AWS = require('aws-sdk');
var gm = require('gm')
           .subClass({ imageMagick: true }); // Enable ImageMagick integration.
var util = require('util');
var DEFAULT_MAX_WIDTH  = 200;
var DEFAULT_MAX_HEIGHT = 200;
var DDB_TABLE = 'images';
var s3 = new AWS.S3();
var dynamodb = new AWS.DynamoDB();
function getImageType(key, callback) {
 var typeMatch = key.match(/\.([^.]*)$/);
 if (!typeMatch) {
     callback("Could not determine the image type for key: ${key}");
     return;
 }
 var imageType = typeMatch[1];
 if (imageType != "jpg" && imageType != "png") {
     callback('Unsupported image type: ${imageType}');
     return;
 }
 return imageType;
}
exports.handler = (event, context, callback) => {
 console.log("Reading options from event:\n",
   util.inspect(event, {depth: 5}));
 var srcBucket = event.Records[0].s3.bucket.name;
 var srcKey    = event.Records[0].s3.object.key;
 var dstBucket = srcBucket;
 var dstKey    = "thumbs/" + srcKey);
    var imageType = getImageType(srcKey, callback);
 async.waterfall([
   function downloadImage(next) {
     s3.getObject({
         Bucket: srcBucket,
         Key: srcKey
       },
       next);
     },
   function tranformImage(response, next) {
     gm(response.Body).size(function(err, size) {
       var metadata = response.Metadata;
       console.log("Metadata:\n", util.inspect(metadata, {depth: 5}));
       var max_width;
       if ('width' in metadata) {
         max_width = metadata.width;
       } else {
         max_width = DEFAULT_MAX_WIDTH;
       }
       var max_height;
       if ('height' in metadata) {
         max_height = metadata.height;
       } else {
         max_height = DEFAULT_MAX_HEIGHT;
       }
       var scalingFactor = Math.min(
         max_width / size.width,
         max_height / size.height
       );
       var width  = scalingFactor * size.width;
       var height = scalingFactor * size.height;
       this.resize(width, height)
         .toBuffer(imageType, function(err, buffer) {
           if (err) {
             next(err);
           } else {
             next(null, response.ContentType, metadata, buffer);
           }
         });
     });
   },
   function uploadThumbnail(contentType, metadata, data, next) {
     // Stream the transformed image to a different S3 bucket.
     s3.putObject({
         Bucket: dstBucket,
         Key: dstKey,
         Body: data,
         ContentType: contentType,
         Metadata: metadata
     }, function(err, buffer) {
       if (err) {
         next(err);
       } else {
         next(null, metadata);
       }
     });
   },
   function storeMetadata(metadata, next) {
     // adds metadata do DynamoDB
     var params = {
       TableName: DDB_TABLE,
       Item: {
         name: { S: srcKey },
         thumbnail: { S: dstKey },
         timestamp: { S: (new Date().toJSON()).toString() },
       }
     };
     if ('author' in metadata) {
       params.Item.author = { S: metadata.author };
     }
     if ('title' in metadata) {
       params.Item.title = { S: metadata.title };
     }
     if ('description' in metadata) {
       params.Item.description = { S: metadata.description };
     }
     dynamodb.putItem(params, next);
   }], function (err) {
     if (err) {
       console.error(err);
     } else {
       console.log(
         'Successfully resized ' + srcBucket + '/' + srcKey +
         ' and uploaded to ' + dstBucket + '/' + dstKey
       );
     }
     callback();
   }
 );
};

create an index.js file with the content of the createThumbnailAnd-StoreInDB function
e same directory containing the index.js file, use npm, the Node.js package manager, to install
the required modules locally:
1
npm install async gm util
creates a node_modules folder in the directory that contains a local installation of the
modules
create is a ZIP file that contains the function and all dependencies.


within the directory containing the index.js file and the node_modules folder, create the deployment
package by running the following command:
1
zip -9 -r ../createThumbnailAndStoreInDB-v1.0.zip *


5.2.3. Configuring permissions

  • Read from the S3 bucket with the images/ prefix.
  • Write in the S3 bucket with the thumbs/ prefix.
  • Put an item on the DynamoDB table you created.
create a managed policy and then attach the policy to the role.
AWS IAM console
select Policies on the left and then Create Policy. Choose Create Your Own Policy
CreateThumbnailAndStoreInDB as the Policy Name and write a meaningful description;
for example, “To read the source image, write the thumbnail and store the metadata in the DB.



choose Roles on the left and Create New Role
lambda_createThumbnailAndStoreInDB as the Role Name
attach two policies to the role
CreateThumbnailAndStoreInDB
AWSLambdaBasicExecutionRole

5.2.4. Creating the function

Lambda console, choose Create a Lambda function


configure Amazon S3 as the trigger for this function
ellipse representing the source of the events
choose Amazon S3 from the list.
select S3 as Event source
Event type, choose Object Created
In the prefix, type images/ to trigger the function only for objects uploaded
You can’t write the function code inline this time because you have to bring several
dependencies with you, as we discussed in the previous section. Choose to Upload a zip file
and use the Upload button to look up the zip archive you previously created in the file dialog.




No comments:

Post a Comment

To All, If you are working in starter accounts please move the work into the classroom account so I can review. For the previous ...