The serverless landscape just got massively expanded
Container Image Support has just been announced for AWS Lambda and it’s a pretty big deal — I’m very excited because it’s something I’ve wanted for years!
I maintain a distribution of thousands of packages called yumda that I created specifically to deal with the problem of bundling native binaries and libraries for Lambda — I’m happy to now say that AWS has essentially made this project redundant 😄
To be clear, what’s been announced is not actually Lambda running Docker per se — it’s specifically using container images as the packaging…
AWS Solutions Architects hate him.
AWS launched Provisioned Concurrency for Lambda at re:Invent 2019 last week — essentially a way to keep warm Lambdas provisioned for you so you don’t experience any cold start latency in your function invocations. It also may save you money if you happen to have the ideal workload for it, as it’s priced at $0.05/hr (for 1 GB of memory) instead of the usual $0.06/hr.
This theoretical 16.67% saving is not what this article’s about though — it was only as I was exploring this new feature that I was reminded of an interesting factor…
Sometime in the last few days, docker pulls of lambci/lambda hit 35 million.
Which is more than twice what it was six months ago:
I don’t know where serverless is on the hype cycle these days, but if local testing and building of Lambda services is anything to go by, doubling every six months ain’t bad 📈
I have no doubt docker-lambda’s growth is fueled by Amazon’s decision to use it as the base of their AWS SAM CLI tool for local testing — as well as tools like Serverless Framework, localstack and many others.
Part of its appeal…
A month ago I dug into the nodejs10.x runtime and highlighted some issues with it — including some bugs and style problems. I’m glad to now report that all the issues I raised have been addressed in the latest runtime code, which should be running on all nodejs10.x Lambdas as of the time of writing.
There are also couple of changes that will break functions using relative handler paths and relying on logs preserving newlines — I cover these further down.
To summarize briefly how the issues were addressed:
bootstrapare now named correctly,
Update 2019–06–25: All of the issues I raise here have been addressed in the latest runtime! I’ll leave this story here for posterity, but it no longer reflects the current state of affairs. See my latest blog post on how AWS addressed the issues I raise here.
Last week AWS announced official support for Node.js v10 on Lambda, which is great! Or at least, it will be once it stabilizes a bit… Here’s what I found after digging into the code.
I run the docker-lambda project which allows you to execute a docker container that’s a replica of the Lambda…
I’m excited to share a hyperparameter optimization method we use at Bustle to train text classification models on AWS Lambda incredibly quickly— an implementation of the recently released Asynchronous Successive Halving Algorithm paper by Liam Li et al, which proved more effective than Google’s own internal Vizier tool. We extend this method using evolutionary algorithm techniques to fine-tune likely candidates as the training progresses.
(Coincidentally, there’s a talk on the ASHA paper at the AWS Loft in NYC tonight, 7 Feb)
LambCI is a tool I began building over a year ago to run tests on our pull requests and branches at Uniqlo Mobile. Inspired at the inaugural ServerlessConf a few weeks ago, I recently put some work into hammering it into shape for public consumption.
It was borne of a dissatisfaction with the two current choices for automated testing on private projects. You can either pay for it as a service (Travis, CircleCI, etc) — where 3 developers needing their own…