One Hour to Save the Data

I needed to build a disaster recovery solution. The goal was to get DynamoDB data backed up to a different region on a continuous basis. Not a one-off export. Something reliable that runs on its own.

The approach

DDB has an export API that allows customers to move data to an S3 bucket which can be in a different region. While we could have run this manually we needed something that was reliable and continuous. At least as much as possible.

I decided to deploy a lambda that ran at scheduled times. Exports the tables into an S3 bucket. The lambda had to accept a list of tables and a destination. Ideally this solution was repeatable for future applications (which we needed the next day). A config driven approach deployed via a pipeline seemed ideal.

Building it

I created a CDK project and told Kiro to start a spec. I needed a lambda, S3 bucket, IAM roles and a schedule. Some alarming and monitoring would be nice too, but nothing crazy. Oh yeah, and I needed the inputs (table names, buckets) to be created based on a config. That way the CDK stack can be deployed over and over with different parameters.

Writing a spec with Kiro is an iterative approach. It starts out, gets things right and some wrong. You read the spec and make corrections. And when you’re happy you tell it to go implement. Then you check the implementation and most of the time it’s good. Sometimes you tweak in a few spots. The spec took a few minutes to write and so did the implementation. Then a quick deployment to preprod to make sure the whole thing worked. Then a code review, sign off and a manual deployment to production to get it to run while the pipeline does its thing. Total time about 1h, maybe 90 min.

It spread

I shared it with the team and it became the default for other teams too. We deployed it to another region the following day.

To onboard a team, they add their accounts, names for tables and desired S3 bucket names. The pipeline does the rest.

What I did vs. what the AI did

Kiro acted like a personal assistant. It took my thoughts and wrote them into a spec. Organized them (I can be all over the place sometimes), and when I was ready it implemented everything. It confirmed with a successful build.

Kiro didn’t know we needed cross region replication. It didn’t know the tables we needed to save. And it wouldn’t have known that we want to build this future proof and configuration driven. All that work happened because of the experience of the human operator but a human could not have built something so sophisticated so quickly, or at least this one couldn’t have.

One hour

With the help of AI I was able to build something quickly and with minimal issues. Generally those two things are in tension. It was successful because of the help of the friendly ghost (Kiro) and the combined experience building applications and knowing the underlying system.

AI is good at that. And if you know how to use it that’s the difference between having a solution in an hour or still working on it by the end of the week.