asilgag/aws-s3-incremental-deployer
最新稳定版本:v2.0.1
Composer 安装命令:
composer require asilgag/aws-s3-incremental-deployer
包简介
Incremental and atomic high-performance deployments for AWS S3
README 文档
README
Incremental and atomic high-performance deployments for AWS S3.
- incremental: using a fast diff algorithm, this package only uploads changed files based on their contents.
- atomic: even thought atomicity is not supported by AWS S3, this package performs uploads in the most optimal way to nearly achieve the same result.
- high-performance: updates to sites with 100K files are usually deployed in ~10 seconds
Why don't use aws s3 sync --delete?
aws s3 sync --delete uses two metrics to decide whether to sync from source to destination:
- file size
- timestamp
This works well in most cases, but not when you are deploying a site generated by a Static Site Generator like
Gatsby. Gatsby re-generates all files on each build, so aws s3 sync --delete will upload
all files since all of them are "new".
This package, on the contrary, uses a smarter (and faster) strategy: it compares file contents to decide whether to sync it or not. Hence, if your site contains thousands of files, but only 4 of them have changed, this package updates only those 4 files.
Installation
composer require asilgag/aws-s3-incremental-deployer
Dependencies
This package makes intensive use of the Unix shell. The following Unix commands MUST be present so this package can work as expected:
- aws (AWS CLI)
- find
- sort
- xargs
- sha1sum
Therefore, this package does not work on non-Unix OS (Windows, etc).
ACL Permissions
The user performing the deployment MUST be granted the following ACL Permissions:
- READ
- WRITE
- READ_ACP
Incremental deploys
Instead of using a "aws s3 sync" command to synchronize a local folder to a bucket, this package uses a different strategy.
It makes use of the Unix command sha1sum to get the checksums of all files from a local folder, and then
it compares it with the last deployed version hosted in a bucket. Then, it detects which files are new, edited
or deleted, and it handles uploading them in the most "atomic" possible way:
- first, new assets and pages
- second, updated assets and pages
- third, homepage
You should use a CDN (CloudFront, Akamai, etc) with a proper cache policy to ensure no one access a broken version of your site while it's being uploaded.
Usage
use Asilgag\Aws\S3\AwsS3IncrementalDeployer;
use Asilgag\Aws\S3\Logger\MultiLogger;
// Use any PSR-3 compatible logger
$multiLogger = new MultiLogger('/path/to/log/file', TRUE);
// Create a AwsS3IncrementalDeployer object and configure its behaviour.
$deployer = new AwsS3IncrementalDeployer($multiLogger);
// Optional. Define an array of relative paths that shouldn't be uploaded.
$deployer->setExcludedPaths(['relative/path/to/exclude/in/all/cases/*']);
// Optional. Add options to the "aws" part of the command (--region, --profile, etc)
$deployer->getAwsCli()->getAwsOptions()->add('--region eu-east-1');
// Optional. Set environment for the command.
$deployer->getAwsCli()->setEnvironment('AWS_ACCESS_KEY_ID', '***');
$deployer->getAwsCli()->setEnvironment('AWS_SECRET_ACCESS_KEY', '***');
// Execute deploy operation.
try {
$deployer->deploy('/path/to/local/site', 'bucket-name');
}
catch (RuntimeException $e) {
// Do some logging
}
统计信息
- 总下载量: 9.23k
- 月度下载量: 0
- 日度下载量: 0
- 收藏数: 0
- 点击次数: 3
- 依赖项目数: 0
- 推荐数: 0
其他信息
- 授权协议: MIT
- 更新时间: 2020-03-05