You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I experimented with the Webpacker configuration and add_local_files but our circumstances required a bit of tweaking. In particular, we have multiple web servers and want to avoid multiple asset compilation jobs, and we need to work with an existing CDN configuration that looks for everything under the pathname '/assets'.
This Capistrano task assumes there is a single server with the assets role and one or more with the web role. After asset compilation, this task downloads the Sprockets manifest .sprockets-<fingerprint>.json and the Webpacker manifest manifest.json, and uploads them to all the web servers.
# config/deploy.rb
set :manifest_dir_sprockets, -> { File.join(current_path, 'public', 'assets') }
set :manifest_dir_webpacker, -> { File.join(current_path, 'public', 'packs') }
set :manifest_location_sprockets, -> { File.join(fetch(:manifest_dir_sprockets), '.sprockets-*.json') }
set :manifest_location_webpacker, -> { File.join(fetch(:manifest_dir_webpacker), 'manifest.json') }
after :compile_assets, :copy_manifests
desc 'Download manifest files from asset compilation server and upload them to other web servers'
task :copy_manifests do
on roles(:assets) do
within(shared_path) do
%i[sprockets webpacker].each do |type|
execute(:rm, '-f', fetch(:"manifest_path_#{type}"))
set(:"manifest_path_#{type}", capture(:ls, '-t', fetch(:"manifest_location_#{type}"), '|', 'head', '-n1'))
set(:"manifest_filename_#{type}", fetch(:"manifest_path_#{type}").split('/').last)
download!(fetch(:"manifest_path_#{type}"), "tmp/#{fetch(:"manifest_filename_#{type}")}")
end
end
within(current_path) do
rake('webpacker:upload')
end
end
on roles(:all) do
within(shared_path) do
%i[sprockets webpacker].each do |type|
execute(:rm, '-rf', File.join(fetch(:"manifest_dir_#{type}"), '*'))
upload!("tmp/#{fetch(:"manifest_filename_#{type}")}", fetch(:"manifest_path_#{type}"))
end
end
end
end
The Rake task we invoke here is used to upload the transpiled asset packs to our desired location on S3. We also need to modify the manifest itself so that it prepends the assets/ string to the filename.
# lib/tasks/webpacker.rake
namespace :webpacker do
desc 'Upload webpacker files to S3 asset host'
task :upload do
s3 = Aws::S3::Client.new
file = Rails.root.join('public', 'packs', 'manifest.json')
payload = JSON.parse(File.read(file))
payload.each do |key, filename|
filename.gsub!(%r{\A\/}, '')
target = "assets/#{filename}"
payload[key] = target
next if s3.list_objects(bucket: S3_ASSETS_BUCKET, prefix: target).contents.any?
body = File.read(Rails.root.join('public', filename.gsub('_/', '../')))
s3.put_object(acl: 'public-read', body: body, bucket: S3_ASSETS_BUCKET, key: target)
end
File.open(file, 'w') { |f| f.puts(payload.to_json) }
end
end
It's a bit brittle and I'd like it to be more configurable and transparent, but it does what's required.
The text was updated successfully, but these errors were encountered:
I experimented with the Webpacker configuration and
add_local_files
but our circumstances required a bit of tweaking. In particular, we have multiple web servers and want to avoid multiple asset compilation jobs, and we need to work with an existing CDN configuration that looks for everything under the pathname '/assets'.This Capistrano task assumes there is a single server with the
assets
role and one or more with theweb
role. After asset compilation, this task downloads the Sprockets manifest.sprockets-<fingerprint>.json
and the Webpacker manifestmanifest.json
, and uploads them to all the web servers.The Rake task we invoke here is used to upload the transpiled asset packs to our desired location on S3. We also need to modify the manifest itself so that it prepends the
assets/
string to the filename.It's a bit brittle and I'd like it to be more configurable and transparent, but it does what's required.
The text was updated successfully, but these errors were encountered: