Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Figure out a long-term deployment mechanism that balances the benefits of shared hosting and modern automation #4

Open
lgarron opened this issue Jan 9, 2025 · 0 comments

Comments

@lgarron
Copy link
Member

lgarron commented Jan 9, 2025

Moved from: #3

Right now we mainly have two kinds of deployment:

  • From GitHub to GitHub Pages
  • From local development to Dreamhost.

This is because Dreamhost does not support access controls based on project. If we want to deploy from GitHub with deployment credentials that have limited access (read: cannot mess with other deployment, whether by accident or due to malicious compromise), we'd have to:

  • Create separate user credentials for every separately deployed project, which I don't think can be done programmatically.
  • Separate any projects that currently deploy to a shared domain (e.g. experiments.cubing.net).

I think that this would make debugging unnecessarily painful.

That said, Dreamhost has been stable for serving files1 and gives us flexibility for some things we might need in the future2. It's also much, much, much faster to deploy than GCE and AWS Lambda, and I'd prefer to avoid any flashy hosting service that might disappear or sell us out on a timespan of 10 years.

I think a medium-term approach might be to have GitHub Actions trigger a simple cgi-bin setup in Dreamhost that pulls and deploys an artifact from GitHub Pages. As a long-term solution, I'd love to find something that's similar to Dreamhost but based on Caddy. I'm reluctant to self-host — as much as we have the resources and skills to do so — since that can more easily lead to uptime or maintenance challenges compared to shared hosting3.

Footnotes

  1. If you overlook their atrocious track record of failing to renew HTTPS certificates. But at least I have a monitoring service for that.

  2. Like a simple database or simple site templating. The latter is unfortunately necessary due to badly designed things like the Open Graph protocol. This could be solved by using a cloud worker using something like the HTMLRewriter API, but that requires additional configuration and maintenance.

  3. Say what you want about Dreamhost, but https://live.deprecated.cubing.net/StanfordFall2011/ is still running perfectly over 13 years later without any human maintenance.

This was referenced Jan 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant