-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request]: provider profil parameter #358
Comments
Hi, Are you only interested in the "profile" parameter?) I interpret your request to be, that you would like our provider to get credentials from the Could you also share more details on the challenges you face when using our provider with Atlantis, please? |
Hi Philipp, Yes, you understood correctly. I need to use the profiles stored in ~/.config/exoscale/exoscale.toml. In Exoscale, we have multiple organizations, each with its own IAM keys. I need to manage resources within the same Terraform plan. For Atlantis in SKS a way to provide secrets is to set helm values like this
terraform offer the possibility to use alias for provider, we can have for example multiple aws ressources with different profile
Using environment variables restricts us to a single set of credentials Is that make sense? |
Any update, chance to get that implemented? as you already have ~/Library/Application\ Support/exoscale/exoscale.toml profils file you can did the same as aws
rather:
|
Hey, thanks again for your interest in this. We are discussing internally on how to better standardize configurations for our tooling. It's not our highest priority task at the moment, so I can't give an ETA, but this is planned. |
What feature would you like to have in the provider?
The following provider-level settings are supported, either via HCL parameters or environment variables.
It would be useful to have a profile parameter, similar to what AWS provides.
Using the Exoscale Terraform provider with Atlantis is challenging when managing multiple IAM keys for different environments.
The text was updated successfully, but these errors were encountered: