-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extra state secret when using the kubernetes backend #209
Comments
I believe this is caused by the way the terraform CLI is implemented. If I remember correctly we have to run We don't use the default workspace for anything, it's just an artifact of the way the CLI works. Is there a problem with having it there? I don't know that we can delete it permanently - it will come back every time the terraform CLI is executed. |
It's not really a problem, just confusing and I don't think it's documented. I do wonder though what would happen if I named a workspace "default" |
I think it would just use the default workspace. The problem would be if you have more than one Workspace with the crossplane.io/external-name set to "default" because that's where we get the workspace name from. That's true for any external-name, since it's not unique it would cause collisions between Workspaces/workspaces. We should probably change that to use |
I did just try out making a workspace called |
It will not try to delete the |
What happened?
I have the terraform provider installed and configured to use kubernetes incluster-config as a state store. This seems to work great, and I get a state file for my Workspaces. However, once I apply a Workspace (
terraform-test
), I also get a state file secret that says it is from a non-existentdefault
Workspace.This seems to happen the first time I apply a workspace with a new providerconfig.
And if I delete my workspace, the
tfstate-default-providerconfig-default
secret remains.How can we reproduce it?
aws-irsa
DeploymentRuntimeConfig not includedWhat environment did it happen in?
The text was updated successfully, but these errors were encountered: