Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(cli): reduce redundancy on context to cluster flags in command deployment create #1156

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

instamenta
Copy link
Contributor

Description

Removes flag --context-cluster and it's usages, replaces them with data provided from local config deployments.clusters.

Related Issues

…data from flag.deploymentClusters and parsed data inside local config

Signed-off-by: instamenta <[email protected]>
@instamenta instamenta self-assigned this Jan 14, 2025
@instamenta instamenta requested review from leninmehedy and a team as code owners January 14, 2025 11:33
Copy link
Contributor

Unit Test Results - Linux

  1 files  ±0   58 suites  ±0   3s ⏱️ ±0s
227 tests ±0  227 ✅ ±0  0 💤 ±0  0 ❌ ±0 
232 runs  ±0  232 ✅ ±0  0 💤 ±0  0 ❌ ±0 

Results for commit 446b8c9. ± Comparison against base commit 1cf5893.

Copy link
Contributor

Unit Test Results - Windows

  1 files  ±0   58 suites  ±0   12s ⏱️ -1s
227 tests ±0  227 ✅ ±0  0 💤 ±0  0 ❌ ±0 
232 runs  ±0  232 ✅ ±0  0 💤 ±0  0 ❌ ±0 

Results for commit 446b8c9. ± Comparison against base commit 1cf5893.

Copy link
Contributor

github-actions bot commented Jan 14, 2025

E2E Test Report

 17 files  126 suites   1h 30m 12s ⏱️
258 tests 258 ✅ 0 💤 0 ❌
269 runs  269 ✅ 0 💤 0 ❌

Results for commit 446b8c9.

♻️ This comment has been updated with latest results.

Copy link

Coverage summary from Codacy

See diff coverage on Codacy

Coverage variation Diff coverage
Report missing for 1cf58931 66.67%
Coverage variation details
Coverable lines Covered lines Coverage
Common ancestor commit (1cf5893) Report Missing Report Missing Report Missing
Head commit (446b8c9) 21104 17707 83.90%

Coverage variation is the difference between the coverage for the head and common ancestor commits of the pull request branch: <coverage of head commit> - <coverage of common ancestor commit>

Diff coverage details
Coverable lines Covered lines Diff coverage
Pull request (#1156) 3 2 66.67%

Diff coverage is the percentage of lines that are covered by tests out of the coverable lines that the pull request added or modified: <covered lines added or modified>/<coverable lines added or modified> * 100%

See your quality gate settings    Change summary preferences

Codacy stopped sending the deprecated coverage status on June 5th, 2024. Learn more

Footnotes

  1. Codacy didn't receive coverage data for the commit, or there was an error processing the received data. Check your integration for errors and validate that your coverage setup is correct.

Copy link

codecov bot commented Jan 14, 2025

Codecov Report

Attention: Patch coverage is 66.66667% with 1 line in your changes missing coverage. Please review.

Project coverage is 83.13%. Comparing base (9bc6fcd) to head (446b8c9).
Report is 2 commits behind head on main.

Files with missing lines Patch % Lines
src/core/config/local_config.ts 0.00% 1 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #1156      +/-   ##
==========================================
- Coverage   83.37%   83.13%   -0.24%     
==========================================
  Files          77       77              
  Lines       20809    21104     +295     
  Branches     1717     1460     -257     
==========================================
+ Hits        17349    17545     +196     
- Misses       3363     3485     +122     
+ Partials       97       74      -23     
Files with missing lines Coverage Δ
src/commands/flags.ts 75.62% <ø> (+0.35%) ⬆️
src/core/config/remote/remote_config_manager.ts 81.18% <100.00%> (-0.07%) ⬇️
src/core/templates.ts 77.77% <100.00%> (+5.43%) ⬆️
src/core/config/local_config.ts 67.07% <0.00%> (ø)

... and 33 files with indirect coverage changes

Impacted file tree graph

subTasks.push({
title: `Testing connection to cluster: ${chalk.cyan(cluster)}`,
task: async (_, task) => {
if (!(await self.k8.testClusterConnection(cluster))) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need to pass the context not the cluster. We need to look it up, prior to passing it.

rm -Rf ~/.solo
npm run solo -- init
npm run solo -- node keys --gossip-keys --tls-keys --node-aliases-unparsed node1,node2
npm run solo -- deployment create -n jeromy --email [email protected] --deployment-clusters solo-e2e
❯ cat local-config.yaml
userEmailAddress: [email protected]
deployments:
  jeromy:
    clusters:
      - solo-e2e
currentDeploymentName: jeromy
clusterContextMapping:
  solo-e2e: kind-solo-e2e
******************************* Solo *********************************************
Version                 : 0.33.0
Kubernetes Context      : kind-solo-e2e
Kubernetes Cluster      : kind-solo-e2e
Kubernetes Namespace    : jeromy
**********************************************************************************
✔ Initialize
  ✔ Acquire lease - lease acquired successfully, attempt: 1/10
↓ Prompt local configuration
❯ Validate cluster connections
  ✖ No active cluster!

Copy link
Contributor

@jeromy-cannon jeromy-cannon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@instamenta ,

some changes:

  • check and make sure that K8s testClusterConnection validates that the context exists in the kube config prior to setting it
  • put K8s testClusterConnection logic in a try/catch block, and if there is a failure revert the context back to its original value.

In this case:

npm run solo -- deployment create -n jeromy --email [email protected] --deployment-clusters solo-e2e

the cluster solo-e2e is what solo will use as an alias to map to a context kind-solo-e2e which I supplied when it prompted me.

The local-config.yaml looks good:

❯ cat local-config.yaml
userEmailAddress: [email protected]
deployments:
  jeromy:
    clusters:
      - solo-e2e
currentDeploymentName: jeromy
clusterContextMapping:
  solo-e2e: kind-solo-e2e

currently, the testClusterConnection fails because solo-e2e isn't a valid context, but it has already updated the kube current context. So, when the program attempts to exit, the release lease fails because k8 is a singleton, and still pointing to context = solo-e2e.

✔ Initialize
  ✔ Acquire lease - lease acquired successfully, attempt: 1/10
↓ Prompt local configuration
❯ Validate cluster connections
  ✖ No active cluster!
◼ Create remote config
*********************************** ERROR *****************************************
failed to read existing leases, unexpected server response of '500' received
***********************************************************************************

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

reduce redundancy on context to cluster flags in solo deployment create
2 participants