Skip to content

🐛 Fix duplicate deployment names in klusterlet-agent availability message #671

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

RokibulHasan7
Copy link
Member

Summary

Related issue(s)

Fixes #

@openshift-ci openshift-ci bot requested review from xuezhaojun and zhujian7 October 25, 2024 04:29
Copy link

codecov bot commented Oct 25, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 63.05%. Comparing base (865ae06) to head (980cd44).
Report is 2 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #671      +/-   ##
==========================================
+ Coverage   63.04%   63.05%   +0.01%     
==========================================
  Files         182      182              
  Lines       17606    17612       +6     
==========================================
+ Hits        11100    11106       +6     
  Misses       5594     5594              
  Partials      912      912              
Flag Coverage Δ
unit 63.05% <100.00%> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@zhujian7
Copy link
Member

Another option is to deduplicate the agents in the checkAgentsDeploymentAvailable func, like:

// Check agent deployments, if both of them have at least 1 available replicas, return available condition
func checkAgentsDeploymentAvailable(ctx context.Context, kubeClient kubernetes.Interface, agents []klusterletAgent) metav1.Condition {
	var availableMessages []string
	components := sets.Set[string]{}  ### import "k8s.io/apimachinery/pkg/util/sets"
	for _, agent := range agents {
		componentID := fmt.Sprintf("%s-%s", agent.namespace, agent.deploymentName)
		if components.Has(componentID) {
			continue
		}
		components.Insert(componentID)
		deployment, err := kubeClient.AppsV1().Deployments(agent.namespace).Get(ctx, agent.deploymentName, metav1.GetOptions{})
		...
}

Signed-off-by: Rokibul Hasan <[email protected]>
@zhujian7
Copy link
Member

/lgtm

@xuezhaojun
Copy link
Member

@RokibulHasan7 Can we have a comparison of how the message looks before and after the change. This will help followers and code readers understand the fix better. Thank you!

@RokibulHasan7
Copy link
Member Author

@RokibulHasan7 Can we have a comparison of how the message looks before and after the change. This will help followers and code readers understand the fix better. Thank you!

Before:

  Type:			Available
  Status:		True
  LastTransitionTime:	2024-10-27 13:25:38 +0600 +06
  Reason:		KlusterletAvailable
  Message:		deployments are ready: klusterlet-agent,klusterlet-agent

After:

  Type:			Available
  Status:		True
  LastTransitionTime:	2024-10-27 13:25:38 +0600 +06
  Reason:		KlusterletAvailable
  Message:		deployments are ready: klusterlet-agent

@zhujian7
Copy link
Member

/cc @qiujian16

@openshift-ci openshift-ci bot requested a review from qiujian16 October 28, 2024 01:49
@qiujian16
Copy link
Member

good catch, thanks!
/approve

Copy link
Contributor

openshift-ci bot commented Oct 28, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: qiujian16, RokibulHasan7

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-merge-bot openshift-merge-bot bot merged commit e9245d4 into open-cluster-management-io:main Oct 28, 2024
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants